API Reference
Base URL: https://enrich.sh
All endpoints require authentication via Bearer token unless noted otherwise.
Authentication
Include your API key in the Authorization header:
Authorization: Bearer sk_live_your_key_here| Prefix | Environment | Usage |
|---|---|---|
sk_live_ | Production | Real data, billing applies |
sk_test_ | Sandbox | Testing, no billing |
Ingest Events
POST /ingestSend events to be buffered and stored as Parquet. Events are validated against the stream's schema_mode — in strict mode, invalid events are routed to the Dead Letter Queue.
Request Body:
| Field | Type | Required | Description |
|---|---|---|---|
stream_id | string | ✅ | Target stream identifier |
data | array | ✅ | Array of event objects |
Example:
curl -X POST https://enrich.sh/ingest \
-H "Authorization: Bearer sk_live_your_key" \
-H "Content-Type: application/json" \
-d '{
"stream_id": "events",
"data": [
{ "event": "click", "url": "/buy", "ts": 1738776000 },
{ "event": "purchase", "amount": 99.99, "ts": 1738776001 }
]
}'Response (200 OK):
{
"accepted": 2,
"buffered": 1502
}Limits
| Limit | Value |
|---|---|
| Max payload size | 1 MB |
| Max events per request | ~3,000 (within 1 MB) |
| Min events per request | 1 |
| Rate limit | ~200 req/s per customer |
TIP
For high-volume ingestion, batch 50–100 events per request for optimal throughput.
Error Responses
| Status | Error | Description |
|---|---|---|
| 400 | stream_id is required | Missing stream_id field |
| 400 | data must be an array | Invalid data format |
| 400 | data array is empty | Empty data array |
| 404 | Stream 'x' not found | Stream doesn't exist |
| 413 | Payload too large | Exceeds 1 MB request limit |
| 429 | Monthly event limit exceeded | Upgrade plan required |
Streams
List Streams
GET /streamsResponse:
{
"streams": [
{
"stream_id": "events",
"schema_mode": "evolve",
"template": null,
"created_at": "2026-02-01T10:00:00.000Z"
},
{
"stream_id": "clicks",
"schema_mode": "flex",
"template": "clickstream",
"created_at": "2026-02-03T15:00:00.000Z"
}
]
}Create Stream
POST /streams| Field | Type | Required | Default | Description |
|---|---|---|---|---|
stream_id | string | ✅ | — | Unique identifier (alphanumeric + _ + -) |
fields | object | ❌ | null | Field type definitions |
template | string | ❌ | null | Enrichment template |
schema_mode | string | ❌ | flex | strict, evolve, or flex |
webhook_url | string | ❌ | null | URL to forward events on flush |
Example:
curl -X POST https://enrich.sh/streams \
-H "Authorization: Bearer sk_live_your_key" \
-H "Content-Type: application/json" \
-d '{
"stream_id": "purchases",
"schema_mode": "strict",
"fields": {
"order_id": { "type": "string" },
"amount": { "type": "float64" },
"currency": { "type": "string" }
}
}'Response (201 Created):
{
"stream_id": "purchases",
"schema_mode": "strict",
"template": null,
"created_at": "2026-02-05T12:00:00.000Z"
}Update Stream
PUT /streams/:stream_idSame body as Create — only include fields to update.
Delete Stream
DELETE /streams/:stream_idWARNING
This deletes the stream configuration only. Data already stored in R2 is not deleted.
Dead Letter Queue
List DLQ Events
GET /streams/:stream_id/dlqGet events rejected by strict mode validation.
| Parameter | Type | Default | Description |
|---|---|---|---|
days | integer | 7 | Time range (last N days) |
limit | integer | 100 | Max events to return |
Response:
{
"stream_id": "transactions",
"dlq_count": 42,
"events": [
{
"rejected_at": "2026-02-15T10:30:00Z",
"reason": "extra_field",
"field": "unknown_col",
"original": {
"order_id": "abc123",
"amount": 99.99,
"unknown_col": "should not be here"
}
}
]
}DLQ Event Metadata
| Field | Description |
|---|---|
rejected_at | Timestamp of rejection |
reason | missing_field, extra_field, or type_mismatch |
field | Which field caused the rejection |
original | Full original event payload |
DLQ events are also stored as Parquet at {stream_id}/_dlq/ and queryable via DuckDB.
Schema Events
Get Schema Change History
GET /streams/:stream_id/schema-eventsView detected schema changes for streams in evolve mode.
| Parameter | Type | Default | Description |
|---|---|---|---|
days | integer | 7 | Time range |
Response:
{
"stream_id": "erp_events",
"schema_events": [
{
"type": "new_field",
"field": "discount_code",
"detected_type": "string",
"detected_at": "2026-02-15T10:30:00Z",
"event_count": 127
},
{
"type": "type_change",
"field": "amount",
"previous_type": "int64",
"detected_type": "string",
"detected_at": "2026-02-15T11:00:00Z",
"event_count": 3
},
{
"type": "missing_field",
"field": "currency",
"detected_at": "2026-02-15T12:00:00Z",
"event_count": 45
}
]
}| Change Type | Description |
|---|---|
new_field | A field appeared that isn't in the schema |
type_change | A field's data type changed |
missing_field | A previously-present field is no longer being sent |
Connect (S3 Credentials)
Get S3 Access Credentials
GET /connectGet S3-compatible credentials for direct warehouse access to your data.
Response:
{
"endpoint": "https://abcdef123456.r2.cloudflarestorage.com",
"bucket": "enrich-cust_abc123",
"access_key_id": "your_r2_access_key",
"secret_access_key": "your_r2_secret_key",
"region": "auto",
"egress_cost": "$0",
"example_path": "s3://enrich-cust_abc123/events/2026/02/**/*.parquet",
"sql": {
"duckdb": "SET s3_region='auto'; SET s3_endpoint='abcdef123456.r2.cloudflarestorage.com'; SET s3_access_key_id='...'; SET s3_secret_access_key='...'; SELECT * FROM read_parquet('s3://enrich-cust_abc123/events/2026/02/**/*.parquet');",
"clickhouse": "SELECT * FROM s3('https://abcdef123456.r2.cloudflarestorage.com/enrich-cust_abc123/events/2026/02/**/*.parquet', '...', '...', 'Parquet');"
}
}TIP
The Dashboard Connect page provides ready-to-paste SQL for ClickHouse, BigQuery, DuckDB, Snowflake, and Python.
Stream Replay
Replay Events to Webhook
POST /streams/:stream_id/replayRe-send historical events from a time range to a webhook URL.
| Field | Type | Required | Description |
|---|---|---|---|
from | string | ✅ | Start date (YYYY-MM-DD) |
to | string | ✅ | End date (YYYY-MM-DD) |
webhook_url | string | ✅ | Target URL |
Example:
curl -X POST https://enrich.sh/streams/events/replay \
-H "Authorization: Bearer sk_live_your_key" \
-H "Content-Type: application/json" \
-d '{
"from": "2026-02-01",
"to": "2026-02-14",
"webhook_url": "https://your-api.com/replay-target"
}'Response:
{
"replay_id": "rpl_abc123",
"status": "started",
"stream_id": "events",
"from": "2026-02-01",
"to": "2026-02-14",
"estimated_events": 450000
}Use cases: ML model retraining, backfilling downstream systems, disaster recovery.
Templates
List Templates
GET /templatesGet available enrichment templates with their schemas.
Response:
{
"templates": [
{
"id": "clickstream",
"name": "clickstream",
"description": "Web and app analytics with session tracking, geo, and domain enrichment",
"fields": [
{ "name": "url", "type": "string" },
{ "name": "timestamp", "type": "int64" },
{ "name": "user_id", "type": "string" }
]
},
{
"id": "transaction",
"name": "transaction",
"description": "Payment and purchase event enrichment",
"fields": [
{ "name": "txn_id", "type": "string" },
{ "name": "txn_amount", "type": "string" },
{ "name": "txn_currency", "type": "string" }
]
}
]
}Usage
Get Usage Stats
GET /usage| Parameter | Type | Default | Description |
|---|---|---|---|
start_date | string | 30 days ago | Start date (YYYY-MM-DD) |
end_date | string | today | End date (YYYY-MM-DD) |
stream_id | string | all | Filter by stream |
Example:
curl "https://enrich.sh/usage?start_date=2026-02-01&end_date=2026-02-05" \
-H "Authorization: Bearer sk_live_your_key"Response:
{
"usage": [
{
"date": "2026-02-05",
"stream_id": "events",
"event_count": 125000,
"bytes_stored": 4521000,
"file_count": 3
}
],
"totals": {
"event_count": 223000,
"bytes_stored": 7721000,
"file_count": 5
}
}Live Stats
Get Real-Time Stats
GET /statsReal-time ingestion statistics powered by Cloudflare Analytics Engine.
Response:
{
"status": "ok",
"customer_id": "cust_abc123",
"hour": "2026-02-07T09",
"requests": 1523,
"events": 45690,
"errors": 2,
"histogram": [
{ "minute": "2026-02-07T09:30", "events": 1200, "requests": 40, "errors": 0 }
],
"hourly": [
{ "hour": "2026-02-07T08", "events": 52000, "requests": 1700, "errors": 5 }
]
}| Field | Description |
|---|---|
status | ok if data flowing, idle if no recent activity |
histogram | Minute-by-minute stats for last hour |
hourly | Hourly aggregates for last 24 hours |
Errors
Get Recent Errors
GET /errors| Parameter | Type | Default | Description |
|---|---|---|---|
limit | integer | 50 | Max errors to return (max 100) |
stream_id | string | all | Filter by stream |
Response:
{
"customer_id": "cust_abc123",
"errors_24h": 12,
"by_type": {
"validation": 8,
"schema_rejection": 2,
"rate_limit": 1,
"processing": 1
},
"recent": [
{
"id": "err_xyz789",
"type": "validation",
"message": "stream_id is required",
"endpoint": "/ingest",
"request_body": "{ ... }",
"created_at": "2026-02-07T09:45:00.000Z"
}
]
}Health Check
GET /healthPublic endpoint — no authentication required.
{
"status": "ok",
"timestamp": "2026-02-05T18:00:00.000Z",
"version": "1.0.0"
}HTTP Status Codes
| Code | Description |
|---|---|
| 200 | Success |
| 201 | Created |
| 400 | Bad Request (validation error) |
| 401 | Unauthorized (invalid/missing API key) |
| 404 | Not Found |
| 405 | Method Not Allowed |
| 413 | Payload Too Large |
| 429 | Rate Limited / Quota Exceeded |
| 500 | Internal Server Error |
Rate Limits
| Tier | Requests/sec | Events/month | Features |
|---|---|---|---|
| Starter | 50 | 10M | Flex mode, shared storage |
| Pro | 200 | 100M | All schema modes, dedicated R2 bucket, DLQ, alerts |
| Scale | 300 | 500M | Everything + webhook forwarding, stream replay |
| Enterprise | Custom | Custom | Custom SLAs, dedicated support |
When rate limited, you'll receive a 429 response with a Retry-After header.
