Skip to content

Getting Started

Get up and running with Enrich.sh in under 5 minutes.

Prerequisites

  • An Enrich.sh account — sign up at dashboard.enrich.sh
  • Your API key (found in Dashboard → Settings → API Keys)

Step 1: Get Your API Key

Your API key looks like this:

sk_live_xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx

WARNING

Keep your API key secret! Never expose it in client-side code. Use a server-side proxy or the SDK's beacon() method.

Step 2: Install the SDK

bash
npm install enrich.sh

Or use curl directly — no SDK required.

Step 3: Create a Stream

A stream defines where and how your data is stored. Create one via the Dashboard or API.

  1. Go to dashboard.enrich.shStreams
  2. Click Create New Stream
  3. Enter a stream_id (e.g., events, logs, purchases)
  4. Configure fields and click Create

Option B: API

Simple stream (flex mode — accepts everything):

bash
curl -X POST https://enrich.sh/streams \
  -H "Authorization: Bearer sk_live_your_key_here" \
  -H "Content-Type: application/json" \
  -d '{
    "stream_id": "events"
  }'

With typed fields + evolve mode (recommended):

bash
curl -X POST https://enrich.sh/streams \
  -H "Authorization: Bearer sk_live_your_key_here" \
  -H "Content-Type: application/json" \
  -d '{
    "stream_id": "events",
    "schema_mode": "evolve",
    "fields": {
      "event": { "type": "string" },
      "url": { "type": "string" },
      "user_id": { "type": "string" },
      "ts": { "name": "timestamp", "type": "int64" },
      "metadata": { "type": "json" }
    }
  }'

INFO

evolve mode auto-detects schema changes and alerts you in the Dashboard. See Streams Configuration for all modes.

Supported Field Types

TypeDescriptionExample
stringText (default)"hello"
int64Integer / timestamp1738776000
float64Decimal number99.99
booleanTrue / falsetrue
jsonNested objects / arrays{"a": 1}

TIP

See Streams Configuration for complete details on field types, nested objects, and schema modes.

Step 4: Send Your First Event

Using curl

bash
curl -X POST https://enrich.sh/ingest \
  -H "Authorization: Bearer sk_live_your_key_here" \
  -H "Content-Type: application/json" \
  -d '{
    "stream_id": "events",
    "data": [
      {
        "event": "page_view",
        "url": "https://example.com/pricing",
        "user_id": "user_123",
        "ts": 1738776000
      }
    ]
  }'

Using the SDK

javascript
import { Enrich } from 'enrich.sh'

const enrich = new Enrich('sk_live_your_key_here')

// Buffer + auto-flush (recommended for high volume)
enrich.track('events', {
  event: 'page_view',
  url: 'https://example.com/pricing',
  user_id: 'user_123',
})

// Or send immediately
await enrich.ingest('events', {
  event: 'page_view',
  url: 'https://example.com/pricing',
  user_id: 'user_123',
})

Response

json
{
  "accepted": 1,
  "buffered": 1
}

For production use, always batch your events:

bash
curl -X POST https://enrich.sh/ingest \
  -H "Authorization: Bearer sk_live_your_key_here" \
  -H "Content-Type: application/json" \
  -d '{
    "stream_id": "events",
    "data": [
      { "event": "page_view", "url": "/home", "ts": 1738776000 },
      { "event": "page_view", "url": "/pricing", "ts": 1738776001 },
      { "event": "click", "element": "signup_btn", "ts": 1738776002 },
      { "event": "page_view", "url": "/signup", "ts": 1738776003 }
    ]
  }'

TIP

Batch 50–100 events per request for optimal throughput.

Step 6: Connect Your Warehouse

Go to Dashboard → Stream → Connect to get S3-compatible credentials and ready-to-paste SQL for your warehouse.

What You GetDetails
S3 endpoint{account_id}.r2.cloudflarestorage.com
Bucketenrich-{customer_id}
AccessScoped read-only
Egress cost$0
Works withClickHouse, BigQuery, DuckDB, Snowflake, Python

See Streams → Connecting Your Warehouse for copy-paste SQL examples.

Step 7: Query Your Data

Using DuckDB

sql
INSTALL httpfs;
LOAD httpfs;

-- Configure R2 credentials (from Dashboard → Connect)
SET s3_region = 'auto';
SET s3_endpoint = 'your-account.r2.cloudflarestorage.com';
SET s3_access_key_id = 'your_r2_access_key';
SET s3_secret_access_key = 'your_r2_secret';

-- Query your data
SELECT *
FROM read_parquet('s3://enrich-your-id/events/2026/02/**/*.parquet');

Using the SDK

javascript
const urls = await enrich.query('events', { days: 7 })

// Pass directly to DuckDB
await conn.query(
  `SELECT * FROM read_parquet(${JSON.stringify(urls)})`
)

Using Python

python
import duckdb

conn = duckdb.connect()
conn.execute("""
    SET s3_region = 'auto';
    SET s3_endpoint = 'your-account.r2.cloudflarestorage.com';
    SET s3_access_key_id = 'your_r2_access_key';
    SET s3_secret_access_key = 'your_r2_secret';
""")

df = conn.execute("""
    SELECT event, COUNT(*) as count
    FROM read_parquet('s3://enrich-your-id/events/2026/02/**/*.parquet')
    GROUP BY event
    ORDER BY count DESC
""").fetchdf()

print(df)

Next Steps

Serverless data ingestion for developers.