Scheduled Data Pipeline and ETL Triggers
Trigger ETL pipelines, data syncs, and ML model retraining on a cron schedule with TrigRun. Kick off Airflow DAGs, dbt runs, or custom pipelines via HTTP.
Trigger your data pipeline at 1 AM every night. TrigRun calls your pipeline API — Airflow, dbt Cloud, Prefect, or a custom endpoint — so your ETL runs on time without maintaining a cron server.
The problem
Your data pipeline needs to run nightly: extract from production databases, transform, load into your warehouse, and refresh dashboards. Pipeline orchestrators (Airflow, Prefect) have built-in schedulers, but sometimes you need an external trigger — for cross-system coordination, as a backup scheduler, or for lightweight pipelines that don't justify a full orchestrator.
How it works with TrigRun
Trigger an Airflow DAG
Trigger a dbt Cloud run
Trigger ML model retraining
Expected results
Successful pipeline trigger:
| Field | Example value |
|---|---|
| Status | 200 OK |
| Duration | 1,120 ms |
| Response | {"dag_run_id": "manual__2026-03-16T01:00:00", "state": "queued", "execution_date": "2026-03-16T06:00:00Z"} |
Note: TrigRun logs the trigger response, not the pipeline duration. The pipeline itself runs asynchronously for minutes or hours. Use your pipeline tool's UI to monitor progress.
Pipeline coordination pattern
Use multiple TrigRun jobs to orchestrate a multi-step pipeline:
| Job | Schedule | Purpose |
|---|---|---|
| Extract | 0 1 * * * | Pull data from production |
| Transform | 0 2 * * * | Run dbt models (1 hour after extract) |
| Load dashboards | 0 3 * * * | Refresh Metabase/Looker cache |
| Notify team | 0 4 * * * | POST summary to Slack |
Common schedules
| Pattern | Expression | Use case |
|---|---|---|
| Nightly 1 AM | 0 1 * * * | Standard ETL |
| Every 6 hours | 0 */6 * * * | Near-real-time warehouse |
| Weekly Sunday | 0 2 * * 0 | ML model retraining |
| Monthly 1st | 0 1 1 * * | Monthly aggregation |