TrigRun
Use Cases

Scheduled Data Pipeline and ETL Triggers

Trigger ETL pipelines, data syncs, and ML model retraining on a cron schedule with TrigRun. Kick off Airflow DAGs, dbt runs, or custom pipelines via HTTP.

Trigger your data pipeline at 1 AM every night. TrigRun calls your pipeline API — Airflow, dbt Cloud, Prefect, or a custom endpoint — so your ETL runs on time without maintaining a cron server.

The problem

Your data pipeline needs to run nightly: extract from production databases, transform, load into your warehouse, and refresh dashboards. Pipeline orchestrators (Airflow, Prefect) have built-in schedulers, but sometimes you need an external trigger — for cross-system coordination, as a backup scheduler, or for lightweight pipelines that don't justify a full orchestrator.

How it works with TrigRun

┌─────────────┐  daily 1 AM   ┌──────────────────┐              ┌──────────────┐
│   TrigRun   │ ─────────────▶│ Pipeline Trigger  │ ────────────▶│ Data         │
│  Scheduler  │  POST /trigger │   Endpoint       │  kicks off   │ Warehouse    │
└─────────────┘               └──────────────────┘              └──────────────┘
       │                             │
       ▼                             ▼
  Execution log               Starts the pipeline:
  "pipeline started,          Extract → Transform → Load
   run_id: abc123"

Trigger an Airflow DAG

curl -X POST https://api.trigrun.com/v1/jobs \
  -H "Authorization: Bearer YOUR_API_TOKEN" \
  -H "Content-Type: application/json" \
  -d '{
    "name": "Nightly ETL - production to warehouse",
    "kind": "cron",
    "schedule": {
      "cron": "0 1 * * *"
    },
    "request": {
      "url": "https://airflow.yourcompany.com/api/v1/dags/nightly_etl/dagRuns",
      "method": "POST",
      "headers": {
        "Authorization": "secret://airflow-api-key",
        "Content-Type": "application/json"
      },
      "body": {
        "conf": {
          "source": "production",
          "target": "warehouse",
          "mode": "incremental"
        }
      },
      "timeout_seconds": 30
    },
    "retry_policy": {
      "max_attempts": 3,
      "retry_on_statuses": [500, 502, 503]
    }
  }'

Trigger a dbt Cloud run

{
  "name": "dbt Cloud - nightly model refresh",
  "kind": "cron",
  "schedule": { "cron": "0 3 * * *" },
  "request": {
    "url": "https://cloud.getdbt.com/api/v2/accounts/ACCOUNT_ID/jobs/JOB_ID/run/",
    "method": "POST",
    "headers": {
      "Authorization": "secret://dbt-cloud-token",
      "Content-Type": "application/json"
    },
    "body": { "cause": "Triggered by TrigRun nightly schedule" }
  }
}

Trigger ML model retraining

{
  "name": "ML model retrain - recommendation engine",
  "kind": "cron",
  "schedule": { "cron": "0 2 * * 0" },
  "request": {
    "url": "https://ml.yourcompany.com/api/train",
    "method": "POST",
    "headers": { "Authorization": "secret://ml-api-key" },
    "body": {
      "model": "recommendations-v3",
      "dataset": "last_7_days",
      "deploy_on_success": true
    }
  }
}

Expected results

Successful pipeline trigger:

FieldExample value
Status200 OK
Duration1,120 ms
Response{"dag_run_id": "manual__2026-03-16T01:00:00", "state": "queued", "execution_date": "2026-03-16T06:00:00Z"}

Note: TrigRun logs the trigger response, not the pipeline duration. The pipeline itself runs asynchronously for minutes or hours. Use your pipeline tool's UI to monitor progress.

Pipeline coordination pattern

Use multiple TrigRun jobs to orchestrate a multi-step pipeline:

JobSchedulePurpose
Extract0 1 * * *Pull data from production
Transform0 2 * * *Run dbt models (1 hour after extract)
Load dashboards0 3 * * *Refresh Metabase/Looker cache
Notify team0 4 * * *POST summary to Slack

Common schedules

PatternExpressionUse case
Nightly 1 AM0 1 * * *Standard ETL
Every 6 hours0 */6 * * *Near-real-time warehouse
Weekly Sunday0 2 * * 0ML model retraining
Monthly 1st0 1 1 * *Monthly aggregation

On this page