Set up a data ingestion or sync pipeline — from source to destination, with transformation. Supports Airbyte, custom ETL scripts, and webhook-based ingestion. Use when the user says "sync data from X", "pull data into the DB", "set up a pipeline", or "ingest from API".
npx @senso-ai/shipables install KeyanVakil/wire-data-pipelineGet data from where it is to where it needs to be.
Identify the pipeline shape:
Choose the right approach:
Best for: syncing from SaaS tools (Stripe, Salesforce, Postgres, S3, etc.) into your warehouse/DB.
docker compose up with Airbyte's docker-compose.ymlnormalization to flatten nested JSON into relational tablesPOST /api/v1/connections/syncBest for: a single API source with simple transformation.
// Pseudo-pattern for a scheduled ingestion function
export async function syncData() {
const records = await fetchFromSource()
const transformed = records.map(transform)
await upsertToDestination(transformed, { onConflict: 'id' })
}
updated_at checkspipeline_runs tableBest for: real-time data from Stripe, GitHub, etc.
last_synced_at so you can detect stale pipelines