workflow-management
TD Workflow Management
Setup & CLI Commands
For full CLI reference, see tdx-skills/workflow. Quick start:
tdx wf use my_project # Set default project for session
tdx wf pull my_project # Pull project locally for editing
tdx wf push # Push changes with diff preview
Debugging Steps
tdx wf sessions --status error- find failed sessionstdx wf timeline --session-id <id>- visualize task executiontdx wf attempt <id> logs +failed_task- read error logs- Verify query syntax if td> failed
- Check time ranges - does data exist for session_date?
- Validate parameter values
- Check resource limits (memory, timeout)
tdx wf attempt <id> retry --resume-from +failed_task- retry from failure
Alerting
+critical_task:
td>: queries/important.sql
_error:
+slack_alert:
sh>: |
curl -X POST ${secret:slack.webhook_url} \
-H 'Content-Type: application/json' \
-d '{"text": "Workflow failed: ${session_id}"}'
Data Quality Checks
+process:
td>: queries/process.sql
create_table: results
+validate:
td>:
query: |
SELECT COUNT(*) as cnt,
SUM(CASE WHEN id IS NULL THEN 1 ELSE 0 END) as nulls
FROM results
store_last_results: true
+check:
if>: ${td.last_results.cnt == 0}
_do:
+fail:
sh>: exit 1
Wait for Data
+wait_for_data:
sh>: |
for i in {1..30}; do
COUNT=$(tdx query -d analytics "SELECT COUNT(*) FROM src WHERE date='${session_date}'" --format csv | tail -1)
if [ "$COUNT" -gt 0 ]; then exit 0; fi
sleep 60
done
exit 1
Idempotent Operations
+safe_insert:
td>:
query: |
DELETE FROM target WHERE date = '${session_date}';
INSERT INTO target SELECT * FROM source WHERE date = '${session_date}'
Backfill Pattern
+backfill:
loop>:
dates: ["2024-01-01", "2024-01-02", "2024-01-03"]
_do:
+process:
call>: main_workflow.dig
params:
session_date: ${dates}
Secrets
See tdx-skills/workflow for tdx wf secrets commands. Usage in .dig files:
+task:
sh>: curl -H "Authorization: ${secret:API_KEY}" https://api.example.com
Common Issues
| Issue | Solution |
|---|---|
| Timeout | Add timeout: 3600s, _retry: 2 |
| Intermittent failures | Add _retry: 5 with exponential backoff |
| Out of memory | Reduce data volume, use approx functions |
| Duplicate runs | Use idempotent DELETE+INSERT pattern |
More from treasure-data/td-skills
pytd
Expert assistance for using pytd (Python SDK) to query and import data with Treasure Data. Use this skill when users need help with Python-based data analysis, querying Presto/Hive, importing pandas DataFrames, bulk data uploads, or integrating TD with Python analytical workflows.
20tdx-basic
Executes tdx CLI commands for Treasure Data. Covers `tdx databases`, `tdx tables`, `tdx describe`, `tdx query`, `tdx auth setup`, context management with profiles/sessions, and output formats (JSON/TSV/table). Use when users need tdx command syntax, authentication setup, database/table exploration, schema inspection, or query execution.
3workflow
Manages TD workflows using `tdx wf` commands. Covers project sync (pull/push/clone), running workflows, monitoring sessions/attempts, task timeline visualization, retry/kill operations, and secrets management. Use when users need to manage, monitor, or debug Treasure Workflow projects via tdx CLI.
3journey
Load when the client wants to create, edit, or manage a CDP customer journey. Use for building journey YAML with segments, activations, and stage steps, modifying journey stages or flow logic (decision points, condition waits, A/B tests), or pushing journey changes to Treasure Data. Also load when the client wants to analyze journey performance, query journey tables, create journey dashboards, or generate journey action reports.
2parent-segment-analysis
Query and analyze CDP parent segment database data. Use `tdx ps desc -o` to get output database schema, then query customers and behavior tables. Use when exploring parent segment data, building reports, or analyzing customer attributes and behaviors.
2connector-config
Writes connector_config for segment/journey activations using `tdx connection schema <type>` to discover available fields. Use when configuring activations - always run schema command first to see connector-specific fields.
2