dbt
dbt with Treasure Data Trino
Installation
uv venv && source .venv/bin/activate
uv pip install dbt-core dbt-trino==1.9.3
profiles.yml
td:
target: dev
outputs:
dev:
type: trino
method: none # Not 'ldap'
user: "{{ env_var('TD_API_KEY') }}"
password: dummy # Not used
host: api-presto.treasuredata.com
port: 443
database: td # Always 'td'
schema: your_dev_database # Your TD database name
threads: 4
http_scheme: https
session_properties:
query_max_run_time: 1h
Key TD settings:
method: nonefor API key authdatabase: td(always)schema: your_td_database(what you see in TD Console)
Required Override Macros
TD doesn't support CREATE VIEW. Create macros/override_dbt_trino.sql:
{% macro trino__create_view_as(relation, sql) -%}
create or replace table {{ relation }} as (
{{ sql }}
);
{%- endmacro %}
{% macro trino__list_relations_without_caching(schema_relation) %}
{% call statement('list_relations_without_caching', fetch_result=True) %}
select
table_catalog as "database",
table_schema as "schema",
table_name as "name",
table_type as "type"
from {{ schema_relation }}.information_schema.tables
where table_schema = '{{ schema_relation.schema }}'
{% endcall %}
{{ return(load_result('list_relations_without_caching').table) }}
{% endmacro %}
dbt_project.yml
name: 'my_td_project'
version: '1.0.0'
config-version: 2
profile: 'td'
flags:
require_certificate_validation: true
vars:
target_range: '-3M/now'
models:
my_td_project:
+materialized: table
+on_schema_change: "append_new_columns"
Model Patterns
Basic model:
{{
config(materialized='table')
}}
SELECT
TD_TIME_STRING(time, 'd!', 'JST') as date,
COUNT(*) as event_count
FROM {{ source('raw', 'events') }}
WHERE TD_INTERVAL(time, '{{ var("target_range", "-7d") }}', 'JST')
GROUP BY 1
Incremental model:
{{
config(
materialized='incremental',
unique_key='event_id'
)
}}
SELECT *
FROM {{ source('raw', 'events') }}
WHERE TD_INTERVAL(time, '{{ var("target_range", "-1d") }}', 'JST')
{% if is_incremental() %}
AND time > (SELECT MAX(time) FROM {{ this }})
{% endif %}
Commands
dbt debug # Test connection
dbt run # Run all
dbt run --select daily_events # Run specific
dbt run --vars '{"target_range": "-1d"}' # Override variable
dbt run --full-refresh # Rebuild incremental
dbt test # Run tests
TD Workflow Deployment
# dbt_workflow.dig
timezone: Asia/Tokyo
schedule:
daily>: 03:00:00
_export:
docker:
image: "treasuredata/customscript-python:3.12.11-td1"
_env:
TD_API_KEY: ${secret:td.apikey}
+setup:
py>: tasks.InstallPackages
+dbt_run:
py>: dbt_wrapper.run_dbt
command_args: ['run', '--target', 'prod']
tasks.py:
def InstallPackages():
import subprocess, sys
subprocess.check_call([sys.executable, '-m', 'pip', 'install', 'dbt-core==1.10.9', 'dbt-trino==1.9.3'])
Common Errors
| Error | Fix |
|---|---|
| connector does not support creating views | Add override macro above |
| Table ownership information not available | Add override macro for list_relations |
| Var 'target_range' is undefined | Add default: {{ var('target_range', '-1d') }} |
Resources
More from treasure-data/td-skills
pytd
Expert assistance for using pytd (Python SDK) to query and import data with Treasure Data. Use this skill when users need help with Python-based data analysis, querying Presto/Hive, importing pandas DataFrames, bulk data uploads, or integrating TD with Python analytical workflows.
20tdx-basic
Executes tdx CLI commands for Treasure Data. Covers `tdx databases`, `tdx tables`, `tdx describe`, `tdx query`, `tdx auth setup`, context management with profiles/sessions, and output formats (JSON/TSV/table). Use when users need tdx command syntax, authentication setup, database/table exploration, schema inspection, or query execution.
3workflow
Manages TD workflows using `tdx wf` commands. Covers project sync (pull/push/clone), running workflows, monitoring sessions/attempts, task timeline visualization, retry/kill operations, and secrets management. Use when users need to manage, monitor, or debug Treasure Workflow projects via tdx CLI.
3journey
Load when the client wants to create, edit, or manage a CDP customer journey. Use for building journey YAML with segments, activations, and stage steps, modifying journey stages or flow logic (decision points, condition waits, A/B tests), or pushing journey changes to Treasure Data. Also load when the client wants to analyze journey performance, query journey tables, create journey dashboards, or generate journey action reports.
2parent-segment-analysis
Query and analyze CDP parent segment database data. Use `tdx ps desc -o` to get output database schema, then query customers and behavior tables. Use when exploring parent segment data, building reports, or analyzing customer attributes and behaviors.
2connector-config
Writes connector_config for segment/journey activations using `tdx connection schema <type>` to discover available fields. Use when configuring activations - always run schema command first to see connector-specific fields.
2