hive
TD Hive SQL
TD Time Functions
td_interval (Recommended for relative time)
where td_interval(time, '-1d', 'JST') -- Yesterday
where td_interval(time, '-1w', 'JST') -- Previous week
where td_interval(time, '-1M', 'JST') -- Previous month
where td_interval(time, '-1d/-1d', 'JST') -- 2 days ago
Note: Cannot use TD_SCHEDULED_TIME() as first arg. Include TD_SCHEDULED_TIME() elsewhere to establish reference date.
td_time_range (Explicit dates)
where td_time_range(time, '2024-01-01', '2024-01-31', 'JST')
where td_time_range(time, '2024-01-01', null, 'JST') -- Open-ended
TD_TIME_FORMAT
TD_TIME_FORMAT(time, 'yyyy-MM-dd HH:mm:ss', 'JST')
TD_TIME_PARSE
TD_TIME_PARSE('2024-01-01', 'JST') -- String to Unix timestamp
TD_DATE_TRUNC
TD_DATE_TRUNC('day', time, 'JST')
TD_DATE_TRUNC('hour', time, 'UTC')
Hive-Specific Features
MAPJOIN Hint
select /*+ MAPJOIN(small_table) */ *
from large_table l
join small_table s on l.id = s.id
where td_time_range(l.time, '2024-01-01')
lateral view with explode
select user_id, tag
from user_profiles
lateral view explode(tags) tags_table as tag
where td_time_range(time, '2024-01-01')
get_json_object
select
get_json_object(json_column, '$.user.id') as user_id,
get_json_object(json_column, '$.event.type') as event_type
from raw_events
Dynamic Partitioning
set hive.exec.dynamic.partition = true;
set hive.exec.dynamic.partition.mode = nonstrict;
insert overwrite table target_table partition(dt)
select *, TD_TIME_FORMAT(time, 'yyyy-MM-dd', 'JST') as dt
from source_table
where td_time_range(time, '2024-01-01', '2024-01-31')
Differences from Trino
| Feature | Hive | Trino |
|---|---|---|
| Approx distinct | count(distinct x) |
approx_distinct(x) |
| Time format | TD_TIME_FORMAT() |
td_time_string() |
| Small table join | /*+ MAPJOIN(t) */ |
Automatic |
| Flatten array | lateral view explode() |
unnest() |
Common Errors
| Error | Fix |
|---|---|
| OutOfMemoryError | Reduce time range, use MAPJOIN |
| Too many dynamic partitions | Reduce partition count |
When to Use Hive vs Trino
- Use Hive: Memory errors in Trino, batch ETL, Hive-specific UDFs
- Use Trino: Interactive queries, faster execution, approx functions
Resources
More from treasure-data/td-skills
pytd
Expert assistance for using pytd (Python SDK) to query and import data with Treasure Data. Use this skill when users need help with Python-based data analysis, querying Presto/Hive, importing pandas DataFrames, bulk data uploads, or integrating TD with Python analytical workflows.
20tdx-basic
Executes tdx CLI commands for Treasure Data. Covers `tdx databases`, `tdx tables`, `tdx describe`, `tdx query`, `tdx auth setup`, context management with profiles/sessions, and output formats (JSON/TSV/table). Use when users need tdx command syntax, authentication setup, database/table exploration, schema inspection, or query execution.
3workflow
Manages TD workflows using `tdx wf` commands. Covers project sync (pull/push/clone), running workflows, monitoring sessions/attempts, task timeline visualization, retry/kill operations, and secrets management. Use when users need to manage, monitor, or debug Treasure Workflow projects via tdx CLI.
3journey
Load when the client wants to create, edit, or manage a CDP customer journey. Use for building journey YAML with segments, activations, and stage steps, modifying journey stages or flow logic (decision points, condition waits, A/B tests), or pushing journey changes to Treasure Data. Also load when the client wants to analyze journey performance, query journey tables, create journey dashboards, or generate journey action reports.
2parent-segment-analysis
Query and analyze CDP parent segment database data. Use `tdx ps desc -o` to get output database schema, then query customers and behavior tables. Use when exploring parent segment data, building reports, or analyzing customer attributes and behaviors.
2connector-config
Writes connector_config for segment/journey activations using `tdx connection schema <type>` to discover available fields. Use when configuring activations - always run schema command first to see connector-specific fields.
2