databricks-unity-catalog
Unity Catalog
Guidance for Unity Catalog system tables, volumes, and governance.
When to Use This Skill
Use this skill when:
- Working with volumes (upload, download, list files in
/Volumes/) - Querying lineage (table dependencies, column-level lineage)
- Analyzing audit logs (who accessed what, permission changes)
- Monitoring billing and usage (DBU consumption, cost analysis)
- Tracking compute resources (cluster usage, warehouse metrics)
- Reviewing job execution (run history, success rates, failures)
- Analyzing query performance (slow queries, warehouse utilization)
- Profiling data quality (data profiling, drift detection, metric tables)
Reference Files
| Topic | File | Description |
|---|---|---|
| System Tables | 5-system-tables.md | Lineage, audit, billing, compute, jobs, query history |
| Volumes | 6-volumes.md | Volume file operations, permissions, best practices |
| Data Profiling | 7-data-profiling.md | Data profiling, drift detection, profile metrics |
Quick Start
Volume File Operations (MCP Tools)
| Tool | Usage |
|---|---|
list_volume_files |
list_volume_files(volume_path="/Volumes/catalog/schema/volume/path/") |
get_volume_folder_details |
get_volume_folder_details(volume_path="catalog/schema/volume/path", format="parquet") - schema, row counts, stats |
upload_to_volume |
upload_to_volume(local_path="/tmp/data/*", volume_path="/Volumes/.../dest") |
download_from_volume |
download_from_volume(volume_path="/Volumes/.../file.csv", local_path="/tmp/file.csv") |
create_volume_directory |
create_volume_directory(volume_path="/Volumes/.../new_folder") |
Enable System Tables Access
-- Grant access to system tables
GRANT USE CATALOG ON CATALOG system TO `data_engineers`;
GRANT USE SCHEMA ON SCHEMA system.access TO `data_engineers`;
GRANT SELECT ON SCHEMA system.access TO `data_engineers`;
Common Queries
-- Table lineage: What tables feed into this table?
SELECT source_table_full_name, source_column_name
FROM system.access.table_lineage
WHERE target_table_full_name = 'catalog.schema.table'
AND event_date >= current_date() - 7;
-- Audit: Recent permission changes
SELECT event_time, user_identity.email, action_name, request_params
FROM system.access.audit
WHERE action_name LIKE '%GRANT%' OR action_name LIKE '%REVOKE%'
ORDER BY event_time DESC
LIMIT 100;
-- Billing: DBU usage by workspace
SELECT workspace_id, sku_name, SUM(usage_quantity) AS total_dbus
FROM system.billing.usage
WHERE usage_date >= current_date() - 30
GROUP BY workspace_id, sku_name;
MCP Tool Integration
Use mcp__databricks__execute_sql for system table queries:
# Query lineage
mcp__databricks__execute_sql(
sql_query="""
SELECT source_table_full_name, target_table_full_name
FROM system.access.table_lineage
WHERE event_date >= current_date() - 7
""",
catalog="system"
)
Best Practices
- Filter by date - System tables can be large; always use date filters
- Use appropriate retention - Check your workspace's retention settings
- Grant minimal access - System tables contain sensitive metadata
- Schedule reports - Create scheduled queries for regular monitoring
Related Skills
- databricks-spark-declarative-pipelines - for pipelines that write to Unity Catalog tables
- databricks-jobs - for job execution data visible in system tables
- databricks-synthetic-data-gen - for generating data stored in Unity Catalog Volumes
- databricks-aibi-dashboards - for building dashboards on top of Unity Catalog data
Resources
More from databricks-solutions/ai-dev-kit
databricks-python-sdk
Databricks development guidance including Python SDK, Databricks Connect, CLI, and REST API. Use when working with databricks-sdk, databricks-connect, or Databricks APIs.
132python-dev
Python development guidance with code quality standards, error handling, testing practices, and environment management. Use when writing, reviewing, or modifying Python code (.py files) or Jupyter notebooks (.ipynb files).
68skill-test
Testing framework for evaluating Databricks skills. Use when building test cases for skills, running skill evaluations, comparing skill versions, or creating ground truth datasets with the Generate-Review-Promote (GRP) pipeline. Triggers include "test skill", "evaluate skill", "skill regression", "ground truth", "GRP pipeline", "skill quality", and "skill metrics".
53databricks-docs
Databricks documentation reference via llms.txt index. Use when other skills do not cover a topic, looking up unfamiliar Databricks features, or needing authoritative docs on APIs, configurations, or platform capabilities.
29databricks-config
Manage Databricks workspace connections: check current workspace, switch profiles, list available workspaces, or authenticate to a new workspace. Use when the user mentions \"switch workspace\", \"which workspace\", \"current profile\", \"databrickscfg\", \"connect to workspace\", or \"databricks auth\".
26databricks-app-python
Builds Python-based Databricks applications using Dash, Streamlit, Gradio, Flask, FastAPI, or Reflex. Handles OAuth authorization (app and user auth), app resources, SQL warehouse and Lakebase connectivity, model serving integration, foundation model APIs, LLM integration, and deployment. Use when building Python web apps, dashboards, ML demos, or REST APIs for Databricks, or when the user mentions Streamlit, Dash, Gradio, Flask, FastAPI, Reflex, or Databricks app.
22