01-yaml-table-setup
Pass
Audited by Gen Agent Trust Hub on Mar 8, 2026
Risk Level: SAFECOMMAND_EXECUTIONEXTERNAL_DOWNLOADS
Full Analysis
- [COMMAND_EXECUTION]: The script
setup_tables.pyutilizesspark.sql()to execute dynamically generated DDL statements for table creation and constraint management. This is a standard automation pattern for Databricks environments. - [EXTERNAL_DOWNLOADS]: The Asset Bundle configuration in
gold_setup_job.ymlidentifies a dependency on thepyyamlpackage, which is a well-known and standard library for YAML processing. - [INDIRECT_PROMPT_INJECTION]: The skill implements a data ingestion surface by reading schema definitions from YAML files to influence the structure of the generated SQL.
- Ingestion points: The
load_yamlfunction insrc/gold/setup_tables.pyreads files from thegold_layer_design/yaml/directory. - Boundary markers: The script does not utilize explicit delimiters between the YAML-sourced metadata and the SQL command structure.
- Capability inventory: The skill possesses capabilities to perform file system reads via
pathliband execute SQL commands via thepyspark.sql.SparkSessionobject. - Sanitization: The implementation correctly uses
yaml.safe_load()to prevent arbitrary code execution during the YAML parsing process and references an escaping function for column descriptions, although it does not strictly validate SQL identifiers.
Audit Metadata