natural-language-postgres-presentation
Fail
Audited by Gen Agent Trust Hub on Feb 21, 2026
Risk Level: HIGHEXTERNAL_DOWNLOADSCOMMAND_EXECUTIONREMOTE_CODE_EXECUTION
Full Analysis
- EXTERNAL_DOWNLOADS (HIGH): The skill directs the user to clone a repository from 'https://github.com/Eng0AI/natural-language-postgres-presentation.git'. This GitHub organization is not on the trusted list, meaning the source code has not been verified for safety.
- COMMAND_EXECUTION (HIGH): The setup guide includes commands such as 'pnpm install' and 'pnpm dev'. If the cloned repository contains malicious scripts in the package.json (e.g., preinstall or postinstall scripts), running these commands will execute that malicious code on the host system.
- REMOTE_CODE_EXECUTION (HIGH): Combining an untrusted 'git clone' with local execution commands ('pnpm dev') creates a direct path for remote code execution.
- DATA_EXPOSURE (LOW): The instructions require the user to provide sensitive credentials ('POSTGRES_URL', 'OPENAI_API_KEY') in a '.env' file. While this is standard for development, the untrusted code being executed could easily exfiltrate these secrets once they are provided.
Recommendations
- AI detected serious security threats
Audit Metadata