local-llm-router
Fail
Audited by Gen Agent Trust Hub on Feb 17, 2026
Risk Level: HIGHREMOTE_CODE_EXECUTIONEXTERNAL_DOWNLOADSCOMMAND_EXECUTION
Full Analysis
- [REMOTE_CODE_EXECUTION] (HIGH): The file references/model-matrix.md contains the command
curl -fsSL https://ollama.ai/install.sh | sh. Piped execution of a remote script is a high-risk pattern (Category 4). The source ollama.ai is not on the provided trusted list. Severity is downgraded to HIGH as the command is directly related to the primary purpose of setting up the local LLM environment described in the skill. - [EXTERNAL_DOWNLOADS] (LOW): The skill recommends installing the
vllmpackage via pip, which constitutes an unverifiable external dependency from a third-party registry. - [COMMAND_EXECUTION] (LOW): The document lists various shell commands for model deployment and server management, such as
ollama pullandpython -m vllm. While intended for setup, they represent a capability for arbitrary command execution.
Recommendations
- AI detected serious security threats
Audit Metadata