Use llamactl - a CLI tool for LlamaAgents
Use llamactl - a CLI tool for LlamaAgents
llamactl is a CLI tool for developing and deploying LlamaIndex workflows as LlamaAgents. It provides commands to initialize projects, run local development servers, and manage cloud deployments.
Prerequisites
Before using llamactl, ensure you have:
uv- Python package manager and build tool- Node.js - Required for UI development (supports
npm,pnpm, oryarn) llama-index-workflowsandllamactlinstalled in your environment
Installation
Install llamactl globally using uv:
uv tool install -U llamactl
llamactl --help
Or try it without installing:
uvx llamactl --help
Initialize a Project
Create a new LlamaAgents project with starter templates:
llamactl init
This creates a Python module with LlamaIndex workflows and an optional UI frontend. Configuration is managed in pyproject.toml, where you define workflow instances, environment settings, and UI build options.
Local Development
Start the local development server:
llamactl serve
This command:
- Installs dependencies
- Serves workflows as an API (configured in
pyproject.toml) - Starts the frontend development server
The server automatically detects file changes and can resume in-progress workflows.
Deploy to LlamaCloud
Push your code to a git repository:
git remote add origin https://github.com/org/repo
git add -A
git commit -m 'Set up new app'
git push -u origin main
Create a cloud deployment:
llamactl deployments create
This opens an interactive Terminal UI to configure:
- Deployment name
- Git repository (supports private GitHub repos via the LlamaDeploy GitHub app)
- Git branch/tag/commit
- Environment secrets
Manage Deployments
- View deployment status:
llamactl deployments get - Update secrets or branch:
llamactl deployments edit - Deploy new version:
llamactl deployments update
For detailed configuration options, see the Deployment Config Reference.
More from run-llama/vibe-llama
pdf processing
Invoke this skill BEFORE implementing any text extraction/parsing logic to learn how to use LlamaParse to process any document accurately. Requires llama_cloud_services package and LLAMA_CLOUD_API_KEY as an environment variable.
3classify files according to specific rules
Invoke this skill BEFORE implementing any text/document classification task to learn the correct llama_cloud_services API usage. Required reading before writing classification code." Requires the llama_cloud_services package and LLAMA_CLOUD_API_KEY as an environment variable.
2retrieve relevant information through rag
Leverage Retrieval Augmented Generation to retrieve relevant information from a a LlamaCloud Index. Requires the llama_cloud_services package and LLAMA_CLOUD_API_KEY as an environment variable.
2