Installation
Basic Installation
pip install multi-agent-generator
Development Installation
git clone https://github.com/meashish194/multi-agent-generator-pro.git
cd multi-agent-generator
pip install -e ".[dev]"
Prerequisites
- Python 3.8 or higher
- At least one supported LLM provider (OpenAI, WatsonX, Ollama, etc.)
Environment Variables
Set up environment variables for your chosen LLM provider:
OpenAI
export OPENAI_API_KEY="your-openai-api-key"
IBM WatsonX
export WATSONX_API_KEY="your-watsonx-api-key"
export WATSONX_PROJECT_ID="your-project-id"
export WATSONX_URL="https://us-south.ml.cloud.ibm.com"
Ollama (Local)
export OLLAMA_URL="http://localhost:11434"
Generic LiteLLM
export API_KEY="your-api-key"
export API_BASE="https://your-api-endpoint"
Provider Notes
- Agno currently only works with
OPENAI_API_KEYwithout tools. Support for additional APIs and tools will be expanded in future releases.
You can freely switch providers using --provider in CLI or by setting environment variables.
Verifying Installation
After installation, verify everything works:
# Check CLI is available
multi-agent-generator --help
# Test a simple generation
multi-agent-generator "Create a simple assistant" --framework crewai
Optional: Streamlit UI
The Streamlit UI is included by default. Launch it with:
streamlit run streamlit_app.py