Calliope WAIIDE Agent Configuration Guide
Calliope Integration: This component is integrated into the Calliope AI platform. Some features and configurations may differ from the upstream project.
Overview
The Calliope WAIIDE Agent can be configured through multiple methods:
- Environment Variables (highest priority)
- WAIIDE SecretStorage (secure storage)
- WAIIDE Settings (
settings.json)
Environment Variables
API Keys
Each provider requires its specific API key environment variable:
| Provider | Environment Variable | Get API Key |
|---|---|---|
| OpenAI | OPENAI_API_KEY | https://platform.openai.com/api-keys |
| Anthropic | ANTHROPIC_API_KEY | https://console.anthropic.com/ |
GOOGLE_API_KEY | https://aistudio.google.com/app/apikey | |
| AI21 | AI21_API_KEY | https://studio.ai21.com/ |
| Cohere | COHERE_API_KEY | https://dashboard.cohere.ai/ |
| Mistral | MISTRAL_API_KEY | https://console.mistral.ai/ |
| NVIDIA | NVIDIA_API_KEY | https://build.nvidia.com/ |
| Together | TOGETHER_API_KEY | https://api.together.xyz/ |
| Ollama | (no key needed) | Local installation |
Model Configuration
Use CALLIOPE_MODELS_JSON to configure models for different features:
export CALLIOPE_MODELS_JSON='{
"chat": {
"provider": "openai",
"model": "gpt-4-turbo",
"maxTokens": 4096,
"temperature": 0.3
},
"autocomplete": {
"provider": "openai",
"model": "gpt-3.5-turbo",
"maxTokens": 2048,
"temperature": 0.2
},
"agent": {
"provider": "openai",
"model": "gpt-4",
"maxTokens": 8192,
"temperature": 0.3
}
}'Ollama Configuration
For local models with Ollama:
export CALLIOPE_OLLAMA_URL="http://localhost:11434"WAIIDE Settings
These settings can be configured in WAIIDE’s settings.json:
{
// Main provider selection
"calliope.apiProvider": "openai", // Options: openai, anthropic, google, ollama
// Model selection
"calliope.model": "gpt-4",
// API Key (not recommended - use environment variables instead)
"calliope.apiKey": "",
// Ollama URL
"calliope.ollamaUrl": "http://localhost:11434",
// Token limits
"calliope.maxTokens": 4096,
// Temperature (0-2, lower = more deterministic)
"calliope.temperature": 0.3,
// Feature toggles
"calliope.enableAutoComplete": true,
// Autocomplete settings
"calliope.autocomplete.debounceDelay": 300
}Docker Configuration
Using docker-compose
- Create a
.envfile in your project root:
# Copy from example
cp scripts/calliope-env.example .env
# Edit and add your API keys
nano .env- Start the container:
docker-compose up -dUsing docker run
docker run -d \
-p 8070:8070 \
-e OPENAI_API_KEY="your-key-here" \
-e CALLIOPE_MODELS_JSON='{"chat":{"provider":"openai","model":"gpt-4"}}' \
calliopeai/waiide:latestConfiguration Examples
Example 1: OpenAI GPT-4
export OPENAI_API_KEY="sk-..."
export CALLIOPE_MODELS_JSON='{
"chat": {"provider": "openai", "model": "gpt-4-turbo"},
"autocomplete": {"provider": "openai", "model": "gpt-3.5-turbo"}
}'Example 2: Anthropic Claude
export ANTHROPIC_API_KEY="sk-ant-..."
export CALLIOPE_MODELS_JSON='{
"chat": {"provider": "anthropic", "model": "claude-3-5-sonnet-20241022"},
"autocomplete": {"provider": "anthropic", "model": "claude-3-haiku-20240307"}
}'Example 3: Google Gemini
export GOOGLE_API_KEY="..."
export CALLIOPE_MODELS_JSON='{
"chat": {"provider": "google", "model": "gemini-pro"},
"autocomplete": {"provider": "google", "model": "gemini-flash"}
}'Example 4: Mixed Providers
Using Claude for chat and GPT for autocomplete:
export ANTHROPIC_API_KEY="sk-ant-..."
export OPENAI_API_KEY="sk-..."
export CALLIOPE_MODELS_JSON='{
"chat": {"provider": "anthropic", "model": "claude-3-5-sonnet-20241022"},
"autocomplete": {"provider": "openai", "model": "gpt-3.5-turbo"}
}'Example 5: Ollama (Local)
export CALLIOPE_OLLAMA_URL="http://localhost:11434"
export CALLIOPE_MODELS_JSON='{
"chat": {"provider": "ollama", "model": "llama2"},
"autocomplete": {"provider": "ollama", "model": "codellama"}
}'Configuration Priority
The extension checks for configuration in this order:
Environment Variables (highest priority)
- Provider-specific API keys (
OPENAI_API_KEY, etc.) - Model configuration (
CALLIOPE_MODELS_JSON)
- Provider-specific API keys (
WAIIDE SecretStorage
- Secure storage managed by WAIIDE
- Set through the Settings UI
WAIIDE Settings (lowest priority)
settings.jsonconfiguration- Not recommended for API keys
Verifying Configuration
To verify your configuration is working:
- Open WAIIDE with the Calliope extension
- Open the Command Palette (
Cmd+Shift+PorCtrl+Shift+P) - Run “Calliope: Show Configuration”
- Check the output panel for configuration details
Troubleshooting
API Key Not Found
If you see “No API key found”, check:
- Environment variable is set correctly
- Variable name matches the provider (e.g.,
OPENAI_API_KEYfor OpenAI) - No typos in the variable name
- Container has access to the environment variable (check docker-compose.yml)
Model Not Available
If a model is not working:
- Verify the model name is correct
- Check you have access to that model with your API key
- Ensure the provider supports that model
Connection Issues
For connection problems:
- Check network connectivity
- Verify API endpoints are accessible
- For Ollama, ensure the service is running locally
- Check firewall/proxy settings
Security Best Practices
- Never commit API keys to version control
- Use environment variables or WAIIDE SecretStorage
- Create
.envfiles but add them to.gitignore - Rotate API keys regularly
- Use separate keys for development and production
- Set appropriate rate limits on your API keys
Support
For issues or questions:
- Check the Calliope Agent README
- View CI Integration Guide
- Submit issues to the GitHub repository