Calliope WAIIDE Agent Configuration Guide

Calliope WAIIDE Agent Configuration Guide

Calliope Integration: This component is integrated into the Calliope AI platform. Some features and configurations may differ from the upstream project.

Overview

The Calliope WAIIDE Agent can be configured through multiple methods:

  1. Environment Variables (highest priority)
  2. WAIIDE SecretStorage (secure storage)
  3. WAIIDE Settings (settings.json)

Environment Variables

API Keys

Each provider requires its specific API key environment variable:

ProviderEnvironment VariableGet API Key
OpenAIOPENAI_API_KEYhttps://platform.openai.com/api-keys
AnthropicANTHROPIC_API_KEYhttps://console.anthropic.com/
GoogleGOOGLE_API_KEYhttps://aistudio.google.com/app/apikey
AI21AI21_API_KEYhttps://studio.ai21.com/
CohereCOHERE_API_KEYhttps://dashboard.cohere.ai/
MistralMISTRAL_API_KEYhttps://console.mistral.ai/
NVIDIANVIDIA_API_KEYhttps://build.nvidia.com/
TogetherTOGETHER_API_KEYhttps://api.together.xyz/
Ollama(no key needed)Local installation

Model Configuration

Use CALLIOPE_MODELS_JSON to configure models for different features:

export CALLIOPE_MODELS_JSON='{
  "chat": {
    "provider": "openai",
    "model": "gpt-4-turbo",
    "maxTokens": 4096,
    "temperature": 0.3
  },
  "autocomplete": {
    "provider": "openai",
    "model": "gpt-3.5-turbo",
    "maxTokens": 2048,
    "temperature": 0.2
  },
  "agent": {
    "provider": "openai",
    "model": "gpt-4",
    "maxTokens": 8192,
    "temperature": 0.3
  }
}'

Ollama Configuration

For local models with Ollama:

export CALLIOPE_OLLAMA_URL="http://localhost:11434"

WAIIDE Settings

These settings can be configured in WAIIDE’s settings.json:

{
  // Main provider selection
  "calliope.apiProvider": "openai",  // Options: openai, anthropic, google, ollama
  
  // Model selection
  "calliope.model": "gpt-4",
  
  // API Key (not recommended - use environment variables instead)
  "calliope.apiKey": "",
  
  // Ollama URL
  "calliope.ollamaUrl": "http://localhost:11434",
  
  // Token limits
  "calliope.maxTokens": 4096,
  
  // Temperature (0-2, lower = more deterministic)
  "calliope.temperature": 0.3,
  
  // Feature toggles
  "calliope.enableAutoComplete": true,
  
  // Autocomplete settings
  "calliope.autocomplete.debounceDelay": 300
}

Docker Configuration

Using docker-compose

  1. Create a .env file in your project root:
# Copy from example
cp scripts/calliope-env.example .env

# Edit and add your API keys
nano .env
  1. Start the container:
docker-compose up -d

Using docker run

docker run -d \
  -p 8070:8070 \
  -e OPENAI_API_KEY="your-key-here" \
  -e CALLIOPE_MODELS_JSON='{"chat":{"provider":"openai","model":"gpt-4"}}' \
  calliopeai/waiide:latest

Configuration Examples

Example 1: OpenAI GPT-4

export OPENAI_API_KEY="sk-..."
export CALLIOPE_MODELS_JSON='{
  "chat": {"provider": "openai", "model": "gpt-4-turbo"},
  "autocomplete": {"provider": "openai", "model": "gpt-3.5-turbo"}
}'

Example 2: Anthropic Claude

export ANTHROPIC_API_KEY="sk-ant-..."
export CALLIOPE_MODELS_JSON='{
  "chat": {"provider": "anthropic", "model": "claude-3-5-sonnet-20241022"},
  "autocomplete": {"provider": "anthropic", "model": "claude-3-haiku-20240307"}
}'

Example 3: Google Gemini

export GOOGLE_API_KEY="..."
export CALLIOPE_MODELS_JSON='{
  "chat": {"provider": "google", "model": "gemini-pro"},
  "autocomplete": {"provider": "google", "model": "gemini-flash"}
}'

Example 4: Mixed Providers

Using Claude for chat and GPT for autocomplete:

export ANTHROPIC_API_KEY="sk-ant-..."
export OPENAI_API_KEY="sk-..."
export CALLIOPE_MODELS_JSON='{
  "chat": {"provider": "anthropic", "model": "claude-3-5-sonnet-20241022"},
  "autocomplete": {"provider": "openai", "model": "gpt-3.5-turbo"}
}'

Example 5: Ollama (Local)

export CALLIOPE_OLLAMA_URL="http://localhost:11434"
export CALLIOPE_MODELS_JSON='{
  "chat": {"provider": "ollama", "model": "llama2"},
  "autocomplete": {"provider": "ollama", "model": "codellama"}
}'

Configuration Priority

The extension checks for configuration in this order:

  1. Environment Variables (highest priority)

    • Provider-specific API keys (OPENAI_API_KEY, etc.)
    • Model configuration (CALLIOPE_MODELS_JSON)
  2. WAIIDE SecretStorage

    • Secure storage managed by WAIIDE
    • Set through the Settings UI
  3. WAIIDE Settings (lowest priority)

    • settings.json configuration
    • Not recommended for API keys

Verifying Configuration

To verify your configuration is working:

  1. Open WAIIDE with the Calliope extension
  2. Open the Command Palette (Cmd+Shift+P or Ctrl+Shift+P)
  3. Run “Calliope: Show Configuration”
  4. Check the output panel for configuration details

Troubleshooting

API Key Not Found

If you see “No API key found”, check:

  1. Environment variable is set correctly
  2. Variable name matches the provider (e.g., OPENAI_API_KEY for OpenAI)
  3. No typos in the variable name
  4. Container has access to the environment variable (check docker-compose.yml)

Model Not Available

If a model is not working:

  1. Verify the model name is correct
  2. Check you have access to that model with your API key
  3. Ensure the provider supports that model

Connection Issues

For connection problems:

  1. Check network connectivity
  2. Verify API endpoints are accessible
  3. For Ollama, ensure the service is running locally
  4. Check firewall/proxy settings

Security Best Practices

  1. Never commit API keys to version control
  2. Use environment variables or WAIIDE SecretStorage
  3. Create .env files but add them to .gitignore
  4. Rotate API keys regularly
  5. Use separate keys for development and production
  6. Set appropriate rate limits on your API keys

Support

For issues or questions: