Desktop Apps
Work locally with the full power of Calliope AI tools.
Overview
While Calliope is primarily a cloud platform, we offer desktop applications for developers who want to work offline or leverage local compute resources.
Available Apps
AI Lab Desktop
A standalone version of AI Lab—JupyterLab enhanced with AI capabilities.
Features:
- Full JupyterLab environment
- AI chat assistant
- Magic commands (
%%ai,%ai) - Local kernel support
- Offline capable (with local models)
Best for:
- Data science workflows
- Notebook-based development
- Working with sensitive data locally
- Leveraging local GPU resources
IDE Desktop
A standalone version of our AI-powered web IDE.
Features:
- Full VS Code-based IDE
- AI coding assistance
- Extension support
- Terminal access
- Git integration
Best for:
- Software development
- Working offline
- Projects requiring local resources
- Teams with local-first requirements
Downloads
Download Calliope Desktop Apps
Available for:
- macOS (Apple Silicon & Intel)
- Windows (64-bit)
- Linux (AppImage & .deb)
Installation
macOS
- Download the
.dmgfile for your Mac (Apple Silicon or Intel) - Open the downloaded file
- Drag the app to your Applications folder
- First launch: Right-click the app and select “Open” (required for unsigned apps)
Windows
- Download the
.exeinstaller - Run the installer
- Follow the setup wizard
- Launch from the Start menu
Linux
AppImage
chmod +x Calliope-*.AppImage
./Calliope-*.AppImageDebian/Ubuntu (.deb)
sudo dpkg -i calliope-*.debConfiguration
AI Provider Setup
Configure your AI provider on first launch:
- Open Settings/Preferences
- Navigate to AI Configuration
- Select your provider (OpenAI, Anthropic, etc.)
- Enter your API key
- Choose your preferred model
Local Models
For offline use, configure local models:
- Install Ollama on your machine
- Pull your preferred models:
ollama pull llama3 - In the app, select “Ollama” as your provider
- Choose from your locally available models
Working Offline
Desktop apps can work fully offline when configured with local models:
- Local LLM: Use Ollama or other local inference
- Local embeddings: Configure local embedding models for RAG
- Cached packages: Pre-install Python/Node packages while online
Syncing with Cloud
Connect your desktop app to your Calliope cloud workspace:
- Open Settings → Cloud Sync
- Sign in with your Calliope account
- Choose what to sync:
- Projects
- Settings
- Extensions
- Credentials (encrypted)
System Requirements
AI Lab Desktop
| Requirement | Minimum | Recommended |
|---|---|---|
| RAM | 8 GB | 16 GB+ |
| Storage | 2 GB | 10 GB+ |
| Python | 3.10+ | 3.11+ |
| GPU | Optional | CUDA-capable |
IDE Desktop
| Requirement | Minimum | Recommended |
|---|---|---|
| RAM | 4 GB | 8 GB+ |
| Storage | 1 GB | 5 GB+ |
| Node.js | 18+ | 20+ |
Troubleshooting
App won’t start
macOS: Right-click and select “Open” to bypass Gatekeeper on first launch.
Linux: Ensure the AppImage has execute permissions: chmod +x *.AppImage
Windows: Run as administrator if installation fails.
Slow performance
- Close unused tabs/terminals
- Reduce model size if using local inference
- Check available RAM and disk space
AI features not working
- Verify API key is set correctly
- Check internet connection (for cloud models)
- Ensure Ollama is running (for local models)
Updates
Desktop apps check for updates automatically. You can also:
- Open Help → Check for Updates
- Download the latest version from calliope.ai/download
Differences from Cloud
| Feature | Cloud | Desktop |
|---|---|---|
| Setup required | No | Yes |
| Works offline | No | Yes (with local models) |
| Team collaboration | Built-in | Manual sync |
| Compute resources | Cloud | Local |
| Updates | Automatic | Manual |