Desktop Apps

Desktop Apps

Work locally with the full power of Calliope AI tools.

Overview

While Calliope is primarily a cloud platform, we offer desktop applications for developers who want to work offline or leverage local compute resources.

Available Apps

AI Lab Desktop

A standalone version of AI Lab—JupyterLab enhanced with AI capabilities.

Features:

  • Full JupyterLab environment
  • AI chat assistant
  • Magic commands (%%ai, %ai)
  • Local kernel support
  • Offline capable (with local models)

Best for:

  • Data science workflows
  • Notebook-based development
  • Working with sensitive data locally
  • Leveraging local GPU resources

IDE Desktop

A standalone version of our AI-powered web IDE.

Features:

  • Full VS Code-based IDE
  • AI coding assistance
  • Extension support
  • Terminal access
  • Git integration

Best for:

  • Software development
  • Working offline
  • Projects requiring local resources
  • Teams with local-first requirements

Downloads

Download Calliope Desktop Apps

Available for:

  • macOS (Apple Silicon & Intel)
  • Windows (64-bit)
  • Linux (AppImage & .deb)

Installation

macOS

  1. Download the .dmg file for your Mac (Apple Silicon or Intel)
  2. Open the downloaded file
  3. Drag the app to your Applications folder
  4. First launch: Right-click the app and select “Open” (required for unsigned apps)

Windows

  1. Download the .exe installer
  2. Run the installer
  3. Follow the setup wizard
  4. Launch from the Start menu

Linux

AppImage

chmod +x Calliope-*.AppImage
./Calliope-*.AppImage

Debian/Ubuntu (.deb)

sudo dpkg -i calliope-*.deb

Configuration

AI Provider Setup

Configure your AI provider on first launch:

  1. Open Settings/Preferences
  2. Navigate to AI Configuration
  3. Select your provider (OpenAI, Anthropic, etc.)
  4. Enter your API key
  5. Choose your preferred model

Local Models

For offline use, configure local models:

  1. Install Ollama on your machine
  2. Pull your preferred models: ollama pull llama3
  3. In the app, select “Ollama” as your provider
  4. Choose from your locally available models

Working Offline

Desktop apps can work fully offline when configured with local models:

  1. Local LLM: Use Ollama or other local inference
  2. Local embeddings: Configure local embedding models for RAG
  3. Cached packages: Pre-install Python/Node packages while online

Syncing with Cloud

Connect your desktop app to your Calliope cloud workspace:

  1. Open Settings → Cloud Sync
  2. Sign in with your Calliope account
  3. Choose what to sync:
    • Projects
    • Settings
    • Extensions
    • Credentials (encrypted)

System Requirements

AI Lab Desktop

RequirementMinimumRecommended
RAM8 GB16 GB+
Storage2 GB10 GB+
Python3.10+3.11+
GPUOptionalCUDA-capable

IDE Desktop

RequirementMinimumRecommended
RAM4 GB8 GB+
Storage1 GB5 GB+
Node.js18+20+

Troubleshooting

App won’t start

macOS: Right-click and select “Open” to bypass Gatekeeper on first launch.

Linux: Ensure the AppImage has execute permissions: chmod +x *.AppImage

Windows: Run as administrator if installation fails.

Slow performance

  • Close unused tabs/terminals
  • Reduce model size if using local inference
  • Check available RAM and disk space

AI features not working

  • Verify API key is set correctly
  • Check internet connection (for cloud models)
  • Ensure Ollama is running (for local models)

Updates

Desktop apps check for updates automatically. You can also:

  1. Open Help → Check for Updates
  2. Download the latest version from calliope.ai/download

Differences from Cloud

FeatureCloudDesktop
Setup requiredNoYes
Works offlineNoYes (with local models)
Team collaborationBuilt-inManual sync
Compute resourcesCloudLocal
UpdatesAutomaticManual

Documentation