AI Lab Desktop
AI Lab Desktop
A standalone version of AI Lab for local data science work.
Overview
AI Lab Desktop brings the full power of Calliope’s AI-enhanced JupyterLab to your local machine. Work offline, leverage your GPU, and maintain full control of your data.
Features
JupyterLab Environment
- Full JupyterLab interface
- Multiple kernels (Python, R, Julia)
- Extensions support
- Terminal access
- File browser
AI Integration
- Jupyter AI chat: Converse with AI in a sidebar panel
- Magic commands:
%%aiand%aifor inline AI - Calliope magic:
%calliopefor SQL and data operations - Code assistance: AI-powered completion and suggestions
Data Agent
Built-in data agent capabilities:
- Natural language SQL queries
- RAG over your documents
- Code execution assistance
- Visualization generation
Installation
Download
- Go to calliope.ai/download
- Select AI Lab Desktop
- Choose your platform:
- macOS (Apple Silicon)
- macOS (Intel)
- Windows
- Linux
macOS Installation
- Download the
.dmgfile - Open the downloaded file
- Drag AI Lab to Applications
- First launch: Right-click → Open (to bypass Gatekeeper)
Windows Installation
- Download the
.exeinstaller - Run the installer
- Follow the setup wizard
- Launch from Start menu
Linux Installation
AppImage:
chmod +x AILab-*.AppImage
./AILab-*.AppImageDebian/Ubuntu:
sudo dpkg -i ailab-*.debFirst Launch
Initial Setup
- Launch AI Lab Desktop
- The setup wizard opens
- Configure AI provider:
- Select provider (OpenAI, Anthropic, Ollama)
- Enter API key (for cloud providers)
- Click Complete Setup
Creating Your First Notebook
- Click + to create new notebook
- Select Python kernel
- Start coding!
Testing AI Features
Try the AI chat:
- Open the Jupyter AI panel (right sidebar)
- Type: “Hello, can you help with data analysis?”
- See the AI respond
Try magic commands:
%ai explain this code
def fibonacci(n):
return n if n < 2 else fibonacci(n-1) + fibonacci(n-2)Configuring AI Providers
OpenAI
- Go to Settings → AI Provider
- Select OpenAI
- Enter your API key
- Choose model (gpt-4o recommended)
Anthropic
- Go to Settings → AI Provider
- Select Anthropic
- Enter your API key
- Choose model (claude-3-5-sonnet recommended)
Ollama (Local Models)
For fully offline AI:
- Install Ollama on your machine
- Pull a model:
ollama pull llama3 - In AI Lab: Settings → AI Provider → Ollama
- Select from available local models
Using AI Features
Jupyter AI Chat
The sidebar chat for conversations:
You: How do I read a CSV file with pandas?
AI: Here's how to read a CSV file:
```python
import pandas as pd
df = pd.read_csv('your_file.csv')AI Magic Commands
Cell magic (full cell):
%%ai gpt4o
Explain the pandas groupby operation with an exampleLine magic (single line):
%ai summarize: df.describe()Calliope Magic
For data operations:
%calliope list-datasources%%calliope ask-sql mydb
What are the top 10 customers by revenue?Working with Data
Local Files
Access files from your machine:
- Use the file browser
- Drag and drop into notebooks
- Access via standard paths
Database Connections
Connect to local or remote databases:
%%calliope add-database
{
"datasource_id": "local_db",
"name": "Local PostgreSQL",
"dialect": "postgresql",
"connection_details": {
"host": "localhost",
"port": 5432,
"database": "mydata",
"user": "myuser"
},
"password_field": "local_db_pass",
"password_value": "mypassword"
}Working with Large Files
AI Lab Desktop can leverage your local resources:
- Large datasets that exceed cloud limits
- GPU acceleration for ML workloads
- Local file system for big data
Extensions
Pre-installed Extensions
- jupyterlab-git
- jupyterlab-lsp
- jupyter-ai
- calliope extensions
Installing Extensions
pip install jupyterlab-extension-nameOr via JupyterLab UI:
- Go to Extension Manager
- Search for extension
- Click Install
Kernels
Default Kernels
- Python 3.11+
- Additional kernels available
Adding Kernels
R kernel:
R -e "install.packages('IRkernel'); IRkernel::installspec()"Julia kernel:
julia -e 'using Pkg; Pkg.add("IJulia")'Performance Tips
For Large Datasets
- Use chunked reading for big files
- Leverage local SSD for caching
- Use appropriate data types
GPU Acceleration
If you have a compatible GPU:
- Install CUDA toolkit (NVIDIA)
- Install GPU-enabled libraries:
pip install torch --index-url https://download.pytorch.org/whl/cu118Memory Management
- Monitor memory usage in status bar
- Restart kernels to free memory
- Use generators for large iterations
Offline Usage
Fully Offline Setup
- Configure Ollama with local models
- Pre-download any needed packages
- Prepare local data files
Required for Offline
- AI provider: Ollama with downloaded models
- Python packages: Pre-installed
- Data: Local files or databases
Syncing with Cloud
Connecting to Cloud Workspace
- Go to Settings → Cloud Sync
- Sign in with Calliope account
- Choose sync options
What Can Sync
- Notebooks
- Settings
- Extensions
- Credentials (encrypted)
Sync Behavior
- Two-way sync available
- Conflict resolution options
- Selective sync by folder
Troubleshooting
App Won’t Start
macOS: Right-click → Open for first launch Windows: Run as administrator Linux: Check AppImage permissions
AI Not Responding
- Verify API key is correct
- Check internet connection (cloud providers)
- Ensure Ollama is running (local models)
Slow Performance
- Check available RAM
- Close unused kernels
- Reduce model size for local AI
Extension Issues
# Reset extension state
jupyter lab clean
jupyter lab build