Advanced AI

Advanced AI

Direct access to AI models with fine-grained control over providers, formats, and conversation context.

Most users should start with Calliope AI for chat, SQL queries, and common tasks. Advanced AI gives you direct model access when you need more control. The %ai magics also power much of what %calliope does under the hood.

Loading the Extension

%load_ext jupyter_ai_magics

Cell Magic (%%ai)

Generate content from AI models with full control over output format.

Basic Syntax

%%ai provider:model [-f format]
Your prompt here

Output Formats

FormatFlagUse Case
markdown-f markdownDocumentation, explanations (default)
code-f codeExecutable code blocks
html-f htmlRich HTML output
math-f mathLaTeX mathematical expressions
json-f jsonStructured data
text-f textPlain text
image-f imageImage generation (HuggingFace)

Examples

Code Generation

%%ai claude -f code
Write a Python class for a binary search tree with insert,
search, and delete methods. Include docstrings.

Output:

class BinarySearchTree:
    """A binary search tree implementation."""

    def __init__(self):
        """Initialize an empty BST."""
        self.root = None

    def insert(self, value):
        """Insert a value into the BST."""
        # ... implementation

Mathematical Expressions

%%ai gpt4o -f math
Derive the quadratic formula from ax² + bx + c = 0

Output (rendered LaTeX): $$x = \frac{-b \pm \sqrt{b^2 - 4ac}}{2a}$$

HTML Generation

%%ai claude -f html
Create an SVG of a simple bar chart showing sales data:
Q1: 100, Q2: 150, Q3: 120, Q4: 200

Output: Interactive SVG rendered in notebook

Structured Data

%%ai gpt4o -f json
Generate a JSON schema for a user profile with name,
email, age, and preferences (array of strings)

Output:

{
  "type": "object",
  "properties": {
    "name": {"type": "string"},
    "email": {"type": "string", "format": "email"},
    "age": {"type": "integer", "minimum": 0},
    "preferences": {
      "type": "array",
      "items": {"type": "string"}
    }
  },
  "required": ["name", "email"]
}

Variable Interpolation

Reference Python variables in your prompts using curly braces:

language = "Python"
task = "web scraping"
%%ai claude -f code
Write a {language} script for {task} using BeautifulSoup

Special Variables

Access notebook cells directly:

# Reference input from cell 5
%%ai claude
Explain this code: {In[5]}
# Reference output from cell 3
%%ai claude
Analyze these results: {Out[3]}
# Reference an error
%%ai claude
Fix this error: {Err[7]}

Output Redirection

Capture AI output into variables for further processing:

%%ai claude -f code -o my_function
Write a function to validate email addresses
# Now use the generated code
exec(my_function)
is_valid = validate_email("test@example.com")

Chaining Outputs

# Generate data
%%ai gpt4o -f json -o sample_data
Generate 5 sample user records with name, email, age
# Process the data
import json
users = json.loads(sample_data)
for user in users:
    print(f"Processing {user['name']}...")

Building Pipelines

# Step 1: Generate analysis code
%%ai claude -f code -o analysis_code
Write pandas code to load 'sales.csv' and calculate
monthly revenue trends
# Step 2: Execute and capture results
exec(analysis_code)
# Step 3: Get AI interpretation
%%ai claude
Interpret these results: {Out[-1]}
What business insights can we draw?

Conversation Context

AI maintains conversation history within a session:

%%ai claude
I'm building a REST API for a todo app.
What endpoints should I create?
%%ai claude
Now show me the Flask code for those endpoints
%%ai claude
Add authentication middleware to protect these routes

Reset Context

%ai reset

Configure History Length

%config AiMagics.max_history = 6  # Keep last 6 exchanges

Model Providers

Supported Providers

ProviderModel ExamplesBest For
Anthropicclaude, claude-3-opusComplex reasoning, long context
OpenAIgpt4o, gpt4o-miniGeneral purpose, reliable
GooglegeminiFast, multimodal
Mistralcodestral, mistral-largeCode generation
Bedrockbedrock-claudeAWS integration
Ollamaollama:llama3Local, private
HuggingFacehuggingface_hub:model-idSpecialized models

List Available Models

%ai list
%ai list anthropic  # List models from specific provider

Model Shortcuts

Pre-configured shortcuts for common models:

ShortcutModel
claudeClaude 3.5 Sonnet
gpt4oGPT-4o
gpt4o-miniGPT-4o Mini
geminiGemini 2.0 Flash
bedrock-claudeClaude via AWS Bedrock

Advanced Configuration

Set Default Model

%config AiMagics.default_language_model = "anthropic:claude-3-5-sonnet-20241022"

Register Model Aliases

# Create a shortcut
%ai register mymodel anthropic:claude-3-5-sonnet-20241022

# Use the shortcut
%%ai mymodel -f code
Write a sorting algorithm
# Update an alias
%ai update mymodel openai:gpt-4o

# Delete an alias
%ai delete mymodel

Error Handling

Explain Errors

When a code cell fails, get AI help:

# After an error occurs
%ai error claude

The AI will analyze the traceback and suggest fixes.

Fix Errors Automatically

In the chat sidebar, use:

/fix

Select the error cell, and AI will propose corrections.

SageMaker Endpoints

Connect to custom models deployed on AWS SageMaker:

%%ai sagemaker-endpoint:my-endpoint-name \
    --region-name us-east-1 \
    --request-schema {"text_inputs": "<prompt>"} \
    --response-path generated_texts.[0] \
    -f code
Write a data validation function

Documentation