Advanced AI
Direct access to AI models with fine-grained control over providers, formats, and conversation context.
%ai magics also power much of what %calliope does under the hood.Loading the Extension
%load_ext jupyter_ai_magicsCell Magic (%%ai)
Generate content from AI models with full control over output format.
Basic Syntax
%%ai provider:model [-f format]
Your prompt hereOutput Formats
| Format | Flag | Use Case |
|---|---|---|
markdown | -f markdown | Documentation, explanations (default) |
code | -f code | Executable code blocks |
html | -f html | Rich HTML output |
math | -f math | LaTeX mathematical expressions |
json | -f json | Structured data |
text | -f text | Plain text |
image | -f image | Image generation (HuggingFace) |
Examples
Code Generation
%%ai claude -f code
Write a Python class for a binary search tree with insert,
search, and delete methods. Include docstrings.Output:
class BinarySearchTree:
"""A binary search tree implementation."""
def __init__(self):
"""Initialize an empty BST."""
self.root = None
def insert(self, value):
"""Insert a value into the BST."""
# ... implementationMathematical Expressions
%%ai gpt4o -f math
Derive the quadratic formula from ax² + bx + c = 0Output (rendered LaTeX): $$x = \frac{-b \pm \sqrt{b^2 - 4ac}}{2a}$$
HTML Generation
%%ai claude -f html
Create an SVG of a simple bar chart showing sales data:
Q1: 100, Q2: 150, Q3: 120, Q4: 200Output: Interactive SVG rendered in notebook
Structured Data
%%ai gpt4o -f json
Generate a JSON schema for a user profile with name,
email, age, and preferences (array of strings)Output:
{
"type": "object",
"properties": {
"name": {"type": "string"},
"email": {"type": "string", "format": "email"},
"age": {"type": "integer", "minimum": 0},
"preferences": {
"type": "array",
"items": {"type": "string"}
}
},
"required": ["name", "email"]
}Variable Interpolation
Reference Python variables in your prompts using curly braces:
language = "Python"
task = "web scraping"%%ai claude -f code
Write a {language} script for {task} using BeautifulSoupSpecial Variables
Access notebook cells directly:
# Reference input from cell 5
%%ai claude
Explain this code: {In[5]}# Reference output from cell 3
%%ai claude
Analyze these results: {Out[3]}# Reference an error
%%ai claude
Fix this error: {Err[7]}Output Redirection
Capture AI output into variables for further processing:
%%ai claude -f code -o my_function
Write a function to validate email addresses# Now use the generated code
exec(my_function)
is_valid = validate_email("test@example.com")Chaining Outputs
# Generate data
%%ai gpt4o -f json -o sample_data
Generate 5 sample user records with name, email, age# Process the data
import json
users = json.loads(sample_data)
for user in users:
print(f"Processing {user['name']}...")Building Pipelines
# Step 1: Generate analysis code
%%ai claude -f code -o analysis_code
Write pandas code to load 'sales.csv' and calculate
monthly revenue trends# Step 2: Execute and capture results
exec(analysis_code)# Step 3: Get AI interpretation
%%ai claude
Interpret these results: {Out[-1]}
What business insights can we draw?Conversation Context
AI maintains conversation history within a session:
%%ai claude
I'm building a REST API for a todo app.
What endpoints should I create?%%ai claude
Now show me the Flask code for those endpoints%%ai claude
Add authentication middleware to protect these routesReset Context
%ai resetConfigure History Length
%config AiMagics.max_history = 6 # Keep last 6 exchangesModel Providers
Supported Providers
| Provider | Model Examples | Best For |
|---|---|---|
| Anthropic | claude, claude-3-opus | Complex reasoning, long context |
| OpenAI | gpt4o, gpt4o-mini | General purpose, reliable |
gemini | Fast, multimodal | |
| Mistral | codestral, mistral-large | Code generation |
| Bedrock | bedrock-claude | AWS integration |
| Ollama | ollama:llama3 | Local, private |
| HuggingFace | huggingface_hub:model-id | Specialized models |
List Available Models
%ai list%ai list anthropic # List models from specific providerModel Shortcuts
Pre-configured shortcuts for common models:
| Shortcut | Model |
|---|---|
claude | Claude 3.5 Sonnet |
gpt4o | GPT-4o |
gpt4o-mini | GPT-4o Mini |
gemini | Gemini 2.0 Flash |
bedrock-claude | Claude via AWS Bedrock |
Advanced Configuration
Set Default Model
%config AiMagics.default_language_model = "anthropic:claude-3-5-sonnet-20241022"Register Model Aliases
# Create a shortcut
%ai register mymodel anthropic:claude-3-5-sonnet-20241022
# Use the shortcut
%%ai mymodel -f code
Write a sorting algorithm# Update an alias
%ai update mymodel openai:gpt-4o
# Delete an alias
%ai delete mymodelError Handling
Explain Errors
When a code cell fails, get AI help:
# After an error occurs
%ai error claudeThe AI will analyze the traceback and suggest fixes.
Fix Errors Automatically
In the chat sidebar, use:
/fixSelect the error cell, and AI will propose corrections.
SageMaker Endpoints
Connect to custom models deployed on AWS SageMaker:
%%ai sagemaker-endpoint:my-endpoint-name \
--region-name us-east-1 \
--request-schema {"text_inputs": "<prompt>"} \
--response-path generated_texts.[0] \
-f code
Write a data validation function