Models
Lab supports multiple AI providers and models. This reference covers available providers, popular models, and configuration.
Supported Providers
| Provider | Prefix | Example |
|---|---|---|
| OpenAI | openai: | openai:gpt-4o |
| Anthropic | anthropic: | anthropic:claude-3-5-sonnet-latest |
google: | google:gemini-2.0-flash | |
| AWS Bedrock | bedrock: | bedrock:anthropic.claude-3-5-sonnet |
| Cohere | cohere: | cohere:command-r-plus |
| Mistral | mistral: | mistral:mistral-large-latest |
Built-in Shortcuts
For convenience, these shortcuts are available without the provider prefix:
| Shortcut | Full Model ID |
|---|---|
gpt4o | openai:gpt-4o |
gpt4o-mini | openai:gpt-4o-mini |
claude | anthropic:claude-3-5-sonnet-latest |
claude-haiku | anthropic:claude-3-5-haiku-latest |
gemini | google:gemini-2.0-flash |
bedrock-claude | bedrock:anthropic.claude-3-5-sonnet |
Popular Models
OpenAI
| Model | Best For |
|---|---|
gpt-4o | General purpose, complex reasoning |
gpt-4o-mini | Fast, cost-effective tasks |
gpt-4-turbo | Long context, detailed outputs |
o1-preview | Advanced reasoning, math |
Anthropic
| Model | Best For |
|---|---|
claude-3-5-sonnet-latest | Best balance of capability and speed |
claude-3-5-haiku-latest | Fast, efficient for simple tasks |
claude-3-opus-latest | Most capable, complex analysis |
| Model | Best For |
|---|---|
gemini-2.0-flash | Fast, multimodal |
gemini-1.5-pro | Long context, complex tasks |
AWS Bedrock
| Model | Best For |
|---|---|
anthropic.claude-3-5-sonnet | Claude via AWS |
anthropic.claude-3-haiku | Fast Claude via AWS |
amazon.titan-text-premier | AWS native model |
Configuration
Environment Variables
Set API keys as environment variables:
# OpenAI
export OPENAI_API_KEY=sk-...
export ANTHROPIC_API_KEY=sk-ant-...
export GOOGLE_API_KEY=...AWS Bedrock
Bedrock uses AWS credentials. Configure via:
- Environment variables (
AWS_ACCESS_KEY_ID,AWS_SECRET_ACCESS_KEY) - AWS credentials file
- IAM role (when running in AWS)
Default Model
Set a default model so you don’t need to specify it each time:
%config AiMagics.default_model_name = "anthropic:claude-3-5-sonnet-latest"Listing Available Models
See what models are available:
# All models
%ai list
%ai list openai
%ai list anthropicCreating Aliases
Register custom shortcuts:
# Create aliases
%ai register fast openai:gpt-4o-mini
%ai register best anthropic:claude-3-opus-latest
%%ai fast
Quick question about Python syntaxChoosing a Model
For Speed
Use smaller, faster models:
gpt4o-miniclaude-haikugemini-2.0-flash
For Quality
Use larger, more capable models:
gpt-4oclaude(sonnet)claude-3-opus-latest
For Cost
Consider token usage and pricing:
- Haiku and mini models are most economical
- Use larger models only when needed
- Shorter prompts reduce costs
For Privacy
If data sensitivity is a concern:
- AWS Bedrock keeps data in your AWS account
- Check provider data policies