Getting Started
Getting Started
Build your first AI workflow in minutes.
Launching Langflow
- Open the Hub
- Click Langflow in the tool list
- Wait for the environment to load
- You’ll see the visual editor
Interface Overview
Canvas
The main workspace where you build flows:
- Drag components here
- Connect them with lines
- Arrange your workflow visually
Component Sidebar
Left panel with available components:
- Inputs: Chat, Text, File
- LLMs: OpenAI, Anthropic, Ollama
- Prompts: Templates, System Messages
- Memory: Conversation storage
- Tools: Search, Calculator, Code
- Outputs: Chat, Text, JSON
Properties Panel
Right panel showing:
- Selected component settings
- Parameter configuration
- Connection details
Toolbar
Top bar with:
- Save/Load flows
- Run/Test buttons
- Settings access
- Export options
Your First Flow: Simple Chatbot
Step 1: Add Input
- Find Chat Input in the sidebar
- Drag it onto the canvas
- This receives user messages
Step 2: Add LLM
- Find OpenAI (or your preferred LLM)
- Drag it onto the canvas
- Position it to the right of the input
Step 3: Add Output
- Find Chat Output in the sidebar
- Drag it onto the canvas
- Position it to the right of the LLM
Step 4: Connect Components
- Click the output port (right side) of Chat Input
- Drag to the input port (left side) of OpenAI
- Click OpenAI’s output port
- Drag to Chat Output’s input port
You should see:
[Chat Input] → [OpenAI] → [Chat Output]Step 5: Configure LLM
- Click on the OpenAI component
- In the properties panel:
- Select model:
gpt-4oorgpt-3.5-turbo - API key: Enter yours or use workspace default
- Select model:
- Save settings
Step 6: Test Your Flow
- Click Run in the toolbar
- The Playground opens
- Type “Hello!” and press Enter
- See your chatbot respond!
Adding a System Prompt
Make your chatbot have a personality:
Step 1: Add Prompt Component
- Find Prompt in the sidebar
- Drag it between Input and LLM
Step 2: Configure the Prompt
- Click the Prompt component
- Enter your system message:
You are a helpful assistant that specializes in cooking.
You always suggest recipes and cooking tips.
Keep responses concise and practical.Step 3: Rewire Connections
- Delete old Input → LLM connection
- Connect: Input → Prompt → LLM → Output
Step 4: Test Again
Now your chatbot will respond as a cooking assistant!
Adding Memory
Make your chatbot remember the conversation:
Step 1: Add Memory Component
- Find Conversation Buffer Memory
- Drag it onto the canvas
Step 2: Connect Memory
Connect memory to your LLM component:
- Memory output → LLM memory input
- This gives the LLM conversation context
Step 3: Configure Memory
Set how much history to keep:
- k: Number of exchanges to remember
- Typical: 5-10 exchanges
Step 4: Test Conversation
- Ask: “What’s a good pasta dish?”
- Then ask: “How long does it take to make?”
- The bot remembers you’re discussing pasta!
Saving Your Flow
Save to Workspace
- Click Save in toolbar
- Name your flow: “Cooking Chatbot”
- Click Save
Your flow is now saved in your Calliope workspace.
Export as JSON
- Click Export in toolbar
- Choose JSON
- Download the file
Use this to:
- Version control in Git
- Share with teammates
- Import into other environments
Next Steps
Now that you’ve built a basic chatbot:
- Components - Learn all available components
- Building Workflows - Create complex flows
- Testing & Debugging - Debug your flows
- Deploying Flows - Use flows as APIs