Guide for building AI workflows with NodeTool.
Version: 1.0 Target Audience: Developers and Workflow Designers
Table of Contents
Detailed Workflow Examples
For step-by-step guides with detailed explanations and Mermaid diagrams, browse our Workflow Examples Gallery.
Highlighted Examples:
- Image Enhance - Basic image enhancement workflow
- Transcribe Audio - Speech-to-text with Whisper
- Chat with Docs - RAG-based document Q&A
- Creative Story Ideas - Beginner tutorial workflow
- Movie Posters - Multi-stage AI generation
- Data Visualization Pipeline - Data fetching and visualization
View all 19+ workflow examples β
Core Concepts
What is a NodeTool Workflow?
A NodeTool workflow is a Directed Acyclic Graph (DAG) where:
- Nodes represent operations (processing, generation, transformation)
- Edges represent data flow between nodes
- Execution follows dependency order automatically
Input β Process β Transform β Output
Key Principles
- Data Flows Through Edges: Nodes connect via typed edges (image β image, text β text, etc.)
- Asynchronous Execution: Nodes execute when dependencies are satisfied
- Streaming by Default: Many nodes support real-time streaming output
- Type Safety: Connections enforce type compatibility
Node Types
| Type | Purpose | Examples |
|---|---|---|
| Input Nodes | Accept parameters | StringInput, ImageInput, AudioInput |
| Processing Nodes | Transform data | Resize, Filter, ExtractText |
| Agent Nodes | LLM-powered logic | Agent, Summarizer, ListGenerator |
| Output Nodes | Return results | Output, Preview |
| Control Nodes | Flow control | Collect, FormatText |
| Storage Nodes | Persistence | CreateTable, Insert, Query |
Streaming Architecture
Why Streaming?
NodeTool workflows support streaming execution for:
- Real-time feedback: See results as theyβre generated
- Lower latency: Start processing before all data arrives
- Better UX: Progress indicators and incremental results
- Efficient memory: Process large data in chunks
Streaming Nodes
Nodes that support streaming output:
Agent: Streams LLM responses token by tokenListGenerator: Streams list items as theyβre generatedRealtimeAgent: Streams audio + text responsesRealtimeWhisper: Streams transcription as audio arrivesRealtimeAudioInput: Streams audio from an input source
Data Flow Patterns
Pattern 1: Sequential Pipeline
Input β Process β Transform β Output
Each node waits for previous node to complete.
Pattern 2: Parallel Branches
β ProcessA β OutputA
Input β
β ProcessB β OutputB
Multiple branches execute in parallel.
Pattern 3: Streaming Pipeline
Input β StreamingAgent β Collect β Output
(yields chunks)
Data flows in chunks, enabling real-time updates.
Pattern 4: Fan-In Pattern
SourceA β
β Combine β Process β Output
SourceB β
Multiple inputs combine before processing.
Workflow Patterns
To build any example: β press Space to add nodes β drag connections β press Ctrl/β+Enter to run β add Preview nodes to inspect intermediate results
Pattern 1: Simple Pipeline
Use Case: Transform input β process β output
Example: Image Enhancement
When to Use:
- Simple data transformations
- Single input, single output
- No conditional logic needed
Pattern 2: Agent-Driven Generation
Use Case: LLM generates content based on input
Example: Image to Story
When to Use:
- Creative generation tasks
- Multimodal transformations (imageβtextβaudio)
- Need semantic understanding
Key Nodes:
Agent: General-purpose LLM agent with streamingSummarizer: Specialized for text summarizationListGenerator: Streams list of items
Pattern 3: Streaming with Multiple Previews
Use Case: Show intermediate results during generation
Example: Movie Poster Generator
When to Use:
- Complex multi-stage generation
- User needs to see progress
- Agent planning + execution workflow
Key Concepts:
- Strategy Phase: Agent plans approach
- Preview Nodes: Show intermediate results
- ListGenerator: Streams generated prompts
- Image Generation: Final output
Pattern 4: RAG (Retrieval-Augmented Generation)
Use Case: Answer questions using documents as context
Example: Chat with Docs
When to Use:
- Question-answering over documents
- Need factual accuracy from specific sources
- Reduce LLM hallucinations
Key Components:
- Search: Query vector database for relevant documents
- Format: Inject retrieved context into prompt
- Generate: Stream LLM response with context
Related Workflow: Index PDFs
Pattern 5: Database Persistence
Use Case: Store generated data for later retrieval
Example: AI Flashcard Generator with SQLite
When to Use:
- Need persistent storage
- Building apps with memory
- Agent workflows that need to recall past interactions
Key Nodes:
CreateTable: Initialize database schemaInsert: Add recordsQuery: Retrieve recordsUpdate: Modify recordsDelete: Remove records
Database Flow:
- Create table structure
- Generate data with agent
- Insert into database
- Query and display results
Pattern 6: Email & Web Integration
Use Case: Process emails or web content
Example: Summarize Newsletters
When to Use:
- Automate email processing
- Monitor RSS feeds
- Extract web content
Key Nodes:
GmailSearch: Search Gmail with queriesEmailFields: Extract email metadataFetchRSSFeed: Get RSS feed entriesGetRequest: Fetch web content
Pattern 7: Realtime Processing
Use Case: Process streaming audio/video in real-time
Example: Realtime Agent
When to Use:
- Voice interfaces
- Live transcription
- Interactive audio applications
Key Nodes:
RealtimeAudioInput: Streaming audio inputRealtimeAgent: OpenAI Realtime API with streamingRealtimeWhisper: Live transcriptionRealtimeTranscription: OpenAI transcription streaming
Pattern 8: Multi-Modal Workflows
Use Case: Convert between different media types
Example: Audio to Image
When to Use:
- Converting between media types
- Creating rich multimedia experiences
- Accessibility applications
Common Chains:
- Audio β Text β Image
- Image β Text β Audio
- Video β Audio β Text β Summary
Pattern 9: Advanced Image Processing
Use Case: AI-powered image transformations
Example: Style Transfer
When to Use:
- Style transfer between images
- Controlled image generation
- Preserving structure while changing style
Key Techniques:
- ControlNet: Preserve structure with edge detection
- Image-to-Text: Generate descriptions
- Img2Img: Transform while maintaining composition
Pattern 10: Data Processing Pipeline
Use Case: Fetch, transform, and visualize data
Example: Data Visualization Pipeline
When to Use:
- Fetch external data sources
- Transform and filter datasets
- Auto-generate visualizations
Key Nodes:
GetRequest: Fetch web resourcesImportCSV: Parse CSV dataFilter: Transform dataChartGenerator: AI-generated charts with Plotly
Quick Reference: Choose Your Pattern
I want to
Generate creative content: β Use Pattern 2 (Agent-Driven Generation) β Nodes: Agent, ListGenerator, image/audio
generators
Answer questions about documents: β Use Pattern 4 (RAG) β First: Index documents with IndexTextChunks β Then:
Query with HybridSearch + Agent
Process emails automatically: β Use Pattern 6 (Email Integration) β Nodes: GmailSearch, EmailFields,
Classifier/Summarizer
Build a voice interface: β Use Pattern 7 (Realtime Processing) β Nodes: RealtimeAudioInput, RealtimeAgent
Store data persistently: β Use Pattern 5 (Database Persistence) β Nodes: CreateTable, Insert, Query
Transform images with AI: β Use Pattern 9 (Advanced Image Processing) β Nodes: StableDiffusionControlNet, Canny,
ImageToText
Process audio/video: β Check Audio/Video examples β Nodes: Whisper, AddSubtitles, RemoveSilence
Fetch and visualize data: β Use Pattern 10 (Data Processing Pipeline) β Nodes: GetRequest, ImportCSV,
ChartGenerator
Conclusion
Key takeaways:
- Start Simple: Begin with basic pipelines, add complexity as needed
- Use Streaming: Use streaming for better UX and performance
- Preview Often: Add Preview nodes to debug and validate
- Combine Patterns: Mix patterns for sophisticated workflows
- Test Incrementally: Build step by step, test each addition
Next Steps
- Explore Examples: Run the example workflows
- Build Your Own: Start with a simple pattern and customize
- Share Workflows: Export and share with the community
- Extend NodeTool: Create custom nodes
Resources
- MCP Server: Use
export_workflow_digraphto visualize workflows - Node Search: Use
search_nodesto discover nodes - Documentation: Check node descriptions with
get_node_info - Agent CLI: Run autonomous agents from command line - see Agent CLI Documentation
Automation with Agent CLI
For headless execution and automation, use the nodetool agent command to run autonomous agents from the command line:
# Run research agent
nodetool agent \
--config examples/agents/research-agent.yaml \
--prompt "Research latest AI trends"
# Automate content generation
nodetool agent \
--config examples/agents/content-creator.yaml \
--prompt "Write blog post about NodeTool workflows" \
--output blog-post.md
# Interactive agent session
nodetool agent \
--config examples/agents/code-assistant.yaml \
--interactive
See Also:
- Agent CLI Documentation β Complete CLI reference
- Agent Configuration Examples β Sample YAML configurations
- Global Chat & Agents β Agent system overview
Version: 1.0 Last Updated: 2025-10-12 Generated with: NodeTool MCP Server + Claude Code