Open-Source Visual AI Workflow Builder

Build AI Workflows Visually

Connect nodes to generate content, analyze data, and automate tasks. Run models locally or via cloud APIs.

Start Here

NodeTool is a visual workflow builder for AI pipelines—connect nodes for images, video, text, data, and automation. Run locally or deploy to RunPod, Cloud Run, or self-hosted servers.

Who uses NodeTool?

Creators & Designers

Generate and transform media with AI.

  • Use Flux, SDXL, and custom models
  • Generate variations
  • Build reusable workflows

Developers & Researchers

Build agents, RAG systems, and pipelines.

  • Design multi-step LLM agents
  • Index and query documents locally
  • Deploy workflows as APIs

Data & Business Users

Process documents and automate tasks without coding.

  • Process data with AI pipelines
  • Automate document workflows
  • Build custom tools

What you can build right away

Your first 10 minutes

  1. Download NodeTool — install for macOS, Windows, or Linux.
  2. Choose your AI models — install local models like Flux/SDXL, or use cloud services.
  3. Try a template workflow — explore examples, press Run, watch results stream.
  4. Experiment and customize — change inputs, connect nodes differently, make it yours.
  5. Share your workflow — turn it into a simple app others can use.

Choose your path

Local-first or cloud-augmented

Local-only mode

All workflows, assets, and models run on your machine for maximum privacy and control.

  • Use MLX, llama.cpp, Whisper, and Flux locally
  • Store assets on disk or your own storage
  • Work offline once models are downloaded
Storage options →

Cloud-augmented mode

Mix local AI with cloud services for flexibility. Use the best tool for each task.

  • Add API keys for OpenAI, Anthropic, Replicate
  • Access cutting-edge models on demand
  • Deploy workflows to cloud infrastructure
Models & Providers →

📱 Personal AI Stack

Access self-hosted AI infrastructure from mobile devices.

NodeTool's mobile app connects to your own NodeTool server. Run workflows remotely, accessing models and data on hardware you control.

flowchart TB Mobile["📱 NodeTool Mobile"] --> |Secure Connection| VPN VPN["🔒 VPN / Tailscale / WireGuard"] --> |Encrypted Tunnel| Server Server["🖥️ NodeTool Server
(Your Hardware)"] subgraph Stack["Your AI Stack"] direction TB LLMs["🧠 Local LLMs
Llama, Mistral, Qwen, Phi"] Data["📁 Your Data
Documents, Photos, Notes"] Media["🎨 Media AI
Flux, Whisper, Music, Video"] Integrations["🔌 Integrations
APIs, Tools, Home Automation"] end Server --> Stack

Deployment Options

🏠 Local Stack

Connect via VPN to a home server. All processing on your hardware. Data stays on your network.

[Mobile] → [VPN] → [Home Server] → [Local LLMs + Data]

☁️ NodeTool Cloud

Managed infrastructure. No hardware setup. Workflows sync across devices.

[Mobile] → [NodeTool Cloud] → [Managed LLMs + Storage]

🏢 Private Cloud

Deploy to your organization's VPC. Supports compliance requirements and multi-user access.

[Mobile] → [VPN] → [VPC] → [Self-Hosted NodeTool]

🌐 Hybrid Stack

Local processing for sensitive data, cloud APIs for additional models.

[Mobile] → [Local Server] → [Local LLMs + Cloud APIs]

Capabilities

  • Personal AI Assistant — Chat with an AI that accesses your documents and knowledge base.
  • Mobile AI Workspace — Generate images, audio, and video from your phone using local models.
  • Private Knowledge Base — Index and query your documents with RAG.
  • Home Automation — Connect NodeTool to smart home devices.
  • Team Collaboration — Share workflows while keeping data in your infrastructure.

Key Features

🎯 Visual Editor

Build workflows by connecting nodes. No coding required. View the entire pipeline on one canvas.

👁️ Real-time Debugging

See every step execute in real-time. Inspect intermediate outputs. Streaming execution shows progress as it happens.

🔒 Local Execution

Run LLMs, Whisper, and diffusion models on your infrastructure. Cloud APIs are opt-in.

Popular destinations

Build with the community

NodeTool is open-source under AGPL-3.0. Join the Discord, explore the GitHub repo, and share workflows with other builders.