
Langflow
Visual builder for AI agents and workflows with API deployment
- Stars
- 145,340
- License
- MIT
- Last commit
- 5 hours ago
Build AI apps/agents with tools, retrieval, prompts and routing graphs.
AI application frameworks and orchestration tools supply reusable building blocks-such as prompt templates, retrieval modules, and tool adapters-to streamline the creation of conversational agents, retrieval-augmented generation (RAG) pipelines, and multi-step AI workflows. They abstract away low-level API calls and provide graph-oriented routing, enabling developers to focus on business logic rather than glue code. The open-source ecosystem includes projects like LangChain, LlamaIndex, Haystack, and Flowise, each offering varying degrees of extensibility, community support, and visual editors. These frameworks can be self-hosted on-premises or in cloud environments, giving organizations control over data residency, scaling, and integration with proprietary LLM providers.

Visual builder for AI agents and workflows with API deployment

Open-source platform for building production-ready LLM applications

Build reliable LLM agents with interchangeable components

RAG engine with deep document understanding and agents

Build and deploy AI agents visually with low-code workflows

Modular Python framework to index and query data for LLMs.
An open-source platform that lets developers and teams design, deploy, and manage AI agent workflows with a visual editor, local model support, and built-in orchestration.
Expect a strong TypeScript presence among maintained projects.
Assess whether the framework supports custom components, third-party tool integrations, and easy addition of new LLM providers through a modular plugin architecture.
Consider the size of the contributor community, quality of official docs, availability of tutorials, and responsiveness of issue trackers or forums.
Evaluate runtime efficiency, support for asynchronous execution, and ability to scale horizontally via container orchestration or serverless deployments.
Check built-in connectors for vector stores, databases, and document loaders that enable robust RAG capabilities.
Determine if the framework offers a graphical editor or drag-and-drop interface for constructing prompt graphs without extensive coding.
Most tools in this category support these baseline capabilities.
Visual workflow orchestration for AI agents and automation
Event-driven agent/workflow framework for building multi-step AI systems.
Hiveflow lets teams design and run multi-agent workflows with a visual builder, browser extension for contextual flows, and an MCP server to connect assistants like Claude/Cursor—supporting process automation across email, documents, and APIs.
Combine LLM wrappers, memory stores, and tool plugins to build chatbots that maintain context and can invoke external APIs.
Chain document loaders, vector search, and prompt templates to answer queries using up-to-date knowledge bases.
Define sequential or conditional workflows where the LLM decides which tool (e.g., calculator, web scraper) to call next.
Use low-code canvases to sketch and iterate on AI workflows, then export the configuration to code for production.
Package the assembled application as Docker images or serverless functions, and leverage built-in logging and metrics for observability.
What is an AI application framework?
It is a collection of libraries and utilities that simplify building AI-driven applications by handling prompt orchestration, LLM integration, retrieval, and tool usage in a reusable way.
How does orchestration differ from simple prompting?
Orchestration involves coordinating multiple steps-such as data retrieval, tool calls, and conditional branching-whereas simple prompting sends a single request to an LLM without additional logic.
Which open-source frameworks are most widely adopted?
LangChain, LlamaIndex, Haystack, Flowise, and RAGFlow are among the top projects, each with over 20,000 GitHub stars and active contributor communities.
How should I choose between LangChain and LlamaIndex?
LangChain emphasizes flexible chain building and tool integration, while LlamaIndex focuses on data ingestion and retrieval. Choose based on whether your primary need is complex workflow orchestration (LangChain) or robust document indexing (LlamaIndex).
Can these frameworks be deployed on-premises?
Yes. All listed open-source projects can be run in self-hosted environments using Docker, Kubernetes, or direct installation on virtual machines, giving full control over data and compute.
Do the frameworks support tool usage like calculators or web searches?
Most frameworks provide a plugin system for external tools. Built-in adapters exist for common utilities such as calculators, web browsers, and custom APIs, allowing the LLM to invoke them during a workflow.