LibreChat logo

LibreChat

Unified AI chat interface for all major models

Self-hosted ChatGPT alternative supporting OpenAI, Claude, Gemini, local models, and custom endpoints with agents, code execution, and multimodal capabilities.

LibreChat banner

Overview

What is LibreChat?

LibreChat is a comprehensive, self-hosted AI chat platform that unifies access to multiple AI providers through a single, familiar interface. Designed for developers, teams, and organizations seeking control over their AI infrastructure, it delivers a ChatGPT-inspired experience while supporting OpenAI, Anthropic Claude, Google Gemini, AWS Bedrock, Azure OpenAI, and any OpenAI-compatible endpoint—including local models via Ollama.

Core Capabilities

Beyond basic chat, LibreChat offers no-code agent creation with marketplace sharing, secure code execution across eight programming languages, web search with reranking, generative UI for React and Mermaid diagrams, and multimodal file analysis. Users can create shareable presets, fork conversations for advanced context control, and leverage speech-to-text/text-to-speech for hands-free interaction.

Deployment & Access

Built for flexibility, LibreChat supports Docker, Railway, Zeabur, and Sealos deployments. Multi-user authentication via OAuth2, LDAP, or email ensures secure team access, while built-in moderation and token tracking provide administrative oversight. The multilingual interface spans 30+ languages, making it accessible to global teams.

Highlights

Unified access to OpenAI, Claude, Gemini, Bedrock, local models, and custom endpoints
No-code agent builder with marketplace, MCP support, and collaborative sharing
Secure code interpreter for Python, Node.js, Go, Java, Rust, and more
Multimodal chat with file analysis, image generation, web search, and speech integration

Pros

  • Supports virtually any AI provider or OpenAI-compatible API without proxies
  • Self-hosted deployment ensures data privacy and full infrastructure control
  • Rich feature set including agents, code execution, web search, and generative UI
  • Multi-user authentication with OAuth2, LDAP, moderation, and token tracking

Considerations

  • Self-hosting requires Docker knowledge and infrastructure management
  • Feature breadth may overwhelm users seeking a simple chat interface
  • Advanced capabilities like agents and code execution demand configuration
  • Community-driven support may lack enterprise SLA guarantees

Managed products teams compare with

When teams consider LibreChat, these hosted platforms usually appear on the same shortlist.

ChatGPT logo

ChatGPT

AI conversational assistant for answering questions, writing, and coding help

Claude logo

Claude

AI conversational assistant for reasoning, writing, and coding

Manus logo

Manus

General purpose AI agent for automating complex tasks

Looking for a hosted option? These are the services engineering teams benchmark against before choosing open source.

Fit guide

Great for

  • Teams requiring unified access to multiple AI providers from one interface
  • Organizations prioritizing data privacy through self-hosted deployments
  • Developers building custom AI workflows with agents, tools, and code execution
  • Multi-user environments needing authentication, moderation, and usage tracking

Not ideal when

  • Non-technical users seeking zero-configuration cloud-hosted solutions
  • Projects requiring only a single AI provider without extensibility
  • Teams lacking infrastructure resources for Docker-based deployments
  • Use cases demanding vendor-provided enterprise support contracts

How teams use it

Multi-Provider AI Evaluation

Compare responses from OpenAI, Claude, and Gemini side-by-side to select optimal models for specific tasks without switching platforms.

Private Enterprise AI Assistant

Deploy a self-hosted chat interface with LDAP authentication, enabling secure team access to AI models while maintaining full data sovereignty.

Custom Agent Marketplace

Build specialized no-code agents for customer support, data analysis, or content generation, then share them across departments via the internal marketplace.

Code-Assisted Development

Leverage the sandboxed code interpreter to prototype Python scripts, analyze datasets, or generate visualizations directly within chat conversations.

Tech snapshot

TypeScript66%
JavaScript33%
CSS1%
Handlebars1%
Shell1%
Smarty1%

Tags

visionartifactsaiawscloneo1claudemcpgeminigooglegpt-5anthropicdeepseekwebuichatgpt-clonechatgptlibrechatazureopenairesponses-api

Frequently asked questions

Which AI providers does LibreChat support?

LibreChat supports OpenAI, Anthropic Claude, Google Gemini, AWS Bedrock, Azure OpenAI, Vertex AI, and any OpenAI-compatible API. It also works with local models via Ollama, groq, Mistral AI, OpenRouter, and many others.

Can I run LibreChat completely offline with local models?

Yes. LibreChat integrates with Ollama and other local inference engines, allowing fully offline operation without external API calls when configured with local models.

What are LibreChat Agents and how do they work?

LibreChat Agents are no-code custom assistants you build using tools, file search, code execution, and MCP servers. You can share agents with specific users or groups and discover community-built agents in the marketplace.

Is multi-user authentication supported?

Yes. LibreChat includes OAuth2, LDAP, and email-based authentication with built-in moderation and token spend tracking for secure multi-user deployments.

How does the code interpreter ensure security?

Code execution runs in a fully isolated, sandboxed environment supporting Python, Node.js, Go, Java, Rust, and other languages, ensuring no access to your host system or sensitive data.

Project at a glance

Active
Stars
33,240
Watchers
33,240
Forks
6,647
LicenseMIT
Repo age2 years old
Last commit59 minutes ago
Self-hostingSupported
Primary languageTypeScript

Last synced 54 minutes ago