Onyx logo

Onyx

Self‑hosted AI chat UI with agents, RAG, and 40+ connectors

Onyx delivers a feature‑rich, self‑hostable chat interface that works with any LLM, offering custom agents, hybrid RAG, web search, 40+ connectors, and enterprise‑grade security—all deployable via Docker, Kubernetes or Terraform.

Onyx banner

Overview

Overview

Onyx is a self‑hosted chat platform designed for developers, product teams, and enterprises that need a flexible AI assistant capable of tapping into internal and external knowledge sources. It supports any LLM—cloud‑based services like OpenAI, Anthropic, Gemini, as well as self‑hosted models such as Ollama or vLLM—so you can choose the model that fits your performance, cost, and privacy requirements.

Core Capabilities

The UI includes custom AI agents, hybrid retrieval‑augmented generation with a knowledge graph, and built‑in web search via Google PSE, Exa, Serper, or an in‑house scraper. Over 40 connectors let you ingest data from SaaS apps, file stores, and databases. Advanced features like deep multi‑step research, code interpretation, image generation, and action execution (MCP) enable complex workflows, while collaboration tools provide chat sharing, role‑based access, analytics, and SSO/RBAC security.

Deployment & Community

Onyx can be launched with a single script or deployed through Docker, Kubernetes, or Terraform, and it runs in air‑gapped environments. Comprehensive guides cover major cloud providers, and a vibrant Discord community offers support and contributions. The Community Edition is MIT‑licensed, with an Enterprise Edition adding extra capabilities for large organizations.

Highlights

Custom AI agents with programmable instructions and actions
Hybrid RAG + knowledge graph for precise document retrieval
40+ connectors and built‑in web search across multiple providers
Enterprise‑grade security (SSO, RBAC, encrypted credentials) and collaboration tools

Pros

  • Works with any LLM, including self‑hosted models
  • One‑command installation simplifies initial setup
  • Extensible connector ecosystem for diverse data sources
  • Robust security and role‑based management for teams

Considerations

  • Full feature set may require additional configuration for large deployments
  • Enterprise‑only features are not available in the free edition
  • Running self‑hosted LLMs adds infrastructure overhead
  • UI performance depends on modern browser capabilities

Managed products teams compare with

When teams consider Onyx, these hosted platforms usually appear on the same shortlist.

ChatGPT logo

ChatGPT

AI conversational assistant for answering questions, writing, and coding help

Claude logo

Claude

AI conversational assistant for reasoning, writing, and coding

Manus logo

Manus

General purpose AI agent for automating complex tasks

Looking for a hosted option? These are the services engineering teams benchmark against before choosing open source.

Fit guide

Great for

  • Teams that need a unified chat interface across multiple LLM providers
  • Organizations requiring on‑premise deployment for data privacy
  • Developers building custom AI agents or workflow automations
  • Enterprises seeking searchable knowledge bases with fine‑grained access control

Not ideal when

  • Users looking for a fully managed SaaS solution without any infrastructure
  • Projects that only need a simple single‑LLM chatbot without data integration
  • Teams without capacity to maintain self‑hosted LLM infrastructure
  • Environments lacking modern browser support

How teams use it

Customer Support Knowledge Base

Agents retrieve up‑to‑date policy documents from internal repositories and answer tickets in real time.

Research Assistant for Market Analysis

Deep research agents combine web search, uploaded reports, and connector data to generate comprehensive market insights.

Internal DevOps Automation

MCP actions let AI agents trigger CI pipelines, fetch logs, and create incident tickets directly from chat.

Creative Content Generation

Users generate images, code snippets, and data visualizations on demand within the same conversational UI.

Tech snapshot

Python68%
TypeScript28%
JavaScript2%
CSS1%
HTML1%
Shell1%

Tags

aillmchatuiinformation-retrievalragpythonenterprise-searchnextjsllm-uigen-aichatgptai-chat

Frequently asked questions

Can Onyx run without an internet connection?

Yes, it can be deployed in a completely air‑gapped environment as long as the required LLM and data sources are available locally.

Which LLMs are supported?

Onyx works with any API‑compatible LLM, including OpenAI, Anthropic, Gemini, as well as self‑hosted options like Ollama and vLLM.

How is data security handled?

Onyx provides SSO (OIDC/SAML/OAuth2), role‑based access control, and encryption of stored credentials. Document permissioning mirrors external app access.

What deployment options are available?

You can install via a single script, Docker Compose, Kubernetes manifests, or Terraform modules, with guides for major cloud providers.

Is there a free version?

The Community Edition is free under the MIT license; the Enterprise Edition adds extra features for larger organizations.

Project at a glance

Active
Stars
17,101
Watchers
17,101
Forks
2,299
Repo age2 years old
Last commit1 hour ago
Primary languagePython

Last synced 49 minutes ago