
ChatGPT
AI conversational assistant for answering questions, writing, and coding help
Discover top open-source software, updated regularly with real-world adoption signals.

Self-hosted AI search engine with local and cloud LLMs
Farfalle combines multiple search APIs and LLMs—local via Ollama or cloud via OpenAI, Groq, LiteLLM—to deliver AI-enhanced answers, all deployable via Docker and Vercel.

Farfalle is a self‑hosted AI‑powered search platform that merges web search providers (SearXNG, Tavily, Serper, Bing) with large language models. Users can run local models through Ollama (llama3, gemma, mistral, phi3) or tap cloud services such as OpenAI GPT‑4o, Groq Llama‑3, and any custom endpoint via LiteLLM. The system answers queries with a blend of live web results and LLM reasoning, offering a richer, more contextual response than standard search.
Designed for developers, teams, and privacy‑conscious users who need control over data and model costs. The stack—Next.js frontend, FastAPI backend, Redis rate limiting, and Logfire logging—runs in Docker containers, with a pre‑built image for quick start. After launching the backend, the frontend can be deployed on Vercel by pointing it to the API URL. Optional API keys enable additional search sources, but the core functionality works entirely offline with local models.
When teams consider Farfalle, these hosted platforms usually appear on the same shortlist.
Looking for a hosted option? These are the services engineering teams benchmark against before choosing open source.
Research assistant for developers
Provides concise answers sourced from web and LLM reasoning, accelerating code exploration.
Enterprise knowledge base search
Combines internal search APIs with private LLMs to answer employee queries while keeping data on‑prem.
Educational Q&A portal
Students ask questions and receive AI‑generated explanations backed by live web results.
Prototype AI chatbot
Leverages LiteLLM to route requests to preferred cloud provider, enabling rapid experimentation.
No, you can operate entirely with local models via Ollama; API keys are only required for cloud providers.
Run `docker-compose -f docker-compose.dev.yaml up -d` after configuring the .env file.
Farfalle is built to integrate additional providers; you can extend the FastAPI backend to call custom APIs.
Yes, a ready‑to‑use image is published as part of the repository’s releases.
The project is released under the Apache‑2.0 license.
Project at a glance
DormantLast synced 4 days ago