Open-Source Projects
Discover top open-source software, updated regularly with real-world adoption signals.
Discover top open-source software, updated regularly with real-world adoption signals.

Modular Python framework to index and query data for LLMs.
LlamaIndex provides connectors, indices, and retrieval tools to augment large language models with private data, offering both a simple high‑level API and deep customization via over 300 integrations.

LlamaIndex is a Python data framework that helps developers augment large language models (LLMs) with private data sources. It offers a unified set of data connectors, indexing structures, and retrieval interfaces, making it easy for both beginners and advanced users to build knowledge‑augmented LLM applications.
The library ships as a starter package (llama-index) or a core package (llama-index-core) that can be extended with any of the 300+ integration packages on LlamaHub, covering LLMs, embeddings, vector stores, and data loaders. A high‑level API can create a searchable index in five lines of code, while low‑level components allow custom retrievers, query engines, and rerankers. Indices are stored in‑memory by default but can be persisted to disk for later reuse. Installation is via pip, and the project uses Poetry for dependency management.
Corporate document Q&A
Employees can ask natural‑language questions and receive answers sourced from internal policies, manuals, and reports.
Product support chatbot
Customer‑facing bot retrieves relevant sections from product guides to provide accurate troubleshooting steps.
Research paper knowledge graph
Academic team builds an index of papers and citations, enabling semantic search and relationship exploration.
Contextual code assistant
IDE plugin queries a codebase index to suggest implementations and documentation snippets in real time.
Use `pip install llama-index` for the starter package, or `pip install llama-index-core` plus any desired integration packages from LlamaHub.
Yes, call `index.storage_context.persist()` and later reload with `StorageContext.from_defaults(persist_dir=...)`.
Over 300 community‑maintained integration packages covering LLMs, embeddings, vector stores, and data loaders.
Absolutely; integrations exist for Replicate, HuggingFace, Llama‑2, and many others.
The official docs are linked in the README and kept up‑to‑date at the project's documentation site.
Project at a glance
ActiveLast synced 4 days ago