
Coda
Docs, tables, and apps combined into one collaborative workspace
Discover top open-source software, updated regularly with real-world adoption signals.

AI‑enhanced local note‑taking with automatic linking and semantic search
Reor is a desktop markdown note‑taking app that stores data locally, uses AI to embed notes, auto‑link related content, provide semantic search and LLM‑powered Q&A.

Reor delivers a privacy‑first knowledge‑management experience on macOS, Linux, and Windows. It stores every markdown file on your device and augments it with AI‑driven embeddings, allowing the app to surface related notes in real time and answer questions using retrieval‑augmented generation.
When you write a note, Reor chunks the text, creates vector embeddings, and inserts them into an internal LanceDB vector store. Similarity search automatically links relevant notes in a sidebar, while an integrated LLM (via Ollama or any OpenAI‑compatible API) can answer queries based on the retrieved context. The editor feels familiar to Obsidian users, and all AI processing can run entirely offline if you install local models.
Download a pre‑built binary from reorproject.org, choose a folder for your markdown vault, and optionally add local LLMs through Settings → Add New Local LLM. No external services are required unless you prefer a cloud API.
When teams consider Reor, these hosted platforms usually appear on the same shortlist.
Looking for a hosted option? These are the services engineering teams benchmark against before choosing open source.
Research literature review
Automatically connects related papers and notes, enabling quick retrieval of supporting evidence.
Project brainstorming
Generates AI‑driven suggestions and surfaces past ideas relevant to current concepts.
Personal knowledge base
Provides semantic search across all markdown files, answering queries with context‑aware responses.
Technical documentation authoring
Links related API docs and code snippets, reducing duplication and improving navigation.
Yes, it operates fully offline when using locally installed LLMs via Ollama.
macOS, Linux, and Windows.
In Settings → Add New Local LLM, specify the model name and let Reor download it through Ollama.
No, all notes are stored locally; cloud sync must be handled externally.
Yes, Reor can connect to any OpenAI‑compatible API such as OpenAI, Ollama, or Oobabooga.
Project at a glance
StableLast synced 4 days ago