Reor logo

Reor

AI‑enhanced local note‑taking with automatic linking and semantic search

Reor is a desktop markdown note‑taking app that stores data locally, uses AI to embed notes, auto‑link related content, provide semantic search and LLM‑powered Q&A.

Reor banner

Overview

Overview

Reor delivers a privacy‑first knowledge‑management experience on macOS, Linux, and Windows. It stores every markdown file on your device and augments it with AI‑driven embeddings, allowing the app to surface related notes in real time and answer questions using retrieval‑augmented generation.

Capabilities

When you write a note, Reor chunks the text, creates vector embeddings, and inserts them into an internal LanceDB vector store. Similarity search automatically links relevant notes in a sidebar, while an integrated LLM (via Ollama or any OpenAI‑compatible API) can answer queries based on the retrieved context. The editor feels familiar to Obsidian users, and all AI processing can run entirely offline if you install local models.

Deployment

Download a pre‑built binary from reorproject.org, choose a folder for your markdown vault, and optionally add local LLMs through Settings → Add New Local LLM. No external services are required unless you prefer a cloud API.

Highlights

Local‑first storage with full‑text markdown editing
Automatic vector embedding and similarity linking of notes
Integrated LLM Q&A and semantic search via Ollama or compatible APIs
Cross‑platform desktop app for macOS, Linux, and Windows

Pros

  • Data never leaves the device, ensuring privacy
  • AI augmentation works offline with local models
  • Supports any LLM compatible with Ollama
  • Familiar Obsidian‑style markdown interface

Considerations

  • Local models can demand significant RAM/CPU
  • Front‑matter parsing may be limited on imported files
  • No built‑in cloud sync; manual sync required
  • Limited third‑party PKM integrations at present

Managed products teams compare with

When teams consider Reor, these hosted platforms usually appear on the same shortlist.

Coda logo

Coda

Docs, tables, and apps combined into one collaborative workspace

Craft logo

Craft

Collaborative documents and notes with rich formatting

Document360 logo

Document360

Knowledge base software for product docs and self‑service help

Looking for a hosted option? These are the services engineering teams benchmark against before choosing open source.

Fit guide

Great for

  • Users who prioritize privacy and local data control
  • Researchers and writers needing AI‑assisted knowledge linking
  • Teams with on‑premise AI infrastructure
  • Individuals comfortable managing local LLM installations

Not ideal when

  • People seeking seamless cloud sync across devices
  • Users without sufficient hardware for local model inference
  • Those needing extensive integrations with other PKM tools
  • Fans of web‑based note‑taking solutions

How teams use it

Research literature review

Automatically connects related papers and notes, enabling quick retrieval of supporting evidence.

Project brainstorming

Generates AI‑driven suggestions and surfaces past ideas relevant to current concepts.

Personal knowledge base

Provides semantic search across all markdown files, answering queries with context‑aware responses.

Technical documentation authoring

Links related API docs and code snippets, reducing duplication and improving navigation.

Tech snapshot

JavaScript59%
TypeScript40%
CSS1%
SCSS1%
HTML1%
Makefile1%

Tags

llamaaivector-databasenote-takinglancedbollamamarkdownragllamacppsecond-brainpkmlocal-first

Frequently asked questions

Can Reor run without an internet connection?

Yes, it operates fully offline when using locally installed LLMs via Ollama.

Which operating systems are supported?

macOS, Linux, and Windows.

How do I add a new language model?

In Settings → Add New Local LLM, specify the model name and let Reor download it through Ollama.

Is my data synced to the cloud?

No, all notes are stored locally; cloud sync must be handled externally.

Can I use OpenAI's API instead of a local model?

Yes, Reor can connect to any OpenAI‑compatible API such as OpenAI, Ollama, or Oobabooga.

Project at a glance

Stable
Stars
8,467
Watchers
8,467
Forks
515
LicenseAGPL-3.0
Repo age2 years old
Last commit8 months ago
Primary languageJavaScript

Last synced 49 minutes ago