Letta logo

Letta

Build stateful AI agents with persistent, self‑editing memory

Letta lets developers create AI agents that retain and edit memory over time, offering Python/TypeScript SDKs, a no‑code UI, desktop client, and cloud service for rapid deployment.

Letta banner

Overview

Overview

Letta is a platform for building stateful AI agents that maintain and modify their own memory across interactions. By leveraging a memory hierarchy of in‑context and out‑of‑context blocks, agents can edit, delete, or search their knowledge base using built‑in tools such as web search and code execution.

Deployment Options

Developers can interact with Letta through Python or TypeScript SDKs, a no‑code Agent Development Environment (ADE), a fully‑local desktop client for macOS and Windows, or the Letta Cloud service. Self‑hosting is also supported by running a local server and pointing the SDK to a custom base URL, giving full control over data privacy and infrastructure.

Advanced Capabilities

The platform enables multi‑agent architectures with shared memory blocks, allowing agents to collaborate on a common knowledge store. Sleep‑time agents run in the background, continuously updating memory without user prompts, effectively acting as an agent’s subconscious.

Highlights

Self‑editing memory blocks with in‑context and out‑of‑context storage
Tool integration for dynamic context engineering (e.g., web search, code execution)
Multi‑agent shared memory and sleep‑time background agents
Python and TypeScript SDKs plus no‑code ADE and desktop client

Pros

  • Persistent memory enables long‑term personalization
  • Flexible deployment: cloud, local desktop, or self‑hosted
  • Rich SDKs support both Python and TypeScript ecosystems
  • Built‑in tools allow agents to modify their own context

Considerations

  • Requires an API key or self‑hosted server to run
  • Complex memory management may have a learning curve
  • Performance depends on underlying LLM model and hosting
  • Current tooling focuses on developers; end‑users need a UI layer

Managed products teams compare with

When teams consider Letta, these hosted platforms usually appear on the same shortlist.

CrewAI logo

CrewAI

Multi-agent automation framework & studio to build and run AI crews

LangGraph logo

LangGraph

Open-source framework for building stateful, long-running AI agents

Relevance AI logo

Relevance AI

No-code platform to build a team of AI agents with rich integrations

Looking for a hosted option? These are the services engineering teams benchmark against before choosing open source.

Fit guide

Great for

  • Developers building AI assistants that need to remember past interactions
  • Enterprises wanting shared knowledge bases across multiple agents
  • Researchers exploring LLM operating system concepts
  • Teams requiring on‑premise deployment for data privacy

Not ideal when

  • Simple stateless chatbots with no memory requirements
  • Non‑technical users without access to the ADE UI
  • Edge devices with limited compute and no internet
  • Projects needing real‑time sub‑millisecond response times

How teams use it

Customer Support Agent with Personalized History

Remembers each user's preferences and past tickets, providing tailored responses and reducing repeat inquiries.

Project Management Assistant with Shared Organization Memory

Multiple agents access a common memory block to coordinate tasks, deadlines, and resources across teams.

Personal Productivity Bot with Sleep‑Time Processing

Background sleep‑time agent updates the user's knowledge base overnight, enabling proactive suggestions each morning.

Research Knowledge Curator that Self‑Improves

Agent continuously ingests new papers via web search tool, updates its memory, and offers up‑to‑date summaries.

Tech snapshot

Python99%
Go1%
Shell1%
C++1%
Java1%
Jinja1%

Tags

aillm-agentllmai-agents

Frequently asked questions

How do I obtain a Letta API key?

Sign up on Letta Cloud; after verification you can generate an API key from the dashboard.

Which language models can I use with Letta?

Letta supports any model accessible via its API, such as OpenAI GPT‑4.x, GPT‑3.5, and Anthropic Claude models.

Can I run Letta entirely on my own hardware?

Yes, by self‑hosting the Letta server and pointing the SDK to your local base URL.

What is a shared memory block?

A memory block that multiple agents can attach to, allowing them to read and write common information.

Is there a free tier or trial?

Letta Cloud offers a free trial with limited usage; self‑hosted deployments depend on your own resources.

Project at a glance

Active
Stars
20,762
Watchers
20,762
Forks
2,167
LicenseApache-2.0
Repo age2 years old
Last commit2 days ago
Primary languagePython

Last synced 3 hours ago