
ChatGPT
AI conversational assistant for answering questions, writing, and coding help
Discover top open-source software, updated regularly with real-world adoption signals.

Desktop AI client supporting multiple LLM providers cross-platform
Cherry Studio is a desktop client for Windows, Mac, and Linux that connects to major LLM providers, local models, and AI web services with 300+ pre-configured assistants.

Cherry Studio is a cross-platform desktop application that unifies access to diverse AI language models in a single interface. It supports major cloud LLM providers like OpenAI, Gemini, and Anthropic, integrates AI web services including Claude and Perplexity, and connects to local models via Ollama and LM Studio.
Designed for developers, researchers, and power users who work with multiple AI models, Cherry Studio ships with 300+ pre-configured AI assistants and enables custom assistant creation. The platform handles multi-format document processing (text, images, Office, PDF), offers WebDAV file management, and supports simultaneous conversations across different models. Built-in tools include global search, topic management, AI translation, and Model Context Protocol (MCP) server support.
Ready to use out of the box with no environment configuration required, Cherry Studio runs natively on Windows, macOS, and Linux. The interface features light/dark themes, transparent window options, complete Markdown rendering with Mermaid chart visualization, and code syntax highlighting. An active community contributes themes, and the project roadmap includes mobile apps, plugin systems, and advanced knowledge management features.
When teams consider Cherry Studio, these hosted platforms usually appear on the same shortlist.
Looking for a hosted option? These are the services engineering teams benchmark against before choosing open source.
Multi-Model Research Comparison
Researchers query GPT-4, Claude, and Gemini simultaneously to compare reasoning approaches and select the best response for academic work.
Local Model Development Workflow
Developers test custom Ollama models alongside commercial APIs, iterating on prompts without switching tools or managing multiple API clients.
Document-Driven AI Analysis
Analysts upload PDFs and Office files, leverage AI translation and topic management to extract insights, then back up conversations via WebDAV.
Cross-Platform Team Collaboration
Distributed teams on Windows, Mac, and Linux share custom assistants and conversation topics, maintaining consistent AI workflows across operating systems.
Cherry Studio connects to major cloud providers (OpenAI, Gemini, Anthropic), AI web services (Claude, Perplexity, Poe), and local models via Ollama and LM Studio. You supply your own API keys or local model endpoints.
No. Cherry Studio is ready to use out of the box with no environment setup required. Download the desktop client for Windows, Mac, or Linux and connect your LLM providers.
Yes. In addition to 300+ pre-configured assistants, Cherry Studio allows you to create custom assistants tailored to your workflows and run multi-model conversations simultaneously.
Cherry Studio processes text, images, Office documents, and PDFs. It includes WebDAV file management for cloud backup, Mermaid chart visualization, and code syntax highlighting.
Not yet. Android and iOS apps are in Phase 1 of the roadmap. Currently, Cherry Studio runs on Windows, macOS, and Linux desktops, with HarmonyOS PC edition also planned.
Project at a glance
ActiveLast synced 4 days ago