
Claude Cowork
Desktop AI coworker that operates on your files to handle multi-step tasks.

Rust-powered AI desktop agent with BYOK and Docker isolation
A native cross-platform desktop app written in Rust that lets you run any LLM locally or via API, with bring-your-own-key privacy, Docker-isolated execution, and extensible skill plugins.
Kuse Cowork targets developers, power users, and privacy-focused teams who need a local AI assistant that can work with any large language model. Built entirely in Rust and packaged with Tauri, the app runs on macOS, Windows, and Linux without heavyweight dependencies, delivering fast, memory-safe performance.
The agent supports BYOK for Anthropic Claude, OpenAI GPT, Ollama, LM Studio, or any OpenAI-compatible endpoint, and it can load custom skills such as docx, pdf, pptx, and xlsx processing. Docker containers isolate every external command, ensuring that code execution cannot affect the host system. Users configure their API keys and workspace folder through a simple UI, then launch tasks like file organization, receipt parsing, or meeting-note summarization. The project can be built from source with Node.js, Rust, and Docker installed, or run in development mode via npm run tauri dev. Production builds are created with npm run tauri build, producing a ~10 MB native binary.
When teams consider Kuse Cowork, these hosted platforms usually appear on the same shortlist.
Looking for a hosted option? These are the services engineering teams benchmark against before choosing open source.
Automated folder organization
Rearranges files, applies consistent naming, and updates an index within the selected workspace.
Expense report generation from receipts
Extracts line-item data from scanned receipts, compiles a CSV, and produces a summarized expense report.
Meeting notes summarization with TODO extraction
Creates a concise summary of meeting transcripts and lists actionable tasks for follow-up.
Custom document conversion via skill plugins
Transforms docx, pptx, or xlsx files into desired formats using user-defined skill extensions.
Docker is needed for full container isolation, but the app can still run without it; commands will execute directly on the host.
Any model with an OpenAI-compatible API, including Anthropic Claude, OpenAI GPT, Ollama, LM Studio, or self-hosted endpoints.
No. All settings and data are stored locally, and API calls go directly to the provider you configure.
Yes. The extensible skill system lets you create custom handlers for additional file types or workflows.
Run `npm run tauri build` after installing Node.js, Rust, and Docker; the output is a ~10 MB native executable.
Project at a glance
ActiveLast synced 4 hours ago