Kuse Cowork logo

Kuse Cowork

Rust-powered AI desktop agent with BYOK and Docker isolation

A native cross-platform desktop app written in Rust that lets you run any LLM locally or via API, with bring-your-own-key privacy, Docker-isolated execution, and extensible skill plugins.

Overview

Overview

Kuse Cowork targets developers, power users, and privacy-focused teams who need a local AI assistant that can work with any large language model. Built entirely in Rust and packaged with Tauri, the app runs on macOS, Windows, and Linux without heavyweight dependencies, delivering fast, memory-safe performance.

Capabilities & Deployment

The agent supports BYOK for Anthropic Claude, OpenAI GPT, Ollama, LM Studio, or any OpenAI-compatible endpoint, and it can load custom skills such as docx, pdf, pptx, and xlsx processing. Docker containers isolate every external command, ensuring that code execution cannot affect the host system. Users configure their API keys and workspace folder through a simple UI, then launch tasks like file organization, receipt parsing, or meeting-note summarization. The project can be built from source with Node.js, Rust, and Docker installed, or run in development mode via npm run tauri dev. Production builds are created with npm run tauri build, producing a ~10 MB native binary.

Highlights

Bring-Your-Own-Key support for any LLM provider
Pure Rust core with zero external dependencies
Native cross-platform performance on macOS, Windows, Linux
Docker-based container isolation for secure command execution

Pros

  • Privacy-first: all data stays on the local machine
  • Model-agnostic: works with Claude, GPT, Ollama, etc.
  • Fast and memory-safe thanks to Rust implementation
  • Extensible skill system for custom file handling

Considerations

  • Full isolation requires Docker Desktop to be installed
  • Early-stage project; some features may be unstable
  • Manual development setup can be complex for newcomers
  • User interface is functional but not polished

Managed products teams compare with

When teams consider Kuse Cowork, these hosted platforms usually appear on the same shortlist.

Claude Cowork logo

Claude Cowork

Desktop AI coworker that operates on your files to handle multi-step tasks.

Looking for a hosted option? These are the services engineering teams benchmark against before choosing open source.

Fit guide

Great for

  • Developers who need local AI automation without telemetry
  • Teams that require strict data privacy and BYOK control
  • Cross-platform power users looking for a lightweight desktop agent
  • Organizations wanting to extend functionality via custom skills

Not ideal when

  • Users without Docker or who cannot install Docker Desktop
  • Non-technical users seeking a one-click install experience
  • Environments lacking Rust/Tauri build toolchains
  • Mobile-only scenarios, as mobile support is not yet available

How teams use it

Automated folder organization

Rearranges files, applies consistent naming, and updates an index within the selected workspace.

Expense report generation from receipts

Extracts line-item data from scanned receipts, compiles a CSV, and produces a summarized expense report.

Meeting notes summarization with TODO extraction

Creates a concise summary of meeting transcripts and lists actionable tasks for follow-up.

Custom document conversion via skill plugins

Transforms docx, pptx, or xlsx files into desired formats using user-defined skill extensions.

Tech snapshot

Rust57%
TypeScript31%
CSS12%
HTML1%

Frequently asked questions

Is Docker required to run Kuse Cowork?

Docker is needed for full container isolation, but the app can still run without it; commands will execute directly on the host.

Which AI models can I use?

Any model with an OpenAI-compatible API, including Anthropic Claude, OpenAI GPT, Ollama, LM Studio, or self-hosted endpoints.

Does the app send any data to external servers?

No. All settings and data are stored locally, and API calls go directly to the provider you configure.

Can I add my own skills or plugins?

Yes. The extensible skill system lets you create custom handlers for additional file types or workflows.

How do I build a production binary?

Run `npm run tauri build` after installing Node.js, Rust, and Docker; the output is a ~10 MB native executable.

Project at a glance

Active
Stars
173
Watchers
173
Forks
24
LicenseMIT
Repo age4 days old
Last commit8 hours ago
Primary languageRust

Last synced 4 hours ago