OpenCrust

by opencrust-org

Personal AI assistant platform rewritten in Rust — single 16 MB binary, 13 MB RAM idle, encrypted credentials, config hot-reload.

Rust

Notable Features

  • 16 MB binary; 13 MB RAM idle; encrypted credential storage; config hot-reload; Telegram/Discord/Slack/WhatsApp/iMessage

About

OpenCrust by opencrust-org is a ground-up rewrite of OpenClaw in Rust, targeting the “personal AI assistant platform” use case with a radically reduced resource footprint. The compiled binary weighs 16 MB and idles at 13 MB of RAM — a fraction of the Node.js-based OpenClaw runtime. Config hot-reload means the agent can be reconfigured without a restart, and encrypted credential storage protects API keys and tokens at rest.

OpenCrust supports five messaging channels: Telegram, Discord, Slack, WhatsApp, and iMessage. The iMessage support is notably rare in the OpenClaw-adjacent ecosystem, making OpenCrust one of the few implementations accessible to Apple ecosystem users who prefer iMessage over cross-platform alternatives. The channel implementations are built natively in Rust rather than wrapping JavaScript bridge libraries.

The Rust rewrite represents a deliberate engineering tradeoff: more upfront development effort in exchange for dramatically lower runtime costs, better memory safety, and single-binary deployment. For operators running OpenClaw-style agents on VPS instances where they’re paying for RAM, or on devices where the Node.js runtime itself is the limiting factor, OpenCrust offers a compelling alternative architecture.

Platform Support

Linux macOS