Back to blog
EngineeringJanuary 30, 2026

Why Every Line of Our Backend Is Rust

When we started building Athion's infrastructure, we evaluated Go, TypeScript (Node.js), and Rust. We chose Rust for everything — backend services, the Flux desktop app core, and our upcoming IDE.

Memory safety without compromise

Rust's ownership model eliminates entire classes of bugs at compile time. No null pointer dereferences. No data races. No use-after-free. In a voice chat application handling hundreds of concurrent audio streams, these guarantees aren't academic — they're essential.

Predictable performance

Garbage-collected languages introduce latency spikes. When you're routing real-time audio with a target latency under 100ms, a 50ms GC pause is unacceptable. Rust gives us deterministic memory management with zero-cost abstractions.

The numbers

Our Rust backend serving Flux voice connections:

  • Memory per connection: ~2 KB overhead
  • P99 latency: 12ms for API requests
  • CPU usage: 0.3% idle on a single core
  • Binary size: 8 MB for the complete server

Compare this to a typical Node.js backend where a single Express server idles at 80+ MB.

The tradeoff

Rust has a steep learning curve. Development velocity is slower initially. The borrow checker will fight you until you internalize its rules. But the code you ship is more correct, more performant, and more reliable.

For infrastructure that runs 24/7 and handles real-time media — that tradeoff is worth it every time.