Real-Time Distributed Chat System

The core question behind this project: what actually breaks when you try to scale WebSocket connections across multiple server instances? Single-server chat is straightforward. Multi-server chat is where all the interesting problems live.
The backend is Go, chosen for its concurrency model — goroutines handle individual WebSocket connections cleanly. Redis pub/sub handles message fanout across server instances, so a message sent to one node reaches clients connected to any other node. A custom load balancer sits in front and distributes connections.
The whole system runs in Docker containers, which made it easy to simulate multi-node scenarios locally. The React frontend is intentionally simple — the complexity is in the backend coordination.
Security focus
Trust boundaries between services: which component gets to trust which messages, connections, and identities. WebSocket session constraints, separated responsibilities, container isolation.
Key learnings
- ·WebSocket mechanics
- ·Scaling chat past one node
- ·Load balancing for connection-heavy traffic
- ·Redis pub/sub across instances
- ·Distributed design tradeoffs