How We Built a Real-Time Dashboard with WebSockets & React
One of our fintech clients needed a live analytics dashboard — real-time transaction monitoring, user activity streams, and instant KPI updates. No polling. No refresh buttons. Pure real-time.
Here's how we engineered it to handle 50,000 concurrent connections with sub-50ms latency.
The Requirements
- Real-time transaction feed (< 100ms delay)
- Live KPI counters (revenue, active users, error rate)
- Interactive charts that update in real-time
- Role-based dashboards (admin, analyst, viewer)
- Handle 50K+ concurrent WebSocket connections
Architecture Overview
┌─────────────┐ ┌──────────────┐ ┌──────────────┐
│ React UI │◄────│ WebSocket │◄────│ Event Bus │
│ (Next.js) │ │ Gateway │ │ (Redis) │
└─────────────┘ └──────────────┘ └──────────────┘
▲
│
┌──────────────┐ ┌──────────────┐
│ PostgreSQL │◄────│ API Layer │
│ (TimescaleDB)│ │ (Node.js) │
└──────────────┘ └──────────────┘
Key Technology Choices
| Layer | Technology | Why |
|---|---|---|
| Frontend | Next.js + React | Server-rendered shell, client-side real-time |
| WebSocket | Socket.io on Node.js | Automatic reconnection, room-based broadcasting |
| Message Bus | Redis Pub/Sub | Horizontal scaling across WS servers |
| Database | TimescaleDB | Time-series optimized PostgreSQL for analytics |
| Hosting | AWS ECS + ALB | Auto-scaling container orchestration |
The WebSocket Layer
The critical piece was the WebSocket gateway. We built it as a standalone Node.js service that could scale horizontally behind a load balancer.
import { Server } from "socket.io";
import { createAdapter } from "@socket.io/redis-adapter";
import Redis from "ioredis";
const pubClient = new Redis(process.env.REDIS_URL);
const subClient = pubClient.duplicate();
const io = new Server(server, {
cors: { origin: process.env.CLIENT_URL },
adapter: createAdapter(pubClient, subClient),
});
io.on("connection", (socket) => {
const { role, orgId } = socket.handshake.auth;
// Join organization-specific room
socket.join(`org:${orgId}`);
// Join role-specific room for filtered data
socket.join(`org:${orgId}:${role}`);
console.log(`User connected: ${socket.id} (${role})`);
});
Redis Pub/Sub for Horizontal Scaling
With @socket.io/redis-adapter, all WebSocket server instances share state through Redis. When a new transaction comes in, it's published once and broadcast to all connected clients across all server instances.
Frontend: React + Custom Hooks
On the client side, we built custom hooks that abstract the WebSocket connection:
function useRealtimeKPI(metric: string) {
const [value, setValue] = useState(0);
const [trend, setTrend] = useState<"up" | "down" | "flat">("flat");
useEffect(() => {
const socket = getSocket();
socket.on(`kpi:${metric}`, (data) => {
setTrend(data.value > value ? "up" : data.value < value ? "down" : "flat");
setValue(data.value);
});
return () => { socket.off(`kpi:${metric}`); };
}, [metric]);
return { value, trend };
}
Performance Results
After load testing with 50K concurrent connections:
- Message latency: 12ms average (p99: 47ms)
- Memory per connection: ~2.4KB
- Server instances needed: 3 (auto-scaled)
- Zero dropped connections during 24-hour stress test
Lessons Learned
- Redis Pub/Sub is essential for horizontal WebSocket scaling
- Debounce UI updates — Don't re-render on every message; batch at 60fps
- Use rooms aggressively — Don't broadcast data users don't need
- TimescaleDB > regular PostgreSQL for time-series analytics queries
Need a real-time dashboard for your business? Let's build it together.