#31Concurrent Hotel Viewers
"X users viewing this right now." Hot keys, HLL.
When a user opens a hotel detail page, show "127 people are viewing this right now." The display refreshes as users arrive and leave. Sounds trivial — until you realize that on a viral hotel during a holiday weekend, you have one entity (one hotel ID) that 100K users are pinging every few seconds. Every detail of this design — TTL choice, push vs poll, exactness, where you keep state — flips when you account for the hot-key reality. This canonical is sized for **5M concurrent active hotel pages globally** (production-tier 3) with viral hotels reaching **100K concurrent viewers on a single ID**. The design tilts heavily on three load-bearing tricks: **edge fan-in** (per-PoP coalescing collapses 280 PoPs of writes into one batched origin call per heartbeat-interval), **hashtag-pinned sharded HyperLogLog** (16–64 parallel HLLs per viral hotel co-located in one Redis slot for atomic PFMERGE), and a **2-bucket sliding window** on top of the HLL to kill the 30-second roll-over cliff. Push fan-out follows Pusher's published coalesce-and-broadcast pattern (≤100 subscribers broadcast every change; >100 broadcast at most once per 5 s) layered over Discord-style sticky-hash gateway routing.
Reading: Heule/Nunkesser/Hall — HyperLogLog in Practice (HLL++, EDBT 2013) · Vattani et al. — Optimal Probabilistic Cache Stampede Prevention (VLDB 2015) · Pusher — How we built subscription counting at scale · Discord — How we scaled Elixir to 5M concurrent users · Cloudflare — Durable Objects: easy / fast / correct, choose three · Google SRE Workbook Ch. 22 — Cascading Failures
presence
TTL
hot keys
HLL
sharded counters
edge aggregation
push vs poll
coalesced broadcast
stale-while-revalidate