Infrastructure

DNS, TLS 1.3, load balancing, gateways, queues

Not Started
Loading...

Every web request passes through multiple infrastructure layers, each adding latency but providing essential services. Understanding these components helps you optimize the critical path and make informed architectural decisions.

The key insight: connection reuse and caching are your best friends. A new HTTPS connection might take 50ms to establish (DNS + TCP + TLS), but subsequent requests on that connection take microseconds. Similarly, a cache hit is 100-1000x faster than fetching from origin.

Infrastructure Components

Typical latency ranges for each component in the request path

DNS Lookup
Cold misses dominate; prefer low TTL for agility and rely on resolvers.
10–100 ms
TLS Handshake
Amortize via connection reuse; TLS 1.3 is single RTT.
10–30 ms
Load Balancer (L7)
Keep policies simple; colocate with services.
0.5–2 ms
API Gateway
Auth and transforms live here; push slow logic to services.
5–15 ms
WAF
Inline rules; tune false positives.
1–5 ms
Message Queue
Broker hops add small latency; batch where possible.
1–10 ms
Cache Hit
Data locality is king; maximize hit rate.
0.1–1 ms

📝 Test Your Knowledge

4 questions • Progress: 0/4