Full Request Path

Budgeted end-to-end request breakdowns

Not Started
Loading...

A typical web request travels through 8-10 infrastructure components before reaching your application. Each hop adds latency, but also provides opportunities for optimization. The total journey can range from 50ms for cached content to 500ms+ for dynamic, uncached responses on slow networks.

The biggest wins come from eliminating entire steps: cache at the edge to skip the origin, reuse connections to skip handshakes, and compress responses to reduce transfer time. Every millisecond saved directly improves user experience and conversion rates.

Full Request Path Analysis

Breakdown of a typical HTTPS request from browser to database and back

DNS Resolution
If cached, this is sub‑millisecond; misses dominate cold paths.
20 ms
TLS Handshake
TLS 1.3 is 1‑RTT; connection reuse removes this entirely.
30 ms
CDN Edge
Edge POP localizes RTT; prioritize cacheable responses.
10 ms
Load Balancer + WAF
Inline controls add small but consistent overhead.
3 ms
API Gateway
Auth, routing, and transformations live here; keep rules lean.
10 ms
Auth Service
JWT verification is sub‑millisecond with local keys.
2 ms
Application + DB
Indexing and hot read paths keep this bounded.
15 ms
Response (1MB via 4G)
Dominated by bandwidth and RTT (BDP). Compress aggressively.
200 ms
Optimization Opportunities
  • Connection pooling: Save ~30 ms
  • HTTP/2 multiplexing: Save ~20 ms
  • Edge caching (95% hit): Save ~100 ms
  • Response compression: Save ~120 ms (60–80% smaller)
  • DB connection pool: Save 5–10 ms
Networking & TLS
50ms
Edge & Load Balancing
13ms
App + DB
25ms
Payload Transfer
200ms

📝 Test Your Knowledge

4 questions • Progress: 0/4