HTTP Pipelining & Multiplexing
Discussions focus on limitations of HTTP/1.1 pipelining and concurrent request handling, with comparisons to HTTP/2 multiplexing for efficient multiple requests over single connections.
Activity Over Time
Top Contributors
Keywords
Sample Comments
or use HTTP2 which should make the number of requests largely irrelevant.
With HTTP/1.1 pipelining, you can't reliably start sending the second request until the first response is complete. As such, you can't have multiple requests out at the same time. It's also very much linear.
If it’s all same server then it can be batched up.Also doesn’t the new fangled http/2 protocols and the like solve this, at least partially?
That seems mostly like the question of "Does this HTTP framework support HTTP pipelining". While I don't know the answer, it doesn't seem highly relevant. Most clients went away from using pipelining, since follow-up requests on the same connection are subject to unknown latency (stuck behind the first request) and a connection failure can impact all of those requests.The better approach is to use either more connections, or proper request multiplexing via HTTP/2 or &
This has been solved for 10+ years. Properly configure your load balancer with HTTP keep alive and piplining.
If I understand your question correctly, I would say that http pipelining solves that issue. It can only be used for GET requests, so there are limitations.
Probably for HTTP Pipelining: https://en.wikipedia.org/wiki/HTTP_pipelining
That's solved in HTTP2 and other connection multiplexing protocols.
With HTTP2 support this is not longer really a problem, since HTTP2 multiplexes requests over 1 connection.
Why not just make multiple HTTP2 connections then?