W
W
waltaki2021-01-21 11:22:11
Nginx
waltaki, 2021-01-21 11:22:11

How to give one response to parallel requests?

There is an /api/info/ request configured like this:

location /api/info/ {
    limit_req zone=info burst=45;
    try_files $uri @webapi;

    proxy_cache_key $scheme$proxy_host$request_method$request_uri;
    proxy_cache proxy_cache;
    proxy_cache_valid 2s;
    proxy_cache_lock on;
    proxy_cache_use_stale error timeout updating;

    expires 2s;
}


If you make several parallel requests from one client, then the following picture emerges:
the first request receives a response from the back after 2s;
on the second after 4;
and so on, until timeout;

It turns out that already a couple of open tabs are brought to timeout, and the waiting time on one tab is too long.

Is it possible to make it so that after 2s the response from the back is given to all parallel connections that are waiting for a response. At the same time, requests to the projected server from one client should just be no more than once every 2s.

Answer the question

In order to leave comments, you need to log in

1 answer(s)
K
ky0, 2021-01-21
@waltaki

Is there a nodelay in your limit-req zone so that requests are processed "without being smeared"?

Didn't find what you were looking for?

Ask your question

Ask a Question

731 491 924 answers to any question