M
M
Mike Goodwin2019-02-19 17:51:04
Nginx
Mike Goodwin, 2019-02-19 17:51:04

When proxying requests through 2 layers of nginx, is it possible to send the response directly to the user?

A system for proxying requests through 2 levels of nginx has been assembled.
The first is used as a balancer for the other two:
upstream balancer {
server nginx0:444;
server nginx1:445;
}
server {
listen 80 default_server;
server_name test.server.name;
if ($host = test.server.name) {
return 301 https://$host$request_uri;
}
return 404;
}
server {
listen 443 ssl http2;
server_name www.test.server.name;
root /var/www/test.server.name;
index index.html;
location / {
proxy_pass https://balancer;
proxy_set_header Host $http_host;
proxy_set_header X-Real-IP $remote_addr;
proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
proxy_set_header realip $remote_addr;
}
ssl_certificate /etc/letsencrypt/live/fullchain.pem; # manage
ssl_certificate_key /etc/letsencrypt/live/privkey.pem; # mana
}
The second level of nginx involves caching and consists of two containers with nginx in docker
upstream apibackend {
server back1;
server back2;
}
server {
listen 444 ssl http2;
listen 445 ssl http2;
server_name www.test.server.name;
root /var/www/test.server.name;
index index.html;
proxy_set_header HOST $host;
proxy_set_header X-Forwarded-Proto $scheme;
proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
proxy_set_header X-Real-IP $remote_addr;
proxy_cache name_cache;
proxy_cache_revalidate on;
proxy_cache_use_stale updating;
proxy_cache_background_update on;
proxy_cache_valid any 5m;
location /api/ {
proxy_pass apibackend/api/;
}
The question is the following. Judging by the answers, after the request reaches the back or the second level nginx cache, it returns back in the same way that is fixed in X-Forwarded-For, thereby making the first level nginx a bottleneck, and not giving desired RPS result. How can I organize the return of the second-level nginx cache corresponding to the request directly to the user. If you have any other suggestions I'd be happy to hear. Thank you.

Answer the question

In order to leave comments, you need to log in

1 answer(s)
V
Vladimir, 2019-02-19
@Mike_GoodWin

Instead of the first Nginx, you can install an LVS (linux virtual server) balancer in direct routing and Keepalived mode or analogues for monitoring backends.
In direct routing mode, the request from the client goes through the balancer to the backend, and the response from the backend goes directly to the client without the participation of the balancer.

Didn't find what you were looking for?

Ask your question

Ask a Question

731 491 924 answers to any question