Answer the question
In order to leave comments, you need to log in
load balancer - what is the best way to do it and where to stick varnish?
hello, I'm drawing a diagram of servers for a test, and now I'm thinking where to put varnish.
the scheme used to work well:
USERS <-> ( [VARNISH:80]{NGINX:8080}[APC]php-fpm:9000{memcached}[DB] )
something like this ...
but the load began to grow with the number of users, respectively, and I decided insert a balancer and split everything into 3-4 servers + server for the database.
that is, the question arose of how to arrange the cache and the web server after the balancer.
1. USERS <-> Nginx LBS <-> (Varnish|Webserver1) <-> [Webserver2] <-> [DB]
2. USERS <-> Nginx LBS <-> (Varnish1|Webserver1) <-> (Varnish2| Webserver2) <-> [DB]
3. USERS <-> Nginx LBS <-> (Varnish) <-> [Webserver1] <-> [Webserver2] <->
you can set up a balancer to evenly distribute users and hook them each to one server only.
and then something like
upstream varnish {
server Varnish;
}
upstream webserver {
ip_hash;
server Webserver1;
server Webserver2;
}
location /
proxy_pass varnish;
if error 40x 50x = @webserver
location @webserver
proxy_pass webserver;
or to stick a varnish in front, then nginx and a balance on two servers ...
or do it like in the very first option, just pull out the database?
who will say what?
- ({NGINX:80}[APC]php-fpm:9000{memcached}) -
USERS <-> [VARNISH] <-> {NGINX LBS}
- ({NGINX:80}[APC]php-fpm:9000{memcached}) -
- ([VARNISH:80]{NGINX:8080}[APC]php-fpm:9000{memcached}) -
USERS <-> {NGINX LBS}
- ([VARNISH:80]{NGINX:8080}[APC]php-fpm:9000{memcached}) -
Answer the question
In order to leave comments, you need to log in
Maybe it's worth to start caching how to configure it?
In stores, as a rule, the main dynamics are search and order. The latter occurs relatively rarely and does not represent a large burden. Search should be optimized on the application side. All other pages, including pages with product descriptions, are best generated only when information is added / changed on them, and then given statically.
shop, a bunch of users on one server 16g \ ram with 512M memory_limit, 10k products database, I want a little freedom to start, dilute people across servers, an interesting idea than just cram more memory.
read looked earlier about Redis, ok went to put.
and goods are added every day, and a lot of people will fit in every day.
hmm, linode ... it's still cheaper for me to take another server and a small server and stir it up myself.
thanks
Why do you need Varnish?
nbonvin.wordpress.com/2011/03/24/serving-small-static-files-which-server-to-use/
Update 2 (Apr 14, 2011)
@VBart
Varnish Cache 3.0.2
Submitted by tfheen on Wed, 2011 -10-26 16:08
How about the option:
Users - Varnish+Nginx LB - [WebServer1..N] - Memcached -
DB Varnish will accept connections from users on port 80, give the cache, then forward everything else to localhost: 8080 where Nginx sits, which distributes traffic further to the nodes.
Good luck!
I think it would be better to start with this
1. USERS <-> Nginx LBS <-> (Varnish|Webserver1) <-> [Webserver2] <-> [DB]
Didn't find what you were looking for?
Ask your questionAsk a Question
731 491 924 answers to any question