Answer the question
In order to leave comments, you need to log in
Is cache invalidation a problem in Nginx?
The question arose of caching the site - three options are offered:
Answer the question
In order to leave comments, you need to log in
I just completed a caching system for a highly loaded project on django + nginx, I did not experience any problems with invalidation. memcached + nginx microcaching. The principle of operation is as follows:
In its original form, this bundle has only one problem - the processing of competing requests. If several clients simultaneously try to request the same "unheated" page, then all requests will go to the backend (ab -n 1000 -c 100). Luckily, nginx makes it easy to prevent this with microcaching (uwsgi_cache_valid any 1s; uwsgi_cache_use_stale updating;). In this case, only the very first request goes to the backend, parallel ones get the result from the file cache, subsequent ones - from memcached. Not too elegant, there is an additional layer of caching (file), but works fine. On real data, we managed to get a 1200 times acceleration and unlimited scalability (using additional servers with nginx in microcaching mode between the client and the main server).
NGINX has fastcgi_cache_purge which is available on a commercial subscription. In other cases, write a script or use a plugin for cms that will pull NGINX so that it clears the cache when you change the page on the site. Be sure to exclude this script from caching, and don't forget to restrict access to it!
Didn't find what you were looking for?
Ask your questionAsk a Question
731 491 924 answers to any question