Answer the question
In order to leave comments, you need to log in
When updating the js script in git, everything changes on the server, but it is given to the site for users after a random time. Where to dig?
the modified script is sent to the site for users after a random time, it can be updated after 5 minutes, in most cases after 5-8 hours.
They tried to disable caching completely, now they set it to 30 seconds, but the problem remains. we no longer know where to dig, and what to sin on.
in browser:
Cache-Control: max-age=30, public
Connection: keep-alive
Content-Encoding: gzip
Content-Type: application/x-javascript
Date: Thu, 27 Feb 2020 07:13:24 GMT
ETag: W/ "5e538d92-8105"
Expires: Thu, 27 Feb 2020 07:13:54 GMT
Last-Modified: Mon, 24 Feb 2020 08:47:14 GMT
Server: nginx/1.12.2
Transfer-Encoding: chunked
Now the nginx config is:
location ~* \.js$ {
expires 30;
add_header Cache-Control public;
Answer the question
In order to leave comments, you need to log in
It's worth digging into cache invalidation by changing the url to .js for each new version of the file.
When instead of script.js you have some versioning
For example, script.vXXX.js or use a hash from the contents of script.MD5(from the contents of the script).js = script.adadadasdadadsd.js and, accordingly, when the script is modified, the url will also change
This approach better than all the rest, because it allows you to implement invalidation of the css / js cache when changing the file, as well as efficiently cache on the client (it will be possible right away for a year)
You don’t need to go far for an example, open the source code of this page and see how an example of using this approach
https://qna.habr.com/frontend.5e15b533-1a38-4c3c-b...
https://qna.habr.com/frontend.5e15b533-1a38-4c3c-b...
dig into the study of Bitrix. If you use git for public development, then you know little about Bitrix development, I can assume that there are more crutches in your code than customization.
enable script merging in Bitrix, it will issue a link with a hash, each change is a new script for the browser, which will be cached in a new way, including on the server side (but a new version is issued starting from the 2nd request, because on the first request after the changes, the previously cached version is given first, and then the new one is cached), in addition, this will reduce the amount of data transferred.
Didn't find what you were looking for?
Ask your questionAsk a Question
731 491 924 answers to any question