Answer the question
In order to leave comments, you need to log in
How to optimize js loading for multipage sites?
The initial conditions of the problem are approximately as follows - there is a multi-page site, let's say "portal" type, but not very large. The backend is traditionally written in php. On the front end we use jquery with all sorts of third-party and our own plugins to improve the user interface - all sorts of lightboxes, calendars, tooltips and other pretty standard zoo.
Not so long ago, we decided to introduce a modular approach? asynchronous loading using require.js and separation of backend and frontend. The implementation went well, but the question arose of how to optimize the loading of javascripts now?
The most popular solution for require.js, which is offered everywhere, is to use r.js to glue everything into one file and include it in production. But, excuse me, friends, why are we introducing all this modularity then? Just for the sake of convenience in development?
In total, we use about 300-400 kb of various Javascript on the site, and this use is completely uneven across sections. And it seems to me that it is not quite right to pack all this into one file and give it to each visitor. The fact is that our average user may never reach the page or section of the site where we use the datepicker or photo upload, or some other specific scripts. So why are we going to load him with these gifts at the first meeting, force him to download all 400 kb of scripts, and then force his browser to parse all this code.
Now we are using some kind of intermediate solution, which we are still not very happy with. At the first meeting, we give the user the minimum set of necessary scripts (100-150 kb), and we pull the rest in the form of requirejs modules. As a result, we get that with a large number of interactive elements on some pages, in addition to the basic set of scripts, 5-7 more javascript files are requested and downloaded from us. From an optimization standpoint, this doesn't seem to be a very pretty solution either.
We are probably not the first to face this issue. Our site is not unique and the problem is also.
What is more profitable to do in terms of proper client-side optimization:
1. Pack all scripts into one file, give it away immediately and not worry that most of these scripts will not be in demand by many site visitors?
2. Try to somehow share the scripts. Give only what the visitor needs on a specific page of the site?
If you go the second way, then what to do with optimization? What if the page requires loading a dozen small js modules?
Answer the question
In order to leave comments, you need to log in
Minify everything and serve with gzip with correct HTTP cache settings. Once loaded and no longer pull on all pages.
Make the inclusion of javascript files in the javascript itself.
For example:
- the frontend received jquery and jquery ui, and wrote them down in the list of received scripts - then the server again offers to get jquery and
JQueryCookie
, javascript checks and eventually requests only JQueryCookie
whether they
It seems to me that the issue of a one-time download of scripts is not at all in weight. What is now 300-400 kb? So, nonsense ... (up: for mobile devices it may not be nonsense, which does not change the approach as a whole)
The issue is execution synchronism - the HTML parser expects scripts to be loaded and interpreted, and all the synchronous code collected in one single file may cause the browser to freeze. Sometimes felt for a long time.
Again, in my opinion, it is better to spend a request on some heavy scripts that can cause brakes in building a page than to force the user to wait for the document to react. And to load, if possible, asynchronously.
Thanks to the discussion in the comments and some more research, I came up with the following. First, the issue of optimization should be divided into 2 points and each of them should be discussed separately.
1. Downloading scripts to the user's device. Here, the unambiguous option approved by everyone is to combine all the scripts into one file, minify it, compress it and cache it for the user on the first visit.
2. Execution of scripts on the user's device. It's more difficult here. First, it happens every time a page is opened, regardless of whether the script is taken from the cache or not. If you did not perform any optimization, but simply combined all your scripts into one file, then immediately after loading (taking from the cache), the user's browser will start analyzing and executing them. This takes time, and during this time the user's browser "freezes". Depending on the power of the device (desktop or phone) and the size of your combined javascript, this execution can take from fractions of a second to several seconds. But even a fading of 0.2-0.3 seconds is quite noticeable to the user.
Useful links on this topic:
calendar.perfplanet.com/2011/lazy-evaluation-of-co...
habrahabr.ru/post/145269
The question remains how to combine the first and second, i.e. give the user all js in one large file, but force the browser to execute only the code that is needed on this page, and not to execute unnecessary code. There is a solution to this! It is called delayed (lazy) javascript evaluation, in English, delay (lazy) javascript evaluation. Those. if you combine all the code into one file and you have a lot of this code, make sure that only the code that is needed on the current page is executed.
There are enough solutions on the Internet about this, but since we use require.js, I looked at how it was solved in this library. Starting from version 2 require.js is able to defer javascript execution and does it automatically for all modules declared via define. It is important to know that for scripts written in non-AMD format and connected via shim (for example, jquery plugins), deferred execution does not work and they are executed immediately upon loading. Unfortunately, we have quite a lot of such plugins and we will have to do something, for example, wrap them in requirejs modules. But it doesn't seem to be a big problem.
So the short answer to the question is:we pack all the scripts into one file, but be sure to take care of the delayed execution of scripts so as not to load the browser with unnecessary work, and the user with delays in executing unnecessary javascript.
If you are so worried about the issue that scripts are constantly pulled from the server, then put them in the client storage, using for example such a tool .
But in general, this is superfluous with your volumes. There is nothing wrong with the fact that some parts of the scripts are not being executed at the moment and the user may never reach them. They after all do not load the processor, do not cause a brake. What do you care if the user got to your datepicker? What will be the difference for the user experience, whether the user received the code to use the datepicker immediately or only when he opened the page where it is?
There is another opinion, to consider the site as an application (well, for example, like on smartphones). What is the beauty of this approach? And the fact that when you start the application, the entire shell of this application is loaded at once. And then, only bare data is loaded, which gives unprecedented response speed and responsiveness. You won't find this on regular websites. In such a situation, not only scripts and styles, but also all site graphics will have to be preloaded and stored on the client somehow.
Compressing through r.js is the only correct solution, even a script weighing 1mb is a trifle compared to the total weight of an average statistical page, including images. And given that the script is loaded once and then cached by the browser, then there is absolutely no reason to worry about excess weight.
Didn't find what you were looking for?
Ask your questionAsk a Question
731 491 924 answers to any question