R
R
Rinor2019-05-29 22:10:48
JavaScript
Rinor, 2019-05-29 22:10:48

What is the best way to include third-party js libraries?

PageSpeed ​​Insights results show that when building a bundle through Webpack, the performance is slightly higher than the usual connection of third-party modules via CDN, however, when using CDN, popular libraries are already cached by many users and they do not need to be downloaded at all, as well as if there are HTTP / 2 modules on the server from CDNs can be loaded in parallel, unlike the same webpack assembly. So which approach is still more successful in modern realities? Who prefers which approach?
PS It is not necessary to support IE and, accordingly, babel is not needed.

Answer the question

In order to leave comments, you need to log in

2 answer(s)
ⓒⓢⓢ, 2019-05-29
@Rinor

if there is HTTP / 2 on the server, modules from the CDN can be loaded in parallel, unlike the same webpack assembly

With webpack-assembly, everything can also be loaded in parallel and without context blocking.
Modern realities - modern methods. But whether these investments will be justified depends on the project. If your resources are scattered across multiple servers, then the benefit will be minimal
HTTP/1.1
По протоколу http1 браузер устанавливает множественные подключения, зачастую про такие вещи, как "задержка сети", разрешение доменных имён и т.д. - забывают или вообще об этом не знают.
Поэтому, как итог, нужна золотая середина между размером скачиваемого файла и количеством фреймов (кусков бандла)
По протоколу http2 несколько иначе
Мы имеем одно* подключение с множественными запросами и время на "задержку сети" уже тратиться только на одно соединение.
Поэтому не рекомендуется** делать большие файлы, т.к. скорость скачивания для загрузки не измениться (зачастую зависит от скорости интернета пользователя), в расчёт не берём сжатие

I
Igor Vorotnev, 2019-05-30
@HeadOnFire

You are looking a little in the wrong direction and do not quite understand the real meaning (and benefit) of CDN as a service / technology in general.
The time it takes to downloadfile from a CDN server is not a major time waster. Even before this download begins, the browser is forced to resolve DNS, send a request to a specific server, shake hands, verify the certificate and establish a connection. That's what takes most of the time. Plus, the download itself, due to the influence of the Congestion Control mechanism and the specifics of the TCP protocol, does not start immediately from the maximum channel bandwidth, but from the minimum packets and grows as they are successfully delivered. And if the channel is also not very ok in terms of packet loss (and rollback of the packet size), for example, mobile 3G, then as a result the download speed will not be very high. And if you also connect each script from your CDN, the problem increases many times over. Part of the situation can be improved with prefetch / preconnect / preload.
Also, you need to take into account that the statement "the user has already cached" is greatly exaggerated. Caches are not so huge to fit everything, they are cleaned, invalidated, outdated a little more than constantly. And taking into account the whole zoo of versions, the probability that the user will have your version is actually not so high.
At the same time, if you have HTTP / 2 and everything is given in one or more files (for HTTP / 2 this is really not important to a certain level) - DNS, SSL and the connection are already established at the time of the request, and the previous data flow has already pumped the channel enough . Therefore, loading (which is not the slowest part of this process anyway) will happen quickly. Plus, you can also push by shifting the download start time.
And now about the CDN itself and why it is needed.
If your main target audience is in the Russian Federation (and even better - in Moscow and the region, for example, which is quite normal for local businesses), set up a server in the same region and don't worry. You don't need your own CDN, or third-party CDN for scripts, or any Google Fonts. From your server over HTTP / 2, it will always be given faster.
But if your audience is the whole world, and there is only one server, then CDN will come to the rescue. A living example is a client from Australia, 60% of the audience is Australia. Naturally, we take the server in Melbourne, right next to its target audience. And for these 60% everything works as fast as possible without any CDN. On an average mobile device and on 3G - a couple of seconds. But for the remaining 40% - the USA, Canada and Europe - only one RTT to Melbourne 400-800ms even on a good office Wi-Fi. And on the average device and 3G - generally tin - TTI is about 30 seconds, sometimes up to 40-45 sags. And here CDN comes to the rescue - for users from Canada, for example, files start to be given from the nearest server in Montreal. Yes, we will lose <0.5 seconds to establish a connection with this CDN (after all, it is right next to us, RTT ~ 50ms), but thanks to the data transfer speed, we will save a lot.
That's what a CDN is for. And all this heresy about "the user has already cached" - dregs. Using CDN only for scripting libraries and not for all files, and even more so different CDNs for different libraries, is an anti-pattern and a manifestation of 80th level laziness.
update:
The situation is the same, by the way, with all these marketing scripts and trackers that our clients love to hang on their sites. Here is an example of a waterfall graph for loading a page and all resources. Please note that all resources from our server via HTTP / 2 (regardless of the priority of the resource from the browser's point of view) generally arrive instantly and almost simultaneously. This is the merit of HTTP / 2. But all third-party scripts from external CDNs - and it would seem that they should be fast, because these are Google, Facebook, etc. - take a lot of time. And most of that time is connecting to these servers. By the way, here it is worth preconnect for every single external address, without it it is even sadder.
For comparison:
- Full page loading speed without all these counters, all files from own server via HTTP/2 (including, by the way, 2 autoplay videos ~1Mb each) - about 250-300ms .
- With Google Tag Manager and a zoo of trackers underneath - 3000ms . 10 times more.

Didn't find what you were looking for?

Ask your question

Ask a Question

731 491 924 answers to any question