J
J
JunDevTest2018-08-17 21:53:26
Client side optimization
JunDevTest, 2018-08-17 21:53:26

Is it worth minifying and merging files if there is http2 and gzip?

Greetings.
Recently, I started looking for a utility or script for merging and minifying CSS and JS files and thought about the expediency of this action in our time, when there is GZIP and HTTP2. Besides, is such optimization really important in our time? If yes, then advise normal, not "smoothie-hipstor" solutions that do not require a separate server with a whole zoo of technologies to boot?

Answer the question

In order to leave comments, you need to log in

6 answer(s)
E
Eugene Wolf, 2018-08-17
@Wolfnsex

Is it worth minifying and merging files if there is http2 and gzip?
GZIP plays a rather small role here, besides, it was (and is) in HTTP1.0/1.1. It also does not directly relate to minification, from the word at all. Minified files become smaller with GZIP, compared to non-minified files compressed with the same GZIP (ceteris paribus).
If yes, then advise normal, not "smoothie-hipstor" solutions that do not require a separate server with a whole zoo of technologies to boot?
The only technology I know of that requires a server is the "Google PageSpeed" module, which connects to Apache or "compiles" into Nginx. At the same time, it does not require a separate server, it is enough that it is installed in 95% of cases (Nginx and / or Apache). But, you can easily minify and compress files without it, using a lot of ready-made solutions that do not require any server at all and usually work locally, for example, solutions based on NodeJS (gulp) - there are just tons of modules, but if you want, I can search for You. And also, there are a lot of online services doing the same, which also do not require anything other than a browser.
PS Gulp (and other tools like it) can also glue JS and "compress" images and minify CSS and carry out a lot of actions of a similar nature, depending on your needs. "Configure" is enough 1 time, then you can use the finished blank in an unlimited number of projects.

M
Moskus, 2018-08-17
@Moskus

As always, questions on this topic are understated or contain terms that are not defined.
If by JS minification we mean its simplest form, that is, the removal of spaces and comments (which you may not have in your code), the effect of such minification after gzip compression tends to zero, because data archiving eliminates redundancy, albeit not completely . Many solutions that work as an online service with a form where you need to paste the code do just such an imitation of minification.
If by JS minification we mean the use of more advanced methods (for example, what https://github.com/mishoo/UglifyJS2 can do)), such as renaming functions and variables, removing from the code of ready-made libraries those functions that are not called, and so on - this can have a significant effect.
The same goes for CSS.
As for super-slow channels and so on - people who use such channels do exist, but they are orders of magnitude rarer than people who use weak and outdated hardware. Therefore, with limited resources for development (banal - limited time), more attention should be paid to ensuring that the site is not unnecessarily demanding on the resources of the client device (useless background videos, giant images, slow JS animation, endless scrolling, a bunch of different web fonts , sheets-pages instead of multi-page navigation, other fashionable resource-intensive layout techniques).

B
batyrmastyr, 2018-08-24
@batyrmastyr

In terms of js compression, first of all, you need to speed up the parsing and execution of the code, because, conditionally, 10 kb of js can be worse than a couple of megabytes of pictures ( https://medium.com/dev-channel/the-cost-of-javascr ... ). In general, Moskus is right - removing spaces and comments from the code does little, and removing unnecessary code + renaming functions and variables significantly reduces the file size.
But be careful: if the minifier goes further and pretends to be an archiver (like Dean Edwards's Packer or webpack with its thousands of evals !function(F,Q){for(var B in Q)F[B]=Q[B] }(global,webpackJsonp([0], {"use strict";eval("var __WEBPACK_AM), then you will not speed up the work, but rather add brakes on the client.

D
Dmitry Belyaev, 2018-08-17
@bingo347

I'll add to the above about http/2:
without "smart" server-push, a bundle containing all the site code for each page will load faster than individual modules, even if you don't load all of them.
just imagine, you are loading html -> it is being processed -> it starts loading styles, images and the main js module -> js is being processed -> loads 5 more modules, which after processing load another 50 modules
and now just think - you have a hundred files that are loaded in parallel in one tcp stream, moreover, wrapped in tls - which adds. overhead for encryption and decryption + add here an overhead for js parsing, which will delay the network, and the network of its
exit from this 2
- or still give everything in one bundle
- or set up push, but this is also troublesome, because in addition to calculating the files needed by the initial request (although difficult, but quite realistic), you need to predict what the browser already has in the cache (we'll make a mistake - we get either the initial overhead (didn't give the right one) or gobble up extra traffic (give away unnecessary))

L
lukoie, 2018-08-18
@lukoie

Take a look at CodeKit. He does everything himself and smartly, and does not require a lot of everything in addition.

Didn't find what you were looking for?

Ask your question

Ask a Question

731 491 924 answers to any question