It’s not primarily for SEO, reduce the size and decrease the requests help a little with SEO, but this is what we can call a positive side effect, is not by far a "deciding factor".
has some risk in doing so?
Not if the tool or platform you used to do this is reliable, well-updated and tested, for example who you use Node.js
and Express
:
Now if you do it manually the risk is the factor "human error", it is very easy to make mistakes, even more when you have many scripts.
because there are some sites that use normal CSS
I think it’s because they don’t know and understand how HTTP works and how much this can influence on the server or simply by using "archaic" tools that don’t support automatic minification, after all doing it in the hand is something extremely laborious.
So, you have some kind of criteria for compressing or not compressing CSS?
For production, if you can always compress and "merge" them, unless the actual file is small and unique, for development and/or approval environments you can use an alternative version that shows the originals, without minification or compression, of course worth noting that compression gzip
(see Content-Encoding
) is something different.
And this really makes a lot of difference in loading pages?
Does if you have many files, even more in the matter of rendering blocks, as the download time will be shorter.
However compressing and unifying are not the only things that will help, depending on the tool you use for the service if you do this without having a cache system you will end up consuming the server more to make the service compress and compress manually can be a big problem.
So, minifying is beneficial, as it will reduce the "download" time and thus improve the response time for multiple requests and multiple users, but as I said there are factors that have to take into account, because it can not end up having effect contrary to the desired as if you have no cache, I’ll say this further down.
Automating
You can do the process manually sometimes, depending on the site I do, however it is not worth much if the project receives many modifications, in case the best would be to use something for the platform itself, if it usually uses a framework, examples of tools according to the technologies:
What can we do but minify
In addition to minifying and unifying multiple CSS or JS files in a single, removing comments, spaces and unnecessary lines you can use HTTP Cache and use the 304 Not Modified, as I explained in this question:
If you do not do this and use automated tools on the server may be that the consumption increases and until the response time is longer than expected, or if the tool is misused ends up being harmful.
Compressing with Gzip Deflate
Servers like Apache and Nginx usually already manage to do this automatically, but talking about the type of compression, usually the Deflate
is more efficient that only "minificar", a file .js
that I have normal weighs 63kb, but compressed by Apache it weighs 9kb, ie it is 86% lighter than normal, to use this has to have enabled in Apache module:
Of course it’s critical that you use this along with HTTP Cache, as I mentioned before, a very simple use example is to add to your archive .htaccess
(Apache) something like:
<IfModule mod_deflate.c>
<filesMatch "\.(js|css)$">
SetOutputFilter DEFLATE
</filesMatch>
</IfModule>
For Nginx servers just add to location
specific the gzip_static on;
location / {
gzip_static on;
gzip_types text/javascript application/javascript text/css; # mimetypes aceitos
gzip_min_length 1000; # tamanho minimo para executar a compressão
}
Read more on: https://www.nginx.com/resources/admin-guide/compression-and-decompression/
Depending on the amount of elements your css has makes a difference yes on page loading. Making an analogy , the compressed file is indexed thus making the search for elements faster and the uncompressed is a read without index. Besides being more organized :)
– Marcos Marques