Optimize loading of images

Asked

Viewed 930 times

2

Good morning, I have a Bootstrap Carousel that occupies the entire screen. I would like to know the methods used to optimize the loading of the images of this Carousel and the page as a whole.

I am using this Carousel: http://bootsnipp.com/snippets/featured/responsive-bs-carousel-with-hero-headers. I enabled gzip compression and caching via htaccess but I would like some way to make it not affect the loading of the entire site, which is one page, and the client is free to put the images as needed.

  • Do not understand, specify more.

  • Optimization of image loading?

  • Sorry, I edited the question.

  • Could you publish what you have already done? So it would be easier for us to help.

  • Have you done any analysis process on the performance page? Or are you just worried about uploading images? (download images)

  • "... what methods are used to optimize the loading of the images in this box..." -> of which? "... I have a Bootstrap Carousel that occupies the entire screen..." - Which one? How is it? What does it have? Can’t answer your question without more information.

  • I edited and added an example of the code used, and informed what I have done and what I intend.

Show 2 more comments

4 answers

3

There are some techniques that can be employed:

Utilize Connection: Keep-Alive

Allows the established TCP connection to be re-used for new HTTP requests, removing the time needed to perform Handshake.

Multiple Static Content Domains

Spread your images on domains like static1.site.com.br, static2.site.com.br, etc, and upload each of them from a different website. On multiple domains you are sure that multiple connections will be opened to download resources, minimizing the impact of Handshake time.

Configure Cache-Control

Images usually do not change frequently. Use Cache-Control to allow the browser and other intermediary nodes to cache their images. Set the time correctly, according to the nature of the image. A tip (half gambiarra, but simple enough), is to split the images into folders according to their refresh rate and put files .htaccess (in the case of Apache) to configure the cache control individually.

Example (.htaccess):

# Configura o cache como público e com a idade máxima de 3600 segundos (1 hora).
Header set Cache-Control "public, max-age=3600"

Remember that 1 hour is little for the vast majority of images.

Etag

For banners, it might be best to re-validate the banner each time the page is accessed. The advantage is that the customer will always own the most current banner, but the disadvantage is that a request per sepre image will be performed.

How this helps if the request is made? The server uses the field ETag to check if there has been any change in the image on the server. If no changes occur, it returns a status 304 Not Modified instead of 200 OK, and no response body.

Apache does this automatically, as long as it is not said otherwise.

See more in: http://en.wikipedia.org/wiki/HTTP_ETag

Coding

Selecting the correct image format and quality helps not only the final image quality, but also the loading time.

There are some tools that automatically optimize the image for you:

PNG: https://tinypng.com/

JPG: https://tinyjpg.com/

Observing

With the exception of coding, all techniques can be employed for other files such as PDF, JS, CSS, HTML, etc. Simply configure them in the most appropriate way.

0

Cloudflare is an excellent tool for this. After properly configuring Cloudflare, search for Cloudflare’s "page rules"!

0

The use of the phrase may seem a cliché, but I will not fail to quote: There is no silver bullet. Every type of software, every technology and every particularity of design choices will influence performance.

So, speaking more specifically of the Web, how do we start? Note that I said Web, not php, or . net, or java, I said Web.

So go to the basics before you go to your code or Design, yes, they all have performance approaches to Web and pretty common if you want to know.

And the basics are http. That’s right, the good old protocol that underpins our applications and that was initially implemented thinking about simpler things than today.

Understand the lifecycle of your application pages and apply specific techniques in each part of the application cycle.

What do you mean? A guy who started arguing a lot about this was Steve Solders, which started on yahoo and then went to Oraculo, ops, google.

He did just that, analyzed the life cycle and applied a lot of techniques, on the interface. Assuming that the code that generates the page was well done, then it was thought to run with performance, so it will run faster than the delivery of the request (all the objects that make up the page and the rendering in the browser). The guy’s website: https://www.stevesouders.com/

He has a lot of posts about it, starts by describing the 80/20 rule and then lists a set of rules that you should apply and provides a rank, based on these rules, for the performance of your page.

There are tools there for page analysis (eg yslow)

It has a lot of simple and cool things there, such as: "decrease your requests", that is, the fewer objects on a page, the faster it is delivered over the network, fewer tcp connections, fewer dns resolutions, less queuing waiting for an available connection, etc.

A simple example is that a single 100k file is downloaded faster than 10 10k files because? Network. With each file, if you do not have them in the cache, the browser will resolve the dns, open connection, receive content, change connection and repeat for each file, as it happens through the network, it will take 10 times (the time to do these actions below) more than just do once and download the whole content.

So, 5 identical CSS files scattered across your pages, which are always used by your classes, oops, I’m literally wasting time.

Merge all, put in cache, decrease 4 requests.

This is an example of how to use basic knowledge of network protocols to improve application design with a focus on performance.

Of course there are infrastructure solutions more, um, robust and with additional cost to the project infrastructure (CDN, Load Balance, cluster), but, let’s stick to other things, is the basic concept of them here for you to research, one of them is really worth understanding, CDN. Which is, basically, another server that you put everything that is fixed content, so to speak.

Let’s go back to the networks, I want to download an image of a site, a request, dns, tcp, data, repeats everything. But, there is an important detail, your Browser. The Browser tries to do a good job and tries to download the content of the page simultaneously. That is, if the html of my page has 3 images, it reads the html, parses the links of the images and opens 3 connections tcp (requests) to the server downloading at the same time.

Only most of the time the pages have well more than 30 requests and the problem is that the browser has a limit of simultaneous connections that it can do, result? It creates a queue of requests waiting to be downloaded, here for us queue is not cool right?

The maximum number of connections varies from browser to browser, so there goes solders: http://www.stevesouders.com/blog/2008/03/20/roundup-on-parallel-connections/

And the same problem, you end up having on the application server. Depending on the network, the services on the server and the speed at which it delivers the requests, it may take longer to establish the tcp connections below http, ai queue again, etc.

So, think folded, put another machine (a clone so to speak) changes only the address (DNS) and points all links of fixed content there. Your browser will open the simultaneous connections with the two Servers, which generates fewer queues, faster tcp, etc.

If you look right, using cache, you’re just decreasing the number of requests, it’s a good rule ;)

One important thing: I didn’t talk about C#, java, Asp.net, Pho, ruby.. I spoke of http and networks, so this knowledge goes to the various types of projects.

Take a look at the guy’s website, if you can, buy the book as a reference. Before you buy, go to a bookstore and see which of the books matches the most you want to do.

Opa, malz, ce ta afim de uma paradas com imagens, slider e pá... then have a woman who wrote some materials about it, design and other things: http://larahogan.me/

Ai is complicated because it has some techniques is are processing the images, take focus, sum with jpeg that discards sharpness, smaller file, download faster. And that requires training the person who’s going to put content on the site. But, there’s a lot of stuff there about design and the performance culture, that’s cool.

Something like, Everybody’s responsible for the performance.

Anyway, if you think about the flow of a request, there’s something like this: Browser does request, the network (http, dns, tcp) works like hell, the server processes the request (parses the html, executes your code, goes to the database - more network - returns parses the html again, more network (tcp, cache, etc) and returns to Browser, it starts rendering the page, runs local scripts and shows you things, that for each page request.

So it’s the same as, simplifying: Browser request-> network -> Processing (Your Code) -> Network ->Browser.

We’ve talked a lot about requests, design, network, so the code is missing.

Yes, the request is only answered when the code is executed and is ready to return an object requested by the request (which usually mounts the html that will be the body of the requested page).

So, there’s no revenue, analyze the full flow of your page and see if what you do is responsible for more than 20% of the page load time, Solders' rule, complains about it. Of course you have to have common sense, right? If your page is like google, it has virtually no requests and does not even yield because it has no objects, then load time is basically code + network, focus on code. But, if the main object of the page is responsible for 1% of the load time, your interface is chipped old, good.

Ai fits everything that the masters talk around about performance, cursors vs Subqueries, redo the indices of the tables, decrease the joins, instantiate many objects that are not always used, do not use Lazy load, inefficient ORM use, viewstate too big, Tookit loaded without need, ajax implemented wrongly, select open on the listing screens, do not use filters to limit searches, etc.

Man, it’s a lot of stuff, every technology’s gonna have its particularities.

I’ll leave some cool stuff here:

0

Base 64

Perform the Encode base64 images is a very good alternative to saving charging time, how? instead of making a request to the server for an image every time, the image (encoded in Base64) is sent next to the file HTML (together even, in the tag sometimes on own CSS).

How would the HTML with image Base64:

<div>
    <img src="data:image/png;base64, iVBORw0KGgoAAAANSUhEUgAAAAUA
AAAFCAYAAACNbyblAAAAHElEQVQI12P4//8/w38GIAXDIBKE0DHxgljNBAAO
    9TXL0Y4OHwAAAABJRU5ErkJggg==" alt="Red dot" />
</div>

Formats of Images

There is a discussion on Soen (same network of this site) that goes against what you are looking for: PNG vs. GIF vs. JPEG - When best to use?, I will give a summary of the content of the discussion:

  • To plain image (like twitter bird format .gif is most indicated by having a higher compression and 3% loss in quality (and look there)).
  • For images with type quality Foto is more appropriate .jpg because it has a higher compression even keeping a better image quality.
  • For flat image with transparent background: .png 8-bit (represents up to 256 colors) is most indicated by having a compression greater than .gif.

Obs: You can combine the solutions

  • 1

    Thank you, I will study this subject.

  • Encoding for Base64 is a portability practice, not performance. Volume increases by 33% - a 100KiB image would grow to 133 Kib. In addition, the payload of all of the images present is reloaded every time the HTML file undergoes server re-fetch, since Data Uris cannot be cached regardless of the document that the port.

  • But only a single request is made that would only be a loss if he kept reloading the page

  • @Ricardohenrique This single request is only more performative (and yet by an almost irrelevant margin) in the first request. After that, even in the worst possible scenario (HTTPS with Handshake renegotiation) the direct load performance of the local cache is several orders of magnitude larger than the re-fetch of the document containing Data Uris. Note the OP post - 'the customer is free to place the images as needed'. Content cannot be static.

  • I understand, in the scenario where an application where the images were video thumbnails and each page came from different videos, the use of Base64 would be bad?

  • Can it generate Base64 from each image and save in BD? Would that be bad? Because I plan to use this method.

  • @Ricardohenrique If you have a BD that supports binary storage (such as Mysql BINARY or MS-SQL) it is better to store content as bytes, since with Base64 your database consumption would go up by 33%. Base64 is an excellent feature if you want to store thumbnails for offline consumption, for example (using Appcache, Indexeddb and the like).

  • @bfavaretto would have how to create a chat room so I can clarify some points? if yes call me on it. grateful

Show 3 more comments

Browser other questions tagged

You are not signed in. Login or sign up in order to post.