The use of the phrase may seem a cliché, but I will not fail to quote:
There is no silver bullet.
Every type of software, every technology and every particularity of design choices will influence performance.
So, speaking more specifically of the Web, how do we start? Note that I said Web, not php, or . net, or java, I said Web.
So go to the basics before you go to your code or Design, yes, they all have performance approaches to Web and pretty common if you want to know.
And the basics are http. That’s right, the good old protocol that underpins our applications and that was initially implemented thinking about simpler things than today.
Understand the lifecycle of your application pages and apply specific techniques in each part of the application cycle.
What do you mean? A guy who started arguing a lot about this was Steve Solders, which started on yahoo and then went to Oraculo, ops, google.
He did just that, analyzed the life cycle and applied a lot of techniques, on the interface. Assuming that the code that generates the page was well done, then it was thought to run with performance, so it will run faster than the delivery of the request (all the objects that make up the page and the rendering in the browser). The guy’s website: https://www.stevesouders.com/
He has a lot of posts about it, starts by describing the 80/20 rule and then lists a set of rules that you should apply and provides a rank, based on these rules, for the performance of your page.
There are tools there for page analysis (eg yslow)
It has a lot of simple and cool things there, such as: "decrease your requests", that is, the fewer objects on a page, the faster it is delivered over the network, fewer tcp connections, fewer dns resolutions, less queuing waiting for an available connection, etc.
A simple example is that a single 100k file is downloaded faster than 10 10k files because? Network. With each file, if you do not have them in the cache, the browser will resolve the dns, open connection, receive content, change connection and repeat for each file, as it happens through the network, it will take 10 times (the time to do these actions below) more than just do once and download the whole content.
So, 5 identical CSS files scattered across your pages, which are always used by your classes, oops, I’m literally wasting time.
Merge all, put in cache, decrease 4 requests.
This is an example of how to use basic knowledge of network protocols to improve application design with a focus on performance.
Of course there are infrastructure solutions more, um, robust and with additional cost to the project infrastructure (CDN, Load Balance, cluster), but, let’s stick to other things, is the basic concept of them here for you to research, one of them is really worth understanding, CDN. Which is, basically, another server that you put everything that is fixed content, so to speak.
Let’s go back to the networks, I want to download an image of a site, a request, dns, tcp, data, repeats everything. But, there is an important detail, your Browser. The Browser tries to do a good job and tries to download the content of the page simultaneously. That is, if the html of my page has 3 images, it reads the html, parses the links of the images and opens 3 connections tcp (requests) to the server downloading at the same time.
Only most of the time the pages have well more than 30 requests and the problem is that the browser has a limit of simultaneous connections that it can do, result? It creates a queue of requests waiting to be downloaded, here for us queue is not cool right?
The maximum number of connections varies from browser to browser, so there goes solders:
http://www.stevesouders.com/blog/2008/03/20/roundup-on-parallel-connections/
And the same problem, you end up having on the application server. Depending on the network, the services on the server and the speed at which it delivers the requests, it may take longer to establish the tcp connections below http, ai queue again, etc.
So, think folded, put another machine (a clone so to speak) changes only the address (DNS) and points all links of fixed content there. Your browser will open the simultaneous connections with the two Servers, which generates fewer queues, faster tcp, etc.
If you look right, using cache, you’re just decreasing the number of requests, it’s a good rule ;)
One important thing: I didn’t talk about C#, java, Asp.net, Pho, ruby.. I spoke of http and networks, so this knowledge goes to the various types of projects.
Take a look at the guy’s website, if you can, buy the book as a reference. Before you buy, go to a bookstore and see which of the books matches the most you want to do.
Opa, malz, ce ta afim de uma paradas com imagens, slider e pá... then have a woman who wrote some materials about it, design and other things:
http://larahogan.me/
Ai is complicated because it has some techniques is are processing the images, take focus, sum with jpeg that discards sharpness, smaller file, download faster. And that requires training the person who’s going to put content on the site. But, there’s a lot of stuff there about design and the performance culture, that’s cool.
Something like, Everybody’s responsible for the performance.
Anyway, if you think about the flow of a request, there’s something like this:
Browser does request, the network (http, dns, tcp) works like hell, the server processes the request (parses the html, executes your code, goes to the database - more network - returns parses the html again, more network (tcp, cache, etc) and returns to Browser, it starts rendering the page, runs local scripts and shows you things, that for each page request.
So it’s the same as, simplifying:
Browser request-> network -> Processing (Your Code) -> Network ->Browser.
We’ve talked a lot about requests, design, network, so the code is missing.
Yes, the request is only answered when the code is executed and is ready to return an object requested by the request (which usually mounts the html that will be the body of the requested page).
So, there’s no revenue, analyze the full flow of your page and see if what you do is responsible for more than 20% of the page load time, Solders' rule, complains about it. Of course you have to have common sense, right? If your page is like google, it has virtually no requests and does not even yield because it has no objects, then load time is basically code + network, focus on code. But, if the main object of the page is responsible for 1% of the load time, your interface is chipped old, good.
Ai fits everything that the masters talk around about performance, cursors vs Subqueries, redo the indices of the tables, decrease the joins, instantiate many objects that are not always used, do not use Lazy load, inefficient ORM use, viewstate too big, Tookit loaded without need, ajax implemented wrongly, select open on the listing screens, do not use filters to limit searches, etc.
Man, it’s a lot of stuff, every technology’s gonna have its particularities.
I’ll leave some cool stuff here:
Do not understand, specify more.
– Jefferson Alison
Optimization of image loading?
– Ricardo
Sorry, I edited the question.
– Matheus Ilário
Could you publish what you have already done? So it would be easier for us to help.
– Felipe Avelar
Have you done any analysis process on the performance page? Or are you just worried about uploading images? (download images)
– Intruso
"... what methods are used to optimize the loading of the images in this box..." -> of which? "... I have a Bootstrap Carousel that occupies the entire screen..." - Which one? How is it? What does it have? Can’t answer your question without more information.
– Victor Stafusa
I edited and added an example of the code used, and informed what I have done and what I intend.
– Matheus Ilário