File Subdomain, Site Optimization

Asked

Viewed 545 times

5

Well, I heard that for more downloads in parallel the indication is to put them in a subdomain, and even a fact that cookies not go in the request would also make faster.

Anyway, truth or myth, how much can this practice help? (JS, CSS) is also valid?

4 answers

6

Depends on several factors,

1- If the domain continues to stay on the same server as the website, I don’t think I will get any optimization.

2- if you use some kind of technology to protect your website like cloudflare may delay your downloads or even optimize, it all depends on where you are located and your Internet speed.

3- I do not think that cookies could delay downloads, I see no reason for it to happen.

Basically, if you want to have a site that uploads files quickly to users, yes you can create a domain where Subdomain points to another server of your own, where that server only serves for downloading and uploading files. Ai teras um website suberbo where the download could be super fast example (downloads1.seuwebsite.com can point to 1 server your downloads2.seuwebsite.com can point to another server) usually this technique is used when the 1 machine no longer has space available, or even 1 other server with super fast internet for larger files.

I hope I’ve helped.

  • 1

    Cookies delay the download yes. When a request is made to subdominio.meusite.com, the browser sends (uploads) all existing cookies to subdominio.meusite.com and meusite.com, and in Sponse is also made the download of this data (+ details). That’s why large websites use different domains (not sub-domains) to serve static content, such as Google apps, which use googleusercontent.com.

5

If it is on the same server Subdomain will not help at all, as it will continue downloading from the same location!

The interesting thing for you would be to hire a CDN(Content Distribution Network), so you can know the location of the person and indicate the nearest server with your files without any manual work!

CDN can be Cloudflare, it can be Maxcdn and several other companies like Amazon that offer products similar to CDN or with functions very similar to those of a!

Minifying JS and CSS can help speed!

This practice can help you index on Google in a higher position, attract more people to your site by being fast and have greater speed for any change!

  • so the closer the servers are to you the faster the download?

  • of course, but it also depends on the location of your ISP and its speed of connection that Voce has between it.

1

Through the Yslow plugin of Firefox (depends on having Firebug too) I arrived at this page: http://yuiblog.com/blog/2007/04/11/performance-research-part-4/ . It’s 2007, it may have changed a lot since the publication, but it presents a study on exactly what you asked.

In summary: It starts from the principle that increasing the number of domains should improve the time it takes to load all the requested elements, because the HTTP/1.1 specification suggests that browsers should download two components in parallel by hostname (pointed out by Henrique above). This limit can be changed via configuration in both I.E. and Firefox (no other information), but anyone who moves the server will not configure any client browser, and the way to configure is not very user-friendly.

Only the results of his test were not exactly as expected. It has two tests, 20 images each, being a small and other medium images. Both had an improvement in the increase from 1 to 2 hosts, but the small images did not have a significant impact by increasing the number of hosts and the average images had a negative impact! With 3 hosts it was worse than with 1 host. At the end he comes to the conclusion that the best should be 2 to 4 hosts.

The CDN that Vinicius cited helps a lot too, trying to narrow the distance between the client’s browser and the source of the requested content.

Another reason to place more hosts would be to have hostname(s) with dynamic content and other(s) with static content, using content expiration header to be cached, but this can be done in a single hostname through configuration with the mod_expires module of apache for example.

If the intention is to know what to improve on a page to make it faster or lighter, Firefox with Firebug and Yslow can help a lot, I do not know similar tool in other browsers, but there must be.

If you have resources available for testing, Safari (at least on OS X) has a very good feature of timing the time it takes to load a page indicating each action. I’m not sure, but I think I’ve seen this in Chrome on Windows too.

0

This is an HTTP 1.1 problem, which we currently use is a sequential protocol. This means that when we open the connection, we can make one request at a time. Will 1 request, we wait, arrives the answer; only there can fire another request.

To try to lessen the negative impact of this behavior, browsers with HTTP 1.1 open more than one connection at the same time. Nowadays this number is usually 4 to 8 simultaneous connections per hostname.

HTTP 2.0 will already have multiplexing, avoiding to do these gambiarras that we do subdominios, but you can already use SPDY which is a protocol made by google, which already has this option, but to install it you have to have HTTPS installed on your server.

And leaving the CSS and JS files in a file only helps yes as it will only make one request per file.

Browser other questions tagged

You are not signed in. Login or sign up in order to post.