Client Side Optimization
If I had to fix slow responding website, as lot of developers’ my first thoughts are server side optimizations. Database may not be optimized, Caching is not utilized etc. We often oversee another major part of our web application that is client. We often ignore this area, and trust me it has major impact on performance of a website. So before jumping into optimization we need to know how things are processed at client (browser).
Our web pages have lot of resources; they can contain images, JavaScript Files, CSS files, Flash. For every such individual resource there is a separate request from browser to server. Request cost time. Each request has 2 phase, time required making connection to server and actual time to download that resource. (User developer tool’s Network tab or HTTPWatch or Fiddler to check this)
Each additional image/JavaScript/style file is slowing is your website. A solution will be browsers should download these resources in parallel for better experience. However there is limit on how many concurrent connection a browser can make to you website at a given time. Typically modern browser can have 6 to 8 connections to a domain. This implies that no more than 8 parallel request can be made to your website by browser. If your page needs more resources, browser will have to wait for previous requests to complete to make request for other resource, till then they are queued and compete for connection to be available. Moreover some resource depends on other resources, ex. a JavaScript requiring other JavaScript file. If you take into account this, you would realize this limit can greatly affect performance.
To verify that open developer tool of your browser (F12) and go to networks tab, try browsing a page of any web site. You will something as following.
You see at first browser tries to get HTML of page (first request). Once it has html, then only browser knows what all resources are required to render this page correctly. After HTML it starts downloading resources. In this case for 3.5 sec it had to wait just to get HTML. After that you can several parallel requests to server to resources. Little later you will see something like
Focus on the lagged request, those resources are needed, but they were queued till previous resources all downloaded. One page load is complete, you will find every image, css, js file required in the page. At the bottom of network tab you will find summary of all request.
Browser made 54 requests to just display one page and it took 5.95 seconds!!!! . Now, we know what is bad, let’s look at solutions.
Two point solution
- Reduce number of request.
- Minimize the data required.
Following are ways to do these
Optimize HTML : Remove white space wherever possible, Remove comments, Remove useless tags
Bundling: All the resource listed in network tab for the pages are necessary we cannot leave them. So we club them. This reduces number of request. Standard term for this is “bundling”. Several JavaScript and CSS style can be clubbed to a single file for JavaScript and single file for Style. This reduces number of request for all of JavaScript or CSS can be downloaded in one request (although it is true that this will be a bigger file, but that will be faster than several requests). Now moving a step further we can also “Minify” files.
Minification: This removes extra white spaces, comments from file, also rename variable to use short name. This drastically reduces size of files.
will be something like
ASP.Net developers please look at Bundling-and-minification for more details.
Other links , JsCompress, CSS
Customize Header Expiry/Caching : We know about caching in Server, along with that you can also control what is cached at client side. Caching Images and other static resource is generally a good idea.
Or in code for a page, but this applies to a particular page response.
I would recommend using combination of both. For apache
Get Static Resource out of your website: Host your static resources in other server than website server. If possible use CDNs.
A content delivery network or content distribution network (CDN) is a system of computers containing copies of data, placed at various points in a network so as to maximize bandwidth for access to the data from clients throughout the network. A client accesses a copy of the data near to the client, as opposed to all clients accessing the same central server, so as to avoid bottleneck near that server.
Content types include web objects, downloadable objects (media files, software, documents), applications, real time media streams, and other components of internet delivery (DNS, routes, and database queries)
Like for Jquery https://code.jquery.com/jquery-2.1.3.min.js You can use free/ commercial CDNs for all of static content
Optimize Image: Images we use can reduce lot of kbs if properly optimize, this technique is very useful and definitely a must for websites having lot of high resolution images. Also we should not scale down bigger images where small image is needed, we should get smaller image instead.
Optimizers Yahoo service , Steelbyte And for designers PNG Optimization
Once you are done with all the changes these are few online tools which tells you what all can be further optimized, one is PageSpeed
Explore these options and checkout the difference!
good to see your writing