Even Faster Web Sites

Last few months I just become void and search something to do or learn on technically. And I went to my company library and searching for some books, and found this one “Even Faster Web Sites”.

If your are front end developer or loves web page designing, you would come across few rules for making web page faster – “High Performance Web Sites” book, even lot of articles, forum, blogs explains about those 10 rules. Steve Souders, who wrote the book High Performance Web Sites made the web development to improve 80% faster.

He and other folks contributed the book “Even Faster Web Sites”. I think why he named it even faster is because already he written a book for performance which makes the web fast. So obviously the next one should be even more faster right.

The book covers 15 chapters and some performance tools. And I can’t give all the details over  here, because you should read the book to learn more deeper. Here I will talk about some of the things which made me to surprise/shocks me on the behavior of web and browser.

 

1) Splitting the initial payload

If you talk a look at top 10 US web sites, 75% of the JavaScript are not executed on page load. So why do we need to load those stuff on before load and makes page delay. Idea over here is to use Defer rendering, where we can load the scripts(scripts not executing before page load) after page gets loaded.

2) Scripts Blocking other components

This topic was new to me and surprised that web and browsers are behaving in such a manner.

Scripts tag will blocks other components(images, style, iframe, anything) to execute or render until it downloaded and executed

Why this is happening is because of script is capable of changing the web page(by style, or new components and many features). So to preserve the order of the execution and page rendering, it blocks others components.

So  if you are having scripts which will not do anything of the web page modification, you can execute or render without blocking the other components by set of technics like XHR Eval, XHR Injection, Script in Iframe, Script DOM element, Script Defer, document.write Script tag.

And even CSS style also will block other components to execute or render, reason as same as script blocks.

Best way is to position the CSS Styles at top and Scripts at bottom to load the page in efficient way.

3) Writing Efficient JavaScript

In JavaScript lifecycle, we have execution context(scope), global context and scope chain. So whatever variable we refer or use, will start search from active execution conetext (scope) and it continues to the upper stack and atlast goes to global context.

We need to be more careful of using the global variable, and for the local variable don’t forget to put var keyword, most of the developer won’t do that which leads that to global variable. And store the repeatedly using variables into local variable to process fast.

Processing DOM element is most costly in the JavaScript, where we mostly to use the DOM document or any elements like ( document.getElementById(‘..’) ) which needs to be stored in to local variable in case of it is used in more than one statement, which makes the access scope search easier and faster.

Loop tip:

To improve the performance of a loop is to decrement the iterator towards 0 rather than incrementing will save up to 50% of original execution time depend upon the complexity of loop.

4) Going Beyond Gzipping:

Even if Gzipping support is enabled in your server and the browser, sometimes nearly 15% of users are not getting the Gzipped version.

This happens because of these two culprits ( web proxies, PC security software’s ), which will make the original header to mangled like (X-cept-EncodXng: XXXXX). So the server and browser couldn’t understand the encoding format and gets failed to serve Gzipped content.

To overcome this we can use cookie to share that the browser supports Gzip format and server can serve the Gzipped content.

5) Sharding Dominant Domains

There is a W3C rule that browser needs to make max async call to single domain. If you take look at the older browser, it supports only 2 parallel call at a time to single domain. But newer browsers supports up to 6 parallel calls to single domain.

If a web page supports maximum parallel downloads, will make the page maximum to load faster. In order to download max parallel connection, we can go for multiple domains as the rule applies to single domain. So if components are shared with multiple domains like (CSS, images and JS are put into two domains), CSS and JS will download at same time from different domains.

And an important concept here is that this restriction to max parallel calls is not only the webpage, it applies to all the window, tab of that browser. So If you are loading YouTube in 2 tabs at a time and one tab start max downloading the components of page from that domain(YouTube), other tab needs to wait for the first tab to gets completed. As the MAX connection applies to whole browser at a time.

To use multiple domains, it doesn’t mean that we need to have 2 different servers, because browser won’t look for IP address to check MAX connection. It just checks the url hostname. So we can have CNAME (alias ) for the same server and use the alias for some components.

6) Simplifying CSS selectors

In the current Web 2.0 applications we use lot of CSS selectors to apply different styles.

How the CSS selectors executes and apply the styles is “browser tries to match the CSS selectors with the elements in the document.  The amount of matching the browser must perform depends on how the CSS selectors are written”.

Our mentality of writing the selectors is as same as our writing way, starts from left to right. Ex: to apply a bg color as red to all the span inside div under div(with class middle) will be

div > div.middle > span { background-color: red; }

But how the browser reads or executes the selectors are from right to left. So all our selectors should be turned to filter most of elements on right most one. For the above example browser selects all span, and check each span whether its child of div.middle.

CSS selectors should be on right to left

 

Performance Tools:

There are lot of tools which shared in the book, am just list the major hits in the web developers circle.

HttpWatch

Fiddler

Firebug

YSlow

 

Here the link to see the examples of the EFWS,

http://stevesouders.com/efws/links.php?ex

Hope it will be some way helpful to you, and am trying to write the blog in interesting way. If it bored am sorry. Anyway read the books “Even Faster Best Sites” and “High Performance Web Sites” to get into it deeper with lot more information’s.

Geolocation using JavaScript

I have been thinking to buy a mobile which has GPS support. But what a petty now even my laptop supports or helps to find my Geolocation through JavaScript and my browser. Have wondered how !!! Ya me too had same. In this blog will details about how can we work thro’ and its features.

Geolocation Specification:

Actually, W3C have been released API specification of Geolocation from 2008 onwards. And now it become implemented by all the browser. Started from firefox, chrome and now IE9 also supports Geolocation.

The Geolocation API defines a high-level interface to location information associated only with the device hosting the implementation, such as latitude and longitude. The API itself is agnostic of the underlying location information sources. Common sources of location information include Global Positioning System (GPS) and location inferred from network signals such as IP address, RFID, WiFi and Bluetooth MAC addresses, and GSM/CDMA cell IDs, as well as user input. No guarantee is given that the API returns the device’s actual location.[W3C – Definition]

How to use in your page:

You can get the Geolocation object from

navigator.geolocation

For best pratice, check if the geolocation object is null, then the browser doesn’t supports Geolocation.

 

navigator.geolocation.getCurrentPosition(getPositionFunction);

getPositionFunction(position){

latitude = position.coords.latitude;

longitude=position.coords.longitude;

}

Features:

  • Requesting repeated position, helps to update the location automatically.
  • Requesting from cached position

Privacy:

Due to privacy issues of retrieving the users location,

W3C has requested a privacy considerations for implementors of Geolocation API (i.e browsers) that, User Agents must not send location Information to Web Sites without permission of user.

So browser have implemented in such a way, Browsers will popup a dialog bar at top, where we can either “allow” or “deny” the service to send to the web site.

Privacy considerations for receipients (i.e. Web Sites which use navigator API),  If the Geolocation information is stored, then it should be allowed to update or delete to user.

Implementation and more properties:

This implementation is for those who gonna implement there own geolocation algorithm,

There are two main Interfaces NavigatorGeolocation and Geolocation.

Geolocation interfaces has signature for all the methods.

Position interface a property.

Coordinates – which holds all the location information. Properties in Coordinates are

  1. latitude
  2. longitude
  3. altitude – height of the position in meters, if implementor cannot provide will be NULL.
  4. accuracy – accuracy level of location in meters.
  5. altitudeAccuracy – both the accuracy should be 95% confidence level.
  6. heading – denotes the direction of travel in degrees,
  7. speed – denotes current ground speed, in m/s.

Usage:

Most of the commerical sites are started using in their sites.

  • Mostly on Maps, to show up our current position.
  • Apple trailers – get your current position and display the movies shows in your city.

So use the HTML5 – Geolocation and provide better RIA to your clients.

But then when I’m tried to get the location thro’ some test sites, It have been shows current city, location is nearly 20 Kms around.

Smile with tongue out

— The above information are taken from W3C specifications.

JavaScript Minifier

AJAX Minifier

Microsoft AJAX Minifier 4.0 have been released and do you wonder what is about ..?

The Microsoft Ajax Minifier enables you to improve the performance of your Ajax applications by reducing the size of your JavaScript files. The Microsoft Ajax Minifier supports two levels of minification: normal crunching and hypercrunching. Normal crunching refers to the process of removing unnecessary whitespace, comments, semicolons, and curly braces. Hypercrunching refers to the process of shortening the names of local variables and removing unreachable code.

The Microsoft Ajax Minifier includes the following components:

  • Command-line tool– ajaxmin.exe — Enables you to minify JavaScript files from a command prompt.
  • MSBuild Task — ajaxmintask.dll — Enables you to minify JavaScript files in a Visual Studio project automatically when performing a build.
  • Component — ajaxmin.dll — Enables you to to use the Microsoft Ajax Minifier programmatically.

Here is the link where you can download the AJAX Minifier:

http://aspnet.codeplex.com/releases/view/34488

And here is the tutorial link where it was explained little brief on it.

http://www.ajaxprojects.com/ajax/tutorialdetails.php?itemid=766

Yesterday I found hard to implement minifier support through my C#, Because I couldn’t get any tutorial anywhere, And at last I found a tutorial page.

http://www.asp.net/ajaxlibrary/AjaxMinDLL.ashx

There are so many other Minifiers on existing

JSMin

– one of the famous tools to minify the Javascript

http://www.crockford.com/javascript/jsmin.html

Shrinksafe

— A Dojo’s toolkit

http://o.dojotoolkit.org/docs/shrinksafe

Packer

–  Dean Edwards

http://base2.googlecode.com/svn/trunk/src/apps/packer/packer.html

JSO

— JS-Optimizer

http://js-optimizer.sourceforge.net/

JAWR

— A tool for java web application

https://jawr.dev.java.net/

An online tool — http://jscompress.com/

YUI

— Yahoo’s most performance and light weight compresser

http://developer.yahoo.com/yui/compressor/

At last by using the Minifiers, Make you application light weight, hight performance and load quicker than others.