Monday, October 31, 2011

Tools to analyze the web pages


Here are six tools that can analyzes web pages and tells you why they are slow. Use the following tools to:
  • Make your site faster.
  • Debug site problem, especially client side and server side stuff.
  • Better user experience.
  • Improve the web.

#1: Yahoo! YSlow

The Firebug extension for Firefox allows you to debugging, editing, and monitoring of any website's CSS, HTML, DOM, and JavaScript. YSlow works with the firebug extension:
YSlow analyzes web pages and suggests ways to improve their performance based on a set of rules for high performance web pages. YSlow is a Firefox add-on integrated with the Firebug web development tool. YSlow grades web page based on one of three predefined ruleset or a user-defined ruleset. It offers suggestions for improving the page's performance, summarizes the page's components, displays statistics about the page, and provides tools for performance analysis, including Smush.i and JSLint.
Fig.01 Yahoo! Yslow
Fig.01 Yahoo! Yslow Providing Overall Score For Cyberciti.biz (click to enlarge)
If you apply tips provided by YSlow, your corporate web site or personal blog can be load pretty faster as compare to old version.
Fig.02: YSlow Components Level Report (click to enlarge)
Fig.02: YSlow Components Level Report (click to enlarge)
This is useful to find out if Apache or Lighttpd compressing (gzipping) files or not.
Fig.03: YSlow! Graphical Representation of Various Components
Fig.03: YSlow! Graphical Representation of Various Components

#2: Google Page Speed

Page Speed is an open-source Firefox/Firebug Add-on. You can use Page Speed to evaluate the performance of yoir web pages and to get suggestions on how to improve them.
Fig.04: Google Page Speed in Action  (click to enlarge)
Fig.04: Google Page Speed in Action (click to enlarge)
Fig.05: Google Page Speed Suggestions
Fig.05: Google Page Speed Suggestions

#3: Pagetest (IE specific tool)

This tool only works with MS Internet Explorer. From the project web page:
Pagetest is an open source tool for measuring and analyzing web page performance right from your web browser. AOL developed Pagetest internally to automate load time measurement of its many websites, and it has evolved into a powerful tool for web developers and software engineers in testing their web pages and getting instant feedback. We decided to release it to the grander web development community to further help evolve it into an even more useful - and free - web performance tool.
Fig.06: Waterfall Of My Web Page Load Performance Using  (click to enlarge)
Fig.06: Waterfall Of My Web Page Load Performance Using (click to enlarge)

#4: HTTP Server Benchmarking Tool

ab is a tool for benchmarking your Apache Hypertext Transfer Protocol (HTTP) server. It is designed to give you an impression of how your current Apache installation performs. This especially shows you how many requests per second your Apache installation is capable of serving. See how to use ab command.
httperf is a tool to measure web server performance. It speaks the HTTP protocol both in its HTTP/1.0 and HTTP/1.1 flavors and offers a variety of workload generators.  Following command causes httperf to create a connection to host www.cyberciti.biz send and receive the reply, close the connection, and then print some performance statistics.
$ httperf --hog --server www.cyberciti.biz
Sample Outputs:
httperf --hog --client=0/1 --server=www.cyberciti.biz --port=80 --uri=/ --send-buffer=4096 --recv-buffer=16384 --num-conns=1 --num-calls=1
httperf: warning: open file limit > FD_SETSIZE; limiting max. # of open files to FD_SETSIZE
Maximum connect burst length: 0
Total: connections 1 requests 1 replies 1 test-duration 0.236 s
Connection rate: 4.2 conn/s (236.0 ms/conn, <=1 concurrent connections)
Connection time [ms]: min 236.0 avg 236.0 max 236.0 median 235.5 stddev 0.0
Connection time [ms]: connect 47.0
Connection length [replies/conn]: 1.000
Request rate: 4.2 req/s (236.0 ms/req)
Request size [B]: 70.0
Reply rate [replies/s]: min 0.0 avg 0.0 max 0.0 stddev 0.0 (0 samples)
Reply time [ms]: response 38.0 transfer 151.0
Reply size [B]: header 242.0 content 26976.0 footer 2.0 (total 27220.0)
Reply status: 1xx=0 2xx=1 3xx=0 4xx=0 5xx=0
CPU time [s]: user 0.01 system 0.22 (user 6.3% system 93.6% total 99.9%)
Net I/O: 112.9 KB/s (0.9*10^6 bps)
Errors: total 0 client-timo 0 socket-timo 0 connrefused 0 connreset 0
Errors: fd-unavail 0 addrunavail 0 ftab-full 0 other 0
Following is Like above, except that a total of 100 connections are created and that connections are created at a fixed rate of 10 per second:
# httperf --hog --server www.cyberciti.biz --num-conn 100 --ra 10 --timeout 5
Sample Outputs:
httperf --hog --timeout=5 --client=0/1 --server=www.cyberciti.biz --port=80 --uri=/ --rate=10 --send-buffer=4096 --recv-buffer=16384 --num-conns=100 --num-calls=1
httperf: warning: open file limit > FD_SETSIZE; limiting max. # of open files to FD_SETSIZE
Maximum connect burst length: 1
Total: connections 100 requests 100 replies 100 test-duration 10.089 s
Connection rate: 9.9 conn/s (100.9 ms/conn, <=4 concurrent connections)
Connection time [ms]: min 186.7 avg 193.6 max 302.3 median 187.5 stddev 20.8
Connection time [ms]: connect 36.4
Connection length [replies/conn]: 1.000
Request rate: 9.9 req/s (100.9 ms/req)
Request size [B]: 70.0
Reply rate [replies/s]: min 9.8 avg 9.9 max 10.0 stddev 0.1 (2 samples)
Reply time [ms]: response 39.5 transfer 117.7
Reply size [B]: header 242.0 content 26976.0 footer 2.0 (total 27220.0)
Reply status: 1xx=0 2xx=100 3xx=0 4xx=0 5xx=0
CPU time [s]: user 0.34 system 9.75 (user 3.4% system 96.6% total 99.9%)
Net I/O: 264.1 KB/s (2.2*10^6 bps)
Errors: total 0 client-timo 0 socket-timo 0 connrefused 0 connreset 0
Errors: fd-unavail 0 addrunavail 0 ftab-full 0 other 0
  • Download httppref utility for UNIX like operating systems.
  • See ab and httppref man page for more details.

#5: Full Page Test

The Full Page Test loads a complete HTML page including all objects (images, CSS, JavaScripts, RSS, Flash and frames/iframes). It mimics the way a page is loaded in a web browser. The load time of all objects is shown visually with time bars.
Fig.07: Pingdom page test in action
Fig.07: Pingdom page test in action

#6: UNIX wget or fetch Utility

wget is used to retrieve the file(s) pointed to by the URL(s) on the command line. It can tell you exact time it spent to download your files:
$ wget http://www.cyberciti.biz/files/test.pdf
$ wget http://www.cyberciti.biz/

Sample Outputs:
--2009-07-15 22:09:05--  http://www.cyberciti.biz/
Resolving www.cyberciti.biz... 74.86.48.99
Connecting to www.cyberciti.biz|74.86.48.99|:80... connected.
HTTP request sent, awaiting response... 200 OK
Length: unspecified [text/html]
Saving to: `index.html'
    [   <=>                                                                                                               ] 26,976      38.0K/s   in 0.7s
2009-07-15 22:09:07 (38.0 KB/s) - `index.html' saved [26976]

No comments:

Post a Comment