Aug 30, 2011

Marketing implications of benchmarking Firefox, Chrome, Internet Explorer

Benchmarks for Browsers
Material for this table from Steven J. Vaughan-Nichols blog “Firefox 6: A Firefox too far?
Benchmark system:
·   Gateway DX4710
·   Windows 7 SP1
·   2.5 GHz Intel Core 2 Quad processor
·    6 GB RAM
·    Intel GMA (Graphics Media Accelerator) 3100
·    Netgear Gigabit Ethernet switch
·    60 Mbps cable Internet connection.
White cells indicate best scores
Firefox 6
Chrome 13
IE 9
1.    Acid 3 compliance with web standards (JavaScript, CSS, Extensible XML, etc.)
Max score = 100; higher is better
2.    HTML5 Test
Max score = 400; higher is better
3.    Mozilla Kraken 1.0 JavaScript benchmark
Lower is better
4.    Google JavaScript V8 Benchmark Suite
Higher is better
5.    Peacekeeper JavaScript benchmark
Higher is better
6.    SunSpider 0.9.1 oldest JavaScript benchmark
Lower is better

Comparing browsers with benchmark programs can reveal browsers’ strengths and weaknesses, and possibly implications for the browser product strategy. The ZDnet blogger Steven Vaughan-Nichols chose six benchmarks.

Comparing the performance of several browsers on the same very capable computer with a high-speed connection is the most objective way to see which browser
  • enables you to browse the web faster
  • displays the most web page types with no problems
  • supports most-used web applications
  • taxes your local PC the least.
Of course, the configuration of your local PC, both hardware and software, especially the operating system, and your web connection will greatly affect how any program, including browsers, works.

Acid 3, the first benchmark in the list, tests for a collection of widely used browser features such as memory usage and web tools such as JavaScript, CSS, XML. It’s a general baseline rather than optimized for any particular aspect such as JavaScript. Four of the benchmarks mentioned here test for JavaScript.

The second benchmark, for HTML 5, is the one I am most interested in because HTML 5 is a new standard that adds capabilities with productivity benefits for users. For example, it shifts some of the ‘heavy lifting’ of web applications like Flash to the ‘cloud’ rather than making your local PC do the lifting. It obviates plug-ins – no more downloading plug-ins before certain web tools work.

HTML5 was accepted as a standard less than a year ago so HTML5-type web pages aren’t very common yet. This might be why the newer browsers, Firefox 6 and Chrome, benchmarked about the same whereas Internet Explorer (IE) lagged significantly. The product strategy implication may be that IE decided to adopt new standards more slowly  because it’s more important to take the time to make new versions better backward-compatible in order to maintain an workable upgrade path for IE's huge number of existing customers, especially those in enterprises who tend to be slow to upgrade.

Chrome is the newest of the three browsers so it has a smaller ‘installed base’ that it must ‘bring along’ and less legacy code with which it must be compatible.

Firefox is neither the oldest browser nor the newest but it has the most frequent releases of the three browsers. It already has the largest market share (see table below) and its users tend to be naive and early adopters. So why run the release treadmill so much faster than the competition? Frequent releases do deliver new features faster, but releases always also include fixes such as for security holes and compatibility problems.

The last four of the six benchmarks in the table above are for JavaScript. Good JavaScript performance is really important because it’s the main language for browsers and websites. It’s also used in applications that are outside of web pages such as PDF, site-specific browsers, and desktop widgets.

It’s surprising that Firefox 6 didn’t ‘win’ Kraken (the third benchmark in table above) because it is Mozilla's own JavaScript benchmark. They must identify which aspects of Kraken are causing Firefox to get low marks. In any case, Mozilla should either optimize Firefox for Kraken or adjust the benchmark. Mozilla’s marketing department must be distressed that they cannot use their own benchmark, or any of these six benchmarks in their campaigns. I should investigate Firefox's strategy to compete against Chrome.

Google aced Acid3 as well as its JavaScript benchmark, V8. So, either Google used the benchmark(s) to guide Chrome’s development, or V8 was designed for Chrome, or the good match is an honestly fortuitous intersection. Optimizing the product for certain benchmarks is not only good engineering practice, but good marketing too.

Firefox came in last by a wide margin in the SunSpider (last benchmark in the table above), the oldest JavaScript benchmark. This could mean (I am guessing) Firefox may have some fundamental incompatibility with JavaScript, and/or of the three browsers, Firefox has drifted the furthest from older JavaScript, or other attributes that affect the benchmark such as poor memory management, a very important browser problem. Nevertheless, market share wins over performance against benchmarks. Whatever the benchmark results may be, Firefox is almost twice as widely used as IE and Chrome (statistics from the W3C School).