Skip to content

Benchmarking of JavaScript Performance: Firefox 3.5 vs. Google Chrome 3.0

Many web pages and web applications rely on JavaScript. That is the reason JavaScript performance in web browsers tends to be considered a high priority by browser vendors. There may be other factors that affect browser performance, as browsers are not simply JavaScript engines. This was mentioned by a senior product manager on the Internet Explorer development team, as you can see in this Computerworld article in which the results of JavaScript benchmark tests on different web browsers are mentioned. However, in that same article, it is said that Internet Explorer 8 Release Candidate 1 completed those same JavaScript benchmark tests four times faster than IE8 Beta 2 did. This seems to indicate that those working on Internet Explorer do consider JavaScript performance important. Since browser vendors consider JavaScript performance important, it is important that browsers do well on these benchmark tests.

There are three major JavaScript performance test suites, each of which were released by different browser vendors. Webkit released SunSpider, Mozilla released Dromaeo, and Google released the V8 benchmark. As is often the case with benchmark tests, they may not necessarily accurately determine what performs best in the actual situations that they try to simulate. These JavaScript benchmark tests may have their flaws, some of which are still being addressed. Despite the flaws that JavaScript benchmarks may have, they can give an indication of how some JavaScript engines can be better than others. In the Computerworld article previously mentioned here, the SunSpider test suite was used to test the performance of the JavaScript engines of different browsers.

After I wrote about Google Chrome in the last entry here, I wanted to more precisely determine how much better it is at JavaScript performance than other browsers are. I considered running a few benchmark tests on a few different browsers. However, that has already been done many times by others. One can view results of such tests in the Computerworld article to which I previously posted a link, and here. However, those tests were run before Firefox 3.5 was released, and JavaScript performance in Firefox 3.0 is not as good as it is in Firefox 3.5. In fact, according to what is on this page on Firefox performance, SunSpider tests indicate that JavaScript performance in Firefox 3.5 is twice as good as JavaScript performance in Firefox 3.0. I also wanted to run these tests myself. I ran the SunSpider benchmark tests using Firefox and Chrome, and found that the benchmark tests ran much faster in Chrome than they did in Firefox.

The SunSpider test suite is a comprehensive one that simulates situations that users will encounter when they browse the web, such as generation of tag clouds from JSON input. In running these tests, I ensured that I was using the latest versions of these web browsers. I also ensured that no other browser tabs were open when I ran these tests. I also created a new Firefox profile and used it to run the SunSpider tests on Firefox. I wanted to ensure that these test results would be as accurate as possible.

I found it interesting to see the mean times it took for the tests in the SunSpider suite to be completed. It was also interesting to see what the 95% confidence intervals were, which as some of those reading this may know, are used to determine with 95% certainty that true mean values are within those intervals. I viewed the source code on the SunSpider website to see how those values were calculated using a Student’s t-distribution, and was reminded of concepts that I learned about in a university statistics course. After running the tests three times with Google Chrome, the average time it took for the tests to be completed was 822.2 ms. I then ran the tests three times with Firefox, and the average time it took for the tests on Firefox to be completed was 1756.8 ms. Firefox took more than twice as long as Chrome to complete the suite of tests. One can verify this after viewing the results of the first, second, and third tests I ran with Chrome, and the results of the first, second, and third tests I ran with Firefox. One can also copy and paste URLs from one set of test results into a text box on a page that contains the other browser’s results to compare results.

As you may have seen if you have viewed the results of other JavaScript benchmark tests that have been posted on the web, they tend to list only the mean times it took to complete them. Confidence intervals are not mentioned in the Computerworld article that I previously mentioned. When one wants to see precisely how much faster one browser’s JavaScript engine is than another’s, one may want to take confidence intervals into account. However, in the tests that I ran, overall confidence intervals did not differ by a large percentage. The mean values and confidence intervals were similar for the three tests that were run on each browser. This gives a strong indication that Chrome’s JavaScript performance is significantly better than that of Firefox.

Google Chrome may outperform other browsers in JavaScript performance. However, Internet Explorer’s JavaScript performance is as good as Google Chrome’s when Google Chrome Frame is used with it. According to this other Computerworld article, when Internet Explorer is used with Chrome’s JavaScript engine, its JavaScript performance is, understandably, much better. Once again, Computerworld used the SunSpider test suite to determine these JavaScript performance differences. Google Chrome’s JavaScript engine’s performance may be better than the performance of other JavaScript engines. It also appears that it will continue to be better than other JavaScript engines. In this post on the Google Chrome blog it was said that JavaScript performance in its latest beta has increased by 30% since the last stable release of Google Chrome.