Measuring and benchmarking processing power of a javascript engine in a browser

随声附和 提交于 2019-12-19 09:44:21

问题


What is an accurate way to measure the performance of a javascript engine like v8 or spidermonkey ? It should at least have not very high deviations from one evaluation and another, probably allow to rank between different javascript engines on different operating systems and different hardware configurations.

My first attempt was this in a web page with nothing on it, I loaded that page in web browsers. Then I tried executing this code in Google Chrome's javascript console and it came out very different as you'll see in the results:

mean = function (distr) {
    var sum = 0;
    for (obs in distr) {
        sum += distr[obs];
    };
    return sum / distr.length;
};

stdev = function (distr,mean) {
    var diffsquares = 0;
    for (obs in distr) {
        diffsquares += Math.pow(distr[obs] - mean , 2);
    };
    return Math.sqrt((diffsquares / distr.length));
};


var OPs = 1000000;

var results = [];
for (var t = 0; t < 60; t++) {
    var start = (new Date()).getTime();
    for(var i = 0.5; i < OPs; i++){
        i++;
    }
    var end = (new Date()).getTime();
    var took = end - start;
    var FLOPS = OPs/took;
    results.push(FLOPS);
};

average = mean(results);
deviation = stdev(results,average);

console.log('Average: '+average+' FLOPS. Standart deviation: '+deviation+' FLOPS');

And it replied:

NodeJS 0.5.0

  1. Average: 74607.30446024566 FLOPS. Standart deviation: 4129.4008527666265 FLOPS
  2. Average: 73974.89765136827 FLOPS. Standart deviation: 4574.367360870471 FLOPS
  3. Average: 73923.55086434036 FLOPS. Standart deviation: 5768.396926072297 FLOPS

Chrome 13.0.782.112 (From the Console (Ctrl+Shift+J))

  1. Average: 1183.409340319158 FLOPS. Standart deviation: 24.463468674550658 FLOPS
  2. Average: 1026.8727431432026 FLOPS. Standart deviation: 18.32394087291766 FLOPS
  3. Average: 1063.7000331534252 FLOPS. Standart deviation: 22.928786803808094 FLOPS

Chrome 13.0.782.112 (as a webpage)

  1. Average: 47547.03408688914 FLOPS. Standart deviation: 4064.7464541422833 FLOPS
  2. Average: 49273.65762892078 FLOPS. Standart deviation: 1553.1768207400576 FLOPS
  3. Average: 47849.72703247966 FLOPS. Standart deviation: 3445.930694070375 FLOPS

Firefox 6.0

  1. Average: 62626.63398692811 FLOPS. Standart deviation: 3543.4801728588277 FLOPS
  2. Average: 85572.76057276056 FLOPS. Standart deviation: 4336.354514715926 FLOPS
  3. Average: 63780.19323671495 FLOPS. Standart deviation: 3323.648677036589 FLOPS

Opera 11.50

  1. Average: 38462.49044165712 FLOPS. Standart deviation: 2438.527900104241 FLOPS
  2. Average: 37968.736460671964 FLOPS. Standart deviation: 2186.9271687271607 FLOPS
  3. Average: 38638.1851173518 FLOPS. Standart deviation: 1677.6876987114347 FLOPS

Something strange happened. The benchmark in Chrome on the console took a lot more time than the ones in other browsers and NodeJS. I mean something like 30 seconds on Chrome versus 2 on others. The standart deviations in Chrome on the console are also very small compared to others. Why this huge difference between executing the code on the console and executing code in a webpage ?

If this is all too stupid let me remind you that I "learned" javascript (and to code in general) by myself and not very long ago, so I suck at a lot of things.

What is a good measure of this ? I'd like to focus on speed of math operations and not other things like regex speed. What do you recomend ? I also tryied generating 10x10 matrixes of floating point numbers and multiplying them lots of times, the result comes every time either 7, 8 or 9 M FLOPS, but mostly 7 on Chrome, if it's not stupid at all and someone wants the code I'm happy to pastebin it.


回答1:


JS performance optimization is a huge area in general, and it's rather ambitious to start from scratch.

If I were you, I'd take a look at some existing projects around this space:

  • Benchmark.js handles the timing and stats analysis (averaging, computing variance) bits.
  • JSPerf lets anyone create and run tests and then look at results for any browser. There's q large repository of tests there that you can peruse.
  • BrowserScope is the results storage for JSPerf tests, and tracks results per-UA.



回答2:


The Chrome console has a "weird" execution environment that's not quite the web page itself and incurs some performance costs due to that, I would think. That's certainly true for the console in Firefox.

To answer your original question... it really depends on what you want to measure. Different JS engines are good at different things, so depending on the test program you could have Chrome being 5x faster than Firefox, say, or vice versa.

Also, the optimizations browser JITs do can be very heavily dependent on the overall code flow, so the time it takes to do operation A followed by operation B is in general not the same as the sum of the times needed to do A and B separately (it can be much larger, or it can be smaller). As a result, benchmarking anything other than the code you actually want to run is of very limited utility. Running any single piece of code is nearly useless for "ranking web browsers according to performance".



来源:https://stackoverflow.com/questions/7128057/measuring-and-benchmarking-processing-power-of-a-javascript-engine-in-a-browser

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!