NeuQuant.js (JavaScript color quantization) hidden bug in JS conversion

别说谁变了你拦得住时间么 提交于 2019-12-03 00:50:32

JavaScript code seems to ignore that C truncates the results of the operations with decimal numbers before assign them to integer variables. So, int i = 5 / 2; is 2 to C, but var i = 5 / 2; is 2.5 to JavaScript.

Said that, change this line:

delta = samplepixels / ncycles;

to:

delta = (samplepixels / ncycles) | 0;

This solves the issue, but it's not clear to me if this change solves all the possible integer conversion problems, or only the one exposed in the question.

Note that I have used the bitwise OR operator to truncate the result. This is a classic way to truncate a number in JavaScript, because bitwise operators treat their operands as integers of 32 bits.

It took me a few days to figure out this subtle bug. The problem is not limited to JavaScript implementations, the C/C++ and C# versions have it too. The bug is in the way pixels are sampled during the learning process. The code uses one of four prime numbers, 499, 491, 487 and 503. In case the image is 500x499, for example, it chooses 491, and pixels are sampled pretty uniformly, as follows (red dots are sampled pixels):

good sampling

Now, with a 500x500 image, 499 is chosen and the sampling is awfully bad:

bad sampling

I got rid of all the prime number nonsense and used a good old random number generator:

    int step = ((float)rand() / (float)RAND_MAX) * lengthcount;
    if( step >= lengthcount )
        step = lengthcount - 1;
    p = thepicture + step;

Works great with any image size!

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!