CGImageRef width doesn't agree with bytes-per-row

前端 未结 1 1085
北海茫月
北海茫月 2021-01-05 18:51

I\'m trying to read pixels out of the screen buffer, I\'m creating a CGImageRef with CGDisplayCreateImage, but the values for CGImageGetWidth

相关标签:
1条回答
  • 2021-01-05 19:27

    The bytes per row (also called the “stride”) can be larger than the width of the image. The extra bytes at the end of each row are simply ignored. The bytes for the pixel at (x, y) start at offset y * bpr + x * bpp (where bpr is bytes-per-row and bpp is bytes-per-pixel).

    Notice that 1376 is exactly divisible by 32 (and all smaller powers of 2), while 1366 is not. The CPUs in modern Macs have instructions that operate on 16 or 32 or 64 bytes at a time, so the CGImage algorithms can be more efficient if the image's stride is a multiple of 16 or 32 or 64. CGDisplayCreateImage was written by someone who knows this.

    0 讨论(0)
提交回复
热议问题