I\'m trying to read pixels out of the screen buffer, I\'m creating a CGImageRef
with CGDisplayCreateImage
, but the values for CGImageGetWidth
The bytes per row (also called the “stride”) can be larger than the width of the image. The extra bytes at the end of each row are simply ignored. The bytes for the pixel at (x, y)
start at offset y * bpr + x * bpp
(where bpr
is bytes-per-row and bpp
is bytes-per-pixel).
Notice that 1376 is exactly divisible by 32 (and all smaller powers of 2), while 1366 is not. The CPUs in modern Macs have instructions that operate on 16 or 32 or 64 bytes at a time, so the CGImage
algorithms can be more efficient if the image's stride is a multiple of 16 or 32 or 64. CGDisplayCreateImage
was written by someone who knows this.