问题
I'm trying to use cocoa to grab images from a webcam. I'm able to get the image in RGBA format using the QTKit and the didOutputVideoFrame delegate call, and converting the CVImageBuffer to a CIImage and then to a NSBitmapImageRep.
I know my camera grabs natively in YUV, what I want is to get the YUV data directly from the CVImageBuffer, and proccess the YUV frame before displaying it.
My question is: How can I get the YUV data from the CVImageBuffer?
thanks.
回答1:
You might be able to create a CIImage
from the buffer using +[CIImage imageWithCVBuffer:]
and then render that CIImage into a CGBitmapContext
of the desired pixel format.
Note, I have not tested this solution.
回答2:
I asked what I thought was a different question, but it turned out to have the same answer as this one: raw data from CVImageBuffer without rendering?
来源:https://stackoverflow.com/questions/2837383/how-can-i-obtain-raw-data-from-a-cvimagebuffer-object