How can I do fast image processing from the iPhone camera?

前端 未结 2 1401
离开以前
离开以前 2021-01-30 02:03

I am trying to write an iPhone application which will do some real-time camera image processing. I used the example presented in the AVFoundation docs as a starting point: sett

2条回答
  •  猫巷女王i
    2021-01-30 02:21

    As Steve points out, in my answer here I encourage people to look at OpenGL ES for the best performance when processing and rendering images to the screen from the iPhone's camera. The reason for this is that using Quartz to continually update a UIImage onto the screen is a fairly slow way to send raw pixel data to the display.

    If possible, I encourage you to look to OpenGL ES to do your actual processing, because of how well-tuned GPUs are for this kind of work. If you need to maintain OpenGL ES 1.1 compatibility, your processing options are much more limited than with 2.0's programmable shaders, but you can still do some basic image adjustment.

    Even if you're doing all of your image processing using the raw data on the CPU, you'll still be much better off by using an OpenGL ES texture for the image data, updating that with each frame. You'll see a jump in performance just by switching to that rendering route.

    (Update: 2/18/2012) As I describe in my update to the above-linked answer, I've made this process much easier with my new open source GPUImage framework. This handles all of the OpenGL ES interaction for you, so you can just focus on applying the filters and other effects that you'd like to on your incoming video. It's anywhere from 5-70X faster than doing this processing using CPU-bound routines and manual display updates.

提交回复
热议问题