I\'m looking for the fastest way to decode a local mpeg-4 video\'s frames on the iPhone. I\'m simply interested in the luminance values of the pixels in every 10th frame. I
If you are willing to use an iOS 5 only solution, take a look at the sample app ChromaKey from the 2011 WWDC session on AVCaputureSession.
That demo captures 30 FPS of video from the built-in camera and passes each frame to OpenGL as a texture. It then uses OpenGL to manipulate the frame, and optionally writes the result out to an output video file.
The code uses some serious low-level magic to bind a Core Video Pixel buffer from an AVCaptureSession to OpenGL so they share memory in the graphics hardware.
It should be fairly straightforward to change the AVCaptureSession to use a movie file as input rather than camera input.
You could probably set up the session to deliver frames in Y/UV form rather than RGB, where the Y component is luminance. Failing that, it would be a pretty simple matter to write a shader that would convert RGB values for each pixel to luminance values.
You should be able to do all this on ALL Frames, not just every 10th frame.