For my current project I\'m reading the main camera output of the iPhone. I\'m then converting the pixelbuffer to a cached OpenGL texture through the method: CVOpenGLE
We can't cast a still image texture to a CVOpenGLESTextureCacheRef. Core Video lets you map video frames directly to OpenGL textures. Using a video buffer where Core Video creates the textures and gives them to us, already in video memory.
To create the opengles texture this link may help you
Link