I want to bump-map the cameras images with a fixed texture...
I found the following thread: Bump Mapping on the iPhone
on general bump-mapping techniques. But t
Responding to your comment, to add OpenGL effects to camera outputs you need to use AVFoundation, which was introduced in iOS 4. That can be used to get a live feed of pixel data from the camera, which you can upload to OpenGL as a texture — allowing you to do whatever you want subsequently.
If you'll forgive me referencing an answer I've supplied in the past, basic stuff to start receiving a suitable buffer is here.
You'll receive CMSampleBuffers on a secondary thread in your delegate method:
- (void)captureOutput:(AVCaptureOutput *)captureOutput
didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer
fromConnection:(AVCaptureConnection *)connection
You can use CMSampleBufferGetImageBuffer
to get a CVImageBuffer and from that use a combination of CVPixelBufferLockBaseAddress
CVPixelBufferGetBaseAddress
, CVPixelBufferGetBytesPerRow
, and possibly CVPixelBufferGetWidth
and CVPixelBufferGetHeight
to get at a C array of the pixels. In my testing across an iPhone 4 and an iPhone 3GS, the buffer has only ever come back tightly packed (so, no padding between lines), meaning that it can be uploaded immediately as GL_BGRA, since the APPLE_texture_format_BGRA8888 extension is supported on all iPhones to date.
From there you obviously need to figure out what to do in terms of vertex and fragment shaders, depending on whether you have a normal map, a bump map that you actually want to affect visible shape (and, if so, whether by vertex perturbance or by ray casting) or something else.