问题
I use AVFoundation to take photos. Trouble is that even with constant settings of exposure duration, ISO and white balance I get photos with different brightness. Torch, flash and all possible stabilizations are disabled.
This issue is also presented in standard Apple app introducing working with camera: https://developer.apple.com/library/ios/samplecode/AVCam/Introduction/Intro.html#//apple_ref/doc/uid/DTS40010112
This is the video I filmed via this app containing no changes except setting manual exposure during session initialization:
https://www.youtube.com/watch?v=g9NOWGVeftI
Notice the sudden brightening what should not be if exposure settings of digital camera are constant (no, it is not moment when exposure settings have been set and completion handler has been called; settings are already set).
The brightening happens not always but anyway there can be significant brightness difference if I move camera away and aim to object again. But if I make series of photo without moving camera the brightnesses are same.
(Of course, object’s photos are taken in the same light circumstances)
May be this brightening is a part of setting custom exposure settings (because it usually happens at first) and lt’s late activation is process I should speed up but I don’t know how to do it.
I have this effect on iPod Touch 5 & iPad Air. I suppose it can happen on other iOS devices too.
It seems that scene brightness affects final image brightness (and preview layer’s brightness too). Device doesn’t just set given exposure settings; it adds some correction based on current scene brightness and changes this correction when common brightness of visible scene is changed a lot.
If I close camera by my hand, move hand away and take a photo it can be brighter than one that captured without closing camera before.
May be it’s not [only] brightness but contrast because when I move camera away from white display there can be brightening in moment when [relatively dark] objects beyond screen became visible.
Exposure target offset is little less than zero before the brightening and little more than zero after brightening.
I supposed this value is parameter on what this unexpected adjustment is based (as in auto exposure mode).
But attempts to prevent it via observing changes of target offset and setting equal exposure target bias failed because target offset is being changed all the time and it’s impossible to have working camera changing it’s target bias permanently.
Attempts to force adjustment via setting exposure target bias making exposure target offset value far from zero before capturing failed too because nothing happens and I can have the brightening after attempt to compensate. Even in custom mode target bias affects exposure offset [visible to client] but it seems it does not affect device behavior in parts responsible for exposure.
I also found that there are no brightness jumps in locked exposure mode (or I missed them…). I tried to set this mode after setting custom exposure values but trouble is that in locked mode device not only fixes current exposure values but does initial adjustment what changes exposure settings.
Exposure values I get from exif data and from AVCaptureDevice instance after taking photos are not changed after jump. I tried to observe exposure values via KVO but there’s nothing suspicious. When I set custom mode exposure duration and ISO are being changed few times and then completion handler is called. The brightening can be later, but it doesn’t affect current exposure values I can get.
All it is confusing. How can I provide straightforward relationship between image’s brightness and exposure settings?
回答1:
- (void)setupAVCapture {
//-- Setup Capture Session.
_session = [[AVCaptureSession alloc] init];
[_session beginConfiguration];
//-- Set preset session size.
[_session setSessionPreset:AVCaptureSessionPreset1920x1080];
//-- Creata a video device and input from that Device. Add the input to the capture session.
AVCaptureDevice * videoDevice = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
if(videoDevice == nil)
assert(0);
//-- Add the device to the session.
NSError *error;
AVCaptureDeviceInput *input = [AVCaptureDeviceInput deviceInputWithDevice:videoDevice error:&error];
if(error)
assert(0);
[_session addInput:input];
//-- Create the output for the capture session.
AVCaptureVideoDataOutput * dataOutput = [[AVCaptureVideoDataOutput alloc] init];
[dataOutput setAlwaysDiscardsLateVideoFrames:YES]; // Probably want to set this to NO when recording
//-- Set to YUV420.
[dataOutput setVideoSettings:[NSDictionary dictionaryWithObject:[NSNumber numberWithInt:kCVPixelFormatType_420YpCbCr8BiPlanarFullRange]
forKey:(id)kCVPixelBufferPixelFormatTypeKey]]; // Necessary for manual preview
// Set dispatch to be on the main thread so OpenGL can do things with the data
[dataOutput setSampleBufferDelegate:self queue:dispatch_get_main_queue()];
[_session addOutput:dataOutput];
[_session commitConfiguration];
[_session startRunning];
}
- (void)captureOutput:(AVCaptureOutput *)captureOutput
didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer
fromConnection:(AVCaptureConnection *)connection
{
CFDictionaryRef metadataDict = CMCopyDictionaryOfAttachments(NULL,
sampleBuffer, kCMAttachmentMode_ShouldPropagate);
NSDictionary *metadata = [[NSMutableDictionary alloc]
initWithDictionary:(__bridge NSDictionary*)metadataDict];
CFRelease(metadataDict);
NSDictionary *exifMetadata = [[metadata
objectForKey:(NSString *)kCGImagePropertyExifDictionary] mutableCopy];
self.autoBrightness = [[exifMetadata
objectForKey:(NSString *)kCGImagePropertyExifBrightnessValue] floatValue];
float oldMin = -4.639957; // dark
float oldMax = 4.639957; // light
if (self.autoBrightness > oldMax) oldMax = self.autoBrightness; // adjust oldMax if brighter than expected oldMax
self.lumaThreshold = ((self.autoBrightness - oldMin) * ((3.0 - 1.0) / (oldMax - oldMin))) + 1.0;
NSLog(@"brightnessValue %f", self.autoBrightness);
NSLog(@"lumaThreshold %f", self.lumaThreshold);
}
The lumaThreshold variable is sent as a uniform variable to my fragment shader, which multiplies the Y sampler texture to find the ideal luminosity based on the brightness of the environment. Right now, it uses the back camera; I'll probably switch to the front camera, since I'm only changing the "brightness" of the screen to adjust for indoor/outdoor viewing, and the user's eyes are on the front of the camera (and not the back).
回答2:
After tinkering I figured out how to easily lock your exposure. During primary camera initialization add:
device.exposureMode = AVCaptureDevice.ExposureMode.custom
as soon as the device is locked for config
and (very important)
device.exposureMode = AVCaptureDevice.ExposureMode.locked
These both ensure that:
1. You get to initialize the camera with your custom settings
2. The camera remains completely locked after changes are made
Your camera init code should look something like this:
try device.lockForConfiguration()
device.exposureMode = AVCaptureDevice.ExposureMode.custom
device.setExposureModeCustom(duration: durationCust, iso: minISO, completionHandler: nil)
device.setWhiteBalanceModeLocked(with: deviceGains) {
(timestamp:CMTime) -> Void in
}
device.exposureMode = AVCaptureDevice.ExposureMode.locked
device.unlockForConfiguration()
When you want to actively change the exposure parameters do not re-declare the exposure as locked or custom outside of actually changing the exposure. The code in your function should look something like this:
try device.lockForConfiguration()
device.setExposureModeCustom(duration: durationCust, iso: minISO, completionHandler: nil)
device.unlockForConfiguration()
I was happy to figure this out - I hope someone finds it helpful :)
来源:https://stackoverflow.com/questions/34511431/ios-avfoundation-different-photo-brightness-with-the-same-manual-exposure-set