问题
I want to implement custom camera into my app. So, I am creating this camera using AVCaptureDevice
.
Now I want to show only Gray Output into my custom camera. So I am trying to getting this using setWhiteBalanceModeLockedWithDeviceWhiteBalanceGains:
and AVCaptureWhiteBalanceGains
. I am using AVCamManual: Extending AVCam to Use Manual Capture for this.
- (void)setWhiteBalanceGains:(AVCaptureWhiteBalanceGains)gains
{
NSError *error = nil;
if ( [videoDevice lockForConfiguration:&error] ) {
AVCaptureWhiteBalanceGains normalizedGains = [self normalizedGains:gains]; // Conversion can yield out-of-bound values, cap to limits
[videoDevice setWhiteBalanceModeLockedWithDeviceWhiteBalanceGains:normalizedGains completionHandler:nil];
[videoDevice unlockForConfiguration];
}
else {
NSLog( @"Could not lock device for configuration: %@", error );
}
}
But for that, I must have to pass RGB gain values between 1 to 4. So I am creating this method for checking MAX and MIN values.
- (AVCaptureWhiteBalanceGains)normalizedGains:(AVCaptureWhiteBalanceGains) gains
{
AVCaptureWhiteBalanceGains g = gains;
g.redGain = MAX( 1.0, g.redGain );
g.greenGain = MAX( 1.0, g.greenGain );
g.blueGain = MAX( 1.0, g.blueGain );
g.redGain = MIN( videoDevice.maxWhiteBalanceGain, g.redGain );
g.greenGain = MIN( videoDevice.maxWhiteBalanceGain, g.greenGain );
g.blueGain = MIN( videoDevice.maxWhiteBalanceGain, g.blueGain );
return g;
}
Also I am trying to get different effects like passing RGB gain static values.
- (AVCaptureWhiteBalanceGains)normalizedGains:(AVCaptureWhiteBalanceGains) gains
{
AVCaptureWhiteBalanceGains g = gains;
g.redGain = 3;
g.greenGain = 2;
g.blueGain = 1;
return g;
}
Now, I want to set this gray scale (Formula: Pixel = 0.30078125f * R + 0.5859375f * G + 0.11328125f * B) on my custom camera. I have tried this for this formula.
- (AVCaptureWhiteBalanceGains)normalizedGains:(AVCaptureWhiteBalanceGains) gains
{
AVCaptureWhiteBalanceGains g = gains;
g.redGain = g.redGain * 0.30078125;
g.greenGain = g.greenGain * 0.5859375;
g.blueGain = g.blueGain * 0.11328125;
float grayScale = g.redGain + g.greenGain + g.blueGain;
g.redGain = MAX( 1.0, grayScale );
g.greenGain = MAX( 1.0, grayScale );
g.blueGain = MAX( 1.0, grayScale );
g.redGain = MIN( videoDevice.maxWhiteBalanceGain, g.redGain );
g.greenGain = MIN( videoDevice.maxWhiteBalanceGain, g.greenGain);
g.blueGain = MIN( videoDevice.maxWhiteBalanceGain, g.blueGain );
return g;
}
So How can I pass this value in between 1 to 4..?
Is there any way or scale to compare this things..?
Any Help would be appreciated.
回答1:
CoreImage
provides a host of filters for adjusting images using the GPU, and can be used efficiently with video data, either from a camera feed, or a video file.
There is an article on objc.io showing how to do this. The examples are in Objective-C but the explanation should clear enough to follow.
The basic steps are:
- Create an
EAGLContext
, configured to use OpenGLES2. - Create a
GLKView
to display the rendered output, using theEAGLContext
. - Create a
CIContext
, using the sameEAGLContext
. - Create a
CIFilter
using aCIColorMonochrome
CoreImage filter. - Create an
AVCaptureSession
with anAVCaptureVideoDataOutput
. - In the
AVCaptureVideoDataOutputDelegate
method, convert theCMSampleBuffer
to aCIImage
. Apply theCIFilter
to the image. Draw the filtered image to theCIImageContext
.
This pipeline ensures that the video pixel buffers stay on the GPU (from camera to display), and avoids moving data to the CPU, to maintain realtime performance.
To save the filtered video, implement an AVAssetWriter
, and append the sample buffer in the same AVCaptureVideoDataOutputDelegate
where the filtering is done.
Here is an example in Swift.
Example on GitHub.
import UIKit
import GLKit
import AVFoundation
private let rotationTransform = CGAffineTransformMakeRotation(CGFloat(-M_PI * 0.5))
class ViewController: UIViewController, AVCaptureVideoDataOutputSampleBufferDelegate {
private var context: CIContext!
private var targetRect: CGRect!
private var session: AVCaptureSession!
private var filter: CIFilter!
@IBOutlet var glView: GLKView!
override func prefersStatusBarHidden() -> Bool {
return true
}
override func viewDidAppear(animated: Bool) {
super.viewDidAppear(animated)
let whiteColor = CIColor(
red: 1.0,
green: 1.0,
blue: 1.0
)
filter = CIFilter(
name: "CIColorMonochrome",
withInputParameters: [
"inputColor" : whiteColor,
"inputIntensity" : 1.0
]
)
// GL context
let glContext = EAGLContext(
API: .OpenGLES2
)
glView.context = glContext
glView.enableSetNeedsDisplay = false
context = CIContext(
EAGLContext: glContext,
options: [
kCIContextOutputColorSpace: NSNull(),
kCIContextWorkingColorSpace: NSNull(),
]
)
let screenSize = UIScreen.mainScreen().bounds.size
let screenScale = UIScreen.mainScreen().scale
targetRect = CGRect(
x: 0,
y: 0,
width: screenSize.width * screenScale,
height: screenSize.height * screenScale
)
// Setup capture session.
let cameraDevice = AVCaptureDevice.defaultDeviceWithMediaType(AVMediaTypeVideo)
let videoInput = try? AVCaptureDeviceInput(
device: cameraDevice
)
let videoOutput = AVCaptureVideoDataOutput()
videoOutput.setSampleBufferDelegate(self, queue: dispatch_get_main_queue())
session = AVCaptureSession()
session.beginConfiguration()
session.addInput(videoInput)
session.addOutput(videoOutput)
session.commitConfiguration()
session.startRunning()
}
func captureOutput(captureOutput: AVCaptureOutput!, didOutputSampleBuffer sampleBuffer: CMSampleBuffer!, fromConnection connection: AVCaptureConnection!) {
guard let pixelBuffer = CMSampleBufferGetImageBuffer(sampleBuffer) else {
return
}
let originalImage = CIImage(
CVPixelBuffer: pixelBuffer,
options: [
kCIImageColorSpace: NSNull()
]
)
let rotatedImage = originalImage.imageByApplyingTransform(rotationTransform)
filter.setValue(rotatedImage, forKey: kCIInputImageKey)
guard let filteredImage = filter.outputImage else {
return
}
context.drawImage(filteredImage, inRect: targetRect, fromRect: filteredImage.extent)
glView.display()
}
func captureOutput(captureOutput: AVCaptureOutput!, didDropSampleBuffer sampleBuffer: CMSampleBuffer!, fromConnection connection: AVCaptureConnection!) {
let seconds = CMTimeGetSeconds(CMSampleBufferGetPresentationTimeStamp(sampleBuffer))
print("dropped sample buffer: \(seconds)")
}
}
来源:https://stackoverflow.com/questions/38122040/set-grayscale-on-output-of-avcapturedevice-in-ios