问题
Here is a challenge I am facing, when saving an image taken from the camera, in an iOS app written in Swift. I am not saving exactly what I want, I am saving more than necessary and it is not good.
These are the two relevant chunks of code for this issue.
First where the session starts, as one can see I am only looking at an ellipse-shaped area:
previewLayer = AVCaptureVideoPreviewLayer(session: captureSession)
previewLayer?.frame = self.view.layer.frame
self.view.layer.addSublayer(previewLayer!)
let maskLayer = CAShapeLayer()
maskLayer.path = CGPathCreateWithEllipseInRect(CGRect(x: 50.0, y: 100.0, width: 200.0, height: 100.0), nil)
previewLayer!.mask = maskLayer
captureSession.startRunning()
Second where the image is saved and the session stops:
if captureSession.running {
if let videoConnection = stillImageOutput.connectionWithMediaType(AVMediaTypeVideo) {
stillImageOutput.captureStillImageAsynchronouslyFromConnection(videoConnection) {
(imageDataSampleBuffer, error) -> Void in
if error == nil {
let imageData = AVCaptureStillImageOutput.jpegStillImageNSDataRepresentation(imageDataSampleBuffer)
UIImageWriteToSavedPhotosAlbum(UIImage(data: imageData)!, nil, nil, nil)
} else {print("Error on taking a picture:\n\(error)")}
}
}
captureSession.stopRunning()
}
This is working, but when I look in the photo album to see what has been saved, I find the whole picture taken as if there were no limiting ellipse and this is not what I want. Ideally I should be saving only those pixels within the ellipse. If that is not possible, I should at least save the pixels within the ellipse as they are and the ones outside the ellipse as one unique transparent color. Is there a way to control how AVCaptureStillImageOutput works, or something I need to do to achieve what I want?
来源:https://stackoverflow.com/questions/34603557/avcapturestillimageoutput-area-selection