avcaptureoutput

AVCaptureStillImageOutput & UIImagePNGRepresentation

北城余情 提交于 2019-12-13 07:07:56
问题 I am having a hard time for something I think shouldn’t be so difficult, so I presume I must be looking at the problem from the wrong angle. In order to understand how AVCaptureStillImageOutput and the camera work I made a tiny app. This app is able to take a picture and save it as a PNG file (I do not want JPEG). The next time the app is launched, it checks if a file is present and if it is, the image stored inside the file is used as the background view of the app. The idea is rather simple

AVCaptureStillImageOutput area selection

拥有回忆 提交于 2019-12-12 19:01:57
问题 Here is a challenge I am facing, when saving an image taken from the camera, in an iOS app written in Swift. I am not saving exactly what I want, I am saving more than necessary and it is not good. These are the two relevant chunks of code for this issue. First where the session starts, as one can see I am only looking at an ellipse-shaped area: previewLayer = AVCaptureVideoPreviewLayer(session: captureSession) previewLayer?.frame = self.view.layer.frame self.view.layer.addSublayer

Deep Copy of CMImageBuffer or CVImageBuffer

血红的双手。 提交于 2019-12-02 07:11:46
问题 Hi I am currently working on an app which needs to capture a Video and at the same time should be able to take frames to blend them. The problem I am having is that my frames coming from: func captureOutput( captureOutput: AVCaptureOutput!, didOutputSampleBuffer sampleBuffer: CMSampleBuffer!, fromConnection connection: AVCaptureConnection! ) will drop after blending about 10-12 frames. I tried blending every 10th frame but it will still drop after 10-12 blended frames. I know that I should

Deep Copy of CMImageBuffer or CVImageBuffer

僤鯓⒐⒋嵵緔 提交于 2019-12-02 03:59:44
Hi I am currently working on an app which needs to capture a Video and at the same time should be able to take frames to blend them. The problem I am having is that my frames coming from: func captureOutput( captureOutput: AVCaptureOutput!, didOutputSampleBuffer sampleBuffer: CMSampleBuffer!, fromConnection connection: AVCaptureConnection! ) will drop after blending about 10-12 frames. I tried blending every 10th frame but it will still drop after 10-12 blended frames. I know that I should copy the CVImageBuffer to release the imageBuffer which I got using the following: let imageBuffer =

iOS: error in __connection_block_invoke_2: Connection interrupted [duplicate]

谁说我不能喝 提交于 2019-11-29 10:05:27
This question already has an answer here: What is “error in __connection_block_invoke_2: Connection interrupted” in iOS? 1 answer Xcode/iOS 8/AVFoundation related error in console: error in __connection_block_invoke_2: Connection interrupted I am just adding AVCaptureVideoDataOutput to Apple's sample app 'AVCamManualUsingtheManualCaptureAPI' What I added was: // CoreImage wants BGRA pixel format NSDictionary *outputSettings = @{ (id)kCVPixelBufferPixelFormatTypeKey : [NSNumber numberWithInteger:kCVPixelFormatType_32BGRA]}; // create and configure video data output AVCaptureVideoDataOutput

iOS: error in __connection_block_invoke_2: Connection interrupted [duplicate]

一个人想着一个人 提交于 2019-11-28 03:33:45
问题 This question already has an answer here : What is “error in __connection_block_invoke_2: Connection interrupted” in iOS? (1 answer) Closed 4 years ago . Xcode/iOS 8/AVFoundation related error in console: error in __connection_block_invoke_2: Connection interrupted I am just adding AVCaptureVideoDataOutput to Apple's sample app 'AVCamManualUsingtheManualCaptureAPI' What I added was: // CoreImage wants BGRA pixel format NSDictionary *outputSettings = @{ (id)kCVPixelBufferPixelFormatTypeKey :