avcapture

Using AVCapturePhotoOutput in iOS10 - NSGenericException

泄露秘密 提交于 2019-12-01 13:41:20
I am currently trying to figure out how to use iOS 10's AVCapturePhotoOutput method and am having trouble doing so. I feel like I am about to get it right but continue receiving an error: Terminating app due to uncaught exception 'NSGenericException', reason: '-[AVCapturePhotoOutput capturePhotoWithSettings:delegate:] No active and enabled video connection' I have tried to put this line of code in either the AVCapturePhotoCaptureDelegate or my didPressTakePhoto function: if let videoConnection = stillImageOutput.connection(withMediaType: AVMediaTypeVideo) { videoConnection.videoOrientation =

Using AVCapturePhotoOutput in iOS10 - NSGenericException

て烟熏妆下的殇ゞ 提交于 2019-12-01 10:31:19
问题 I am currently trying to figure out how to use iOS 10's AVCapturePhotoOutput method and am having trouble doing so. I feel like I am about to get it right but continue receiving an error: Terminating app due to uncaught exception 'NSGenericException', reason: '-[AVCapturePhotoOutput capturePhotoWithSettings:delegate:] No active and enabled video connection' I have tried to put this line of code in either the AVCapturePhotoCaptureDelegate or my didPressTakePhoto function: if let

iOS: captureOutput:didOutputSampleBuffer:fromConnection is NOT called

狂风中的少年 提交于 2019-11-30 10:24:28
I want to pull frames from the live-feed of AVCaptureSession and I am using Apple's AVCam as a test case. Here is the link to AVCam: https://developer.apple.com/library/ios/samplecode/AVCam/Introduction/Intro.html I found that that captureOutput:didOutputSampleBuffer:fromConnection is NOT called and I would like to know why or what I am doing wrong. Here is what I have done: (1) I make the AVCamViewController a delegate @interface AVCamViewController () <AVCaptureFileOutputRecordingDelegate, AVCaptureVideoDataOutputSampleBufferDelegate> (2) I created an AVCaptureVideoDataOutput object and add

Run multiple AVCaptureSessions or add multiple inputs

社会主义新天地 提交于 2019-11-30 03:29:51
I want to display the stream of the front and the back facing camera of an iPad2 in two UIViews next to each other. To stream the image of one device I use the following code AVCaptureDeviceInput *captureInputFront = [AVCaptureDeviceInput deviceInputWithDevice:[AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo] error:nil]; AVCaptureSession *session = [[AVCaptureSession alloc] init]; session addInput:captureInputFront]; session setSessionPreset:AVCaptureSessionPresetMedium]; session startRunning]; AVCaptureVideoPreviewLayer *prevLayer = [AVCaptureVideoPreviewLayer layerWithSession

iOS: captureOutput:didOutputSampleBuffer:fromConnection is NOT called

徘徊边缘 提交于 2019-11-29 15:34:59
问题 I want to pull frames from the live-feed of AVCaptureSession and I am using Apple's AVCam as a test case. Here is the link to AVCam: https://developer.apple.com/library/ios/samplecode/AVCam/Introduction/Intro.html I found that that captureOutput:didOutputSampleBuffer:fromConnection is NOT called and I would like to know why or what I am doing wrong. Here is what I have done: (1) I make the AVCamViewController a delegate @interface AVCamViewController () <AVCaptureFileOutputRecordingDelegate,

Capture Metal MTKView as Movie in realtime?

岁酱吖の 提交于 2019-11-28 23:51:57
What is the most efficient way to capture frames from a MTKView ? If possible, I would like to save a .mov file from the frames in realtime. Is it possible to render into an AVPlayer frame or something? It is currently drawing with this code (based on @warrenm PerformanceShaders project ): func draw(in view: MTKView) { _ = inflightSemaphore.wait(timeout: DispatchTime.distantFuture) updateBuffers() let commandBuffer = commandQueue.makeCommandBuffer() commandBuffer.addCompletedHandler{ [weak self] commandBuffer in if let strongSelf = self { strongSelf.inflightSemaphore.signal() } } // Dispatch

Set GrayScale on Output of AVCaptureDevice in iOS

此生再无相见时 提交于 2019-11-27 23:58:28
问题 I want to implement custom camera into my app. So, I am creating this camera using AVCaptureDevice . Now I want to show only Gray Output into my custom camera. So I am trying to getting this using setWhiteBalanceModeLockedWithDeviceWhiteBalanceGains: and AVCaptureWhiteBalanceGains . I am using AVCamManual: Extending AVCam to Use Manual Capture for this. - (void)setWhiteBalanceGains:(AVCaptureWhiteBalanceGains)gains { NSError *error = nil; if ( [videoDevice lockForConfiguration:&error] ) {

AVCaptureSession specify resolution and quality of captured images obj-c iphone app

社会主义新天地 提交于 2019-11-27 20:14:54
Hi I want to setup AV capture session to capture images with specific resolution (and, if possible, with specific quality) using iphone camera. here's setupping AV session code // Create and configure a capture session and start it running - (void)setupCaptureSession { NSError *error = nil; // Create the session self.captureSession = [[AVCaptureSession alloc] init]; // Configure the session to produce lower resolution video frames, if your // processing algorithm can cope. We'll specify medium quality for the // chosen device. captureSession.sessionPreset = AVCaptureSessionPresetMedium; //

Method to find device's camera resolution iOS

爷,独闯天下 提交于 2019-11-27 17:32:47
Whats the best method to find the image resolution going to be captured using setting AVCaptureSessionPresetPhoto . I am trying to find the resolution before capturing the image. With the function below, you can programmatically get the resolution from activeFormat before capture begins, though not before adding inputs and outputs: https://developer.apple.com/library/ios/documentation/AVFoundation/Reference/AVCaptureDevice_Class/index.html#//apple_ref/occ/instp/AVCaptureDevice/activeFormat private func getCaptureResolution() -> CGSize { // Define default resolution var resolution = CGSize

AVCaptureVideoPreviewLayer orientation - need landscape

Deadly 提交于 2019-11-27 11:05:42
My app is landscape only. I'm presenting the AVCaptureVideoPreviewLayer like this: self.previewLayer = [[AVCaptureVideoPreviewLayer alloc] initWithSession:session]; [self.previewLayer setBackgroundColor:[[UIColor blackColor] CGColor]]; [self.previewLayer setVideoGravity:AVLayerVideoGravityResizeAspect]; NSLog(@"previewView: %@", self.previewView); CALayer *rootLayer = [self.previewView layer]; [rootLayer setMasksToBounds:YES]; [self.previewLayer setFrame:[rootLayer bounds]]; NSLog(@"previewlayer: %f, %f, %f, %f", self.previewLayer.frame.origin.x, self.previewLayer.frame.origin.y, self