avcapture

How to save a movie from AVCapture

丶灬走出姿态 提交于 2019-12-13 04:34:04
问题 I've been trying to figure out AVCapture the last couple of days and am struggling to save a video. My understanding is that you call [movieFileOutput startRecordingToOutputFileURL:outputURL recordingDelegate:self]; and then at a later time you can call [movieFileOutput stopRecording]; And it should then call the delegate method -(void)captureOutput:(AVCaptureFileOutput *)captureOutput didFinishRecordingToOutputFileAtURL:(NSURL *)outputFileURL fromConnections:(NSArray *)connections error:

AVCaptureStillImageOutput area selection

拥有回忆 提交于 2019-12-12 19:01:57
问题 Here is a challenge I am facing, when saving an image taken from the camera, in an iOS app written in Swift. I am not saving exactly what I want, I am saving more than necessary and it is not good. These are the two relevant chunks of code for this issue. First where the session starts, as one can see I am only looking at an ellipse-shaped area: previewLayer = AVCaptureVideoPreviewLayer(session: captureSession) previewLayer?.frame = self.view.layer.frame self.view.layer.addSublayer

Recursive Callback call crashing app after some iterations watchdog error iOS

删除回忆录丶 提交于 2019-12-12 04:01:32
问题 I'm making an iOS app that continuously takes two pictures and does post processing on them, then displays it in UI. The only way I was able to capture two pictures synchronously was by using recursion inside the completionHandler of captureStillImageAsynchronouslyFromConnection function in AVCaptureStillImageOutput . The app eventually crashes after taking around 30 photos, here is the crash log for it: Date/Time: 2017-05-18 14:06:38.5714 -0400 Launch Time: 2017-05-18 14:06:22.1703 -0400 OS

iPhone front camera - tap to focus?

怎甘沉沦 提交于 2019-12-12 02:57:11
问题 I'm trying to figure out if tap-to-focus is possible via the front facing camera device. I've seen on the iPhone4 wikipedia that it supports focus, but not much more detail. When I call isFocusPointOfInterestSupported on the front facing camera it returns NO, so...I would assume that it isn't supported. However, the apple camera app seems to have some tap support on the front facing camera - is this focus or is it adjusting exposure at the tap point? To what capacity is focus supported on the

AVCaptureVideo not showing label

空扰寡人 提交于 2019-12-11 16:00:10
问题 I am trying to retain the objects from the console to be shown in as a label(classifierText). The warning of "UILabel.text must be used from main thread only" appears. What seems to the problem as to why the items are being shown as the label? var previewLayer: AVCaptureVideoPreviewLayer! let classifierText: UILabel = { let classifier = UILabel() classifier.translatesAutoresizingMaskIntoConstraints = false classifier.textColor = .black classifier.font = UIFont(name: "Times-New-Roman", size:

AVCaptureVideoPreviewLayer is not visible on the screenshot

旧时模样 提交于 2019-12-11 09:18:06
问题 I have an application that adds some live animations and images to preview view in AV Foundation camera. I can do "hardware screenshot" (holding the Side button and Volume Up button) and it's ok. However, I need a button that makes a screenshot. All the methods of taking screenshot like UIGraphicsGetImageFromCurrentImageContext (or view.drawHierarchy() ) result in black screen where video preview is. All other elements are on the screenshot and images are visible except

AVCaptureDevice isFlashModeSupported deprecated iOS 10

最后都变了- 提交于 2019-12-10 16:26:16
问题 I am using AVCaptureDevice's instance method "isFlashModeSupported" as below: NSArray *captureDeviceType = @[AVCaptureDeviceTypeBuiltInWideAngleCamera,AVCaptureDeviceTypeBuiltInMicrophone]; AVCaptureDeviceDiscoverySession *captureDevice = [AVCaptureDeviceDiscoverySession discoverySessionWithDeviceTypes:captureDeviceType mediaType:AVMediaTypeVideo position:AVCaptureDevicePositionUnspecified]; NSArray *deviceList = [captureDevice devices]; AVCaptureDevice *selectedCamera = [deviceList

AVCaptureVideoPreviewLayer and preview from camera position

余生颓废 提交于 2019-12-05 04:11:41
问题 I'm developing an app that permits user to takes photo. I've started using AVCam apple provides but i'm actually have a problem Simply i cannot position the camera layer where i want but it's positioned automatically on center of the View On the left side you can see what i actually have, on the right side what i'd like to have. The View that contains the preview that comes from the camera is a UIView subclass and this is the code class AVPreviewView : UIView { override class func layerClass(

Exporting AVCaptureSession video in a size that matches the preview layer

余生颓废 提交于 2019-12-04 09:34:34
问题 I'm recording video using AVCaptureSession with the session preset AVCaptureSessionPreset640x480 . I'm using an AVCaptureVideoPreviewLayer in a non-standard size (300 x 300) with the gravity set to aspect fill while recording. It's setup like this: self.previewLayer = [[AVCaptureVideoPreviewLayer alloc] initWithSession:_captureSession]; _previewLayer.videoGravity = AVLayerVideoGravityResizeAspectFill; _previewLayer.frame = _previewView.bounds; // 300 x 300 [_previewView.layer addSublayer:

Getting actual NSString of AvCaptureVideoDataOutput availableVideoCVPixelFormatTypes

限于喜欢 提交于 2019-12-04 03:18:28
I am trying to find the accepted formats on an AVFoundation output: self.theOutput=[[AVCaptureVideoDataOutput alloc]init]; if ([self.theSession canAddOutput:self.theOutput]) [self.theSession addOutput:self.theOutput]; I am then inserting a breakpoint right after and: po [self.theOutput availableVideoCVPixelFormatTypes] and I get this: (NSArray *) $5 = 0x2087ad00 <__NSArrayM 0x2087ad00>( 875704438, 875704422, 1111970369 ) How do I get the string values of these format types? Thanks On an iPhone5 running iOS6, here are the AVCaptureVideoDataOuput availableVideoCVPixelFormatTypes: