avcapturedevice

Can't use AVCaptureDevice with a flash

二次信任 提交于 2019-12-02 05:43:52
问题 I am having difficult times, for something which I think ought to be simple. I just want to light the flash when taking a picture in my iOS app. And all I tried failed or works only 20 percent. Here is the code fired to light the flash up: // Here we have: captureDevice.hasFlash && captureDevice.isFlashModeSupported(.On) do {try captureDevice.lockForConfiguration() captureDevice.flashMode = .On captureDevice.unlockForConfiguration() } catch let error as NSError { print("captureDevice

Can't use AVCaptureDevice with a flash

早过忘川 提交于 2019-12-01 22:53:12
I am having difficult times, for something which I think ought to be simple. I just want to light the flash when taking a picture in my iOS app. And all I tried failed or works only 20 percent. Here is the code fired to light the flash up: // Here we have: captureDevice.hasFlash && captureDevice.isFlashModeSupported(.On) do {try captureDevice.lockForConfiguration() captureDevice.flashMode = .On captureDevice.unlockForConfiguration() } catch let error as NSError { print("captureDevice.lockForConfiguration FAILED") print(error.code) } I have tried several flavors of the code, by moving the 2

How do I record a video on iOS without using a preset?

一个人想着一个人 提交于 2019-12-01 21:39:55
The simpler way to record a video on iOS is by setting a AVCaptureSession.sessionPreset . But that doesn't work for me since I want to control parameters like binning, stabilization (cinematic, standard, or none) and ISO. I find the format I want and assign it to activeFormat , but when I try to start recording, I get an error: Terminating app due to uncaught exception 'NSInvalidArgumentException', reason: '*** -[AVCaptureMovieFileOutput startRecordingToOutputFileURL:recordingDelegate:] No active/enabled connections' Here is my initialisation code: let device = AVCaptureDevice.defaultDevice(

AVAudioSession setCategory not working

瘦欲@ 提交于 2019-11-30 21:56:27
I have a video capturing app and I want to be able to play background music while recording audio+video. I can accomplish this if I set the AVAudioSession category to PlayAndRecord in didFinishLaunchingWithOptions . However, this causes a glitch in the audio whenever the view with the camera enters or exits the foreground, and its apparently impossible to get rid of: https://forums.developer.apple.com/message/74778#74778 I can live with the glitch if it just happens when I start/stop recording video, but that means I need to change the AVAudioSession category from Ambient to PlayAndRecord when

What is the role of AVCaptureDeviceType.builtInDualCamera

怎甘沉沦 提交于 2019-11-30 15:56:08
I am playing with swift and an iPhone 7 Plus. I am working with builtInWideAngleCamera and builtInTelephotoCamera. This is great, even if i cannot get the 2 images simultaneously. I saw in apple documentation that AVCaptureDeviceType contains a builtInDualCamera entry. What is the purpose of this device in avfoundation, because we cannot do anything (zoom, depth effect) with apple API ? In other word, i cannot see the difference between builtInDualCamera and builtInWideAngleCamera when working with AVCaptureDeviceType, avcapturesession and stuff Thanks Duel-Camera options is to choose the

iOS: captureOutput:didOutputSampleBuffer:fromConnection is NOT called

狂风中的少年 提交于 2019-11-30 10:24:28
I want to pull frames from the live-feed of AVCaptureSession and I am using Apple's AVCam as a test case. Here is the link to AVCam: https://developer.apple.com/library/ios/samplecode/AVCam/Introduction/Intro.html I found that that captureOutput:didOutputSampleBuffer:fromConnection is NOT called and I would like to know why or what I am doing wrong. Here is what I have done: (1) I make the AVCamViewController a delegate @interface AVCamViewController () <AVCaptureFileOutputRecordingDelegate, AVCaptureVideoDataOutputSampleBufferDelegate> (2) I created an AVCaptureVideoDataOutput object and add

What is the role of AVCaptureDeviceType.builtInDualCamera

廉价感情. 提交于 2019-11-29 22:55:15
问题 I am playing with swift and an iPhone 7 Plus. I am working with builtInWideAngleCamera and builtInTelephotoCamera. This is great, even if i cannot get the 2 images simultaneously. I saw in apple documentation that AVCaptureDeviceType contains a builtInDualCamera entry. What is the purpose of this device in avfoundation, because we cannot do anything (zoom, depth effect) with apple API ? In other word, i cannot see the difference between builtInDualCamera and builtInWideAngleCamera when

iOS: captureOutput:didOutputSampleBuffer:fromConnection is NOT called

徘徊边缘 提交于 2019-11-29 15:34:59
问题 I want to pull frames from the live-feed of AVCaptureSession and I am using Apple's AVCam as a test case. Here is the link to AVCam: https://developer.apple.com/library/ios/samplecode/AVCam/Introduction/Intro.html I found that that captureOutput:didOutputSampleBuffer:fromConnection is NOT called and I would like to know why or what I am doing wrong. Here is what I have done: (1) I make the AVCamViewController a delegate @interface AVCamViewController () <AVCaptureFileOutputRecordingDelegate,

AVCaptureSession cancels background audio

ε祈祈猫儿з 提交于 2019-11-29 14:55:47
问题 Whenever I start an AVCaptureSession running with the microphone as an input it cancels whatever background music is currently running (iPod music for instance). If I comment out the line adding the audio input, the background audio continues. Does anyone know of a way to record video clips with the microphone while continuing to allow background audio to play? I've looked around a lot, and can't seem to find any references to this behavior. Thanks for any help! 回答1: Try setting

Preventing AVCaptureVideoPreviewLayer from rotating, but allow UI layer to rotate with orientation

对着背影说爱祢 提交于 2019-11-29 02:34:40
I have two view controllers. One is the root VC and contains the UI interface such as the record button. On this view controller, I also display the view of another VC at index 0. This view contains a AVCaptureVideoPreviewLayer. I would like my video camera to mimic the Apple video camera app, where the interface layout adjusts with the rotation, but the video preview layer does not. You can see how the recording timer (UILabel) in the stock video app disappears and reappears at the top depending on the orientation. Any idea how to do this? I found one suggestion that recommendeds adding the