问题
I'm using an ARSession
combined with an ARFaceTrackingConfiguration
to track my face. At the same time, I would like to record a video from the front facing camera of my iPhone X. To do so I'm using AVCaptureSession
but as soon as I start recording, the ARSession
gets interrupted.
These are two snippets of code:
// Face tracking
let configuration = ARFaceTrackingConfiguration()
configuration.isLightEstimationEnabled = false
let session = ARSession()
session.run(configuration, options: [.removeExistingAnchors, .resetTracking])
// Video recording
let camera = AVCaptureDevice.default(.builtInWideAngleCamera, for: .video, position: .front)!
input = try! AVCaptureDeviceInput(device: camera)
session.addInput(input)
session.addOutput(output)
Does anybody know how to do the two things at the same time? Apps like Snapchat allow users to record and use the True Depth sensor at the same time so I imagine what I'm asking is perfectly feasible. Thanks!
回答1:
ARKit runs its own AVCaptureSession
, and there can be only one capture session running at a time — if you run a capture session, you preempt ARKit’s, which prevents ARKit from working.
However, ARKit does provide access to the camera pixel buffers it receives from its capture session, so you can record video by feeding those sample buffers to an AVAssetWriter
. (It’s basically the same workflow you’d use when recording video from AVCaptureVideoDataOutput
... a lower-level way of doing video recording compared to AVCaptureMovieFileOutput
.)
You can also feed the ARKit camera pixel buffers (see ARFrame.capturedImage) to other technologies that work with live camera imagery, like the Vision framework. Apple has a sample code project demonstrating such usage.
来源:https://stackoverflow.com/questions/50880868/record-video-from-front-facing-camera-during-arkit-arsession-on-iphone-x