How to capture depth data from camera in iOS 11 and Swift 4?

后端 未结 4 768
一个人的身影
一个人的身影 2020-12-09 06:36

I\'m trying to get depth data from the camera in iOS 11 with AVDepthData, tho when I setup a photoOutput with the AVCapturePhotoCaptureDelegate the photo.depthData is nil.

相关标签:
4条回答
  • 2020-12-09 06:50

    To give more details to @klinger answer, here is what you need to do to get Depth Data for each pixel, I wrote some comments, hope it helps!

    func photoOutput(_ output: AVCapturePhotoOutput, didFinishProcessingPhoto photo: AVCapturePhoto, error: Error?) {
    
        //## Convert Disparity to Depth ##
    
        let depthData = (photo.depthData as AVDepthData!).converting(toDepthDataType: kCVPixelFormatType_DepthFloat32)
        let depthDataMap = depthData.depthDataMap //AVDepthData -> CVPixelBuffer
    
        //## Data Analysis ##
    
        // Useful data
        let width = CVPixelBufferGetWidth(depthDataMap) //768 on an iPhone 7+
        let height = CVPixelBufferGetHeight(depthDataMap) //576 on an iPhone 7+
        CVPixelBufferLockBaseAddress(depthDataMap, CVPixelBufferLockFlags(rawValue: 0))
    
        // Convert the base address to a safe pointer of the appropriate type
        let floatBuffer = unsafeBitCast(CVPixelBufferGetBaseAddress(depthDataMap), to: UnsafeMutablePointer<Float32>.self)
    
        // Read the data (returns value of type Float)
        // Accessible values : (width-1) * (height-1) = 767 * 575
    
        let distanceAtXYPoint = floatBuffer[Int(x * y)]
    
    }
    
    0 讨论(0)
  • 2020-12-09 06:52

    There are two ways to do this, and you are trying to do both at once:

    1. Capture depth data along with the image. This is done by using the photo.depthData object from photoOutput(_:didFinishProcessingPhoto:error:). I explain why this did not work for you below.
    2. Use a AVCaptureDepthDataOutput and implement depthDataOutput(_:didOutput:timestamp:connection:). I am not sure why this did not work for you, but implementing depthDataOutput(_:didOutput:timestamp:connection:) might help you figure out why.

    I think that #1 is a better option, because it pairs the depth data with the image. Here's how you would do that:

    @IBAction func capture(_ sender: Any) {
    
        let settings = AVCapturePhotoSettings(format: [AVVideoCodecKey: AVVideoCodecType.jpeg])
        settings.isDepthDataDeliveryEnabled = true
        self.sessionOutput?.capturePhoto(with: settings, delegate: self)
    
    }
    
    // ...
    
    override func viewDidLoad() {
        // ...
        self.sessionOutput = AVCapturePhotoOutput()
        self.sessionOutput.isDepthDataDeliveryEnabled = true
        // ...
    }
    

    Then, depth_map shouldn't be nil. Make sure to read both this and this (separate but similar pages) for more information about obtaining depth data.

    For #2, I'm not quite sure why depthDataOutput(_:didOutput:timestamp:connection:) isn't being called, but you should implement depthDataOutput(_:didDrop:timestamp:connection:reason:) to see if depth data is being dropped for some reason.

    0 讨论(0)
  • 2020-12-09 06:52

    The way you init your capture device is not right.

    You should use the dual camera mode.

    as for oc like follows:

    AVCaptureDevice *device = [AVCaptureDevice defaultDeviceWithDeviceType:AVCaptureDeviceTypeBuiltInDualCamera mediaType:AVMediaTypeVideo position:AVCaptureDevicePositionBack];
    
    0 讨论(0)
  • 2020-12-09 07:05

    First, you need to use the dual camera, otherwise you won't get any depth data.

    let device = AVCaptureDevice.default(.builtInDualCamera, for: .video, position: .back)
    

    And keep a reference to your queue

    let dataOutputQueue = DispatchQueue(label: "data queue", qos: .userInitiated, attributes: [], autoreleaseFrequency: .workItem)
    

    You'll also probably want to synchronize the video and depth data

    var outputSynchronizer: AVCaptureDataOutputSynchronizer?
    

    Then you can synchronize the two outputs in your viewDidLoad() method like this

    if sessionOutput?.isDepthDataDeliverySupported {
        sessionOutput?.isDepthDataDeliveryEnabled = true
        depthDataOutput?.connection(with: .depthData)!.isEnabled = true
        depthDataOutput?.isFilteringEnabled = true
        outputSynchronizer = AVCaptureDataOutputSynchronizer(dataOutputs: [sessionOutput!, depthDataOutput!])
        outputSynchronizer!.setDelegate(self, queue: self.dataOutputQueue)
    }
    

    I would recommend watching WWDC session 507 - they also provide a full sample app that does exactly what you want.

    https://developer.apple.com/videos/play/wwdc2017/507/

    0 讨论(0)
提交回复
热议问题