Run and Pause an ARSession in a specified period of time

|▌冷眼眸甩不掉的悲伤 提交于 2019-12-10 12:27:52

问题


I'm developing ARKit/Vision iOS app with gesture recognition. My app has a simple UI containing single UIView. There's no ARSCNView/ARSKView at all. I'm putting a sequence of captured ARFrames into CVPixelBuffer what then I use for VNRecognizedObjectObservation.

I don't need any tracking data from a session. I just need currentFrame.capturedImage for CVPixelBuffer. And I need to capture ARFrames at 30 fps. 60 fps is excessive frame rate.

preferredFramesPerSecond instance property is absolutely useless in my case, because it controls frame rate for rendering an ARSCNView/ARSKView. I have no ARViews. And it doesn't affect session's frame rate.

So, I decided to use run() and pause() methods to decrease a session's frame rate.

Question

I'd like to know how to automatically run and pause an ARSession in a specified period of time? The duration of run and pause methods must be 16 ms (or 0.016 sec). I suppose it might be possible through DispatchQueue. But I don't know how to implement it.

How to do it?

Here's a pseudo-code:

session.run(configuration)

    /*  run lasts 16 ms  */

session.pause()

    /*  pause lasts 16 ms  */

session.run(session.configuration!)

    /*  etc...  */

P.S. I can use neither CocoaPod nor Carthage in my app.

Update: It's about how ARSession's currentFrame.capturedImage is retrieved and used.

let session = ARSession()

override func viewDidAppear(_ animated: Bool) {
    super.viewDidAppear(animated)

    session.delegate = self
    let configuration = ARImageTrackingConfiguration() // 6DOF
    configuration.providesAudioData = false
    configuration.isAutoFocusEnabled = true            
    configuration.isLightEstimationEnabled = false
    configuration.maximumNumberOfTrackedImages = 0
    session.run(configuration)  

    spawnCoreMLUpdate()
}

func spawnCoreMLUpdate() {    // Spawning new async tasks

    dispatchQueue.async {
        self.spawnCoreMLUpdate()
        self.updateCoreML()
    }
}

func updateCoreML() {

    let pixelBuffer: CVPixelBuffer? = (session.currentFrame?.capturedImage)
    if pixelBuffer == nil { return }
    let ciImage = CIImage(cvPixelBuffer: pixelBuffer!)
    let imageRequestHandler = VNImageRequestHandler(ciImage: ciImage, options: [:])
    do {
        try imageRequestHandler.perform(self.visionRequests)
    } catch {
        print(error)
    }
}

回答1:


I don't think the run() and pause() strategy is the way to go because the DispatchQueue API is not designed for realtime accuracy. Which means there will be no guarantee that the pause will be 16ms every time. On top of that, restarting a session might not be immediate and could add more delay.

Also, the code you shared will at most capture only one image and as session.run(configuration) is asynchronous will probably capture no frame.

As you're not using ARSCNView/ARSKView the only way is to implement the ARSession delegate to be notified of every captured frame.

Of course the delegate will most likely be called every 16ms because that's how the camera works. But you can decide which frames you are going to process. By using the timestamp of the frame you can process a frame every 32ms and drop the other ones. Which is equivalent to a 30 fps processing.

Here is some code to get you started, make sure that dispatchQueue is not concurrent to process your buffers sequentially:

var lastProcessedFrame: ARFrame?

func session(_ session: ARSession, didUpdate frame: ARFrame) {
  dispatchQueue.async {
    self.updateCoreML(with: frame)
  }
}

private func shouldProcessFrame(_ frame: ARFrame) -> Bool {
  guard let lastProcessedFrame = lastProcessedFrame else {
    // Always process the first frame
    return true
  }
  return frame.timestamp - lastProcessedFrame.timestamp >= 0.032 // 32ms for 30fps
}

func updateCoreML(with frame: ARFrame) {

  guard shouldProcessFrame(frame) else {
    // Less than 32ms with the previous frame
    return
  }
  lastProcessedFrame = frame
  let pixelBuffer = frame.capturedImage
  let imageRequestHandler = VNImageRequestHandler(cvPixelBuffer: pixelBuffer, options: [:])
  do {
    try imageRequestHandler.perform(self.visionRequests)
  } catch {
    print(error)
  }
}



回答2:


If what you want is to reduce the frame rate from 60 to 30, you should use the preferredFramesPerSecond property of SCNView. I'm assuming you're using an ARSCNView, which is a subclass of SCNView.

Property documentation.




回答3:


If I understand It correctly, you can achieve it via DispatchQueue. If you run below code, It prints HHH first then waits for 1 second then prints ABC. You can put your own functions to make it work for you. Of course change time interval from 1 to your desired value.

let syncConc = DispatchQueue(label:"con",attributes:.concurrent)

DispatchQueue.global(qos: .utility).async {
syncConc.async {
    for _ in 0...10{
        print("HHH - \(Thread.current)")
        Thread.sleep(forTimeInterval: 1)
        print("ABC - \(Thread.current)")

    }
}

PS: I'm still not sure If Thread.sleep will block your process, If It is I'll edit my answer.



来源:https://stackoverflow.com/questions/53685090/run-and-pause-an-arsession-in-a-specified-period-of-time

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!