Convert ARFrame's captured image to UIImage orientation issue

前端 未结 1 330
小鲜肉
小鲜肉 2020-12-31 20:27

I want to detect ball and have AR model interact with it. I used opencv for ball detection and send center of ball which I can use in hitTest to get coordinates

相关标签:
1条回答
  • 2020-12-31 20:32

    1. Is there another way to convert the image with the correct orientation?

    You may try to use snapshot() of ARSCNView (inherited from SCNView), which:

    Draws the contents of the view and returns them as a new image object

    so if you have an object like:

    @IBOutlet var arkitSceneView:ARSCNView!
    

    you only need to do so:

    let imageFromArkitScene:UIImage? = arkitSceneView.snapshot()
    

    2. Why does this happen?

    It's because the CVPixelBuffer comes from ARFrame, which is :

    captured (continuously) from the device camera, by the running AR session.

    Well, since the camera orientation does not change with the rotation of the device (they are separate), to be able to adjust the orientation of your frame to the current view, you should re-orient the image captured from your camera applying the affine transform extracted with displayTransform(for:viewportSize:):

    Returns an affine transform for converting between normalized image coordinates and a coordinate space appropriate for rendering the camera image onscreen.

    you may find good documentation here, usage example:

    let orient = UIApplication.shared.statusBarOrientation
    let viewportSize = yourSceneView.bounds.size
    let transform = frame.displayTransform(for: orient, viewportSize: viewportSize).inverted()
    var finalImage = CIImage(cvPixelBuffer: pixelBuffer).transformed(by: transform)
    
    0 讨论(0)
提交回复
热议问题