问题
I want to detect ball and have AR model interact with it. I used opencv for ball detection and send center of ball which I can use in hitTest
to get coordinates in sceneView
. I have been converting CVPixelBuffer
to UIImage
using following function:
static func convertToUIImage(buffer: CVPixelBuffer) -> UIImage?{
let ciImage = CIImage(cvPixelBuffer: buffer)
let temporaryContext = CIContext(options: nil)
if let temporaryImage = temporaryContext.createCGImage(ciImage, from: CGRect(x: 0, y: 0, width: CVPixelBufferGetWidth(buffer), height: CVPixelBufferGetHeight(buffer)))
{
let capturedImage = UIImage(cgImage: temporaryImage)
return capturedImage
}
return nil
}
This gave me rotated image:
Then i found about changing orientation using:
let capturedImage = UIImage(cgImage: temporaryImage, scale: 1.0, orientation: .right)
While it gave correct orientation while device is in portrait, rotating device to landscape again gave rotated image.
Now I am thinking about handling it using viewWillTransition
. But before that i want to know:
- If there is other way around to convert image with correct orientation?
- Why does this happen?
回答1:
1. Is there another way to convert the image with the correct orientation?
You may try to use snapshot()
of ARSCNView
(inherited from SCNView
), which:
Draws the contents of the view and returns them as a new image object
so if you have an object like:
@IBOutlet var arkitSceneView:ARSCNView!
you only need to do so:
let imageFromArkitScene:UIImage? = arkitSceneView.snapshot()
2. Why does this happen?
It's because the CVPixelBuffer
comes from ARFrame
, which is :
captured (continuously) from the device camera, by the running AR session.
Well, since the camera orientation does not change with the rotation of the device (they are separate), to be able to adjust the orientation of your frame to the current view, you should re-orient the image captured from your camera applying the affine transform extracted with displayTransform(for:viewportSize:)
:
Returns an affine transform for converting between normalized image coordinates and a coordinate space appropriate for rendering the camera image onscreen.
you may find good documentation here, usage example:
let orient = UIApplication.shared.statusBarOrientation
let viewportSize = yourSceneView.bounds.size
let transform = frame.displayTransform(for: orient, viewportSize: viewportSize).inverted()
var finalImage = CIImage(cvPixelBuffer: pixelBuffer).transformed(by: transform)
来源:https://stackoverflow.com/questions/48456015/convert-arframes-captured-image-to-uiimage-orientation-issue