问题
I am using Firebase ML Kit for Face Detection, and in the documentation it says:
If necessary, rotate the image so that its imageOrientation property is .up. Create a VisionImage object using the correctly-rotated UIImage. Do not specify any rotation metadata—the default value, .topLeft, must be used.
I am running into the problem where photos I upload from the internet tend to work properly but when I take photos from my camera there seems to be issues. I have a feeling it is due to the way the images are oriented and I can't figure out how I should check the images to ensure these two requirements listed above are satisfied. I tried printing out the images.imageOrientation but it wasn't helping me much, and I for some reason could not use the UIImageOrientationUp
which I saw used in a different stackoverflow answer.
This is what gets printed when I try to print the images orientation:
int:0x2809f9a40 'UISV-alignment' UIImageView:0x13de4d4b0.bottom == UILabel:0x13dec1630'orient's Profile'.bottom (active)>",
"<NSLayoutConstraint:0x2809f9a90 'UISV-alignment' UIImageView:0x13de4d4b0.top == UILabel:0x13dec1630'orient's Profile'.top (active)>",
Anyways if someone could help me write a function that I can use to ensure the orientations of the image I am about to pass to the ML Kit are oriented correctly, I would really appreciate it. Thanks! I am an iOS novice this is my first "Real" app so I am sorry if there was a better or easier way to accomplish my goal.
*** So I found that when I take a picture with my camera it is oriented to the .right but it looks fine on the actual imageView. I tried changing the orientation to the .up but now the image is actually rotated to the right and the detection still failed... I think I need to change the orientation to .Up without actually rotating the image if that is possible. Because when I try to set the value it says its a get only property
回答1:
Thanks for reaching out to us, I'm Julie from MLKit team, sorry to catch up this thread late.
Yes, when a photo is captured from the camera, the default orientation is not always .up
, e.g., if it is taken in a portrait mode, the orientation of the image.orientation is .right
.
Face detector is actually quite flexible in handling images whose orientation is not .up
, the key step is to set the orientation correctly:
Here is an example of detecting faces using photos captured from camera in our quickstart app, check it out to see if it will solve your problem.
Basically you only need to set the imageMetadata.orientation
value correctly like this:
// Define the metadata for the image.
let imageMetadata = VisionImageMetadata()
imageMetadata.orientation = UIUtilities.visionImageOrientation(from: image.imageOrientation)
// Initialize a VisionImage object with the given UIImage.
let visionImage = VisionImage(image: image)
visionImage.metadata = imageMetadata
and the mapping between the orientations can be found here:
public static func visionImageOrientation(
from imageOrientation: UIImage.Orientation
) -> VisionDetectorImageOrientation {
switch imageOrientation {
case .up:
return .topLeft
case .down:
return .bottomRight
case .left:
return .leftBottom
case .right:
return .rightTop
case .upMirrored:
return .topRight
case .downMirrored:
return .bottomLeft
case .leftMirrored:
return .leftTop
case .rightMirrored:
return .rightBottom
}
}
This statement for UIImage is for a more general purpose across all ML Kit detectors:
Create a VisionImage object using the correctly-rotated UIImage. Do not specify any rotation metadata—the default value, .topLeft, must be used.
but for face, it can be handled in a lightweight way by just setting the orientation correctly. We apologize for the confusion that it brought to you, and we will update this statement in the next release.
Thanks for reporting the problem, and hope the quickstart app would be helpful to your development.
Cheers,
Julie
来源:https://stackoverflow.com/questions/61149127/ensuring-image-data-is-correctly-oriented-on-ios-app-in-swift-5