问题
This is camera overlay for my app,
The yellow square is to indicate user that only photo inside this part (in camera) will be saved. It's like crop.
When I saved that capture image, it'll save zoomed photo [a big zoomed on photo],
What I found is, when I took a photo, it'll be of size of {2448, 3264}
I'm cropping the image like this,
- (UIImage *)imageByCroppingImage:(UIImage *)image toSize:(CGSize)size
{
double x = (image.size.width - size.width) / 2.0;
double y = (image.size.height - size.height) / 2.0;
CGRect cropRect = CGRectMake(x, y, size.height, size.width);
CGImageRef imageRef = CGImageCreateWithImageInRect([image CGImage], cropRect);
UIImage *cropped = [UIImage imageWithCGImage:imageRef];
CGImageRelease(imageRef);
return cropped;
}
- (void)imagePickerController:(UIImagePickerController *)picker didFinishPickingMediaWithInfo:(NSDictionary *)info
{
UIImage *image = [info valueForKey:UIImagePickerControllerOriginalImage];
if (image) {
UIImage *newImage = [self imageByCroppingImage:image toSize:CGSizeMake(300.f, 300.f)];
UIImageWriteToSavedPhotosAlbum(newImage, nil, nil, nil);
}
}
Notes,
Orientation was fixed before perform cropping. Using this, http://pastebin.com/WYUkDLS0
That yellow square on camera is also same size that's width=300 and height=300.
If I'll set front camera for
UIImagePickerController
then it'll give me perfect output of cropped image. Yes this is really strange!I've tried everything from here, Cropping an UIImage. Even https://github.com/Nyx0uf/NYXImagesKit won't help.
Any idea/suggestions?
Update:
From this question, Trying to crop my UIImage to a 1:1 aspect ratio (square) but it keeps enlarging the image causing it to be blurry. Why?
I followed the answer of @DrummerB like this,
CGFloat originalWidth = image.size.width * image.scale;
CGFloat originalHeight = image.size.height * image.scale;
float smallestDimension = fminf(originalWidth, originalHeight);
CGRect square = CGRectMake(0, 0, smallestDimension, smallestDimension);
CGImageRef imageRef = CGImageCreateWithImageInRect([image CGImage], square);
UIImage *squareImage = [UIImage imageWithCGImage:imageRef scale:image.scale orientation:image.imageOrientation];
UIImageWriteToSavedPhotosAlbum(squareImage, nil, nil, nil);
CGImageRelease(imageRef);
This is what I captured,
And it result me the following,
Now I'm getting the square photo, but note in output, still I'm getting photo outside that yellow square. What I want is to get photo which is reside in yellow square. Captured image is still of size, {w=2448, h=3264}. Note, that red circles which indicate outer part of image which should not include in output as that part is not inside yellow square.
What's wrong in this?
回答1:
It looks like the image you are receiving in your implementation is returning an image crop of 300 by 300 pixels. The yellow square you have on screen is 300 by 300 points. Points are not the same as pixels. So if your photo 3264 pixels wide, then cropping it to 300 pixels would return an image of about 1/10th the original size.
来源:https://stackoverflow.com/questions/24533731/how-to-do-properly-cropping-of-uiimage-taken-with-uiimagepickercontroller