Crop image to a square according to the size of a UIView/CGRect

不打扰是莪最后的温柔 提交于 2019-12-05 18:18:08

问题


I have an implementation of AVCaptureSession and my goal is for the user to take a photo and only save the part of the image within the red square border, as shown below:

AVCaptureSession's previewLayer (the camera) spans from (0,0) (top left) to the bottom of my camera controls bar (the bar just above the view that contains the shutter). My navigation bar and controls bar are semi-transparent, so the camera can show through.

I'm using [captureSession setSessionPreset:AVCaptureSessionPresetPhoto]; to ensure that the original image being saved to the camera roll is like Apple's camera.

The user will be able to take the photo in portrait, landscape left and right, so the cropping method must take this into account.

So far, I've tried to crop the original image using this code:

DDLogVerbose(@"%@: Image crop rect: (%f, %f, %f, %f)", THIS_FILE, self.imageCropRect.origin.x, self.imageCropRect.origin.y, self.imageCropRect.size.width, self.imageCropRect.size.height);

// Create new image context (retina safe)
UIGraphicsBeginImageContextWithOptions(CGSizeMake(self.imageCropRect.size.width, self.imageCropRect.size.width), NO, 0.0);

// Create rect for image
CGRect rect = self.imageCropRect;

// Draw the image into the rect
[self.captureManager.stillImage drawInRect:rect];

// Saving the image, ending image context
UIImage *croppedImage = UIGraphicsGetImageFromCurrentImageContext();

However, when I look at the cropped image in the camera roll, it seems that it has just squashed the original image, and not discarded the top and bottom parts of the image like I'd like. It also results in 53 pixels of white space at the top of the "cropped" image, likely because of the y position of my CGRect.

This is my logging output for the CGRect:

 Image crop rect: (0.000000, 53.000000, 320.000000, 322.000000)

This also describes the frame of the red bordered view in the superview.

Is there something crucial I'm overlooking?

P.S. The original image size (taken with a camera in portrait mode) is:

Original image size: (2448.000000, 3264.000000)

回答1:


You can crop images with CGImageCreateWithImageInRect:

CGImageRef imageRef = CGImageCreateWithImageInRect([uncroppedImage CGImage], bounds);
UIImage *croppedImage = [UIImage imageWithCGImage:imageRef];
CGImageRelease(imageRef);



回答2:


Don't forget to add scale parameter otherwise you will get low resolution image

CGImageRef imageRef = CGImageCreateWithImageInRect([uncroppedImage CGImage], CGRectMake(0, 0, 30, 120));
[imageView setImage:[UIImage imageWithCGImage:imageRef scale:[[UIScreen mainScreen] scale] orientation:UIImageOrientationUp]];
CGImageRelease(imageRef);



回答3:


Swift 3:

let imageRef:CGImage = uncroppedImage.cgImage!.cropping(to: bounds)!
let croppedImage:UIImage = UIImage(cgImage: imageRef)


来源:https://stackoverflow.com/questions/21011671/crop-image-to-a-square-according-to-the-size-of-a-uiview-cgrect

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!