cgimage

CGImage Masking stopped working on iOS 12

走远了吗. 提交于 2021-02-09 15:41:20
问题 I've got a method for masking a B&W image by cutting out (i.e. making transparent) any pixels that are above or below a certain brightness. The result would be the same B&W image, but with everything above 70% or below 25% brightness (or whatever you choose) changed to transparent. It was working perfectly on iOS 11, but it broke on iOS 12. It now returns the original, solid image with no modifications every time. -(UIImage*)imageWithLumaMaskFromDark:(CGFloat)lumaFloor toLight:(CGFloat

CGImage Masking stopped working on iOS 12

独自空忆成欢 提交于 2021-02-09 15:41:04
问题 I've got a method for masking a B&W image by cutting out (i.e. making transparent) any pixels that are above or below a certain brightness. The result would be the same B&W image, but with everything above 70% or below 25% brightness (or whatever you choose) changed to transparent. It was working perfectly on iOS 11, but it broke on iOS 12. It now returns the original, solid image with no modifications every time. -(UIImage*)imageWithLumaMaskFromDark:(CGFloat)lumaFloor toLight:(CGFloat

CGImage Masking stopped working on iOS 12

只谈情不闲聊 提交于 2021-02-09 15:40:32
问题 I've got a method for masking a B&W image by cutting out (i.e. making transparent) any pixels that are above or below a certain brightness. The result would be the same B&W image, but with everything above 70% or below 25% brightness (or whatever you choose) changed to transparent. It was working perfectly on iOS 11, but it broke on iOS 12. It now returns the original, solid image with no modifications every time. -(UIImage*)imageWithLumaMaskFromDark:(CGFloat)lumaFloor toLight:(CGFloat

CGImage from images stored on iCloud look wrong

浪子不回头ぞ 提交于 2020-03-04 07:02:42
问题 I'm extracting pixel colors from a CGImage using the code described in this answer. However, I just realized that if I load an image that was created on another device, the pixels values look wrong. The first obvious problem is that the alpha is gone. The CGImageAlphaInfo reports .noneSkipLast , but I know the image is RGBA. If I read it from the same device it was created, it looks fine. The second problem is that there is some color bleeding, as if the image had been resized. Perhaps is

CGImage from images stored on iCloud look wrong

冷暖自知 提交于 2020-03-04 07:02:18
问题 I'm extracting pixel colors from a CGImage using the code described in this answer. However, I just realized that if I load an image that was created on another device, the pixels values look wrong. The first obvious problem is that the alpha is gone. The CGImageAlphaInfo reports .noneSkipLast , but I know the image is RGBA. If I read it from the same device it was created, it looks fine. The second problem is that there is some color bleeding, as if the image had been resized. Perhaps is

CGImage from images stored on iCloud look wrong

时光怂恿深爱的人放手 提交于 2020-03-04 07:02:04
问题 I'm extracting pixel colors from a CGImage using the code described in this answer. However, I just realized that if I load an image that was created on another device, the pixels values look wrong. The first obvious problem is that the alpha is gone. The CGImageAlphaInfo reports .noneSkipLast , but I know the image is RGBA. If I read it from the same device it was created, it looks fine. The second problem is that there is some color bleeding, as if the image had been resized. Perhaps is

iOS中使用像素位图(CGImageRef)对图片进行处理

血红的双手。 提交于 2020-02-29 08:41:21
iOS中对图片进行重绘处理的方法总结 一、CGImageRef是什么 CGImageRef是定义在QuartzCore框架中的一个结构体指针,用C语言编写。在CGImage.h文件中,我们可以看到下面的定义: typedef struct CGImage *CGImageRef; CGImageRef 和 struct CGImage * 是完全等价的。这个结构用来创建像素位图,可以通过操作存储的像素位来编辑图片。 QuartzCore这个框架是可移植的。 二、CGImageRef相关的一些方法解析 CFTypeID CGImageGetTypeID( void ) 这个方法返回的是一个编号,每个 Core Foundation框架中得结构都会有一个这样的编号,CFTypeID定义如下: #if __LLP64__ typedef unsigned long long CFTypeID; typedef unsigned long long CFOptionFlags; typedef unsigned long long CFHashCode; typedef signed long long CFIndex; #else typedef unsigned long CFTypeID; typedef unsigned long CFOptionFlags; typedef

Convert bitmap image information into CGImage in iPhone OS 3

只谈情不闲聊 提交于 2020-01-15 09:31:29
问题 I want to create a CGImage with the color information I already have Here is the code for converting the CGImage to CML, CML_color is a matrix structure - (void)CGImageReftoCML:(CGImageRef)image destination:(CML_color &)dest{ CML_RGBA p; NSUInteger width=CGImageGetWidth(image); NSUInteger height=CGImageGetHeight(image); CGColorSpaceRef colorSpace=CGColorSpaceCreateDeviceRGB(); unsigned char *rawData=(unsigned char*)malloc(height*width*4); NSUInteger bytesPerPixel=4; NSUInteger bytesPerRow

Converting NSData to CGImage and then back to NSData makes the file too big

柔情痞子 提交于 2020-01-04 04:10:48
问题 I have built a camera using AVFoundation . Once my AVCaptureStillImageOutput has completed its captureStillImageAsynchronouslyFromConnection:completionHandler: method, I create a NSData object like this: NSData *imageData = [AVCaptureStillImageOutput jpegStillImageNSDataRepresentation:imageDataSampleBuffer]; Once I have the NSData object, I would like to rotate the image -without- converting to a UIImage . I have found out that I can convert to a CGImage to do so. After I have the imageData,

iPhone - Get a pointer to the data behind CGDataProvider?

 ̄綄美尐妖づ 提交于 2020-01-01 18:44:32
问题 I'm trying to take a CGImage and copy its data into a buffer for later processing. The code below is what I have so far, but there's one thing I don't like about it - it's copying the image data twice. Once for CGDataProviderCopyData() and once for the :getBytes:length call on imgData. I haven't been able to find a way to copy the image data directly into my buffer and cut out the CGDataProviderCopyData() step, but there has to be a way...any pointers? (...pun ftw) NSData *imgData = (NSData *