gpuimage

iOS端直播-音视频采集技术分享

情到浓时终转凉″ 提交于 2020-03-11 09:19:11
1、iOS直播技术的流程   直播技术的流程大致可以分为几个步骤:数据采集、图像处理(实时滤镜)、视频编码、封包、上传、云端(转码、录制、分发)、直播播放器。 数据采集:通过摄像头和麦克风获得实时的音视频数据; 图像处理:将数据采集的输入流进行实时滤镜,得到我们美化之后的视频帧; 视频编码:编码分为软编码和硬编码。现在一般的编码方式都是H.264,比较新的H.265据说压缩率比较高,但算法也相当要复杂一些,使用还不够广泛。软编码是利用CPU进行编码,硬编码就是使用GPU进行编码,软编码支持现在所有的系统版本,由于苹果在iOS8才开放硬编码的API,故硬编码只支持iOS8以上的系统; 封包:现在直播推流中,一般采用的格式是FLV; 上传:常用的协议是利用RTMP协议进行推流; 云端:进行流的转码、分发和录制; 直播播放器:负责拉流、解码、播放。 2、获取系统的授权 直播的第一步就是采集数据,包含视频和音频数据,由于iOS权限的要求,需要先获取访问摄像头和麦克风的权限: 请求获取访问摄像头权限 __weak typeof ( self ) _self = self ; AVAuthorizationStatus status = [ AVCaptureDevice authorizationStatusForMediaType : AVMediaTypeVideo ] ; switch

Objective-C block doesn't skips code and then later executes it

拥有回忆 提交于 2020-01-17 06:10:29
问题 I'm using the GPUImage framework and I've noticed that the compiler automatically skips everything that is within the brackets of the setColorAverageProcessingFinishedBlock. It completely skips over these contents and continues on, executing everything else in the code. Once everything else has been executed, it comes back to the content within the brackets. Obviously, this has unintended side effects. NSMutableArray *redValues = [NSMutableArray array]; NSMutableArray *arrayOne =

GPUImage failed to init ACVFile with data:(null)

为君一笑 提交于 2020-01-17 01:39:23
问题 First of all, I must say that GPUImage is an excellent framework. However, when loading an ACV file that I export from Photoshop CS6, it gives me an error saying that: failed to init ACVFile with data:(null). The thing is though, that the same code works for some other ACV files, and the file definitely has data, 64 bites of it in fact. Here is how I am trying to load it: GPUImageToneCurveFilter *stillImageFilter2 = [[GPUImageToneCurveFilter alloc] initWithACV:@"test"]; UIImage

GPUImage masking transparency

旧时模样 提交于 2020-01-16 04:42:10
问题 I'm using GPUImage framework of Brad Larson in my app and I need to make a mask of mask.jpg and photo. So mask images are two colors images with black background and white geometric figure. So basically masking is working, the problem is that I can't make the transparency of mask. To make that I need to replace the black color of mask.png to lighter color (more gray to white). So I tried to use code from here Want to change a particular color inside an image with another color - iPhone But it

Android直播技术之(二) : 渲染处理/编码数据等介绍

北战南征 提交于 2020-01-15 19:24:20
1.渲染处理 这里所说的渲染处理主要是从相机中采集的数据进行二次处理也就是常说的美颜(美颜的概念值通过一定的算法对原始数据图像进行二次处理并强化图像效果,不限于去掉不协调边缘/边缘检测等),市面上比较好的美颜厂商有商汤/FaceUnity等,而自己做美颜封装,可用的开源库主要是GPUImage GPU工作原理指图像运算工作的微处理器,GPU主要利用显卡对图像的顶点坐标,通过图元组配进行光栅化/顶点着色/片元着色等一系列管线操作 OpenGl ES(开源嵌入式图像处理框架)它是一套图形与硬件的接口,用于把处理好的图像显示到屏幕上 GPUImage是一个基于OpenGL ES 2.0 的图像和视频处理的宽平台框架,提供多样的图像处理滤镜,支持相机和摄像机实时滤镜,内置超百种滤镜效果,且能够自定义图像处理,而滤镜处理的原理就是把静态图像或视频的每一帧进行图形变换后显示出来,它的本质是像素点的坐标和颜色变化 下面简单介绍下GPUImage处理画面的原理: GPUImage采用链式处理画面,通过addTarget函数为链条添加每一个环节对象,处理完一个target就会把上个环节处理好的图像数据传递给下一个target去处理,这被称为GPUImage处理链.如:墨镜原理,从外界传来光线,会经过墨镜过滤,再传给我们的眼睛,这样我们就能感受到大白天也会乌黑一片了 一般的target可分为两类

Save applied filter in camera preview with GPUImage

不羁的心 提交于 2020-01-14 03:59:29
问题 I am using GPUImage android library to apply filters on camera preview and save the image with filters applied after take a picture. The problem is that when I took the pictute, I can't get the image with the filters. I am using the following code: @Override public void onPictureTaken(byte[] data, Camera camera) { Bitmap bitmap = BitmapFactory.decodeByteArray(data, 0, data.length); mGPUImage.setImage(bitmap); bitmap = mGPUImage.getBitmapWithFilterApplied(); saveImage(bitmap); } The sample

GPUImage : Apply filer to existing video file

不问归期 提交于 2020-01-13 18:56:32
问题 I was trying to use video filters of GPUImage framework. I followed Filtering and re-encoding a movie tutorial. It giving me error Unknown type name GPUImageRotationFilter . So i tried to apply a simple filter to my video file Here is my code viewController.h @interface ViewController : UIViewController<GPUImageMovieWriterDelegate,GPUImageInput> { GPUImageMovie *_movieFile; GPUImageOutput<GPUImageInput> *_sketchFilter; GPUImageMovieWriter *_movieWriter; } @property (nonatomic,retain)

“GPUImage.h” not found

淺唱寂寞╮ 提交于 2020-01-12 06:58:48
问题 I am trying to set up GPUImage in a project but I am not able to track down why I'm getting the error: "GPUImage.h" not found. I have added the framework, setup the target dependency, added the Header Search path as: framework, and added other linker flag -ObjC . Still no luck. I have included my super simple test project here and linked below if anyone wants to take a look. I know this must be documented and basic, but I searched on GitHub but did not find reference to this particular issue.

iOS滤镜系列-滤镜开发概览

≡放荡痞女 提交于 2020-01-12 03:55:40
概述 滤镜最早的出现应该是应用在相机镜头前实现自然光过滤和调色的镜片,然而在软件开发中更多的指的是软件滤镜,是对镜头滤镜的模拟实现。当然这种方式更加方便快捷,缺点自然就是无法还原拍摄时的真实场景,例如无法实现偏光镜和紫外线滤色镜的效果。今天简单介绍一下iOS滤镜开发中的正确姿势,让刚刚接触滤镜开发的朋友少走弯路。 在iOS开发中常见的滤镜开发方式大概包括:CIFilter、GPUImage、OpenCV等。 CoreImage CIFilter CIFilter存在于CoreImage框架中,它基于OpenGL着色器来处理图像(最新的已经基于Metal实现),优点当然是快,因为它可以充分利用GPU加速来处理图像渲染,同时它自身支持滤镜链,多个滤镜同时使用时迅速高效。 CIFilter目前已经支持21个分类(如下代码) 196种滤镜 : public let kCICategoryDistortionEffect: String public let kCICategoryGeometryAdjustment: String public let kCICategoryCompositeOperation: String public let kCICategoryHalftoneEffect: String public let kCICategoryColorAdjustment

How to achieve those filter chaining with GPUImage framework?

纵然是瞬间 提交于 2020-01-11 07:49:08
问题 I'm trying to chaining blended layer and filter it (Origin -> Texture1(opacity 30%)/HardLight -> Texture2/SoftLight) => level(45, 0.95, 238) + saturation(-100) + hue(+42) Here is what I tried : Edited: This code works below, thanks for answer // Textures GPUImagePicture *origin = [[GPUImagePicture alloc] initWithImage:originImage smoothlyScaleOutput:NO]; GPUImagePicture *text1 = [[GPUImagePicture alloc] initWithImage:[UIImage imageNamed:@"filter_landscape_vintage_1.png"] smoothlyScaleOutput