AVfoundation blur background in Video

十年热恋 提交于 2019-12-05 15:30:17

问题


In my application I have fix composition render size of 1280 x 720. So if will import any portrait video then I have to show blur background with fill and aspect frame of video in centre. Same like this:

https://www.youtube.com/watch?v=yCOrqUA0ws4

I achieved to play both videos using AVMtableComposition, but I don't know how to blur a particular background track. I did following in my code:

self.composition = [AVMutableComposition composition];
AVAsset *firstAsset = [AVAsset assetWithURL:[NSURL fileURLWithPath:[[NSBundle mainBundle] pathForResource:@"ScreenFlow_Blend" ofType:@"mp4"]]];


[self addAsset:firstAsset toComposition:self.composition withTrackID:1];
[self addAsset:firstAsset toComposition:self.composition withTrackID:2];
//  [self addAsset:ThirdAsset toComposition:self.composition withTrackID:3];

AVAssetTrack *backVideoTrack = [firstAsset tracksWithMediaType:AVMediaTypeVideo][0];;

self.videoComposition = [AVMutableVideoComposition videoComposition];
self.videoComposition.renderSize = CGSizeMake(1280, 720);
self.videoComposition.frameDuration = CMTimeMake(1, 30);

AVMutableVideoCompositionInstruction *instruction = [AVMutableVideoCompositionInstruction videoCompositionInstruction];
instruction.timeRange = [backVideoTrack timeRange];

CGFloat scale = 1280/backVideoTrack.naturalSize.width;
CGAffineTransform t = CGAffineTransformMakeScale(scale, scale);
t = CGAffineTransformTranslate(t, 0, -backVideoTrack.naturalSize.height/2 + self.videoComposition.renderSize.height/2);

AVMutableVideoCompositionLayerInstruction *frontLayerInstruction = [AVMutableVideoCompositionLayerInstruction videoCompositionLayerInstruction];
frontLayerInstruction.trackID = 1;
[frontLayerInstruction setTransform:t atTime:kCMTimeZero];

CGFloat scaleSmall = 720/backVideoTrack.naturalSize.height;

CGAffineTransform  translate = CGAffineTransformMakeTranslation(self.videoComposition.renderSize.width/2 - ((backVideoTrack.naturalSize.width/2)*scaleSmall),0);

CGAffineTransform  scaleTransform = CGAffineTransformMakeScale(scaleSmall,scaleSmall);

CGAffineTransform finalTransform = CGAffineTransformConcat(scaleTransform, translate);


CGAffineTransform t1 = CGAffineTransformMakeScale(scaleSmall,scaleSmall);
t1 = CGAffineTransformTranslate(t1,1280, 0);

AVMutableVideoCompositionLayerInstruction *backLayerInstruction = [AVMutableVideoCompositionLayerInstruction videoCompositionLayerInstruction];
backLayerInstruction.trackID = 2;
[backLayerInstruction setTransform:finalTransform atTime:kCMTimeZero];


//    AVMutableVideoCompositionLayerInstruction *maskLayerInstruction = [AVMutableVideoCompositionLayerInstruction videoCompositionLayerInstruction];
//    maskLayerInstruction.trackID = 3;
//    [maskLayerInstruction setTransform:t atTime:kCMTimeZero];


instruction.layerInstructions = @[backLayerInstruction,frontLayerInstruction];

self.videoComposition.instructions = @[ instruction ];

AVPlayerItem *playerItem = [AVPlayerItem playerItemWithAsset:self.composition];
playerItem.videoComposition = self.videoComposition;
self.player = [AVPlayer playerWithPlayerItem:playerItem];

AVPlayerLayer *newPlayerLayer = [AVPlayerLayer playerLayerWithPlayer:[self player]];
[newPlayerLayer setFrame:[[[self playerView] layer] bounds]];
// [newPlayerLayer setHidden:YES];

[[[self playerView] layer] addSublayer:newPlayerLayer];
[self setPlayerLayer:newPlayerLayer];

Using above code I can achieve this:

https://drive.google.com/open?id=0B2jCvCt5fosyOVNOcGZ1MU1laEU

I know about the customVideoCompositor class to filter composition frames. I tried it but if I use customVideoCompositor then I am loosing my transformation on composition layers. Plus, from customVideoCompositor I don't know how to filter a particular track id.

If someone have any docs link or suggestion then it's really appreciate go forward in this.


回答1:


Before adding the second video layer which is on the centre of the screen, add this code

UIVisualEffect *blurEffect;
blurEffect = [UIBlurEffect effectWithStyle:UIBlurEffectStyleExtraLight];//Change the effect which you want.

UIVisualEffectView *visualEffectView;
visualEffectView = [[UIVisualEffectView alloc] initWithEffect:blurEffect];

visualEffectView.frame = self.view.bounds;
[self.view addSubview:visualEffectView];

In Swift

    let blurEffect = UIBlurEffect(style: UIBlurEffectStyle.Light) //Change the style which suites you
    let blurEffectView = UIVisualEffectView(effect: blurEffect)
    blurEffectView.frame = view.bounds
    blurEffectView.autoresizingMask = [.FlexibleWidth, .FlexibleHeight] // for supporting device rotation
    view.addSubview(blurEffectView)



回答2:


A way to achieve that is using 2 different AVPlayers and a a blur overlay view: backgroundLayer -> bluer overlay view -> frontLayer. You only need to make sure both player start and stop at the same time.

Another options is using 1 AVPlayer and a time observer. Extract the current image of the frontLayer on every frame, blur it and display in a backgroundLayer. The blur function can be found in the same link I provided above.



来源:https://stackoverflow.com/questions/39313688/avfoundation-blur-background-in-video

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!