Stitching two videos

前端 未结 1 1194
我在风中等你
我在风中等你 2021-01-15 12:52

I need to create one video composed by two stitched videos like the following image:

\"Two

相关标签:
1条回答
  • 2021-01-15 13:21

    I found the solution. Here I go:

    - (void)stitchAudio:(NSURL *)file1 audio2:(NSURL *)file2 {
        AVAsset *video1Asset = [AVAsset assetWithURL:file1];
        AVAsset *video2Asset = [AVAsset assetWithURL:file2];
    
        AVMutableComposition* mixComposition = [AVMutableComposition composition];
    
        AVMutableCompositionTrack *firstTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid];
        [firstTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, video1Asset.duration)
                            ofTrack:[[video1Asset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0]
                             atTime:kCMTimeZero error:nil];
    
        AVMutableCompositionTrack *secondTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid];
    
        [secondTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, video2Asset.duration)
                             ofTrack:[[video2Asset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0]
                              atTime:kCMTimeZero error:nil];
    
        //See how we are creating AVMutableVideoCompositionInstruction object.This object will contain the array of our AVMutableVideoCompositionLayerInstruction objects.You set the duration of the layer.You should add the lenght equal to the lingth of the longer asset in terms of duration.
        AVMutableVideoCompositionInstruction * MainInstruction = [AVMutableVideoCompositionInstruction videoCompositionInstruction];
        MainInstruction.timeRange = CMTimeRangeMake(kCMTimeZero, video1Asset.duration);
    
        //We will be creating 2 AVMutableVideoCompositionLayerInstruction objects.Each for our 2 AVMutableCompositionTrack.here we are creating AVMutableVideoCompositionLayerInstruction for out first track.see how we make use of Affinetransform to move and scale our First Track.so it is displayed at the bottom of the screen in smaller size.(First track in the one that remains on top).
        AVMutableVideoCompositionLayerInstruction *FirstlayerInstruction = [AVMutableVideoCompositionLayerInstruction videoCompositionLayerInstructionWithAssetTrack:firstTrack];
        CGAffineTransform Scale = CGAffineTransformMakeScale(0.68f,0.68f);
        CGAffineTransform Move = CGAffineTransformMakeTranslation(0,0);
        [FirstlayerInstruction setTransform:CGAffineTransformConcat(Scale,Move) atTime:kCMTimeZero];
    
        //Here we are creating AVMutableVideoCompositionLayerInstruction for out second track.see how we make use of Affinetransform to move and scale our second Track.
        AVMutableVideoCompositionLayerInstruction *SecondlayerInstruction = [AVMutableVideoCompositionLayerInstruction videoCompositionLayerInstructionWithAssetTrack:secondTrack];
        CGAffineTransform SecondScale = CGAffineTransformMakeScale(0.68f,0.68f);
        CGAffineTransform SecondMove = CGAffineTransformMakeTranslation(318,0);
        [SecondlayerInstruction setTransform:CGAffineTransformConcat(SecondScale,SecondMove) atTime:kCMTimeZero];
    
        //Now we add our 2 created AVMutableVideoCompositionLayerInstruction objects to our AVMutableVideoCompositionInstruction in form of an array.
        MainInstruction.layerInstructions = [NSArray arrayWithObjects:FirstlayerInstruction,SecondlayerInstruction,nil];
    
        //Now we create AVMutableVideoComposition object.We can add mutiple AVMutableVideoCompositionInstruction to this object.We have only one AVMutableVideoCompositionInstruction object in our example.You can use multiple AVMutableVideoCompositionInstruction objects to add multiple layers of effects such as fade and transition but make sure that time ranges of the AVMutableVideoCompositionInstruction objects dont overlap.
        AVMutableVideoComposition *MainCompositionInst = [AVMutableVideoComposition videoComposition];
        MainCompositionInst.instructions = [NSArray arrayWithObject:MainInstruction];
        MainCompositionInst.frameDuration = CMTimeMake(1, 30);
        MainCompositionInst.renderSize = CGSizeMake(640, 480);
    
        //Finally just add the newly created AVMutableComposition with multiple tracks to an AVPlayerItem and play it using AVPlayer.
        AVPlayerItem * newPlayerItem = [AVPlayerItem playerItemWithAsset:mixComposition];
        newPlayerItem.videoComposition = MainCompositionInst;
        self.mPlayer = [AVPlayer playerWithPlayerItem:newPlayerItem];
        [self.mPlaybackView setPlayer:self.mPlayer];
        [self.mPlayer play];
       // [self.mPlayer addObserver:self forKeyPath:@"status" options:0 context:AVPlayerDemoPlaybackViewControllerStatusObservationContext];
    
        // Create the export session with the composition and set the preset to the highest quality.
        AVAssetExportSession *exporter = [[AVAssetExportSession alloc] initWithAsset:mixComposition presetName:AVAssetExportPresetHighestQuality];
        // Set the desired output URL for the file created by the export process.
        exporter.outputURL = [self newUniqueAudioFileURL];
        exporter.videoComposition = MainCompositionInst;
        // Set the output file type to be a QuickTime movie.
        exporter.outputFileType = AVFileTypeQuickTimeMovie;
        exporter.shouldOptimizeForNetworkUse = YES;
    
        // Asynchronously export the composition to a video file and save this file to the camera roll once export completes.
        [exporter exportAsynchronouslyWithCompletionHandler:^{
            dispatch_async(dispatch_get_main_queue(), ^{
                if (exporter.status == AVAssetExportSessionStatusCompleted) {
                    ALAssetsLibrary *assetsLibrary = [[ALAssetsLibrary alloc] init];
    
                    self.mPlayer = [AVPlayer playerWithURL:exporter.outputURL];
                    //self.mPlayer2 = [AVPlayer playerWithURL:url];
    
                    //[self.mPlayer addObserver:self forKeyPath:@"status" options:0 context:AVPlayerDemoPlaybackViewControllerStatusObservationContext];
    
                    if ([assetsLibrary videoAtPathIsCompatibleWithSavedPhotosAlbum:exporter.outputURL]) {
                        [assetsLibrary writeVideoAtPathToSavedPhotosAlbum:exporter.outputURL completionBlock:NULL];
                    }
                }
            });
        }];
    }
    

    Source: https://abdulazeem.wordpress.com/2012/04/02/

    0 讨论(0)
提交回复
热议问题