问题
I have a weird problem. In my app I am combining multiple audio and video files using the code below. The resulted video seems to work fine once I downloaded it from the device to the computer and play with Quick Time, but whenever I am trying to play the newly composed video using either UIWebView or AVPLayer I can only see first part of merged video files.
Furthermore when I tried to use MPMoviePlayerController to play it hangs on "Loading".
I can hear audio for all composition. To make it clear I have two arrays: 1- audioPieces with paths to audio files [song1, song2, song3]; 2- moviePieces with paths to video files [movie1,movie2,movie3]; After merging those files I can see only movie1 but I can hear song1 + song2 + song3. P.S. songs and movies have different lengths (Less than 0.2s difference). Any help will be appreciated. Thank you in advance, Janusz
-(void)putFilesTogether{
AVMutableComposition *mixComposition = [AVMutableComposition composition];
AVMutableCompositionTrack *videoCompositionTrack =[[AVMutableCompositionTrack alloc]init];
AVMutableCompositionTrack *audioCompositionTrack =[[AVMutableCompositionTrack alloc]init];
NSLog(@" movie %@ audio %@ ", moviePieces, audioPieces);
NSError * error;
for(int i=0;i<moviePieces.count;i++)
{
NSFileManager * fm = [NSFileManager defaultManager];
NSString * movieFilePath;
NSString * audioFilePath;
movieFilePath = [moviePieces objectAtIndex:i];
audioFilePath = [audioPieces objectAtIndex:i];
if(![fm fileExistsAtPath:movieFilePath]){
NSLog(@"Movie doesn't exist %@ ",movieFilePath);
}
else{
NSLog(@"Movie exist %@ ",movieFilePath);
}
if(![fm fileExistsAtPath:audioFilePath]){
NSLog(@"Audio doesn't exist %@ ",audioFilePath);
}
else{
NSLog(@"Audio exists %@ ",audioFilePath);
}
NSURL *videoUrl = [NSURL fileURLWithPath:movieFilePath];
NSURL *audioUrl = [NSURL fileURLWithPath:audioFilePath];
AVURLAsset *videoasset = [[AVURLAsset alloc]initWithURL:videoUrl options:nil];
AVAssetTrack *videoAssetTrack= [[videoasset tracksWithMediaType:AVMediaTypeVideo] lastObject];
AVURLAsset *audioasset = [[AVURLAsset alloc]initWithURL:audioUrl options:nil];
AVAssetTrack *audioAssetTrack= [[audioasset tracksWithMediaType:AVMediaTypeAudio] lastObject];
videoCompositionTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid];
audioCompositionTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeAudio preferredTrackID:kCMPersistentTrackID_Invalid];
CMTime tempTime = mixComposition.duration;
[audioCompositionTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, audioasset.duration) ofTrack:audioAssetTrack atTime:tempTime error:&error];
[videoCompositionTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, videoasset.duration) ofTrack:videoAssetTrack atTime:tempTime error:&error];
if(error)
{
NSLog(@"Ups. Something went wrong! %@", [error debugDescription]);
}
}
NSDate *now = [NSDate dateWithTimeIntervalSinceNow:0];
NSString *caldate = [now description];
float ran = arc4random()%1000;
NSString * pathToSave = [NSString stringWithFormat:@"Output%@%f.mp4",caldate,ran];
pathToSave =[DOCUMENTS_FOLDER stringByAppendingPathComponent:pathToSave];
NSURL *movieUrl = [NSURL fileURLWithPath:pathToSave];
AVAssetExportSession *exporter =[[AVAssetExportSession alloc] initWithAsset:mixComposition presetName:AVAssetExportPresetPassthrough];
exporter.outputFileType=AVFileTypeQuickTimeMovie;
exporter.outputURL=movieUrl;
exporter.shouldOptimizeForNetworkUse=YES;
CMTimeValue val = mixComposition.duration.value;
CMTime start=CMTimeMake(0, 600);
CMTime duration=CMTimeMake(val, 600);
CMTimeRange range=CMTimeRangeMake(start, duration);
exporter.timeRange=range;
[exporter exportAsynchronouslyWithCompletionHandler:^{
switch ([exporter status]) {
case AVAssetExportSessionStatusFailed:{
NSLog(@"Export failed: %@ %@", [[exporter error] localizedDescription],[[exporter error]debugDescription]);
NSString * message = @"Movie wasn't created. Try again later.";
[self performSelectorOnMainThread:@selector(dismissMe:) withObject:message waitUntilDone:NO];
break;}
case AVAssetExportSessionStatusCancelled:{ NSLog(@"Export canceled");
NSString * message1 = @"Movie wasn't created. Try again later.";
[self performSelectorOnMainThread:@selector(dismissMe:) withObject:message1 waitUntilDone:NO];
break;}
case AVAssetExportSessionStatusCompleted:
{
NSString * message = @"Movie was successfully created.";
CMTime duration = mixComposition.duration;
[self saveData:duration ofPath:pathToSave];
[self cleanFiles];
[self performSelectorOnMainThread:@selector(dismissMe:) withObject:message waitUntilDone:NO];
}
}}];
}
回答1:
The problem lays in:
videoCompositionTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid];
audioCompositionTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeAudio preferredTrackID:kCMPersistentTrackID_Invalid];
They need to be moved outside the for loop body.
来源:https://stackoverflow.com/questions/14247985/composing-video-and-audio-using-avmutablecomposition