I am using the below code, for streaming the two videos sequentially. But it is not showing any video in the simulator, its totally blank.
Also how can I seek through t
The simulator is NOT ABLE to display video. Nether the inbuilt UIImagePickerController nor any video controller will work. It's not implemented and mostly appears black or red on the iOS simulator. You have to debug on the iOS target. Sometimes debugging will not work properly. Use NSLog() istead. This will always work (i.e. if you compile without debug informations using 'release' code)
you can seek using the player:
if mp is your media player:
[mp pause];
CMTime position = mp.currentTime;
// maybe replace something
[mp replaceCurrentItemWithPlayerItem:[AVPlayerItem playerItemWithAsset:self.composition]];
[mp seekToTime:length];
[mp play];
summary:
Edit: use composition and player item
Seek: use player
Here is a short formal example of how to do this (and already thread safe):
AVMutableComposition *_composition = [AVMutableComposition composition];
// iterate though all files
// And build mutable composition
for (int i = 0; i < filesCount; i++) {
AVURLAsset* sourceAsset = nil;
NSURL* movieURL = [NSURL fileURLWithPath:[paths objectAtIndex:i]];
sourceAsset = [AVURLAsset URLAssetWithURL:movieURL options:nil];
// calculate time
CMTimeRange editRange = CMTimeRangeMake(CMTimeMake(0, 600), sourceAsset.duration);
NSError *editError;
BOOL result = [_composition insertTimeRange:editRange
ofAsset:sourceAsset
atTime:_composition.duration
error:&editError];
dispatch_sync(dispatch_get_main_queue(), ^{
// maybe you need a progress bar
self.loaderBar.progress = (float) i / filesCount;
[self.loaderBar setNeedsDisplay];
});
}
// make the composition threadsafe if you need it later
self.composition = [[_composition copy] autorelease];
// Player wants mainthread?
dispatch_sync(dispatch_get_main_queue(), ^{
mp = [AVPlayer playerWithPlayerItem:[[[AVPlayerItem alloc] initWithAsset:self.composition] autorelease]];
self.observer = [mp addPeriodicTimeObserverForInterval:CMTimeMake(60, 600) queue:nil usingBlock:^(CMTime time){
// this is our callback block to set the progressbar
if (mp.status == AVPlayerStatusReadyToPlay) {
float actualTime = time.value / time.timescale;
// avoid division by zero
if (time.value > 0.) {
CMTime length = mp.currentItem.asset.duration;
float lengthTime = length.value / length.timescale;
if (lengthTime) {
self.progressBar.value = actualTime / lengthTime;
} else {
self.progressBar.value = 0.0f;
}
}];
});
// the last task must be on mainthread again
dispatch_sync(dispatch_get_main_queue(), ^{
// create our playerLayer
self.playerLayer = [AVPlayerLayer playerLayerWithPlayer:mp];
self.playerLayer.videoGravity = AVLayerVideoGravityResizeAspectFill;
self.playerLayer.frame = [self view].layer.bounds;
// insert into our view (make it visible)
[[self view].layer insertSublayer:self.playerLayer atIndex:0];
});
// and now do the playback, maybe mp is global (self.mp)
// this depends on your needs
[mp play];
});
I hope this helps.