Perhaps you have noticed one of the latest trend in iOS-apps: Using videos as backgrounds - mainly at login- or \"first launch\" screens. Yesterday I attempted to mimic this
I realize that this is an old post but being that I have had some experience bringing down the CPU usage in my iOS app, I'll respond.
first place to look is use AVFoundationFramework
Implementing AVPlayer should help bring down the CPU a little
but the best solution is to use Brad Larson's GPUImage library which utilizes OpenGl and will reduce the CPU usage greatly. Download the library and there are examples of how to use. I recommend using GPUImageMovieWriter
Best way is to use AVFoundation
then you control the video layer itself
In header file declare @property (nonatomic, strong) AVPlayerLayer *playerLayer;
- (void)viewDidLoad {
[super viewDidLoad];
[self.view.layer addSublayer:self.playerLayer];
// loop movie
[[NSNotificationCenter defaultCenter] addObserver: self
selector: @selector(replayMovie:)
name: AVPlayerItemDidPlayToEndTimeNotification
object:nil];
}
-(AVPlayerLayer*)playerLayer{
if(!_playerLayer){
// find movie file
NSString *moviePath = [[NSBundle mainBundle] pathForResource:@"arenaVideo" ofType:@"mp4"];
NSURL *movieURL = [NSURL fileURLWithPath:moviePath];
_playerLayer = [AVPlayerLayer playerLayerWithPlayer:[[AVPlayer alloc]initWithURL:movieURL]];
_playerLayer.frame = CGRectMake(0,0,self.view.frame.size.width, self.view.frame.size.height);
[_playerLayer.player play];
}
return _playerLayer
}
-(void)replayMovie:(NSNotification *)notification
{
[self.playerLayer.player play];
}
Swift 2.0
lazy var playerLayer:AVPlayerLayer = {
let player = AVPlayer(URL: NSURL(fileURLWithPath: NSBundle.mainBundle().pathForResource("LaunchMovie", ofType: "mov")!))
player.muted = true
player.allowsExternalPlayback = false
player.appliesMediaSelectionCriteriaAutomatically = false
var error:NSError?
// This is needed so it would not cut off users audio (if listening to music etc.
do {
try AVAudioSession.sharedInstance().setCategory(AVAudioSessionCategoryAmbient)
} catch var error1 as NSError {
error = error1
} catch {
fatalError()
}
if error != nil {
print(error)
}
var playerLayer = AVPlayerLayer(player: player)
playerLayer.frame = self.view.frame
playerLayer.videoGravity = "AVLayerVideoGravityResizeAspectFill"
playerLayer.backgroundColor = UIColor.blackColor().CGColor
player.play()
NSNotificationCenter.defaultCenter().addObserver(self, selector:"playerDidReachEnd", name:AVPlayerItemDidPlayToEndTimeNotification, object:nil)
return playerLayer
}()
override func viewDidLoad() {
super.viewDidLoad()
self.view.layer.addSublayer(self.playerLayer)
}
override func viewWillDisappear(animated: Bool) {
NSNotificationCenter.defaultCenter().removeObserver(self)
}
// If orientation changes
override func willAnimateRotationToInterfaceOrientation(toInterfaceOrientation: UIInterfaceOrientation, duration: NSTimeInterval) {
playerLayer.frame = self.view.frame
}
func playerDidReachEnd(){
self.playerLayer.player!.seekToTime(kCMTimeZero)
self.playerLayer.player!.play()
}
Tested on iOS7 - iOS9
For iOS9, I used Andrius' code and added the following for the loop:
-(void)replayBG:(NSNotification *)n {
[playerLayer.player seekToTime:kCMTimeZero];
[playerLayer.player play];
}
I found this code on GitHub that worked for me in iOS8/9
- (void)viewDidLoad {
[super viewDidLoad];
// Load the video from the app bundle.
NSURL *videoURL = [[NSBundle mainBundle] URLForResource:@"video" withExtension:@"mov"];
// Create and configure the movie player.
self.moviePlayer = [[MPMoviePlayerController alloc] initWithContentURL:videoURL];
self.moviePlayer.controlStyle = MPMovieControlStyleNone;
self.moviePlayer.scalingMode = MPMovieScalingModeAspectFill;
self.moviePlayer.view.frame = self.view.frame;
[self.view insertSubview:self.moviePlayer.view atIndex:0];
[self.moviePlayer play];
// Loop video.
[[NSNotificationCenter defaultCenter] addObserver:self selector:@selector(loopVideo) name:MPMoviePlayerPlaybackDidFinishNotification object:self.moviePlayer];
}
- (void)loopVideo {
[self.moviePlayer play];
}
I do it with a AVAssetReader,GLKView and render through a CIImage pipeline. When playing none filtered video on simulator, it eat about 80% cpu. On real device, it cost 1x% with real time filtering(CIFilter). It can be set to loop and control the FPS as well. I've made it on Github and welcome anyone to have a copy. It will be a good alternative choice for someone don't want to drop the whole GPUImage for just a video background view. Drag and drop the view and it work. https://github.com/matthewlui/FSVideoView