Filtering video with GPUImage

后端 未结 2 1403
醉梦人生
醉梦人生 2021-01-24 02:33

I\'m using GPUImage in my application and trying to filter video. Live video filtering is working well. The trouble comes up when I try to read a video into memory from the file

相关标签:
2条回答
  • 2021-01-24 02:51

    There's at least one thing here that might be behind this. In the code above, you're not hanging on to a strong reference to your movieFile source object.

    If this is an ARC-enabled project, that object will be deallocated the instant that you complete your setup method (if it's not, you'll be leaking that object). That will stop movie playback, deallocate the movie itself, and lead to black frames being sent down the filter pipeline (other potential instabilities).

    You need to make movieFile a strongly-referenced instance variable to make sure it hangs on past this setup method, since all movie processing is asynchronous.

    0 讨论(0)
  • 2021-01-24 02:51
    Here is solution : 
    
    Declare it 
        var movieFile: GPUImageMovie!
        var gpuImage: GPUImagePicture!
        var sourcePicture: GPUImagePicture!
        var sepiaFilter: GPUImageOutput!
        var sepiaFilter2: GPUImageInput! 
        var  movieWriter : GPUImageMovieWriter!
        var filter: GPUImageInput!
    
    //Filter image 
    
      func StartWriting()
        {
    
            // Step - 1  pass url to  avasset
            let loadingNotification = MBProgressHUD.showHUDAddedTo(self.view, animated: true)
            loadingNotification.mode = MBProgressHUDMode.Indeterminate
            loadingNotification.labelText = "Loading"
    
            let documentsURL1 = NSFileManager.defaultManager().URLsForDirectory(.DocumentDirectory, inDomains: .UserDomainMask)[0] as! NSURL
            let pathToMovie = documentsURL1.URLByAppendingPathComponent("temp.mov")
            self.movieFile = GPUImageMovie(URL: pathToMovie)
            self.movieFile.runBenchmark = true
            self.movieFile.playAtActualSpeed = false
            self.filter = GPUImageGrayscaleFilter()
            self.sepiaFilter = GPUImageGrayscaleFilter()
            self.movieFile.addTarget(self.filter)
            let documentsURL2 = NSFileManager.defaultManager().URLsForDirectory(.DocumentDirectory, inDomains: .UserDomainMask)[0] as! NSURL
            self.paths = documentsURL2.URLByAppendingPathComponent("temp1.mov")
            var fileManager: NSFileManager = NSFileManager.defaultManager()
            var error: NSError
            fileManager.removeItemAtURL(self.paths, error: nil)
            let Data = NSData(contentsOfURL: pathToMovie)
            println( Data?.length )
    
            var anAsset = AVAsset.assetWithURL(pathToMovie)as!AVAsset
    
            var videoAssetTrack = anAsset.tracksWithMediaType(AVMediaTypeVideo)[0]as! AVAssetTrack
            var videoAssetOrientation_: UIImageOrientation = .Up
            var isVideoAssetPortrait_: Bool = true
            var videoTransform: CGAffineTransform = videoAssetTrack.preferredTransform
    
            var naturalSize = CGSize()
            var FirstAssetScaleToFitRatio: CGFloat = 320.0 / videoAssetTrack.naturalSize.width
            println(naturalSize)
            naturalSize = videoAssetTrack.naturalSize
            self.movieWriter = GPUImageMovieWriter(movieURL: self.paths, size: naturalSize)
            let input = self.filter as! GPUImageOutput
            input.addTarget(self.movieWriter)
            self.movieWriter.shouldPassthroughAudio = true
            if anAsset.tracksWithMediaType(AVMediaTypeAudio).count > 0 {
                self.movieFile.audioEncodingTarget =  self.movieWriter
            }
            else
            {
                self.movieFile.audioEncodingTarget = nil
            }
    
            self.movieFile.enableSynchronizedEncodingUsingMovieWriter(self.movieWriter)
            self.movieWriter.startRecording()
            self.movieFile.startProcessing()
    
            self.movieWriter.completionBlock =
    
                {() -> Void in
    
                    self.movieWriter.finishRecording()
    
    
                    self.obj.performWithAsset(self.paths)
    
    
    
            }
            let delayTime1 = dispatch_time(DISPATCH_TIME_NOW, Int64(15 * Double(NSEC_PER_SEC)))
            dispatch_after(delayTime1, dispatch_get_main_queue()) {
                MBProgressHUD.hideAllHUDsForView(self.view, animated: true)
    
    
            }
            hasoutput = true ;
        }
    
    0 讨论(0)
提交回复
热议问题