How to apply filter to Video real-time using Swift

后端 未结 2 667
说谎
说谎 2020-12-04 15:00

is possible to apply filter to AVLayer and add it to view as addSublayer? I want to change colors and add some noise to vi

相关标签:
2条回答
  • 2020-12-04 15:20

    If you're using an AVPlayerViewController, you can set the compositingFilter property of the view's layer:

      playerController.view.layer.compositingFilter = "multiplyBlendMode"
    

    See here for the compositing filter options you can use. e.g. "multiplyBlendMode", "screenBlendMode", etc.

    Example of doing this in a UIViewController:

    class ViewController : UIViewController{
      override func viewDidLoad() {
        //load a movie called my_movie.mp4 that's in your xcode project
        let path = Bundle.main.path(forResource: "my_movie", ofType:"mp4")
        let player = AVPlayer(url: URL(fileURLWithPath: path!))
    
        //make a movie player and set the filter
        let playerController = AVPlayerViewController()
        playerController.player = player
        playerController.view.layer.compositingFilter = "multiplyBlendMode"
    
        //add the player view controller to this view controller
        self.addChild(playerController)
        view.addSubview(playerController.view)
        playerController.didMove(toParent: self)
    
        //play the movie
        player.play()
      }
    }
    

    For let path = Bundle.main.path(forResource: "my_movie", ofType:"mp4"), make sure you add the .mp4 file to Build Phases > Copy Bundle Resources in your Xcode project. Or check the 'add to target' boxes when you import the file.

    0 讨论(0)
  • 2020-12-04 15:28

    There's another alternative, use an AVCaptureSession to create instances of CIImage to which you can apply CIFilters (of which there are loads, from blurs to color correction to VFX).

    Here's an example using the ComicBook effect. In a nutshell, create an AVCaptureSession:

    let captureSession = AVCaptureSession()
    captureSession.sessionPreset = AVCaptureSessionPresetPhoto
    

    Create an AVCaptureDevice to represent the camera, here I'm setting the back camera:

    let backCamera = AVCaptureDevice.defaultDeviceWithMediaType(AVMediaTypeVideo)
    

    Then create a concrete implementation of the device and attach it to the session. In Swift 2, instantiating AVCaptureDeviceInput can throw an error, so we need to catch that:

     do
    {
        let input = try AVCaptureDeviceInput(device: backCamera)
    
        captureSession.addInput(input)
    }
    catch
    {
        print("can't access camera")
        return
    }
    

    Now, here's a little 'gotcha': although we don't actually use an AVCaptureVideoPreviewLayer but it's required to get the sample delegate working, so we create one of those:

    // although we don't use this, it's required to get captureOutput invoked
    let previewLayer = AVCaptureVideoPreviewLayer(session: captureSession)
    
    view.layer.addSublayer(previewLayer)
    

    Next, we create a video output, AVCaptureVideoDataOutput which we'll use to access the video feed:

    let videoOutput = AVCaptureVideoDataOutput()
    

    Ensuring that self implements AVCaptureVideoDataOutputSampleBufferDelegate, we can set the sample buffer delegate on the video output:

     videoOutput.setSampleBufferDelegate(self, 
        queue: dispatch_queue_create("sample buffer delegate", DISPATCH_QUEUE_SERIAL))
    

    The video output is then attached to the capture session:

     captureSession.addOutput(videoOutput)
    

    ...and, finally, we start the capture session:

    captureSession.startRunning()
    

    Because we've set the delegate, captureOutput will be invoked with each frame capture. captureOutput is passed a sample buffer of type CMSampleBuffer and it just takes two lines of code to convert that data to a CIImage for Core Image to handle:

    let pixelBuffer = CMSampleBufferGetImageBuffer(sampleBuffer)
    let cameraImage = CIImage(CVPixelBuffer: pixelBuffer!)
    

    ...and that image data is passed to our Comic Book effect which, in turn, is used to populate an image view:

    let comicEffect = CIFilter(name: "CIComicEffect")
    
    comicEffect!.setValue(cameraImage, forKey: kCIInputImageKey)
    
    let filteredImage = UIImage(CIImage: comicEffect!.valueForKey(kCIOutputImageKey) as! CIImage!)
    
    dispatch_async(dispatch_get_main_queue())
    {
        self.imageView.image = filteredImage
    }
    

    I have the source code for this project available in my GitHub repo here.

    0 讨论(0)
提交回复
热议问题