Applying filter to real time camera preview - Swift

隐身守侯 提交于 2019-12-03 08:46:19

There are a few things wrong with your code on top

You are using a AVCaptureVideoPreviewLayer but this is going to transport pixels capture by the camera directly to the screen, skipping your image processing and CIFilter and is not necessary.

Your conformance to AVCaptureVideoDataOutputSampleBufferDelegate is out of date. func captureOutput(captureOutput: AVCaptureOutput!, didOutputSampleBuffer sampleBuffer: CMSampleBuffer!, fromConnection connection: AVCaptureConnection!) is now called func captureOutput(_ output: AVCaptureOutput, didOutput sampleBuffer: CMSampleBuffer, from connection: AVCaptureConnection)

Because you won't be using AVCaptureVideoPreviewLayer you'll need to ask for permission before you'll be able to start getting pixels from the camera. This is typically done in viewDidAppear(_:) Like:

override func viewDidAppear(_ animated: Bool) {
    super.viewDidAppear(animated)
    if AVCaptureDevice.authorizationStatus(for: AVMediaType.video) != .authorized
    {
        AVCaptureDevice.requestAccess(for: AVMediaType.video, completionHandler:
        { (authorized) in
            DispatchQueue.main.async
            {
                if authorized
                {
                    self.setupInputOutput()
                }
            }
        })
    }
}

Also, if you are supporting rotation you will also need to update the AVCaptureConnection on rotation in your didOutput callback.

After making these changes (full source code) your code worked, producing an image like so:

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!