不使用setSampleBufferDelegate调用captureOutput函数

时间:2021-07-05 03:41:45

I'm starting to develop an iOS app and this is my first SO post. I'm trying to implement a UI view which can show the preview video of the rear camera and process the captured frames. My preview layer works perfectly and I can see the picture display in my UI view. However, the captureOutput function is never called.

我开始开发iOS应用程序,这是我的第一篇SO帖子。我正在尝试实现一个UI视图,它可以显示后置摄像头的预览视频并处理捕获的帧。我的预览图层完美运行,我可以在UI视图中看到图片显示。但是,永远不会调用captureOutput函数。

I have searched online for silimar issues and solutions for a while and tried to tweak different things including the output, connection, and dispatch queue settings, but none has worked. Can anyone help me out or share some insights and directions? Thanks a lot in advance!

我已经在网上搜索了一段时间的silimar问题和解决方案,并试图调整不同的东西,包括输出,连接和调度队列设置,但没有一个工作。任何人都可以帮助我或分享一些见解和方向吗?非常感谢提前!

Here is my code, I'm using Xcode 11 beta with iOS 10 as build target.

这是我的代码,我使用Xcode 11 beta和iOS 10作为构建目标。

class ThreeDScanningViewController: UIViewController, 
AVCaptureVideoDataOutputSampleBufferDelegate {

    @IBOutlet weak var imageView: UIImageView!

    var session : AVCaptureSession!
    var device : AVCaptureDevice!
    var output : AVCaptureVideoDataOutput!
    var previewLayer : AVCaptureVideoPreviewLayer!

    override func viewDidLoad() {
        super.viewDidLoad()
                //NotificationCenter.default.addObserver(self, selector: #selector(self.startedNotif), name: NSNotification.name.CaptureSessionDidStartRunningNotification, object: nil)

    func initCamera() -> Bool {
        session = AVCaptureSession()
        session.sessionPreset = AVCaptureSession.Preset.medium

        let devices = AVCaptureDevice.devices()

        for d in devices { 
            if ((d as AnyObject).position == AVCaptureDevice.Position.back) {
                device = d as! AVCaptureDevice
            }
        }
        if device == nil {
            return false
        }

        do {
            // Set up the input

            let input : AVCaptureDeviceInput!
            try input = AVCaptureDeviceInput(device: device)

            if session.canAddInput(input) {
                session.addInput(input)
            } else {
                return false
            }

            // Set up the device

            try device.lockForConfiguration()
            device.activeVideoMinFrameDuration = CMTimeMake(1, 15)
            device.unlockForConfiguration()

            // Set up the preview layer

            previewLayer = AVCaptureVideoPreviewLayer(session: session)
            previewLayer.frame = imageView.bounds
            imageView.layer.addSublayer(previewLayer)

            // Set up the output

            output = AVCaptureVideoDataOutput()
            output.videoSettings = [(kCVPixelBufferPixelFormatTypeKey as NSString) as String: kCVPixelFormatType_32BGRA]

            let queue = DispatchQueue(label: "myqueue")
            output!.setSampleBufferDelegate(self, queue: queue)

            output.alwaysDiscardsLateVideoFrames = true

            if session.canAddOutput(output) {
                session.addOutput(output)
            } else {
                return false
            }

            for connection in output.connections {
                if let conn = connection as? AVCaptureConnection {
                    if conn.isVideoOrientationSupported {
                        conn.videoOrientation = AVCaptureVideoOrientation.portrait
                    }
                }
            }

            session.startRunning()

        } catch let error as NSError {
            print(error)
            return false
        }

        return true
    }

    func captureOutput (captureOutput: AVCaptureOutput!, didOutputSampleBuffer sampleBuffer: CMSampleBuffer!, from connection: AVCaptureConnection!) {
        print("captureOutput!\n");
        DispatchQueue.main.async(execute: {
            // Do stuff
        })
    }
}

Here are some links I've looked into, none is relevant to solve my issue:

以下是我查看的一些链接,没有一个与解决我的问题相关:

4 个解决方案

#1


17  

I have finally managed to find the cause of the issue. You need to make sure to use the correct function signature for the captureOutput function for the Swift 3 syntax.

我终于找到了问题的原因。您需要确保为swift 3语法的captureOutput函数使用正确的函数签名。

func captureOutput(_ output: AVCaptureOutput, didOutput sampleBuffer: CMSampleBuffer, from connection: AVCaptureConnection)

NOT

func captureOutput(_ output: AVCaptureOutput, didOutputSampleBuffer sampleBuffer: CMSampleBuffer, from connection: AVCaptureConnection)

I was using older version of the Swift syntax and the compiler did not warn me of the issue! After correcting the function signatures, the captureOutput function gets called beautifully:-)

我使用的是旧版本的Swift语法,编译器没有提醒我这个问题!纠正功能签名后,captureOutput函数被称为精美:-)

#2


9  

From Swift 4:

来自Swift 4:

func captureOutput(_ captureOutput: AVCaptureOutput!, 
didOutputMetadataObjects metadataObjects: [Any]!, from connection: 
AVCaptureConnection!)  

won't be called as it no longer exists.

将不会被调用,因为它不再存在。

It has been changed to the following :

它已更改为以下内容:

func metadataOutput(_ output: AVCaptureMetadataOutput, didOutput metadataObjects: [AVMetadataObject], from connection: AVCaptureConnection) 

#3


0  

According to this tutorial you need to commit your configuration before starting to run the session.

根据本教程,您需要在开始运行会话之前提交配置。

I also see that you have multiple points where you return false before the session can start to run. Hav you checked to see if you are exiting prematurely in one of these locations? Simply a console output, or a break point on the return statements can give you some info.

我还看到,在会话开始运行之前,您有多个返回false的点。你检查过你是否在其中一个地点过早退出?只需一个控制台输出或返回语句上的断点就可以为您提供一些信息。

#4


0  

The problem got fixed when i changed dualCamera to AVCaptureDeviceType.builtInWideAngleCamera swift 4. Hope it helps anyone in need.

当我将dualCamera更改为AVCaptureDeviceType.builtInWideAngleCamera swift时,问题得到修复4.希望它可以帮助任何有需要的人。

#1


17  

I have finally managed to find the cause of the issue. You need to make sure to use the correct function signature for the captureOutput function for the Swift 3 syntax.

我终于找到了问题的原因。您需要确保为swift 3语法的captureOutput函数使用正确的函数签名。

func captureOutput(_ output: AVCaptureOutput, didOutput sampleBuffer: CMSampleBuffer, from connection: AVCaptureConnection)

NOT

func captureOutput(_ output: AVCaptureOutput, didOutputSampleBuffer sampleBuffer: CMSampleBuffer, from connection: AVCaptureConnection)

I was using older version of the Swift syntax and the compiler did not warn me of the issue! After correcting the function signatures, the captureOutput function gets called beautifully:-)

我使用的是旧版本的Swift语法,编译器没有提醒我这个问题!纠正功能签名后,captureOutput函数被称为精美:-)

#2


9  

From Swift 4:

来自Swift 4:

func captureOutput(_ captureOutput: AVCaptureOutput!, 
didOutputMetadataObjects metadataObjects: [Any]!, from connection: 
AVCaptureConnection!)  

won't be called as it no longer exists.

将不会被调用,因为它不再存在。

It has been changed to the following :

它已更改为以下内容:

func metadataOutput(_ output: AVCaptureMetadataOutput, didOutput metadataObjects: [AVMetadataObject], from connection: AVCaptureConnection) 

#3


0  

According to this tutorial you need to commit your configuration before starting to run the session.

根据本教程,您需要在开始运行会话之前提交配置。

I also see that you have multiple points where you return false before the session can start to run. Hav you checked to see if you are exiting prematurely in one of these locations? Simply a console output, or a break point on the return statements can give you some info.

我还看到,在会话开始运行之前,您有多个返回false的点。你检查过你是否在其中一个地点过早退出?只需一个控制台输出或返回语句上的断点就可以为您提供一些信息。

#4


0  

The problem got fixed when i changed dualCamera to AVCaptureDeviceType.builtInWideAngleCamera swift 4. Hope it helps anyone in need.

当我将dualCamera更改为AVCaptureDeviceType.builtInWideAngleCamera swift时,问题得到修复4.希望它可以帮助任何有需要的人。