如何避免在通过AVFoundation使用iPhone相机时阻止UI?

时间:2022-08-26 21:00:19

I am trying to embed a simple view in my iPhone application to take quick snapshots. Everything works fine but I am facing some issues with the cameras startup-time. In an Apple sample project AVCaptureSession's -startRunning is not getting executed on the main thread, what seems to be necessary. I am setting up the capture session during the view's initialization as well as starting it in a separate thread. Now I add the AVCaptureVideoPreviewLayer in -didMoveToSuperview. Everything's fine without multithreading (the UI is blocked for about a second) but with GCD the UI sometimes works, sometimes it takes way too long for the UI to 'unfreeze' or the preview to be shown.

我试图在我的iPhone应用程序中嵌入一个简单的视图来快速拍摄快照。一切正常,但我正面临着相机启动时的一些问题。在Apple示例项目中,AVCaptureSession的-startRunning没有在主线程上执行,这似乎是必要的。我在视图初始化期间设置捕获会话,并在单独的线程中启动它。现在我在-didMoveToSuperview中添加AVCaptureVideoPreviewLayer。没有多线程的一切都很好(UI被阻止了大约一秒钟)但是使用GCD时,UI有时会起作用,有时需要花费太长时间才能让UI“解冻”或显示预览。

How can I deal with the camera's startup delay in a reliable way, without blocking the main thread (the delay itself is not the problem)?

如何在不阻塞主线程的情况下以可靠的方式处理摄像机的启动延迟(延迟本身不是问题)?

I hope you guys understand my problem :D

我希望你们明白我的问题:D

Thanks in advance!

提前致谢!

BTW: Here is my proof-of-concept-project (without GCD) I am now reusing for another app: http://github.com/dariolass/QuickShotView

顺便说一句:这是我的概念验证项目(没有GCD)我现在正在重用另一个应用程序:http://github.com/dariolass/QuickShotView

2 个解决方案

#1


10  

So I figured it out by myself. This code works for me and produces the least UI freezing:

所以我自己想出来了。此代码适用于我,并产生最少的UI冻结:

- (void)willMoveToSuperview:(UIView *)newSuperview {
    //capture session setup
    AVCaptureDeviceInput *newVideoInput = [[AVCaptureDeviceInput alloc] initWithDevice:self.rearCamera error:nil];
    AVCaptureStillImageOutput *newStillImageOutput = [[AVCaptureStillImageOutput alloc] init];
    NSDictionary *outputSettings = [[NSDictionary alloc] initWithObjectsAndKeys:
                            AVVideoCodecJPEG, AVVideoCodecKey,
                            nil];
    [newStillImageOutput setOutputSettings:outputSettings];

    AVCaptureSession *newCaptureSession = [[AVCaptureSession alloc] init];

    if ([newCaptureSession canAddInput:newVideoInput]) {
        [newCaptureSession addInput:newVideoInput];
    }

    if ([newCaptureSession canAddOutput:newStillImageOutput]) {
        [newCaptureSession addOutput:newStillImageOutput];
        self.stillImageOutput = newStillImageOutput;
        self.captureSession = newCaptureSession;
    }
    // -startRunning will only return when the session started (-> the camera is then ready)
    dispatch_queue_t layerQ = dispatch_queue_create("layerQ", NULL);
    dispatch_async(layerQ, ^{
        [self.captureSession startRunning];
        AVCaptureVideoPreviewLayer *prevLayer = [[AVCaptureVideoPreviewLayer alloc]initWithSession:self.captureSession];
            prevLayer.frame = self.previewLayerFrame;
            prevLayer.masksToBounds = YES;
            prevLayer.videoGravity = AVLayerVideoGravityResizeAspectFill;
            prevLayer.cornerRadius = PREVIEW_LAYER_EDGE_RADIUS;
        //to make sure were not modifying the UI on a thread other than the main thread, use dispatch_async w/ dispatch_get_main_queue
        dispatch_async(dispatch_get_main_queue(), ^{
            [self.layer insertSublayer:prevLayer atIndex:0];
        });
    });
}

#2


-1  

I think another way to avoid is that you can put your "start camera" code in viewDidAppear, instead of putting them in viewWillAppear.

我认为另一种避免的方法是你可以将你的“开始相机”代码放在viewDidAppear中,而不是将它们放在viewWillAppear中。

#1


10  

So I figured it out by myself. This code works for me and produces the least UI freezing:

所以我自己想出来了。此代码适用于我,并产生最少的UI冻结:

- (void)willMoveToSuperview:(UIView *)newSuperview {
    //capture session setup
    AVCaptureDeviceInput *newVideoInput = [[AVCaptureDeviceInput alloc] initWithDevice:self.rearCamera error:nil];
    AVCaptureStillImageOutput *newStillImageOutput = [[AVCaptureStillImageOutput alloc] init];
    NSDictionary *outputSettings = [[NSDictionary alloc] initWithObjectsAndKeys:
                            AVVideoCodecJPEG, AVVideoCodecKey,
                            nil];
    [newStillImageOutput setOutputSettings:outputSettings];

    AVCaptureSession *newCaptureSession = [[AVCaptureSession alloc] init];

    if ([newCaptureSession canAddInput:newVideoInput]) {
        [newCaptureSession addInput:newVideoInput];
    }

    if ([newCaptureSession canAddOutput:newStillImageOutput]) {
        [newCaptureSession addOutput:newStillImageOutput];
        self.stillImageOutput = newStillImageOutput;
        self.captureSession = newCaptureSession;
    }
    // -startRunning will only return when the session started (-> the camera is then ready)
    dispatch_queue_t layerQ = dispatch_queue_create("layerQ", NULL);
    dispatch_async(layerQ, ^{
        [self.captureSession startRunning];
        AVCaptureVideoPreviewLayer *prevLayer = [[AVCaptureVideoPreviewLayer alloc]initWithSession:self.captureSession];
            prevLayer.frame = self.previewLayerFrame;
            prevLayer.masksToBounds = YES;
            prevLayer.videoGravity = AVLayerVideoGravityResizeAspectFill;
            prevLayer.cornerRadius = PREVIEW_LAYER_EDGE_RADIUS;
        //to make sure were not modifying the UI on a thread other than the main thread, use dispatch_async w/ dispatch_get_main_queue
        dispatch_async(dispatch_get_main_queue(), ^{
            [self.layer insertSublayer:prevLayer atIndex:0];
        });
    });
}

#2


-1  

I think another way to avoid is that you can put your "start camera" code in viewDidAppear, instead of putting them in viewWillAppear.

我认为另一种避免的方法是你可以将你的“开始相机”代码放在viewDidAppear中,而不是将它们放在viewWillAppear中。