I'm capturing an image using AVFoundation
. I'm using AVCaptureVideoPreviewLayer
to display the camera feed on screen. This preview layer's frame gets the bounds of a UIView
with dynamic dimensions:
我用AVFoundation捕捉图像。我正在使用AVCaptureVideoPreviewLayer在屏幕上显示摄像头的feed。这个预览层的框架获得了具有动态维度的UIView的边界:
AVCaptureVideoPreviewLayer *previewLayer = [[AVCaptureVideoPreviewLayer alloc] initWithSession:session];
[previewLayer setVideoGravity:AVLayerVideoGravityResizeAspectFill];
CALayer *rootLayer = [self.cameraFeedView layer];
[rootLayer setMasksToBounds:YES];
CGRect frame = self.cameraFeedView.frame;
[previewLayer setFrame:frame];
previewLayer.frame = rootLayer.bounds;
[rootLayer insertSublayer:previewLayer atIndex:0];
And I'm using AVCaptureStillImageOutput
to capture an image:
我使用AVCaptureStillImageOutput来捕捉图像:
AVCaptureStillImageOutput *stillImageOutput = [[AVCaptureStillImageOutput alloc] init];
[stillImageOutput captureStillImageAsynchronouslyFromConnection:videoConnection
completionHandler:^(CMSampleBufferRef imageDataSampleBuffer, NSError *error) {
if (imageDataSampleBuffer != NULL) {
NSData *imageData = [AVCaptureStillImageOutput jpegStillImageNSDataRepresentation:imageDataSampleBuffer];
UIImage *capturedImage = [UIImage imageWithData:imageData];
}
}];
My problem is that the captured image is at the size of the iPhone camera (1280x960 - front camera), but I need it to be the same aspect ratio as the preview layer. For example, if the size of the preview layer is 150x100, I need the captured image to be 960x640. Is there any solution for this?
我的问题是捕获的图像是iPhone摄像头(1280x960 -前置摄像头)大小,但我需要它与预览层的高宽比相同。例如,如果预览层的大小是150x100,我需要捕获的图像是960x640。有什么解决办法吗?
1 个解决方案
#1
1
I also enter counter the same problem. You have to crop or resize output still image. But you should notice that output still`s scale and image orientation.
我也输入了反例。您必须裁剪或调整输出静态图像的大小。但是您应该注意到输出仍然是缩放和图像方向。
preview layer square frame
预览帧层广场
CGFloat width = CGRectGetWidth(self.view.bounds);
[self.captureVideoPreviewLayer setFrame:CGRectMake(0, 0, width, width)];
[self.cameraView.layer addSublayer:self.captureVideoPreviewLayer];
calculate cropped image`s frame
计算裁剪图像帧
[self.captureStillImageOutput captureStillImageAsynchronouslyFromConnection:captureConnection completionHandler:^(CMSampleBufferRef imageDataSampleBuffer, NSError *error) {
NSData *data = [AVCaptureStillImageOutput jpegStillImageNSDataRepresentation:imageDataSampleBuffer];
UIImage *image = [[UIImage alloc] initWithData:data];
CGRect cropRect = CGRectMake((image.size.height - image.size.width) / 2, 0, image.size.width, image.size.width);
CGImageRef imageRef = CGImageCreateWithImageInRect([image CGImage], cropRect);
UIImage *croppedImage = [UIImage imageWithCGImage:imageRef scale:image.scale orientation:image.imageOrientation]; // always UIImageOrientationRight
CGImageRelease(imageRef);
}];
#1
1
I also enter counter the same problem. You have to crop or resize output still image. But you should notice that output still`s scale and image orientation.
我也输入了反例。您必须裁剪或调整输出静态图像的大小。但是您应该注意到输出仍然是缩放和图像方向。
preview layer square frame
预览帧层广场
CGFloat width = CGRectGetWidth(self.view.bounds);
[self.captureVideoPreviewLayer setFrame:CGRectMake(0, 0, width, width)];
[self.cameraView.layer addSublayer:self.captureVideoPreviewLayer];
calculate cropped image`s frame
计算裁剪图像帧
[self.captureStillImageOutput captureStillImageAsynchronouslyFromConnection:captureConnection completionHandler:^(CMSampleBufferRef imageDataSampleBuffer, NSError *error) {
NSData *data = [AVCaptureStillImageOutput jpegStillImageNSDataRepresentation:imageDataSampleBuffer];
UIImage *image = [[UIImage alloc] initWithData:data];
CGRect cropRect = CGRectMake((image.size.height - image.size.width) / 2, 0, image.size.width, image.size.width);
CGImageRef imageRef = CGImageCreateWithImageInRect([image CGImage], cropRect);
UIImage *croppedImage = [UIImage imageWithCGImage:imageRef scale:image.scale orientation:image.imageOrientation]; // always UIImageOrientationRight
CGImageRelease(imageRef);
}];