I have got some code online which captures video from the camera of iPhone and then stores it to a video file and it is working fine. But my purpose is not to save it in the memory, but to send it to a sever. I have found out that there is a free media server named WOWZA which allows streaming and also Apple has (HSL) HTTP Live Streaming feature and that the servers expect the video to be in h.264 format for video and in mp3 for audio. By reading some of the documents about Apple HSL I also came to know that it gives a different url in the playlist file for each segment of the media file which is then played in the correct order on a device through the browser. I am not sure how to get small segments of the file that is recorded by the phone's camera and also how to convert it into the required format. Following is the code for capturing video:
我有一些在线代码从iPhone的相机捕获视频,然后将其存储到视频文件,它工作正常。但我的目的不是将其保存在内存中,而是将其发送到服务器。我发现有一个名为WOWZA的免费媒体服务器,它允许流媒体和Apple有(HSL)HTTP直播流功能,服务器希望视频为h.264格式的视频和mp3的音频。通过阅读有关Apple HSL的一些文档,我也发现它在播放列表文件中为媒体文件的每个片段提供了不同的URL,然后通过浏览器以正确的顺序在设备上播放。我不知道如何获取手机摄像头录制的文件的小片段,以及如何将其转换为所需的格式。以下是捕获视频的代码:
Implementation File
#import "THCaptureViewController.h"
#import <AVFoundation/AVFoundation.h>
#import "THPlayerViewController.h"
#define VIDEO_FILE @"test.mov"
@interface THCaptureViewController ()
@property (nonatomic, strong) AVCaptureSession *captureSession;
@property (nonatomic, strong) AVCaptureMovieFileOutput *captureOutput;
@property (nonatomic, weak) AVCaptureDeviceInput *activeVideoInput;
@property (nonatomic, strong) AVCaptureVideoPreviewLayer *previewLayer;
@end
@implementation THCaptureViewController
- (void)viewDidLoad
{
[super viewDidLoad];
#if TARGET_IPHONE_SIMULATOR
self.simulatorView.hidden = NO;
[self.view bringSubviewToFront:self.simulatorView];
#else
self.simulatorView.hidden = YES;
[self.view sendSubviewToBack:self.simulatorView];
#endif
// Hide the toggle button if device has less than 2 cameras. Does 3GS support iOS 6?
self.toggleCameraButton.hidden = [[AVCaptureDevice devicesWithMediaType:AVMediaTypeVideo] count] < 2;
dispatch_async(dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_DEFAULT, 0),
^{
[self setUpCaptureSession];
});
}
#pragma mark - Configure Capture Session
- (void)setUpCaptureSession
{
self.captureSession = [[AVCaptureSession alloc] init];
NSError *error;
// Set up hardware devices
AVCaptureDevice *videoDevice = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
if (videoDevice) {
AVCaptureDeviceInput *input = [AVCaptureDeviceInput deviceInputWithDevice:videoDevice error:&error];
if (input) {
[self.captureSession addInput:input];
self.activeVideoInput = input;
}
}
AVCaptureDevice *audioDevice = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeAudio];
if (audioDevice) {
AVCaptureDeviceInput *audioInput = [AVCaptureDeviceInput deviceInputWithDevice:audioDevice error:&error];
if (audioInput) {
[self.captureSession addInput:audioInput];
}
}
//Create a VideoDataOutput and add it to the session
AVCaptureVideoDataOutput *output = [[AVCaptureVideoDataOutput alloc] init];
[self.captureSession addOutput:output];
// Setup the still image file output
AVCaptureStillImageOutput *stillImageOutput = [[AVCaptureStillImageOutput alloc] init];
[stillImageOutput setOutputSettings:@{AVVideoCodecKey : AVVideoCodecJPEG}];
if ([self.captureSession canAddOutput:stillImageOutput]) {
[self.captureSession addOutput:stillImageOutput];
}
// Start running session so preview is available
[self.captureSession startRunning];
// Set up preview layer
dispatch_async(dispatch_get_main_queue(), ^{
self.previewLayer = [AVCaptureVideoPreviewLayer layerWithSession:self.captureSession];
self.previewLayer.frame = self.previewView.bounds;
self.previewLayer.videoGravity = AVLayerVideoGravityResizeAspectFill;
[[self.previewLayer connection] setVideoOrientation:[self currentVideoOrientation]];
[self.previewView.layer addSublayer:self.previewLayer];
});
}
#pragma mark - Start Recording
- (IBAction)startRecording:(id)sender {
if ([sender isSelected]) {
[sender setSelected:NO];
[self.captureOutput stopRecording];
} else {
[sender setSelected:YES];
if (!self.captureOutput) {
self.captureOutput = [[AVCaptureMovieFileOutput alloc] init];
[self.captureSession addOutput:self.captureOutput];
}
// Delete the old movie file if it exists
//[[NSFileManager defaultManager] removeItemAtURL:[self outputURL] error:nil];
[self.captureSession startRunning];
AVCaptureConnection *videoConnection = [self connectionWithMediaType:AVMediaTypeVideo fromConnections:self.captureOutput.connections];
if ([videoConnection isVideoOrientationSupported]) {
videoConnection.videoOrientation = [self currentVideoOrientation];
}
if ([videoConnection isVideoStabilizationSupported]) {
videoConnection.enablesVideoStabilizationWhenAvailable = YES;
}
[self.captureOutput startRecordingToOutputFileURL:[self outputURL] recordingDelegate:self];
}
// Disable the toggle button if recording
self.toggleCameraButton.enabled = ![sender isSelected];
}
- (AVCaptureConnection *)connectionWithMediaType:(NSString *)mediaType fromConnections:(NSArray *)connections {
for (AVCaptureConnection *connection in connections) {
for (AVCaptureInputPort *port in [connection inputPorts]) {
if ([[port mediaType] isEqual:mediaType]) {
return connection;
}
}
}
return nil;
}
#pragma mark - AVCaptureFileOutputRecordingDelegate
- (void)captureOutput:(AVCaptureFileOutput *)captureOutput didFinishRecordingToOutputFileAtURL:(NSURL *)outputFileURL fromConnections:(NSArray *)connections error:(NSError *)error {
if (!error) {
[self presentRecording];
} else {
NSLog(@"Error: %@", [error localizedDescription]);
}
}
#pragma mark - Show Last Recording
- (void)presentRecording
{
NSString *tracksKey = @"tracks";
AVAsset *asset = [AVURLAsset assetWithURL:[self outputURL]];
[asset loadValuesAsynchronouslyForKeys:@[tracksKey] completionHandler:^{
NSError *error;
AVKeyValueStatus status = [asset statusOfValueForKey:tracksKey error:&error];
if (status == AVKeyValueStatusLoaded) {
dispatch_async(dispatch_get_main_queue(), ^{
UIStoryboard *mainStoryboard = [UIStoryboard storyboardWithName:@"MainStoryboard" bundle:nil];
THPlayerViewController *controller = [mainStoryboard instantiateViewControllerWithIdentifier:@"THPlayerViewController"];
controller.title = @"Capture Recording";
controller.asset = asset;
[self presentViewController:controller animated:YES completion:nil];
});
}
}];
}
#pragma mark - Recoding Destination URL
- (NSURL *)outputURL
{
NSString *documentsDirectory = [NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES) objectAtIndex:0];
NSLog(@"documents Directory: %@", documentsDirectory);
NSString *filePath = [documentsDirectory stringByAppendingPathComponent:VIDEO_FILE];
NSLog(@"output url: %@", filePath);
return [NSURL fileURLWithPath:filePath];
}
@end
I found this link which shows how to capture the video in frames. But I am not sure that if capturing the video in frames will help me in sending the video in h.264 format to the server. Can this be done, if yes then how?
我发现此链接显示了如何捕获帧中的视频。但我不确定如果以帧为单位捕获视频将帮助我将h.264格式的视频发送到服务器。可以这样做,如果是,那么如何?
Here the person who has asked the question says (in the comments below the question) that he was able to do it successfully, but he hasn't mentioned that how he captured the video.
在这里,提出问题的人说(在问题下面的评论中)他能够成功地做到这一点,但他没有提到他是如何捕获视频的。
Please tell me which data type should be used to get small segments of the video captured and also how to convert the captured data in the required format and send it to the server.
请告诉我应该使用哪种数据类型来获取捕获的视频的小段,以及如何以所需格式转换捕获的数据并将其发送到服务器。
1 个解决方案
#1
1
You can use live sdk .You have to setup nginx powered streaming server. Please follow this link .I have used it and it is very efficient solution . https://github.com/ltebean/Live
你可以使用live sdk。你必须设置nginx驱动的流媒体服务器。请关注此链接。我使用过它,这是非常有效的解决方案。 https://github.com/ltebean/Live
#1
1
You can use live sdk .You have to setup nginx powered streaming server. Please follow this link .I have used it and it is very efficient solution . https://github.com/ltebean/Live
你可以使用live sdk。你必须设置nginx驱动的流媒体服务器。请关注此链接。我使用过它,这是非常有效的解决方案。 https://github.com/ltebean/Live