How to live stream videos from iPhone to server like Ustream or Qik? I know there's something called Http Live Streaming from Apple, but most resources I found only talks about streaming videos from server to iPhone.
如何从iPhone到服务器,如Ustream或Qik直播视频?我知道有一种叫做Http直播的东西来自苹果,但我发现的大部分资源都是关于从服务器到iPhone的流媒体视频。
Is Apple's Http Living Streaming something I should use? Or something else? Thanks.
我应该使用苹果的Http live流媒体服务吗?还是别的?谢谢。
2 个解决方案
#1
45
There isn't a built-in way to do this, as far as I know. As you say, HTTP Live Streaming is for downloads to the iPhone.
据我所知,没有一种内置的方法来实现这一点。正如您所说,HTTP实时流媒体是为iPhone提供下载的。
The way I'm doing it is to implement an AVCaptureSession, which has a delegate with a callback that's run on every frame. That callback sends each frame over the network to the server, which has a custom setup to receive it.
我的方法是实现一个AVCaptureSession,它有一个在每一帧上运行的回调的委托。该回调通过网络将每个帧发送到服务器,服务器有一个自定义设置来接收它。
流:https://developer.apple.com/library/content/documentation/AudioVideo/Conceptual/AVFoundationPG/Articles/04_MediaCapture.html / / apple_ref / doc / uid / TP40010188-CH5-SW2
And here's some code:
这里是一些代码:
// make input device
NSError *deviceError;
AVCaptureDevice *cameraDevice = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
AVCaptureDeviceInput *inputDevice = [AVCaptureDeviceInput deviceInputWithDevice:cameraDevice error:&deviceError];
// make output device
AVCaptureVideoDataOutput *outputDevice = [[AVCaptureVideoDataOutput alloc] init];
[outputDevice setSampleBufferDelegate:self queue:dispatch_get_main_queue()];
// initialize capture session
AVCaptureSession *captureSession = [[[AVCaptureSession alloc] init] autorelease];
[captureSession addInput:inputDevice];
[captureSession addOutput:outputDevice];
// make preview layer and add so that camera's view is displayed on screen
AVCaptureVideoPreviewLayer *previewLayer = [AVCaptureVideoPreviewLayer layerWithSession:captureSession];
previewLayer.frame = view.bounds;
[view.layer addSublayer:previewLayer];
// go!
[captureSession startRunning];
Then the output device's delegate (here, self) has to implement the callback:
然后输出设备的委托(这里是self)必须实现回调:
-(void) captureOutput:(AVCaptureOutput*)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection*)connection
{
CVImageBufferRef imageBuffer = CMSampleBufferGetImageBuffer( sampleBuffer );
CGSize imageSize = CVImageBufferGetEncodedSize( imageBuffer );
// also in the 'mediaSpecific' dict of the sampleBuffer
NSLog( @"frame captured at %.fx%.f", imageSize.width, imageSize.height );
}
EDIT/UPDATE
Several people have asked how to do this without sending the frames to the server one by one. The answer is complex...
一些人已经问过如何在不将框架逐个发送到服务器的情况下实现这一点。答案是复杂的…
Basically, in the didOutputSampleBuffer
function above, you add the samples into an AVAssetWriter
. I actually had three asset writers active at a time -- past, present, and future -- managed on different threads.
基本上,在上面的didOutputSampleBuffer函数中,您将示例添加到AVAssetWriter中。实际上,我在不同的线程上同时拥有三个活跃的资产写手——过去、现在和未来。
The past writer is in the process of closing the movie file and uploading it. The current writer is receiving the sample buffers from the camera. The future writer is in the process of opening a new movie file and preparing it for data. Every 5 seconds, I set past=current; current=future
and restart the sequence.
过去的作者正在关闭电影文件并上传它。当前写入器正在从摄像机接收样本缓冲区。未来的作者正在打开一个新的电影文件并准备它的数据。每5秒,我设置past=current;current=future并重新启动序列。
This then uploads video in 5-second chunks to the server. You can stitch the videos together with ffmpeg
if you want, or transcode them into MPEG-2 transport streams for HTTP Live Streaming. The video data itself is H.264-encoded by the asset writer, so transcoding merely changes the file's header format.
然后将视频以5秒为单位上传至服务器。如果需要,可以使用ffmpeg将视频拼接在一起,或者将它们转换为MPEG-2传输流,以便进行HTTP实时流。视频数据本身是h .264编码的,由资产写入器进行编码,因此代码转换只改变文件的头格式。
#2
-3
I'm not sure you can do that with HTTP Live Streaming. HTTP Live Streaming segments the video in 10 secs (aprox.) length, and creates a playlist with those segments. So if you want the iPhone to be the stream server side with HTTP Live Streaming, you will have to figure out a way to segment the video file and create the playlist.
我不确定你能不能用HTTP实时流媒体。HTTP Live Streaming segment the video in 10 secs (aprox.) length,并创建一个播放列表。所以如果你想让iPhone成为流服务器端有HTTP实时流,你必须找到一种方法来分割视频文件并创建播放列表。
How to do it is beyond my knowledge. Sorry.
我不知道怎么做。对不起。
#1
45
There isn't a built-in way to do this, as far as I know. As you say, HTTP Live Streaming is for downloads to the iPhone.
据我所知,没有一种内置的方法来实现这一点。正如您所说,HTTP实时流媒体是为iPhone提供下载的。
The way I'm doing it is to implement an AVCaptureSession, which has a delegate with a callback that's run on every frame. That callback sends each frame over the network to the server, which has a custom setup to receive it.
我的方法是实现一个AVCaptureSession,它有一个在每一帧上运行的回调的委托。该回调通过网络将每个帧发送到服务器,服务器有一个自定义设置来接收它。
流:https://developer.apple.com/library/content/documentation/AudioVideo/Conceptual/AVFoundationPG/Articles/04_MediaCapture.html / / apple_ref / doc / uid / TP40010188-CH5-SW2
And here's some code:
这里是一些代码:
// make input device
NSError *deviceError;
AVCaptureDevice *cameraDevice = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
AVCaptureDeviceInput *inputDevice = [AVCaptureDeviceInput deviceInputWithDevice:cameraDevice error:&deviceError];
// make output device
AVCaptureVideoDataOutput *outputDevice = [[AVCaptureVideoDataOutput alloc] init];
[outputDevice setSampleBufferDelegate:self queue:dispatch_get_main_queue()];
// initialize capture session
AVCaptureSession *captureSession = [[[AVCaptureSession alloc] init] autorelease];
[captureSession addInput:inputDevice];
[captureSession addOutput:outputDevice];
// make preview layer and add so that camera's view is displayed on screen
AVCaptureVideoPreviewLayer *previewLayer = [AVCaptureVideoPreviewLayer layerWithSession:captureSession];
previewLayer.frame = view.bounds;
[view.layer addSublayer:previewLayer];
// go!
[captureSession startRunning];
Then the output device's delegate (here, self) has to implement the callback:
然后输出设备的委托(这里是self)必须实现回调:
-(void) captureOutput:(AVCaptureOutput*)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection*)connection
{
CVImageBufferRef imageBuffer = CMSampleBufferGetImageBuffer( sampleBuffer );
CGSize imageSize = CVImageBufferGetEncodedSize( imageBuffer );
// also in the 'mediaSpecific' dict of the sampleBuffer
NSLog( @"frame captured at %.fx%.f", imageSize.width, imageSize.height );
}
EDIT/UPDATE
Several people have asked how to do this without sending the frames to the server one by one. The answer is complex...
一些人已经问过如何在不将框架逐个发送到服务器的情况下实现这一点。答案是复杂的…
Basically, in the didOutputSampleBuffer
function above, you add the samples into an AVAssetWriter
. I actually had three asset writers active at a time -- past, present, and future -- managed on different threads.
基本上,在上面的didOutputSampleBuffer函数中,您将示例添加到AVAssetWriter中。实际上,我在不同的线程上同时拥有三个活跃的资产写手——过去、现在和未来。
The past writer is in the process of closing the movie file and uploading it. The current writer is receiving the sample buffers from the camera. The future writer is in the process of opening a new movie file and preparing it for data. Every 5 seconds, I set past=current; current=future
and restart the sequence.
过去的作者正在关闭电影文件并上传它。当前写入器正在从摄像机接收样本缓冲区。未来的作者正在打开一个新的电影文件并准备它的数据。每5秒,我设置past=current;current=future并重新启动序列。
This then uploads video in 5-second chunks to the server. You can stitch the videos together with ffmpeg
if you want, or transcode them into MPEG-2 transport streams for HTTP Live Streaming. The video data itself is H.264-encoded by the asset writer, so transcoding merely changes the file's header format.
然后将视频以5秒为单位上传至服务器。如果需要,可以使用ffmpeg将视频拼接在一起,或者将它们转换为MPEG-2传输流,以便进行HTTP实时流。视频数据本身是h .264编码的,由资产写入器进行编码,因此代码转换只改变文件的头格式。
#2
-3
I'm not sure you can do that with HTTP Live Streaming. HTTP Live Streaming segments the video in 10 secs (aprox.) length, and creates a playlist with those segments. So if you want the iPhone to be the stream server side with HTTP Live Streaming, you will have to figure out a way to segment the video file and create the playlist.
我不确定你能不能用HTTP实时流媒体。HTTP Live Streaming segment the video in 10 secs (aprox.) length,并创建一个播放列表。所以如果你想让iPhone成为流服务器端有HTTP实时流,你必须找到一种方法来分割视频文件并创建播放列表。
How to do it is beyond my knowledge. Sorry.
我不知道怎么做。对不起。