I'm using Node.js as stream server to stream realtime Webm videos that is sent by FFMPEG (executed from another application, the stream is done via HTTP) and received by a webapp that uses the tag.
我正在使用Node.js作为流服务器来传输由FFMPEG发送的实时Webm视频(从另一个应用程序执行,流通过HTTP完成)并由使用该标记的webapp接收。
This is what I'm doing: FFMPEG streams the received frames using the following command:
这就是我正在做的事情:FFMPEG使用以下命令流式传输接收到的帧:
ffmpeg -r 30 -f rawvideo -pix_fmt bgra -s 640x480
-i \\.\pipe\STREAM_PIPE -r 60
-f segment -s 240x160 -codec:v libvpx -f webm
http://my.domain.com/video_stream.webm
(the stream comes from an application that uses the Kinect as source and communicates with FFMPEG through a pipe, sending one frame after another)
(该流来自一个应用程序,该应用程序使用Kinect作为源并通过管道与FFMPEG通信,一个接一个地发送一个帧)
When the webapp connects, it receives immediately this response from the server:
当webapp连接时,它立即从服务器接收此响应:
HTTP/1.1 200 OK
X-Powered-By: Express
content-type: video/webm
cache-control: private
connection: close
Date: Fri, 06 Dec 2013 14:36:31 GMT
and a Webm header (previously stored on the server, with the same resolution and frame rate of the source stream and tested as working on VLC) is immediately appended. Then the webapp starts to receive the data streamed by FFMPEG. Here is a screenshot of Mkvinfo GUI showing the fields of the header:
并且立即附加Webm头(先前存储在服务器上,具有与源流相同的分辨率和帧速率并且在VLC上进行测试)。然后webapp开始接收FFMPEG流式传输的数据。以下是Mkvinfo GUI的屏幕截图,显示了标题的字段:
However, even if the Network tab of the Chrome console shows that there is an actual stream of data (meaning that what is streamed is not completely garbage, otherwise the connection would be dropped), the player doesn't display anything. We tried to manually prepend our header to the dumped video received by the webapp and VLC plays it just fine, but this is not happening with the tag.
但是,即使Chrome控制台的“网络”标签显示存在实际的数据流(意味着流式传输的内容不完全是垃圾,否则连接将被丢弃),播放器也不会显示任何内容。我们尝试手动将我们的标头添加到webapp收到的转储视频中,而VLC播放它很好,但标签不会发生这种情况。
What can cause this problem? Are we missing something about the encoding on the FFMPEG side or we stored wrong values on the header (or they're not enough)?
什么可能导致这个问题?我们是否遗漏了FFMPEG方面的编码或者我们在标题上存储了错误的值(或者它们还不够)?
PS: I cannot rely on an extern stream server.
PS:我不能依赖外部流服务器。
PPS: We tried the following experiments:
PPS:我们尝试了以下实验:
- substituting the video header with the one stored in the server makes the video playable on both vlc and video tag
- if we dump the video that is already started (without an header) and we prepend the video header stored in the server or even its original header, the video is playable in VLC but not on the tag (we're carefully prepending the header just before the beggining of the cluster).
用存储在服务器中的视频标题替换视频标题使视频可在vlc和视频标记上播放
如果我们转储已经启动的视频(没有标题)并且我们预先存储在服务器中的视频标头或者甚至是其原始标题,那么视频可以在VLC中播放但不在标签上播放(我们正在小心地预先添加标题在集群开始之前)。
1 个解决方案
#1
1
There so many variables added to this problem when considering you're using a technology outside of(and not integrated into) node to stream your video. This could cause issues with the loadbalancer or proxy you are using, or it could be that you're hosting 2 applications on the same port.
在考虑使用节点之外(并未集成到节点)中的技术来流式传输视频时,会出现这么多变量。这可能会导致您正在使用的负载均衡器或代理问题,或者可能是您在同一端口上托管2个应用程序。
Could you do the streaming in just node? Or could you even just stream ffmpeg to the filesystem and stream that with node.fs.readStream()? This would reuse the same webserver instead of spawning an entirely new server on the same box. And if youre just streaming that content from point to point, then you need to buffer the data coming through and forward the buffer as a stream through node.
你可以在节点中进行流媒体传输吗?或者您甚至可以将ffmpeg流式传输到文件系统并使用node.fs.readStream()进行流式传输?这将重用相同的Web服务器,而不是在同一个盒子上生成一个全新的服务器。如果您只是从一点到另一点流式传输内容,那么您需要缓冲通过的数据并将缓冲区作为流通过节点转发。
The reason why technologies get integrated, wrapped, and extended into other frameworks is for reasons of uniformity. Reading your question, though its well detailed, it still leaves a lot aloof. This could lead into question about how ffmpeg converts and serves http content, and how your loadbalancer/proxy handles that. Does node have anything to do with this? Is there a replacement for ffmpeg so you can standardize around node's framework? Is node right for this applications?
技术集成,包装和扩展到其他框架的原因是出于统一的原因。阅读你的问题,尽管它的详细信息很详细,但仍然留下了很多的冷漠。这可能会引发对ffmpeg如何转换和提供http内容以及loadbalancer / proxy如何处理它的问题。节点是否与此有关?是否有ffmpeg的替代品,以便您可以围绕节点的框架进行标准化?节点是否适用于此应用程序?
#1
1
There so many variables added to this problem when considering you're using a technology outside of(and not integrated into) node to stream your video. This could cause issues with the loadbalancer or proxy you are using, or it could be that you're hosting 2 applications on the same port.
在考虑使用节点之外(并未集成到节点)中的技术来流式传输视频时,会出现这么多变量。这可能会导致您正在使用的负载均衡器或代理问题,或者可能是您在同一端口上托管2个应用程序。
Could you do the streaming in just node? Or could you even just stream ffmpeg to the filesystem and stream that with node.fs.readStream()? This would reuse the same webserver instead of spawning an entirely new server on the same box. And if youre just streaming that content from point to point, then you need to buffer the data coming through and forward the buffer as a stream through node.
你可以在节点中进行流媒体传输吗?或者您甚至可以将ffmpeg流式传输到文件系统并使用node.fs.readStream()进行流式传输?这将重用相同的Web服务器,而不是在同一个盒子上生成一个全新的服务器。如果您只是从一点到另一点流式传输内容,那么您需要缓冲通过的数据并将缓冲区作为流通过节点转发。
The reason why technologies get integrated, wrapped, and extended into other frameworks is for reasons of uniformity. Reading your question, though its well detailed, it still leaves a lot aloof. This could lead into question about how ffmpeg converts and serves http content, and how your loadbalancer/proxy handles that. Does node have anything to do with this? Is there a replacement for ffmpeg so you can standardize around node's framework? Is node right for this applications?
技术集成,包装和扩展到其他框架的原因是出于统一的原因。阅读你的问题,尽管它的详细信息很详细,但仍然留下了很多的冷漠。这可能会引发对ffmpeg如何转换和提供http内容以及loadbalancer / proxy如何处理它的问题。节点是否与此有关?是否有ffmpeg的替代品,以便您可以围绕节点的框架进行标准化?节点是否适用于此应用程序?