A really cool new feature in recent browser versions is HTMLMediaElement.captureStream() (this has recently been shipped in Chrome).
最近浏览器版本中一个非常酷的新功能是HTMLMediaElement.captureStream()(最近在Chrome中提供)。
Now I understand how it works on the client side. You can reroute the stream to other HTMLMediaElement. However, I want to be able to send it to the server and handle it in Node.js.
现在我了解它在客户端的工作原理。您可以将流重新路由到其他HTMLMediaElement。但是,我希望能够将它发送到服务器并在Node.js中处理它。
How could this be done?
怎么可以这样做?
1 个解决方案
#1
0
As you don't want to use webRTC, you could potentially use the MediaStream recording API: https://developer.mozilla.org/en-US/docs/Web/API/MediaStream_Recording_API
由于您不想使用webRTC,您可以使用MediaStream录制API:https://developer.mozilla.org/en-US/docs/Web/API/MediaStream_Recording_API
Once you have the Blob
objects you could send this data to Node using Websockets, distribute them back to browsers, re-assemble the Blobs, and play back the media.
获得Blob对象后,您可以使用Websockets将此数据发送到Node,将它们分发回浏览器,重新组合Blob并播放媒体。
However you couldn't stream these Blobs as they may not be individually playable. For that, you'll need webRTC.
但是,您无法流式传输这些Blob,因为它们可能无法单独播放。为此,您需要webRTC。
#1
0
As you don't want to use webRTC, you could potentially use the MediaStream recording API: https://developer.mozilla.org/en-US/docs/Web/API/MediaStream_Recording_API
由于您不想使用webRTC,您可以使用MediaStream录制API:https://developer.mozilla.org/en-US/docs/Web/API/MediaStream_Recording_API
Once you have the Blob
objects you could send this data to Node using Websockets, distribute them back to browsers, re-assemble the Blobs, and play back the media.
获得Blob对象后,您可以使用Websockets将此数据发送到Node,将它们分发回浏览器,重新组合Blob并播放媒体。
However you couldn't stream these Blobs as they may not be individually playable. For that, you'll need webRTC.
但是,您无法流式传输这些Blob,因为它们可能无法单独播放。为此,您需要webRTC。