I'm trying to figure out the best approach to display a video on a GL texture while preserving the transparency of the alpha channel.
我试图找出在GL纹理上显示视频的最佳方法,同时保留alpha通道的透明度。
Information about video as GL texture is here: Is it possible using video as texture for GL in iOS? and iOS4: how do I use video file as an OpenGL texture?.
关于视频作为GL纹理的信息在这里:是否可以在iOS中使用视频作为GL的纹理?和iOS4:如何将视频文件用作OpenGL纹理?
Using ffmpeg to help with alpha transparency, but not app store friendly is here: iPhone: Display a semi-transparent video on top of a UIView?
使用ffmpeg来帮助提高alpha透明度,但不是应用程序商店友好:iPhone:在UIView上显示半透明视频?
The video source would be filmed in front of a green screen for chroma keying. The video could be untouched to leave the green screen or processed in a video editing suite and exported to Quicktime Animation or Apple Pro Res 4444 with Alpha.
视频源将在绿色屏幕前拍摄,用于色度键控。视频可以不受影响而离开绿屏或在视频编辑套件中处理,并导出到带有Alpha的Quicktime Animation或Apple Pro Res 4444。
There are multiple approaches that I think could potentially work, but I haven't found a full solution.
我认为有多种方法可能有效,但我还没有找到完整的解决方案。
- Realtime threshold processing of the video looking for green to remove
- 视频的实时阈值处理寻找绿色删除
- Figure out how to use the above mentioned Quicktime codecs to preserve the alpha channel
- 弄清楚如何使用上面提到的Quicktime编解码器来保存alpha通道
- Blending two videos together: 1) Main video with RGB 2) separate video with alpha mask
- 将两个视频混合在一起:1)带RGB的主视频2)带alpha遮罩的单独视频
I would love to get your thoughts on the best approach for iOS and OpenGL ES 2.0
我很想了解iOS和OpenGL ES 2.0的最佳方法
Thanks.
谢谢。
3 个解决方案
#1
4
The easiest way to do chroma keying for simple blending of a movie and another scene would be to use the GPUImageChromaKeyBlendFilter from my GPUImage framework. You can supply the movie source as a GPUImageMovie, and then blend that with your background content. The chroma key filter allows you to specify a color, a proximity to that color, and a smoothness of blending to use in the replacement operation. All of this is GPU-accelerated via tuned shaders.
对电影和另一个场景的简单混合进行色度键控的最简单方法是使用我的GPUImage框架中的GPUImageChromaKeyBlendFilter。您可以将电影源作为GPUImageMovie提供,然后将其与背景内容混合。色度键过滤器允许您指定颜色,与该颜色的接近度以及在替换操作中使用的混合平滑度。所有这些都是通过调整着色器加速GPU加速的。
Images, movies, and the live cameras can be used as sources, but if you wish to render this with OpenGL ES content behind your movie, I'd recommend rendering your OpenGL ES content to a texture-backed FBO and passing that texture in via a GPUImageTextureInput.
图像,电影和实时摄像机可以用作信号源,但是如果您希望使用电影背后的OpenGL ES内容进行渲染,我建议将OpenGL ES内容渲染到纹理支持的FBO并将该纹理传递到一个GPUImageTextureInput。
You could possibly use this to output a texture containing your movie frames with the keyed color replaced by a constant color with a 0 alpha channel, as well. This texture could be extracted using a GPUImageTextureOutput for later use in your OpenGL ES scene.
您可以使用此输出包含电影帧的纹理,其中键控颜色也替换为带有0 Alpha通道的常量颜色。可以使用GPUImageTextureOutput提取此纹理,以便稍后在OpenGL ES场景中使用。
#2
0
Apple showed a sample app at WWDC in 2011 called ChromaKey that shows how to handle frames of video passed to an OpenGL texture, manipulated, and optionally written out to a video file.
Apple于2011年在WWDC上展示了一个名为ChromaKey的示例应用程序,它展示了如何处理传递给OpenGL纹理的视频帧,操作并可选地写入视频文件。
(In a very performant way)
(以非常高效的方式)
It's written to use a feed from the video camera, and uses a very crude chromakey algorithm.
它被编写为使用来自摄像机的馈送,并使用非常粗略的抠像算法。
As the other poster said, you'll probably want to skip the chromakey code and do the color knockout yourself beforehand.
正如另一张海报所说,你可能想要跳过色度键代码并事先自己进行颜色淘汰。
It shouldn't be that hard to rewrite the Chromakey sample app to use a video file as input instead of a camera feed, and it's quite easy to disable the chormakey code.
重写Chromakey示例应用程序以将视频文件用作输入而不是相机提要应该不难,并且很容易禁用chormakey代码。
You'd need to modify the setup on the video input to expect RGBA data instead of RGB or Y/UV. The sample app is set up to use RGB, but I've seen other example apps from Apple that use Y/UV instead.
您需要修改视频输入上的设置以期望RGBA数据而不是RGB或Y / UV。示例应用程序设置为使用RGB,但我看到Apple的其他示例应用程序使用Y / UV代替。
#3
0
Have a look at the free "APNG" app on the app store. It shows how an animated PNG (.apng) can be rendered directly to an iOS view. The key is that APNG supports an alpha channel in the file format, so you don't need to mess around with chroma tricks that will not really work for all your video content. This approach is more efficient that multiple layers or chroma tricks since another round of processing is not needed each time a texture is displayed in a loop.
看看应用程序商店中的免费“APNG”应用程序。它显示了如何将动画PNG(.apng)直接呈现到iOS视图。关键是APNG支持文件格式的alpha通道,因此您不需要弄乱那些对您的所有视频内容都不起作用的色度技巧。这种方法比多层或色度技巧更有效,因为每次在循环中显示纹理时不需要另一轮处理。
If you want to have a look at a small example xcode project that displays an alpha channel animation on the side of a spinning cube with OpenGL ES2, it can be found at Load OpenGL textures with alpha channel on iOS. The example code shows a simple call to glTexImage2D() that uploads a texture to the graphics card once for each display link callback.
如果你想看一个小例子xcode项目,它在OpenGL ES2旋转立方体的一侧显示一个alpha通道动画,可以在iOS上使用alpha通道加载OpenGL纹理。示例代码显示了对glTexImage2D()的简单调用,该调用将纹理上载到图形卡一次,用于每个显示链接回调。
#1
4
The easiest way to do chroma keying for simple blending of a movie and another scene would be to use the GPUImageChromaKeyBlendFilter from my GPUImage framework. You can supply the movie source as a GPUImageMovie, and then blend that with your background content. The chroma key filter allows you to specify a color, a proximity to that color, and a smoothness of blending to use in the replacement operation. All of this is GPU-accelerated via tuned shaders.
对电影和另一个场景的简单混合进行色度键控的最简单方法是使用我的GPUImage框架中的GPUImageChromaKeyBlendFilter。您可以将电影源作为GPUImageMovie提供,然后将其与背景内容混合。色度键过滤器允许您指定颜色,与该颜色的接近度以及在替换操作中使用的混合平滑度。所有这些都是通过调整着色器加速GPU加速的。
Images, movies, and the live cameras can be used as sources, but if you wish to render this with OpenGL ES content behind your movie, I'd recommend rendering your OpenGL ES content to a texture-backed FBO and passing that texture in via a GPUImageTextureInput.
图像,电影和实时摄像机可以用作信号源,但是如果您希望使用电影背后的OpenGL ES内容进行渲染,我建议将OpenGL ES内容渲染到纹理支持的FBO并将该纹理传递到一个GPUImageTextureInput。
You could possibly use this to output a texture containing your movie frames with the keyed color replaced by a constant color with a 0 alpha channel, as well. This texture could be extracted using a GPUImageTextureOutput for later use in your OpenGL ES scene.
您可以使用此输出包含电影帧的纹理,其中键控颜色也替换为带有0 Alpha通道的常量颜色。可以使用GPUImageTextureOutput提取此纹理,以便稍后在OpenGL ES场景中使用。
#2
0
Apple showed a sample app at WWDC in 2011 called ChromaKey that shows how to handle frames of video passed to an OpenGL texture, manipulated, and optionally written out to a video file.
Apple于2011年在WWDC上展示了一个名为ChromaKey的示例应用程序,它展示了如何处理传递给OpenGL纹理的视频帧,操作并可选地写入视频文件。
(In a very performant way)
(以非常高效的方式)
It's written to use a feed from the video camera, and uses a very crude chromakey algorithm.
它被编写为使用来自摄像机的馈送,并使用非常粗略的抠像算法。
As the other poster said, you'll probably want to skip the chromakey code and do the color knockout yourself beforehand.
正如另一张海报所说,你可能想要跳过色度键代码并事先自己进行颜色淘汰。
It shouldn't be that hard to rewrite the Chromakey sample app to use a video file as input instead of a camera feed, and it's quite easy to disable the chormakey code.
重写Chromakey示例应用程序以将视频文件用作输入而不是相机提要应该不难,并且很容易禁用chormakey代码。
You'd need to modify the setup on the video input to expect RGBA data instead of RGB or Y/UV. The sample app is set up to use RGB, but I've seen other example apps from Apple that use Y/UV instead.
您需要修改视频输入上的设置以期望RGBA数据而不是RGB或Y / UV。示例应用程序设置为使用RGB,但我看到Apple的其他示例应用程序使用Y / UV代替。
#3
0
Have a look at the free "APNG" app on the app store. It shows how an animated PNG (.apng) can be rendered directly to an iOS view. The key is that APNG supports an alpha channel in the file format, so you don't need to mess around with chroma tricks that will not really work for all your video content. This approach is more efficient that multiple layers or chroma tricks since another round of processing is not needed each time a texture is displayed in a loop.
看看应用程序商店中的免费“APNG”应用程序。它显示了如何将动画PNG(.apng)直接呈现到iOS视图。关键是APNG支持文件格式的alpha通道,因此您不需要弄乱那些对您的所有视频内容都不起作用的色度技巧。这种方法比多层或色度技巧更有效,因为每次在循环中显示纹理时不需要另一轮处理。
If you want to have a look at a small example xcode project that displays an alpha channel animation on the side of a spinning cube with OpenGL ES2, it can be found at Load OpenGL textures with alpha channel on iOS. The example code shows a simple call to glTexImage2D() that uploads a texture to the graphics card once for each display link callback.
如果你想看一个小例子xcode项目,它在OpenGL ES2旋转立方体的一侧显示一个alpha通道动画,可以在iOS上使用alpha通道加载OpenGL纹理。示例代码显示了对glTexImage2D()的简单调用,该调用将纹理上载到图形卡一次,用于每个显示链接回调。