https://blog.csdn.net/TTTTzTTTT/article/details/53456324
如果要调用外接的USB摄像头获取图像通常使用OpenCV来调用,如何调用摄像头请参考本人的另一篇博客如果用OpenCV调用USB摄像头,如果不是,读取本地图片和视频进行图像处理和分析也通常用OpenCV来进行。
如果要进行3D建模需要使用OpenGL库来编程,NeHe教程里推荐使用SOIL库来加载纹理(SOIL官方主页)直接使用SOIL_load_OGL_texture函数或者其他函数就行了(如果用SOIL加载纹理),这里主要是加载本机的各种格式的图片比较方便,如果对于已经读取到内存中的Mat类,如何将其记载为纹理呢??以下来详细介绍之。
对于一个已经存在于内存中的cv::Mat mat0(Mat类详解点这里)可以使用一下函数返回绑定的textureID:
GLuint matToTexture(cv::Mat mat, GLenum minFilter,
GLenum magFilter, GLenum wrapFilter)
{
// Generate a number for our textureID's unique handle
GLuint textureID;
glGenTextures(1, &textureID);
// Bind to our texture handle
glBindTexture(GL_TEXTURE_2D, textureID);
// Catch silly-mistake texture interpolation method for magnification
if (magFilter == GL_LINEAR_MIPMAP_LINEAR ||
magFilter == GL_LINEAR_MIPMAP_NEAREST ||
magFilter == GL_NEAREST_MIPMAP_LINEAR ||
magFilter == GL_NEAREST_MIPMAP_NEAREST)
{
cout << "You can't use MIPMAPs for magnification - setting filter to GL_LINEAR" << endl;
magFilter = GL_LINEAR;
}
// Set texture interpolation methods for minification and magnification
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, minFilter);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, magFilter);
// Set texture clamping method
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_S, wrapFilter);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_T, wrapFilter);
// Set incoming texture format to:
// GL_BGR for CV_CAP_OPENNI_BGR_IMAGE,
// GL_LUMINANCE for CV_CAP_OPENNI_DISPARITY_MAP,
// Work out other mappings as required ( there's a list in comments in main() )
GLenum inputColourFormat = GL_BGR_EXT;
if (mat.channels() == 1)
{
inputColourFormat = GL_LUMINANCE;
}
// Create the texture
glTexImage2D(GL_TEXTURE_2D, // Type of texture
0, // Pyramid level (for mip-mapping) - 0 is the top level
GL_RGB, // Internal colour format to convert to
mat.cols, // Image width i.e. 640 for Kinect in standard mode
mat.rows, // Image height i.e. 480 for Kinect in standard mode
0, // Border width in pixels (can either be 1 or 0)
inputColourFormat, // Input image format (i.e. GL_RGB, GL_RGBA, GL_BGR etc.)
GL_UNSIGNED_BYTE, // Image data type
mat.ptr()); // The actual image data itself
// If we're using mipmaps then generate them. Note: This requires OpenGL 3.0 or higher
if (minFilter == GL_LINEAR_MIPMAP_LINEAR ||
minFilter == GL_LINEAR_MIPMAP_NEAREST ||
minFilter == GL_NEAREST_MIPMAP_LINEAR ||
minFilter == GL_NEAREST_MIPMAP_NEAREST)
{
glGenerateMipmap(GL_TEXTURE_2D);
}
return textureID;
}
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
使用方法如下
cv::Mat mat1;
GLuint textureID;
textureID = matToTexture(mat1,GL_LINEAR,GL_LINEAR,GL_CLAMP);
1
2
3
其中mat.ptr()返回的是指向矩阵的指针,也可以用mat.data代替,只要了解了如何把该Mat类型的矩阵值调用出来就可以用简单的glTextureImage2D实现了,一个更简单的代码如下:
glGenTextures(1, &textureID);
glBindTexture(GL_TEXTURE_2D, textureID);
glTexImage2D(GL_TEXTURE_2D, 0, GL_RGB, mat1.cols, mat1.rows, 0, GL_BGR_EXT, GL_UNSIGNED_BYTE, mat1.ptr()); // GL_BGR_EXT
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_LINEAR);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_LINEAR);
1
2
3
4
5
这里有一点要注意的是,为什么使用enum: GL_BGR_EXT,其实Mat类中的三通道的排列是BGR的顺序而不是普通的RGB排序,至于为什么,这跟计算机内存中存数据的顺序BIG-endian和little-endian有关,还有人说是几个大牛互相不服气制定不同的规则,总之就是如下的排列方式:
BYTE B_channel = mat1.ptr<cv::Vec3b>(i,j)[0];
BYTE G_channel = mat1.ptr<cv::Vec3b>(i,j)[1];
BYTE R_channel = mat1.ptr<cv::Vec3b>(i,j)[2];
1
2
3
起初我将GL_BGR_EXT写成了GL_RGB,得到了整体颜色偏紫色的效果图,就是因为R和B-channel调换了的效果。
---------------------
作者:FakePanDa
来源:CSDN
原文:https://blog.csdn.net/TTTTzTTTT/article/details/53456324
版权声明:本文为博主原创文章,转载请附上博文链接!