I'm trying to do some image manipulation on the iPhone, basing things on the GLImageProcessing example from Apple.
我试图在iPhone上进行一些图像处理,基于Apple的GLImageProcessing示例。
Ultimately what I'd like to do is to load an image into a texture, perform one or more of the operations in the example code (hue, saturation, brightness, etc.), then read the resulting image back out for later processing/saving. For the most part, this would never need to touch the screen, so I thought that FBOs might be the way to go.
最后,我想要做的是将图像加载到纹理中,执行示例代码中的一个或多个操作(色调,饱和度,亮度等),然后将结果图像读回以供以后处理/保存。在大多数情况下,这永远不需要触摸屏幕,因此我认为FBO可能是最佳选择。
To start with, I've cobbled together a little example that creates an offscreen FBO, draws to it, then reads the data back out as an image. I was psyched when this worked perfectly in the simulator, then bummed as I realized I just got a black screen on the actual device.
首先,我拼凑了一个小例子来创建一个屏幕外的FBO,绘制它,然后将数据作为图像读回来。当这在模拟器中完美运行时我感到很兴奋,然后因为我意识到我在实际设备上只有黑屏而感到沮丧。
Disclaimer: my OpenGL is old enough that I've had quite a bit of a learning curve going to OpenGL ES, and I've never been much of a texture wizard. I do know that the device has different characteristics from the simulator in terms of framebuffer access (mandatory offscreen FBO and swap on the device, direct access on the simulator), but I haven't been able to find what I've been doing wrong, even after a fairly extensive search.
免责声明:我的OpenGL已经足够大了,我已经对OpenGL ES有了相当多的学习曲线,而且我从来都不是一个纹理向导。我知道该设备在帧缓冲访问方面具有与模拟器不同的特性(强制性的屏幕外FBO和设备上的交换,模拟器上的直接访问),但我无法找到我做错了什么即使经过相当广泛的搜索。
Any suggestions?
// set up the offscreen FBO sizes
int renderBufferWidth = 1280;
int renderBufferHeight = 720;
// now the FBO
GLuint fbo = 0;
glGenFramebuffersOES(1, &fbo);
glBindFramebufferOES(GL_FRAMEBUFFER_OES, fbo);
GLuint renderBuffer = 0;
glGenRenderbuffersOES(1, &renderBuffer);
glBindRenderbufferOES(GL_RENDERBUFFER_OES, renderBuffer);
glRenderbufferStorageOES(GL_RENDERBUFFER_OES,
GL_RGBA8_OES,
renderBufferWidth,
renderBufferHeight);
glFramebufferRenderbufferOES(GL_FRAMEBUFFER_OES,
GL_COLOR_ATTACHMENT0_OES,
GL_RENDERBUFFER_OES,
renderBuffer);
GLenum status = glCheckFramebufferStatusOES(GL_FRAMEBUFFER_OES);
if (status != GL_FRAMEBUFFER_COMPLETE_OES) {
NSLog(@"Problem with OpenGL framebuffer after specifying color render buffer: %x", status);
}
// throw in a test drawing
glClearColor(0.5f, 0.5f, 0.5f, 1.0f);
glClear(GL_COLOR_BUFFER_BIT);
static const GLfloat triangleVertices[] = {
-0.5f, -0.33f,
0.5f, -0.33f,
-0.5f, 0.33f
};
static const GLfloat triangleColors[] = {
1.0, 0.0, 0.0, 0.5,
0.0, 1.0, 0.0, 0.5,
0.0, 0.0, 1.0, 0.5
};
GLint backingWidth = 320;
GLint backingHeight = 480;
NSLog(@"setting up view/model matrices");
glMatrixMode(GL_PROJECTION);
glLoadIdentity();
glMatrixMode(GL_MODELVIEW);
glLoadIdentity();
glVertexPointer(2, GL_FLOAT, 0, triangleVertices);
glEnableClientState(GL_VERTEX_ARRAY);
glColorPointer(4, GL_FLOAT, 0, triangleColors);
glEnableClientState(GL_COLOR_ARRAY);
// draw the triangle
glDrawArrays(GL_TRIANGLE_STRIP, 0, 3);
// Extract the resulting rendering as an image
int samplesPerPixel = 4; // R, G, B and A
int rowBytes = samplesPerPixel * renderBufferWidth;
char* bufferData = (char*)malloc(rowBytes * renderBufferHeight);
if (bufferData == NULL) {
NSLog(@"Unable to allocate buffer for image extraction.");
}
// works on simulator with GL_BGRA, but not on device
glReadPixels(0, 0, renderBufferWidth,
renderBufferHeight,
GL_BGRA,
GL_UNSIGNED_BYTE, bufferData);
NSLog(@"reading pixels from framebuffer");
// Flip it vertically - images read from OpenGL buffers are upside-down
char* flippedBuffer = (char*)malloc(rowBytes * renderBufferHeight);
if (flippedBuffer == NULL) {
NSLog(@"Unable to allocate flipped buffer for corrected image.");
}
for (int i = 0 ; i < renderBufferHeight ; i++) {
bcopy(bufferData + i * rowBytes,
flippedBuffer + (renderBufferHeight - i - 1) * rowBytes,
rowBytes);
}
// unbind my FBO
glBindFramebufferOES(GL_FRAMEBUFFER_OES, 0);
// Output the image to a file
CGColorSpaceRef colorSpace = CGColorSpaceCreateDeviceRGB();
int bitsPerComponent = 8;
CGBitmapInfo bitmapInfo = kCGImageAlphaNoneSkipFirst | kCGBitmapByteOrder32Host;
CGContextRef contextRef = CGBitmapContextCreate(flippedBuffer,
renderBufferWidth,
renderBufferHeight,
bitsPerComponent,
rowBytes, colorSpace, bitmapInfo);
if (contextRef == nil) {
NSLog(@"Unable to create CGContextRef.");
}
CGImageRef imageRef = CGBitmapContextCreateImage(contextRef);
if (imageRef == nil) {
NSLog(@"Unable to create CGImageRef.");
} else {
if (savedImage == NO) {
UIImage *myImage = [UIImage imageWithCGImage:imageRef];
UIImageWriteToSavedPhotosAlbum(myImage, nil, nil, nil);
savedImage = YES;
}
}
Edit:
The answer, of course, was that the bitmap format should be GL_RGBA, not GL_BGRA:
答案当然是位图格式应该是GL_RGBA,而不是GL_BGRA:
// works on simulator with GL_BGRA, but not on device
glReadPixels(0, 0, renderBufferWidth,
renderBufferHeight,
**GL_RGBA**,
GL_UNSIGNED_BYTE, bufferData);
1 个解决方案
#1
4
As Andrew answered himself:
安德鲁自言自语道:
The answer, was that the bitmap format should be GL_RGBA, not GL_BGRA
答案是,位图格式应该是GL_RGBA,而不是GL_BGRA
// works on simulator with GL_BGRA, but not on device
glReadPixels(0, 0, renderBufferWidth,
renderBufferHeight,
GL_RGBA, // <--
GL_UNSIGNED_BYTE, bufferData);
#1
4
As Andrew answered himself:
安德鲁自言自语道:
The answer, was that the bitmap format should be GL_RGBA, not GL_BGRA
答案是,位图格式应该是GL_RGBA,而不是GL_BGRA
// works on simulator with GL_BGRA, but not on device
glReadPixels(0, 0, renderBufferWidth,
renderBufferHeight,
GL_RGBA, // <--
GL_UNSIGNED_BYTE, bufferData);