将SCNView渲染到屏幕外缓冲区以生成图像

时间:2020-12-02 19:39:54

I have a SCNView, it renders my scene. However, I want to get a couple of screenshots at different time points when I save. My best guess at this is to create a SCNRenderer and render the scene specifying different times. I have tried, but my image just comes back as blank, here's my code, any ideas?:

我有一个SCNView,它呈现我的场景。但是,我希望在保存时在不同时间点获取几个屏幕截图。我最好的猜测是创建一个SCNRenderer并渲染场景指定不同的时间。我试过了,但是我的图像只是空白,这是我的代码,任何想法?:

-(void)test 
{
//First create a new OpenGL context to render to. 
NSOpenGLPixelFormatAttribute pixelFormatAttributes[] = {
    NSOpenGLPFAOpenGLProfile, NSOpenGLProfileVersionLegacy,
    NSOpenGLPFADoubleBuffer,
    NSOpenGLPFANoRecovery,
    NSOpenGLPFAAccelerated,
    NSOpenGLPFADepthSize, 24,
    0
};

NSOpenGLPixelFormat *pixelFormat = [[NSOpenGLPixelFormat alloc] initWithAttributes:pixelFormatAttributes];
if (pixelFormat == nil)
{
    NSLog(@"Error: No appropriate pixel format found");
}

NSOpenGLContext *context = [[NSOpenGLContext alloc] initWithFormat:pixelFormat shareContext:nil];

//set the renderer to render to that context
SCNRenderer *lRenderer = [SCNRenderer rendererWithContext:context.CGLContextObj options: nil];
lRenderer.scene = myscnview.scene;
lRenderer.pointOfView = [myscnview.pointOfView clone];

//render the scene
[lRenderer render];

//I think I should now have the scene rendered into the context now?
//so i could just do:
NSImage *image = [self imageFromSceneKitView:controller.docView fromWindow:window ctx:context];

}

- (NSImage*)imageFromSceneKitView:(SCNView*)sceneKitView fromWindow:(NSWindow *)window ctx:    (NSOpenGLContext *)ctx
{
NSInteger width = sceneKitView.bounds.size.width * window.backingScaleFactor;
NSInteger height = sceneKitView.bounds.size.height * window.backingScaleFactor;
width = width - (width % 32);
height = height - (height % 32);
NSBitmapImageRep* imageRep=[[NSBitmapImageRep alloc] initWithBitmapDataPlanes:NULL
                                                                   pixelsWide:width
                                                                   pixelsHigh:height
                                                                bitsPerSample:8
                                                              samplesPerPixel:4
                                                                     hasAlpha:YES
                                                                     isPlanar:NO
                                                               colorSpaceName:NSCalibratedRGBColorSpace
                                                                  bytesPerRow:width*4
                                                                 bitsPerPixel:4*8];

CGLLockContext((CGLContextObj)[ctx CGLContextObj]);
[ctx makeCurrentContext];
glReadPixels(0, 0, (int)width, (int)height, GL_RGBA, GL_UNSIGNED_BYTE, [imageRep bitmapData]);
[NSOpenGLContext clearCurrentContext];
CGLUnlockContext((CGLContextObj)[ctx CGLContextObj]);
NSImage* outputImage = [[NSImage alloc] initWithSize:NSMakeSize(width, height)];
[outputImage addRepresentation:imageRep];

NSImage* flippedImage = [NSImage imageWithSize:NSMakeSize(width, height) flipped:YES drawingHandler:^BOOL(NSRect dstRect) {
    [imageRep drawInRect:dstRect];
    return YES;
}];
return flippedImage;

}

1 个解决方案

#1


5  

This is a rather old question, but let me mention the fairly new SCNView.snapshot() and SCNRenderer.snapshot(atTime:with:antialiasingMode:) APIs.

这是一个相当古老的问题,但让我提一下相当新的SCNView.snapshot()和SCNRenderer.snapshot(atTime:with:antialiasingMode :) API。

#1


5  

This is a rather old question, but let me mention the fairly new SCNView.snapshot() and SCNRenderer.snapshot(atTime:with:antialiasingMode:) APIs.

这是一个相当古老的问题,但让我提一下相当新的SCNView.snapshot()和SCNRenderer.snapshot(atTime:with:antialiasingMode :) API。