Solved. Thanks for the quick response.
I discarded the code-based pixelFormat stuff and it cleared up the rendering issue. As for the 2 aux, I got rid of this as well... changed it to none. I am working on building a framework which is similar to opensource projects like Processing and OpenFrameworks, but strictly for Cocoa. I don't know too much about setting up pixelFormats, but had that code-based stuff in there because I was modeling of an example I'd found in the documentation for creating movies. As for the 16Bit color... Is 24Bit RGB better in terms of shipping hardware? Do you know where can I read more about this? I've looked at the QCCompositionLayer, and am actually thinking about shifting towards this for the design of the framework I'm building. For the moment, though, the project is built around an NSOpenGLView and I'm just trying to put a few extra classes (e.g. one which makes it easier for people to work with Quartz patches) into the project before redesigning the project from scratch. Thanks for the advice. T On Fri, Jun 24, 2011 at 4:33 PM, Christopher Wright < [email protected]> wrote: > When I load the view, I call the following method: > > -(void)prepareOpenGL { > [[self openGLContext] makeCurrentContext]; > > > // Synchronize buffer swaps with vertical refresh rate > GLint swapInt = 1; > [[self openGLContext] setValues:&swapInt forParameter: > NSOpenGLCPSwapInterval]; > > > > > GLint opacity = 0; > [[self openGLContext] setValues:&opacity forParameter: > NSOpenGLCPSurfaceOpacity]; > readyToDraw = YES; > > > NSOpenGLPixelFormatAttribute attribs[] = > { > kCGLPFAAccelerated, > kCGLPFANoRecovery, > kCGLPFADoubleBuffer, > kCGLPFAColorSize, 24, > kCGLPFADepthSize, 24, > 0 > }; > > > pixelFormat = [[NSOpenGLPixelFormat alloc] initWithAttributes > :attribs]; > [self setPixelFormat:pixelFormat]; > > > if (!pixelFormat) > NSLog(@"No OpenGL pixel format"); > } > > > I'm not entirely sure why you're creating a pixelformat here -- it's not a > pixelformat that matches your XIB's configuration either; your XIB is > configured for 16 bit color (5551) -- I wouldn't recommend this for > performance reasons because 16bit color buffers aren't a particularly fast > path on currently shipping hardware, and you lose a lot of alpha channel > fidelity (you get 2 alpha levels instead of 256). > > 2 aux buffers is a bit unusual too -- are you using them for something? > > -(NSOpenGLPixelFormat *)pixelFormat { > return pixelFormat; > } > > Here is the drawRect method in my custom view: > > > > -(void)drawRect:(NSRect)dirtyRect { if(readyToDraw) { > glClearColor(0, 0, 0, 0); > glClear(GL_COLOR_BUFFER_BIT); > if(isSetup){ > if (backgroundShouldDraw == YES) { > [self drawBackground]; > backgroundShouldDraw = NO; > } > [self draw]; > } > } > } > > > > Here is the render method for the Patch: > > -(void)render { glEnable(GL_DEPTH_TEST); > glClear(GL_DEPTH_BUFFER_BIT); > [patchRenderer renderAtTime:[NSDate timeIntervalSinceReferenceDate] > arguments:patchArguments]; > } > > > These look roughly correct at a glance. > > You've got a root-level clear patch in your composition that clears both > the color and depth buffers anyway, so these should be unnecessary unless > you're planning on doing additional rendering outside of QC in your view. > > Plugging away some more, it looks like you're subclassing CALayer > (CAOpenGLLayer). In this case, your pixelformat should probably match what > your context's pixel format is. (-[NSOpenGLView pixelFormat] can help get > exactly the right thing). > > Have you looked at QCCompositionLayer (which might be a better starting > point for subclassing)? > > -- > Christopher Wright > [email protected] > >
_______________________________________________ Do not post admin requests to the list. They will be ignored. Quartzcomposer-dev mailing list ([email protected]) Help/Unsubscribe/Update your Subscription: http://lists.apple.com/mailman/options/quartzcomposer-dev/archive%40mail-archive.com This email sent to [email protected]

