Aha!

Ian Grant's workaround does the trick!
in fact, in my case it can be simplified to just adding a Clear in the RII 
patch (which I had already), with
Clear Color and
Clear Depth
unchecked (so in theory, it shouldn't actually do anything).

Very obscure...

a|x

--- On Sun, 23/11/08, Alex Drinkwater <[EMAIL PROTECTED]> wrote:

> From: Alex Drinkwater <[EMAIL PROTECTED]>
> Subject: Re: Render In Image and GLSL noise()
> To: [email protected]
> Date: Sunday, 23 November, 2008, 11:08 PM
> Thanks very much vade and cwright for getting back to me. I
> as under the impression noise() was supposed to run on the
> GPU, but maybe that's not always the case. There
> certainly must have been a time when it didn't, which
> presumably is why it didn't make it into the CIKernel
> subset of the GLSL language. Doesn't seem to run too
> slowly on my MBP/X1600, though.
> 
> I will give Ian Grant's workaround a go. Definitely
> seems like a RII bug to me, which doesn't surprise me
> too much. Thanks for the link d
> CW.
> 
> Cheers,
> 
> a|x
> 
> 
> --- On Sun, 23/11/08, vade <[EMAIL PROTECTED]> wrote:
> 
> > From: vade <[EMAIL PROTECTED]>
> > Subject: Re: Render In Image and GLSL noise()
> > To: "Christopher Wright"
> <[EMAIL PROTECTED]>
> > Cc: "Alex Drinkwater"
> <[EMAIL PROTECTED]>,
> [email protected]
> > Date: Sunday, 23 November, 2008, 10:38 PM
> > right, because it *always* renders in software because
> > noise() punts to sw, since noise() doesnt run on
> normal
> > "non workstation" class cards. At least,
> this is
> > what I understood to be the case when I attempted some
> > similar vertex displacement code using noise3() and
> ran into
> > massive slowdowns. A kind "GL wizard"
> explained
> > that to me and told me I should not rely on noise() on
> the
> > GPU, since it will almost always be slow. THis may
> have
> > changed, but I figured it would be the same.
> > 
> > Now, that doesnt explain why there are artifacts, but,
> it
> > may mean there is an issue in the SW renderer? No
> idea. :)
> > :P
> > 
> > 
> > On Nov 23, 2008, at 5:10 PM, Christopher Wright wrote:
> > 
> > >> Well, im not sure about the rendering
> artifact,
> > but noise3() and all noise() functions are typically
> not GPU
> > accelerated, from what I understand, so i think part
> of the
> > code punts to software. Maybe that is a clue?
> > > 
> > > 
> > > Not sure if I support that hypothesis --
> rendering in
> > all-software mode still produces an artifact (a
> transparent
> > triangle, no less, see attached picture).  No
> > 'puntage' in this case, since there's
> nothing
> > hardware accelerated.
> > > 
> > > <Image-1.png>
> > > 
> > > Probably some kind of CoreImage/GL bug in QC. 
> > It's been reported before on the list, perhaps
> there was
> > a more in-depth explanation at that point, or there
> were
> > other conditions where it happened, to refine
> what's
> > happening.
> > > 
> > > (Here's your link:
> >
> http://lists.apple.com/archives/Quartzcomposer-dev/2008/Aug/msg00129.html
> >  credits to Ian Grant for starting the thread)
> > > 
> > > --
> > > [ christopher wright ]
> > > [EMAIL PROTECTED]
> > > http://kineme.net/
> > >
> 
> 
> 
>  _______________________________________________
> Do not post admin requests to the list. They will be
> ignored.
> Quartzcomposer-dev mailing list     
> ([email protected])
> Help/Unsubscribe/Update your Subscription:
> http://lists.apple.com/mailman/options/quartzcomposer-dev/the_voder%40yahoo.co.uk
> 
> This email sent to [EMAIL PROTECTED]



 _______________________________________________
Do not post admin requests to the list. They will be ignored.
Quartzcomposer-dev mailing list      ([email protected])
Help/Unsubscribe/Update your Subscription:
http://lists.apple.com/mailman/options/quartzcomposer-dev/archive%40mail-archive.com

This email sent to [EMAIL PROTECTED]

Reply via email to