Redshift can handle that just fine. I know the term 'out-of-core' has been tossed around a lot, but it bears repeating. When Redshift either 1) reaches its maximum geocache amount (which is currently capped at 4GB for various reasons), or 2) reaches the ram limits of the card, it sends data out-of-core, which means it starts offloading it from the card to the system ram. For this reason, you still need plenty of system ram. But what this means is that you can use your GPU to render scenes whose render-time memory footprint far exceed the ram amount on the card. I have a GTX 470 at home with a measly 1.2GB and I render all kinds of stuff on it that would never be possible with something like Octane (unless they've changed something recently?).

The chief effect of the whole OOC thing is on the final render time: the less frequently Redshift has to go OOC, the less time it will take to render the frame. If you have a card with 4GB or 6GB or higher, RS will have to go OOC far less frequently. And yes, there are plans to raise that geocache cap to a higher number to accomodate cards with higher addressable ram amounts.

-Tim


On 5/21/2014 3:30 AM, Dan Yargici wrote:
All said and done, Redshift is crazy-good though... Just saying.

...and how often are we rendering 100mio polys in commercials? In those instances I'll split my scenes into layers and render them 50x faster (not a joke) thank you very much.

DAN



Reply via email to