Hey Chad,

1. With Redshift it will roll over (out-of-core) to system ram once the
vram is filled, which does slow the performance down (though its still
faster than cpu rendering). As an example when I was comparing a 780 (3Gb)
to a titan (6Gb) on smaller scenes they were fairly evenly matched, once we
added something memory consuming like hair the titan was often 2 to 3 times
faster.

2. I'm sure RR7 added something to address per gpu render jobs, Deadline
certainly can.

Tim - A man's cup of tea is no joke ;)


On 7 August 2015 at 03:18, Chad Briggs <chad_fo...@elementxcreative.com>
wrote:

> We are eyeballing a new farm next year (about that time again) and this
> thread has been great. I do have a few questions as we are currently using
> arnold for everything and don't have much experience with Redshift.
>
>    1. everyone keeps chiming in on how you want the graphics card to have
>    as much memory as possible. What happens when you run out of VRAM during a
>    render? Does it just crap out? Or get really slow?
>    2. Some folks were mentioning they assign one render job per GPU
>    rather than per node? Does your farm software have to be set up to handle
>    that? (we use Royal Render).
>
> Chad Briggs
> Element X
>
>
>

Reply via email to