Thanks for sharing. For multiple GPUs do you have manually split the data 
to each GPU or does that get taken care of automatically? BTW for multi GPU 
stuff I assume you don't need SLI and that SLI is just for gaming.
 
On Friday, April 29, 2016 at 4:31:32 PM UTC-4, Chris Rackauckas wrote:
>
> Works great for me. Here's a tutorial where I describe something I did on 
> XSEDE's Comet 
> <http://www.stochasticlifestyle.com/julia-on-the-hpc-with-gpus/> which 
> has Tesla K80s. It works great. I have had code running on GTX970s, 980Tis, 
> K40s, and K80s with no problem.
>
> On Thursday, April 28, 2016 at 1:13:56 PM UTC-7, feza wrote:
>>
>> Hi All, 
>>
>> Has anyone here had experience using Julia  programming using Nvidia's 
>> Tesla K80 or K40  GPU? What was the experience, is it buggy or does Julia 
>> have no problem.?
>>
>

Reply via email to