Re: [Computer-go] Using GPUs?

2015-06-27 Thread Daniel Shawul
Yes it performed well for Hex8x8; i got a speed up of 60x compared to CPU
when i tested it about 2 years ago on a not-so-modern GPU (128 cores IIRC).
However, the playouts in Hex are much simpler than that of Go.
For instance, I check for termination of game once when the board is
completely full, i.e after all 64
stones are placed. This has allowed me to do only with two bitboards:
empty_squares and white_stones_squares.
It is weird that you don't even need to store black_stones. Also there are
no captures to complicate matters.

I have a chess branch in the github repo that I experimented on, but it
didn't work out well. First, chess is not suitable for monte-carlo search.
Second, board representation requires more register/shared memory so it is
difficult to make one thread do one playout by itself. Right now a warp (32
threads) get the same position from the MCTS tree, then each do their own
playouts. There won't be lots of divergence as all they do is place a stone
until board is completely full. I guess the memory limitation issue affects
Go as well. GPU for Go is definitely harder than Hex8x8, which I handpicked
for better performance. But I believe one should be able to get a good Go
or Checkers engine using MCTS on the GPU.
Daniel


On Fri, Jun 26, 2015 at 6:29 AM, Darren Cook dar...@dcook.org wrote:

  It is not exactly Go, but i have a monte-carlo tree searcher on the GPU
 for
  the game of Hex 8x8
  Here is a github link https://github.com/dshawul/GpuHex

 The engine looks to be just the middle 450 lines of code; quite compact!
 So running playouts on a GPU worked out well?

 Would doing the same thing for go be just a matter of writing more lines
 of code, or needing more memory on the GPU, or is there some more
 fundamental difference between hex and go that makes the latter less
 suitable? (e.g. in hex pieces are only added to the board, whereas in go
 they can be removed and loops can happen - does that make GPU-ing
 algorithms harder?)

 Darren

 ___
 Computer-go mailing list
 Computer-go@computer-go.org
 http://computer-go.org/mailman/listinfo/computer-go

___
Computer-go mailing list
Computer-go@computer-go.org
http://computer-go.org/mailman/listinfo/computer-go

Re: [Computer-go] Using GPUs?

2015-06-27 Thread Nikos Papachristou
On Fri, Jun 26, 2015 at 4:20 PM, Darren Cook dar...@dcook.org wrote:


 Confirmed here:
   http://blogs.nvidia.com/blog/2015/03/17/pascal/

 So, currently they use a 32-bit float, rather than a 64-bit double, but
 will reduce that to 16-bit to get a double speed-up. Assuming they've
 been listening to customers properly, that must mean 16-bit floats are
 good enough for neural nets?

 Apparently you can use 16-bit representations in DNNs with little or no
degradation in accuracy:
http://arxiv.org/pdf/1502.02551.pdf

Nikos
___
Computer-go mailing list
Computer-go@computer-go.org
http://computer-go.org/mailman/listinfo/computer-go

Re: [Computer-go] Using GPUs?

2015-06-26 Thread Nikos Papachristou
Not go related, but you may find this deep learning GPU hardware guide
useful:
https://timdettmers.wordpress.com/2015/03/09/deep-learning-hardware-guide/

As for hardware breakthroughs, Nvidia has announced that its next
generation GPUs (codenamed Pascal) will offer 10x the performance in 2016,
so you might want to wait a little more.

Nikos

On Thu, Jun 25, 2015 at 8:18 PM, Darren Cook dar...@dcook.org wrote:

 I wondered if any of the current go programs are using GPUs.

 If yes, what is good to look for in a GPU? Links to essential reading on
 this topic would be welcome. (*)

 If not, is there some hardware breakthrough being waited for, or some
 algorithmic one?

 Darren

 *: After many years of being happy with built-in graphics, I'm now
 thinking to get a gaming PC, to show off some WebGL data
 visualizations. Assuming the cost is in the same ballpark, I thought I'd
 get one that would allow some scientific computing experiments too.
 ___
 Computer-go mailing list
 Computer-go@computer-go.org
 http://computer-go.org/mailman/listinfo/computer-go
___
Computer-go mailing list
Computer-go@computer-go.org
http://computer-go.org/mailman/listinfo/computer-go

Re: [Computer-go] Using GPUs?

2015-06-26 Thread Darren Cook
Steven wrote:
 http://arxiv.org/abs/1412.6564 (nvidia gtx titan black)
 http://arxiv.org/abs/1412.3409 (nvidia gtx 780)

Thanks - I had read those papers but hadn't realized the neural nets
were run on GPUs.

Nikos wrote:
 https://timdettmers.wordpress.com/2015/03/09/deep-learning-hardware-guide/

This was very useful, thanks!

 As for hardware breakthroughs, Nvidia has announced that its next
 generation GPUs (codenamed Pascal) will offer 10x the performance in 2016,
 so you might want to wait a little more.

One of the comments, on the above blog, questions that 10x speed-up:

https://timdettmers.wordpress.com/2015/03/09/deep-learning-hardware-guide/comment-page-1/#comment-336

Confirmed here:
  http://blogs.nvidia.com/blog/2015/03/17/pascal/

So, currently they use a 32-bit float, rather than a 64-bit double, but
will reduce that to 16-bit to get a double speed-up. Assuming they've
been listening to customers properly, that must mean 16-bit floats are
good enough for neural nets?

Darren

___
Computer-go mailing list
Computer-go@computer-go.org
http://computer-go.org/mailman/listinfo/computer-go

Re: [Computer-go] Using GPUs?

2015-06-26 Thread Steven Clark
Here are the papers I was thinking of:

http://arxiv.org/abs/1412.6564 (nvidia gtx titan black)
http://arxiv.org/abs/1412.3409 (nvidia gtx 780)

On Fri, Jun 26, 2015 at 2:09 AM, Nikos Papachristou nikp...@gmail.com
wrote:

 Not go related, but you may find this deep learning GPU hardware guide
 useful:
 https://timdettmers.wordpress.com/2015/03/09/deep-learning-hardware-guide/

 As for hardware breakthroughs, Nvidia has announced that its next
 generation GPUs (codenamed Pascal) will offer 10x the performance in 2016,
 so you might want to wait a little more.

 Nikos

 On Thu, Jun 25, 2015 at 8:18 PM, Darren Cook dar...@dcook.org wrote:

 I wondered if any of the current go programs are using GPUs.

 If yes, what is good to look for in a GPU? Links to essential reading on
 this topic would be welcome. (*)

 If not, is there some hardware breakthrough being waited for, or some
 algorithmic one?

 Darren

 *: After many years of being happy with built-in graphics, I'm now
 thinking to get a gaming PC, to show off some WebGL data
 visualizations. Assuming the cost is in the same ballpark, I thought I'd
 get one that would allow some scientific computing experiments too.
 ___
 Computer-go mailing list
 Computer-go@computer-go.org
 http://computer-go.org/mailman/listinfo/computer-go



 ___
 Computer-go mailing list
 Computer-go@computer-go.org
 http://computer-go.org/mailman/listinfo/computer-go

___
Computer-go mailing list
Computer-go@computer-go.org
http://computer-go.org/mailman/listinfo/computer-go

Re: [Computer-go] Using GPUs?

2015-06-26 Thread Darren Cook
 It is not exactly Go, but i have a monte-carlo tree searcher on the GPU for
 the game of Hex 8x8
 Here is a github link https://github.com/dshawul/GpuHex

The engine looks to be just the middle 450 lines of code; quite compact!
So running playouts on a GPU worked out well?

Would doing the same thing for go be just a matter of writing more lines
of code, or needing more memory on the GPU, or is there some more
fundamental difference between hex and go that makes the latter less
suitable? (e.g. in hex pieces are only added to the board, whereas in go
they can be removed and loops can happen - does that make GPU-ing
algorithms harder?)

Darren

___
Computer-go mailing list
Computer-go@computer-go.org
http://computer-go.org/mailman/listinfo/computer-go

[Computer-go] Using GPUs?

2015-06-25 Thread Darren Cook
I wondered if any of the current go programs are using GPUs.

If yes, what is good to look for in a GPU? Links to essential reading on
this topic would be welcome. (*)

If not, is there some hardware breakthrough being waited for, or some
algorithmic one?

Darren

*: After many years of being happy with built-in graphics, I'm now
thinking to get a gaming PC, to show off some WebGL data
visualizations. Assuming the cost is in the same ballpark, I thought I'd
get one that would allow some scientific computing experiments too.
___
Computer-go mailing list
Computer-go@computer-go.org
http://computer-go.org/mailman/listinfo/computer-go

Re: [Computer-go] Using GPUs?

2015-06-25 Thread Daniel Shawul
It is not exactly Go, but i have a monte-carlo tree searcher on the GPU for
the game of Hex 8x8.
I got about 60x speed up from it when i tested it about two years ago. I
specifically chose this game because
the moves and WDL rules are much simpler than that of Go.
Here is a github link https://github.com/dshawul/GpuHex

On Thu, Jun 25, 2015 at 10:18 AM, Darren Cook dar...@dcook.org wrote:

 I wondered if any of the current go programs are using GPUs.

 If yes, what is good to look for in a GPU? Links to essential reading on
 this topic would be welcome. (*)

 If not, is there some hardware breakthrough being waited for, or some
 algorithmic one?

 Darren

 *: After many years of being happy with built-in graphics, I'm now
 thinking to get a gaming PC, to show off some WebGL data
 visualizations. Assuming the cost is in the same ballpark, I thought I'd
 get one that would allow some scientific computing experiments too.
 ___
 Computer-go mailing list
 Computer-go@computer-go.org
 http://computer-go.org/mailman/listinfo/computer-go
___
Computer-go mailing list
Computer-go@computer-go.org
http://computer-go.org/mailman/listinfo/computer-go

Re: [Computer-go] Using GPUs?

2015-06-25 Thread Steven Clark
Can't speak to current go programs, but there's lots of exciting stuff
going on currently with machine learning / deep neural networks, most of
which uses GPUs heavily. I know some research has been done on
convolutional neural networks for Go -- don't have any links handy at the
moment though.

Recommend getting a recent vintage NVIDIA gpu (for CUDA support). Say, a
780 or 980. Either of these would be fine for your visualization purposes
as well.

On Thu, Jun 25, 2015 at 1:18 PM, Darren Cook dar...@dcook.org wrote:

 I wondered if any of the current go programs are using GPUs.

 If yes, what is good to look for in a GPU? Links to essential reading on
 this topic would be welcome. (*)

 If not, is there some hardware breakthrough being waited for, or some
 algorithmic one?

 Darren

 *: After many years of being happy with built-in graphics, I'm now
 thinking to get a gaming PC, to show off some WebGL data
 visualizations. Assuming the cost is in the same ballpark, I thought I'd
 get one that would allow some scientific computing experiments too.
 ___
 Computer-go mailing list
 Computer-go@computer-go.org
 http://computer-go.org/mailman/listinfo/computer-go
___
Computer-go mailing list
Computer-go@computer-go.org
http://computer-go.org/mailman/listinfo/computer-go