Hi Mark!
How's your rally driving going. ;-)
On Sun, Dec 11, 2011 at 4:45 AM, Mark Higgins migg...@gmail.com wrote:
I notice in gnubg and other neural networks the probability of gammon gets
its own output node, alongside the probability of (any kind of) win.
Doesn't this sometimes mean
Softmax activation looks pretty interesting! I guess in that case you'd need to
change the meaning of the outputs to ( prob of single win, prob of single loss,
prob of gammon win, prob of gammon loss, prob of bg win, prob of bg loss );
then they all have to sum to 1 but there's no restriction
I tried a little experiment on this: a 10-hidden-node network with a single
probability-of-win output, but two setups. The first doesn't have a whose turn
is it input and doesn't add any symmetry constraints. The second has the extra
inputs for the turn and makes the symmetry constraint I
My experience tells me that 100,000 trials may not be sufficient.
With today's computing power it should be easy to do at least a
couple of millions.
-Joseph
On 12 December 2011 11:22, Mark Higgins migg...@gmail.com wrote:
I tried a little experiment on this: a 10-hidden-node network with a
Thx - I'll run it longer and with more hidden nodes and see what happens.
On Dec 11, 2011, at 5:44 PM, Joseph Heled jhe...@gmail.com wrote:
My experience tells me that 100,000 trials may not be sufficient.
With today's computing power it should be easy to do at least a
couple of