See below:
On Tue, Mar 22, 2016 at 5:21 PM, Lukas van de Wiel <
lukas.drinkt.t...@gmail.com> wrote:
> It would reduce Alphago, because there is less training material in the
> form of high-dan-games, to train the policy network.
>
Maybe not a concern. There has been a suggestion that AlphaGo be
Conv net should be robust. From image processing domian, these are feature
detectors (shape in case of go) that are invariant to translations (moving
shape left right up down along board). Enlarging board wouldnt put bot at
disadvantage in evaluating local positions.
On Tuesday, March 22, 2016,
On 3/22/2016 5:21 PM, Lukas van de Wiel wrote:
It would reduce Alphago, because there is less training material in
the form of high-dan-games, to train the policy network.
It would also reduce the skill of a human opponent, because (s)he
would have less experience on a larger board, just as Al
It would reduce Alphago, because there is less training material in the
form of high-dan-games, to train the policy network.
It would also reduce the skill of a human opponent, because (s)he would
have less experience on a larger board, just as AlphaGo.
It would be fun to see which can adapt bett
On 3/22/2016 11:25 AM, Tom M wrote:
I suspect that even with a similarly large training sample for
initialization that AlphaGo would suffer a major reduction in apparent
skill level.
i think a human would also.
The CNN would require many more layers of convolution;
the valuation of positions
Ko is what makes this game difficult, from a theoretical point of view.
I suspect ko+unresolved groups is where it's at.
s.
On Mar 22, 2016 11:25 AM, "Tom M" wrote:
> I suspect that even with a similarly large training sample for
> initialization that AlphaGo would suffer a major reduction in a
I suspect that even with a similarly large training sample for
initialization that AlphaGo would suffer a major reduction in apparent
skill level. The CNN would require many more layers of convolution;
the valuation of positions would be much more uncertain; play in the
corner, edges, and center w