On Wed, Apr 30, 2008 at 8:56 PM, Russell Wallace <[EMAIL PROTECTED]> wrote: > > Well yes, like I said, we can recap the fact that logic gates are a > special case of neurons, but this fact is of limited use: > > 1) _You_ can set up circuits in this way, but the network itself can't, >
I believe it can, given the right learning dynamics and some rather simple conditions on teaching strategy. Starting from more vague concepts, it can refine itself to start giving reliable intermediate calculations, which can then be stacked up and connected in more and more elaborate circuits, by the same 'gate-learning' process. > 2) Even then, it only works for purely combinatorial problems and > breaks down once e.g. recursion comes into play - how would you go > about implementing Quicksort in this way? (If the answer starts with > "well, von Neumann machines are made of logic gates...", I'll consider > my case made :)) > That is why you can't learn to multiply numbers in your head like a calculator (or maybe it's possible with sufficient understanding of learning dynamics, but was never implemented...). You unfortunately don't have *memory*, so it's not a von Neumann machine, it's more limited in its ability. You can configure the circuit, but circuit can't be connected to a reliable tape. You can only sort of emulate the tape for some of the circuits by other circuits. Overall, processing is composed of reactive transitions. Is this what you meant by 'combinatorial problems'? -- Vladimir Nesov [EMAIL PROTECTED] ------------------------------------------- agi Archives: http://www.listbox.com/member/archive/303/=now RSS Feed: http://www.listbox.com/member/archive/rss/303/ Modify Your Subscription: http://www.listbox.com/member/?member_id=8660244&id_secret=101455710-f059c4 Powered by Listbox: http://www.listbox.com