Measuring deltas would be one form of feedback loop. It still would not explain 
the assumption that Delta+1 = learning. Elasticity perhaps, but not learning in 
the sense of an increase in base knowledge, in knowledge competency.

Unless, some want to conclude that learning = aggregate recording and recall, 
which is the current view on learning proving to be rather useless.

The "modern" view does not cater for many aspects of knowledge, including the 
generalized effect of knowledge obsolescence. I.e., What if Delta+1 = minus 
232? Who, or what would be able to reliably tell?

________________________________
From: Matt Mahoney <mattmahone...@gmail.com>
Sent: Friday, 21 June 2019 01:32
To: AGI
Subject: Re: [agi] Re: A mathematics of concpetual relations?

I disagree. By what mechanism would neurons representing feet and meters 
connect, but not kilograms and liters?

Neurons form connections by Hebb's rule. Neurons representing words form 
connections when they appear close together or in the same context.

On Thu, Jun 20, 2019, 4:14 PM Jim Bromer 
<jimbro...@gmail.com<mailto:jimbro...@gmail.com>> wrote:
Steve said: I strongly suspect biological synapses are tagged in some way to 
only connect with other synapses carrying dimensionally compatible information.

I totally agree. So one thing that I am wondering about is whether that can be 
computed using a novel kind of mathematics? Intuitively, I would say absolutely.

A truly innovative AI mathematical system would not 'solve' every AI problem 
but could it be developed so that it helped speed up and direct an initial 
analysis of input? Intuitively I am pretty sure it can be done, but I am not at 
all sure that I could come up with a method.
Jim Bromer


On Thu, Jun 20, 2019 at 1:13 PM Steve Richfield 
<steve.richfi...@gmail.com<mailto:steve.richfi...@gmail.com>> wrote:
Jim,

Many systems, e.g. while adding probabilities to compute probabilities doesn't 
make sense; adding counts having poor significance, which can look a lot like 
adding probabilities, can make sense to produce a count.

Where this gets confusing is in sensory fusion. Present practice is usually 
some sort of weighted summation, when CAREFUL analysis would probably involve 
various nonlinearities to convert inputs to cannonical form that make sense to 
add, followed by another nonlinearity to convert the sum to suitable output 
units.

I strongly suspect biological synapses are tagged in some way to only connect 
with other synapses carrying dimensionally compatible information.

Everyone seems to focus on values being computed, when it appears that it is 
the dimensionality that restricts learning to potentially rational processes.

Steve

On Thu, Jun 20, 2019, 9:14 AM Jim Bromer 
<jimbro...@gmail.com<mailto:jimbro...@gmail.com>> wrote:
I originally thought about novel computational rules. Arithmetic is not 
reversible because a computational result is not unique for the input operands. 
That makes it a type of compression. Furthermore it uses a limited set of 
rules. That makes it a super compression method.

On Thu, Jun 20, 2019, 12:08 PM Jim Bromer 
<jimbro...@gmail.com<mailto:jimbro...@gmail.com>> wrote:
I guess I understand what you mean.

On Thu, Jun 20, 2019, 12:07 PM Jim Bromer 
<jimbro...@gmail.com<mailto:jimbro...@gmail.com>> wrote:
I think your use of metaphors, especially metaphors that were intended to 
emphasize your thoughts through exaggeration, may have confused me. Would you 
explain your last post Steve?

On Thu, Jun 20, 2019, 12:02 PM Steve Richfield 
<steve.richfi...@gmail.com<mailto:steve.richfi...@gmail.com>> wrote:
Too much responding without sufficient thought. After a week of thought 
regarding earlier postings on this thread...

Genuine computation involves manipulating numerically expressible value (e.g. 
0.62), dimensionality (e.g. probability), and significance (e.g. +/- 0.1). 
Outputs of biological neurons appear to fit this model.

HOWEVER, much of AI does NOT fit this model - yet still appears to "work". If 
this is useful than use it, but there usually is no path to better solutions. 
You can't directly understand, optimize, adapt, debug, etc., because it is 
difficult/impossible to wrap your brain around quantities representing nothing.

Manipulations that don't fit this model are numerology, not mathematics, akin 
to bring astrology instead of astronomy.

It seems perfectly obvious to me that AGI, when it comes into being, will 
involve NO numerological faux "computation".

Sure, learning could involve developing entirely new computation, but it would 
have to perform potentially valid computations on it's inputs. For example, 
adding probabilities is NOT valid, but ORing them could be valid.

Steve

On Thu, Jun 20, 2019, 8:22 AM Alan Grimes via AGI 
<agi@agi.topicbox.com<mailto:agi@agi.topicbox.com>> wrote:
It has the basic structure and organization of a conscious agent,
obviously it lacks the other ingredients required to produce a complete
mind.

Stefan Reich via AGI wrote:
> Prednet develops consciousness?
>
> On Wed, Jun 19, 2019, 06:51 Alan Grimes via AGI 
> <agi@agi.topicbox.com<mailto:agi@agi.topicbox.com>
> <mailto:agi@agi.topicbox.com<mailto:agi@agi.topicbox.com>>> wrote:
>
>     Yay, it seems peeps are finally ready to talk about this!! =P
>
>
>     Lets see if I can fool anyone into thinking I'm actually making
>     sense by
>     starting with a first principles approach... Permalink
>     
> <https://agi.topicbox.com/groups/agi/T395236743964cb4b-M686d9fcf7662ad8dc2fc1130>
>
>


--
Please report bounces from this address to 
a...@numentics.com<mailto:a...@numentics.com>

Powers are not rights.

Artificial General Intelligence List<https://agi.topicbox.com/latest> / AGI / 
see discussions<https://agi.topicbox.com/groups/agi> + 
participants<https://agi.topicbox.com/groups/agi/members> + delivery 
options<https://agi.topicbox.com/groups/agi/subscription> 
Permalink<https://agi.topicbox.com/groups/agi/T395236743964cb4b-M9defff2ab5e8b39a818f88fa>

------------------------------------------
Artificial General Intelligence List: AGI
Permalink: 
https://agi.topicbox.com/groups/agi/T395236743964cb4b-M84a28f9027d90b5a0f9e6d01
Delivery options: https://agi.topicbox.com/groups/agi/subscription

Reply via email to