> From: Ed Porter [mailto:[EMAIL PROTECTED]
> >> ED PORTER ============>>
> > I am not an expert at computational efficiency, but I think graph
> > structures
> > like semantic nets, are probably close to as efficient as possible
> given
> > the
> > type of connectionism they are representing and the type of computing
> > that
> > is to be done on them, which include, importantly, selective spreading
> > activation.
> 
> 
> > JOHN ROSE  ============>>
> Uhm have you checked this out? Is there any evidence this? It would make
> it
> easier if this was in fact the case.
> 
> ED PORTER ============>>
> No, I have no evidence other than I do not know of any structure that is
> more appropriate than graph structures --- which are largely pointer
> based
> structures --- to be efficient representation information that has
> relatively irregular, highly sparse connections in an extremely high
> dimensional space.
> 


The activation dynamics that occur in this graph, have you thought out the
equations that describe them? This is where efficiency could be applied.
This is where a simulation of the simulation could be used to try to zero in
on optimal flow network efficiencies and capabilities. Unless you define
precisely what the graph is made of and get some exact metrics on the
processing granularity you don't know too much as to what will really happen
in the sparsely connected denseness to fully understand the resultant
behavior and discover further requirements. It's difficult with rich
connectionism because a mathematical model has the similar unanswered
questions...



> > ED PORTER ============>>
> Don't you sense that at some moments your consciousness feels richer
> than at
> other moments.  Many people who have had sudden close brushes with death
> have reported feeling as if suddenly much of their life were passing
> before
> their eyes.  This results from extreme emotional arousal that causes the
> brain operate at many times what it could on any sustainable basis.
> 


This probably a resource adaptation. It'd be nice if our consciousness was
always elevated but eventually other capacities suffer.

 
> 
> ED PORTER ============>> I think consciousness is highly applicable to
> AGI's, if we want them to think like humans --- because I think
> consciousness plays a key role in human thought.  It is the amphitheater
> in
> which our thoughts are spoken and listened to.
> 


It is highly applicable but I still don't know if required for general
intelligence. Consciousness brings so much baggage, but it seems that
consciousness can amplify intelligence in some ways. Perhaps there are
aspects of consciousness that improve intelligence but don't have the
baggage.

John



-------------------------------------------
agi
Archives: http://www.listbox.com/member/archive/303/=now
RSS Feed: http://www.listbox.com/member/archive/rss/303/
Modify Your Subscription: 
http://www.listbox.com/member/?member_id=8660244&id_secret=103754539-40ed26
Powered by Listbox: http://www.listbox.com

Reply via email to