Re: [agi] Relevance of SE in AGI

2008-12-22 Thread Richard Loosemore

Valentina Poletti wrote:
I have a question for you AGIers.. from your experience as well as from 
your background, how relevant do you think software engineering is in 
developing AI software and, in particular AGI software? Just wondering.. 
does software verification as well as correctness proving serve any use 
in this field? Or is this something used just for Nasa and critical 
applications?


1) Software engineering (if we take that to mean the conventional 
repertoire of techniques taught as SE) is relevant to any project that 
gets up above a certain size, but it is less important when the project 
is much smaller, serves a more exploratory function, or where the design 
is constantly changing.  To this extent I agree with Pei's comments.


2) If you are looking beyond the idea of simply grabbing some SE 
techniques off the shelf, and are instead asking if SE can have an 
impact on AGI, then the answer is a dramatic Yes!.  Why?  Because 
tools determine the way that we *can* think about things.  Tools shape 
our thoughts.  They can sometimes enable us to think in new ways that 
were simply not possible before the tools were invented.


I decided a long time ago that if cognitive scientists had easy-to-use 
use tools that enabled them to construct realistic components of 
thinking systems, their entire style of explanation would be 
revolutionized.  Right now, cog sci people cannot afford the time to be 
both cog sci experts *and* sophisticated software developers, so they 
have to make do with programming that is, by and large, trivially 
simple.  This determines the kinds of models and explanations they can 
come up with.  (Ditto in spades for the neuroscientists, by the way).


So, the more global answer to your question is that nothing could be 
more important for AGI than software engineering.


The problem is, that the kind of software engineering we are talking 
about is not a matter of grabbing SE components off the shelf, but 
asking what the needs of cognitive scientists and AGIers might be, and 
then inventing new techniques and tools that will give these people the 
ability to think about intelligent systems in new ways.


That is why I am working on Safaire.





Richard Loosemore


---
agi
Archives: https://www.listbox.com/member/archive/303/=now
RSS Feed: https://www.listbox.com/member/archive/rss/303/
Modify Your Subscription: 
https://www.listbox.com/member/?member_id=8660244id_secret=123753653-47f84b
Powered by Listbox: http://www.listbox.com


Re: [agi] Relevance of SE in AGI

2008-12-22 Thread Ben Goertzel
Well, we have attempted to use sound software engineering principles to
architect the OpenCog framework, with a view toward making it usable for
prototyping speculative AI ideas and ultimately building scalable, robust,
mature AGI systems as well

But, we are fairly confident of our overall architecture with this system
because there have been a number of predecessor systems based on similar
principles, which we implemented and learned a lot from ...

If one has a new AGI idea and wants to start experimenting with it, SE is
basically a secondary matter ... the point is to explore the algorithms and
ideas by whatever means is less time-wasting and frustrating...

OTOH, if one has an AGI idea that's already been fleshed out a fair bit and
one is ready to try to use it as the basis for a scalable, extensible
system, SE is more worth paying attention to...

Premature attention to engineering when one should be focusing on science is
a risk, but so is ignoring engineering when one wants to build a scalable,
extensible system...

ben g

On Mon, Dec 22, 2008 at 9:03 AM, Richard Loosemore r...@lightlink.comwrote:

 Valentina Poletti wrote:

 I have a question for you AGIers.. from your experience as well as from
 your background, how relevant do you think software engineering is in
 developing AI software and, in particular AGI software? Just wondering..
 does software verification as well as correctness proving serve any use in
 this field? Or is this something used just for Nasa and critical
 applications?


 1) Software engineering (if we take that to mean the conventional
 repertoire of techniques taught as SE) is relevant to any project that
 gets up above a certain size, but it is less important when the project is
 much smaller, serves a more exploratory function, or where the design is
 constantly changing.  To this extent I agree with Pei's comments.

 2) If you are looking beyond the idea of simply grabbing some SE techniques
 off the shelf, and are instead asking if SE can have an impact on AGI, then
 the answer is a dramatic Yes!.  Why?  Because tools determine the way that
 we *can* think about things.  Tools shape our thoughts.  They can sometimes
 enable us to think in new ways that were simply not possible before the
 tools were invented.

 I decided a long time ago that if cognitive scientists had easy-to-use use
 tools that enabled them to construct realistic components of thinking
 systems, their entire style of explanation would be revolutionized.  Right
 now, cog sci people cannot afford the time to be both cog sci experts *and*
 sophisticated software developers, so they have to make do with programming
 that is, by and large, trivially simple.  This determines the kinds of
 models and explanations they can come up with.  (Ditto in spades for the
 neuroscientists, by the way).

 So, the more global answer to your question is that nothing could be more
 important for AGI than software engineering.

 The problem is, that the kind of software engineering we are talking about
 is not a matter of grabbing SE components off the shelf, but asking what the
 needs of cognitive scientists and AGIers might be, and then inventing new
 techniques and tools that will give these people the ability to think about
 intelligent systems in new ways.

 That is why I am working on Safaire.





 Richard Loosemore



 ---
 agi
 Archives: https://www.listbox.com/member/archive/303/=now
 RSS Feed: https://www.listbox.com/member/archive/rss/303/
 Modify Your Subscription:
 https://www.listbox.com/member/?;
 Powered by Listbox: http://www.listbox.com




-- 
Ben Goertzel, PhD
CEO, Novamente LLC and Biomind LLC
Director of Research, SIAI
b...@goertzel.org

I intend to live forever, or die trying.
-- Groucho Marx



---
agi
Archives: https://www.listbox.com/member/archive/303/=now
RSS Feed: https://www.listbox.com/member/archive/rss/303/
Modify Your Subscription: 
https://www.listbox.com/member/?member_id=8660244id_secret=123753653-47f84b
Powered by Listbox: http://www.listbox.com


RE: [agi] Relevance of SE in AGI

2008-12-22 Thread John G. Rose
I've been experimenting with extending OOP to potentially implement
functionality that could make a particular AGI design easier to build.

 

The problem with SE is that it brings along much baggage that can totally
obscure AGI thinking.

 

Many AGI people and AI people are automatic top of the line software
engineers. So the type of SE for AGI is different than typical SE and the
challenges are different.

 

I think though that proto-AGI's will emerge from hybrid SE AGI organizations
either independent or embedded within larger orgs.  Though some AGI
principles are so tantalizing close to a potential software implementation
you can almost taste it... though these typically turn out to be mirages...

 

John

 

 

From: Valentina Poletti [mailto:jamwa...@gmail.com] 
Sent: Saturday, December 20, 2008 6:29 PM
To: agi@v2.listbox.com
Subject: [agi] Relevance of SE in AGI

 

I have a question for you AGIers.. from your experience as well as from your
background, how relevant do you think software engineering is in developing
AI software and, in particular AGI software? Just wondering.. does software
verification as well as correctness proving serve any use in this field? Or
is this something used just for Nasa and critical applications? 

 

Valentina

  _  


agi |  https://www.listbox.com/member/archive/303/=now Archives
https://www.listbox.com/member/archive/rss/303/ |
https://www.listbox.com/member/?;
 Modify Your Subscription

 http://www.listbox.com 

 




---
agi
Archives: https://www.listbox.com/member/archive/303/=now
RSS Feed: https://www.listbox.com/member/archive/rss/303/
Modify Your Subscription: 
https://www.listbox.com/member/?member_id=8660244id_secret=123753653-47f84b
Powered by Listbox: http://www.listbox.com


Re: [agi] Relevance of SE in AGI

2008-12-22 Thread Richard Loosemore

Ben Goertzel wrote:


Well, we have attempted to use sound software engineering principles to 
architect the OpenCog framework, with a view toward making it usable for 
prototyping speculative AI ideas and ultimately building scalable, 
robust, mature AGI systems as well


But, we are fairly confident of our overall architecture with this 
system because there have been a number of predecessor systems based on 
similar principles, which we implemented and learned a lot from ...


If one has a new AGI idea and wants to start experimenting with it, SE 
is basically a secondary matter ... the point is to explore the 
algorithms and ideas by whatever means is less time-wasting and 
frustrating...


OTOH, if one has an AGI idea that's already been fleshed out a fair bit 
and one is ready to try to use it as the basis for a scalable, 
extensible system, SE is more worth paying attention to...


Premature attention to engineering when one should be focusing on 
science is a risk, but so is ignoring engineering when one wants to 
build a scalable, extensible system...


I think you missed my point, but no matter.

My point was that premature attention to engineering is absolutely 
vital in a field such as the cognitive science approach to AGI. 
Cognitive scientists simply do not have the time to be experts in 
cognitive science, AND software engineers at the same time.  Fort that 
reason, their models, and the way they think about theoretical models, 
are severely constrained by their weak ability to build software systems.


In this case, the science is being crippled by the lack of tools, so 
there is no such thing as premature attention to engineering.




Richard Loosemore









ben g

On Mon, Dec 22, 2008 at 9:03 AM, Richard Loosemore r...@lightlink.com 
mailto:r...@lightlink.com wrote:


Valentina Poletti wrote:

I have a question for you AGIers.. from your experience as well
as from your background, how relevant do you think software
engineering is in developing AI software and, in particular AGI
software? Just wondering.. does software verification as well as
correctness proving serve any use in this field? Or is this
something used just for Nasa and critical applications?


1) Software engineering (if we take that to mean the conventional
repertoire of techniques taught as SE) is relevant to any project
that gets up above a certain size, but it is less important when the
project is much smaller, serves a more exploratory function, or
where the design is constantly changing.  To this extent I agree
with Pei's comments.

2) If you are looking beyond the idea of simply grabbing some SE
techniques off the shelf, and are instead asking if SE can have an
impact on AGI, then the answer is a dramatic Yes!.  Why?  Because
tools determine the way that we *can* think about things.  Tools
shape our thoughts.  They can sometimes enable us to think in new
ways that were simply not possible before the tools were invented.

I decided a long time ago that if cognitive scientists had
easy-to-use use tools that enabled them to construct realistic
components of thinking systems, their entire style of explanation
would be revolutionized.  Right now, cog sci people cannot afford
the time to be both cog sci experts *and* sophisticated software
developers, so they have to make do with programming that is, by and
large, trivially simple.  This determines the kinds of models and
explanations they can come up with.  (Ditto in spades for the
neuroscientists, by the way).

So, the more global answer to your question is that nothing could be
more important for AGI than software engineering.

The problem is, that the kind of software engineering we are talking
about is not a matter of grabbing SE components off the shelf, but
asking what the needs of cognitive scientists and AGIers might be,
and then inventing new techniques and tools that will give these
people the ability to think about intelligent systems in new ways.

That is why I am working on Safaire.





Richard Loosemore



---
agi
Archives: https://www.listbox.com/member/archive/303/=now
RSS Feed: https://www.listbox.com/member/archive/rss/303/
Modify Your Subscription: https://www.listbox.com/member/?;
https://www.listbox.com/member/?;
Powered by Listbox: http://www.listbox.com




--
Ben Goertzel, PhD
CEO, Novamente LLC and Biomind LLC
Director of Research, SIAI
b...@goertzel.org mailto:b...@goertzel.org

I intend to live forever, or die trying.
-- Groucho Marx


*agi* | Archives https://www.listbox.com/member/archive/303/=now 
https://www.listbox.com/member/archive/rss/303/ | Modify 
https://www.listbox.com/member/?; 
Your Subscription	[Powered by Listbox] 

Re: [agi] Relevance of SE in AGI

2008-12-21 Thread Pei Wang
At the current time, almost all AGI projects are still working on
conceptual design issues, and the systems developed are just
prototypes, so software engineering is not that much relevant. In the
future, when most of the theoretical problems have been solved,
especially when it becomes clear that one approach is going to lead us
to AGI, software engineering will become really relevant.

The existing AI applications are not that different from just
computer applications, for which software engineering is necessary,
but there isn't much intelligence in them.

BTW, in a sense software engineering is just the opposite of
artificial intelligence: while the latter tries to make machines to
work as flexibly as humans, the former tries to make humans
(programmers) to work as rigidly as machines. ;-)

Pei

On Sat, Dec 20, 2008 at 8:28 PM, Valentina Poletti jamwa...@gmail.com wrote:
 I have a question for you AGIers.. from your experience as well as from your
 background, how relevant do you think software engineering is in developing
 AI software and, in particular AGI software? Just wondering.. does software
 verification as well as correctness proving serve any use in this field? Or
 is this something used just for Nasa and critical applications?
 Valentina
 
 agi | Archives | Modify Your Subscription


---
agi
Archives: https://www.listbox.com/member/archive/303/=now
RSS Feed: https://www.listbox.com/member/archive/rss/303/
Modify Your Subscription: 
https://www.listbox.com/member/?member_id=8660244id_secret=123753653-47f84b
Powered by Listbox: http://www.listbox.com


Re: [agi] Relevance of SE in AGI

2008-12-21 Thread Philip Hunt
2008/12/21 Valentina Poletti jamwa...@gmail.com:
 I have a question for you AGIers.. from your experience as well as from your
 background, how relevant do you think software engineering is in developing
 AI software and, in particular AGI software?

If by software engineering you mean techniques for writing software
better, then software engineering is relevant to all production of
software, whether for AI or anything else.

AI can be thought of as a particularly hard field of software development.

 Just wondering.. does software
 verification as well as correctness proving serve any use in this field?

I've never used formal proofs of correctness of software, so can't
comment. I use software testing (unit tests) on pretty much all
non-trivial software thast I write -- i find doing so makes things
much easier.

-- 
Philip Hunt, cabala...@googlemail.com
Please avoid sending me Word or PowerPoint attachments.
See http://www.gnu.org/philosophy/no-word-attachments.html


---
agi
Archives: https://www.listbox.com/member/archive/303/=now
RSS Feed: https://www.listbox.com/member/archive/rss/303/
Modify Your Subscription: 
https://www.listbox.com/member/?member_id=8660244id_secret=123753653-47f84b
Powered by Listbox: http://www.listbox.com


Re: [agi] Relevance of SE in AGI

2008-12-21 Thread Steve Richfield
Valentina,

Having written http://www.DrEliza.com, several NN programs, and LOT of
financial applications, and holding a CDP - widely recognized in financial
programming circles, here are my comments.

The real world is a little different than the theoretical world of CS, in
that people want results rather than proofs. However, especially in the
financial world, errors CAN be expensive. Hence, the usual approaches
involve extensive internal checking (lots of Assert statements, etc.),
careful code reviews (that often uncover errors that testing just can't
catch because a tester may not think of all of the ways that a piece of code
might be stressed), and code-coverage analysis to identify what has NOT been
exercise/exorcised.

I write AI software pretty much the same way that I have written financial
software.

Note that really good internal checking can almost replace early testing,
because as soon as something produces garbage, it will almost immediately
get caught. Hence, just write it and throw it into the rest of the code, and
let its environment test it. Initially, it might contain temporary code to
display its results, which will soon get yanked when everything looks OK.

Finally, really good error handling is an absolute MUST, because no such
complex application is ever completely wrung out. If it isn't fail-soft,
then it probably will never ever make it as a product. This pretty much
excuses C/C++ from consideration, but still leaves C# in the running.

I prefer programming in environments that check everything possible, like
Visual Basic or .NET. These save a LOT of debugging effort by catching
nearly all of the really hard bugs that languages like C/C++ seem to make in
bulk. Further, when you think that your application is REALLY wrung out, you
can then re-compile with most of the error checking turned off to get C-like
speed.

Note that these things can also be said for Java, but most implementations
don't provide compilers that can turn off error checking, which cuts their
speed to ~1/3 that of other approaches. Losing 2/3 of the speed is a high
price to pay for a platform.

Steve Richfield
==
On 12/20/08, Valentina Poletti jamwa...@gmail.com wrote:

 I have a question for you AGIers.. from your experience as well as from
 your background, how relevant do you think software engineering is in
 developing AI software and, in particular AGI software? Just wondering..
 does software verification as well as correctness proving serve any use in
 this field? Or is this something used just for Nasa and critical
 applications?

 Valentina
  --
   *agi* | Archives https://www.listbox.com/member/archive/303/=now
 https://www.listbox.com/member/archive/rss/303/ | 
 Modifyhttps://www.listbox.com/member/?;Your Subscription
 http://www.listbox.com/




---
agi
Archives: https://www.listbox.com/member/archive/303/=now
RSS Feed: https://www.listbox.com/member/archive/rss/303/
Modify Your Subscription: 
https://www.listbox.com/member/?member_id=8660244id_secret=123753653-47f84b
Powered by Listbox: http://www.listbox.com


Re: [agi] Relevance of SE in AGI

2008-12-21 Thread Daniel Allen
Great post, Steve.  Thanks.


---
agi
Archives: https://www.listbox.com/member/archive/303/=now
RSS Feed: https://www.listbox.com/member/archive/rss/303/
Modify Your Subscription: 
https://www.listbox.com/member/?member_id=8660244id_secret=123753653-47f84b
Powered by Listbox: http://www.listbox.com