Re: [agi] Priors and indefinite probabilities

2007-02-18 Thread Aki Iskandar


Thanks Ben - this makes complete sense, and you've answered my  
question precisely.


~Aki


On 19-Feb-07, at 1:03 AM, Ben Goertzel wrote:


Aki Iskandar wrote:


Hello -

I'm new on this email list.  I'm very interested in AI / AGI - but  
do not have any formal background at all.  I do have a degree in  
Finance, and have been a professional consultant / developer for  
the last 9 years (including having worked at Microsoft for almost  
3 of those years).


I am extremely happy to see that there are people out there that  
believe AGI will become a reality - I share the same belief.   
Most, to all, of my colleagues see AI as never becoming a  
reality.  Some that do see intelligent machines becoming a reality  
- believe that it is hardware, not software, that will make it  
so.  I believe the opposite ... in that the key is in the software  
- the hardware we have today is ample.


The reason I'm writing is that I am curious (after watching a  
couple of the videos on google linked off of Ben's site) as to why  
you're using C++ instead of other languages, such as C#, Java, or  
Python.  The later 2, and others, do the grunt work of cleaning up  
resources - thus allowing for more time to work on the problem  
domain, as well as saving time in compiling, linking, and debugging.


I'm not questioning your decision - I'm merely curious to learn  
about your motivations for selecting C++ as your language of choice.




The Novamente AI system is designed to run efficiently on SMP  
multiprocessor machines, using large amounts of RAM (as many  
gigabytes as the machine will support), and requiring complex and  
customized patterns of garbage collection.  The automated GC  
supplied by languages like Java or C# will not do the trick.  C++  
is the only language that has been intensively battle-tested under  
this kind of scenario.  (In principle, C# could be used, with  
copious use of unsafe code blocks, but it has not been intensively  
tested in this kind of scenario.)


C++ is a large language that can be used in many different ways.   
Early Novamente code was somewhat C-ish and is gradually being  
replaced.  New Novamente code makes heavy use of STL, generic  
design patterns, and the Boost library, which is a more elegant C++  
dialect.  STL and Boost do a lot of the gruntwork for you too,  
although they're not as simple to use as Java or Python, of course.


I personally love the Ruby language, and have prototyped some  
Novamente stuff in Ruby prior to its incorporation in the main C++  
codebase.  But Ruby is really slow and can't handle complex GC  
situations.




-- Ben G



Thanks,
~Aki


-
This list is sponsored by AGIRI: http://www.agiri.org/email
To unsubscribe or change your options, please go to:
http://v2.listbox.com/member/?list_id=303


-
This list is sponsored by AGIRI: http://www.agiri.org/email
To unsubscribe or change your options, please go to:
http://v2.listbox.com/member/?list_id=303


Re: [agi] Re: Languages for AGI

2007-02-18 Thread Ben Goertzel




In Abraham Lincoln's case I think it makes sense, since he already
knows how he'll use the axe. I doubt that most people who are worrying
about which language they'll use actually have a good idea of how to
actually design an AGI...

You can spend all the time you want sharpening your axes, it'll do you
no good if you don't know what you'll use it for...

Ricardo


Well put, Ricardo.

In the case of Novamente, we have the AGI design in hand; and in this
context it's clear that different programming languages would have their
different plusses and minuses for implementing Novamente ... but
ultimately it is not going to be the programming language that's going to
be the bottleneck.  It's going to be tuning and tweaking the details of
the numerous component algorithms that's going to be the bottleneck.
And making **this** process tractable, within the context of the
Novamente design, is much more dependent on how the code is
structured and how thoughtfully the detailed design is done, than on
what programming language is chosen.  It's true that some languages
more strongly encourage well-structured code than others do, but this
is not really such a major point.

We are currently restructuring some of our older Novamente code to
use a  more modern C++ idiom, heavier on templates and Boost.
But the most critical aspect of this restructuring is the greater insight
we've achieved recently into our AGI design itself, which has told
us what kind of abstract interfaces our Novamente core system really
needs, for interacting with the various AI  modules.  Ultimately the
nature of these abstract interfaces would not be so different, no
matter what the programming language (so long as the programming
language was reasonably expressive and supplied modern
programming constructs, i.e. no COBOL or FORTRAN...)

-- Ben G

-
This list is sponsored by AGIRI: http://www.agiri.org/email
To unsubscribe or change your options, please go to:
http://v2.listbox.com/member/?list_id=303


Re: [agi] Re: Languages for AGI

2007-02-18 Thread Ben Goertzel


BTW: I really loved Haskell when I used it in the 90's, and if there 
were a rip-roaring fast SMP Haskell implementation with an effective 
customizable garbage collector, Novamente would probably be written in 
Haskell.


But, there is not, and so Novamente is written in C++ ... but 
Novamente's "internal procedures" are written (i.e. learned by the 
system) in a language called Combo that has more in common with LISP or 
Haskell than C++ (it's purely functional, for one thing).  So, the 
"codic modality" of the system refers to this internal Combo language, 
not to the underlying C++ or assembly language layers.


-- Ben


Mark Waser wrote:

One reason for picking a language more powerful than the run-of-the-mill
imperative ones (of which virtually all the ones mentioned so far are 
just
different flavors) is that the can give you access to different 
paradigms

that will enhance your view of how an AGI should work internally.


Very true.  Arguably, before choosing a language, an AGI researcher 
should know an ordinary imperative language (Pascal/Java/C++/C#), some 
flavor of LISP, Prolog, and some flavor of ML -- just to know what the 
real choices are . . . .


The differences between most of the (ordinary imperative) languages 
that have been cited in this debate, from Python to Ruby to Java to 
C++ to C#, are merely syntax and the supporting infrastructure.  
Python is a little looser so that it is faster to develop in but it is 
more of a b*tch to debug.  Ruby has a lot of infrastructure that makes 
web development really fast and easy but doesn't have a lot beyond 
that realm.  Java and C# are virtually the same language -- except C# 
has a *lot* more infrastructure. And (in my opinion), C++ needs to be 
retired (get over it).


Personally, I've got a bias against Perl and Python because I've had 
far too much experience that shows that quick to develop turns into 
difficult to maintain and expand past a given point.  That's not what 
you want to see in something like AGI.


I also have a bias against lower-level languages.  Writing in machine 
language is for compilers, not human beings.  Writing in assembly 
language is only justified when doing heavy duty algorithms on 
specialized floating point processors (and only if someone else hasn't 
done it first).  Writing in C is just plain dumb (these days).  I've 
done all three of these things *when it was appropriate* but it just 
isn't appropriate any longer.  The successful developer is the one who 
uses the existing tools and infrastructure as the foundation for 
serious progress with clean, elegant architecture (and builds more 
layers of tools/infrastructure in their own personal toolbox).  The 
programmers who are ending up out of work are the ones who keep 
re-inventing the wheel over and over again.


As I've pointed out before in this venue, AGI is a hard enough task 
that it
makes sense to do some serious work on tools-to-build-the-tools. As 
Abraham
Lincoln put it, "If I had 8 hours to chop down a tree, I'd spend 6 
sharpening

my axe."


Amen.

- Original Message - From: "J. Storrs Hall, PhD." 
<[EMAIL PROTECTED]>

To: 
Sent: Sunday, February 18, 2007 3:29 PM
Subject: [agi] Re: Languages for AGI



One reason for picking a language more powerful than the run-of-the-mill
imperative ones (of which virtually all the ones mentioned so far are 
just
different flavors) is that the can give you access to different 
paradigms

that will enhance your view of how an AGI should work internally.

A classic example is Prolog (which I use for most my day-to-day 
programming).
I suggest reading "Clause and Effect" by Clocksin, a slim, high-level 
volume,

to get a "For God's sake, why didn't they ever mention this in school"
reaction when you see how 5 lines of Prolog do more than 100 lines of 
C for a
wide range of AI-like problems. (Not the actual AI itself, mind you, 
but all

the sort of thing that forms the infrastructure of a system).

Surveys of languages in common use very often show that O'Caml leads 
the pack
in a combination of conciseness of code and fast execution time, for 
what

it's worth.
http://shootout.alioth.debian.org/sandbox/index.php

For the past month or two, I've been delving into a new paradigm that 
promises

to have a deep effect on the way programming in general is done, but is
especially germane to AI. It's reactive programming (see this 
discussion at

http://lambda-the-ultimate.org/node/2068 )

The idea is a language that looks a lot more like the 
signals-and-systems
mindset of cybernetics than the logic-based one of McCarthy and early 
AI.


As I've pointed out before in this venue, AGI is a hard enough task 
that it
makes sense to do some serious work on tools-to-build-the-tools. As 
Abraham
Lincoln put it, "If I had 8 hours to chop down a tree, I'd spend 6 
sharpening

my axe."

Josh

-
This list is sponsored by AGIRI: http://www.agiri.org/email
To unsubscribe or change your options, please go to:
ht

Re: [agi] Priors and indefinite probabilities

2007-02-18 Thread Ben Goertzel

Aki Iskandar wrote:


Hello -

I'm new on this email list.  I'm very interested in AI / AGI - but do 
not have any formal background at all.  I do have a degree in Finance, 
and have been a professional consultant / developer for the last 9 
years (including having worked at Microsoft for almost 3 of those years).


I am extremely happy to see that there are people out there that 
believe AGI will become a reality - I share the same belief.  Most, to 
all, of my colleagues see AI as never becoming a reality.  Some that 
do see intelligent machines becoming a reality - believe that it is 
hardware, not software, that will make it so.  I believe the opposite 
... in that the key is in the software - the hardware we have today is 
ample.


The reason I'm writing is that I am curious (after watching a couple 
of the videos on google linked off of Ben's site) as to why you're 
using C++ instead of other languages, such as C#, Java, or Python.  
The later 2, and others, do the grunt work of cleaning up resources - 
thus allowing for more time to work on the problem domain, as well as 
saving time in compiling, linking, and debugging.


I'm not questioning your decision - I'm merely curious to learn about 
your motivations for selecting C++ as your language of choice.




The Novamente AI system is designed to run efficiently on SMP 
multiprocessor machines, using large amounts of RAM (as many gigabytes 
as the machine will support), and requiring complex and customized 
patterns of garbage collection.  The automated GC supplied by languages 
like Java or C# will not do the trick.  C++ is the only language that 
has been intensively battle-tested under this kind of scenario.  (In 
principle, C# could be used, with copious use of unsafe code blocks, but 
it has not been intensively tested in this kind of scenario.)


C++ is a large language that can be used in many different ways.  Early 
Novamente code was somewhat C-ish and is gradually being replaced.  New 
Novamente code makes heavy use of STL, generic design patterns, and the 
Boost library, which is a more elegant C++ dialect.  STL and Boost do a 
lot of the gruntwork for you too, although they're not as simple to use 
as Java or Python, of course.


I personally love the Ruby language, and have prototyped some Novamente 
stuff in Ruby prior to its incorporation in the main C++ codebase.  But 
Ruby is really slow and can't handle complex GC situations.




-- Ben G



Thanks,
~Aki


-
This list is sponsored by AGIRI: http://www.agiri.org/email
To unsubscribe or change your options, please go to:
http://v2.listbox.com/member/?list_id=303


Re: Languages for AGI [WAS Re: [agi] Priors and indefinite probabilities]

2007-02-18 Thread Samantha Atkins
Eliezer S. Yudkowsky wrote:
>
>
> If you know in advance what code you plan on writing, choosing a
> language should not be a big deal.  This is as true of AI as any other
> programming task.
>

It is still a big deal.  You want to chose a language that allows you to
express your intent as concisely and clearly as possible with a minimum
of language choice induced overhead.  Ideally you want a language that
actually helps you sharpen your thoughts as you express them.  You want
the result to run at reasonable speed and to be maintainable over time. 
Almost never do you know fully not only what you plan on writing but
what it will need to also handle an iteration or two down the road. You
learn what kind of flexibility to build in to help with inevitable
change.  But the choice of programming language can make a very large
difference in how easy it is to create and maintain that. 

- samantha

-
This list is sponsored by AGIRI: http://www.agiri.org/email
To unsubscribe or change your options, please go to:
http://v2.listbox.com/member/?list_id=303


Re: Languages for AGI [WAS Re: [agi] Priors and indefinite probabilities]

2007-02-18 Thread Samantha Atkins
Mark Waser wrote:
>
>> And, from a practical programmatic way of having  code generate code,
>> those are the only two ways.  The way you  mentioned - a text file -
>> you still have to call the compiler (which  you can do through the
>> above namespaces), but then you still have to  bring the dll into the
>> same appdomain and process.  In short, it is a  huge performance hit,
>> and in no way would seem to be a smooth  transition.
>
> Spoken by a man who has clearly never tried it.  I have functioning
> code that does *exactly* what I outlined.  There is no perceptible
> delay when the program writes, compiles, links, starts a new thread,
> and executes the second piece of new code (the first piece generates a
> minor delay which I attribute to loading the compiler and other tools
> into memory).
>

I have tried it.  I was writing code and especially classes to files,
compiling and loading them into memory back in the mid 80s.  There is no
way that opening a file, writing the code to it, closing the file,
invoking another process or several to compile and link it and still
another file I/O set to load it is going to be of no real performance
cost.  There is also no way it will outperform creating code directly in
a language tuned for it in memory and immediately evaluating it with or
without JIT machine code generation.  #Net is optimized for certain
stack based classes of languages.  Emulating other types of languages on
top of it is not going to be as efficient as implementing them closer to
the hardware.  If the IDL allowed creating a broader class of VMs than
it apparently does I would be much more interested.

> Also, even if it *did* generate a delay, this function should happen
> often enough that it is a problem and there are numerous ways around
> the delay (multi-tasking, etc).
>
How would it help you that much to do a bunch of context switching or
IPC on top of the original overhead?

- samantha

-
This list is sponsored by AGIRI: http://www.agiri.org/email
To unsubscribe or change your options, please go to:
http://v2.listbox.com/member/?list_id=303


Re: Languages for AGI [WAS Re: [agi] Priors and indefinite probabilities]

2007-02-18 Thread Samantha Atkins
Eugen Leitl wrote:
> On Sun, Feb 18, 2007 at 12:40:03AM -0800, Samantha Atkins wrote:
>
>   
>> Really?  I question whether you can get anywhere near the same level of
>> reflection and true data <-> code equivalence in any other standard
>> language.  I would think this capability might be very important
>> especially to a Seed AI.
>> 
>
> 
>
> However, the AI school represented here seems to assume a seed AI (an 
> open-ended agent
> capable of directly extracting information from its environment) is 
> sufficiently simple
> to be specified by a team of human programmers, and implemented explictly by
> a team of human programmers. This type of approach is most clearest 
> represented
> by Cyc, which is sterile. 

Cyc was never intended to be a Seed AI to the best of my knowledge.  If
not it doesn't make a very clear case against seed AI.

> The reason is assumption that the internal architecture
> of human cognition is fully inspectable by human analyst introspection alone, 
> and 
> that furthermore the resulting extracted architecture is below the complexity 
> ceiling 
> accessible to a human team of programmers. I believe both assumptions are 
> incorrect.
>   

I don't believe that any real intelligence will be reasonably
inspectable by human analysts.  As a working sofware geek these last
three decades or so I am quite aware of the limits of human
understanding of even perfectly mundane moderately large systems of
code.  I think the primary assumption with Seed AI is that humans can
put together something that has some small basis of generalizable
learning ability and the capacity to self improve from there.  That is
still a tall order but it doesn't require that humans are going to
understand the code very well, especially after an iteration or two.
> There are approaches which involve stochastical methods,
> information theory and evolutionary computation which appear potentially 
> fertile,
> though the details of the projects are hard to evaluate, since lacking 
> sufficient
> numbers of peer-reviewed publications, source code, or even interactive 
> demonstrations.
> Lisp does not particularly excel at these numerics-heavy applications, though 
> e.g.
> Koza used a subset of Lisp sexpr with reasonably good results. 
It is quite possible to write numerics-heavy applications in lisp where
needed that approach the speed of C.  With suitable declarations and
tuned code generation there is no reason for any significant gap. 
Unlike most languages such tuned subsystems can be created within the
language itself fairly seamlessly.   Among other things Lisp excels as
DSL  environment.

What I find problematic with Lisp is that it has been stuck in the
academic/specialist closet too long.  Python, for instance, has a far
greater wealth of libraries and glue for many tasks.  The Common Lisp
standard doesn't even specify a threading and IPC model.  Too much is
done differently in different implementations.   Too much has to be
created or reparented from the efforts of others in order to as
efficiently produce many types of practical systems.   That I have a
problem with.

- samantha

-
This list is sponsored by AGIRI: http://www.agiri.org/email
To unsubscribe or change your options, please go to:
http://v2.listbox.com/member/?list_id=303


Re: [agi] Re: Languages for AGI

2007-02-18 Thread Mike Dougherty

On 2/18/07, Mark Waser <[EMAIL PROTECTED]> wrote:

personal toolbox).  The programmers who are ending up out of work are the
ones who keep re-inventing the wheel over and over again.


Thinking about the amount of redundant (wasted) effort involved with
starting from scratch on an AI project, I considered an old adage and
modified it:

If you are not standing on the shoulders of giants, you are likely to
be trampled by them.

.. though I guess in the case of AGI, even giants have only taken a
few tentative steps

-
This list is sponsored by AGIRI: http://www.agiri.org/email
To unsubscribe or change your options, please go to:
http://v2.listbox.com/member/?list_id=303


Re: [agi] Re: Languages for AGI

2007-02-18 Thread Ricardo Barreira

> The idea is a language that looks a lot more like the signals-and-systems
> mindset of cybernetics than the logic-based one of McCarthy and early AI.
>
> As I've pointed out before in this venue, AGI is a hard enough task that
> it
> makes sense to do some serious work on tools-to-build-the-tools. As
> Abraham
> Lincoln put it, "If I had 8 hours to chop down a tree, I'd spend 6
> sharpening
> my axe."


In Abraham Lincoln's case I think it makes sense, since he already
knows how he'll use the axe. I doubt that most people who are worrying
about which language they'll use actually have a good idea of how to
actually design an AGI...

You can spend all the time you want sharpening your axes, it'll do you
no good if you don't know what you'll use it for...

Ricardo

-
This list is sponsored by AGIRI: http://www.agiri.org/email
To unsubscribe or change your options, please go to:
http://v2.listbox.com/member/?list_id=303


Re: Languages for AGI [WAS Re: [agi] Priors and indefinite probabilities]

2007-02-18 Thread Bob Mottram

I've seen the programming language merry-go-round on AI related forums too
many times to become embroiled, but for what it's worth I'm using C# /
.NET.  My master plan for robotic domination involves using Mono.

-
This list is sponsored by AGIRI: http://www.agiri.org/email
To unsubscribe or change your options, please go to:
http://v2.listbox.com/member/?list_id=303


Re: Languages for AGI [WAS Re: [agi] Priors and indefinite probabilities]

2007-02-18 Thread Russell Wallace

On 2/18/07, Chuck Esterbrook <[EMAIL PROTECTED]> wrote:


You are absolutely...correct. I think the utility of existing database
servers is very underappreciated in academia and many AI researchers
are from academia or working on academia style projects (gov't
research grants or work to support research--not that there's anything
wrong with that!). But it's too bad as databases have a lot to offer.
Anyone, feel free to ask if you want me to expand.



Please do; it hadn't jumped out at me that commercial database systems are
suitable for AI work, but I'm not a database expert; I could well be
overlooking something.

Regarding platform, while you and I like .NET some people will reject

it because Microsoft (and the former Borland engineers they hired to
work on it), created it. I've talked to people who said they would use
it if it were open source. So I point them to Novell Mono (the open
source clone) at which point they claim they can't use it because
Microsoft will eventually shut Novell down. After I point out that
Microsoft submitted .NET as a published standard so that projects like
Novell Mono could take place, well... then it's on to the next excuse.



How well does Mono work? In particular, if I write a GUI-intensive program
in Visual C# and try to use Mono to run it on Linux, Solaris or whatever,
will it work entirely, or only mostly with a few glitches to work around, or
will the GUI part crash and burn with only the internal computation part
continuing to function? (I've heard people say the latter, but I haven't
tried it personally, and the question strikes me as relevant since while
Windows dominates the office desktop, Unix is a lot stronger in many
potential AI markets.)

-
This list is sponsored by AGIRI: http://www.agiri.org/email
To unsubscribe or change your options, please go to:
http://v2.listbox.com/member/?list_id=303


Re: Languages for AGI [WAS Re: [agi] Priors and indefinite probabilities]

2007-02-18 Thread Chuck Esterbrook

On 2/18/07, Charles D Hixson <[EMAIL PROTECTED]> wrote:

Chuck Esterbrook wrote:
> On 2/18/07, Eliezer S. Yudkowsky <[EMAIL PROTECTED]> wrote:
>> Mark Waser wrote:
>> >...
>
> I find C++ overly complex while simultaneously lacking well known
> productivity boosters including:
> * garbage collection
> * language level bounds checking
> * contracts
> * reflection / introspection (complete and portable)
> * dynamic loading (portable)
> * dynamic invocation
>
> Having benefited from these in other languages such as Python and C#,
> I'm not going back. Ever.
> ...
> Best regards,
>
> -Chuck
You might check out D ( http://www.digitalmars.com/d/index.html ).  Mind
you, it's still in the quite early days, and missing a lot of libraries
... which means you need to construct interfaces to the C versions.
Still, it answers several of your objections, and has partial answers to
at least one of the others.


Thanks for the suggestion. I cranked out lots of D for a few weeks and
overall it's a nice language. In fact, I was jealous to see my "unit
testing as a language feature" idea already implemented before I had a
chance to implement it myself.

D still isn't as high level as I'd like (think Python, Ruby) and it's
evolution felt painfully slow. It's also a language unto itself,
whereas I'm fan of using .NET/mono to get quick access to existing
libraries and tools. Oh yeah, and I could never get a debugger going.
Compounding that pain: there was no stack trace output for runtime
errors like there is for C# or Python.

All my D comments come with a big grain of salt because that was in
late 2005 that I checked it out.

I checked out Boo after that which also has some nice things going for
it, but also had various deficiencies I wasn't willing to live with
(or rework the code for).

Although Cobra is young, it's usable (I rewrote the compiler in Cobra
last fall) and not surprisingly, I'm especially happy it's choices in
various areas. :-)

It's full steam ahead. Okay, "part-time steam-ahead" since it's not my day job.

-Chuck

-
This list is sponsored by AGIRI: http://www.agiri.org/email
To unsubscribe or change your options, please go to:
http://v2.listbox.com/member/?list_id=303


Re: Languages for AGI [WAS Re: [agi] Priors and indefinite probabilities]

2007-02-18 Thread Charles D Hixson

Chuck Esterbrook wrote:

On 2/18/07, Eliezer S. Yudkowsky <[EMAIL PROTECTED]> wrote:

Mark Waser wrote:
>...


I find C++ overly complex while simultaneously lacking well known
productivity boosters including:
* garbage collection
* language level bounds checking
* contracts
* reflection / introspection (complete and portable)
* dynamic loading (portable)
* dynamic invocation

Having benefited from these in other languages such as Python and C#,
I'm not going back. Ever.
...
Best regards,

-Chuck
You might check out D ( http://www.digitalmars.com/d/index.html ).  Mind 
you, it's still in the quite early days, and missing a lot of libraries 
... which means you need to construct interfaces to the C versions.  
Still, it answers several of your objections, and has partial answers to 
at least one of the others.


-
This list is sponsored by AGIRI: http://www.agiri.org/email
To unsubscribe or change your options, please go to:
http://v2.listbox.com/member/?list_id=303


Re: Languages for AGI [WAS Re: [agi] Priors and indefinite probabilities]

2007-02-18 Thread Chuck Esterbrook

On 2/18/07, Aki Iskandar <[EMAIL PROTECTED]> wrote:

On another note, are you planning on an IDE for Cobra?  Can you write
an extension for VS.NET, or for WingWare's Wing IDE?  How does one
develop in Cobra?  Now and in the future.


Now: Your favorite text editor and invocation from the command line.

Future: VS plugin. Would love to write it now, but there is more
pressing work on the language still to complete.

Note that there is a "superstacktrace" option to Cobra which will give
you a stacktrace for uncaught exceptions that includes the values of
all parameters and local variables for each stackframe. This is a
productivity booster in itself, but also an aid to make up for the
lack of interactive debugger.

-Chuck

-
This list is sponsored by AGIRI: http://www.agiri.org/email
To unsubscribe or change your options, please go to:
http://v2.listbox.com/member/?list_id=303


Re: [agi] Re: Languages for AGI

2007-02-18 Thread Mark Waser

One reason for picking a language more powerful than the run-of-the-mill
imperative ones (of which virtually all the ones mentioned so far are just
different flavors) is that the can give you access to different paradigms
that will enhance your view of how an AGI should work internally.


Very true.  Arguably, before choosing a language, an AGI researcher should 
know an ordinary imperative language (Pascal/Java/C++/C#), some flavor of 
LISP, Prolog, and some flavor of ML -- just to know what the real choices 
are . . . .


The differences between most of the (ordinary imperative) languages that 
have been cited in this debate, from Python to Ruby to Java to C++ to C#, 
are merely syntax and the supporting infrastructure.  Python is a little 
looser so that it is faster to develop in but it is more of a b*tch to 
debug.  Ruby has a lot of infrastructure that makes web development really 
fast and easy but doesn't have a lot beyond that realm.  Java and C# are 
virtually the same language -- except C# has a *lot* more infrastructure. 
And (in my opinion), C++ needs to be retired (get over it).


Personally, I've got a bias against Perl and Python because I've had far too 
much experience that shows that quick to develop turns into difficult to 
maintain and expand past a given point.  That's not what you want to see in 
something like AGI.


I also have a bias against lower-level languages.  Writing in machine 
language is for compilers, not human beings.  Writing in assembly language 
is only justified when doing heavy duty algorithms on specialized floating 
point processors (and only if someone else hasn't done it first).  Writing 
in C is just plain dumb (these days).  I've done all three of these things 
*when it was appropriate* but it just isn't appropriate any longer.  The 
successful developer is the one who uses the existing tools and 
infrastructure as the foundation for serious progress with clean, elegant 
architecture (and builds more layers of tools/infrastructure in their own 
personal toolbox).  The programmers who are ending up out of work are the 
ones who keep re-inventing the wheel over and over again.


As I've pointed out before in this venue, AGI is a hard enough task that 
it
makes sense to do some serious work on tools-to-build-the-tools. As 
Abraham
Lincoln put it, "If I had 8 hours to chop down a tree, I'd spend 6 
sharpening

my axe."


Amen.

- Original Message - 
From: "J. Storrs Hall, PhD." <[EMAIL PROTECTED]>

To: 
Sent: Sunday, February 18, 2007 3:29 PM
Subject: [agi] Re: Languages for AGI



One reason for picking a language more powerful than the run-of-the-mill
imperative ones (of which virtually all the ones mentioned so far are just
different flavors) is that the can give you access to different paradigms
that will enhance your view of how an AGI should work internally.

A classic example is Prolog (which I use for most my day-to-day 
programming).
I suggest reading "Clause and Effect" by Clocksin, a slim, high-level 
volume,

to get a "For God's sake, why didn't they ever mention this in school"
reaction when you see how 5 lines of Prolog do more than 100 lines of C 
for a
wide range of AI-like problems. (Not the actual AI itself, mind you, but 
all

the sort of thing that forms the infrastructure of a system).

Surveys of languages in common use very often show that O'Caml leads the 
pack

in a combination of conciseness of code and fast execution time, for what
it's worth.
http://shootout.alioth.debian.org/sandbox/index.php

For the past month or two, I've been delving into a new paradigm that 
promises

to have a deep effect on the way programming in general is done, but is
especially germane to AI. It's reactive programming (see this discussion 
at

http://lambda-the-ultimate.org/node/2068 )

The idea is a language that looks a lot more like the signals-and-systems
mindset of cybernetics than the logic-based one of McCarthy and early AI.

As I've pointed out before in this venue, AGI is a hard enough task that 
it
makes sense to do some serious work on tools-to-build-the-tools. As 
Abraham
Lincoln put it, "If I had 8 hours to chop down a tree, I'd spend 6 
sharpening

my axe."

Josh

-
This list is sponsored by AGIRI: http://www.agiri.org/email
To unsubscribe or change your options, please go to:
http://v2.listbox.com/member/?list_id=303




-
This list is sponsored by AGIRI: http://www.agiri.org/email
To unsubscribe or change your options, please go to:
http://v2.listbox.com/member/?list_id=303


Re: Languages for AGI [WAS Re: [agi] Priors and indefinite probabilities]

2007-02-18 Thread Jey Kottalam

You might want to consider the Boo programming language for a
Python-like language on .NET.
http://en.wikipedia.org/wiki/Boo_programming_language
http://boo.codehaus.org/



-Jey Kottalam

On 2/18/07, Aki Iskandar <[EMAIL PROTECTED]> wrote:


Chuck, I looked at Cobra yesterday, and I like it :-)

I will try to get some time and play with it.  My love of Python, and
reluctant admittance of appreciating .NET, are pointing me in the
direction of using one of 3 languages:

In no particular oder:

1 - Python (CPython)
2 - IronPython
3 - Cobra

but I will also continue to explore Common Lisp as time permits ...
its macros look promising ... but admittedly, it will take me some
time to absorb the language - so for now, its regular Python,
IronPython, or Yours (Cobra)!

One thing for sure though ... at least from my view ... Java and C++
are just not good enough - when I consider several factors ...
including productivity.   With the languages out there today, C++
makes absolutely no sense.   Java is just not as good as .NET ... but
this is because it came first, and was the .NET guinea pig.  Java was
great before C# / .NET.

~Aki




On 18-Feb-07, at 12:29 PM, Chuck Esterbrook wrote:

> On 2/18/07, Mark Waser <[EMAIL PROTECTED]> wrote:
>> Chuck is also absolutely incorrect that the only way to generate
>> code by
>> code is to use Reflection.Emit.  It is very easy to have your code
>> write
>> code in any language to a file (either real or virtual), compile
>> it, and
>> then load the resulting library (real or virtual) anytime you want/
>> need it.
>
> I'm not incorrect--because I never said that. Aki Iskandar brought
> that issue up. Then I pointed out that .NET code executes much faster
> than Python. I was not stating or implying that Reflection.Emit was
> the only means to produce .NET code.
>
> My Cobra compiler, for example, currently generates C# instead
> bytecode for numerous advantages:
> (a) faster bootstrapping (C# is higher level than bytecode)
> (b) leverage the excellent bytecode generation of the C# compiler
> (c) use C#'s error checking as an extra guard against deficiencies in
> my pre-1.0 compiler
>
>> There is absolutely no run-time cost to this method (if you're
>> keeping the
>> compiled code somewhere in your knowledge base) since you're
>> dealing with
>> compiled code (as long as you know how to manage spawning and killing
>> threads and processes so that you don't keep nine million
>> libraries loaded
>> that you'll never use again).
>
> Well "absolutely no run-time cost" is a bit strong. Code generation
> itself takes time, no matter what technique you use. And if you go the
> "generate source code route" then writing it to disk, invoking a
> compiler and linking it back in is a pretty slow process. I've looked
> for a way to do it all in memory, but haven't found one. (You can
> actually link in the C# compiler as a DLL so it's resident in your
> process, but it's API still wants a disk-based file.)
>
> But unless you're throwing away your generated code very quickly
> without using it much (seems unlikely), you'll make up the difference
> quite easily.
>
> And even dynamically loading DLLs and managing how you use them,
> unload them, etc. has *some* cost.
>
>> I also wouldn't sneer at using an established enterprise-class
>> database to
>> serve as one or more of your core knowledge stores.  There is *a
>> lot* of
> ...
>
> You are absolutely...correct. I think the utility of existing database
> servers is very underappreciated in academia and many AI researchers
> are from academia or working on academia style projects (gov't
> research grants or work to support research--not that there's anything
> wrong with that!). But it's too bad as databases have a lot to offer.
> Anyone, feel free to ask if you want me to expand.
>
>> The dumbest thing AGI researchers do is re-invent the wheel
>> constantly when
>> isn't necessary.  I'm heartily with Richard Loosemoore and his
>> call for
>> building a research infrastructure instead of all the walled
>> gardens (with
>> long, low learning curves and horrible enhancement curves) that we
>> have
>> currently.
>
> Some reuse is easy. Fairly generic components like languages and
> databases are easy to leverage on a project. After that, it gets very
> difficult. Normally, something has be documented, be stable, run fast,
> be on the same platform *and* be the right fit before it will be
> adopted on a serious project.
>
> Regarding platform, while you and I like .NET some people will reject
> it because Microsoft (and the former Borland engineers they hired to
> work on it), created it. I've talked to people who said they would use
> it if it were open source. So I point them to Novell Mono (the open
> source clone) at which point they claim they can't use it because
> Microsoft will eventually shut Novell down. After I point out that
> Microsoft submitted .NET as a published standard so that projects like
> Novell Mono could take place, well.

Re: Languages for AGI [WAS Re: [agi] Priors and indefinite probabilities]

2007-02-18 Thread Eugen Leitl
On Sun, Feb 18, 2007 at 09:51:45AM -0800, Eliezer S. Yudkowsky wrote:

> As Michael Wilson pointed out, only one thing is certain when it comes 
> to a language choice for FAI development:  If you build an FAI in 
> anything other than Lisp, numerous Lisp fanatics will spend the next 
> subjective century arguing that it would've been better to use Lisp.

All languages are shallow as far as AI is concerned, and only useful
to figure out the shape of the dedicated hardware for the target.
C-like things are more or less useful with meshed FPGA cores with
embedded RAM, but for a really minimalistic cellular architecture
C is also quite useless. However, C/MPI is very useful for running
a prototype on a large scale machine, with some 10^4..10^6 nodes.

It doesn't matter (much) which language you use in the initial prototype
phase, you will have to throw it away anyway.

Oh, and Python being slow: IronPython is .Net, and extending/expanding
Python for the prototype you do in C is the standard approach. 

A possible solution for those who're loath to touch hardware design: Erlang.

-- 
Eugen* Leitl http://leitl.org";>leitl http://leitl.org
__
ICBM: 48.07100, 11.36820http://www.ativel.com
8B29F6BE: 099D 78BA 2FD3 B014 B08A  7779 75B0 2443 8B29 F6BE

-
This list is sponsored by AGIRI: http://www.agiri.org/email
To unsubscribe or change your options, please go to:
http://v2.listbox.com/member/?list_id=303


signature.asc
Description: Digital signature


Re: Languages for AGI [WAS Re: [agi] Priors and indefinite probabilities]

2007-02-18 Thread Mark Waser

Aki,

   Picking a language, like any other choice, should be based upon 
articulable criteria (even if only "because I enjoy writing in it more than 
anything else").


   Your e-mail(s) provide(d) no substance other than unsupported opinions 
(and incorrect facts).


   I called you on it (and provided supporting facts, criteria, and other 
info).  Instead of providing substance to refute me or continue a *useful* 
discussion, you continue down the path of no substance (whining about my 
e-mail rather than discussing or rebutting facts).


   Dude, develop the thick skin you referenced and play science the right 
way, with facts.


- Original Message - 
From: "Aki Iskandar" <[EMAIL PROTECTED]>

To: 
Sent: Sunday, February 18, 2007 1:45 PM
Subject: **SPAM** Re: Languages for AGI [WAS Re: [agi] Priors and indefinite 
probabilities]





Mark -

I don't know you, and have no bones to pick with you.  I have no  bases, 
nor do I have motivations for doing so.


Picking a language is not a science - so to "prove" or "test" things, 
well ...


If you believe I'm wasting your time - don't bother reading - or  replying 
to my posts.


I, as much as you (or anyone else on this thread / list) have the  right 
to say what we like.  And by consequence, your email to me  below - as 
inapropriate, and frankly childish, as it was - was well  within your 
right.


My only comment is ... Stop taking things like attacks.  Get some  thick 
skin.  Because in science, you need it.   And believe it or  not, I am 
saying that out of respect to you.  Maybe you're having a  bad day - we 
all do - but if anyone wastes time, it is people  shouting at others.


Look at your email to me again.  Was this called for?  Look at your 
subsequent email to Eliezer.  Come on man. Lighten up a little.


Everyone else ... I apologize for taking your time to read this  email. 
I'm just hoping it'll make anyone from flaming people and  calling them 
stupid.


Enough said.  I think we can all get along, and learn something from  each 
other.


~Aki



On 18-Feb-07, at 1:21 PM, Mark Waser wrote:

[Aki]  This is by far too strong a statement - and most likely 
incorrect.


Don't play with "most likely"s.  Either disprove my statement or  don't 
waste our time.



Mark, do you work at Microsoft?


No, but the question is irrelevant (as is your working at Microsoft  --  
except so far as your believing that does prove something  proves that 
your beliefs are questionable).


there are more reasons than time I have to elaborate why I can't  agree 
with your statement.


So give us ONE!  Why are you wasting my attention if you won't back  up 
your statements with verifiable facts?


And, from a practical programmatic way of having  code generate  code, 
those are the only two ways.  The way you  mentioned - a  text file - 
you still have to call the compiler (which  you can do  through the 
above namespaces), but then you still have to  bring  the dll into the 
same appdomain and process.  In short, it is a   huge performance hit, 
and in no way would seem to be a smooth   transition.


Spoken by a man who has clearly never tried it.  I have functioning  code 
that does *exactly* what I outlined.  There is no perceptible  delay when 
the program writes, compiles, links, starts a new  thread, and executes 
the second piece of new code (the first piece  generates a minor delay 
which I attribute to loading the compiler  and other tools into memory).


Also, even if it *did* generate a delay, this function should  happen 
often enough that it is a problem and there are numerous  ways around the 
delay (multi-tasking, etc).


BTW - My apologies to Chuck for misattributing the quote.


- Original Message - From: "Aki Iskandar" <[EMAIL PROTECTED]>
To: 
Sent: Sunday, February 18, 2007 12:36 PM
Subject: **SPAM** Re: Languages for AGI [WAS Re: [agi] Priors and 
indefinite probabilities]





Before I comment on Mark's response, I think that the best comment  on 
this email thread came from Pei, who wrote ...



"I guess you can see, from the replies so far, that what language
people choose is strongly influenced by their conception of AI. Since
people have very different opinions on what an AI is and what is the
best way to build it, it is natural that they selected different
languages, based mainly on its convenience for their concrete  goal, or
even tried to invite new ones.

Therefore, I don't think there is a consensus on what the most
suitable language is for AI."


However, there was an upshot to all the replies to the original 
question - which as with any emotionally charged discourse, there  are 
nuggets of learnings  (I'm gaining insights into languages -  thus 
others have also learned things as well).


ok - now to breifly reply

[Mark]  Far and away, the best answer to the best language  question  is 
the .NET framework.


[Aki]  This is by far too strong a statement - and most likely 
incorrect. Mark, do you work at Microsoft?  I 

[agi] Re: Languages for AGI

2007-02-18 Thread J. Storrs Hall, PhD.
One reason for picking a language more powerful than the run-of-the-mill 
imperative ones (of which virtually all the ones mentioned so far are just 
different flavors) is that the can give you access to different paradigms 
that will enhance your view of how an AGI should work internally.

A classic example is Prolog (which I use for most my day-to-day programming). 
I suggest reading "Clause and Effect" by Clocksin, a slim, high-level volume, 
to get a "For God's sake, why didn't they ever mention this in school" 
reaction when you see how 5 lines of Prolog do more than 100 lines of C for a 
wide range of AI-like problems. (Not the actual AI itself, mind you, but all 
the sort of thing that forms the infrastructure of a system).

Surveys of languages in common use very often show that O'Caml leads the pack 
in a combination of conciseness of code and fast execution time, for what 
it's worth.
http://shootout.alioth.debian.org/sandbox/index.php

For the past month or two, I've been delving into a new paradigm that promises 
to have a deep effect on the way programming in general is done, but is 
especially germane to AI. It's reactive programming (see this discussion at
http://lambda-the-ultimate.org/node/2068 )

The idea is a language that looks a lot more like the signals-and-systems 
mindset of cybernetics than the logic-based one of McCarthy and early AI. 

As I've pointed out before in this venue, AGI is a hard enough task that it 
makes sense to do some serious work on tools-to-build-the-tools. As Abraham 
Lincoln put it, "If I had 8 hours to chop down a tree, I'd spend 6 sharpening 
my axe."

Josh

-
This list is sponsored by AGIRI: http://www.agiri.org/email
To unsubscribe or change your options, please go to:
http://v2.listbox.com/member/?list_id=303


Re: Languages for AGI [WAS Re: [agi] Priors and indefinite probabilities]

2007-02-18 Thread Eliezer S. Yudkowsky

Chuck Esterbrook wrote:

On 2/18/07, Eliezer S. Yudkowsky <[EMAIL PROTECTED]> wrote:


Heh.  Why not work in C++, then, and write your own machine language?
No need to write files to disk, just coerce a pointer to a function
pointer.  I'm no Lisp fanatic, but this sounds more like a case of
Greenspun's Tenth Rule to me.


I find C++ overly complex while simultaneously lacking well known
productivity boosters including:
* garbage collection
* language level bounds checking
* contracts
* reflection / introspection (complete and portable)
* dynamic loading (portable)
* dynamic invocation


I was being sarcastic, not advocating C++ as the One True AI language.


Eliezer, do write code at the institute? What language do you use and
for what reasons? What do you like and dislike about it with respect
to your project? Just curious.


I'm currently a theoretician.  My language-of-choice is Python for 
programs that are allowed to be slow.  C++ for number-crunching. 
Incidentally, back when I did more programming in C++, I wrote my own 
reflection package for it.  (In my defense, I was rather young at the time.)


B. Sheil once suggested that LISP excels primarily at letting you change 
your code after you realize that you wrote the wrong thing, and this is 
why LISP is the language of choice for AI work.  Strongly typed 
languages enforce boundaries between modules, and provide redundant 
constraints for catching bugs, which is helpful for coding conceptually 
straightforward programs.  But this same enforcement and redundancy 
makes it difficult to change the design of the program in midstream, for 
things that are not conceptually straightforward.  Sheil wrote in the 
1980s, but it still seems to me like a very sharp observation.


If you know in advance what code you plan on writing, choosing a 
language should not be a big deal.  This is as true of AI as any other 
programming task.


--
Eliezer S. Yudkowsky  http://singinst.org/
Research Fellow, Singularity Institute for Artificial Intelligence

-
This list is sponsored by AGIRI: http://www.agiri.org/email
To unsubscribe or change your options, please go to:
http://v2.listbox.com/member/?list_id=303


Re: Languages for AGI [WAS Re: [agi] Priors and indefinite probabilities]

2007-02-18 Thread Aki Iskandar


lol ... I enjoy your humor.

Good point on the Microsoft thing.  And you're right.  I certainly  
didn't mean it to be a snide remark.  When I used to work at  
Microsoft, I got tired of the "Microsoft is king" attitude - it was  
rampant - unfortunately.  So my comment was only contextual - the  
poster's comment "Far and away, the best answer to the best language  
question is the .NET framework."  was very reminiscent of the  
Microsoft culture - that is the only reason I wrote it.


In fact, I made sure to claim that I was NOT a .NET expert.   
Microsoft was a proud moment in my life, but I'm glad its over.


But I agree.  The Microsoft comment could have, and may have been,  
taken the wrong way.  So, I am sorry if it sounded snooty.  I assure  
everyone that this was not my intension.


I've learned that the motivation / preference for selection of  
languages - for any domain, not just AI - are like belly buttons,  
everybody has one :-)


On another note, are you planning on an IDE for Cobra?  Can you write  
an extension for VS.NET, or for WingWare's Wing IDE?  How does one  
develop in Cobra?  Now and in the future.


Thanks Chuck





On 18-Feb-07, at 2:09 PM, Chuck Esterbrook wrote:


On 2/18/07, Aki Iskandar <[EMAIL PROTECTED]> wrote:

Enough said.  I think we can all get along, and learn something from
each other.


Oh, yeah??? Prove it!

LOL No, I'm totally kidding. I couldn't resist making that joke.  :-)

There are certainly a couple people on this list that take every
comment as an arguing point when in fact, some of our comments are
conversational, usually to provide context for subsequent points.

But please keep in mind that a statement like "Do you work at
Microsoft?" especially followed by "I do" can *easily* be taken the
wrong way even if you did not mean it that way.


Peace,

-Chuck

-
This list is sponsored by AGIRI: http://www.agiri.org/email
To unsubscribe or change your options, please go to:
http://v2.listbox.com/member/?list_id=303


-
This list is sponsored by AGIRI: http://www.agiri.org/email
To unsubscribe or change your options, please go to:
http://v2.listbox.com/member/?list_id=303


Re: Languages for AGI [WAS Re: [agi] Priors and indefinite probabilities]

2007-02-18 Thread Chuck Esterbrook

On 2/18/07, Aki Iskandar <[EMAIL PROTECTED]> wrote:

Enough said.  I think we can all get along, and learn something from
each other.


Oh, yeah??? Prove it!

LOL No, I'm totally kidding. I couldn't resist making that joke.  :-)

There are certainly a couple people on this list that take every
comment as an arguing point when in fact, some of our comments are
conversational, usually to provide context for subsequent points.

But please keep in mind that a statement like "Do you work at
Microsoft?" especially followed by "I do" can *easily* be taken the
wrong way even if you did not mean it that way.


Peace,

-Chuck

-
This list is sponsored by AGIRI: http://www.agiri.org/email
To unsubscribe or change your options, please go to:
http://v2.listbox.com/member/?list_id=303


Re: Languages for AGI [WAS Re: [agi] Priors and indefinite probabilities]

2007-02-18 Thread Chuck Esterbrook

On 2/18/07, Eliezer S. Yudkowsky <[EMAIL PROTECTED]> wrote:

Mark Waser wrote:
>
> Chuck is also absolutely incorrect that the only way to generate code by
> code is to use Reflection.Emit.  It is very easy to have your code write
> code in any language to a file (either real or virtual), compile it, and
> then load the resulting library (real or virtual) anytime you want/need
> it. There is absolutely no run-time cost to this method (if you're
> keeping the compiled code somewhere in your knowledge base) since you're
> dealing with compiled code (as long as you know how to manage spawning
> and killing threads and processes so that you don't keep nine million
> libraries loaded that you'll never use again).

Heh.  Why not work in C++, then, and write your own machine language?
No need to write files to disk, just coerce a pointer to a function
pointer.  I'm no Lisp fanatic, but this sounds more like a case of
Greenspun's Tenth Rule to me.


I find C++ overly complex while simultaneously lacking well known
productivity boosters including:
* garbage collection
* language level bounds checking
* contracts
* reflection / introspection (complete and portable)
* dynamic loading (portable)
* dynamic invocation

Having benefited from these in other languages such as Python and C#,
I'm not going back. Ever.

Regarding the machine code generation, I don't find it easy to do. The
Intel instruction and register set looks like an exercise in
obfuscation and frustration. RISC chips would be far easier, but I
don't think anyone is beating Intel/AMD at price/performance/power.
With .NET I can generate a fairly straightforward bytecode with
reasonable effort and leverage all the work Microsoft and Novell have
put into the arcane art of optimal machine code generation.


As Michael Wilson pointed out, only one thing is certain when it comes
to a language choice for FAI development:  If you build an FAI in
anything other than Lisp, numerous Lisp fanatics will spend the next
subjective century arguing that it would've been better to use Lisp.


Eliezer, do write code at the institute? What language do you use and
for what reasons? What do you like and dislike about it with respect
to your project? Just curious.

Best regards,

-Chuck

-
This list is sponsored by AGIRI: http://www.agiri.org/email
To unsubscribe or change your options, please go to:
http://v2.listbox.com/member/?list_id=303


Re: Languages for AGI [WAS Re: [agi] Priors and indefinite probabilities]

2007-02-18 Thread Chuck Esterbrook

On 2/18/07, Aki Iskandar <[EMAIL PROTECTED]> wrote:

Chuck, I looked at Cobra yesterday, and I like it :-)


Glad to hear that.  :-)


I will try to get some time and play with it.  My love of Python, and
reluctant admittance of appreciating .NET, are pointing me in the
direction of using one of 3 languages:

In no particular oder:

1 - Python (CPython)
2 - IronPython
3 - Cobra

but I will also continue to explore Common Lisp as time permits ...
its macros look promising ... but admittedly, it will take me some
time to absorb the language - so for now, its regular Python,
IronPython, or Yours (Cobra)!


Thanks!

-Chuck

-
This list is sponsored by AGIRI: http://www.agiri.org/email
To unsubscribe or change your options, please go to:
http://v2.listbox.com/member/?list_id=303


Re: Languages for AGI [WAS Re: [agi] Priors and indefinite probabilities]

2007-02-18 Thread Aki Iskandar


Mark -

I don't know you, and have no bones to pick with you.  I have no  
bases, nor do I have motivations for doing so.


Picking a language is not a science - so to "prove" or "test" things,  
well ...


If you believe I'm wasting your time - don't bother reading - or  
replying to my posts.


I, as much as you (or anyone else on this thread / list) have the  
right to say what we like.  And by consequence, your email to me  
below - as inapropriate, and frankly childish, as it was - was well  
within your right.


My only comment is ... Stop taking things like attacks.  Get some  
thick skin.  Because in science, you need it.   And believe it or  
not, I am saying that out of respect to you.  Maybe you're having a  
bad day - we all do - but if anyone wastes time, it is people  
shouting at others.


Look at your email to me again.  Was this called for?  Look at your  
subsequent email to Eliezer.  Come on man. Lighten up a little.


Everyone else ... I apologize for taking your time to read this  
email.  I'm just hoping it'll make anyone from flaming people and  
calling them stupid.


Enough said.  I think we can all get along, and learn something from  
each other.


~Aki



On 18-Feb-07, at 1:21 PM, Mark Waser wrote:

[Aki]  This is by far too strong a statement - and most likely  
incorrect.


Don't play with "most likely"s.  Either disprove my statement or  
don't waste our time.



Mark, do you work at Microsoft?


No, but the question is irrelevant (as is your working at Microsoft  
--  except so far as your believing that does prove something  
proves that your beliefs are questionable).


there are more reasons than time I have to elaborate why I can't  
agree with your statement.


So give us ONE!  Why are you wasting my attention if you won't back  
up your statements with verifiable facts?


And, from a practical programmatic way of having  code generate  
code, those are the only two ways.  The way you  mentioned - a  
text file - you still have to call the compiler (which  you can do  
through the above namespaces), but then you still have to  bring  
the dll into the same appdomain and process.  In short, it is a   
huge performance hit, and in no way would seem to be a smooth   
transition.


Spoken by a man who has clearly never tried it.  I have functioning  
code that does *exactly* what I outlined.  There is no perceptible  
delay when the program writes, compiles, links, starts a new  
thread, and executes the second piece of new code (the first piece  
generates a minor delay which I attribute to loading the compiler  
and other tools into memory).


Also, even if it *did* generate a delay, this function should  
happen often enough that it is a problem and there are numerous  
ways around the delay (multi-tasking, etc).


BTW - My apologies to Chuck for misattributing the quote.


- Original Message - From: "Aki Iskandar" <[EMAIL PROTECTED]>
To: 
Sent: Sunday, February 18, 2007 12:36 PM
Subject: **SPAM** Re: Languages for AGI [WAS Re: [agi] Priors and  
indefinite probabilities]





Before I comment on Mark's response, I think that the best comment  
on this email thread came from Pei, who wrote ...



"I guess you can see, from the replies so far, that what language
people choose is strongly influenced by their conception of AI. Since
people have very different opinions on what an AI is and what is the
best way to build it, it is natural that they selected different
languages, based mainly on its convenience for their concrete  
goal, or

even tried to invite new ones.

Therefore, I don't think there is a consensus on what the most
suitable language is for AI."


However, there was an upshot to all the replies to the original   
question - which as with any emotionally charged discourse, there  
are nuggets of learnings  (I'm gaining insights into languages -  
thus  others have also learned things as well).


ok - now to breifly reply

[Mark]  Far and away, the best answer to the best language  
question  is the .NET framework.


[Aki]  This is by far too strong a statement - and most likely   
incorrect. Mark, do you work at Microsoft?  I have, for 3 years  
(not  that it makes me a .NET expert by any means), and there are  
more  reasons than time I have to elaborate why I can't agree with  
your  statement.  Two of the nicest things about .NET are ADO.NET  
and  Reflection.  Java (which I think is not as strong or as  
pleasurable  to work with) has reflection.  But something that is  
readily  available for Java (and soon .NET - but not yet) object  
database  management systems (ODBMS) - which may be of better use  
than  traditional RDBMS - and if not, still much better than  
ADO.NET - from  a developers viewpoint when programming against a  
datastore.



Chuck is also absolutely incorrect that the only way to generate   
code by code is to use Reflection.Emit.  It is very easy to have   
your code write code in any language to a file (either real or   
vir

Re: Languages for AGI [WAS Re: [agi] Priors and indefinite probabilities]

2007-02-18 Thread Mark Waser

Heh.  Why not work in C++, then, and write your own machine language?


What are you babbling about?  Why would anyone want to write their own 
machine language?  You can easily write and use code in any .NET language 
without reinventing the wheel.


No need to write files to disk, just coerce a pointer to a function 
pointer.


Actually, I don't write files to disk.  I just didn't want to get into the 
additional details that would then be required to convince snide individuals 
like you that it can also be done a more sophisticated way.  The simple, 
stupid method was sufficient to my proof.


I'm no Lisp fanatic, but this sounds more like a case of Greenspun's Tenth 
Rule to me.


Wow.  Did you have a lobotomy?  Do you actually want to contend that "Any 
sufficiently complicated C or Fortran program contains an ad hoc, 
informally-specified, bug-ridden, slow implementation of half of Common 
Lisp."?


As Michael Wilson pointed out, only one thing is certain when it comes to 
a language choice for FAI development:  If you build an FAI in anything 
other than Lisp, numerous Lisp fanatics will spend the next subjective 
century arguing that it would've been better to use Lisp.


So, this is supposed to prove exactly what?  LISP is great for some things. 
ML is awesome for some things.  Prolog is cool.  So what?  Why not use them 
all where most appropriate?


You re-invented the wheel with yet another language that never saw the light 
of day.  Didn't you learn anything from the experience?


- Original Message - 
From: "Eliezer S. Yudkowsky" <[EMAIL PROTECTED]>

To: 
Sent: Sunday, February 18, 2007 12:51 PM
Subject: **SPAM** Re: Languages for AGI [WAS Re: [agi] Priors and indefinite 
probabilities]




Mark Waser wrote:


Chuck is also absolutely incorrect that the only way to generate code by 
code is to use Reflection.Emit.  It is very easy to have your code write 
code in any language to a file (either real or virtual), compile it, and 
then load the resulting library (real or virtual) anytime you want/need 
it. There is absolutely no run-time cost to this method (if you're 
keeping the compiled code somewhere in your knowledge base) since you're 
dealing with compiled code (as long as you know how to manage spawning 
and killing threads and processes so that you don't keep nine million 
libraries loaded that you'll never use again).


Heh.  Why not work in C++, then, and write your own machine language? No 
need to write files to disk, just coerce a pointer to a function pointer. 
I'm no Lisp fanatic, but this sounds more like a case of Greenspun's Tenth 
Rule to me.


As Michael Wilson pointed out, only one thing is certain when it comes to 
a language choice for FAI development:  If you build an FAI in anything 
other than Lisp, numerous Lisp fanatics will spend the next subjective 
century arguing that it would've been better to use Lisp.


--
Eliezer S. Yudkowsky  http://singinst.org/
Research Fellow, Singularity Institute for Artificial Intelligence

-
This list is sponsored by AGIRI: http://www.agiri.org/email
To unsubscribe or change your options, please go to:
http://v2.listbox.com/member/?list_id=303




-
This list is sponsored by AGIRI: http://www.agiri.org/email
To unsubscribe or change your options, please go to:
http://v2.listbox.com/member/?list_id=303


Re: Languages for AGI [WAS Re: [agi] Priors and indefinite probabilities]

2007-02-18 Thread Mark Waser

[Aki]  This is by far too strong a statement - and most likely incorrect.


Don't play with "most likely"s.  Either disprove my statement or don't waste 
our time.



Mark, do you work at Microsoft?


No, but the question is irrelevant (as is your working at Microsoft --  
except so far as your believing that does prove something proves that your 
beliefs are questionable).


there are more reasons than time I have to elaborate why I can't agree 
with your statement.


So give us ONE!  Why are you wasting my attention if you won't back up your 
statements with verifiable facts?


And, from a practical programmatic way of having  code generate code, 
those are the only two ways.  The way you  mentioned - a text file - you 
still have to call the compiler (which  you can do through the above 
namespaces), but then you still have to  bring the dll into the same 
appdomain and process.  In short, it is a  huge performance hit, and in no 
way would seem to be a smooth  transition.


Spoken by a man who has clearly never tried it.  I have functioning code 
that does *exactly* what I outlined.  There is no perceptible delay when the 
program writes, compiles, links, starts a new thread, and executes the 
second piece of new code (the first piece generates a minor delay which I 
attribute to loading the compiler and other tools into memory).


Also, even if it *did* generate a delay, this function should happen often 
enough that it is a problem and there are numerous ways around the delay 
(multi-tasking, etc).


BTW - My apologies to Chuck for misattributing the quote.


- Original Message - 
From: "Aki Iskandar" <[EMAIL PROTECTED]>

To: 
Sent: Sunday, February 18, 2007 12:36 PM
Subject: **SPAM** Re: Languages for AGI [WAS Re: [agi] Priors and indefinite 
probabilities]





Before I comment on Mark's response, I think that the best comment on 
this email thread came from Pei, who wrote ...



"I guess you can see, from the replies so far, that what language
people choose is strongly influenced by their conception of AI. Since
people have very different opinions on what an AI is and what is the
best way to build it, it is natural that they selected different
languages, based mainly on its convenience for their concrete goal, or
even tried to invite new ones.

Therefore, I don't think there is a consensus on what the most
suitable language is for AI."


However, there was an upshot to all the replies to the original 
 question - which as with any emotionally charged discourse, there are 
nuggets of learnings  (I'm gaining insights into languages - thus  others 
have also learned things as well).


ok - now to breifly reply

[Mark]  Far and away, the best answer to the best language question  is 
the .NET framework.


[Aki]  This is by far too strong a statement - and most likely  incorrect. 
Mark, do you work at Microsoft?  I have, for 3 years (not  that it makes 
me a .NET expert by any means), and there are more  reasons than time I 
have to elaborate why I can't agree with your  statement.  Two of the 
nicest things about .NET are ADO.NET and  Reflection.  Java (which I think 
is not as strong or as pleasurable  to work with) has reflection.  But 
something that is readily  available for Java (and soon .NET - but not 
yet) object database  management systems (ODBMS) - which may be of better 
use than  traditional RDBMS - and if not, still much better than ADO.NET - 
from  a developers viewpoint when programming against a datastore.



Chuck is also absolutely incorrect that the only way to generate  code by 
code is to use Reflection.Emit.  It is very easy to have  your code write 
code in any language to a file (either real or  virtual), compile it, and 
then load the resulting library (real or  virtual) anytime you want/need 
it. There is absolutely no run-time  cost to this method (if you're 
keeping the compiled code somewhere  in your knowledge base) since you're 
dealing with compiled code



I'm the one that made that comment about Reflection.Emit - but I also 
included CodeDOM.  And, from a practical programmatic way of having  code 
generate code, those are the only two ways.  The way you  mentioned - a 
text file - you still have to call the compiler (which  you can do through 
the above namespaces), but then you still have to  bring the dll into the 
same appdomain and process.  In short, it is a  huge performance hit, and 
in no way would seem to be a smooth  transition.  THere would be lots and 
lots of "hang time" or waiting -  and if you did this often, its just 
completely impractical.  Any  execution speed advantages that .NET, in its 
compiled form, as  opposed to a comparatively slower runtime - such as 
Python for  example, is lost.  Way lost.


However, I completely agree with Mark's comment as to use existing 
technologies such as RDBMSs - and to not reinvent the wheel.  I know 
nothing about Novamente, and so this comment is not meant as  "Novamente 
should have ...".   Its a ge

Re: Languages for AGI [WAS Re: [agi] Priors and indefinite probabilities]

2007-02-18 Thread Eliezer S. Yudkowsky

Mark Waser wrote:


Chuck is also absolutely incorrect that the only way to generate code by 
code is to use Reflection.Emit.  It is very easy to have your code write 
code in any language to a file (either real or virtual), compile it, and 
then load the resulting library (real or virtual) anytime you want/need 
it. There is absolutely no run-time cost to this method (if you're 
keeping the compiled code somewhere in your knowledge base) since you're 
dealing with compiled code (as long as you know how to manage spawning 
and killing threads and processes so that you don't keep nine million 
libraries loaded that you'll never use again).


Heh.  Why not work in C++, then, and write your own machine language? 
No need to write files to disk, just coerce a pointer to a function 
pointer.  I'm no Lisp fanatic, but this sounds more like a case of 
Greenspun's Tenth Rule to me.


As Michael Wilson pointed out, only one thing is certain when it comes 
to a language choice for FAI development:  If you build an FAI in 
anything other than Lisp, numerous Lisp fanatics will spend the next 
subjective century arguing that it would've been better to use Lisp.


--
Eliezer S. Yudkowsky  http://singinst.org/
Research Fellow, Singularity Institute for Artificial Intelligence

-
This list is sponsored by AGIRI: http://www.agiri.org/email
To unsubscribe or change your options, please go to:
http://v2.listbox.com/member/?list_id=303


Re: Languages for AGI [WAS Re: [agi] Priors and indefinite probabilities]

2007-02-18 Thread Aki Iskandar


Chuck, I looked at Cobra yesterday, and I like it :-)

I will try to get some time and play with it.  My love of Python, and  
reluctant admittance of appreciating .NET, are pointing me in the  
direction of using one of 3 languages:


In no particular oder:

1 - Python (CPython)
2 - IronPython
3 - Cobra

but I will also continue to explore Common Lisp as time permits ...  
its macros look promising ... but admittedly, it will take me some  
time to absorb the language - so for now, its regular Python,  
IronPython, or Yours (Cobra)!


One thing for sure though ... at least from my view ... Java and C++  
are just not good enough - when I consider several factors ...  
including productivity.   With the languages out there today, C++  
makes absolutely no sense.   Java is just not as good as .NET ... but  
this is because it came first, and was the .NET guinea pig.  Java was  
great before C# / .NET.


~Aki




On 18-Feb-07, at 12:29 PM, Chuck Esterbrook wrote:


On 2/18/07, Mark Waser <[EMAIL PROTECTED]> wrote:
Chuck is also absolutely incorrect that the only way to generate  
code by
code is to use Reflection.Emit.  It is very easy to have your code  
write
code in any language to a file (either real or virtual), compile  
it, and
then load the resulting library (real or virtual) anytime you want/ 
need it.


I'm not incorrect--because I never said that. Aki Iskandar brought
that issue up. Then I pointed out that .NET code executes much faster
than Python. I was not stating or implying that Reflection.Emit was
the only means to produce .NET code.

My Cobra compiler, for example, currently generates C# instead
bytecode for numerous advantages:
(a) faster bootstrapping (C# is higher level than bytecode)
(b) leverage the excellent bytecode generation of the C# compiler
(c) use C#'s error checking as an extra guard against deficiencies in
my pre-1.0 compiler

There is absolutely no run-time cost to this method (if you're  
keeping the
compiled code somewhere in your knowledge base) since you're  
dealing with

compiled code (as long as you know how to manage spawning and killing
threads and processes so that you don't keep nine million  
libraries loaded

that you'll never use again).


Well "absolutely no run-time cost" is a bit strong. Code generation
itself takes time, no matter what technique you use. And if you go the
"generate source code route" then writing it to disk, invoking a
compiler and linking it back in is a pretty slow process. I've looked
for a way to do it all in memory, but haven't found one. (You can
actually link in the C# compiler as a DLL so it's resident in your
process, but it's API still wants a disk-based file.)

But unless you're throwing away your generated code very quickly
without using it much (seems unlikely), you'll make up the difference
quite easily.

And even dynamically loading DLLs and managing how you use them,
unload them, etc. has *some* cost.

I also wouldn't sneer at using an established enterprise-class  
database to
serve as one or more of your core knowledge stores.  There is *a  
lot* of

...

You are absolutely...correct. I think the utility of existing database
servers is very underappreciated in academia and many AI researchers
are from academia or working on academia style projects (gov't
research grants or work to support research--not that there's anything
wrong with that!). But it's too bad as databases have a lot to offer.
Anyone, feel free to ask if you want me to expand.

The dumbest thing AGI researchers do is re-invent the wheel  
constantly when
isn't necessary.  I'm heartily with Richard Loosemoore and his  
call for
building a research infrastructure instead of all the walled  
gardens (with
long, low learning curves and horrible enhancement curves) that we  
have

currently.


Some reuse is easy. Fairly generic components like languages and
databases are easy to leverage on a project. After that, it gets very
difficult. Normally, something has be documented, be stable, run fast,
be on the same platform *and* be the right fit before it will be
adopted on a serious project.

Regarding platform, while you and I like .NET some people will reject
it because Microsoft (and the former Borland engineers they hired to
work on it), created it. I've talked to people who said they would use
it if it were open source. So I point them to Novell Mono (the open
source clone) at which point they claim they can't use it because
Microsoft will eventually shut Novell down. After I point out that
Microsoft submitted .NET as a published standard so that projects like
Novell Mono could take place, well... then it's on to the next excuse.

One legit excuse is that some people already have a huge investment in
other platforms (Java) and cannot turn that around in terms of time
and money. We're already fragmented.

...

dealing with a whole framework rather than just a language).  And, of
course, all of this ignore the ultimate trump that several flavors  
of LISP

are

Re: Languages for AGI [WAS Re: [agi] Priors and indefinite probabilities]

2007-02-18 Thread Aki Iskandar


Before I comment on Mark's response, I think that the best comment on  
this email thread came from Pei, who wrote ...



"I guess you can see, from the replies so far, that what language
people choose is strongly influenced by their conception of AI. Since
people have very different opinions on what an AI is and what is the
best way to build it, it is natural that they selected different
languages, based mainly on its convenience for their concrete goal, or
even tried to invite new ones.

Therefore, I don't think there is a consensus on what the most
suitable language is for AI."


However, there was an upshot to all the replies to the original  
question - which as with any emotionally charged discourse, there are  
nuggets of learnings  (I'm gaining insights into languages - thus  
others have also learned things as well).


ok - now to breifly reply

[Mark]  Far and away, the best answer to the best language question  
is the .NET framework.


[Aki]  This is by far too strong a statement - and most likely  
incorrect.  Mark, do you work at Microsoft?  I have, for 3 years (not  
that it makes me a .NET expert by any means), and there are more  
reasons than time I have to elaborate why I can't agree with your  
statement.  Two of the nicest things about .NET are ADO.NET and  
Reflection.  Java (which I think is not as strong or as pleasurable  
to work with) has reflection.  But something that is readily  
available for Java (and soon .NET - but not yet) object database  
management systems (ODBMS) - which may be of better use than  
traditional RDBMS - and if not, still much better than ADO.NET - from  
a developers viewpoint when programming against a datastore.



Chuck is also absolutely incorrect that the only way to generate  
code by code is to use Reflection.Emit.  It is very easy to have  
your code write code in any language to a file (either real or  
virtual), compile it, and then load the resulting library (real or  
virtual) anytime you want/need it. There is absolutely no run-time  
cost to this method (if you're keeping the compiled code somewhere  
in your knowledge base) since you're dealing with compiled code



I'm the one that made that comment about Reflection.Emit - but I also  
included CodeDOM.  And, from a practical programmatic way of having  
code generate code, those are the only two ways.  The way you  
mentioned - a text file - you still have to call the compiler (which  
you can do through the above namespaces), but then you still have to  
bring the dll into the same appdomain and process.  In short, it is a  
huge performance hit, and in no way would seem to be a smooth  
transition.  THere would be lots and lots of "hang time" or waiting -  
and if you did this often, its just completely impractical.  Any  
execution speed advantages that .NET, in its compiled form, as  
opposed to a comparatively slower runtime - such as Python for  
example, is lost.  Way lost.


However, I completely agree with Mark's comment as to use existing  
technologies such as RDBMSs - and to not reinvent the wheel.  I know  
nothing about Novamente, and so this comment is not meant as  
"Novamente should have ...".   Its a general comment to not reinvent  
wheels.  If the wheel doesn't fit perfectly, you can build an  
"adapter" for it.


Bottom line ... Pei is correct.  There will not be a consensus on  
what the most

suitable language is for AI.

Regards,
~Aki


On 18-Feb-07, at 11:39 AM, Mark Waser wrote:

What is the best language for AI begs the question --> For which  
aspect of AI?
And also --> What are the requirements of *this particular part*  
of  your AI and who is programming it.


Far and away, the best answer to the best language question is  
the .NET framework.  If you're using the framework, you can use any  
language that has been implemented on the framework (which includes  
everything from C# to the OCAML-like F# and nearly every language  
in between -- those obviously many implementations are better than  
others) AND you can easily intermix languages (so the answer to  
best language will vary from piece to piece).


 (as long as you know how to manage spawning and killing threads  
and processes so that you don't keep nine million libraries loaded  
that you'll never use again).


I also wouldn't sneer at using an established enterprise-class  
database to serve as one or more of your core knowledge stores.   
There is *a lot* of infrastructure where many ongoing AI projects  
have re-invented the wheel over and over again as they have to add  
features that come free with such a product.  I have to wonder how  
much further along Novamente would be if it had used something like  
Oracle or SQL Server instead of building their own custom knowledge  
store and having to constantly upgrade it (and yes, I am quite  
convinced that you could implement all of the necessary  
functionality in either with less programming time and with faster  
execution than your curre

Re: Languages for AGI [WAS Re: [agi] Priors and indefinite probabilities]

2007-02-18 Thread Chuck Esterbrook

On 2/18/07, Mark Waser <[EMAIL PROTECTED]> wrote:

Chuck is also absolutely incorrect that the only way to generate code by
code is to use Reflection.Emit.  It is very easy to have your code write
code in any language to a file (either real or virtual), compile it, and
then load the resulting library (real or virtual) anytime you want/need it.


I'm not incorrect--because I never said that. Aki Iskandar brought
that issue up. Then I pointed out that .NET code executes much faster
than Python. I was not stating or implying that Reflection.Emit was
the only means to produce .NET code.

My Cobra compiler, for example, currently generates C# instead
bytecode for numerous advantages:
(a) faster bootstrapping (C# is higher level than bytecode)
(b) leverage the excellent bytecode generation of the C# compiler
(c) use C#'s error checking as an extra guard against deficiencies in
my pre-1.0 compiler


There is absolutely no run-time cost to this method (if you're keeping the
compiled code somewhere in your knowledge base) since you're dealing with
compiled code (as long as you know how to manage spawning and killing
threads and processes so that you don't keep nine million libraries loaded
that you'll never use again).


Well "absolutely no run-time cost" is a bit strong. Code generation
itself takes time, no matter what technique you use. And if you go the
"generate source code route" then writing it to disk, invoking a
compiler and linking it back in is a pretty slow process. I've looked
for a way to do it all in memory, but haven't found one. (You can
actually link in the C# compiler as a DLL so it's resident in your
process, but it's API still wants a disk-based file.)

But unless you're throwing away your generated code very quickly
without using it much (seems unlikely), you'll make up the difference
quite easily.

And even dynamically loading DLLs and managing how you use them,
unload them, etc. has *some* cost.


I also wouldn't sneer at using an established enterprise-class database to
serve as one or more of your core knowledge stores.  There is *a lot* of

...

You are absolutely...correct. I think the utility of existing database
servers is very underappreciated in academia and many AI researchers
are from academia or working on academia style projects (gov't
research grants or work to support research--not that there's anything
wrong with that!). But it's too bad as databases have a lot to offer.
Anyone, feel free to ask if you want me to expand.


The dumbest thing AGI researchers do is re-invent the wheel constantly when
isn't necessary.  I'm heartily with Richard Loosemoore and his call for
building a research infrastructure instead of all the walled gardens (with
long, low learning curves and horrible enhancement curves) that we have
currently.


Some reuse is easy. Fairly generic components like languages and
databases are easy to leverage on a project. After that, it gets very
difficult. Normally, something has be documented, be stable, run fast,
be on the same platform *and* be the right fit before it will be
adopted on a serious project.

Regarding platform, while you and I like .NET some people will reject
it because Microsoft (and the former Borland engineers they hired to
work on it), created it. I've talked to people who said they would use
it if it were open source. So I point them to Novell Mono (the open
source clone) at which point they claim they can't use it because
Microsoft will eventually shut Novell down. After I point out that
Microsoft submitted .NET as a published standard so that projects like
Novell Mono could take place, well... then it's on to the next excuse.

One legit excuse is that some people already have a huge investment in
other platforms (Java) and cannot turn that around in terms of time
and money. We're already fragmented.

...

dealing with a whole framework rather than just a language).  And, of
course, all of this ignore the ultimate trump that several flavors of LISP
are available on the .NET framework.


Python also runs on .NET. In fact, Microsoft hired the guy that was
implementing Python on .NET and the project (IronPython) is now hosted
by Microsoft. So now you can have your cake, generate a new one at
runtime, dynamically load it, and eat it, too!

-Chuck

-
This list is sponsored by AGIRI: http://www.agiri.org/email
To unsubscribe or change your options, please go to:
http://v2.listbox.com/member/?list_id=303


Re: Languages for AGI [WAS Re: [agi] Priors and indefinite probabilities]

2007-02-18 Thread Mark Waser
What is the best language for AI begs the question --> For which aspect of 
AI?
And also --> What are the requirements of *this particular part* of  your AI 
and who is programming it.


Far and away, the best answer to the best language question is the .NET 
framework.  If you're using the framework, you can use any language that has 
been implemented on the framework (which includes everything from C# to the 
OCAML-like F# and nearly every language in between -- those obviously many 
implementations are better than others) AND you can easily intermix 
languages (so the answer to best language will vary from piece to piece).


Chuck is also absolutely incorrect that the only way to generate code by 
code is to use Reflection.Emit.  It is very easy to have your code write 
code in any language to a file (either real or virtual), compile it, and 
then load the resulting library (real or virtual) anytime you want/need it. 
There is absolutely no run-time cost to this method (if you're keeping the 
compiled code somewhere in your knowledge base) since you're dealing with 
compiled code (as long as you know how to manage spawning and killing 
threads and processes so that you don't keep nine million libraries loaded 
that you'll never use again).


I also wouldn't sneer at using an established enterprise-class database to 
serve as one or more of your core knowledge stores.  There is *a lot* of 
infrastructure where many ongoing AI projects have re-invented the wheel 
over and over again as they have to add features that come free with such a 
product.  I have to wonder how much further along Novamente would be if it 
had used something like Oracle or SQL Server instead of building their own 
custom knowledge store and having to constantly upgrade it (and yes, I am 
quite convinced that you could implement all of the necessary functionality 
in either with less programming time and with faster execution than your 
current Novamente version).


The dumbest thing AGI researchers do is re-invent the wheel constantly when 
isn't necessary.  I'm heartily with Richard Loosemoore and his call for 
building a research infrastructure instead of all the walled gardens (with 
long, low learning curves and horrible enhancement curves) that we have 
currently.


I also have to dispute Samantha's "I question whether you can get anywhere 
near the same level of reflection and true data <-> code equivalence in any 
other standard language."  Reflection is a core functionality of the .NET 
framework and available to *all* .NET languages in a much more 
computationally convenient form than how most of LISP's reflection turns 
out.  I would also argue that a higher level retrospection framework is more 
necessary and more easily built in .NET than in LISP (given that you're 
dealing with a whole framework rather than just a language).  And, of 
course, all of this ignore the ultimate trump that several flavors of LISP 
are available on the .NET framework.


- Original Message - 
From: "Chuck Esterbrook" <[EMAIL PROTECTED]>

To: 
Sent: Saturday, February 17, 2007 5:49 PM
Subject: **SPAM** Re: Languages for AGI [WAS Re: [agi] Priors and indefinite 
probabilities]




On 2/17/07, Aki Iskandar <[EMAIL PROTECTED]> wrote:

Richard, Danny, Pei, Chuck, Eugen, Peter ... thanks all for answering
my question.

...

C# is definitely a productive language, mainly due to the IDE, and it
is faster than Java - however, it is strongly typed.
Perhaps the disadvantage to C#, form my perspective, is that the only
ways to generate code (by code) is by using Reflection.Emit, and
CodeDOM namespaces.  However, the performance hit is fr to costly
to run it - because it has to be compiled (to MSIL / bytecode) and
then the class type has to be loaded, and only then interperated / run.

Java suffers the same fate, and is slower than C#.

Python is a duck typed language, and has very rich flexibility when
designing datastructures.  In addition, it has a few ways to evaluate
code on the fly (enabling code that writes code).


I've cranked out mounds of Python and C#, so I have a few things to
offer on the subject. Regarding C#'s productivity coming mostly from
the IDE, I think that is only part of the picture. C# offers many high
level, productive features including garbage collection, classes,
exception handling, bounds checking, delegates, etc. while at the same
time offering excellent runtime speed. Those features aren't available
in C and some of them aren't even available in C++. C# is also better
designed and easier to use than Java primarily because it was designed
after Java as a better version of Java.

Python is still faster to crank out code with (and Ruby as well), but
both Python and Ruby are ridiculously slow. That will be a serious
problem if your application is CPU intensive and I believe any AGI
will be (though early exploratory programs may not).

One approach is to use two languages: Yahoo cranked out their
web-based mail site 

Re: Languages for AGI [WAS Re: [agi] Priors and indefinite probabilities]

2007-02-18 Thread Eugen Leitl
On Sun, Feb 18, 2007 at 12:40:03AM -0800, Samantha Atkins wrote:

> Really?  I question whether you can get anywhere near the same level of
> reflection and true data <-> code equivalence in any other standard
> language.  I would think this capability might be very important
> especially to a Seed AI.

Lisp is really great as a language for large scale software systems, which 
do really push the envelope of software development in terms of sheer size and 
complexity
of the result, which is still functional and useful. With parallel (asynchronous
message passing primitives equivalent to at least a subset of MPI) extensions
and run on a suitable (10^6..10^9 nodes) hardware there's no reason why Lisp
couldn't do AI, in principle. It might be not the best tool for the job,
but certainly not the worst, either.

However, the AI school represented here seems to assume a seed AI (an 
open-ended agent
capable of directly extracting information from its environment) is 
sufficiently simple
to be specified by a team of human programmers, and implemented explictly by
a team of human programmers. This type of approach is most clearest represented
by Cyc, which is sterile. The reason is assumption that the internal 
architecture
of human cognition is fully inspectable by human analyst introspection alone, 
and 
that furthermore the resulting extracted architecture is below the complexity 
ceiling 
accessible to a human team of programmers. I believe both assumptions are 
incorrect.

There are approaches which involve stochastical methods,
information theory and evolutionary computation which appear potentially 
fertile,
though the details of the projects are hard to evaluate, since lacking 
sufficient
numbers of peer-reviewed publications, source code, or even interactive 
demonstrations.
Lisp does not particularly excel at these numerics-heavy applications, though 
e.g.
Koza used a subset of Lisp sexpr with reasonably good results. MIT Scheme folks 
demonstrated
automated chip design long ago, so in principle Lisp could play well with 
today's large FPGAs. 

-- 
Eugen* Leitl http://leitl.org";>leitl http://leitl.org
__
ICBM: 48.07100, 11.36820http://www.ativel.com
8B29F6BE: 099D 78BA 2FD3 B014 B08A  7779 75B0 2443 8B29 F6BE

-
This list is sponsored by AGIRI: http://www.agiri.org/email
To unsubscribe or change your options, please go to:
http://v2.listbox.com/member/?list_id=303


signature.asc
Description: Digital signature


Re: Languages for AGI [WAS Re: [agi] Priors and indefinite probabilities]

2007-02-18 Thread Samantha Atkins
Eugen Leitl wrote:
> On Sat, Feb 17, 2007 at 08:24:21AM -0800, Chuck Esterbrook wrote:
>
>   
>> What is the nature of your language and development environment? Is it
>> in the same neighborhood as imperative OO languages such as Python and
>> Java? Or something "different" like Prolog?
>> 
>
> There are some very good Lisp systems (SBCL) with excellent compilers,
> rivalling C and Fortran in code quality (if you avoid common pitfalls
> like consing). Together with code and data being represented by
> the same data structure and good support of code generation by code
> (more so than any other language I've heard of) makes Lisp an evergreen
> for classical AI domains. (Of course AI is a massively parallel
> number-crunching application, so Lisp isn't all that helpful here).
>
>   
Really?  I question whether you can get anywhere near the same level of
reflection and true data <-> code equivalence in any other standard
language.  I would think this capability might be very important
especially to a Seed AI.

-
This list is sponsored by AGIRI: http://www.agiri.org/email
To unsubscribe or change your options, please go to:
http://v2.listbox.com/member/?list_id=303


Re: Languages for AGI [WAS Re: [agi] Priors and indefinite probabilities]

2007-02-18 Thread Samantha Atkins
Richard Loosemore wrote:
> Aki Iskandar wrote:
>>
>> Hello -
>>
>> I'm new on this email list.  I'm very interested in AI / AGI - but do
>> not have any formal background at all.  I do have a degree in
>> Finance, and have been a professional consultant / developer for the
>> last 9 years (including having worked at Microsoft for almost 3 of
>> those years).
>>
>> I am extremely happy to see that there are people out there that
>> believe AGI will become a reality - I share the same belief.  Most,
>> to all, of my colleagues see AI as never becoming a reality.  Some
>> that do see intelligent machines becoming a reality - believe that it
>> is hardware, not software, that will make it so.  I believe the
>> opposite ... in that the key is in the software - the hardware we
>> have today is ample.
>>
>> The reason I'm writing is that I am curious (after watching a couple
>> of the videos on google linked off of Ben's site) as to why you're
>> using C++ instead of other languages, such as C#, Java, or Python. 
>> The later 2, and others, do the grunt work of cleaning up resources -
>> thus allowing for more time to work on the problem domain, as well as
>> saving time in compiling, linking, and debugging.
>>
>> I'm not questioning your decision - I'm merely curious to learn about
>> your motivations for selecting C++ as your language of choice.
>>
>> Thanks,
>> ~Aki
>
> It is not always true that C++ is used (I am building my own language
> and development environment to do it, for example), but if C++ is most
> common in projects overall, that probably reflects the facts that:
>
> (a) it is most widely known, and
> (b) for many projects, it does not hugely matter which language is used.
>
> Frankly, I think most people choose the language they are already most
> familiar with.  There just don't happen to be any Cobol-trained AI
> researchers ;-).
>
> Back in the old days, it was different.  Lisp and Prolog, for example,
> represented particular ways of thinking about the task of building an
> AI.  The framework for those paradigms was strongly represented by the
> language itself.
>

What do you have in mind?  Pretty much every mechanism in any computer
language known was initially developed and often perfected in Lisp. 
Thus it does not seem me that Lisp was at all tied to a particular form
of program or programming much less to some forms of AI.

- samantha

-
This list is sponsored by AGIRI: http://www.agiri.org/email
To unsubscribe or change your options, please go to:
http://v2.listbox.com/member/?list_id=303