Re: P.S. Re: [IAEP] etoys now available in Debian's non-free repository

2008-06-28 Thread K. K. Subramaniam
On Saturday 28 Jun 2008 4:51:47 pm Alan Kay wrote:
> It was realized that most computing of the 50s and 60s was rather like
> ...
> state in which they will become part of the ecology.
I propose that this overview be included as part of Squeak. Squeak is very 
different from conventional programming toolkits. A good overview like this 
would help set the right perspective at the beginning and eliminate many 
misunderstandings down the road.

Subbu
___
Devel mailing list
Devel@lists.laptop.org
http://lists.laptop.org/listinfo/devel


Re: P.S. Re: [IAEP] etoys now available in Debian's non-free repository

2008-06-28 Thread Yoshiki Ohshima
> Continuing with the biological analogy, the folks who want to be able to
> bootstrap a Squeak/etoys image (starting from 'scratch' without such an image)
> want literally to be able to make ontogeny recapitulate phylogeny -- not
> necessarily every time an image starts, possibly not necessarily every time
> Squeak is 'built' -- but at least with similar frequency and the ease of
> bootstrapping gcc using a different C compiler. (Like using a turtle egg to
> hatch a dinosaur ;-)

  I believe that the question is not whether a system can do it or
not.  Smalltalk-72 did it, GNU Smalltalk and Scheme 48 do it, and our
fonc project is right now taking similar approach.  So bootstrapping
is possible.  However, given the current status, the trade off between
having to change the existing system fundamental way and its value is
the question.

-- Yoshiki
___
Devel mailing list
Devel@lists.laptop.org
http://lists.laptop.org/listinfo/devel


Re: P.S. Re: [IAEP] etoys now available in Debian's non-free repository

2008-06-28 Thread Dan Krejsa
On Sat, 2008-06-28 at 04:21 -0700, Alan Kay wrote:

> It was realized that most computing of the 50s and 60s was rather like
> synthetic chemistry in which structures are built atom by atom and
> molecule by molecule. This gets more and more difficult for larger and
> more complex combinations. "Life" generally uses a process quite
> different -- instead of building cells and organisms, it grows them.
> This leads to seeming paradoxes in the epistemology of making -- i.e.
> to make a cell we need a cell., to make a chicken we need a chicken.
> However, all works fine in the epistemology of growing. But the
> initial bootstrapping is a little tricky. Once the bootstrap is done
> to make life then life can assist much more powerfully in making more
> life, and to vary life, etc. As mentioned before, the Internet is one
> of these, and so is Smalltalk.
> 
> In "biologically inspired" architectures one is much more interested
> in how the organism and ecologies of them are to be sustained, and how
> the dynamical systems can be understood, fixed, changed, evolved,
> reformulated, etc., while they are running and with the help of tools
> that work within them. Just as a cell, and especially e. g. a human
> baby, is not made by hand, we are more interested in making growth
> processes that can be influenced with far more ease than direct
> construction. So, most objects are made by other objects acting under
> conditions that include the dynamic state in which they will become
> part of the ecology.

It seems to me that this analogy is a fairly good one -- although
there are definite differences, in that the 'cell' needed to make
another 'cell' in the etoys/squeak case has been designed with lots
of tools to make it easy to inspect and modify itself, as well as
to completely sequence its DNA or produce a new generation on demand.

Living systems are however notorious for carrying historical baggage
along with them in their genotype, and, since the phenotype cannot
easily be recreated without starting with a parent phenotype, the Ken
Thompson hack implies that inheritable baggage can (paradoxically) be 
carried in the phenotype as well. I think the number of somewhat independent
tools provided to in[tro]spect a running image would make an intentional
malicious Thompson hack in Sqeak quite difficult to maintain for long
without discovery; but I would naively guess that there is some 'harmless'
baggage that looks reasonable and is allowed to continue just due to inertia.
Since the 'DNA' (source code) of etoys/squeak is readily available in a
transparent, human-understandable form, it seems to me that the only issue
of possible concern is the lesser visibility of the 'paradoxical' inheritance
via phenotype/image. Or at least, its lesser visibility if one refuses
to run etoys/squeak to use the tools it provides to inspect itself or
its images.

Continuing with the biological analogy, the folks who want to be able to
bootstrap a Squeak/etoys image (starting from 'scratch' without such an image)
want literally to be able to make ontogeny recapitulate phylogeny -- not
necessarily every time an image starts, possibly not necessarily every time
Squeak is 'built' -- but at least with similar frequency and the ease of
bootstrapping gcc using a different C compiler. (Like using a turtle egg to
hatch a dinosaur ;-)

I don't have a strong opinion on this myself, but I do find the
discussion interesting.

- Dan



___
Devel mailing list
Devel@lists.laptop.org
http://lists.laptop.org/listinfo/devel


P.S. Re: [IAEP] etoys now available in Debian's non-free repository

2008-06-28 Thread Alan Kay
P.S. I thought of a different way to possibly resolve this.

It occurs to me that this discussion and differences of opinion could really be 
about how executables are made. One of the main issues cited has to do with 
security, and a notion that being able to see the sources will help.

The simplest application model today has a file which contains executable code 
and one or more files of non-executable data which is manipulated by the 
executable file when they are combined into an OS process. For example, a text 
editor in Linux will have a file of executable code and a file that is a text 
document that (presumably) has no executables in it.

Since (Squeak) Smalltalk is set up to run under various OSs (even though it 
doesn't need them), this model is followed. There is an executable file which 
contains all the executable machine code in the system -- it usually has a name 
like Squeak.exe -- and there can be any number of "document" files which 
contain only data. The "image" file is such a document file, and it contains no 
executable code wrt any computer which might use it -- it is a pure data 
structure.

Squeak.exe or its equivalent is ultimately made by (a subset of) the C compiler 
and tools of the operating system that will: invoke it, give it access to the 
screen and keyboard and file system, etc. Every piece of C code that is fed to 
the C compiler is on one file or another and available for perusal. In 
practice, we don't write most of this code by hand, but generate it from a 
higher level architecture and debugging process that is internal to the Squeak 
system. But still, the result is C code that is compiled by someone else's C 
compiler (in this case one of the Linux compilers) into standard Linux 
executables that will be run in a standard way. Any Linux person can examine 
all the source files using tools they are familiar with.

The executable file is like an OS kernel that can incorporate executable 
plugins -- for graphics, sound, the particle system, (Mozilla has even been run 
as a plugin, etc.)), so there can be lots of stuff in this file -- but again, 
all of it ultimately has to be made in a form that is acceptable to the 
underlying OS which owns the machine resources.

Seems as though this should do it. And all these files are readily available 
for all of the OSs we deal with.

We would build the Squeak kernel very differently if there were no OS to 
contend with or that has to be cooperated with But, since we don't regard 
ourselves as a religion or the one right way, we are content to go along with 
current conventions where the alternative is needless argumentation.

The rest of the misunderstandings seem to me to be epistemological, and 
ultimately go back to very different POVs taken by different research groups in 
the 60s.

>From several different motivations, the ARPA (and then the PARC branch of 
>ARPA) community got into thinking about highly scalable architectures, partly 
>motivated by the goal of a world-wide network which had to scale better than 
>any computer artifact by many orders of magnitude. This community also took 
>Moore's Law seriously, particularly wrt personal computing and just how many 
>nodes the proposed network might have. This led to a "no-centers" style which 
>manifested itself most strongly in the 70s. The most extreme successful 
>versions of this style eliminated the OS, file systems, applications, data 
>structures, simulated punched cards/teletype, etc., in favor of what Marvin 
>Minsky called a "hetarchical" (as opposed to hierarchical) organization of as 
>much as possible.

Several of the formulators of this style had considerable backgrounds in 
biology, whose no-center scaling then and now goes far beyond anything 
successfully done in computing.

It was realized that most computing of the 50s and 60s was rather like 
synthetic chemistry in which structures are built atom by atom and molecule by 
molecule. This gets more and more difficult for larger and more complex 
combinations. "Life" generally uses a process quite different -- instead of 
building cells and organisms, it grows them. This leads to seeming paradoxes in 
the epistemology of making -- i.e. to make a cell we need a cell., to make a 
chicken we need a chicken. However, all works fine in the epistemology of 
growing. But the initial bootstrapping is a little tricky. Once the bootstrap 
is done to make life then life can assist much more powerfully in making more 
life, and to vary life, etc. As mentioned before, the Internet is one of these, 
and so is Smalltalk.

In "biologically inspired" architectures one is much more interested in how the 
organism and ecologies of them are to be sustained, and how the dynamical 
systems can be understood, fixed, changed, evolved, reformulated, etc., while 
they are running and with the help of tools that work within them. Just as a 
cell, and especially e. g. a human baby, is not made by hand, we are more 
interested i