[fonc] languages

2011-06-04 Thread Julian Leviston
Hi,

Is a language I program in necessarily limiting in its expressibility?

Is there an optimum methodology of expressing algorithms (ie nomenclature)? Is 
there a good or bad way of expressing intent? Are there any intent languages in 
existence? Are there any pattern or algorithm languages? Is a programming 
language necessarily these two combined?

These are the questions I've been finding myself pondering lately.

For example, expressing object oriented concepts and patterns in C, while 
possible, proves rather "uncomfortable". Some things are almost impossible 
unless one "builds a world" inside C, but this is essentially building another 
language and using C as the meta-platform for this language, no? This would 
have to do with the fact that the design intent of the language didn't have 
this as its original intent, surely? Is there a way of patterning a language of 
programming such that it can extend itself infinitely, innately? Was smalltalk 
the first attempt at this? Does it fail by being too "large" in structural 
organisation?

In other words, would a "language" (or exploratory platform for programming) 
inherently require being "ridiculously simple" in terms of its structure in 
order to fully be able to represent any other "language" (or rather than 
language, simply more complicated structures) clearly?

Is Ometa an example of this?

Julian.
___
fonc mailing list
fonc@vpri.org
http://vpri.org/mailman/listinfo/fonc


Re: [fonc] languages

2011-06-04 Thread Alan Kay
Smalltalk was certainly not the first attempt -- and -- the most versatile 
Smalltalk in this area was the first Smalltalk and also the smallest.


I personally think expressibility is not just semantic, but also syntactic. 
Many 
different styles of programming have been realized in Lisp, but "many to most" 
of them suffer from the tradeoffs of the uniform parentheses bound notation 
(there are positive aspects of this also because the uniformity does remove one 
kind of mystery). 


The scheme that Dan Ingalls devised for the later Smalltalks overlapped with 
Lisp's, because Dan wanted a programmer to be able to parse any Smalltalk 
program at sight, no matter how much the semantics had been extended. 
Similarly, 
there was a long debate about whether to put in "normal" precedence for the 
common arithmetic operators. The argument that won was based on the APL 
argument 
that if you have lots of operators then precedence just gets confusing, so just 
associate to the right or left. However, one of the big complaints about 
Smalltalk-80 from the culture that thinks parentheses are a good idea after 
"if", is that it has a non-standard precedence for + and * 

A more interesting tradeoff perhaps is that between Tower of Babel and local 
high expressibility -- for example, when you decide to try lots of DSLs (as in 
STEPS). Each one has had the virtue of being very small and very clear about 
what is being expressed. At the meta level, the OMeta definitions of the syntax 
part are relatively small and relatively clear. But I think a big burden does 
get placed on the poor person from outside who is on the one hand presented 
with 
"not a lot of code that does do a lot", but they have to learn 4 or 5 languages 
to actually understand it.

People of my generation (50 years ago) were used to learning and using many 
syntaxes (e.g. one might learn as many as 20 or more machine code/assembler 
languages, plus 10 or more HLLs, both kinds with more variability in form and 
intent than today). Part of this could have stemmed from the high percentage of 
"math people" involved in computing back then -- part of that deal is learning 
to handle many kinds of mathematical notations, etc. 


Things seem different today for most programmers.

In any case, one of the things we learned from Smalltalk-72 is that even good 
language designers tend to create poor extensions during the heat of 
programming 
and debugging. And that an opportunity for cohesion in an extensible language 
is 
rarely seized. (Consider just how poor is the cohesion in a much smaller part 
of 
all this -- polymorphism -- even though it is of great benefit to everyone to 
have really strong (and few) polymorphisms.)

Cheers,

Alan





From: Julian Leviston 
To: Fundamentals of New Computing 
Sent: Sat, June 4, 2011 10:44:07 AM
Subject: [fonc] languages

Hi,

Is a language I program in necessarily limiting in its expressibility?

Is there an optimum methodology of expressing algorithms (ie nomenclature)? Is 
there a good or bad way of expressing intent? Are there any intent languages in 
existence? Are there any pattern or algorithm languages? Is a programming 
language necessarily these two combined?

These are the questions I've been finding myself pondering lately.

For example, expressing object oriented concepts and patterns in C, while 
possible, proves rather "uncomfortable". Some things are almost impossible 
unless one "builds a world" inside C, but this is essentially building another 
language and using C as the meta-platform for this language, no? This would 
have 
to do with the fact that the design intent of the language didn't have this as 
its original intent, surely? Is there a way of patterning a language of 
programming such that it can extend itself infinitely, innately? Was smalltalk 
the first attempt at this? Does it fail by being too "large" in structural 
organisation?

In other words, would a "language" (or exploratory platform for programming) 
inherently require being "ridiculously simple" in terms of its structure in 
order to fully be able to represent any other "language" (or rather than 
language, simply more complicated structures) clearly?

Is Ometa an example of this?

Julian.
___
fonc mailing list
fonc@vpri.org
http://vpri.org/mailman/listinfo/fonc
___
fonc mailing list
fonc@vpri.org
http://vpri.org/mailman/listinfo/fonc


Re: [fonc] languages

2011-06-04 Thread Alexis Read
In any case, one of the things we learned from Smalltalk-72 is that even
> good language designers tend to create poor extensions during the heat of
> programming and debugging. And that an opportunity for cohesion in an
> extensible language is rarely seized. (Consider just how poor is the
> cohesion in a much smaller part of all this -- polymorphism -- even though
> it is of great benefit to everyone to have really strong (and few)
> polymorphisms.)
>
> Cheers,
>
> Alan
>
>
> --
> *From:* Julian Leviston 
> *To:* Fundamentals of New Computing 
> *Sent:* Sat, June 4, 2011 10:44:07 AM
> *Subject:* [fonc] languages
>
> Hi,
>
> Is a language I program in necessarily limiting in its expressibility?
>
> Is there an optimum methodology of expressing algorithms (ie nomenclature)?
> Is there a good or bad way of expressing intent? Are there any intent
> languages in existence? Are there any pattern or algorithm languages? Is a
> programming language necessarily these two combined?
>
> These are the questions I've been finding myself pondering lately.
>
> For example, expressing object oriented concepts and patterns in C, while
> possible, proves rather "uncomfortable". Some things are almost impossible
> unless one "builds a world" inside C, but this is essentially building
> another language and using C as the meta-platform for this language, no?
> This would have to do with the fact that the design intent of the language
> didn't have this as its original intent, surely? Is there a way of
> patterning a language of programming such that it can extend itself
> infinitely, innately? Was smalltalk the first attempt at this? Does it fail
> by being too "large" in structural organisation?
>
> In other words, would a "language" (or exploratory platform for
> programming) inherently require being "ridiculously simple" in terms of its
> structure in order to fully be able to represent any other "language" (or
> rather than language, simply more complicated structures) clearly?
>
> Is Ometa an example of this?
>
> Julian.
> ___
> fonc mailing list
> fonc@vpri.org
> http://vpri.org/mailman/listinfo/fonc
>
> ___
> fonc mailing list
> fonc@vpri.org
> http://vpri.org/mailman/listinfo/fonc
>
>
___
fonc mailing list
fonc@vpri.org
http://vpri.org/mailman/listinfo/fonc


Re: [fonc] languages

2011-06-05 Thread Florin Mateoc
I would object to the claim that complaints about the non-standard precedence 
are somehow characteristic to the culture of if(

The clash is with math, not with another programming language. And it clashes 
with a well-established convention in math, therefore with the 
readability/expressibility of math formulae. Of course, a mathematician can 
agree that these are only conventions, but a mathematician already thinks in a 
highly abstract way, whereas for Smalltalk the argument was made that this 
would 
somehow help kids think more abstractly and better get the concept of 
precedence. But kids do not think abstractly. Furthermore, the precedence of + 
and * is not so much about the behavior of arithmetic operators. Regardless 
over 
what mathematical structure we define them, they are the very operators used to 
define the notion of distributivity, closely related to the notion of 
precedence. Saying that "addition distributes over multiplication" instead of 
"multiplication distributes over addition" is not more abstract, it just 
confuses the notions. We might as well start writing Smalltalk with lower caps 
as the first letter in all identifiers followed by all caps. aND cLAIM tHAT 
tHIS 
wILL hELP kIDS  tHINK mORE aBSTRACTLY aND sEE tHAT tHE wAY wE cAPITALIZE iS 
oNLY 
a cONVENTION.

As for the other operators, if they do not have some pre-defined/embedded 
precedence, they might as well use left to right. But the few of them that do 
would have warranted an exception.
I was thinking recently that operators would actually be a perfect use case for 
multimethods in Smalltalk, and that the precedence problem is more a 
consequence 
of the lack of multimethods. Anyway for numbers the matter of behavior 
responsibility is also questionable. And since binary selectors are already 
recognized as different in the language, implementing operators as multimethods 
could be done with minimal impact for readability. This approach could even be 
extended to support multikeyword multimethods by having the first keyword start 
with a binary selector character.

Best,
Florin





From: Alan Kay 
To: Fundamentals of New Computing 
Sent: Sat, June 4, 2011 2:46:33 PM
Subject: Re: [fonc] languages


Smalltalk was certainly not the first attempt -- and -- the most versatile 
Smalltalk in this area was the first Smalltalk and also the smallest.


I personally think expressibility is not just semantic, but also syntactic. 
Many 
different styles of programming have been realized in Lisp, but "many to most" 
of them suffer from the tradeoffs of the uniform parentheses bound notation 
(there are positive aspects of this also because the uniformity does remove one 
kind of mystery). 


The scheme that Dan Ingalls devised for the later Smalltalks overlapped with 
Lisp's, because Dan wanted a programmer to be able to parse any Smalltalk 
program at sight, no matter how much the semantics had been extended.  
Similarly, there was a long debate about whether to put in "normal" precedence 
for the common arithmetic operators. The argument that won was based on the APL 
argument that if you have lots of operators then precedence just gets 
confusing, 
so just associate to the right or left. However, one of the big complaints 
about 
Smalltalk-80 from the culture that thinks parentheses are a good idea after 
"if", is that it has a non-standard precedence for + and * 

A more interesting tradeoff perhaps is that between Tower of Babel and local 
high expressibility -- for example, when you decide to try lots of DSLs (as in 
STEPS). Each one has had the virtue of being very small and very clear about 
what is being expressed. At the meta level, the OMeta definitions of the syntax 
part are relatively small and relatively clear. But I think a big burden does 
get placed on the poor person from outside who is on the one hand presented 
with 
"not a lot of code that  does do a lot", but they have to learn 4 or 5 
languages 
to actually understand it.

People of my generation (50 years ago) were used to learning and using many 
syntaxes (e.g. one might learn as many as 20 or more machine code/assembler 
languages, plus 10 or more HLLs, both kinds with more variability in form and 
intent than today). Part of this could have stemmed from the high percentage of 
"math people" involved in computing back then -- part of that deal is learning 
to handle many kinds of mathematical notations, etc. 


Things seem different today for most programmers.

In any case, one of the things we learned from Smalltalk-72 is that even good 
language designers tend to create poor extensions during the heat of 
programming 
and debugging. And that an opportunity for cohesion in an extensible language 
is 
rarely seized. (Consider just how poor is the cohesion in a much smaller part 
of 
all this -- polymorphism -- even though it is of  great b

Re: [fonc] languages

2011-06-05 Thread Alan Kay
Check out APL, designed by a very good mathematician, to see why having no 
special precedences has merit in a language with lots of operators.

However, I think that we should have used the standard precedences in 
Smalltalk. 
Not from the math argument, or from a kids argument, but just because most 
conventional routes deposit the conventions on the travelers.

The arguments either way don't have much to do with " consequences of message 
sending" because what can be sent as a canonical form could be the abstract 
syntax packaging. Prolog had an idea -- that we thought about to some extent -- 
of being able to specify right and left precedences, but this was rejected as 
leading to real needless complexities.

Cheers,

Alan





From: Florin Mateoc 
To: Fundamentals of New Computing 
Sent: Sun, June 5, 2011 11:17:04 AM
Subject: Re: [fonc] languages


I would object to the claim that complaints about the non-standard precedence 
are somehow characteristic to the culture of if(

The clash is with math, not with another programming language. And it clashes 
with a well-established convention in math, therefore with the 
readability/expressibility of math formulae. Of course, a mathematician can 
agree that these are only conventions, but a mathematician already thinks in a 
highly abstract way, whereas for Smalltalk the argument was made that this 
would 
somehow help kids think more abstractly and better get the concept of 
precedence. But kids do not think abstractly. Furthermore, the precedence of + 
and * is not so much about the behavior of arithmetic operators. Regardless 
over 
what mathematical structure we define them, they are the very  operators used 
to 
define the notion of distributivity, closely related to the notion of 
precedence. Saying that "addition distributes over multiplication" instead of 
"multiplication distributes over addition" is not more abstract, it just 
confuses the notions. We might as well start writing Smalltalk with lower caps 
as the first letter in all identifiers followed by all caps. aND cLAIM tHAT 
tHIS 
wILL hELP kIDS  tHINK mORE aBSTRACTLY aND sEE tHAT tHE wAY wE cAPITALIZE iS 
oNLY 
a cONVENTION.

As for the other operators, if they do not have some pre-defined/embedded 
precedence, they might as well use left to right. But the few of them that do 
would have warranted an exception.
I was thinking recently that operators would actually be a perfect use case for 
multimethods in Smalltalk, and that the precedence problem is more a 
consequence 
of the lack of multimethods. Anyway for numbers the matter of behavior 
responsibility is also questionable. And  since binary selectors are already 
recognized as different in the language, implementing operators as multimethods 
could be done with minimal impact for readability. This approach could even be 
extended to support multikeyword multimethods by having the first keyword start 
with a binary selector character.

Best,
Florin





From: Alan Kay 
To: Fundamentals of New Computing 
Sent: Sat, June 4, 2011 2:46:33 PM
Subject: Re: [fonc] languages


Smalltalk was certainly not the first attempt -- and -- the most versatile 
Smalltalk in this area was the first Smalltalk and also the smallest.


I personally think expressibility is not just semantic, but also syntactic. 
Many 
different styles of programming have been realized in Lisp, but "many to most" 
of them suffer from the tradeoffs of the uniform parentheses bound notation 
(there are positive aspects of this also because the uniformity does remove one 
kind of mystery). 


The scheme that Dan Ingalls devised for the later Smalltalks overlapped with 
Lisp's, because Dan wanted a programmer to be able to parse any Smalltalk 
program at sight, no matter how much the semantics had been extended.  
Similarly, there was a long debate about whether to put in "normal" precedence 
for the common arithmetic operators. The argument that won was based on the APL 
argument that if you have lots of operators then precedence just gets 
confusing, 
so just associate to the right or left. However, one of the big complaints 
about 
Smalltalk-80 from the culture that thinks parentheses are a good idea after 
"if", is that it has a non-standard precedence for + and * 

A more interesting tradeoff perhaps is that between Tower of Babel and local 
high expressibility -- for example, when you decide to try lots of DSLs (as in 
STEPS). Each one has had the virtue of being very small and very clear about 
what is being expressed. At the meta level, the OMeta definitions of the syntax 
part are relatively small and relatively clear. But I think a big burden does 
get placed on the poor person from outside who is on the one hand presented 
with 
"not a lot of code that  does do a lot", but they have to learn 4 or 5 
languages 
to actually 

Re: [fonc] languages

2011-06-05 Thread Florin Mateoc
But wasn't APL called a "write-only language", which would make it in a way a 
polar opposite of Smalltalk?

I agree that it is not about "consequences of message sending". And, while I 
also agree that uniformity/simplicity are also virtues, I think it is more 
useful to explicitly state that there are "things" which are truly different. 
Especially in an object system which models the world. Numbers would be in that 
category, they "deserve" to be treated specially. In the same vein, I think 
mathematical operators "deserve" special treatment, and not just from an under 
the covers, optimization point of view.

Thank you,
Florin





From: Alan Kay 
To: Fundamentals of New Computing 
Sent: Sun, June 5, 2011 2:30:23 PM
Subject: Re: [fonc] languages


Check out APL, designed by a very good mathematician, to see why having no 
special precedences has merit in a language with lots of operators.

However, I think that we should have used the standard precedences in 
Smalltalk. 
Not from the math argument, or from a kids argument, but just because most 
conventional routes deposit the conventions on the travelers.

The arguments either way don't have much to do with " consequences of message 
sending" because what can be sent as a canonical form could be the abstract 
syntax packaging. Prolog had an idea -- that we thought about to some extent -- 
of being able to specify right and left precedences, but this was rejected as 
leading to real needless complexities.

Cheers,

Alan





From: Florin Mateoc 
To: Fundamentals of New Computing 
Sent: Sun, June 5, 2011 11:17:04 AM
Subject: Re: [fonc] languages


I would object to the claim that complaints about the non-standard precedence 
are somehow characteristic to the culture of if(

The clash is with math, not with another programming language. And it clashes 
with a well-established convention in math, therefore with the 
readability/expressibility of math formulae. Of course, a mathematician can 
agree that these are only conventions, but a mathematician already thinks in a 
highly abstract way, whereas for Smalltalk the argument was made that this 
would 
somehow help kids think more abstractly and better get the concept of 
precedence. But kids do not think abstractly. Furthermore, the precedence of + 
and * is not so much about the behavior of arithmetic operators. Regardless 
over 
what mathematical structure we define them, they are the very  operators used 
to 
define the notion of distributivity, closely related to the notion of 
precedence. Saying that "addition distributes over multiplication" instead of 
"multiplication distributes over addition" is not more abstract, it just 
confuses the notions. We might as well start writing Smalltalk with lower caps 
as the first letter in all identifiers followed by all caps. aND cLAIM tHAT 
tHIS 
wILL hELP kIDS  tHINK mORE aBSTRACTLY aND sEE tHAT tHE wAY wE cAPITALIZE iS 
oNLY 
a cONVENTION.

As for the other operators, if they do not have some pre-defined/embedded 
precedence, they might as well use left to right. But the few of them that do 
would have warranted an exception.
I was thinking recently that operators would actually be a perfect use case for 
multimethods in Smalltalk, and that the precedence problem is more a 
consequence 
of the lack of multimethods. Anyway for numbers the matter of behavior 
responsibility is also questionable. And  since binary selectors are already 
recognized as different in the language, implementing operators as multimethods 
could be done with minimal impact for readability. This approach could even be 
extended to support multikeyword multimethods by having the first keyword start 
with a binary selector character.

Best,
Florin





From: Alan Kay 
To: Fundamentals of New Computing 
Sent: Sat, June 4, 2011 2:46:33 PM
Subject: Re: [fonc] languages


Smalltalk was certainly not the first attempt -- and -- the most versatile 
Smalltalk in this area was the first Smalltalk and also the smallest.


I personally think expressibility is not just semantic, but also syntactic. 
Many 
different styles of programming have been realized in Lisp, but "many to most" 
of them suffer from the tradeoffs of the uniform parentheses bound notation 
(there are positive aspects of this also because the uniformity does remove one 
kind of mystery). 


The scheme that Dan Ingalls devised for the later Smalltalks overlapped with 
Lisp's, because Dan wanted a programmer to be able to parse any Smalltalk 
program at sight, no matter how much the semantics had been extended.  
Similarly, there was a long debate about whether to put in "normal" precedence 
for the common arithmetic operators. The argument that won was based on the APL 
argument that if you have lots of operators then 

Re: [fonc] languages

2011-06-05 Thread Alan Kay
Yep, and yep

Cheers,

Alan





From: Florin Mateoc 
To: Fundamentals of New Computing 
Sent: Sun, June 5, 2011 3:51:23 PM
Subject: Re: [fonc] languages


But wasn't APL called a "write-only language", which would make it in a way a 
polar opposite of Smalltalk?

I agree that it is not about "consequences of message sending". And, while I 
also agree that uniformity/simplicity are also virtues, I think it is more 
useful to explicitly state that there are "things" which are truly different. 
Especially in an object system which models the world. Numbers would be in that 
category, they "deserve" to be treated specially. In the same vein, I think 
mathematical operators "deserve" special treatment, and not just from an under 
the covers, optimization point of view.

Thank you,
Florin





From: Alan Kay 
To: Fundamentals of New Computing 
Sent: Sun, June 5, 2011 2:30:23 PM
Subject: Re: [fonc] languages


Check out APL, designed by a very good mathematician, to see why having no 
special precedences has merit in a language with lots of operators.

However, I think that we should have used the standard precedences in 
Smalltalk. 
Not from the math argument, or from a kids argument, but just because most 
conventional routes deposit the conventions on the travelers.

The arguments either way don't have much to do with " consequences of message 
sending" because what can be sent as a canonical form could be the abstract 
syntax packaging. Prolog had an idea -- that we thought about to some extent -- 
of being able to specify right and left precedences, but this was rejected as 
leading to real needless complexities.

Cheers,

Alan





From: Florin Mateoc 
To: Fundamentals of New Computing 
Sent: Sun, June 5, 2011 11:17:04 AM
Subject: Re: [fonc] languages


I would object to the claim that complaints about the non-standard precedence 
are somehow characteristic to the culture of if(

The clash is with math, not with another programming language. And it clashes 
with a well-established convention in math, therefore with the 
readability/expressibility of math formulae. Of course, a mathematician can 
agree that these are only conventions, but a mathematician already thinks in a 
highly abstract way, whereas for Smalltalk the argument was made that this 
would 
somehow help kids think more abstractly and better get the concept of 
precedence. But kids do not think abstractly. Furthermore, the precedence of + 
and * is not so much about the behavior of arithmetic operators. Regardless 
over 
what mathematical structure we define them, they are the very  operators used 
to 
define the notion of distributivity, closely related to the notion of 
precedence. Saying that "addition distributes over multiplication" instead of 
"multiplication distributes over addition" is not more abstract, it just 
confuses the notions. We might as well start writing Smalltalk with lower caps 
as the first letter in all identifiers followed by all caps. aND cLAIM tHAT 
tHIS 
wILL hELP kIDS  tHINK mORE aBSTRACTLY aND sEE tHAT tHE wAY wE cAPITALIZE iS 
oNLY 
a cONVENTION.

As for the other operators, if they do not have some pre-defined/embedded 
precedence, they might as well use left to right. But the few of them that do 
would have warranted an exception.
I was thinking recently that operators would actually be a perfect use case for 
multimethods in Smalltalk, and that the precedence problem is more a 
consequence 
of the lack of multimethods. Anyway for numbers the matter of behavior 
responsibility is also questionable. And  since binary selectors are already 
recognized as different in the language, implementing operators as multimethods 
could be done with minimal impact for readability. This approach could even be 
extended to support multikeyword multimethods by having the first keyword start 
with a binary selector character.

Best,
Florin





From: Alan Kay 
To: Fundamentals of New Computing 
Sent: Sat, June 4, 2011 2:46:33 PM
Subject: Re: [fonc] languages


Smalltalk was certainly not the first attempt -- and -- the most versatile 
Smalltalk in this area was the first Smalltalk and also the smallest.


I personally think expressibility is not just semantic, but also syntactic. 
Many 
different styles of programming have been realized in Lisp, but "many to most" 
of them suffer from the tradeoffs of the uniform parentheses bound notation 
(there are positive aspects of this also because the uniformity does remove one 
kind of mystery). 


The scheme that Dan Ingalls devised for the later Smalltalks overlapped with 
Lisp's, because Dan wanted a programmer to be able to parse any Smalltalk 
program at sight, no matter how much the semantics had been extended.  
Similarly, there was a long debat

Re: [fonc] languages

2011-06-05 Thread Steve Wart
I like both Smalltalk and APL. I disagree with the assumption that
operator precedence is a big hurdle for people learning Smalltalk. At
least I find mathematical expressions in Smalltalk to be clearer than
their counterparts in Lisp. I like the following example:

[:n :k | (1 to: k) inject: 1 into: [:c :i | c * (n - k + i / i)]]

(defn choose [n k] (reduce (fn [c i] (* c (/ (+ (- n k) i) i))) 1
(range 1 (+ k 1

Okay maybe they're both hard to understand; nobody said math was easy.
Lisp has seen a huge resurgence in popularity thanks to Clojure.
Smalltalk has also seen nice growth, although on a much smaller scale,
and sadly it's not generally considered viable for enterprise software
development anymore (which is generally the kind of code that matters
to me, boring as it is). But math operators are a red herring. No
programming language really does math well (except maybe APL).
Accountants, engineers and scientists have got on well enough using
whatever lets them do their calculations, but by and large these
operations are a very small part of any reasonably-sized program.

After spending the better part of the past year poring over a very
large Smalltalk code base, I think the biggest conceptual barrier is
that understanding Smalltalk code requires tools that leverage the
language metadata to dynamically analyze what's going on (I'm talking
about menu commands to search for senders and implementers of various
methods, and similar beasts). I think these tools offer a mechanism
that will eventually give you a conceptual understanding of what the
code is doing. Maybe that can be formalized or be proved equivalent or
superior to the explicit type information provided in more
conventional programming languages, maybe not.

Personally I don't think grep and javadoc are better, but the vast
majority of programmers in the world must disagree with me.

Type systems are for reasoning about code, whereas most programs are
written with a computational intent that is generally not formalized
or even formalizable. While it's nice to have programs that can be
formally proved, if you can't prove that your specification is correct
too, there's not much point in it. Ultimately what matters is fitness
for purpose, a big part of which is social utility and communicating
the intent to someone far removed from the original implementation.

In short, it's the libraries and how you can manage the dependencies
amongst your "units" of code that really matter most.

Cheers,
Steve

On Sun, Jun 5, 2011 at 3:55 PM, Alan Kay  wrote:
> Yep, and yep
>
> Cheers,
>
> Alan
>
> 
> From: Florin Mateoc 
> To: Fundamentals of New Computing 
> Sent: Sun, June 5, 2011 3:51:23 PM
> Subject: Re: [fonc] languages
>
> But wasn't APL called a "write-only language", which would make it in a way
> a polar opposite of Smalltalk?
>
> I agree that it is not about "consequences of message sending". And, while I
> also agree that uniformity/simplicity are also virtues, I think it is more
> useful to explicitly state that there are "things" which are truly
> different. Especially in an object system which models the world. Numbers
> would be in that category, they "deserve" to be treated specially. In the
> same vein, I think mathematical operators "deserve" special treatment, and
> not just from an under the covers, optimization point of view.
>
> Thank you,
> Florin
>
> 
> From: Alan Kay 
> To: Fundamentals of New Computing 
> Sent: Sun, June 5, 2011 2:30:23 PM
> Subject: Re: [fonc] languages
>
> Check out APL, designed by a very good mathematician, to see why having no
> special precedences has merit in a language with lots of operators.
>
> However, I think that we should have used the standard precedences in
> Smalltalk. Not from the math argument, or from a kids argument, but just
> because most conventional routes deposit the conventions on the travelers.
>
> The arguments either way don't have much to do with " consequences of
> message sending" because what can be sent as a canonical form could be the
> abstract syntax packaging. Prolog had an idea -- that we thought about to
> some extent -- of being able to specify right and left precedences, but this
> was rejected as leading to real needless complexities.
>
> Cheers,
>
> Alan
>
> 
> From: Florin Mateoc 
> To: Fundamentals of New Computing 
> Sent: Sun, June 5, 2011 11:17:04 AM
> Subject: Re: [fonc] languages
>
> I would object to the claim that complaints about the non-standard
> precedence are somehow characteristic to the culture of if(
>
> The clash is with math, not with another programming lang

Re: [fonc] languages

2011-06-05 Thread BGB
uot; of code that really matter most.



ok.



Cheers,
Steve

On Sun, Jun 5, 2011 at 3:55 PM, Alan Kay  wrote:

Yep, and yep

Cheers,

Alan


From: Florin Mateoc
To: Fundamentals of New Computing
Sent: Sun, June 5, 2011 3:51:23 PM
Subject: Re: [fonc] languages

But wasn't APL called a "write-only language", which would make it in a way
a polar opposite of Smalltalk?

I agree that it is not about "consequences of message sending". And, while I
also agree that uniformity/simplicity are also virtues, I think it is more
useful to explicitly state that there are "things" which are truly
different. Especially in an object system which models the world. Numbers
would be in that category, they "deserve" to be treated specially. In the
same vein, I think mathematical operators "deserve" special treatment, and
not just from an under the covers, optimization point of view.

Thank you,
Florin

____
From: Alan Kay
To: Fundamentals of New Computing
Sent: Sun, June 5, 2011 2:30:23 PM
Subject: Re: [fonc] languages

Check out APL, designed by a very good mathematician, to see why having no
special precedences has merit in a language with lots of operators.

However, I think that we should have used the standard precedences in
Smalltalk. Not from the math argument, or from a kids argument, but just
because most conventional routes deposit the conventions on the travelers.

The arguments either way don't have much to do with " consequences of
message sending" because what can be sent as a canonical form could be the
abstract syntax packaging. Prolog had an idea -- that we thought about to
some extent -- of being able to specify right and left precedences, but this
was rejected as leading to real needless complexities.

Cheers,

Alan


From: Florin Mateoc
To: Fundamentals of New Computing
Sent: Sun, June 5, 2011 11:17:04 AM
Subject: Re: [fonc] languages

I would object to the claim that complaints about the non-standard
precedence are somehow characteristic to the culture of if(

The clash is with math, not with another programming language. And it
clashes with a well-established convention in math, therefore with the
readability/expressibility of math formulae. Of course, a mathematician can
agree that these are only conventions, but a mathematician already thinks in
a highly abstract way, whereas for Smalltalk the argument was made that this
would somehow help kids think more abstractly and better get the concept of
precedence. But kids do not think abstractly. Furthermore, the precedence of
+ and * is not so much about the behavior of arithmetic operators.
Regardless over what mathematical structure we define them, they are the
very operators used to define the notion of distributivity, closely related
to the notion of precedence. Saying that "addition distributes over
multiplication" instead of "multiplication distributes over addition" is not
more abstract, it just confuses the notions. We might as well start writing
Smalltalk with lower caps as the first letter in all identifiers followed by
all caps. aND cLAIM tHAT tHIS wILL hELP kIDS tHINK mORE aBSTRACTLY aND sEE
tHAT tHE wAY wE cAPITALIZE iS oNLY a cONVENTION.

As for the other operators, if they do not have some pre-defined/embedded
precedence, they might as well use left to right. But the few of them that
do would have warranted an exception.
I was thinking recently that operators would actually be a perfect use case
for multimethods in Smalltalk, and that the precedence problem is more a
consequence of the lack of multimethods. Anyway for numbers the matter of
behavior responsibility is also questionable. And since binary selectors are
already recognized as different in the language, implementing operators as
multimethods could be done with minimal impact for readability. This
approach could even be extended to support multikeyword multimethods by
having the first keyword start with a binary selector character.

Best,
Florin


From: Alan Kay
To: Fundamentals of New Computing
Sent: Sat, June 4, 2011 2:46:33 PM
Subject: Re: [fonc] languages

Smalltalk was certainly not the first attempt -- and -- the most versatile
Smalltalk in this area was the first Smalltalk and also the smallest.

I personally think expressibility is not just semantic, but also syntactic.
Many different styles of programming have been realized in Lisp, but "many
to most" of them suffer from the tradeoffs of the uniform parentheses bound
notation (there are positive aspects of this also because the uniformity
does remove one kind of mystery).

The scheme that Dan Ingalls devised for the later Smalltalks overlapped with
Lisp's, because Dan wanted a programmer to be able to parse any Smalltalk
program at sight, no matter how much the semantics had been extended.
Similarly, t

Re: [fonc] languages

2011-06-05 Thread C. Scott Ananian
On Sun, Jun 5, 2011 at 8:35 PM, BGB  wrote:

> I would personally like to see an IDE which was:
> more-or-less language neutral, to what extent this was practical (more like
> traditional standalone editors);
> not tied to or hard-coded for particular tools or build configurations
> (nearly everything would be "actions" tied to high-level scripts, which
> would be customizable per-project, and ideally in a readily human-editable
> form);
> not being tied to a particular operating system;
> ...


This is Eclipse.  Granted, it's an IDE which is designed-by-committee and
hard to love, but it answers all of your requirements.
  --scott

-- 
  ( http://cscott.net )
___
fonc mailing list
fonc@vpri.org
http://vpri.org/mailman/listinfo/fonc


Re: [fonc] languages

2011-06-06 Thread BGB

On 6/5/2011 11:03 PM, C. Scott Ananian wrote:
On Sun, Jun 5, 2011 at 8:35 PM, BGB > wrote:


I would personally like to see an IDE which was:
more-or-less language neutral, to what extent this was practical
(more like traditional standalone editors);
not tied to or hard-coded for particular tools or build
configurations (nearly everything would be "actions" tied to
high-level scripts, which would be customizable per-project, and
ideally in a readily human-editable form);
not being tied to a particular operating system;
...


This is Eclipse.  Granted, it's an IDE which is designed-by-committee 
and hard to love, but it answers all of your requirements.

  --scott



I don't believe Eclipse is it, exactly...
it handles multiple languages, yes, and can be used with multiple 
operating systems, and supports multiple compiler backends, ...


however, AFAIK, pretty much all of the logic is written in Java-based 
plugins, which is not ideal (and so, essentially the logic is tied to 
Eclipse itself, and not to the individual projects).



I was imagining something a little different here, such as the project 
control files being more like Makefiles or Bash-scripts, and so would be 
plain-text and attached to the project (along with the source files), 
where it is possible to control things much more precisely per-project. 
more precisely, I had imagined essentially a hybrid of Makefiles and Bash.


also imagined was the possibility of using JavaScript (or similar) as 
the build-control language, just using JS in a manner similar to 
Make+Bash, likely with some special-purpose API functionality (to make 
it more usable for Make-like purposes).


a difficulty with JS though is that, normally, IDEs like things to be 
fairly declarative, and JS code its not declarative, unless the JS is 
split into multiple parts:
info about the project proper is stored in a JSON-based format, and then 
any build logic is JS files attached to the project.


so, the IDE would mostly just manage files and editors, and invoke the 
appropriate scripts as needed, and many IDE actions essentially just 
call functions, and so one causes something to happen by replacing the 
default action functions (such as in a script loaded by the project file).


actually, conceptually I like the JS route more, even if it would likely 
be a little more verbose than a Bash-like syntax.



IMO, the next best alternative is SciTE, so what I was imagining would 
be a more "expanded" version of SciTE.


then there is also CMake, ...

there is also SCons, which is conceptually related to the prior idea, 
but it based on Python.



but, for the most part, I have mostly just ended up sticking with good 
old text editors and makefiles, as these have served me well, despite 
their drawbacks (the cost of switching to an alternative strategy likely 
being somewhat higher than that of doing nothing and staying with the 
present strategy). IOW, the "if it aint broke, don't fix it" strategy...



or such...

___
fonc mailing list
fonc@vpri.org
http://vpri.org/mailman/listinfo/fonc


Re: [fonc] languages

2011-06-06 Thread Casey Ransberger
I've heard of an IDE called VisualAge (I think?) that was written in Smalltalk 
but could parse and to a degree reason about other languages, but I've never 
seen it. 

Have you looked for that thing, or was it just not so great?

On Jun 5, 2011, at 11:55 PM, BGB  wrote:

> On 6/5/2011 11:03 PM, C. Scott Ananian wrote:
>> 
>> On Sun, Jun 5, 2011 at 8:35 PM, BGB  wrote:
>> I would personally like to see an IDE which was:
>> more-or-less language neutral, to what extent this was practical (more like 
>> traditional standalone editors);
>> not tied to or hard-coded for particular tools or build configurations 
>> (nearly everything would be "actions" tied to high-level scripts, which 
>> would be customizable per-project, and ideally in a readily human-editable 
>> form);
>> not being tied to a particular operating system;
>> ...
>> 
>> This is Eclipse.  Granted, it's an IDE which is designed-by-committee and 
>> hard to love, but it answers all of your requirements.
>>   --scott
>> 
> 
> I don't believe Eclipse is it, exactly...
> it handles multiple languages, yes, and can be used with multiple operating 
> systems, and supports multiple compiler backends, ...
> 
> however, AFAIK, pretty much all of the logic is written in Java-based 
> plugins, which is not ideal (and so, essentially the logic is tied to Eclipse 
> itself, and not to the individual projects).
> 
> 
> I was imagining something a little different here, such as the project 
> control files being more like Makefiles or Bash-scripts, and so would be 
> plain-text and attached to the project (along with the source files), where 
> it is possible to control things much more precisely per-project. more 
> precisely, I had imagined essentially a hybrid of Makefiles and Bash.
> 
> also imagined was the possibility of using JavaScript (or similar) as the 
> build-control language, just using JS in a manner similar to Make+Bash, 
> likely with some special-purpose API functionality (to make it more usable 
> for Make-like purposes).
> 
> a difficulty with JS though is that, normally, IDEs like things to be fairly 
> declarative, and JS code its not declarative, unless the JS is split into 
> multiple parts:
> info about the project proper is stored in a JSON-based format, and then any 
> build logic is JS files attached to the project.
> 
> so, the IDE would mostly just manage files and editors, and invoke the 
> appropriate scripts as needed, and many IDE actions essentially just call 
> functions, and so one causes something to happen by replacing the default 
> action functions (such as in a script loaded by the project file).
> 
> actually, conceptually I like the JS route more, even if it would likely be a 
> little more verbose than a Bash-like syntax.
> 
> 
> IMO, the next best alternative is SciTE, so what I was imagining would be a 
> more "expanded" version of SciTE.
> 
> then there is also CMake, ...
> 
> there is also SCons, which is conceptually related to the prior idea, but it 
> based on Python.
> 
> 
> but, for the most part, I have mostly just ended up sticking with good old 
> text editors and makefiles, as these have served me well, despite their 
> drawbacks (the cost of switching to an alternative strategy likely being 
> somewhat higher than that of doing nothing and staying with the present 
> strategy). IOW, the "if it aint broke, don't fix it" strategy...
> 
> 
> or such...
> 
> ___
> fonc mailing list
> fonc@vpri.org
> http://vpri.org/mailman/listinfo/fonc
___
fonc mailing list
fonc@vpri.org
http://vpri.org/mailman/listinfo/fonc


Re: [fonc] languages

2011-06-06 Thread BGB

On 6/6/2011 12:29 AM, Casey Ransberger wrote:
I've heard of an IDE called VisualAge (I think?) that was written in 
Smalltalk but could parse and to a degree reason about other 
languages, but I've never seen it.


Have you looked for that thing, or was it just not so great?



not really looked at VisualAge...

was intending here to look some at "Code::Blocks", since I got thinking 
about IDEs some.



personally, I have not been as much into Smalltalk, due mostly to my 
apparent inability to understand it when looking at it (look at syntax 
reference, look at code, feel utterly confused as to just what I am 
looking at...). (like, somehow, my brain can't really parse it or make 
much sense of it).


I have taken some ideas off of both Smalltalk and Self, although in the 
form of a language I can more easily understand though...


granted, there are other languages like this (like my apparent inability 
to really make sense of Haskell either).


although, I can generally read/understand Forth and PostScript 
acceptably well, so I really don't know sometimes.



although, recently when writing documentation for some parts of my 
language, and made an observation:

str="Hello";
s=str;
while(*s)
printf("%c", *s++);
printf("\n");

basically, along the lines of:
"holy crap... my language still retains a fair amount in common with C...".

this was not originally intended (mostly, I was trying to implement 
ECMA-262 and add ActionScript and Java like features), just the 
combination of "little things" (implementing stuff, thinking "oh well, 
this would be nifty...") happens to allow a few C-like constructions to 
be written.


also:
buf=new char[256];
str="Hello";
t=buf; s=str;
while(*t++=*s++);

funny how this works sometimes...


or such...


On Jun 5, 2011, at 11:55 PM, BGB > wrote:



On 6/5/2011 11:03 PM, C. Scott Ananian wrote:
On Sun, Jun 5, 2011 at 8:35 PM, BGB > wrote:


I would personally like to see an IDE which was:
more-or-less language neutral, to what extent this was practical
(more like traditional standalone editors);
not tied to or hard-coded for particular tools or build
configurations (nearly everything would be "actions" tied to
high-level scripts, which would be customizable per-project, and
ideally in a readily human-editable form);
not being tied to a particular operating system;
...


This is Eclipse.  Granted, it's an IDE which is 
designed-by-committee and hard to love, but it answers all of your 
requirements.

  --scott



I don't believe Eclipse is it, exactly...
it handles multiple languages, yes, and can be used with multiple 
operating systems, and supports multiple compiler backends, ...


however, AFAIK, pretty much all of the logic is written in Java-based 
plugins, which is not ideal (and so, essentially the logic is tied to 
Eclipse itself, and not to the individual projects).



I was imagining something a little different here, such as the 
project control files being more like Makefiles or Bash-scripts, and 
so would be plain-text and attached to the project (along with the 
source files), where it is possible to control things much more 
precisely per-project. more precisely, I had imagined essentially a 
hybrid of Makefiles and Bash.


also imagined was the possibility of using JavaScript (or similar) as 
the build-control language, just using JS in a manner similar to 
Make+Bash, likely with some special-purpose API functionality (to 
make it more usable for Make-like purposes).


a difficulty with JS though is that, normally, IDEs like things to be 
fairly declarative, and JS code its not declarative, unless the JS is 
split into multiple parts:
info about the project proper is stored in a JSON-based format, and 
then any build logic is JS files attached to the project.


so, the IDE would mostly just manage files and editors, and invoke 
the appropriate scripts as needed, and many IDE actions essentially 
just call functions, and so one causes something to happen by 
replacing the default action functions (such as in a script loaded by 
the project file).


actually, conceptually I like the JS route more, even if it would 
likely be a little more verbose than a Bash-like syntax.



IMO, the next best alternative is SciTE, so what I was imagining 
would be a more "expanded" version of SciTE.


then there is also CMake, ...

there is also SCons, which is conceptually related to the prior idea, 
but it based on Python.



but, for the most part, I have mostly just ended up sticking with 
good old text editors and makefiles, as these have served me well, 
despite their drawbacks (the cost of switching to an alternative 
strategy likely being somewhat higher than that of doing nothing and 
staying with the present strategy). IOW, the "if it aint broke, don't 
fix it" strategy...



or such...

___
fonc mailing list
fonc@vpri.org 

Re: [fonc] languages

2011-06-06 Thread K. K. Subramaniam
On Sunday 05 Jun 2011 12:16:33 AM Alan Kay wrote:
> People of my generation (50 years ago) were used to learning and using
> many  syntaxes (e.g. one might learn as many as 20 or more machine
> code/assembler languages, plus 10 or more HLLs, both kinds with more
> variability in form and intent than today).
Learning multiple languages didn't stop with your generation ;-). In the 80s, 
the fashion of the day was not only to learn many languages but also invent 
your own! One of the languages was called JOVIAL, an acronym for Jules Own 
Version of Algol!

;-) .. Subbu

___
fonc mailing list
fonc@vpri.org
http://vpri.org/mailman/listinfo/fonc


Re: [fonc] languages

2011-06-06 Thread Alan Kay
Hi Subbu

Check out when Jules Schwartz actual did Jovial. And the acronym was actually 
"Jules' Own Version of the International Algebraic Language"

Cheers,

Alan





From: K. K. Subramaniam 
To: fonc@vpri.org
Cc: Alan Kay 
Sent: Mon, June 6, 2011 8:34:08 AM
Subject: Re: [fonc] languages

On Sunday 05 Jun 2011 12:16:33 AM Alan Kay wrote:
> People of my generation (50 years ago) were used to learning and using
> many  syntaxes (e.g. one might learn as many as 20 or more machine
> code/assembler languages, plus 10 or more HLLs, both kinds with more
> variability in form and intent than today).
Learning multiple languages didn't stop with your generation ;-). In the 80s, 
the fashion of the day was not only to learn many languages but also invent 
your own! One of the languages was called JOVIAL, an acronym for Jules Own 
Version of Algol!

;-) .. Subbu
___
fonc mailing list
fonc@vpri.org
http://vpri.org/mailman/listinfo/fonc


Re: [fonc] languages

2011-06-06 Thread K. K. Subramaniam
Alan,

Thanks for the correction. IAL was one of the proposed names for the ALGOL, 
wasn't it?

The reason why this name popped up from my grad days was because something as 
complicated as designing a new programming language was considered a fun thing 
to do. It wasn't as much fun for those who had to maintain programs written in 
them years down the line ;-). Gerald Weinberg's parody - Levine, the great 
Tailor - should serve as a lesson even today.

Subbu

On Monday 06 Jun 2011 9:37:02 PM Alan Kay wrote:
> Hi Subbu
> 
> Check out when Jules Schwartz actual did Jovial. And the acronym was
> actually "Jules' Own Version of the International Algebraic Language"
> 
> Cheers,
> 
> Alan
> 
> 
> 
> 
> 
> From: K. K. Subramaniam 
> To: fonc@vpri.org
> Cc: Alan Kay 
> Sent: Mon, June 6, 2011 8:34:08 AM
> Subject: Re: [fonc] languages
> 
> On Sunday 05 Jun 2011 12:16:33 AM Alan Kay wrote:
> > People of my generation (50 years ago) were used to learning and using
> > many  syntaxes (e.g. one might learn as many as 20 or more machine
> > code/assembler languages, plus 10 or more HLLs, both kinds with more
> > variability in form and intent than today).
> 
> Learning multiple languages didn't stop with your generation ;-). In the
> 80s, the fashion of the day was not only to learn many languages but also
> invent your own! One of the languages was called JOVIAL, an acronym for
> Jules Own Version of Algol!
> 
> ;-) .. Subbu

___
fonc mailing list
fonc@vpri.org
http://vpri.org/mailman/listinfo/fonc


Re: [fonc] languages

2011-06-06 Thread Alan Kay
It was ... and is mostly associated with what came to be called Algol 58, but 
not Algol 60.

Another way to look at it is that "almost all systems are difficult to maintain 
down the line" -- partly because they were not designed with this in mind -- 
and 
this is true for most programming languages. However, I don't think this is 
necessary, but more an artifact of incomplete design.

Cheers,

Alan





From: K. K. Subramaniam 
To: Alan Kay 
Cc: fonc@vpri.org
Sent: Mon, June 6, 2011 10:29:49 AM
Subject: Re: [fonc] languages

Alan,

Thanks for the correction. IAL was one of the proposed names for the ALGOL, 
wasn't it?

The reason why this name popped up from my grad days was because something as 
complicated as designing a new programming language was considered a fun thing 
to do. It wasn't as much fun for those who had to maintain programs written in 
them years down the line ;-). Gerald Weinberg's parody - Levine, the great 
Tailor - should serve as a lesson even today.

Subbu

On Monday 06 Jun 2011 9:37:02 PM Alan Kay wrote:
> Hi Subbu
> 
> Check out when Jules Schwartz actual did Jovial. And the acronym was
> actually "Jules' Own Version of the International Algebraic Language"
> 
> Cheers,
> 
> Alan
> 
> 
> 
> 
> 
> From: K. K. Subramaniam 
> To: fonc@vpri.org
> Cc: Alan Kay 
> Sent: Mon, June 6, 2011 8:34:08 AM
> Subject: Re: [fonc] languages
> 
> On Sunday 05 Jun 2011 12:16:33 AM Alan Kay wrote:
> > People of my generation (50 years ago) were used to learning and using
> > many  syntaxes (e.g. one might learn as many as 20 or more machine
> > code/assembler languages, plus 10 or more HLLs, both kinds with more
> > variability in form and intent than today).
> 
> Learning multiple languages didn't stop with your generation ;-). In the
> 80s, the fashion of the day was not only to learn many languages but also
> invent your own! One of the languages was called JOVIAL, an acronym for
> Jules Own Version of Algol!
> 
> ;-) .. Subbu
___
fonc mailing list
fonc@vpri.org
http://vpri.org/mailman/listinfo/fonc


Re: [fonc] languages

2011-06-06 Thread Casey Ransberger
Inline

On Jun 6, 2011, at 10:48 AM, Alan Kay  wrote:

> It was ... and is mostly associated with what came to be called Algol 58, but 
> not Algol 60.
> 
> Another way to look at it is that "almost all systems are difficult to 
> maintain down the line" -- partly because they were not designed with this in 
> mind -- and this is true for most programming languages. However, I don't 
> think this is necessary, but more an artifact of incomplete design.

This, and design drift, wherein over time various forms of pseudo-arch get 
piled up and end up jutting out at weird angles:)

> Cheers,
> 
> Alan
> 
___
fonc mailing list
fonc@vpri.org
http://vpri.org/mailman/listinfo/fonc


Re: [fonc] languages

2011-06-06 Thread BGB

On 6/6/2011 10:29 AM, K. K. Subramaniam wrote:

Alan,

Thanks for the correction. IAL was one of the proposed names for the ALGOL,
wasn't it?

The reason why this name popped up from my grad days was because something as
complicated as designing a new programming language was considered a fun thing
to do. It wasn't as much fun for those who had to maintain programs written in
them years down the line ;-). Gerald Weinberg's parody - Levine, the great
Tailor - should serve as a lesson even today.


agreed...


but, yeah, designing a language is generally a fun/interesting thing to 
do, and so is working on the compiler/VM, fiddling with stuff, ... it 
gives much more of a sense of "doing stuff" than some other activities, 
and is not necessarily (entirely) busywork or mindless tedium either...


granted, yes, my designs tend to be fairly conservative (and not 
generally "minimalist"), as I don't personally believe in unnecessary 
novelty either.


but, one can simplify things when useful as well (say, when preexisting 
options either don't exist or are not very good, are overly cumbersome, 
or would be be a pain to implement, ...).



for example, I was recently (maybe 30 minutes ago) left considering the 
issue of what should be the syntax for type-specialized arrays.


in C, array types are known by context, so are not normally given in the 
syntax (although, IIRC, a cast can be used, say: "(char){ ... }");

Java does it like this: "new type[] { ... }";
my language declares normal arrays as "[ ... ]";
I couldn't determine if ActionScript had a similar feature.

for now, I opted with this: "[ ... ]suffix" and "[ ... ]:type".
where suffix is a generally 1-3 letter shorthand for certain common 
types, usually used for things like literals and similar, whereas the 
latter form would be the "general case".


so, for example, an array of bytes: "[16, 29, 32, 227, 113, 255]UB", as 
well as allowing this, "[3, 4, 5, 19, 340, 0]:byte"


this was because this seemed like the least effort, and this also looks 
less nasty than some other options. say, "[1, 2, 3, 4] as! byte[]", 
which is also slightly semantically misleading as well, since normally a 
cast like this would fail and raise an exception (note: "as" will return 
null if the cast fails, but "as!" will either cast or blow up; the 
traditional C-style cast syntax was removed as it created syntactic 
ambiguity, hence "as!").



or such...


___
fonc mailing list
fonc@vpri.org
http://vpri.org/mailman/listinfo/fonc


Re: [fonc] languages

2011-06-06 Thread David Barbour
On Sat, Jun 4, 2011 at 10:44 AM, Julian Leviston wrote:

> Is a language I program in necessarily limiting in its expressibility?
>

Yes. All communication architectures are necessarily limiting in their
expressiveness (in the sense defined by Matthias Felleisen). For example,
can't easily introduce reactivity, concurrency, and constraint models to
languages not already designed for them. Even with extensible syntax, you
might be forced to re-design, re-implement. and re-integrate all the
relevant libraries and services from scratch to take advantage of a new
feature. Limitations aren't always bad things, though (e.g. when pursuing
security, scalability, safety, resilience, modularity, extensibility,
optimizations). We can benefit greatly from favoring 'principle of least
power' in our language designs.


>
> Is there an optimum methodology of expressing algorithms (ie nomenclature)?
>


No. From Kolmogorov complexity and pigeon-hole principles, we know that any
given language must make tradeoffs in how efficiently it expresses a given
behavior.  The language HQ9+ shows us that we can (quite trivially) optimize
expression of any given behavior by tweaking the language. Fortunately,
there are a lot of 'useless' algorithms that we'll never need to express.
Good language design is aimed at optimizing, abstracting, and refactoring
expression of useful, common behaviors, even if at some expense to rare or
less useful behaviors.



Is there a good or bad way of expressing intent?


There are effective and ineffective ways of expressing intent.

We certainly want to minimize boiler-plate and noise. If our languages
impose semantic properties (such as ordering of a collection) where we
intend none, we have semantic noise. If our languages impose syntactic
properties (such as  semicolons) where they have no meaning to the
developer, we have syntactic noise. If our languages fail to abstract or
refactor some pattern, we get boiler-plate (and recognizable 'design
patterns').

But we also don't want to sacrifice performance, security, modularity, et
cetera. So sometimes we take a hit on how easily we can express intent.



is there a way of patterning a language of programming such that it can
> extend itself infinitely, innately?


Yes. But you must sacrifice various nice properties (e.g. performance,
securability, modularity, composition) to achieve it.

If you're willing to sacrifice ad-hoc extension of cross-cutting features
(e.g. reactivity, concurrency, failure handling, auditing, resource
management) you can still achieve most of what you want, and embed a few
frameworks and EDSLs (extensible syntax or partial evaluation) to close the
remaining expressiveness gap. If you have a decent concurrency and
reactivity mode, you should even be able to abstract and compose IOC
frameworks as though they were normal objects.
___
fonc mailing list
fonc@vpri.org
http://vpri.org/mailman/listinfo/fonc


Re: [fonc] languages

2011-06-06 Thread BGB

On 6/6/2011 6:05 PM, David Barbour wrote:



On Sat, Jun 4, 2011 at 10:44 AM, Julian Leviston > wrote:


Is a language I program in necessarily limiting in its expressibility?


Yes. All communication architectures are necessarily limiting in their 
expressiveness (in the sense defined by Matthias Felleisen). For 
example, can't easily introduce reactivity, concurrency, and 
constraint models to languages not already designed for them. Even 
with extensible syntax, you might be forced to re-design, 
re-implement. and re-integrate all the relevant libraries and services 
from scratch to take advantage of a new feature. Limitations aren't 
always bad things, though (e.g. when pursuing security, scalability, 
safety, resilience, modularity, extensibility, optimizations). We can 
benefit greatly from favoring 'principle of least power' in our 
language designs.


interesting... the "principle of least power" is something I hadn't 
really thought about previously...





Is there an optimum methodology of expressing algorithms (ie
nomenclature)? 



No. From Kolmogorov complexity and pigeon-hole principles, we know 
that any given language must make tradeoffs in how efficiently it 
expresses a given behavior.  The language HQ9+ shows us that we can 
(quite trivially) optimize expression of any given behavior by 
tweaking the language. Fortunately, there are a lot of 'useless' 
algorithms that we'll never need to express. Good language design is 
aimed at optimizing, abstracting, and refactoring expression of 
useful, common behaviors, even if at some expense to rare or less 
useful behaviors.


yeah...

I think many mainstream languages show this property, as they will often 
be specialized for some sets of tasks, but more far reaching features 
(ability to extend the syntax or core typesystem, ...) are generally absent.





Is there a good or bad way of expressing intent?


There are effective and ineffective ways of expressing intent.

We certainly want to minimize boiler-plate and noise. If our languages 
impose semantic properties (such as ordering of a collection) where we 
intend none, we have semantic noise. If our languages impose syntactic 
properties (such as  semicolons) where they have no meaning to the 
developer, we have syntactic noise. If our languages fail to abstract 
or refactor some pattern, we get boiler-plate (and recognizable 
'design patterns').


But we also don't want to sacrifice performance, security, modularity, 
et cetera. So sometimes we take a hit on how easily we can express intent.




yeah...

usually with semicolons, it is either semicolons or significant 
line-breaks (or hueristics which try to guess whether a break was intended).

semicolons then are the lesser of the evils.

granted, yes, one wouldn't need either if the syntax were designed in a 
way where statements and expressions were naturally self-terminating, 
however, with common syntax design, this is often not the case, and so 
extra symbols are needed mostly as separators or to indicate the 
syntactic structure.




is there a way of patterning a language of programming such that
it can extend itself infinitely, innately?


Yes. But you must sacrifice various nice properties (e.g. performance, 
securability, modularity, composition) to achieve it.


If you're willing to sacrifice ad-hoc extension of cross-cutting 
features (e.g. reactivity, concurrency, failure handling, auditing, 
resource management) you can still achieve most of what you want, and 
embed a few frameworks and EDSLs (extensible syntax or partial 
evaluation) to close the remaining expressiveness gap. If you have a 
decent concurrency and reactivity mode, you should even be able to 
abstract and compose IOC frameworks as though they were normal objects.




yep, and often a lot of this isn't terribly useful in practice.

and, likewise, a lot of "advanced" functionality can be added more narrowly:
API functionality;
special purpose attributes or modifiers;
...


personally, I keep around a few "high power" concepts, but these are far 
less than I could do.


for example, I had gotten in arguments with someone before about my 
languages' lack of macro facilities or user-defined syntax extensions 
(or, at least, in-language syntax extensions).


this was partly because both would open up additional and somewhat more 
awkward issues, for example, macros (in the Common Lisp sense) could 
risk exposing an uncomfortable number of implementation details. 
likewise goes for extensible syntax.


some basic amount of extension is possible though mostly by registering 
callbacks, and at most levels of the tower it is possible to register 
new callbacks for new functionality (this is actually how a fair amount 
of the VM itself is implemented).


most things are generally in a form more like "how do I perform 
operation X given Y?", so lots of registering handlers for various 
operations, and registering p

Terseness, precedence, deprogramming (was Re: [fonc] languages)

2011-06-05 Thread Casey Ransberger
Looks write-only to me, but I haven't learned it. I've also heard that if you 
really just want to get some math done, APL is as efficient keyboard-wise in 
that domain as Perl is in the domains where Perl excels (like extracting 
information from log files.) Perl is also regularly harangued for being 
write-only, but a lot of people who do read it really love it. I wonder how 
much of this stuff just has to do with terseness. The Bourne shell also has 
some of this kind of thing going on, what with the need to wait for an hour for 
something verbose to play out on the terminal screen over a slow modem being 
pretty much straight out. And the Bourne-style shell is still really popular 
for interacting with servers and doing small things.

One of the Self people, can't remember which, used APL in a class to "cheat" 
and get a language project finished in very little code, probably expecting to 
be flunked for not using the same implementation language as the rest of the 
class, but received an A instead :) which is a great example of using one's 
noggin in spite of what one thinks is in demand.

I ended up really preferring the raw s-expr style for awhile, because I never 
had to worry about it that way, and because it made the grammar I was learning 
a lot smaller. I'm pretty sure though, in places like _the office_ where Lisp 
hasn't really penetrated past the other halftime languages-person down the hall 
from me, that it would be easier for me to train adults if order of 
mathematical operations were baked in, because that's probably what most people 
I'd be training will expect, reducing the amount of deprogramming that I would 
have to do.

"Deprogrammatic load" ;)

Another approach I think is really cool is actually just using mathematical 
notation as one representation of what's otherwise basically an s-expr, in 
which case I think one is having some cake and eating it too. I've been playing 
with Mathematica a bit, which seems to do some of that. Kind of like flipping 
between the piano roll and the notation view in a music program. I also think 
their big algorithm library is a really lovely idea... If it doesn't 
necessarily separate meaning from optimization, it at least seems like it could 
help separate math from glue and plumbing  which would just be a godsend in my 
work, where it often feels like I have to read-between-the-lines to find an 
algorithm in a piece of code I'm looking at.

Isn't Nile kind of like a "read-also" APL?

On Jun 5, 2011, at 3:51 PM, Florin Mateoc  wrote:

> But wasn't APL called a "write-only language", which would make it in a way a 
> polar opposite of Smalltalk?
> 
> I agree that it is not about "consequences of message sending". And, while I 
> also agree that uniformity/simplicity are also virtues, I think it is more 
> useful to explicitly state that there are "things" which are truly different. 
> Especially in an object system which models the world. Numbers would be in 
> that category, they "deserve" to be treated specially. In the same vein, I 
> think mathematical operators "deserve" special treatment, and not just from 
> an under the covers, optimization point of view.
> 
> Thank you,
> Florin
> 
> fonc mailing list
> fonc@vpri.org
> http://vpri.org/mailman/listinfo/fonc
___
fonc mailing list
fonc@vpri.org
http://vpri.org/mailman/listinfo/fonc


Language in Test (was Re: [fonc] languages)

2011-06-05 Thread Casey Ransberger
I'm actually not talking about the potty mouths:)

APL is up there on my list now, but it hasn't knocked Prolog out of the top 
slot. 

I've done a bunch of test automation. I really enjoy testing because on a good 
day it can approach something reminiscent of science, but OTOH the test code I 
ended up wrangling (often my own code) wound up the worst sort of mess, for a 
few reasons. Not-test code that I've worked on or composed myself has always 
been a lot better, for reasons I don't totally understand yet. 

I can toss some guesses out there:

One is that people who do automation are often outnumbered 5:1 or worse by the 
people making the artifacts under examination, such that there's too much to do 
in order to do anything very well.

Another is, testing often fails to strike decision makers as important enough 
to invest much in, so you end up hacking your way around blocking issues with 
the smallest kludge you can think of, instead of instrumenting proper hooks 
into the thing under test, which usually takes a little longer, and risks 
further regression in the context of a release schedule. 

Things I learned from Smalltalk and Lisp have been really useful for reducing 
the amount of horror in the test code I've worked on, but it's still kind of 
bad. Actually I was inspired to look for an EDSL in my last adventure that 
would cut down on the cruft in the test code there, which was somewhat inspired 
by STEPS, and did seem to actually help quite a bit. 

Use of an EDSL proved very useful, in part just because most "engineering" orgs 
I've been in don't seem to want to let me use Lisp. 

Being able to claim honestly that I'd implemented what I'd done in Ruby seemed 
to help justify the unorthodox approach to my managers. I did go out of my way 
to explain that what I'd done was compose a domain specific language, and this 
did not seem to get the same kind of resistance, just a few raised eyebrows and 
funny looks. 

I keep getting the feeling that the best tool for the job of encoding and 
"running" invariants might be a Prolog, and so this one is currently at the top 
of my list of things to understand deeply.  

Anyone else here ever deal with a bunch of automation? Ever find a way to apply 
what you know about programming languages themselves in the context of software 
verification? Because I would *love* to hear about that!

On Jun 5, 2011, at 7:06 PM, David Leibs  wrote:

> I love APL!  Learning APL is really all about learning the idioms and how to 
> apply them.  This takes quite a lot of training time.   Doing this kind of 
> training will change the way you think.  
> 
> Alan Perlis quote:  "A language that doesn't affect the way you think about 
> programming, is not worth knowing."

I love this quote. Thanks for your 

(snip)

> 
> -David Leibs
> 
> ___
> fonc mailing list
> fonc@vpri.org
> http://vpri.org/mailman/listinfo/fonc

___
fonc mailing list
fonc@vpri.org
http://vpri.org/mailman/listinfo/fonc


[fonc] languages vs. systems [was: Coding Standards]

2008-08-25 Thread Michael FIG
Hi,

"Alejandro F. Reimondo" <[EMAIL PROTECTED]> writes:

> For people that understand "sustainability" as the ability to
>  continue evolving, other tools are available and better
>  than languages.
> The language is not as important as the actions made on
>  the system itself.

So I hear you talking about systems versus languages, and that systems
are more interesting because they are unconstrained.

That seems compatible with what I understand Ian is working on
(building a basis for a system where every limit can be changed at
runtime).  The work that I'm trying to do is to allow multiple systems
to reside in the same CPU and address space.

The languages come secondary, they each are just a means to
communicate with the system.  But they are necessary, for those of us
who want a TTY interface to the system (the 70s-era people like
myself).  Those who want a graphical interface will want to wait for
(or help work on) something like the IDE that I saw Takashi working on
while I visited VPRI in June.  Those with a telephony interface will
want

The point is you are looking at the bootstrap, and though you may find
it repulsive, the ideal is not to make the bootstrap prettier, it is
to get to the point where you too can contribute (i.e. there's a
working "Smalltalk-80" application).

> We can try to define a language for all persons and
>  a law for all systems; or... we can try to work to modify
>  a systems that has booted +30years ago.
> The tools for each work can´t be the same, because
>  the objetives and the methods are not the same.

The main difference I see with Smalltalk is that it (or Squeak at
least) uses a virtual machine, which is implemented in C, and cannot
be modified from the Smalltalk portion of the code.  It is reasonable
to consider taking a Squeak image, running it on a COLA-implemented VM
that gives bytecode access to the COLA compiler, and gradually
evolving that image to something that can take full advantage of COLA.

If we did that, and still had a TTY interface to the underlying
COLA-based VM, would you still say that the tools "can't be the same"?

All the best,

-- 
Michael FIG <[EMAIL PROTECTED]> //\
   http://michael.fig.org/\//

___
fonc mailing list
fonc@vpri.org
http://vpri.org/mailman/listinfo/fonc


[fonc] languages vs. systems [was: Coding Standards]

2008-08-25 Thread Kevin Driedger
On Mon, Aug 25, 2008 at 7:33 PM, Michael FIG <[EMAIL PROTECTED]> wrote:

> it repulsive, the ideal is not to make the bootstrap prettier, it is
> to get to the point where you too can contribute (i.e. there's a
> working "Smalltalk-80" application).
>

Does this mean a "Smalltalk-80" application is planned?


> to consider taking a Squeak image, running it on a COLA-implemented VM
> that gives bytecode access to the COLA compiler, and gradually
> evolving that image to something that can take full advantage of COLA.
>
This would be tremendously cool to be able to modify the lower layers from
the higher layers.

-- 
]{evin ])riedger
http://extremedesigners.ca



-- 
]{evin ])riedger
http://extremedesigners.ca
___
fonc mailing list
fonc@vpri.org
http://vpri.org/mailman/listinfo/fonc


Re: Terseness, precedence, deprogramming (was Re: [fonc] languages)

2011-06-05 Thread Steve Wart
On Sun, Jun 5, 2011 at 5:13 PM, Casey Ransberger
 wrote:
>
> Isn't Nile kind of like a "read-also" APL?

I think you're referring to Nial. See also J and K.

http://en.wikipedia.org/wiki/Nial
http://en.wikipedia.org/wiki/J_(programming_language)

> On Jun 5, 2011, at 3:51 PM, Florin Mateoc  wrote:
>
> But wasn't APL called a "write-only language", which would make it in a way
> a polar opposite of Smalltalk?
>
> I agree that it is not about "consequences of message sending". And, while I
> also agree that uniformity/simplicity are also virtues, I think it is more
> useful to explicitly state that there are "things" which are truly
> different. Especially in an object system which models the world. Numbers
> would be in that category, they "deserve" to be treated specially. In the
> same vein, I think mathematical operators "deserve" special treatment, and
> not just from an under the covers, optimization point of view.
>
> Thank you,
> Florin
>
> fonc mailing list
> fonc@vpri.org
> http://vpri.org/mailman/listinfo/fonc
>
> ___
> fonc mailing list
> fonc@vpri.org
> http://vpri.org/mailman/listinfo/fonc
>
>

___
fonc mailing list
fonc@vpri.org
http://vpri.org/mailman/listinfo/fonc


Re: Terseness, precedence, deprogramming (was Re: [fonc] languages)

2011-06-05 Thread David Leibs
I love APL!  Learning APL is really all about learning the idioms and how to 
apply them.  This takes quite a lot of training time.   Doing this kind of 
training will change the way you think.  

Alan Perlis quote:  "A language that doesn't affect the way you think about 
programming, is not worth knowing."

There is some old analysis out there that indicates that APL is naturally very 
parallel.  Willhoft-1991 claimed that  94 of the 101 primitives operations in 
APL2 could be implemented in parallel and that 40-50% of APL code in real 
applications was naturally parallel. 

R. G. Willhoft, Parallel expression in the apl2 language, IBM Syst. J. 30 
(1991), no. 4, 498–512.


-David Leibs

___
fonc mailing list
fonc@vpri.org
http://vpri.org/mailman/listinfo/fonc


Re: Terseness, precedence, deprogramming (was Re: [fonc] languages)

2011-06-05 Thread Alan Kay
Hi David

I've always been very fond of APL also -- and a slightly better and more 
readable syntax could be devised these days now that things don't have to be 
squeezed onto an IBM Selectric golfball ...

Cheers,

Alan





From: David Leibs 
To: Fundamentals of New Computing 
Sent: Sun, June 5, 2011 7:06:55 PM
Subject: Re: Terseness, precedence, deprogramming (was Re: [fonc] languages)

I love APL!  Learning APL is really all about learning the idioms and how to 
apply them.  This takes quite a lot of training time.   Doing this kind of 
training will change the way you think.  

Alan Perlis quote:  "A language that doesn't affect the way you think about 
programming, is not worth knowing."

There is some old analysis out there that indicates that APL is naturally very 
parallel.  Willhoft-1991 claimed that  94 of the 101 primitives operations in 
APL2 could be implemented in parallel and that 40-50% of APL code in real 
applications was naturally parallel. 

R. G. Willhoft, Parallel expression in the apl2 language, IBM Syst. J. 30 
(1991), no. 4, 498–512.


-David Leibs
___
fonc mailing list
fonc@vpri.org
http://vpri.org/mailman/listinfo/fonc


Re: Terseness, precedence, deprogramming (was Re: [fonc] languages)

2011-06-05 Thread David Pennell
HP had a version of APL in the early 80's that included "structured"
conditional statements and where performance didn't depend on cramming your
entire program into one line of code.  Between the two, it was possible to
create reasonably readable code.  That version of APl also did some clever
performance optimizations by manipulating array descriptors instead just
using brute force.

APL was the first language other than Fortran that I learned - very eye
opening.

-david

On Sun, Jun 5, 2011 at 9:13 PM, Alan Kay  wrote:

> Hi David
>
> I've always been very fond of APL also -- and a slightly better and more
> readable syntax could be devised these days now that things don't have to be
> squeezed onto an IBM Selectric golfball ...
>
> Cheers,
>
> Alan
>
> --
> *From:* David Leibs 
> *To:* Fundamentals of New Computing 
> *Sent:* Sun, June 5, 2011 7:06:55 PM
> *Subject:* Re: Terseness, precedence, deprogramming (was Re: [fonc]
> languages)
>
> I love APL!  Learning APL is really all about learning the idioms and how
> to apply them.  This takes quite a lot of training time.   Doing this kind
> of training will change the way you think.
>
> Alan Perlis quote:  "A language that doesn't affect the way you think about
> programming, is not worth knowing."
>
> There is some old analysis out there that indicates that APL is naturally
> very parallel.  Willhoft-1991 claimed that  94 of the 101 primitives
> operations in APL2 could be implemented in parallel and that 40-50% of APL
> code in real applications was naturally parallel.
>
> R. G. Willhoft, Parallel expression in the apl2 language, IBM Syst. J. 30
> (1991), no. 4, 498–512.
>
>
> -David Leibs
>
>
> ___
> fonc mailing list
> fonc@vpri.org
> http://vpri.org/mailman/listinfo/fonc
>
>
___
fonc mailing list
fonc@vpri.org
http://vpri.org/mailman/listinfo/fonc


Re: Terseness, precedence, deprogramming (was Re: [fonc] languages)

2011-06-05 Thread Alan Kay
I think this one was derived from Phil Abrams' Stanford (and SLAC) PhD thesis 
on 
dynamic analysis and optimization of APL -- a very nice piece of work! (Maybe 
in 
the early 70s or late 60s?)

Cheers,

Alan





From: David Pennell 
To: Fundamentals of New Computing 
Sent: Sun, June 5, 2011 7:33:40 PM
Subject: Re: Terseness, precedence, deprogramming (was Re: [fonc] languages)

HP had a version of APL in the early 80's that included "structured" 
conditional 
statements and where performance didn't depend on cramming your entire program 
into one line of code.  Between the two, it was possible to create reasonably 
readable code.  That version of APl also did some clever performance 
optimizations by manipulating array descriptors instead just using brute force.

APL was the first language other than Fortran that I learned - very eye opening.


-david


On Sun, Jun 5, 2011 at 9:13 PM, Alan Kay  wrote:

Hi David
>
>I've always been very fond of APL also -- and a slightly better and more 
>readable syntax could be devised these days now that things don't have to be 
>squeezed onto an IBM Selectric golfball ...
>
>Cheers,
>
>Alan
>
>
>
>

 From: David Leibs 
>To: Fundamentals of New Computing 
>Sent: Sun, June 5, 2011 7:06:55 PM
>Subject: Re:  Terseness, precedence, deprogramming (was Re: [fonc] languages)
>
>
>I love APL!  Learning APL is really all about learning the idioms and how to 
>apply them.  This takes quite a lot of training time.   Doing this kind of 
>training will change the way you think.  
>
>
>Alan Perlis quote:  "A language that doesn't affect the way you think about 
>programming, is not worth knowing."
>
>
>There is some old analysis out there that indicates  that APL is naturally 
>very 
>parallel.  Willhoft-1991 claimed that  94 of the 101 primitives operations in 
>APL2 could be implemented in parallel and that 40-50% of APL code in real 
>applications was naturally parallel. 
>
>
>R. G. Willhoft, Parallel expression in the apl2 language, IBM Syst. J. 30 
>(1991), no. 4, 498–512.
>
>
>
>
>-David Leibs
>
>
>___
>fonc mailing list
>fonc@vpri.org
>http://vpri.org/mailman/listinfo/fonc
>
>
___
fonc mailing list
fonc@vpri.org
http://vpri.org/mailman/listinfo/fonc


Re: Terseness, precedence, deprogramming (was Re: [fonc] languages)

2011-06-05 Thread David Harris
Alan-

I expect you lost a few readers there.  I have fond memories of APL on an
IBM 360/145 with APL microcode support and Selectric terminals.

David


On Sun, Jun 5, 2011 at 7:13 PM, Alan Kay  wrote:

> Hi David
>
> I've always been very fond of APL also -- and a slightly better and more
> readable syntax could be devised these days now that things don't have to be
> squeezed onto an IBM Selectric golfball ...
>
> Cheers,
>
> Alan
>
> --
> *From:* David Leibs 
> *To:* Fundamentals of New Computing 
> *Sent:* Sun, June 5, 2011 7:06:55 PM
> *Subject:* Re: Terseness, precedence, deprogramming (was Re: [fonc]
> languages)
>
> I love APL!  Learning APL is really all about learning the idioms and how
> to apply them.  This takes quite a lot of training time.   Doing this kind
> of training will change the way you think.
>
> Alan Perlis quote:  "A language that doesn't affect the way you think about
> programming, is not worth knowing."
>
> There is some old analysis out there that indicates that APL is naturally
> very parallel.  Willhoft-1991 claimed that  94 of the 101 primitives
> operations in APL2 could be implemented in parallel and that 40-50% of APL
> code in real applications was naturally parallel.
>
> R. G. Willhoft, Parallel expression in the apl2 language, IBM Syst. J. 30
> (1991), no. 4, 498–512.
>
>
> -David Leibs
>
>
> ___
> fonc mailing list
> fonc@vpri.org
> http://vpri.org/mailman/listinfo/fonc
>
>
___
fonc mailing list
fonc@vpri.org
http://vpri.org/mailman/listinfo/fonc


Re: Terseness, precedence, deprogramming (was Re: [fonc] languages)

2011-06-05 Thread BGB

On 6/5/2011 7:06 PM, David Leibs wrote:
I love APL!  Learning APL is really all about learning the idioms and 
how to apply them.  This takes quite a lot of training time.   Doing 
this kind of training will change the way you think.


Alan Perlis quote:  "A language that doesn't affect the way you think 
about programming, is not worth knowing."




not everyone wants to learn new things though.

very often, people want to get the job done as their main priority, and 
being faced with new ideas or new ways of doing things is a hindrance to 
the maximization of productivity (or, people may see any new/unfamiliar 
things as inherently malevolent).


granted, these are not the sort of people one is likely to find using a 
language like APL...



a lot depends on who ones' market or target audience is, and whether or 
not they like the thing in question. if it is the wrong product for the 
wrong person, one isn't going to make a sale (and people neither like 
having something being forced on them, nor buying or committing 
resources to something which is not to to their liking, as although 
maybe not immediate, this will breed frustration later...).


it doesn't mean either the product or the potential customer is bad, 
only that things need to be matched up.


it is like, you don't put sugar in the coffee of someone who likes their 
coffee black.



There is some old analysis out there that indicates that APL is 
naturally very parallel.  Willhoft-1991 claimed that  94 of the 101 
primitives operations in APL2 could be implemented in parallel and 
that 40-50% of APL code in real applications was naturally parallel.


R. G. Willhoft, Parallel expression in the apl2 language, IBM Syst. J. 
30 (1991), no. 4, 498–512.




not personally dealt with APL, so I don't know.

I sort of like having the ability to write code with asynchronous 
operations, but this is a little different. I guess a task for myself 
would be to determine whether or not what I am imagining as 'async' is 
equivalent to the actor model, hmm...


decided to leave out a more complex elaboration.


___
fonc mailing list
fonc@vpri.org
http://vpri.org/mailman/listinfo/fonc


Re: Terseness, precedence, deprogramming (was Re: [fonc] languages)

2011-06-05 Thread David Leibs
Alan,
Your memory for great dissertations is amazing.  I don't think the Phil Abrams 
APL machine was ever actually built but It had some really good techniques for 
making APL efficient colorfully named "beating" and "drag-along".  

-djl

On Jun 5, 2011, at 7:50 PM, Alan Kay wrote:

> I think this one was derived from Phil Abrams' Stanford (and SLAC) PhD thesis 
> on dynamic analysis and optimization of APL -- a very nice piece of work! 
> (Maybe in the early 70s or late 60s?)
> 
> Cheers,
> 
> Alan
> 
> From: David Pennell 
> To: Fundamentals of New Computing 
> Sent: Sun, June 5, 2011 7:33:40 PM
> Subject: Re: Terseness, precedence, deprogramming (was Re: [fonc] languages)
> 
> HP had a version of APL in the early 80's that included "structured" 
> conditional statements and where performance didn't depend on cramming your 
> entire program into one line of code.  Between the two, it was possible to 
> create reasonably readable code.  That version of APl also did some clever 
> performance optimizations by manipulating array descriptors instead just 
> using brute force.
> 
> APL was the first language other than Fortran that I learned - very eye 
> opening.
> 
> -david
> 
> On Sun, Jun 5, 2011 at 9:13 PM, Alan Kay  wrote:
> Hi David
> 
> I've always been very fond of APL also -- and a slightly better and more 
> readable syntax could be devised these days now that things don't have to be 
> squeezed onto an IBM Selectric golfball ...
> 
> Cheers,
> 
> Alan
> 
> From: David Leibs 
> To: Fundamentals of New Computing 
> Sent: Sun, June 5, 2011 7:06:55 PM
> Subject: Re: Terseness, precedence, deprogramming (was Re: [fonc] languages)
> 
> I love APL!  Learning APL is really all about learning the idioms and how to 
> apply them.  This takes quite a lot of training time.   Doing this kind of 
> training will change the way you think.  
> 
> Alan Perlis quote:  "A language that doesn't affect the way you think about 
> programming, is not worth knowing."
> 
> There is some old analysis out there that indicates that APL is naturally 
> very parallel.  Willhoft-1991 claimed that  94 of the 101 primitives 
> operations in APL2 could be implemented in parallel and that 40-50% of APL 
> code in real applications was naturally parallel. 
> 
> R. G. Willhoft, Parallel expression in the apl2 language, IBM Syst. J. 30 
> (1991), no. 4, 498–512.
> 
> 
> -David Leibs
> 
> 
> ___
> fonc mailing list
> fonc@vpri.org
> http://vpri.org/mailman/listinfo/fonc
> 
> 
> ___
> fonc mailing list
> fonc@vpri.org
> http://vpri.org/mailman/listinfo/fonc

___
fonc mailing list
fonc@vpri.org
http://vpri.org/mailman/listinfo/fonc


Re: Terseness, precedence, deprogramming (was Re: [fonc] languages)

2011-06-05 Thread C. Scott Ananian
On Sun, Jun 5, 2011 at 8:13 PM, Casey Ransberger
wrote:

> Another approach I think is really cool is actually just using mathematical
> notation as one representation of what's otherwise basically an s-expr, in
> which case I think one is having some cake and eating it too. I've been
> playing with Mathematica a bit, which seems to do some of that. Kind of like
> flipping between the piano roll and the notation view in a music program. I
> also think their big algorithm library is a really lovely idea... If it
> doesn't necessarily separate meaning from optimization, it at least seems
> like it could help separate math from glue and plumbing  which would just be
> a godsend in my work, where it often feels like I have to
> read-between-the-lines to find an algorithm in a piece of code I'm looking
> at.
>

You would like Fortress: http://labs.oracle.com/projects/plrg/faq/NAS-CG.pdf
Here is an example of runnable Fortress code:
http://labs.oracle.com/projects/plrg/faq/NAS-CG.pdf
(There is also an ASCII-only syntax for us luddites.)
  --scott
-- 
  ( http://cscott.net )
___
fonc mailing list
fonc@vpri.org
http://vpri.org/mailman/listinfo/fonc


Re: Terseness, precedence, deprogramming (was Re: [fonc] languages)

2011-06-05 Thread C. Scott Ananian
On Mon, Jun 6, 2011 at 2:13 AM, C. Scott Ananian  wrote:
> You would like Fortress: http://labs.oracle.com/projects/plrg/faq/NAS-CG.pdf

This first link should have been to the Great Wiki:
http://en.wikipedia.org/wiki/Fortress_(programming_language)

A better link to samples of both ASCII and rendered versions of
Fortress is Guy Steele's blog post at:
    
http://projectfortress.sun.com/Projects/Community/blog/ParallelPrefixNotation
You can click on the buttons to see the ASCII versions of any of the
pretty sources.

  --scott
--
      ( http://cscott.net )

___
fonc mailing list
fonc@vpri.org
http://vpri.org/mailman/listinfo/fonc


Re: Terseness, precedence, deprogramming (was Re: [fonc] languages)

2011-06-06 Thread Alan Kay
Yep ...

As Abrams pointed out, "Beating" should be pronounced "Bee-Ating" because it 
was 
a "promotion scheme" that reminded him of the beatification process in the path 
towards sainthood ...

Cheers,

Alan





From: David Leibs 
To: Fundamentals of New Computing 
Sent: Sun, June 5, 2011 9:59:33 PM
Subject: Re: Terseness, precedence, deprogramming (was Re: [fonc] languages)

Alan,
Your memory for great dissertations is amazing.  I don't think the Phil Abrams 
APL machine was ever actually built but It had some really good techniques for 
making APL efficient colorfully named "beating" and "drag-along".  

-djl


On Jun 5, 2011, at 7:50 PM, Alan Kay wrote:

I think this one was derived from Phil Abrams' Stanford (and SLAC) PhD thesis 
on 
dynamic analysis and optimization of APL -- a very nice piece of work! (Maybe 
in 
the early 70s or late 60s?)
>
>Cheers,
>
>Alan
>
>
>
>

From: David Pennell 
>To: Fundamentals of New Computing 
>Sent: Sun, June 5, 2011 7:33:40 PM
>Subject: Re: Terseness, precedence, deprogramming (was Re: [fonc] languages)
>
>HP had a version of APL in the early 80's that included "structured" 
>conditional 
>statements and where performance didn't depend on cramming your entire program 
>into one line of code.  Between the two, it was possible to create reasonably 
>readable code.  That version of APl also did some clever performance 
>optimizations by manipulating array descriptors instead just using brute force.
>
>
>APL was the first language other than Fortran that I learned - very eye 
opening.
>
>
>
>-david
>
>
>On Sun, Jun 5, 2011 at 9:13 PM, Alan Kay  wrote:
>
>Hi David
>>
>>I've always been very fond of APL also -- and a slightly better and more 
>>readable syntax could be devised these days now that things don't have to be 
>>squeezed onto an IBM Selectric golfball ...
>>
>>Cheers,
>>
>>Alan
>>
>>
>>
>>

From: David Leibs 
>>To: Fundamentals of New Computing 
>>Sent: Sun, June 5, 2011 7:06:55 PM
>>Subject: Re: Terseness, precedence, deprogramming (was Re: [fonc] languages)
>>
>>
>>I love APL!  Learning APL is really all about learning the idioms and how to 
>>apply them.  This takes quite a lot of training time.   Doing this kind of 
>>training will change the way you think.  
>>
>>
>>Alan Perlis quote:  "A language that doesn't affect the way you think about 
>>programming, is not worth knowing."
>>
>>
>>There is some old analysis out there that indicates that APL is naturally 
>>very 
>>parallel.  Willhoft-1991 claimed that  94 of the 101 primitives operations in 
>>APL2 could be implemented in parallel and that 40-50% of APL code in real 
>>applications was naturally parallel. 
>>
>>
>>R. G. Willhoft, Parallel expression in the apl2 language, IBM Syst. J. 30 
>>(1991), no. 4, 498–512.
>>
>>
>>
>>
>>-David Leibs
>>
>>
>>___
>>fonc mailing list
>>fonc@vpri.org
>>http://vpri.org/mailman/listinfo/fonc
>>
>>
>___
>fonc mailing list
>fonc@vpri.org
>http://vpri.org/mailman/listinfo/fonc
>
___
fonc mailing list
fonc@vpri.org
http://vpri.org/mailman/listinfo/fonc


Re: [fonc] languages vs. systems [was: Coding Standards]

2008-08-28 Thread Alejandro F. Reimondo
f thinking about it)


It is reasonable to consider taking a Squeak image,
running it on a COLA-implemented VM
that gives bytecode access to the COLA compiler,
and gradually evolving that image to something
that can take full advantage of COLA.


Yes! we need a binary release to start
(changing it while running).


If we did that, and still had a TTY interface to
the underlying COLA-based VM, would you still
say that the tools "can't be the same"?


Yes! history is on my side.
Never the tools was the same for all people.
Diversity is reflected on tools, and granted by people/minds;
puting people outside of the formula (it is only text
or only designed as objects) ensure exclusion and
promote one-way/"univesal" thinking (e.g. no inter-cultural
work can be realized this way).

Thanks for reading my email and for talking about
systems outside language limitations.

all the best for you too,
Ale.


- Original Message - 
From: "Michael FIG" <[EMAIL PROTECTED]>

To: "Fundamentals of New Computing" 
Cc: <[EMAIL PROTECTED]>
Sent: Monday, August 25, 2008 8:33 PM
Subject: [fonc] languages vs. systems [was: Coding Standards]


Hi,

"Alejandro F. Reimondo" <[EMAIL PROTECTED]> writes:


For people that understand "sustainability" as the ability to
 continue evolving, other tools are available and better
 than languages.
The language is not as important as the actions made on
 the system itself.


So I hear you talking about systems versus languages, and that systems
are more interesting because they are unconstrained.

That seems compatible with what I understand Ian is working on
(building a basis for a system where every limit can be changed at
runtime).  The work that I'm trying to do is to allow multiple systems
to reside in the same CPU and address space.

The languages come secondary, they each are just a means to
communicate with the system.  But they are necessary, for those of us
who want a TTY interface to the system (the 70s-era people like
myself).  Those who want a graphical interface will want to wait for
(or help work on) something like the IDE that I saw Takashi working on
while I visited VPRI in June.  Those with a telephony interface will
want

The point is you are looking at the bootstrap, and though you may find
it repulsive, the ideal is not to make the bootstrap prettier, it is
to get to the point where you too can contribute (i.e. there's a
working "Smalltalk-80" application).


We can try to define a language for all persons and
 a law for all systems; or... we can try to work to modify
 a systems that has booted +30years ago.
The tools for each work can´t be the same, because
 the objetives and the methods are not the same.


The main difference I see with Smalltalk is that it (or Squeak at
least) uses a virtual machine, which is implemented in C, and cannot
be modified from the Smalltalk portion of the code.  It is reasonable
to consider taking a Squeak image, running it on a COLA-implemented VM
that gives bytecode access to the COLA compiler, and gradually
evolving that image to something that can take full advantage of COLA.

If we did that, and still had a TTY interface to the underlying
COLA-based VM, would you still say that the tools "can't be the same"?

All the best,

--
Michael FIG <[EMAIL PROTECTED]> //\
  http://michael.fig.org/\//

___
fonc mailing list
fonc@vpri.org
http://vpri.org/mailman/listinfo/fonc



___
fonc mailing list
fonc@vpri.org
http://vpri.org/mailman/listinfo/fonc