At 09:49 PM 8/16/00 +0200, Kai Henningsen wrote:
>[EMAIL PROTECTED] (Dan Sugalski) wrote on 15.08.00 in
><[EMAIL PROTECTED]>:
>
> > At 06:04 PM 8/15/00 -0400, John Porter wrote:
> > >Dan Sugalski wrote:
> > > > >Generality good.
> > > >
> > > > For many things, yes. For computers, say. For people, no. Generality
> > > > bad. Specificity and specialization good. People aren't generalists.
> > > > They're a collection of specialists. The distinction is important.
> > >
> > >I'm sorry if I don't find this argument convincing.
> > >This argument suggests that *every* type carry a distinguishing
> > >prefix symbol -- including ones to distinguish between numbers and
> > >strings, say.
> >
> > Numbers and strings really aren't different things, at least not as far as
> > people are concerned.
>
>Bovine excrement. Numbers and strings completely different things to
>people.
After a certain point, yes, more or less. Up until then, no. And most
numbers are, in most contexts, treated as adjectives by people, where they
have little more meaning than any other adjective. They're just mental
symbols tacked on as a modifier to other symbols.
>Hashes and arrays, OTOH, really aren't different for people. The concept
>of an index needing to be a nonnegative number is a computer concept.
Arrays are essentially sequences of things--first thing, second thing,
third thing, and so on.
Hashes, on the other hand, are named things--the person that's Bob, the
person that's Jim, the person that's Jason.
It's the difference between position and identity. Those are very, very
different concepts.
> >They are for machines, but computer languages
> > ultimately aren't for machines, they're for people.
>
>I agree that computer languages are for people - in fact, that's the sole
>reason they were invented. Otherwise we'd still program in machine
>language.
This statement isn't, strictly speaking, true. Very few computer languages
are actually designed *for* people. Most of them are designed to map over a
particular problem space, be it theory or hardware. C wasn't designed for
people. Neither was Lisp, or Fortran. COBOL, oddly enough, was, though it
shows the limitations of the machines and techniques of the era. People are
generally an afterthought.
>However, I do think your ideas about what does and does not come naturally
>to people are incredibly warped.
Perhaps, but then again perhaps not. I've got two young children. Watching
kids as they grow lends a certain perspective on things. I see what does
and doesn't come naturally.
When designing perl, a book on cognitive psychology or early childhood
education may be as (if not more) useful than a CS text.
> > I'm not presuming that, though there are plenty of languages already that
> > have no symbols. Perl's not one of them, though.
>
>I presume you mean $@% when you say symbols (and not, for example, the
>stuff that goes into a symbol table, because I certainly don't know any
>language that doesn't have those).
Yep.
>It's an open question if these are a good idea. I'm not convinced.
>Personally, I see them as one of the ugly warts in Perl.
>
> > > > Even assuming highlander types, the punctuation carries a rather
> > > > significant amount of contextual information very compactly.
>
>s/significant/in&/ IMNSHO, even ignoring the "even" part.
A single small picture, and one that can be easily picked out by your
visual cortex, carries an awful lot of contextual information with it, and
carries it quickly. YHO is, I think, incorrect.
> > It's going to always be more difficult. You need to *think* to turn a word
> > into a symbol. = is already a symbol. Less effort's needed.
>
>Maybe *you* have to think to turn a word into a symbol. Most people seem
>to grow up with that concept.
I'll have to check, but I'm not sure 'most' is appropriate here, as a large
portion of the world uses iconographic languages.
Regardless, more mental effort is needed to turn the word "rock" into the
concept of a rock than a picture of a rock does. It's not conscious,
generally speaking, but more of your brain is involved in dealing with
words than with pictures.
>As for *recognizing* the stuff, ever heard of redundancy? It's the stuff
>that makes recognizing stuff easier.
No, redundancy is the stuff that makes recognizing things more reliable.
>Words, by being longer, are easier to recognize than non-alphabetic
>symbols.
Non-alpha symbols are *faster* to recognize. That's their advantage.
Overloading them with too much meaning is bad, as then you end up needing
to think more.
When you see something like:
$foo
the first thing that happens is your brain picks out the whole
space-separated thing as a single 'thing'. That happens in your visual
cortex, it's fast, and takes very little effort. That gets stripped down
into pieces. Your brain's already dealing with a perl context, and $ gets
recognized as the 'single thing' marker. 'foo', meanwhile, is wending its
way through your language centers. That's slower. The thinking part of your
brain ultimately gets the thing in phases--thing, singular thing, singular
thing named 'foo'. If you're just skimming the source, you can stop after
the singular thing part. If you're looking at it more in-depth, the
'singular thing' piece is another associative clue to attach actual meaning
to the $foo symbol.
> > >Sure. But "instinct and inherent capabilities" do not apply here.
> >
> > Yes, they do. People write source. People read source. People are the most
> > important piece of the system. The computer can be made to cope with the
> > syntax, whatever the syntax is. People can't be made to cope nearly as
> > easily, nor to nearly the same degree.
>
>I completely agree with this point. Which is exactly why I disagree with
>most of your other points.
Unfortunately, I think you're somewhat under-informed as to the inherent
capabilities of people's brains.
I'm beginning to think that Scott McCloud's _Understanding Comics_ should
be on the useful reading list.
Dan
--------------------------------------"it's like this"-------------------
Dan Sugalski even samurai
[EMAIL PROTECTED] have teddy bears and even
teddy bears get drunk