Steve,
You may be right. It often happens that someone has a (small) idea, perhaps
very focused, and others chime in and try to apply it more widely, perhaps by
making it more general, and it grows. Over the years, the earlier adopters may
be seen almost as co-creators or even become the lead in the story. Tell me who
seems to be associated with Apple and who did much of the technical work? I am
not saying that Jobs did not have vision and marketing talent and an eye for
style and so on. I am not even sure who came up with ideas back then. Another
such person, Bill Gates, did do some programming of BASIC early on and so forth.
So, whatever the history of early Python (and predecessors) was, it may have
begun as some enhancements and improvements of what came before and perhaps new
paradigms. Others who saw it may have seen something that looked easier to
teach. An obvious example, if it was there way back then, was removing lots of
brackets used in other languages (such as {([])} ) and using indentation. Feels
more natural to not-so-mathematical types.
But over the years the brackets have returned and worse. Like many languages,
Python used what symbols it could find on the keyboard and then overloaded them
horribly. Parentheses are used for grouping but also for tuples and to force
invocation of a function and to hold the argument tuple (albeit no trailing
comma is needed for a single argument as in other tuples) and presumably in
other contexts such as within regular expressions. Periods can mean quite a few
things as can asterisk and so on. There are few matched sets of characters on
the keyboard. () is now used for tuples, among other things. [] is used for
lists except when it is used for dictionary access as in cards[key] versus
text[5] and {} is used for dictionaries except when you use [] but also for
sets ...
Heck, they ran out of symbols. {} is an empty dictionary and you say set() for
an empty set. Ditto for tuple.
<> is not currently used as a matched set as it has many other uses like in
comparisons. Some languages even use <> as the same as != or ~= to mean not
equals. "" and '' and even `` are sort of used as if they are a matched set I
some languages (`` is in R, not Python) but are typographically identical as
compared to text processing that creates two versions.
EBCDIC had a few other symbols but many languages now use only ASCII symbols
and have to combine them to make complex and even non-intuitive combinations as
symbols. Try teaching a child in say C++ that:
X++==++Y is valid and even meaningful if read as:
X++ == ++Y
because it asks you to get the current value of X to compare to the NEW value
of Y incremented and return a Boolean result and immediately thereafter,
increment X. Heck, you can write (XY) and so on. I have seen languages with
an = and == and even === alongside := and ::= and -> and --> and <- and <-- and
more all to mean variations on a theme.
If we had started with many more symbols, in some ways it would be harder but
in other ways easier. Mathematicians borrow symbols from lower case, upper case
and script letters from languages like Greek and Latin but also from Hebrew as
in, well not easy to include in a file contain normal text but aleph-null (and
aleph-one and infinitely more levels of infinity.)
A simple teaching language that uses English words children know might either
be verbose in places or long as in making you spell out DELETE instead of del.
But quite a few languages are simple if you leave out most of the
functionality. Yes, you need to explain why some things must end in a semicolon
or be indented a certain way or require parentheses or ...
What you don't want to teach is a complex language like English. I was teaching
my Dad as he prepared for Citizenship and he balked when told there were at
least seven distinct ways you pronounce OUGH in common English words.
So, perhaps python can be used to teach basic programming for several reasons
including no need to declare variables and their "types" in advance and having
relatively intelligent basic types that easily get converted in the background
as in various kinds of numbers. But when you use more and more features, it
expands into areas where, frankly, there is no one right answer and choices are
often made by fiat or by a majority. Mathematically, if you view a topic like
multiple inheritance in classes and the kludges made to get around logical
issues, you see what a mess it really is and very hard to teach. Look at the
double underscore notation(prefix-only) for variables that renames them to be
unique within a certain scope to avoid collisions in the mythical searched name
space.
I am not against such drift but at times wonder if a one-size-has-to-fit-all
mentality is wise.
I had a thought on an earlier topic by Asad. He wanted to write in Python what
is effectively a member of the UNIX grep family. Specifically, fgrep (or