On Monday, May 11, 2015 at 9:03:43 AM UTC-5, Marko Rauhamaa wrote:
> Antoon Pardon <antoon.par...@rece.vub.ac.be>:
> 
> > The point is that all too often someone wants to defend a specific
> > choice the developers have made and cites some general rule or
> > principle in support, ignoring the fact that python breaks that
> > rule/principle in other area's.
> 
> Granted, but you have set the trap for them by demanding a justification
> when no justification was required. Every language has their cute
> idiosyncrasies and arbitrary design choices.

No.  Here's where I must disagree.  I think one can infer a goal for particular 
programming languages, even if it is subconscious.  For example, with LISP it 
could be "generality".  For C, it could be "staying as close to the machine as 
possible while maximizing the use to humans" -- contradiction that works 
because they've limited their architecture to VonNeumann (stackless) machines.

I think the subconscious goal of OOP languages is to create a data ecosystem, 
starting with a unified data model under the realization that ultimately:  all 
data relates to other data -- that my database of wind speed and direction from 
2012 is relatable, by some finite number of hops, to your data on population 
growth in Chicago.  Call it the "seven degrees of data" and remember the 
exabytes of data out there.

Python is creating the perfect system for that because it has an interpreter 
environment with which to manipulate objects that could be retrieved on the net 
and sent back out.  It has docstrings so that your foreign object can 
self-document, and doctests, so that I can be confident that your code works as 
*I* expect.

There are reasons to have limits on programming freedom.  It puts order to 
chaos.  It guides the wily programmers into a particular train of thought.  You 
don't override "True" because you'd be breaking one of the [explicit] goals of 
the language:  readability.  If there were no constraints, life itself could 
not exist.

I don't think shadowing built-in types was a design choice but simply never got 
exercised because most people are used to handling such things, 
*subconsciously*, like C.

To Mr. Gatti, my point was not an insult, it is a theoretical postulate in the 
domain of Computer Science.  One that has not really been studied.  OOP is 
still far from it's goal, so the field is still answering questions within it.

Mark
-- 
https://mail.python.org/mailman/listinfo/python-list

Reply via email to