In article <mailman.2052.1306303508.9059.python-l...@python.org>, Dennis Lee Bieber <wlfr...@ix.netcom.com> wrote:
> On Tue, 24 May 2011 13:39:02 -0400, "D'Arcy J.M. Cain" <da...@druid.net> > declaimed the following in gmane.comp.python.general: > > > > My point was that even proponents of the language can make a > > significant error based on the way the variable is named. It's like > > the old Fortran IV that I first learned where the name of the variable > > determined whether it was an integer or a floating point. > > > Only if one didn't declare the type ahead of time... > > And even then it wasn't that hard to remember (using a non-PC > mnemonic): Indian's are integer (variables starting I to N inclusive > were integers) Remembering that I, J, K, L, M, and N were integer was trivial if you came from a math background. And, of course, Fortran was all about math, so that was natural. Those letters are commonly used for integers in formulae. If I write $ x sub i $, anybody who knows math would immediately assume that the range of x was reals and the range of i was integers. -- http://mail.python.org/mailman/listinfo/python-list