Re: More About Unicode in Python 2 and 3

2014-01-06 Thread Mark Janssen
 The argument is that a very important, if small, subset a data manipulation
 become very painful in Py3.  Not impossible, and not difficult, but painful
 because the mental model and the contortions needed to get things to work
 don't sync up anymore.

You are confused.  Please see my reply to you on the bytestring type thread.

 Painful because Python is, at heart, a simple and
 elegant language, but with the use-case of embedded ascii in binary data
 that elegance went right out the window.

It went out the window only because the Object model with the
type/class unification was wrong.  It was fine before.

Mark

 It can't be both things. It's either bytes or it's text.

 Of course it can be:

 000: 0372 0106   6100 1d00    .r..a...
 010:          
 020: 4e41 4d45    0043 0100   NAME...C
 030: 1900         
 040: 4147 4500    004e 1a00   AGEN
 050: 0300         
 060: 0d1a 0a  ...

 And there we are, mixed bytes and ascii data.

No, you are printing a debug output which shows both.  That's called CHEATING.

Mark
-- 
https://mail.python.org/mailman/listinfo/python-list


Re: More About Unicode in Python 2 and 3

2014-01-06 Thread Mark Janssen
 Chris didn't say bytes and ascii data, he said bytes and TEXT.
 Text != ascii data, and the fact that some people apparently think it
 does is pretty much the heart of the problem.

 The heart of a different problem, not this one.  The problem I refer to is
 that many binary formats have well-defined ascii-encoded text tidbits.

Really?  If people are using binary with well-defined ascii-encoded
tidbits, they're doing something wrong.  Perhaps you think escape
characters \n are well defined tidbits, but YOU WOULD BE WRONG.
The purpose of binary is to keep things raw.  WTF?  You guys are so
strange.


 If you (generic you) don't get that, you'll have a bad time. I mean
 *really*
 get it, deep down in the bone. The long, bad habit of thinking as
 ASCII-encoded bytes as text is the problem here.

I think the whole forking community is confused at because of your own
arrogance.  Foo(l)s.

markj
-- 
https://mail.python.org/mailman/listinfo/python-list


Re: the Gravity of Python 2

2014-01-06 Thread Mark Janssen
 http://blog.startifact.com/posts/python-2-gravity.html

 A Way Forward - How to go forward then? I think it makes sense to work as
 hard as possible to lift those Python 2 codebases out of the gravity well.

 I think this is complete nonsense.  There's only been five years since the
 first release of Python 3.  Surely much more time should be made available
 for people using Python 2 to plan for a migration?

What makes no sense is that you've started a whole 'nother thread on
an issue whose gravity is right here, already on the list.  Add your
new commentary and links to existing threads would be easier, yes?

Mark unLawrence
-- 
https://mail.python.org/mailman/listinfo/python-list


Re: More About Unicode in Python 2 and 3

2014-01-06 Thread Mark Janssen
 Looks like another bad batch, time to change your dealer again.

??? Strange, when the debate hits bottom, accusations about doing
drugs come up.  This is like the third reference (and I don't even
drink alcohol).

mark
-- 
https://mail.python.org/mailman/listinfo/python-list


Re: More About Unicode in Python 2 and 3

2014-01-06 Thread Mark Janssen
 Really?  If people are using binary with well-defined ascii-encoded
 tidbits, they're doing something wrong.  Perhaps you think escape
 characters \n are well defined tidbits, but YOU WOULD BE WRONG.
 The purpose of binary is to keep things raw.  WTF?

 If you want to participate in this discussion, do so.  Calling people
 strange, arrogant, and fools with no technical content is just rude. Typing
 YOU WOULD BE WRONG in all caps doesn't count as technical content.

Ned -- IF
-- 
https://mail.python.org/mailman/listinfo/python-list


Re: More About Unicode in Python 2 and 3

2014-01-06 Thread Mark Janssen
 Really?  If people are using binary with well-defined ascii-encoded
 tidbits, they're doing something wrong.  Perhaps you think escape
 characters \n are well defined tidbits, but YOU WOULD BE WRONG.
 The purpose of binary is to keep things raw.  WTF?

 If you want to participate in this discussion, do so.  Calling people
 strange, arrogant, and fools with no technical content is just rude. Typing
 YOU WOULD BE WRONG in all caps doesn't count as technical content.

Ned -- IF YOU'RE A REAL PERSON -- you will see that several words
prior to that declaration, you'll find (or be able to arrange) the
proposition: Escape characters are well-defined tidbits of binary
data is FALSE.

Now that is a technical point that i'm saying is simply the way
things are coming from the mass of experience held by the OS
community and the C programming community which is responsible for
much of the world's computer systems.  Do you have an argument against
it, or do you piss off and argue against anything I say?? Perhaps I
said it too loudly, and I take responsibility for that, but don't
claim I'm not making a technical point which seems to be at the heart
of all the confusion regarding python/python3 and str/unicode/bytes.

mark
-- 
https://mail.python.org/mailman/listinfo/python-list


Re: More About Unicode in Python 2 and 3

2014-01-06 Thread Mark Janssen
 I would still point out that Kenneth and Armin are not the whole Python
 community.

 I never said they were the whole community, of course. But they are not
 outliers either.  [...]

 Your whole argument seems to be that a couple revered (!!)
 individuals should see their complaints taken for granted. I am opposed to
 rockstarizing the community.

 I'm not creating rock stars.  I'm acknowledging that these two people are
 listened to by many others.  It sounds like part of your effort to avoid
 rockstars is to ignore any one person's specific feedback?  I must be
 misunderstanding what you mean.

To Ned's defense, it doesn't always work to treat everyone in the
community as equal.  That's not to say that those two examples are the
most important, but some people work on core aspects of the field
which are critical for everything else to work properly.  Without
diving into it, one can't say whether Ned's intuition is wrong or not.

markj
-- 
https://mail.python.org/mailman/listinfo/python-list


Re: More About Unicode in Python 2 and 3

2014-01-05 Thread Mark Janssen
 Most of the complaints about Py3 are it's harder to get something
 started (or port from Py2). My answer is that it's easier to get
 something finished.

 I like all of this logic, it makes sense to me.  But Armin and Kenneth have
 more experience than I do actually writing networking software. They are
 both very smart and very willing to do a ton of work.  And both are unhappy.
 I don't know how to square that with the logic that makes sense to me.

 And no amount of logic about why Python 3 is better is going to solve the
 problem of the two of them being unhappy.  They are speaking from experience
 working with the actual product.

+1, well-said.

I hope you'll see my comments on the thread on the bytestring type.
This issue also goes back to the schism in 2004 from the VPython folks
over floating point.  Again the ***whole*** issue is ignoring the
relationship between your abstractions and your concrete architectural
implementations.  I honestly think Python3 will have to be regressed
despite all the circle jerking about how everyone's moving to Python
3 now.  I see how I was inadequately explaining the whole issue by
using high-level concepts like models of computation, but the
comments on the aforementioned thread go right down to the heart of
the issue.

markj
-- 
https://mail.python.org/mailman/listinfo/python-list


Re: interactive help on the base object

2013-12-08 Thread Mark Janssen
On Sun, Dec 8, 2013 at 2:33 AM, Steven D'Aprano
steve+comp.lang.pyt...@pearwood.info wrote:
 On Sat, 07 Dec 2013 20:21:06 -0800, Mark Janssen wrote:

 Is it just me, or is this basically useless?

 class object
  |  The most *base* type

 [[Terry Reedy:]]
 How about something like.
 The default top *superclass* for all Python classes.

 How 'bout you fools just admit that you didn't realize you've been
 confused this whole time?  (It *is* possible isn't it?)

 Mr. Ewing says base has to be interpreted as an *adjective* because
 otherwise it would mean the BOTTOM (like the BASE of the pyramid), while
 Terry responds that it is the TOP (*super*class).  Earlier, Steven
 D'Aprano wanted to argue that this distinction was irrelevant,

 What are you talking about? Until this very post, I haven't made any
 comments in this thread.

It was a few months ago.  You do know what I'm talking about because
you just expounded with the exact same argument below.  It's like a
broken record.  (Now if *I* sound like a broken record, it's because
no seems to see the obvious, but carry on.)

 but obviously it can't very well be both at once now cannit?

 Family trees and other hierarchies, including class inheritance diagrams,
 have a *relative* direction not an absolute direction. We can all agree
 that Fred and Wilma are the parents of Pebbles, but it doesn't really
 matter whether we draw the family tree like this:


 Fred  Wilma  (diagrams best viewed in a fixed-width font
   | | like Courier, Monaco or Lucinda Typewriter)
   +++
|
 Pebbles


 (inheritance goes *down* the page from ancestors to descendants)

 or like this:

 Pebbles
|
   +++
   | |
 Fred  Wilma


 (inheritance goes *up* the page from ancestors to descendants).

 What matters is the relationships between the entities, not the specific
 direction they are drawn in relative to some imaginary absolute space.
 [yadda, yagni, yadda]

But, there IS A DIFFERENCE.  Let me explain the concept of a object
model (or type model if you prefer).

In a family inheritance tree,  there is this difference -- called the
calendar --  which imposes an ordering which can't be countermanded
by flipping your silly chart around.  You made a bullshit example to
simply argue a point and *fooled yourself* into ignoring this.  Yes?

Likewise, WITH A COMPUTER, there is a definite order which can't be
countermanded by simply having this artifice called Object.  If you
FEE(L)s hadn't noticed (no longer using the insult foos out of
respect for the sensativities of the brogrammers), this artifice has
just been *called on the floor* with this little innocent question
that fired up this discussion again (don't hate the messenger).
Again:  people entering the community are pointing out a problem --
that Object is both trying to be the BASE and the SUPERclass of all
objects.

CS554: A type/object *model* has to define the relationship of these
nice abstractions so that they can be mapped to the *actual
concreteness* of the machine.  And there, bro, there is an ordering.
You're not going to magically flip the hierarchy so that your bitless
Object becomes a machine word that is the base of all your types.
You've been fooled by the magic of the Turing Machine.   The modern
computer mollifies you with the illusion of total abstraction where
there are no bits or 1s and 0s involved, but yea, it did not turn out
that way.  (Note bene: as a comparison, C++ is very UNAMBIGUOUS about
this fact -- all objects inherit from concrete machine types, which is
why it remains important, *despite* being one of the worst to do OOP
in.  Its *type model* is probably the most clear of any
object-oriented language).

 Likewise it doesn't matter whether we draw class hierarchies from the top
 down or the bottom up or even sidewise:

Have you caught it by now, friends:  IT MATTERS TO THE COMPUTER.
With some apologies for Ned for attempting to be neutral.   Apparently
you guys are philosophers more than Computer Engineers.

MarkJ
Tacoma, Washington
-- 
https://mail.python.org/mailman/listinfo/python-list


Re: interactive help on the base object

2013-12-08 Thread Mark Janssen
   help(object)
 Help on class object in module builtins:

 class object
   |  The most base type

 '''The default top superclass for all Python classes.
 Its methods are inherited by all classes unless overriden.
 '''

  The root class for all Python classes. Its methods are inherited by
 all classes unless overriden. 

*sits back*.
-- 
MarkJ
Tacoma, Washington
-- 
https://mail.python.org/mailman/listinfo/python-list


Re: interactive help on the base object

2013-12-08 Thread Mark Janssen
 What methods, if any does it provide?  Are they all abstract? etc???

 Pretty much nothing useful :-)

 py dir(object)
 [...]


So (prodding the student), Why does everything inherit from Object if
it provides no functionality?

Practicality-beats-purity-yours?

-- 
MarkJ
Tacoma, Washington
-- 
https://mail.python.org/mailman/listinfo/python-list


Re: interactive help on the base object

2013-12-08 Thread Mark Janssen
On Sun, Dec 8, 2013 at 6:44 PM, Chris Angelico ros...@gmail.com wrote:
 On Mon, Dec 9, 2013 at 10:01 AM, Mark Janssen dreamingforw...@gmail.com 
 wrote:
 (Note bene: as a comparison, C++ is very UNAMBIGUOUS about
 this fact -- all objects inherit from concrete machine types, which is
 why it remains important, *despite* being one of the worst to do OOP
 in.  Its *type model* is probably the most clear of any
 object-oriented language).

 Factually wrong. In C++, it is actually *impossible* to inherit from a
 concrete machine type, by which presumably you mean the classic
 types int/char/float etc.

Wow, you guys trip me out, but I guess I've been working in a
different universe where I was mapping classes into basic types (using
generic programming along with typedef).  I'm going to have to
re-think all this confusion.

But, in any case, if you don't have a way to map your abstract objects
into machine types, you're working on magic, not computer science.

MarkJ
Tacoma, Washington
-- 
https://mail.python.org/mailman/listinfo/python-list


Re: interactive help on the base object

2013-12-07 Thread Mark Janssen
 Is it just me, or is this basically useless?

 class object
  |  The most *base* type

[[Terry Reedy:]]
 How about something like.
 The default top *superclass* for all Python classes.

How 'bout you fools just admit that you didn't realize you've been
confused this whole time?  (It *is* possible isn't it?)

Mr. Ewing says base has to be interpreted as an *adjective* because
otherwise it would mean the BOTTOM (like the BASE of the pyramid),
while Terry responds that it is the TOP (*super*class).  Earlier,
Steven D'Aprano wanted to argue that this distinction was irrelevant,
but obviously it can't very well be both at once now cannit?

Could-the-world-be-so-crazy-confused-and-then-shoot-the-messenger?

Sadly, yes.

MarkJ
Tacoma, Washington
-- 
https://mail.python.org/mailman/listinfo/python-list


Re: Extending the 'function' built-in class

2013-12-01 Thread Mark Janssen
 Hi, I can't figure out how I can extend the 'function' built-in class. I 
 tried:
   class test(function):
 def test(self):
   print(test)
 but I get an error. Is it possible ?

It has to do with differing models of computation, and python isn't
designed for this.  Perhaps you're searching for the ultimate lambda?.
-- 
MarkJ
Tacoma, Washington
-- 
https://mail.python.org/mailman/listinfo/python-list


Re: Newbie - Trying to Help a Friend

2013-11-19 Thread Mark Janssen
 Think they just needed a starting point really to be honest as they can't get 
 there head round it.

Then the problem is that your friend doesn't understand one or more of
the words being used.  This is s necessary prerequisite for making an
algorithm from a text description.  Perhaps they don't know what it
means to be divisible.

-- 
MarkJ
Tacoma, Washington
-- 
https://mail.python.org/mailman/listinfo/python-list


Re: New user's initial thoughts / criticisms of Python

2013-11-11 Thread Mark Janssen
On Mon, Nov 11, 2013 at 3:32 AM, Chris Angelico ros...@gmail.com wrote:
 On Mon, Nov 11, 2013 at 10:17 PM, Steven D'Aprano
 steve+comp.lang.pyt...@pearwood.info wrote:
 On Mon, 11 Nov 2013 21:39:27 +1100, Chris Angelico wrote:
 denormalizes it into a lookup table by creating 70 entries quoting the
 first string, 15 quoting the second, 5, and 10, respectively.

 Ewww :-(

 Imagine having to print out the dict looking for an error in the lookup
 table. Or imagine the case where you have:

 0...2: do this
 20001...890001: do that
 890001...890003: do something else

 Don't get me wrong, it's a clever and reasonable solution for your
 specific use-case. But I'd much rather have a lookup table variant that
 matches on intervals.

 Of course it's Ewww in isolation :) But just imagine there are piles
 and piles of these tables, themselves keyed by keyword, and I want to
 be able to let untrusted people create tables (which means they
 basically have to be data, not code). Also, bear in mind, all the
 tables are based around dice that can be physically rolled, so none
 has more than 100 entries after denormalization. Quite a lot of the
 tables actually have unique entries per value (eg it's a d10 roll,
 with ten unique outputs), so it's simplest to just turn all the tables
 into that format; that way, the main code needs worry about one type
 only, and the preprocessor handles the denormalization.

Hmm, I automatically think of creating a hash function, but then
that's how Python implements keys in dicts, so a dict is fine
solution.

-- 
MarkJ
Tacoma, Washington
-- 
https://mail.python.org/mailman/listinfo/python-list


Re: New user's initial thoughts / criticisms of Python

2013-11-09 Thread Mark Janssen
A little late, but a couple of cents worth more data:

 I've just got a few thoughts I'd like to share and ask about:

 * Why not allow floater=float(int1/int2) - rather than floater=float
 (int1)/float(int2)?

This has to do with evaluation order, the stuff inside the parens gets
evaluated first, resulting in an integer for versions of python less
than v3.

 Give me a float (or an error message) from evaluating everything in the
 brackets. Don't make me explicitly convert everything myself (unless I
 want to)

You only have to give one float value:  int1/float(int2).  The
environment converts it to a floating point operation when either of
the two is a float value.  (try:  1/2.0, for example)

 * No sign of a select .. case statement

 Another useful tool in the programmer's toolbox

I agree on this one, though I prefer C's syntax of switch/case.  The
if/then/elif ladder of python is a bit cumbersome, but was chosen to
reduce language size -- a value with mixed reviews.

 * Call me pedantic by why do we need a trailing comma for a list of one
 item? Keep it intuitive and allow lstShopping=[] or [Bread] or
 [Bread, Milk,Hot Chocolate] I don't like [Bread,]. It bugs me.

This one got answered, it has to do with the parser when dealing with parens.

 Is everyone happy with the way things are?

No, but Python is still the best language.

 Could anyone recommend a good,
 high level language for CGI work? Not sure if I'm going to be happy with
 Perl (ahhh, get him, he's mentioned Perl and is a heretic!) or Python.

Personally, I wouldn't recommend Python for web scripts.  But I'm
biased and am speaking from where I see the field of computer
languages heading.

MarkJ
Tacoma, Washington
-- 
https://mail.python.org/mailman/listinfo/python-list


Re: New user's initial thoughts / criticisms of Python

2013-11-09 Thread Mark Janssen
 I'd be interested to hear your thoughts on where the field of computer 
 languages is heading, and how that affects the choice of languages for 
 building web sites.

Well, there aren't that many groupings towards which languages
specialize for (not including embedded or other application-specific
domains).  There's OS scripting, Web scripting, and then the otherwise
general-purpose normative languages in the middle of those two
extremes.  But this view presumes a model of computation which hasn't
settled into wide agreement.
-- 
MarkJ
Tacoma, Washington
-- 
https://mail.python.org/mailman/listinfo/python-list


Re: Languages for different purposes (was Re: New user's initial thoughts / criticisms of Python)

2013-11-09 Thread Mark Janssen
On Sat, Nov 9, 2013 at 2:58 PM, Chris Angelico ros...@gmail.com wrote:
 So, on what basis _would_ you choose a language for some purpose?
 Without speaking specifically of web development here, how do you
 choose a language?

Most generally, you choose a language informed by the language
designer's intentions of the language, usually stated explicitly.  Of
course, if you're in a constrained environment, then that is going to
dictate your decision.   After that, you're left with your own level
of expertise regarding language design (which for many is not much)
and the breadth of the field to examine (usually larger than most are
familiar).  This is an arena where PhD's are made.

Obviously, languages just designed to [brain]f*ck with you, despite
being theoretically complete, aren't much of a candidate for
evaluation.

 But that would still leave you with a good few choices. When it comes
 down to it, how do you choose between Ruby, Python, Perl, Pike,
 JavaScript, insert language of choice here, etcetera? I can think of
 a few considerations that may or may not be important... and I'm sure
 you can add more.

Among general purpose languages that pretty much offer the same
benefits, the community often informs the decision.

-- 
MarkJ
Tacoma, Washington
-- 
https://mail.python.org/mailman/listinfo/python-list


Re: Algorithm that makes maximum compression of completly diffused data.

2013-11-07 Thread Mark Janssen
Well let me try to explain why it is working and i have implemented one.
I only need to refresh my memory it was almost 15 years ago.
This is not the solution but this is why it is working.
65536=256^2=16^4=***4^8***=2^16

 All of those values are indeed the same, and yet that is completely
 unrelated to compression.  Did you honestly believe this was actually
 explaining anything?

I think the idea is that you could take any arbitrary input sequence,
view it as a large number, and then find what exponential equation can
produce that result.  The equation becomes the compression.

MarkJ
Tacoma, Washington
-- 
https://mail.python.org/mailman/listinfo/python-list


Re: Algorithm that makes maximum compression of completly diffused data.

2013-11-07 Thread Mark Janssen
On Thu, Nov 7, 2013 at 6:17 PM, Chris Angelico ros...@gmail.com wrote:
 On Fri, Nov 8, 2013 at 1:05 PM,  jonas.thornv...@gmail.com wrote:
 I guess what matter is how fast an algorithm can encode and decode a big 
 number, at least if you want to use it for very big sets of random data, or 
 losless video compression?

 I don't care how fast. I care about the laws of physics :) You can't
 stuff more data into less space without losing some of it.

Technically, the universe could expand temporarily or reconfigure to
allow it; the question is who or what will have to shift out to allow
it?

-- 
MarkJ
Tacoma, Washington
-- 
https://mail.python.org/mailman/listinfo/python-list


Re: Algorithm that makes maximum compression of completly diffused data.

2013-11-07 Thread Mark Janssen
 I am not sure if it is just stupidness or laziness that prevent you from 
 seeing that 4^8=65536.

 I can see that 4^8 = 65536. Now how are you going to render 65537? You
 claimed that you could render *any* number efficiently. What you've
 proven is that a small subset of numbers can be rendered efficiently.

I think the idea would be to find the prime factorization for a given
number, which has been proven to be available (and unique) for any and
every number.  Most numbers can compress given this technique.  Prime
numbers, of course, would not be compressed.

-- 
MarkJ
Tacoma, Washington
-- 
https://mail.python.org/mailman/listinfo/python-list


Re: Algorithm that makes maximum compression of completly diffused data.

2013-11-03 Thread Mark Janssen
 Congratulations Jonas.  My kill file for this list used to have only one
 name, but now has 2.

 You have more patience than I!  Jonas just made mine seven.  :)

Gosh, don't kill the guy.  It's not an obvious thing to hardly anyone
but computer scientists.  It's an easy mistake to make.

-- 
MarkJ
Tacoma, Washington
-- 
https://mail.python.org/mailman/listinfo/python-list


Re: Algorithm that makes maximum compression of completly diffused data.

2013-11-03 Thread Mark Janssen
 Note that I *can* make a compression algorithm that takes any
 length-n sequence and compresses all but one sequence by at least one
 bit, and does not ever expand the data.

 00 - 
 01 - 0
 10 - 1
 11 - 00

 This, obviously, is just 'cause the length is an extra piece of data,
 but sometimes you have to store that anyway ;).

And how many bits will you use to store the length?

 So if I have a list of
 N length-Y lists containing only 1s or 0s, I can genuinely compress
 the whole structure by N log2 Y items.

But you cheated by using a piece of information from outside the
system: length.  A generic compression algorithm doesn't have this
information beforehand.
-- 
MarkJ
Tacoma, Washington
-- 
https://mail.python.org/mailman/listinfo/python-list


Re: Algorithm that makes maximum compression of completly diffused data.

2013-11-02 Thread Mark Janssen
 Let me try to get you to understand WHY what you say is impossible.  Let's
 say you do have a function f(x) that can produce a compressed output y for
 any given x, such that y is always smaller than x.  If that were true, then
 I could call f() recursively:
 f(f(...f(f(f(f(f(x)...))
 and eventually the result get down to a single bit.  I hope it is clear
 that there's no way to restore a single bit back into different source
 texts.

Hey, that's a nice proof!

Cheers,

Mark Janssen
-- 
https://mail.python.org/mailman/listinfo/python-list


Re: Algorithm that makes maximum compression of completly diffused data.

2013-10-30 Thread Mark Janssen
On Wed, Oct 30, 2013 at 11:21 AM,  jonas.thornv...@gmail.com wrote:
 I am searching for the program or algorithm that makes the best possible of 
 completly (diffused data/random noise) and wonder what the state of art 
 compression is.

Is this an April Fool's Joke?  A key idea of completely random is
that you *can't* compress it.
-- 
MarkJ
Tacoma, Washington
-- 
https://mail.python.org/mailman/listinfo/python-list


Re: Python Front-end to GCC

2013-10-27 Thread Mark Janssen
 I see the big man stepping in to answer for his homies

After re-reading the discussion, I wish to retract what I'm saying
here and apologize to John who seems like a decent guy.

, but while his
 explanation satisfies their question of well why do these magic
 values get used then, if what Mark says is true?, it doesn't address
 the real confusion:  What is the difference between script code
 (like Javascript and visual) made for the screen (where such magic
 values are utilized) and compiled source (made for the machine)?  And
 that is where John, while benevolent, hasn't done the homework of
 computer science.   Ask him.

Otherwise, most of this, while sloppy, still stands.

Mark Janssen
Tacoma, Washington
-- 
https://mail.python.org/mailman/listinfo/python-list


Re: Python Front-end to GCC

2013-10-26 Thread Mark Janssen
[Getting back to some old comments]

 A language specification in BNF is just syntax. It doesn't say anything
 about semantics. So how could this be used to produce executable C code
 for a program? BNF is used to produce parsers. But a parser isn't
 sufficient.

 A C program is just syntax also.  How does the compiler generate
 executable machine code?  Extrapolate into a Python front-end to C.

 Did you even read the paragraph you quoted above?  The BNF specification
 does NOT completely describe a language, it only defines its syntax.

Computer Science 301 (a.k.a. educating the python brogrammers who've
been too long using interpreted languages):

C source (blah.c) is broken down into a linear sequence of tokens
fed into a parser.  The BNF definition for C takes those tokens/syntax
and produces a lexical graph of the source -- its grammatical form.
This becomes an abstract syntax *tree* because there is a main
function (without which I don't believe you can call a language
formally Turing Complete because one doesn't know where to begin to
feed the machine (wait for it boom)).  In any case, this *roots*
the abstract lexical graph and forms the basis for compiling into
machine code.

  So
 if the only thing you knew about C was its BNF, you could certainly not
 write a C compiler.  And neither could anyone else.

Well, now you're confusing everybody, because you're asking, in
essence: what is the meaning of a symbol to a computer?, and since
there isn't one, then you should wonder: how are you going to get it
to 'do the right thing?'  For that, you'll have to take Mark
Janssen's PHIL 444: Epistemics of Quantity (not offered from the
internet).

Now, please give credit to all the old-timers who paved the way for
all you to be partying in the easy-land of high-level languages like
Python.  It's them who made the computer SCIENCE.  You guys have been
so spoiled, you're taking it all for granted and confusing people with
all your claptrap about TYPES.  The insane cannot tell that they're
insane.  Remember that.

 Fortunately for the
 C community, the language specification included much more than a BNF
 grammar.  At a minimum, you have to specify both the syntax and the
 semantics.

I will look forward to how you will give a semantic specification for
the C token {: left bracket.

Apologies will be accepted on the list.

Mark J
-- 
https://mail.python.org/mailman/listinfo/python-list


Re: Python Front-end to GCC

2013-10-26 Thread Mark Janssen
 Apologies will be accepted on the list.

BTW, I can't resist pointing out that you guys are like a cup already
full of (black) coffee -- too full to allow the pure water of clarity
to enter.

(cf. Buddhism)   .. (boom)

MarkJanssen
Tacoma, Washington
-- 
https://mail.python.org/mailman/listinfo/python-list


Re: Python Front-end to GCC

2013-10-26 Thread Mark Janssen
 What a mess of a discussion.

I see the big man stepping in to answer for his homies, but while his
explanation satisfies their question of well why do these magic
values get used then, if what Mark says is true?, it doesn't address
the real confusion:  What is the difference between script code
(like Javascript and visual) made for the screen (where such magic
values are utilized) and compiled source (made for the machine)?  And
that is where John, while benevolent, hasn't done the homework of
computer science.   Ask him.

Mark
-- 
https://mail.python.org/mailman/listinfo/python-list


Re: Python Front-end to GCC

2013-10-25 Thread Mark Janssen
On Thu, Oct 24, 2013 at 8:40 PM, Mark Lawrence breamore...@yahoo.co.uk wrote:
 On 22/10/2013 18:37, Oscar Benjamin wrote:
 OTOH why in particular would you want to initialise them with zeros? I
 often initialise arrays to nan which is useful for debugging.

Is this some kind of joke?  What has this list become?

-- 
MarkJ
Tacoma, Washington
-- 
https://mail.python.org/mailman/listinfo/python-list


Re: Python Front-end to GCC

2013-10-25 Thread Mark Janssen
 OTOH why in particular would you want to initialise them with zeros? I
 often initialise arrays to nan which is useful for debugging.

 Is this some kind of joke?  What has this list become?

 It's a useful debugging technique to initialize memory to distinctive values
 that should never occur in real data.

If you're doing this, you're doing something wrong.   Please give me
the hex value for NaN so I can initialize with my array.

-- 
MarkJ
Tacoma, Washington
-- 
https://mail.python.org/mailman/listinfo/python-list


Re: Python Front-end to GCC

2013-10-25 Thread Mark Janssen
 OTOH why in particular would you want to initialise them with zeros? I
 often initialise arrays to nan which is useful for debugging.

 Is this some kind of joke?  What has this list become?

 It's a useful debugging technique to initialize memory to distinctive
 values that should never occur in real data.

 If you're doing this, you're doing something wrong.   Please give me
 the hex value for NaN so I can initialize with my array.

 It is clear that you know as much about debugging as you do about objects
 and message passing [...] can see why the
 BDFL described you as an embarrassment, and if he didn't, he certainly
 should have done.

Clearly the python list has been taken over by TheKooks.  Notice he
did not respond to the request.  Since we are talking about digital
computers (with digital memory), I'm really curious what the hex value
for NaN is to initialize my arrays

All hail chairman Meow.  Dismissed.

-- 
MarkJ
Tacoma, Washington
-- 
https://mail.python.org/mailman/listinfo/python-list


Re: Python Front-end to GCC

2013-10-25 Thread Mark Janssen
On Fri, Oct 25, 2013 at 11:59 AM, rusi rustompm...@gmail.com wrote:
 On Saturday, October 26, 2013 12:15:43 AM UTC+5:30, zipher wrote:
 Clearly the python list has been taken over by TheKooks.  Notice he
 did not respond to the request.  Since we are talking about digital
 computers (with digital memory), I'm really curious what the hex value
 for NaN is to initialize my arrays

 I dont see how thats any more relevant than:
 Whats the hex value of the add instruction?

You don't see.  That is correct.  Btw, I believe the hex value for
the add instruction on the (8-bit) Intel 8088 is x0.  Now what were
you saying?

-- 
MarkJ
Tacoma, Washington
-- 
https://mail.python.org/mailman/listinfo/python-list


Re: Python Front-end to GCC

2013-10-25 Thread Mark Janssen
 As for the hex value for Nan who really gives a toss?  The whole point is
 that you initialise to something that you do not expect to see.  Do you not
 have a text book that explains this concept?

No, I don't think there is a textbook that explains such a concept of
initializing memory to anything but 0 -- UNLESS you're from Stupid
University.

Thanks for providing fodder...

Mark Janssen, Ph.D.
Tacoma, WA
-- 
https://mail.python.org/mailman/listinfo/python-list


Re: Python Front-end to GCC

2013-10-25 Thread Mark Janssen
 As for the hex value for Nan who really gives a toss?  The whole point is
 that you initialise to something that you do not expect to see.  Do you
 not have a text book that explains this concept?

 No, I don't think there is a textbook that explains such a concept of
 initializing memory to anything but 0 -- UNLESS you're from Stupid
 University.

 Thanks for providing fodder...

 We've been discussing *DEBUGGING*.

Are you making it LOUD and *clear* that you don't know what you're
talking about?

Input:  Yes/no

MarkJanssen
-- 
https://mail.python.org/mailman/listinfo/python-list


Re: Python Front-end to GCC

2013-10-25 Thread Mark Janssen
 We've been discussing *DEBUGGING*.

 Are you making it LOUD and *clear* that you don't know what you're
 talking about?

 Input:  Yes/no

 no

 Now please explain what you do not understand about the data below that's
 been written by Oscar Benjamin, myself and Ned Batchelder, specifically the
 use of the word *DEBUGGING*.  Is this a word that does not appear in your
 text books?

Yes.

And how do I explain what I do NOT understand?

  If that is in fact the case would you like one of the
 experienced practical programmers on this list to explain it to you?

N/A

 Have
 you ever bothered to read The Zen of Python, specifically the bit about
 Practicality beats purity?

Yes, I have.  And if you have read that, you know that preceding that
is the rule Special cases aren't enough to break the rules.

You sir, have broken the rules, you should not be preaching
practicality if you don't know the rules.

Now take your choir boys there and sit down.

Mark

P.S.

 In his book Writing Solid Code Steve Maguire states that he
 initialises with 0xA3 for Macintosh programs, and that Microsoft uses
 0xCC, for exactly the reasons that you describe above.

I will be glad to discuss all these arcane measures, when you aren't
all being asswipes.

 It's a useful debugging technique to initialize memory to distinctive
 values that should never occur in real data.

As I said, you're going something wrong.

 Python is the second best programming language in the world.
 But the best has yet to be invented.  Christian Tismer

When you're ready to make Python the best programming language in the
world, re-engage.
-- 
MarkJ
Tacoma, Washington
-- 
https://mail.python.org/mailman/listinfo/python-list


Re: Python Front-end to GCC

2013-10-25 Thread Mark Janssen
 But OTOH, it can also be explained away entirely by (as you previously
 noted) the Dunning-Kruger effect, with the same uninformed responses
 trotted out to everything.

 It was rusi who first mentioned this, I merely replied in my normal dead pan
 way.

 Slight aside, I spelt your surname incorrectly a few minutes ago whilst
 replying elsewhere, I do apologise.

What is this?  The circle-jerk list?  I make some points on the last
couple of threads and you all get bent-out of shape, then gather
around each other as if you're all in a cancer ward

-- 
MarkJ
Tacoma, Washington
-- 
https://mail.python.org/mailman/listinfo/python-list


Re: Python Front-end to GCC

2013-10-25 Thread Mark Janssen
On Fri, Oct 25, 2013 at 2:07 PM, Ned Batchelder n...@nedbatchelder.com wrote:
 (Offlist)

 Mark, these conversations would go much more smoothly if you would make
 direct statements about technical points.  Your messages are usually
 insinuating questions, or personal insults.

Yes, thank you.  That is correct.

 For example, you said:

 Please give me the hex value for NaN so I can initialize with my array.

 I think what you meant by this was: I don't think there is a hex value that
 represents NaN.  Why not say that?

Why?  Because I know there's not a hex value for NaN, otherwise it
would confuse the abstraction of what a computer is.  Any hex digit
you could attempt to obscure would be translatable as a number, and
therefore a contradiction.  Is that a good enough reason for ya?

 Then we could talk about your claim.

How about we talk about my claim with facts instead of attempts at
creating reality a la NovusOrdoSeclorum?

 You could even go so far as to admit that others might know things you
 don't, and ask, is there a hex value that represents NaN, I didn't realize
 there was?

How sweet.  Do you like makeup?

 We could have a discussion about the concepts involved.   As it is, the
 threads devolve into name calling, topic-changing non-sequiturs, and silly
 sound effects.  You seem to start with the assumption that you are right and
 everyone else is wrong, and begin with snark.

I'm still waiting on the binary-digit lexer, Ned.

 There really are people on the list who know a lot about software and
 computer science, including the people you are currently calling
 known-nothings.

I don't know if you are personally qualified for the latter, but agree
somewhat on the part of software.

 These things are true: There are hex values that represent NaNs.

Why don't you follow your own advice?  Instead of These things are
true:  Why don't you say These things could be true  OR *I*
believe that hex values could be used to represent NaN?

Tell us, which hex value is used to represent NaN?  (thoughts to self:
 all-ones wouldn't make a very good magic number for finding errors,
so I wonder what Ned will dream up  (btw:  I'm not gay)).   Note
that, just for the record, I was talking strictly about memory (RAM),
not variable assignments.

 Non-Turing-complete languages can be compiled to C.  ASTs don't have enough
 information to compile to machine code.

Please tell us then, what IS enough information to compile to machine
code?   ...rather than just saying that AST's don't have enough
information to compile to machine code

  Data on punched cards can be
 tokenized.  All of these things are true.

*rolls eyes*

 You seem to be a seeker of truth.  Why not listen to others?

Yes, now listening

-- 
MarkJ
Tacoma, Washington
-- 
https://mail.python.org/mailman/listinfo/python-list


Re: Python Front-end to GCC

2013-10-22 Thread Mark Janssen
I love it.  Watch this...

[context]
 A language specification in BNF is just syntax. It doesn't say anything
 about semantics. So how could this be used to produce executable C code
 for a program? BNF is used to produce parsers. But a parser isn't
 sufficient.

 A C program is just syntax also.  How does the compiler generate
 executable machine code?  Extrapolate into a Python front-end to C.

[Dave Angel responds:]
 Did you even read the paragraph you quoted above?  The BNF specification
 does NOT completely describe a language, it only defines its syntax.

[Steven D'Aprano responds:]
 Like every other language, C programs are certainly not *just* syntax.
 Here is some syntax:

 foo bar^ :=

Now, I don't know where y'all were taught Computer Science, but BNF
specifies not only syntax (which would be the *tokens* of a language),
but also its *grammar*;  how syntax relates to linguistic categories
like keywords, and tokens relate to each other.

Dave is claiming that BNF only defines the syntax of a language, but
then Stephen goes on to supply some syntax that a BNF specification of
the language would not allow (even though Steven calls it syntax
which is what BNF in Dave's claim parses).

So which of you is confused?  I ask that in the inclusive (not
exclusive OR) sense ;^)  -- face says both.

Mark Janssen
Tacoma, Washington.
-- 
https://mail.python.org/mailman/listinfo/python-list


Re: Python Front-end to GCC

2013-10-22 Thread Mark Janssen
 So which of you is confused?  I ask that in the inclusive (not
 exclusive OR) sense ;^)  -- face says both.

 Could you please be less snarky?  We're trying to communicate here, and it
 is not at all clear yet who is confused and who is not.  If you are
 interested in discussing technical topics, then discuss them.

Okay.  The purpose of BNF (at least as I envision it) is to
produce/specify a *context-free* grammar.  A lexer parses the tokens
specified in the BNF into an Abstract Syntax Tree.  If one can produce
such a tree for any given source, the language, in theory, can be
compiled by GCC into an executable.

Boom.
-- 
MarkJ
Tacoma, Washington
-- 
https://mail.python.org/mailman/listinfo/python-list


Re: Python Front-end to GCC

2013-10-22 Thread Mark Janssen
 So which of you is confused?  I ask that in the inclusive (not
 exclusive OR) sense ;^)  -- face says both.

 Could you please be less snarky?

 Okay.  The purpose of BNF (at least as I envision it) is to
 produce/specify a *context-free* grammar.  A lexer parses the tokens
 specified in the BNF into an Abstract Syntax Tree.  If one can produce
 such a tree for any given source, the language, in theory, can be
 compiled by GCC into an executable.

 Boom.

 Hmm, I don't hear the boom yet.  An Abstract Syntax Tree is a tree
 representation of a program.  To use my previous example: the program 123
 *!? 456 would become a tree:

 op: *!?
 num: 123
 num: 456

 There's still not enough information to compile this.

Is your language Turing complete?

-- 
MarkJ
Tacoma, Washington
-- 
https://mail.python.org/mailman/listinfo/python-list


Re: Python Front-end to GCC

2013-10-22 Thread Mark Janssen
 Okay.  The purpose of BNF (at least as I envision it) is to
 produce/specify a *context-free* grammar.  A lexer parses the tokens
 specified in the BNF into an Abstract Syntax Tree.  If one can produce
 such a tree for any given source, the language, in theory, can be
 compiled by GCC into an executable.

 Boom.

 But you still need to specify the semantics.

In my world, like writing pseudo-code or flow-charts, the AST *is* the
semantics.  What world are you guys from?
-- 
MarkJ
Tacoma, Washington
-- 
https://mail.python.org/mailman/listinfo/python-list


Re: Python Front-end to GCC

2013-10-22 Thread Mark Janssen
 Is your language Turing complete?


 1) No, it's not.
 2) So what?  That should make it easier to compile to C, if anything.
 3) Don't change the subject.

Well, if your language is not Turing complete, it is not clear that
you will be able to compile it at all.  That's the difference between
a calculator and a computer.

Thank you.  You may be seated.

Mark J
Tacoma, Washington
-- 
https://mail.python.org/mailman/listinfo/python-list


Re: Python Front-end to GCC

2013-10-21 Thread Mark Janssen
On Mon, Oct 21, 2013 at 12:46 AM, Steven D'Aprano st...@pearwood.info wrote:
 On Sun, 20 Oct 2013 20:35:03 -0700, Mark Janssen wrote:

 [Attribution to the original post has been lost]
 Is a jit implementation of a language (not just python) better than
 traditional ahead of time compilation.

 Not at all.  The value of jit compilation, I believe, is purely for the
 dynamic functionality that it allows.  AOT compilation will never allow
 that, but in return you get massive performance and runtime-size gains

 On the contrary, you have that backwards. An optimizing JIT compiler can
 often produce much more efficient, heavily optimized code than a static
 AOT compiler,

This is horseshit.

 and at the very least they can optimize different things
 than a static compiler can.

Okay sure.  But now you've watered down your claim that's it's not
saying much of anything.

 This is why very few people think that, in
 the long run, Nuitka can be as fast as PyPy, and why PyPy's ultimate aim
 to be faster than C is not moonbeams:

It is moonbeams, but that's a good thing.  I think you don't
understand how computers work, Steven.
In any event, PyPy is a great project for those who want experiment
with compiler and language design.

 JIT compilation is really about optimization,

No.

 which is why languages like
 Java and .NET which could easily be compiled to machine code at compile
 time generally use an intermediate bytecode and a JIT compiler instead.
 They're not doing it for dynamism since they aren't dynamic languages.
 It's a way of generating more aggressive (i.e. better but harder)
 optimizations based on information only available at runtime.

This must is true, but not for reasons you understand.

-- 
MarkJ
Tacoma, Washington
-- 
https://mail.python.org/mailman/listinfo/python-list


Re: Python Front-end to GCC

2013-10-21 Thread Mark Janssen
On Mon, Oct 21, 2013 at 4:08 AM, Philip Herron
herron.phi...@googlemail.com wrote:
 Thanks, i've been working on this basically on my own 95% of the compiler is 
 all my code, in my spare time. Its been fairly scary all of this for me. I 
 personally find this as a real source of interest to really demystify 
 compilers and really what Jit compilation really is under the hood.

So I'm curious, not having looked at your code, are you just
translating python code into C code to make your front-end to gcc?
Like converting [1,2,3] into a C linked-list data structure and
making 1 an int (or BigNum?)?

-- 
MarkJ
Tacoma, Washington
-- 
https://mail.python.org/mailman/listinfo/python-list


Re: Python Front-end to GCC

2013-10-21 Thread Mark Janssen
 No its not like those 'compilers' i dont really agree with a compiler 
 generating C/C++ and saying its producing native code. I dont really believe 
 its truely within the statement. Compilers that do that tend to put in alot 
 of type saftey code and debugging internals at a high level to get things 
 working in other projects i am not saying python compilers here i havent 
 analysed enough to say this.

Hmm, well what I'd personally find interesting from a computer science
point of view is a app that will take a language specification in BNF
(complete with keywords and all) and output C code which is then
compiled to an executable as normal.  This is how a front-end should
be designed.  A middle-layer for translating common language elements
like lists, sets, etc, could make it easy.

-- 
MarkJ
Tacoma, Washington
-- 
https://mail.python.org/mailman/listinfo/python-list


Re: Python Front-end to GCC

2013-10-21 Thread Mark Janssen
 A language specification in BNF is just syntax. It doesn't say anything
 about semantics. So how could this be used to produce executable C code
 for a program? BNF is used to produce parsers. But a parser isn't
 sufficient.

A C program is just syntax also.  How does the compiler generate
executable machine code?  Extrapolate into a Python front-end to C.

-- 
MarkJ
Tacoma, Washington
-- 
https://mail.python.org/mailman/listinfo/python-list


Re: Python Front-end to GCC

2013-10-20 Thread Mark Janssen
 Gccpy is an Ahead of time implementation of Python ontop of GCC. So it
 works as you would expect with a traditional compiler such as GCC to
 compile C code. Or G++ to compile C++ etc.

That is amazing.  I was just talking about how someone should make a
front-end to GCC on this list a couple of months ago.  Awesome!

 Documentation can be found http://gcc.gnu.org/wiki/PythonFrontEnd.
 (Although this is sparse partialy on purpose since i do not wan't
 people thinking this is by any means ready to compile real python
 applications)

What's missing?

 I've found some good success with this project in compiling python
 though its largely unknown to the world simply because i am nervous of
 the compiler and more specifically the python compiler world.

 But at least to me there is at least to me an un-answered question in
 current compiler implementations.  AOT vs Jit.

 Is a jit implementation of a language (not just python) better than
 traditional ahead of time compilation.

Not at all.  The value of jit compilation, I believe, is purely for
the dynamic functionality that it allows.  AOT compilation will never
allow that, but in return you get massive performance and runtime-size
gains (that is, you don't need a massive interpreter environment
anymore!)  If your compiler produces an executable program without the
need for the python interpreter environment:  Two major wins.

 What i can say is ahead of time at least strips out the crap needed
 for the users code to be run. As in people are forgetting the basics
 of how a computer works in my opinion when it comes to making code run
 faster.

Agreed.

 I could go into the arguments but i feel i should let the project
 speak for itself its very immature so you really cant compare it to
 anything like it but it does compile little bits and bobs fairly well
 but there is much more work needed.

I wish I had the resources to try it myself, but would love to see
some performance numbers (say factorizations, or bubble-sorts, etc).
Also runtime executable sizes.


 I would really like to hear the feedback good and bad. I can't
 describe how much work i've put into this and how much persistence
 I've had to have in light of recent reddit threads talking about my
 project.

Please reference threads in question, would like to see the issues raised.
-- 
MarkJ
Tacoma, Washington
-- 
https://mail.python.org/mailman/listinfo/python-list


Re: Python was designed (was Re: Multi-threading in Python vs Java)

2013-10-17 Thread Mark Janssen
Prior to that [the '70s] you have punch cards where there's no meaningful
 definition of parsing because there are no tokens.

 I have no idea what you mean by this. [...]
 You seem drawn to sweeping statements about the current state and history of
 computer science, but then make claims like this about punched cards that
 just make no sense.

It's like this.  No matter how you cut it, you're going to get back to
the computers where you load instructions with switches.  At that
point, I'll be very much looking in anticipation to your binary-digit
lexer.

-- 
MarkJ
Tacoma, Washington
-- 
https://mail.python.org/mailman/listinfo/python-list


Re: Python was designed (was Re: Multi-threading in Python vs Java)

2013-10-17 Thread Mark Janssen
On Thu, Oct 17, 2013 at 10:32 AM, rusi rustompm...@gmail.com wrote:
 On Wednesday, October 16, 2013 1:56:27 AM UTC+5:30, zipher wrote:
 Yes, well clearly we are not having the same thoughts, yet the
 purpose of the academic establishment is to pin down such terminology
 and not have these sloppy understandings everywhere.  You dig?

 Heh Mark I am really sorry.  I think this is the third or fourth time that I 
 say something to which you reply with such egregious rubbish -- parsing has 
 something to do with card-punches?!?! Yeah like python has something to do 
 with the purple shirt I am wearing -- that a dozen others jump at you with a 
 resounding 'Cut the crap!'

You feedback is respected.  However, you haven't included in your
analysis that you have a closed group here of Python aficionados.  I
invite you to take a look at
http://c2.com/cgi/wiki?TypeSystemCategoriesInImperativeLanguagesTwo
before you continue to issue insults.

 Likewise here. I certainly 'dig' your passion to clean up the 'sloppy 
 understandings everywhere' and would only wish for you the sanity of more 
 knowledge of the subject before you begin to hold forth.

Talk to me after you've finished your assignment.
-- 
MarkJ
Tacoma, Washington
-- 
https://mail.python.org/mailman/listinfo/python-list


Re: Python was designed (was Re: Multi-threading in Python vs Java)

2013-10-17 Thread Mark Janssen
On Thu, Oct 17, 2013 at 3:10 PM, Ethan Furman et...@stoneleaf.us wrote:
 On 10/17/2013 01:57 PM, Ned Batchelder wrote:


 Read and listen more.  Write and say less.


 Mark Janssen has no interest in learning.  From a thread long-ago:

 Mark Janssen wrote:

 Ethan Furman wrote:

 Mark Janssen wrote:


 Really?

 -- int=five
 -- [int(i) for i in [1,2,3]]

 TypeError:  str is not callable

 Now how are you going to get the original int type back?


Thank you for bringing this back up.  Was it you who suggested that
built-in are re-assignable?  Because this is a bad idea for the
reasons I just showed.  My error in that example was going into arcane
points that I should have cross-checked in the Python language
definition (that built-ins were or were *not* assignable), then I
wouldn't have had to have made my (otherwise valid) point, that there
is no magical stack which will remember your language re-assignment
so that you can get it back, but then the example should have never
been pushed into existence in the first place.

-- 
MarkJ
Tacoma, Washington
-- 
https://mail.python.org/mailman/listinfo/python-list


Re: Python was designed (was Re: Multi-threading in Python vs Java)

2013-10-17 Thread Mark Janssen
 It's like this.  No matter how you cut it, you're going to get back to
 the computers where you load instructions with switches.  At that point,
 I'll be very much looking in anticipation to your binary-digit lexer.

 Why stop there? If you go back far enough, you've got Babbage with his
 Analytical Engine and his laboriously hand-cast analog gears.

And there you bring up the heart of it:  the confusion in computer
science.  thank you.  Babbage's differential engine is not doing
*computation* , it is doing *physics*.  We must draw a line somewhere,
because the digital realm in the machine is so entirely separate from
the physics (and even the physical hardware), that I could make a
whole other universe that does not conform to it.  It is a whole other
ModelOfComputation.

Q.E.D.  (Who else is going to have to eat a floppy disk here?)

 Relevant:

 http://www.xkcd.com/451/

*winks*.  BTW, all this regarding models of computation and such is
relevant to the discussion only because of one thing:  I like python.
I will leave that vague response for a later exercise after I get an
invite from a University (MIT?) to head their Computer Engineering
department.

Cheers,

Mark
-- 
https://mail.python.org/mailman/listinfo/python-list


Re: Python was designed (was Re: Multi-threading in Python vs Java)

2013-10-16 Thread Mark Janssen
On Tue, Oct 15, 2013 at 2:46 PM, Grant Edwards invalid@invalid.invalid wrote:
 On 2013-10-15, Mark Janssen dreamingforw...@gmail.com wrote:

 Yeah, well 40 years ago they didn't have parsers.

 That seems an odd thing to say. People were assembling and compiling
 computer programs long before 1973.

I'm using the word parser in the sense of a stand-alone application
that became useful with the growing open source culture that was
developing in the 70's.  Prior to that you have punch cards where
there's no meaningful definition of parsing because there are no
tokens.  Would you say you were parsing on an old digital machine
where you input programs with binary switches?

But after the advent of the dumb terminal, parsers started evolving,
and that was the early 70's.  I might be a year or two off, but I only
gave one significant digit there.   ;^)

Cheers,
-- 
MarkJ
Tacoma, Washington
-- 
https://mail.python.org/mailman/listinfo/python-list


Re: Python was designed (was Re: Multi-threading in Python vs Java)

2013-10-16 Thread Mark Janssen
 Types on the other hand correspond to our classifications and so are
 things in our minds.

 That is not how a C programmer views it.  They have explicit
 typedefs that make it a thing for the computer.

 Speaking as a C programmer, no.  We have explicit typedefs to create new
 labels for existing types, to make the type-abstraction easier to relate to
 the object-abstraction.

Who uses object abstraction in C?  No one.  That's why C++ was invented.

-- 
MarkJ
Tacoma, Washington
-- 
https://mail.python.org/mailman/listinfo/python-list


Re: Python was designed (was Re: Multi-threading in Python vs Java)

2013-10-16 Thread Mark Janssen
 Who uses object abstraction in C?  No one.  That's why C++ was invented.

 If not, Linux, how about Python?

 http://hg.python.org/cpython/file/e2a411a429d6/Objects

 Or huge slabs of the OS/2 Presentation Manager, which is entirely
 object oriented and mostly C. It's done with SOM, so it's possible to
 subclass someone else's object using a completely different language.

Now this is the first real objection to my statement: OS/2 and the
Presentation Manager, or windowing system.

But, here it is significant that the user /consumer (i.e. *at the
workstation* mind you) is *making* the object because thier visual
system turns it into one.  Otherwise, at the C-level, I'm guessing
it's normal C code without objects, only struct-ured data.  That is,
you don't get all the OOP benefits like inheritance, polymorphism and
encapsulation.  C can do 2 of those, albeit kludgingly, but not all
three.  And without all three, it's not at all well-established that
you're doing real OOP.

-- 
MarkJ
Tacoma, Washington
-- 
https://mail.python.org/mailman/listinfo/python-list


Re: Python was designed (was Re: Multi-threading in Python vs Java)

2013-10-16 Thread Mark Janssen
 And your earlier idea that punched cards didn't have tokens is wildly
 ignorant of the state of software and languages 50 years ago.

Please tell me how you parsed tokens with binary switches 50 years
ago.  Your input is rubbish.
-- 
MarkJ
Tacoma, Washington
-- 
https://mail.python.org/mailman/listinfo/python-list


Re: Python was designed (was Re: Multi-threading in Python vs Java)

2013-10-15 Thread Mark Janssen
 Objects in programming languages (or 'values' if one is more functional 
 programming oriented) correspond to things in the world.

One of the things you're saying there is that values correspond to
things in the world.  But you will not get agreement in computer
science on that anymore than saying numbers correspond to things in
the world -- they are abstractions that are not supposed to
correspond to things.  (Objects, OTOH, were intended to, so your
statement has mixed truthiness.)

 Types on the other hand correspond to our classifications and so are things 
 in our minds.

That is not how a C programmer views it.  They have explicit
typedefs that make it a thing for the computer.

 So for the world 'to settle' on a single universal type system is about as 
 nonsensical and self contradictory as you and I having the same thoughts.

Yes, well clearly we are not having the same thoughts, yet the
purpose of the academic establishment is to pin down such terminology
and not have these sloppy understandings everywhere.  You dig?

 To see how completely nonsensical a classification system of a so-called 
 alien culture is, please read:
 http://en.wikipedia.org/wiki/Celestial_Emporium_of_Benevolent_Knowledge

 And then reflect that the passage is implying that CONVERSELY our 
 natural/obvious/FACTual classifications would appear similarly nonsensical to 
 them.

 The same in the world of programming languages:

No.  There is one world in which the computer is well-defined.  All
others are suspect.

 Here's an APL session
 $ ./apl
 a perfectly good (and for many of us old-timers a very beautiful) type system
 but completely incompatible with anything designed in the last 40 years!

Yeah, well 40 years ago they didn't have parsers.   The purpose of
having a field of computer science worthy of the name, is to advance
the science not let this riff-raff dominate the practice.

-- 
MarkJ
Tacoma, Washington
-- 
https://mail.python.org/mailman/listinfo/python-list


Re: Python was designed (was Re: Multi-threading in Python vs Java)

2013-10-14 Thread Mark Janssen
On Mon, Oct 14, 2013 at 12:18 PM, John Nagle na...@animats.com wrote:
 On 10/12/2013 3:37 PM, Chris Angelico wrote:
 On Sat, Oct 12, 2013 at 7:10 AM, Peter Cacioppi
 peter.cacio...@gmail.com wrote:
 Along with batteries included and we're all adults, I think
 Python needs a pithy phrase summarizing how well thought out it is.
 That is to say, the major design decisions were all carefully
 considered, and as a result things that might appear to be
 problematic are actually not barriers in practice. My suggestion
 for this phrase is Guido was here.

 Designed.

 You simply can't get a good clean design if you just let it grow by
 itself, one feature at a time.

 No, Python went through the usual design screwups.

I hesitate to poke my nose in here, but Python is fine.  No one knows
how to design the perfect language from the start, otherwise it would
be here.   But Python has set the precedent for allowing
backwards-incompatibility to fix language problems and that's what
will keep it from breaking.

 Look at how
 painful the slow transition to Unicode was, from just str to
 Unicode strings, ASCII strings, byte strings, byte arrays,

This is where I wish I could have been involved with the discussion,
but I was outside of civilization at the time, and was not able to
contribute.

 16 and 31 bit character builds, and finally automatic switching
 between rune widths. Old-style classes vs. new-style classes.  Adding a
 boolean type as an afterthought (that was avoidable; C went through
 that painful transition before Python was created).Operator +
 as concatenation for built-in arrays but addition for NumPy
 arrays.

All of this will get fixed, but the problem is that you are stirring
up issues without really understanding the problem.   The problem is
something that is at the bleeding-edge of Computer Science itself and
settling on a theory of types.  I've answered this by creating a
unified object model, but no one has understood why the hell anyone
needs one, so I'm sitting around waiting for a friend..

 Each of those reflects a design error in the type system which
 had to be corrected.

To call it a design error makes it seem like someone make a decision
that resulted in a mistake, but it isn't (wasn't) that simple.

 The type system is now in good shape. The next step is to
 make Python fast.

Woah, boy.  There's no reason to make an incomplete design faster, for
psuedo-problems that no one will care about in 5-10 years.  The field
has yet to realize that it needs an object model, or even what that
is.

 Python objects have dynamic operations suited
 to a naive interpreter like CPython.

Naive, no.

 These make many compile
 time optimizations hard. At any time, any thread can monkey-patch
 any code, object, or variable in any other thread.  The ability
 for anything to use setattr() on anything carries a high
 performance price.  That's part of why Unladen Swallow failed
 and why PyPy development is so slow.

Yes, and all of that is because, the world has not settled on some
simple facts.  It needs an understanding of type system.  It's been
throwing terms around, some of which are well-defined, but others,
not:  there has been enormous cross-breeding that has made mutts out
of everybody and someone's going to have to eat a floppy disk for
feigning authority where there wasn't any.

Mark J
Tacoma, Washington
-- 
https://mail.python.org/mailman/listinfo/python-list


Re: Python was designed (was Re: Multi-threading in Python vs Java)

2013-10-14 Thread Mark Janssen
 Python objects have dynamic operations suited
 to a naive interpreter like CPython.

 Naive, no.

 Naive, in this instance, means executing code exactly as written,
 without optimizing things (and it's not an insult, btw).

In that case, you're talking about a non-optimizing interpreter, but
then, that what is supposed to happen.  I don't think it's fair to
call it naive.  An interpreter can't guess what you mean to do in
every circumstance (threading?).  It's better to do it right (i.e.
well-defined), *slowly* than to do it fast, incorrectly.

MarkJ
Tacoma, Washington
-- 
https://mail.python.org/mailman/listinfo/python-list


Re: Formal-ity and the Church-Turing thesis

2013-10-08 Thread Mark Janssen
 I don't have an infinite stack to implement
 lambda calculus, but...

 And then

 But this is not a useful formalism.  Any particular Program implements
 a DFA, even as it runs on a TM.  The issue of whether than TM is
 finite or not can be dismissed because a simple calculation can
 usually suffice, or at least establish a range usefulness so as not
 to run out of memory.

 Having it both ways aren't you?

I'm just speaking from programmer experience and the fact that most
machines are VonNeumann architecture.  Being that as it is, maxing out
the stack simply happens, and I don't dare do any non-simple
recursion, but otherwise, practically speaking, I can calculate my
memory usage that may grow on the heap so that is effectively a
non-issue.  This may not be an important distinction for computing,
the art (Hello ultimate lambda friends), but it is significant for
the computing, the science.

MarkJ
Tacoma, Washington
-- 
https://mail.python.org/mailman/listinfo/python-list


Re: Tail recursion to while iteration in 2 easy steps

2013-10-07 Thread Mark Janssen
 That's fine. My point was: you can't at the same time have full
 dynamicity *and* procedural optimizations (like tail call opt).
 Everybody should be clear about the trade-off.

 Your wrong. Full dynamics is not in contradiction with tail call
 optimisation. Scheme has already done it for years. You can rebind
 names to other functions in scheme and scheme still has working
 tail call optimisatiosn.

Yeah, and this is where two models of computation have been conflated,
creating magical effects, confusing everybody.  I challenge you to get
down to the machine code in scheme and formally describe how it's
doing both.
-- 
MarkJ
Tacoma, Washington
-- 
https://mail.python.org/mailman/listinfo/python-list


Re: Tail recursion to while iteration in 2 easy steps

2013-10-07 Thread Mark Janssen
 Only that you've got a consistent, stable (and therefore,
 formalizable) translation from your language to the machine.  That's
 all.  Everything else is magic.  Do you know that the Warren
 Abstraction Engine used to power the predicate logic in Prolog into
 machien code for a VonNeumann machine is so complex, no one has
 understood it (or perhaps even verified it)?

Sorry, I mean the Warren Abstraction Machine (or WAM).  I refer you to
www.cvc.uab.es/shared/teach/a25002/wambook.pdf.

Now, one can easily argue that I've gone too far to say no one has
understood it (obviously), so it's very little tongue-in-cheek, but
really, when one tries to pretend that one model of computation can be
substituted for another (arguing *for* the Church-Turing thesis), they
are getting into troubling territory (it is only a thesis,
remember)

-- 
MarkJ
Tacoma, Washington
-- 
https://mail.python.org/mailman/listinfo/python-list


Re: Tail recursion to while iteration in 2 easy steps

2013-10-07 Thread Mark Janssen
On Mon, Oct 7, 2013 at 4:50 PM, Steven D'Aprano
steve+comp.lang.pyt...@pearwood.info wrote:
 On Mon, 07 Oct 2013 15:47:26 -0700, Mark Janssen wrote:
 I challenge you to get
 down to the machine code in scheme and formally describe how it's doing
 both.

 For which machine?

Right, I should stop assuming a modern implementation of vonNeumann
architecture (even though that, too, is ambiguous) since I'm talking
about theory, but yet it is relevant.  My demarcation point for
arguments between the scheme way and other procedural languages
(which, apart from Pascal variants, I blithely all the C way) gets
down to differing models of computation which shouldn't get conflated,
even though everyone thinks and lumps it all as computation.  They
simply can't get *practically* translated between one and the other,
even though they are *theoretically* translated between each other all
the time.  Humans, of course know how to translate, but that doesn't
count from the pov of computer *science*.

 Frankly, asking somebody to *formally* describe a machine code
 implementation strikes me as confused. Normally formal descriptions are
 given in terms of abstract operations, often high level operations,
 sometimes *very* high level, and rarely in terms of low-level flip this
 bit, copy this byte machine code operations. I'm not sure how one would
 be expected to generate a formal description of a machine code
 implementation.

It's like this: there *should* be one-to-one mappings between the
various high-level constructs to the machine code, varying only
between different chips (that is the purpose of the compiler after
all), yet for some operations, in languages like scheme, well... I
cannot say what happens...  hence my challenge.

 But even putting that aside, even if somebody wrote such a description,
 it would be reductionism gone mad. What possible light on the problem
 would be shined by a long, long list of machine code operations, even if
 written using assembly mnemonics?

Only that you've got a consistent, stable (and therefore,
formalizable) translation from your language to the machine.  That's
all.  Everything else is magic.  Do you know that the Warren
Abstraction Engine used to power the predicate logic in Prolog into
machien code for a VonNeumann machine is so complex, no one has
understood it (or perhaps even verified it)?   One hardly knows where
these things originate.  But here it gets into dark arts best not
entered into too deeply.  It will turn you mad, like that guy in the
movie pi.

-- 
MarkJ
Tacoma, Washington
-- 
https://mail.python.org/mailman/listinfo/python-list


Re: Tail recursion to while iteration in 2 easy steps

2013-10-07 Thread Mark Janssen
 But even putting that aside, even if somebody wrote such a description,
 it would be reductionism gone mad. What possible light on the problem
 would be shined by a long, long list of machine code operations, even
 if written using assembly mnemonics?

 Only that you've got a consistent, stable (and therefore, formalizable)
 translation from your language to the machine.

 You are mistaken to think that there is a single, one-to-one, mapping
 between high-level code and machine code.

It's not mistaken.  Given a stable and formalized language definition,
there should only be continued optimization of the lexical and
procedural constructs into better machine code. In the case of an
interpreted language like Python (which I'll define as a language
which includes a layer of indirection between the user and the
machine, encouraging the nice benefits of interactivity), such
optimization isn't really apropos, because it's not the purpose of
python to be optimal to the machine as much as optimal to the
programmer.  In any case, while such optimization can continue over
time, they generally create new compiler releases to indicate such
changes.  The one-to-one mapping is held by the compiler.

Such determinism *defines* the machine, otherwise you might as well
get rid of the notion of computer *science*.  All else is error, akin
to cosmic rays or magic.  Unless the source code changes, all else
remaining equal, the machine code is supposed to be the same, no
matter how many times it is compiled.

[Only if you use the exact source, compiler, switches, etc]] will the output 
be the same.
 And even that is not guaranteed.

Oh, and what would cause such non-determinism?

 Take, for example, the single high-level operation:

 sort(alist)

 What machine code will be executed? Obviously that will depend on the
 sort algorithm used. There are *dozens*. Here are just a few:

Well, since you didn't specify your programming language, you're then
merely stating an English construct.  As such, there can be no single
mapping from English into the machine, which is why there are so many
different languages and experiments that map your [English] concepts
into source code.

 Now sorting is pretty high level, but the same principle applies to even
 simple operations like multiply two numbers. There are often multiple
 algorithms for performing the operation, and even a single algorithm can
 often be implemented in slightly different ways. Expecting all compilers
 to generate the same machine code is simply naive.

You are both over-simplifying and complexifying things at once.  Pick one.

-- 
MarkJ
Tacoma, Washington
-- 
https://mail.python.org/mailman/listinfo/python-list


Re: Tail recursion to while iteration in 2 easy steps

2013-10-07 Thread Mark Janssen
 Yeah, and this is where two models of computation have been conflated,
 creating magical effects, confusing everybody.  I challenge you to get
 down to the machine code in scheme and formally describe how it's
 doing both.

 Which two models of computation are you talking about? And what magica; 
 effects?

Well, I delineate all computation involving predicates (like lambda
calculus) between those using digital logic (like C).  These realms of
computation are so different, they are akin to mixing the complex
numbers with the real.  Yet hardly anyone points it out (I've
concluded that hardly anyone has ever noticed -- the Church-Turing
thesis has lulled the whole field into a shortcut in thinking which
actually doesn't pan out in practice).

 AFAIK there is no magic in computer science, although every sufficiently 
 advanced ...

Ha!  That's very good.  I'm glad you catch the spirit of my rant.
Any sufficiently advanced compiler can be substituted with magic to
the neophyte without a change in output.  A mini Liskov substitution.

-- 
MarkJ
Tacoma, Washington
-- 
https://mail.python.org/mailman/listinfo/python-list


Re: Formal-ity and the Church-Turing thesis

2013-10-07 Thread Mark Janssen
 On Tuesday, October 8, 2013 5:54:10 AM UTC+5:30, zipher wrote:
 Now, one can easily argue that I've gone too far to say no one has
 understood it (obviously), so it's very little tongue-in-cheek, but
 really, when one tries to pretend that one model of computation can be
 substituted for another (arguing *for* the Church-Turing thesis), they
 are getting into troubling territory (it is only a thesis,
 remember)

 The CT thesis is scientific and provable in one sense and vague/philosophical 
 in another.
 The Science: Turing computability and lambda-computability are equivalent.
 The proofs just consist of writing interpreters for one in terms of the other.

Ah, good, a fellow theoretician.  Now it's nice that you use language
that makes it seem quite clear, but understand that there's a hidden,
subconscious, *cultural* encoding to your *statement*.  The use of the
term equivalent, for example.  Equivalent for the programmer, or for
the machine?  (etc., et cetera), and further:  writing interpreters
for one in terms of the other, but again, this will change depending
on your pragmatic requirements.  To the theorist, you've accomplished
something, but then that is a self-serving kind of accomplishment.  To
the programmer, operating under different requirements, you haven't
accomplished anything.  I don't have an infinite stack to implement
lambda calculus, but I can treat my computer's memory as a TM and
*practically* infinite and only rarely hit against the limits of
physicality.  This is just being respectful... ;^)

(For the purposes of discussion, if I make a word in CamelCase, I am
referring to a page on the WikiWikiWeb with the same name:
http://c2.com/cgi/wiki?WikiWikiWeb.)

 The philosophy: *ALL* computational models are turing equivalent (and 
 therefore lambda-equivalent) or weaker.
 The Idea (note not proof) is that for equivalence one can write 
 pair-interpreters like above. For the 'weaker' case, (eg DFA and TMs) one 
 proves that TMs can interpret DFAs and disproves the possibility of the other 
 direction.

 This must remain an idea (aka thesis) and not a proof because one cannot 
 conceive of all possible computational models.

Why not?  I can conceive of all possible integer numbers even if I
never pictured them.  Is there not an inductive way to conceive of
and define computation?  I mean, I observe that the field seems to
define several ModelsOfComputation.  Intuitively I see two primary
domains

 It is hard science however for all the models that anyone has so far come up 
 with.

And what of interactive computation?

 As for:

 I challenge you to get down to the machine code in scheme and formally
 describe how it's doing both.

 I can only say how ironic it sounds to someone who is familiar with the 
 history of our field:
 Turing was not a computer scientist (the term did not exist then) but a 
 mathematician.  And his major contribution was to create a form of argument 
 so much more rigorous than what erstwhile mathematicians were used to that he 
 was justified in calling that math as a machine.

Hmm, I'm wondering if my use of the word formally is confusing you.
In mathematics, this word has a subtly differing meaning, I think,
than in computer science.  Turing was justified in calling that math
as a machine because he was using a definition (the translation table
+ finite dictionary) such that it remained perfectly deterministic.

And here, again, one can easily gets mixed up using the same lexicon
across two different domains:  that of math and that of CS.  I advise
you to look at the dialog at ConfusedComputerScience.

 The irony is that today's generation assumes that 'some-machine' implies its 
 something like 'Intel-machine'.
 To get out of this confusion ask yourself: Is it finite or infinite?

But this only gets us out of the confusion for the mathematicians.
For the programmer and perhaps even the computer scientist (the one's
coming from physics), it is something different.

 If the TM were finite it would be a DFA

But this is not a useful formalism.  Any particular Program implements
a DFA, even as it runs on a TM.  The issue of whether than TM is
finite or not can be dismissed because a simple calculation can
usually suffice, or at least establish a range usefulness so as not
to run out of memory.

 If the Intel-machine (and like) were infinite they would need to exist in a 
 different universe.

Ha, yeah.  Let us dismiss with that.

 And so when you understand that TMs are just a kind of mathematical rewrite 
 system (as is λ calculus as are context free grammars as is school arithmetic 
 etc etc) you will not find the equivalence so surprising

It's not that it's surprising, it's that it's *practically* a problem.
 The translation between one PL and another which assumes a different
model of computation can get intractible.

Maybe that makes sense

MarkJ
Tacoma, Washington
-- 
https://mail.python.org/mailman/listinfo/python-list


Re: Tail recursion to while iteration in 2 easy steps

2013-10-02 Thread Mark Janssen
 def fact(n): return 1 if n = 1 else n * fact(n-1)

 into a tail recursion like
 [...]

 How do know that either = or * didn't rebind the name fact to
 something else? I think that's the main reason why python cannot apply
 any procedural optimization (even things like inlining are impossible,
 or possible only under very conservative assumption, that make it
 worthless).

It's called operator precedence.

-- 
MarkJ
Tacoma, Washington
-- 
https://mail.python.org/mailman/listinfo/python-list


Re: Tail recursion to while iteration in 2 easy steps

2013-10-02 Thread Mark Janssen
On Wed, Oct 2, 2013 at 1:23 PM, Alain Ketterlin al...@unistra.fr wrote:
 On 10/02/2013 08:59 PM, Mark Janssen wrote:

 def fact(n): return 1 if n = 1 else n * fact(n-1)

 How do know that either = or * didn't rebind the name fact to
 something else? I think that's the main reason why python cannot apply
 any procedural optimization

 It's called operator precedence.

 Operator precedence is totally irrelevant here, you misunderstand what
 bind means.

 Imagine that you call fact(x) where x is an instance of the following class:

 class Strange:
   ...
   def __le__(dummy):
 global fact
 fact = someotherfun # this is binding
 return false

 i.e., executing n=1 calls __le__, which rebinds the name fact to
 someting else. Then, there is no recursive call at all. At the time
 fact(x-1) is executed, fact is not the same function any more.

 You cannot prevent this in python.


No, but you can't prevent a lot of bad moves in python.  What you just
did there is a total bonehead (strange?) of an idea.

-- 
MarkJ
Tacoma, Washington
-- 
https://mail.python.org/mailman/listinfo/python-list


Re: Tail recursion to while iteration in 2 easy steps

2013-10-02 Thread Mark Janssen
 Part of the reason that Python does not do tail call optimization is
 that turning tail recursion into while iteration is almost trivial, once
 you know the secret of the two easy steps. Here it is.

 That should be a reason it _does_ do it - saying people should rewrite
 their functions with loops means declaring that Python is not really a
 multi-paradigm programming language but rather rejects functional
 programming styles in favor of imperative ones.

Yes, but that's fine.  A PL language that includes every programming
paradigm would be a total mess, if even possible.  Python has
functional programming where it does not conflict with its overall
design.  The only place I find that this is not the case is with
lambda, but that is now adequately fixed with the addition of the
ternary operator.

-- 
MarkJ
Tacoma, Washington
-- 
https://mail.python.org/mailman/listinfo/python-list


Re: What minimum should a person know before saying I know Python

2013-09-20 Thread Mark Janssen
 I started Python 4 months ago. Largely self-study with use of Python 
 documentation, stackoverflow and google. I was thinking what is the minimum 
 that I must know before I can say that I know Python?

Interesting.  I would say that you must know the keywords, how to make
a Class, how to write a loop.  That covers about 85% of it.
-- 
MarkJ
Tacoma, Washington
-- 
https://mail.python.org/mailman/listinfo/python-list


Re: Language design

2013-09-14 Thread Mark Janssen
 Really?  Are you saying you (and the community at-large) always derive
 from Object as your base class?

 Not directly, that would be silly.

 Silly?  Explicit is better than implicit... right?

 If I'm inheriting from str, I inherit from str explicitly:

 class MyStr(str): ...

 and then str in turn inherits from object explicitly. I certainly do not
 inherit from object and then re-implement all the string methods from
 scratch:

I know that.  Str already inherits from object (due to the language
definition).  Your inheritance from object is implied by your
inheritance from a child class (str), but note there is an implied
directionality:  you don't say str is the parent of object.  But tell
me this:  is str the superclass of object or is it the other way
around?

 class MyStr(object):
 def __new__(cls, value): ...
 def upper(self): ...
 def lower(self): ...
 # and so on...

 That would be ridiculous, and goes against the very idea of inheritance.
 But nor do I feel the need to explicitly list the entire superclass
 hierarchy:

 class MyStr(str, object):
 ...

Now you've lost your marbles.  You are arguing points that a python
programmer would not argue.  Now, since I know you to be a decent
python programmer, I can only conclude that your sanity is in
question.

 which would be silly. Only somebody who doesn't understand how
 inheritance works in Python would do that. There's simply no need for it,
 and in fact it would be actively harmful for larger hierarchies.

Explicitly inheriting from object (class myBase(object): rather than
class myBase():) would not be actively harmful in any way.

 But wait is it the base (at the bottom of the hierarchy) or is it
 the parent at the top?  You see, you, like everyone else has been
 using these terms loosely, confusing yourself.

 Depends on whether I'm standing on my head or not.

 Or more importantly, it depends on whether I visualise my hierarchy
 going top-down or bottom-up. Both are relevant, and both end up with
 the *exact same hierarchy* with only the direction reversed.

 Ha,  only the direction reversed.  That little directionality that
 you're passing by so blithely is the difference between whether you're
 talking about galaxies or atoms.

 It makes no difference whether I write:

 atoms - stars - galaxies

 or

 galaxies - stars - atoms

 nor does it make any difference if I write the chain starting at the top
 and pointing down, or at the bottom and pointing up.

Here again, your sanity is questioned.  You are simply wrong.  Atoms
lie within galaxies, but galaxies do not lie within atoms (poetic
license excluded); i.e. there is a difference, whether your talking
about syntactically by the parser or conceptually by a human being.
Somewhere you have to put yourself in the middle.  And that point
defines how you relate to the machine -- towards abstraction (upwards)
or towards the concrete (to the machine itself).

 The simplicity of Python has seduced you into making an equivocation
 of sorts.  It's subtle and no one in the field has noticed it.  It crept
 in slowly and imperceptively.

 Ah, and now we come to the heart of the matter -- people have been
 drawing tree-structures with the root at the top of the page for
 centuries, and Mark Janssen is the first person to have realised that
 they've got it all backwards.

I'll be waiting for your apology once you simply grasp the simple
(however inconvenient and unbelievable) truth. ;*)

 By inheriting from sets you get a lot of useful functionality for
 free.  That you don't know how you could use that functionality is a
 failure of your imagination, not of the general idea.

 No you don't. You get a bunch of ill-defined methods that don't make
 sense on dicts.

 They are not necessarily ill-defined.  Keep in mind Python already chose
 (way back in 1.x) to arbitrary overwrite the values in a key collision.
 So this problem isn't new.  You've simply adapted to this limitation
 without knowing what you were missing.

 No, Python didn't arbitrarily choose this behaviour.

Perhaps you don't recall the discussion.

 It is standard,
 normal behaviour for a key-value mapping, and it is the standard
 behaviour because it is the only behaviour that makes sense for a general
 purpose mapping.

No.  Please don't propagate your limited sense of things as if it the
only way to do it.

 Python did not invent dicts (although it may have invented the choice of
 name dict).

 If you think of inheritance in the Liskov Substitution sense, then you
 might *consider* building dicts on top of sets. But it doesn't really
 work, because there is no way to sensibly keep set-behaviour for dicts.

There's no need to preserve LSP -- it's just one way to think about
class relations.  In fact, I'll argue that one should not -- because
the field has not perfected the object model adequately, so it would
lead to a suboptimal situation akin to premature optimization.   The
conceptual abstraction is most

Re: Language design

2013-09-13 Thread Mark Janssen
On Fri, Sep 13, 2013 at 4:57 PM, Chris Angelico ros...@gmail.com wrote:
 Evangelical vicar in want of a portable second-hand font. Would
 dispose, for the same, of a portrait, in frame, of the Bishop-elect of
 Vermont.

 I think you could quite easily reconstruct the formatting of that,
 based on its internal structure. Even in poetry, English doesn't
 depend on its formatting nearly as much as Python does;

(Just to dispose of this old argument:)  Both Python and English
depend on both syntactical, material delimiters and whitespace.  While
it may seem that Python depends more on whitespace than English, that
is highly contentious, poetry or not.  Take some literature, remove
all the tabs at paragraph start and CRs at paragraph-end so that it
all runs together and you'll find that it impossible to read -- you
just won't be able to enter into the universe that the author is
attempting to build.

-- 
MarkJ
Tacoma, Washington
-- 
https://mail.python.org/mailman/listinfo/python-list


Re: Language design

2013-09-12 Thread Mark Janssen
 Really?  Are you saying you (and the community at-large) always derive
 from Object as your base class?

 Not directly, that would be silly.

Silly?  Explicit is better than implicit... right?

 But wait is it the base (at the bottom of the hierarchy) or is it the
 parent at the top?  You see, you, like everyone else has been using
 these terms loosely, confusing yourself.

 Depends on whether I'm standing on my head or not.

 Or more importantly, it depends on whether I visualise my hierarchy going
 top-down or bottom-up. Both are relevant, and both end up with the
 *exact same hierarchy* with only the direction reversed.

Ha,  only the direction reversed.  That little directionality that
you're passing by so blithely is the difference between whether you're
talking about galaxies or atoms.  Please.

The simplicity of Python has seduced you into making an equivocation
of sorts.  It's subtle and no one in the field has noticed it.  It
crept in slowly and imperceptively.

 By inheriting from sets you get a lot of useful
 functionality for free.  That you don't know how you could use that
 functionality is a failure of your imagination, not of the general idea.

 No you don't. You get a bunch of ill-defined methods that don't make
 sense on dicts.

They are not necessarily ill-defined.  Keep in mind Python already
chose (way back in 1.x) to arbitrary overwrite the values in a key
collision.  So this problem isn't new.  You've simply adapted to this
limitation without knowing what you were missing.

 3) It used the set literal for dict, so that there's no obvious way to
 do it.  This didn't get changed in Py3k.

 No, it uses the dict literal for dicts.

 Right.  The dict literal should be {:} -- the one obvious way to do it.

 I don't agree it is obvious. It is as obvious as (,) being the empty tuple
 or [,] being the empty list.

You're just being argumentative.  If there are sets as built-ins, then
{:} is the obvious dict literal, because {} is the obvious one for
set.  You don't need [,] to be the list literal because there is no
simpler list-type.

 And the obvious way to form an empty set is by calling set(), the same
 as str(), int(), list(), float(), tuple(), dict(), ...

 Blah, blah.  Let me know when you got everyone migrated over to
 Python.v3.

 What does this have to do with Python 3? It works fine in Python 2.

I mean, you're suggestions are coming from a believer, not someone
wanting to understand the limitations of python or whether v3 has
succeeded at achieving its potential.

 I don't even understand what you are talking about here. [reference]
 variables? What does that mean?

 It's a just a tricky point, that I will wait to comment on.

 I'm looking forward to an explanation, as I'm intrigued.

Well, wer'e here at junior-high.  It will take some time
-- 
MarkJ
Tacoma, Washington
-- 
https://mail.python.org/mailman/listinfo/python-list


Re: Language design

2013-09-11 Thread Mark Janssen
 * Imports are fiendishly complex, hidden below deceptively simple
   syntax.

   It's a reasonable expectation that one can import a module from a
   source code file given its path on the filesystem, but this turns out
   to be much more complicated than in many other languages.

Why is this so difficult?  Add a Graph class to the collections module
(networkx is quite good) and simply check for circular imports.  The
remaining difficulty I encounter is because the user hasn't defined
their PYTHONPATH variable.

-- 
MarkJ
Tacoma, Washington
-- 
https://mail.python.org/mailman/listinfo/python-list


Re: Language design

2013-09-11 Thread Mark Janssen
1) It tried to make Object the parent of every class.  No one's close
enough to God to make that work.
2) It didn't make dicts inherit from sets when they were added to Python.
3) It used the set literal for dict, so that there's no obvious way to
do it.  This didn't get changed in Py3k.
4?) It allowed [reference] variables to be used as dict keys.  This
creates a parsing difficulty for me, mentally.  Keys should be direct,
hashable values, not hidden in a variable name.

A few of the top of the head

Mark

On Mon, Sep 9, 2013 at 11:09 PM, Steven D'Aprano st...@pearwood.info wrote:
 Some time ago, Tom Christiansen wrote about the Seven Deadly Sins of
 Perl:

 http://www.perl.com/doc/FMTEYEWTK/versus/perl.html


 What design mistakes, traps or gotchas do you think Python has? Gotchas
 are not necessarily a bad thing, there may be good reasons for it, but
 they're surprising.

 To get started, here are a couple of mine:


 - Python is so dynamic, that there is hardly anything at all that can be
 optimized at compile time.

 - The behaviour of mutable default variables is a gotcha.

 - Operators that call dunder methods like __add__ don't use the same
 method resolution rules as regular methods, they bypass the instance and
 go straight to the type, at least for new-style classes.



 --
 Steven
 --
 https://mail.python.org/mailman/listinfo/python-list



-- 
MarkJ
Tacoma, Washington
-- 
https://mail.python.org/mailman/listinfo/python-list


Re: Language design

2013-09-11 Thread Mark Janssen
 On Tue, 10 Sep 2013, Ben Finney wrote:
   The sooner we replace the erroneous
   “text is ASCII” in the common wisdom with “text is Unicode”, the
   better.

 I'd actually argue that it's better to replace the common wisdom with
 text is binary data, and we should normally look at that text through
 Unicode eyes. A little less catchy, but more accurate ;)

 No, that's inaccurate. A sequence of bytes is binary data. Unicode is
 not binary data.

Well now, this is an area that is not actually well-defined.  I would
say 16-bit Unicode is binary data if you're encoding in base 65,536,
just as 8-bit ascii is binary data if you're encoding in base-256.
Which is to say:  there is no intervening data to suggest a TYPE.
-- 
MarkJ
Tacoma, Washington
-- 
https://mail.python.org/mailman/listinfo/python-list


Re: Language design

2013-09-11 Thread Mark Janssen
 Why is this so difficult?
 Add a Graph class to the collections module (networkx is quite good)
 and simply check for circular imports.

 Er? That doesn't address the task of importing a module from a source
 code file given its path on the filesystem.

That's true, I guess was hooked on Python's abstraction mechanism for
making the file system invisible.  But I like the idea of programming
*relative* path addressing, so you can create a sort of name space
for your modules.  So instead of import /path/to/file.py which makes
a system dependency (i.e. *yours*), you could have import
TestPackage.collections.bag (using periods for file path separators
in keeping with the Pythonic Way).

-- 
MarkJ
Tacoma, Washington
-- 
https://mail.python.org/mailman/listinfo/python-list


Re: Language design

2013-09-11 Thread Mark Janssen
 Unicode is not 16-bit any more than ASCII is 8-bit. And you used the
 word encod[e], which is the standard way to turn Unicode into bytes
 anyway. No, a Unicode string is a series of codepoints - it's most
 similar to a list of ints than to a stream of bytes.

Okay, now you're in blah, blah land.

--mark
-- 
https://mail.python.org/mailman/listinfo/python-list


Re: Language design

2013-09-11 Thread Mark Janssen
 1) It tried to make Object the parent of every class.

 Tried, and succeeded.

Really?  Are you saying you (and the community at-large) always derive
from Object as your base class?

 No one's close enough to God to make that work.

 Non-sequitor. One doesn't need to be close to a deity to have a single
 root of the object hierarchy.

But wait is it the base (at the bottom of the hierarchy) or is it
the parent at the top?  You see, you, like everyone else has been
using these terms loosely, confusing yourself.

 2) It didn't make dicts inherit from sets when they were added to
 Python.

 Why would you want dicts to inherit from sets?

A dict is-a set of {key:object, key:object} pairs bound together with
a colon :.  By inheriting from sets you get a lot of useful
functionality for free.  That you don't know how you could use that
functionality is a failure of your imagination, not of the general
idea.

 3) It used the set literal for dict, so that there's no obvious
 way to do it.  This didn't get changed in Py3k.

 No, it uses the dict literal for dicts.

Right.  The dict literal should be {:} -- the one obvious way to do
it.  Pay me later.

 And the obvious way to form an empty set is by calling set(), the same as
 str(), int(), list(), float(), tuple(), dict(), ...

Blah, blah.  Let me know when you got everyone migrated over to Python.v3.

 4?) It allowed
 [reference] variables to be used as dict keys.  This creates a parsing
 difficulty for me, mentally.  Keys should be direct, hashable values,
 not hidden in a variable name.

 I don't even understand what you are talking about here. [reference]
 variables? What does that mean?

It's a just a tricky point, that I will wait to comment on.

--mark
-- 
https://mail.python.org/mailman/listinfo/python-list


Casting classes WAS: Documenting builtin methods

2013-07-11 Thread Mark Janssen
A user was wondering why they can't change a docstring in a module's class.

This made me think: why not have a casting operator (reciprocal?) to
transform a bonafide class into a mere carcass of a class which can
then modified and reanimated back into its own type with the type
function?  Such that type(reciprocal(myClass))==myClass...

reciprocal(myClass) returns a writeable, nonproxy class.__dict__
(perhaps also a list of its bases and name)

Just a thought to consider...

MarkJ
Tacoma, Washington
-- 
http://mail.python.org/mailman/listinfo/python-list


Re: Documenting builtin methods

2013-07-10 Thread Mark Janssen
 I have this innocent and simple code:

 from collections import deque
 exhaust_iter = deque(maxlen=0).extend
 exhaust_iter.__doc__ = Exhaust an iterator efficiently without
 caching any of its yielded values.

 Obviously it does not work. Is there a way to get it to work simply
 and without creating a new scope (which would be a rather inefficient
 a way to set documentation, and would hamper introspection)?

 How about dropping the simply requirement?

I think the canonical way to specialize a class (even if it's only
docstrings or method re-names) is to extend it with a new class.

markj
-- 
http://mail.python.org/mailman/listinfo/python-list


Re: how to calculate reputation

2013-07-02 Thread Mark Janssen
 Hi all, this seems to be quite stupid question but I am confused..
 We set the initial value to 0, +1 for up-vote and -1 for down-vote! nice.

 I have a list of bool values True, False (True for up vote, False for
 down-vote).. submitted by users.

 should I take True = +1, False=0  [or] True = +1, False=-1 ?? for adding
 all.

 I am missing something here.. and that's clear.. anyone please help me on
 it?

If False is representing a down-vote, like you say, then you have to
incorporate that information, in which case False=-1  == a user not
merely ignored another user, but marked him/her down.

MarkJ
Tacoma, Washington
-- 
http://mail.python.org/mailman/listinfo/python-list


Re: What is the semantics meaning of 'object'?

2013-06-29 Thread Mark Janssen
 On 26/06/2013 9:19 AM, Mark Janssen wrote:

 Did you ever hear of the Glass Bead Game?

 Which was Hesse's condemnation of the
 pure-academic-understanding-unbound-by-pragmatic-use approach as mental
 masturbation,

It was not.  He was conflicted.  On the one hand he knew the
enterprise was noble, but on the other he saw it could lead to crystal
palaces that were good to nobody.
-- 
MarkJ
Tacoma, Washington
-- 
http://mail.python.org/mailman/listinfo/python-list


Re: What is the semantics meaning of 'object'?

2013-06-25 Thread Mark Janssen
 This bothers me as well.  If you look at Raymond Hettinger's super()
 considered super article, he includes the (correct) advice that
 super() needs to be used at every level of the call chain.  At the end
 of the article, he offers this example to show how easy multiple
 inheritance can be:
 [...]
 oc = OrderedCounter('abracadabra')

 Which is pretty cool in its simplicity, but here's the rub (which I
 have previously noted on this list): OrderedDict doesn't use super.
 Counter does, but not cooperatively; it just calls super().__init__()
 with no arguments.  So the fact that this example works at all is
 basically luck.

Ah, and here we see the weakness in the object architecture that has
evolved in the past decade (not just in Python, note).  It hasn't
really ironed out what end is what.   Here's a proposal:  the highest,
most parental, most general object should be in charge, not
subclasses calling specific parent's init methods
(Parent.__init__(myparams)), etc. -- ***THIS IS WHERE WE WENT
WRONG***.

After the type/class unification, python tried to make the most
generic, most useless class be the parent of *all of them*, but
there's been no use whatsoever in that.  It was a good idea in the
beginning, so pure as it was, but it has not panned out in practice.
Sorry...

I'm trying to start a recovery plan at the wikiwikiweb
(http://c2.com/cgi/wiki?WikiWikiWeb) and I don't want to hear any more
smarmy comments about it.  The confusion is deeper than Python.
-- 
MarkJ
Tacoma, Washington
-- 
http://mail.python.org/mailman/listinfo/python-list


Re: What is the semantics meaning of 'object'?

2013-06-25 Thread Mark Janssen
 So instead of super(), you would have sub()?  It's an interesting
 concept, but I don't think it changes anything.  You still have to
 design your classes cooperatively if you expect to use them with
 multiple inheritance.

Yes, and let new instances of the child classes automatically ensure
the contracts of the parent classes.  I suppose it could be called
delegation
-- 
MarkJ
Tacoma, Washington
-- 
http://mail.python.org/mailman/listinfo/python-list


Re: What is the semantics meaning of 'object'?

2013-06-25 Thread Mark Janssen
 The main problem is getting to the top/end of the call chain. Classic
 example is with __init__, but the same problem can also happen with
 other calls. Just a crazy theory, but would it be possible to
 construct a black-holing object that, for any given method name,
 returns a dummy function that ignores its args? (Other forms of
 attribute lookup aren't going to be a problem, I think, so this can be
 just methods/functions.) Then you just subclass from that all the
 time, instead of from object itself, and you should be able to safely
 call super's methods with whatever kwargs you haven't yourself
 processed. Would that work?

 Caveat: I have not done much with MI in Python, so my idea may be
 complete balderdash.

Here's how it *should* be made:  the most superest, most badassed
object should take care of its children.  New instances should
automatically call up the super chain (and not leave it up to the
subclasses), so that the parent classes can take care of the chil'en.
 When something goes wrong the parent class has to look in and see
what's wrong.

In other words, this habit of specializing a Class to make up for the
weaknesses of its parent are THE WRONG WAY.   Instead, let the
specialization start at the machine types (where it doesn't get more
specialized), and work UPWARDS.

Let the standard library make the grouping (or collection types) to
point to the standard way of data structuring, and then everything
else becomes little mini-apps making a DataEcosystem.

--mark
-- 
http://mail.python.org/mailman/listinfo/python-list


Re: What is the semantics meaning of 'object'?

2013-06-25 Thread Mark Janssen
Sorry my last message got sent prematurely.  Retrying...

 So instead of super(), you would have sub()?  It's an interesting
 concept, but I don't think it changes anything.  You still have to
 design your classes cooperatively if you expect to use them with
 multiple inheritance.

Yes, and let new instances of the child classes automatically ensure
the contracts of the parent classes -- managed within the Python
interpreter, not the developer.

As for sub(), I suppose it could be called delegate().

The issue of classes cooperating isn't as big as it seems, because
since you're working now from a useful, agreed-upon common base (the
non-negotiable, but also non-arbitrary) machine types, you're now all
(the python and ideally the *object* community) speaking the same
language.   Who's going to argue about integers (as the atomic type)
and sets (as the most basic grouping type) being THE common set of
bases for everything else?  I mean, it doesn't get anymore ideal and
pure than integers and sets.  Combining integers with sets I can make
a Rational class and have infinite-precision arithmetic, for example.

That's a lot of power derived simply from using generic data
structures, not some panzy generic meta-Object that doesn't do
anything but tie people to an implicit type-theology.

-- 
MarkJ
Tacoma, Washington
-- 
http://mail.python.org/mailman/listinfo/python-list


Re: What is the semantics meaning of 'object'?

2013-06-25 Thread Mark Janssen
 Combining integers with sets I can make
 a Rational class and have infinite-precision arithmetic, for example.

 Combining two integers lets you make a Rational.

Ah, but what is going to group them together?  You see you've already
gotten seduced.  Python already uses a set to group them together --
it's called a Dict and it's in every Class object.

 Python integers are
 already infinite-precision. Or are you actually talking of using
 machine words and sets as your fundamental?

Probably.  It depends on where we need the flexibility of the
abstraction and where the code is written.

  Also, you need an
 ordered set - is the set {5,3} greater or less than the set {2} when
 you interpret them as rationals?

The ordering (and hence the interpretation) is done WITHIN the Class
(i.e. the SET as I say above).

 One must assume, I suppose, that any
 one-element set represents the integer 1, because any number divided
 by itself is 1.

Ah, very good, observation.  But that must remain an improper question.  ;^)

 That's a lot of power derived simply from using generic data
 structures, not some panzy generic meta-Object that doesn't do
 anything but tie people to an implicit type-theology.

 Sure. And if you want assembly language, you know where to find it.

Well you've been spoiled by all the work that came before you.  The
issue now is not to go back to the machine so much as to tear down
and build up again from raw materials, objects of more and more
complexity where very complex meta-objects upon meta-objects can be
built until the whole of human knowledge can be contained.

Did you ever hear of the Glass Bead Game?
-- 
MarkJ
Tacoma, Washington
-- 
http://mail.python.org/mailman/listinfo/python-list


Re: What is the semantics meaning of 'object'?

2013-06-25 Thread Mark Janssen
 Here's how it *should* be made:  the most superest, most badassed
 object should take care of its children.  New instances should
 automatically call up the super chain (and not leave it up to the
 subclasses), so that the parent classes can take care of the chil'en.
  When something goes wrong the parent class has to look in and see
 what's wrong.

 So what you're saying is that the first class defined does everything,
 and subclasses _restrict_ what can be done? I disagree strongly:

That's right.  Just as the *machine* restricts what can be done.  It's
an upturning of the purity model and going back to practicality.

 1) That breaks the Liskov Substitution Principle. A subclass of list
 ought to fulfill the contracts of a basic list.

We don't need LSP.  I write about this on the WIkiWikiWeb where there
were many arguments documented and many hairs frazzled.  LSP was
derived from AlanKay's abstract idea of Everything is an object.
But no -- there is a *physics* for information, and it ends at the
machine types.

 2) It implies that someone can invent an all-encompassing superclass
 before any subclassing is done.

No, we start with basic types and work upwards.  The
all-encompassing superclass it an all-encompassing data object:  in
mathematics, it's called a SET -- and mathematics has already done the
work to prove that it's the most generic and all-encompassing, a field
of SET THEORY.

 This kinda violates the laws of
 information. Programmers, being creative entities, will be adding to
 the pool of knowledge. Trying to shoehorn everything into one object
 won't work.

No, we don't need programmers adding to the pool of knowledge -- the
internet does that.  We need programmers making data objects that can
present data in new and more interesting ways -- starting from basic
principles.

-- 
MarkJ
Tacoma, Washington
-- 
http://mail.python.org/mailman/listinfo/python-list


Re: What is the semantics meaning of 'object'?

2013-06-25 Thread Mark Janssen
 Combining two integers lets you make a Rational.

 Ah, but what is going to group them together?  You see you've already
 gotten seduced.  Python already uses a set to group them together --
 it's called a Dict and it's in every Class object.

 When you inherit a set to make a Rational, you're making the
 statement (to the interpreter, if nothing else) that a Rational is-a
 set.

No you don't *inherit* a set to make a Rational, although you gain a
set to make it.  It's a subtle thing, because at the center of it
articulates the very difference between a piece of data and a
container to hold that data.  Or is the container the data?

C++ already solves this di-lemma.  It made class which is exactly
like a struct, but hides all it's data members.  That critical
distinction makes all the difference.  I don't know how many people on
the list really appreciate it.

-- 
MarkJ
Tacoma, Washington
-- 
http://mail.python.org/mailman/listinfo/python-list


Re: What is the semantics meaning of 'object'?

2013-06-24 Thread Mark Janssen
 Mostly I'm saying that super() is badly named.

 What else would you call a function that does lookups on the current
 object's superclasses?

^.  You make a symbol for it.  ^__init__(foo, bar)

-- 
MarkJ
Tacoma, Washington
-- 
http://mail.python.org/mailman/listinfo/python-list


Re: n00b question on spacing

2013-06-24 Thread Mark Janssen
On Mon, Jun 24, 2013 at 4:48 PM, alex23 wuwe...@gmail.com wrote:
 On 23/06/2013 3:43 AM, Mark Janssen wrote:

 There was a recent discussion about this (under implicit string
 concatenation).  It seems this is a part of the python language
 specification that was simply undefined.


 It's part of the language reference, not an accidental artifact:
 http://docs.python.org/2/reference/lexical_analysis.html#string-literal-concatenation

When I say specification, I mean specified in the formal notation
(BNF, etc).  whitespace is not defined (otherwise there would be a
line in the token list for linefeed and carriagereturn.

-- 
MarkJ
Tacoma, Washington
-- 
http://mail.python.org/mailman/listinfo/python-list


Re: n00b question on spacing

2013-06-22 Thread Mark Janssen
 Also remember when entering long lines of text that strings concatenate
 within parenthesis.
 So,
 (a, b, c
 d, e, f
 g, h, i)

 Is the same as (a, b, cd, e, fg, h, i)

There was a recent discussion about this (under implicit string
concatenation).  It seems this is a part of the python language
specification that was simply undefined.   (A rule probably should be
added to the lexer to make this explicit.)

-- 
MarkJ
Tacoma, Washington
-- 
http://mail.python.org/mailman/listinfo/python-list


Re: Compiling vs interpreting [was Re: A certainl part of an if() structure never gets executed.]

2013-06-16 Thread Mark Janssen
 Whats the difference of interpreting  to compiling ?

 OK, I give up!

 Actually, that's a more subtle question than most people think. Python,
 for example, is a compiled language. (What did you think the c in
 .pyc files stood for? and the compile() function?)

Careful there.  This terminology is not agreed upon universally (that
is, within the realm of academia where the notion of mastery exists),
and unless you are citing an actual reference or publishing one
yourself, then you may be adding more confusion than illumination.
For example, I would say that it is an *interpreted language* that
gets compiled at run-time.  Some (*valid*) definitions of compiler
mean a strict mapping from the language syntax and lexical definition
to a sequence of bytes that can be fed to a (hardware not virtual)
machine architecture to do perform what is requested.  The face that
an extension ends in the letter c is not sufficient evidence, since
file extensions have no strict standard.

 And these days, for many types of hardware, even machine-code is often
 interpreted by a virtual machine on a chip. And even languages which
 compile to machine-code often use an intermediate platform-independent
 form rather than targeting pure machine-code.

Do you have a reference for this?  What language?

 The line between compilers
 and interpreters is quite fuzzy.

It shouldn't be.  What is fuzzy is the definition of interpreter,
however.  The definition of compiler has only become fuzzy with the
advent of the personal computer.

 Probably the best definition I've seen for the difference between a
 modern compiler and interpreter is this one:

 ...the distinguishing feature of interpreted languages is not that they
 are not compiled, but that the compiler is part of the language runtime
 and that, therefore, it is possible (and easy) to execute code generated
 on the fly.

That's reasonable.
-- 
MarkJ
Tacoma, Washington
-- 
http://mail.python.org/mailman/listinfo/python-list


Re: My son wants me to teach him Python

2013-06-13 Thread Mark Janssen
 Despite not want to RTFM as you say, you might set him in front of
 VPython, type

I totally forgot PyGame -- another likely source of self-motivated
learning for a teen programmer.
-- 
MarkJ
Tacoma, Washington
-- 
http://mail.python.org/mailman/listinfo/python-list


Re: Don't rebind built-in names* - it confuses readers

2013-06-12 Thread Mark Janssen
 list = []
 Reading further, one sees that the function works with two lists, a list of
 file names, unfortunately called 'list',

That is very good advice in general:  never choose a variable name
that is a keyword.
-- 
MarkJ
Tacoma, Washington
-- 
http://mail.python.org/mailman/listinfo/python-list


Re: Re-using copyrighted code

2013-06-12 Thread Mark Janssen
 At least partially, my confusion seems to be caused by the dichotomy of
 the concepts of copyright and license. How do these relate to each other?

 A license emerges out of the commercial domain is purely about
 commercial protections.

I should clarify, that commercial protections here means *money*,
not other potentially legal assets.  As soon as money is exchange you
entangle yourself with their domain.  Otherwise, as long as you give
credit, you're really quite safe, from a Constitutional perspective.

-- 
MarkJ
Tacoma, Washington
-- 
http://mail.python.org/mailman/listinfo/python-list


Re: Don't rebind built-in names* - it confuses readers

2013-06-12 Thread Mark Janssen
On Wed, Jun 12, 2013 at 7:24 AM, Grant Edwards invalid@invalid.invalid wrote:
 On 2013-06-11, Mark Janssen dreamingforw...@gmail.com wrote:
 list = []
 Reading further, one sees that the function works with two lists, a list of
 file names, unfortunately called 'list',

 That is very good advice in general:  never choose a variable name
 that is a keyword.

 You can't choose a vriable name that is a keyword: the compiler won't
 allow it.

 list isn't a keyword.

You're right.  I was being sloppy.
-- 
MarkJ
Tacoma, Washington
-- 
http://mail.python.org/mailman/listinfo/python-list


Re: Don't rebind built-in names* - it confuses readers

2013-06-12 Thread Mark Janssen
 You're right.  I was being sloppy.

 ['ArithmeticError', 'AssertionError', 'AttributeError',
 'BaseException', 'BlockingIOError', 'BrokenPipeError', 'BufferError',
 'BytesWarning', 'ChildProcessError', 'ConnectionAbortedError',
 'ConnectionError', 'ConnectionRefusedError', 'ConnectionResetError',
 'DeprecationWarning', 'EOFError', 'Ellipsis', 'EnvironmentError',
 'Exception', 'False', 'FileExistsError', 'FileNotFoundError',
 'FloatingPointError', 'FutureWarning', 'GeneratorExit', 'IOError',
 'ImportError', 'ImportWarning', 'IndentationError', 'IndexError',
 'InterruptedError', 'IsADirectoryError', 'KeyError',
 'KeyboardInterrupt', 'LookupError', 'MemoryError', 'NameError',
 'None', 'NotADirectoryError', 'NotImplemented', 'NotImplementedError',
 'OSError', 'OverflowError', 'PendingDeprecationWarning',
 'PermissionError', 'ProcessLookupError', 'ReferenceError',
 'ResourceWarning', 'RuntimeError', 'RuntimeWarning', 'StopIteration',
 'SyntaxError', 'SyntaxWarning', 'SystemError', 'SystemExit',
 'TabError', 'TimeoutError', 'True', 'TypeError', 'UnboundLocalError',
 'UnicodeDecodeError', 'UnicodeEncodeError', 'UnicodeError',
 'UnicodeTranslateError', 'UnicodeWarning', 'UserWarning',
 'ValueError', 'Warning', 'WindowsError', 'ZeroDivisionError', '_',
 '__build_class__', '__debug__', '__doc__', '__import__', '__name__',
 '__package__', 'abs', 'all', 'any', 'ascii', 'bin', 'bool',
 'bytearray', 'bytes', 'callable', 'chr', 'classmethod', 'compile',
 'complex', 'copyright', 'credits', 'delattr', 'dict', 'dir', 'divmod',
 'enumerate', 'eval', 'exec', 'exit', 'filter', 'float', 'format',
 'frozenset', 'getattr', 'globals', 'hasattr', 'hash', 'help', 'hex',
 'id', 'input', 'int', 'isinstance', 'issubclass', 'iter', 'len',
 'license', 'list', 'locals', 'map', 'max', 'memoryview', 'min',
 'next', 'object', 'oct', 'open', 'ord', 'pow', 'print', 'property',
 'quit', 'range', 'repr', 'reversed', 'round', 'set', 'setattr',
 'slice', 'sorted', 'staticmethod', 'str', 'sum', 'super', 'tuple',
 'type', 'vars', 'zip']

 I think I can safely say that all the names beginning with an
 uppercase letter (exceptions, True/False/None/Ellipsis), and the ones
 beginning with underscores, should not be overridden. Well and good.
 Still leaves 72 builtins. Obviously overriding len, print, range, etc
 would be risky (unless, as mentioned above, you're making a drop-in
 replacement), but there are plenty that you'd never notice (eg if you
 use hash for an encoded password, or input for the string the user
 typed into an HTML form). I would hope, for instance, that an editor
 would not color-highlight 'credits' differently, as it's designed for
 interactive work. There are plenty in the grey area - is it safe to
 use sum as an accumulator or min for a unit of time? What about
 using super to store the amount of retirement money you've put away?
 I'd be inclined to avoid this sort any time I'm aware of them, just
 because it'll make debugging easier on the day when something goes
 wrong.

Okay, now I'm a bit confused.  print is both a keyword and a
member of the builtins.  What happens then?

And abs(), max(), hex()  and such seemed like keywords to my
scientific self (due to never having to include/import them), but
clearly their not.  And int, list, tuple, dict and such always seemed
like keywords to my CS self because they were included in Python's
type system (like int would be in C).

They are all one-step removed from keywords.   And yet, since they are
not in a separate namespace, they should not be used as variable
names.  Perhaps since they are very different from one another, they
should be put in separate namespaces off of a global, root
namespace...  (math, string, etc.)

Despite that, seems like PEP8 should suggest this not shadowing these
built-ins which are at global scope.

MarkJ
Tacoma, Washington
-- 
http://mail.python.org/mailman/listinfo/python-list


  1   2   3   >