On 10/22/2015 1:56 PM, R. David Murray wrote:
On Thu, 22 Oct 2015 17:02:48 -0000, Brett Cannon <br...@python.org> wrote:
On Thu, 22 Oct 2015 at 09:37 Stéphane Wirtel <steph...@wirtel.be> wrote:

Hi all,

When we compile a python script

# test.py
if 0:
         x = 1

python -mdis test.py

There is no byte code for the condition.

The code above is logically equivalent to 'pass'. CPython could compile 'pass' (or code equivalent to it) to the NOP bytecode, but I believe it does not. (The doc implies that NOP is only introduced by the peephole optimizer as a replacement for other statements.)

So my question is,
>>> the byte code generator removes the unused functions, variables etc…,

It does not do this.

>>> def f(): a=111111111111111

>>> dis(f)
  1           0 LOAD_CONST               1 (111111111111111)
              3 STORE_FAST               0 (a)
              6 LOAD_CONST               0 (None)
              9 RETURN_VALUE

A code checker might report that a in unused, suggesting that you remove the assignment manually, but python does not do it for you.

>>> is it right?

It depends on whether one philosophically considers the meaning of python code to be a particular computational operation or a particular computational result. From the former viewpoint, one might think or want the useless assignment. From the latter viewpoint, in this particular context, deleting the assignment statement would be correct. But this is too sophisticated a judgement for CPython's simple code improvement.

The philosophical question is illustrated by "What is the meaning of '2+2'?". Does it mean 'get 2 and add 2' (and if so, when must this be done)? Does it mean '4'? Current constant folding at the AST or pre-bytecode stage presumes that the answer '4', at runtime, is adequate.

Indeed, whether 'pass' should be compiled to 'NOP' or nothing depends on one's view of the meaning of pass and whether it must be executed (by going though the ceval loop once and doing nothing) or not.

Technically the peepholer removes the dead branch,
>> but since the peepholer is run on all bytecode you can't avoid it.

There's an issue (http://bugs.python.org/issue2506) for being able to
disable all optimizations (that Ned Batchelder, among others, would really

Ned initially just asked for the peephole optimizer to be disabled, which is specific and concrete. He then seemed to ask to be able to disable 'all optimizations'. This is somewhat vague and presupposed a baseline of no optimizations, which is not practical.

like to see happen :).  Raymond rejected it as not being worthwhile.

He noted that there are multiple costs for implementation, maintenance, documentation, and testing. Also that most things currently done by the peephole optimizer, after code is generated, could be done before or as code is generated, and that one cost might be to make better code generation more difficult.

I still agree with Ned and others that there should, just on principle,
be a way to disable all optimizations.

The question is what counts as an optimization? Generating byte code is not part of the language. It could be considered a CPython optimization versus storing the original text or an ast version thereof. This is not practical.

More realistic: would you have the optimization of no code for 'pass' disabled? Or how about constant folding? Or the optimization of having successive string literals pasted together in the parser, rather than later in the total process? (I believe I have read that this is where it currently happens.) My point here is that optimizations are spread throughout the chain from code to result.

 Most (all?) compilers have such
a feature, for debugging reasons if nothing else.

Compilers that make iffy assumptions and do non-equivalent transformations to produce faster code, at the cost of breaking some legal code, must have a means to disable such assumptions and transformations. Or rather, such optimizations should be off by default. Each optimization level defines a subset of the language for which the assumptions and transformations are valid.

For CPython, it has been proposed that there should be optional optimization levels that assume that some dynamic features are not being used: the builtins module is not modified; built-in objects are not masked; modules are not monkey patched. Guido has so far rejected the idea of such language-subsetting optimizations.

My strong impression is that the -O0 'no optimization' level of modern compilers (for C, for instance) *generally* means 'no potentially dangerous and wrong optimizations' and that they in fact include many safe optimizations relative to compilers of a few decades ago. I am sure that there is no way to disable many of the safe optimizations that are now so taken for granted that they are 'normal' and not seen as 'optimizations'. In other words, I believe that the meaning of 'no optimizations' has evolved. The same should be allowed for CPython

It is also my impression that current default CPython optimizations are limited to what are default level 0 in at least some other compilers. I could be wrong, but to be convinced, I would like to see specific comparisons with, say gcc. Which of the various operations being considered for disablement are optional in gcc?

We even have a way
to spell it in the generated byte code files now (opt-0).  But, someone
would have to champion it and write a patch proposal.

I think there should be a PEP specifying exactly what -X would do (or rather undo). One of Raymond's points was that disabling 'all optimizations' is much more than needed for any sensible purpose. I also claim that 'all optimizations' is rather vague since is it relative to an unspecified baseline of what constitutes 'no optimizations'.

--
Terry Jan Reedy



_______________________________________________
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com

Reply via email to