Roundup Robot added the comment:
New changeset 0f0e9b7d4f1d by Terry Jan Reedy in branch '2.7':
Issue #9974: When untokenizing, use row info to insert backslash+newline.
http://hg.python.org/cpython/rev/0f0e9b7d4f1d
New changeset 24b4cd5695d9 by Terry Jan Reedy in branch '3.3':
Issue #9974:
Terry J. Reedy added the comment:
I added 5-tuple mode to roundtrip() in #20750. I solved the ENDMARKER problem
by breaking out of the token loop if and when it appears. Reconstructing
trailing whitespace other than \n is hopeless. The roundtrip test currently
only tests equality of token
Terry J. Reedy added the comment:
The \ continuation bug is one of many covered by #12691 and its patch, but this
came first and it focused on only this bug. With respect to this issue, the
code patches are basically the same; I will use tests to choose between them.
On #12691, Gareth notes
Changes by Terry J. Reedy tjre...@udel.edu:
--
assignee: - terry.reedy
___
Python tracker rep...@bugs.python.org
http://bugs.python.org/issue9974
___
___
Terry J. Reedy added the comment:
One could argue that The guarantee applies only to the token type and token
string as the spacing between tokens (column positions) may change. covers
merging of lines, but total elimination of needed whitespace is definitely a
bug.
--
nosy:
Dwayne Litzenberger added the comment:
@amk: I'd appreciate it if you did. :)
I ran into this bug while writing some code that converts b... into ... in
PyCrypto's setup.py script (for backward compatibility with Python 2.5 and
below).
--
nosy: +DLitz
Changes by Meador Inge mead...@gmail.com:
--
nosy: +meador.inge
___
Python tracker rep...@bugs.python.org
http://bugs.python.org/issue9974
___
___
Python-bugs-list
Changes by Simon Law sfl...@sfllaw.ca:
--
nosy: +sfllaw
___
Python tracker rep...@bugs.python.org
http://bugs.python.org/issue9974
___
___
Python-bugs-list mailing list
Changes by Eric Snow ericsnowcurren...@gmail.com:
--
nosy: +eric.snow
___
Python tracker rep...@bugs.python.org
http://bugs.python.org/issue9974
___
___
Python-bugs-list
A.M. Kuchling added the comment:
I looked at this a bit and made a revised version of the patch that doesn't add
any line continuations when the token is ENDMARKER. It works on the example
program and a few variations I tried, though I'm not convinced that it'll work
for all possible
Raymond Hettinger rhettin...@users.sourceforge.net added the comment:
My patch handles the described situation, albeit a bit poorly ...
Let us know when you've got a cleaned-up patch and have run the round-trip
tests on a broad selection of files.
For your test case, don't feel compelled to
Brian Bossé pen...@gmail.com added the comment:
Yup, that's related to ENDMARKER being tokenized to its own line, even if EOF
happens at the end of the last line of actual code. I don't know if anything
relies on that behavior so I can't really suggest changing it.
My patch handles the
nick caruso ngv...@gmail.com added the comment:
--
import StringIO
import tokenize
tokens = []
def fnord(*a):
tokens.append(a)
tokenize.tokenize(StringIO.StringIO(a = 1).readline, fnord)
tokenize.untokenize(tokens)
--
Generates the
nick caruso ngv...@gmail.com added the comment:
Additionally, substituting a=1\n for a=1 results in no assertion and
successful untokenizing to a = 1\n
--
___
Python tracker rep...@bugs.python.org
http://bugs.python.org/issue9974
Brian Bossé pen...@gmail.com added the comment:
No idea if I'm getting the patch format right here, but tally ho!
This is keyed from release27-maint
Index: Lib/tokenize.py
===
--- Lib/tokenize.py (revision 85136)
+++
Changes by Raymond Hettinger rhettin...@users.sourceforge.net:
--
assignee: - rhettinger
nosy: +rhettinger
___
Python tracker rep...@bugs.python.org
http://bugs.python.org/issue9974
___
Kristján Valur Jónsson krist...@ccpgames.com added the comment:
Interesting, is that a separate defect of doctest?
--
___
Python tracker rep...@bugs.python.org
http://bugs.python.org/issue9974
___
New submission from Brian Bossé pen...@gmail.com:
Executing the following code against a py file which contains line
continuations generates an assert:
import tokenize
foofile = open(filename, r)
tokenize.untokenize(list(tokenize.generate_tokens(foofile.readline)))
(note, the list() is
Changes by Kristján Valur Jónsson krist...@ccpgames.com:
--
nosy: +krisvale
___
Python tracker rep...@bugs.python.org
http://bugs.python.org/issue9974
___
___
19 matches
Mail list logo