Jared Grubb added the comment: CPython allows \ at EOF, but tokenize does not.
>>> s = 'print 1\\\n' >>> exec s 1 >>> tokenize.tokenize(StringIO(s).readline) 1,0-1,5: NAME 'print' 1,6-1,7: NUMBER '1' Traceback (most recent call last): File "<stdin>", line 1, in <module> File "/Library/Frameworks/Python.framework/Versions/2.5/lib/python2.5/tokenize.py", line 153, in tokenize tokenize_loop(readline, tokeneater) File "/Library/Frameworks/Python.framework/Versions/2.5/lib/python2.5/tokenize.py", line 159, in tokenize_loop for token_info in generate_tokens(readline): File "/Library/Frameworks/Python.framework/Versions/2.5/lib/python2.5/tokenize.py", line 283, in generate_tokens raise TokenError, ("EOF in multi-line statement", (lnum, 0)) tokenize.TokenError: ('EOF in multi-line statement', (2, 0)) __________________________________ Tracker <[EMAIL PROTECTED]> <http://bugs.python.org/issue2180> __________________________________ _______________________________________________ Python-bugs-list mailing list Unsubscribe: http://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com