Bugs item #1447633, was opened at 2006-03-10 23:55
Message generated for change (Tracker Item Submitted) made by Item Submitter
You can respond by visiting: 
https://sourceforge.net/tracker/?func=detail&atid=105470&aid=1447633&group_id=5470

Please note that this message will contain a full copy of the comment thread,
including the initial issue submission, for this request,
not just the latest update.
Category: Python Library
Group: Python 2.4
Status: Open
Resolution: None
Priority: 5
Submitted By: Edward C. Jones (edcjones)
Assigned to: Nobody/Anonymous (nobody)
Summary: "reindent.py" exposes bug in tokenize

Initial Comment:
I use up-to-date Debian unstable (i368 port) on a PC with a AMD Athlon64
+3500 chip. I compile my own copy of Python which I keep in /usr/local.

Here is a small Python program called "fixnames.py":

#! /usr/bin/env python

"""Rename files that contain unpleasant characters.

Modify this code as needed.
"""
import os, sys, optparse

usage = 'USAGE: ./fixnames.py [-h] <filelist>'
parser = optparse.OptionParser(usage=usage)
options, args = parser.parse_args()
if len(args) != 1:
    parser.print_help()
    sys.exit('an argument is required'))

# The input is a list of files to be renamed.
for name in open(args[0]), 'r'):
    # Modify these as needed.
    newname = name.replace(' ', '_')
    newname = newname.replace('@', '_at_')
    newname = newname.replace('%20', '_')
    newname = newname.replace("'", '')
    os.rename(name, newname)

If I run

python /usr/local/src/Python-2.4.2/Tools/scripts/reindent.py fixnames.py

I get
Traceback (most recent call last):
  File "/usr/local/src/Python-2.4.2/Tools/scripts/reindent.py", line 293, in ?
    main()
  File "/usr/local/src/Python-2.4.2/Tools/scripts/reindent.py", line 83, in main
    check(arg)
  File "/usr/local/src/Python-2.4.2/Tools/scripts/reindent.py", line 108, in 
check
    if r.run():
  File "/usr/local/src/Python-2.4.2/Tools/scripts/reindent.py", line 166, in run
    tokenize.tokenize(self.getline, self.tokeneater)
  File "/usr/local/lib/python2.4/tokenize.py", line 153, in tokenize
    tokenize_loop(readline, tokeneater)
  File "/usr/local/lib/python2.4/tokenize.py", line 159, in tokenize_loop
    for token_info in generate_tokens(readline):
  File "/usr/local/lib/python2.4/tokenize.py", line 236, in generate_tokens
    raise TokenError, ("EOF in multi-line statement", (lnum, 0))
tokenize.TokenError: ('EOF in multi-line statement', (24, 0))



----------------------------------------------------------------------

You can respond by visiting: 
https://sourceforge.net/tracker/?func=detail&atid=105470&aid=1447633&group_id=5470
_______________________________________________
Python-bugs-list mailing list 
Unsubscribe: 
http://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com

Reply via email to