[Bug 127023] Re: tokenize.py uses N_TOKENS without prior definition

2008-08-07 Thread Scott Kitterman
Apport no longer uses tokenize, so I think this is fine. -- tokenize.py uses N_TOKENS without prior definition https://bugs.launchpad.net/bugs/127023 You received this bug notification because you are a member of Ubuntu Bugs, which is subscribed to Ubuntu. -- ubuntu-bugs mailing list

[Bug 127023] Re: tokenize.py uses N_TOKENS without prior definition

2008-08-06 Thread Connor Imes
We are closing this bug report because it lacks the information we need to investigate the problem, as described in the previous comments. Please reopen it if you can give us the missing information, and don't hesitate to submit bug reports in the future. To reopen the bug report you can click on

[Bug 127023] Re: tokenize.py uses N_TOKENS without prior definition

2008-02-05 Thread Martin Pitt
Seems the Python interpreter got royally confused here (apport itself merely does an import of some Python standard libs, nothing I could fix there). Scott, did you ever get this again? If not, are you ok with blaming sun rays and just ignore this? ** Changed in: python2.5 (Ubuntu)

[Bug 127023] Re: tokenize.py uses N_TOKENS without prior definition

2008-02-05 Thread Scott Kitterman
I never got it again. I get the impression from doko's comment he had an idea what the issue was. Up to you how you want to deal with it. -- tokenize.py uses N_TOKENS without prior definition https://bugs.launchpad.net/bugs/127023 You received this bug notification because you are a member of

[Bug 127023] Re: tokenize.py uses N_TOKENS without prior definition

2007-09-06 Thread Scott Kitterman
** Changed in: apport (Ubuntu) Status: Incomplete = Confirmed -- tokenize.py uses N_TOKENS without prior definition https://bugs.launchpad.net/bugs/127023 You received this bug notification because you are a member of Ubuntu Bugs, which is the bug contact for Ubuntu. -- ubuntu-bugs

[Bug 127023] Re: tokenize.py uses N_TOKENS without prior definition

2007-08-14 Thread Matthias Klose
no, it's not a bug; see the line before: tokenize.py: from token import * $ python2.5 Python 2.5.1 (r251:54863, Jul 17 2007, 16:02:11) [GCC 4.1.3 20070629 (prerelease) (Ubuntu 4.1.2-13ubuntu2)] on linux2 Type help, copyright, credits or license for more information. import tokenize