Thanks to a lot of help, I've got the outer framework for my tokenizer
down to this:
for line_number, line in enumerate(text):
output = ''
for char_number, char in enumerate(line):
output += char
print 'At ' + str(line_number) + ', '+ str(char_number) + ': '
+ output,
Or do I?
My other tokenizers all worked with array indices. When I came to,
say, an open quote character, I passed control to the get_a_string()
method. That would, locate the end of the string, push a string token
onto the stack and then forward the char array index to point to the
next character in the line.
Is there a pythonic design I'm overlooking?
--
http://mail.python.org/mailman/listinfo/python-list