Steven D'Aprano wrote: > Context is all gone, so I'm not sure that I remember what "it" is. I > think it is the text that you're parsing.
Yes. I'm tokenizing today. Parsing comes after Christmas. > TEXT = "placeholder" > > def parse(): > while True: > token = get_next_token() # looks at global TEXT > yield token Classic, but I'm not going to go there (at least until I fail otherwise). My tokenizer returns an array of Token objects. Each Token includes the text from which is created, locations in the original text and, for something like CONSTANT_INTEGER, it has an intValue data member. > # Run as many independent parsers as I need: > parser1 = parse(open("filename", "r").read()) > parser2 = parse(open("filename2", "r").read()) > parser3 = parse("some text") Interesting approach, that. Could have a separate parser for each statement. Hmmm. Maybe my tokenizer should return a list of arrays of Tokens, one array per statement. Hmmm. I'm thinking about an OO language construction that would be very easy to extend. Tentatively, I'll have Production objects, Statement objects, etc. I've already got Tokens. Goal is a really simple language for beginners. Decaf will be to Java as BASIC was to Fortran, I hope. -- http://mail.python.org/mailman/listinfo/python-list