Hello me,
Have you tried shlex.py it is a tokenizer for writing lexical
parsers.
Should be a breeze to whip something up with it.
an example of tokenizing:
py>import shlex
py># fake an open record
py>import cStringIO
py>myfakeRecord = cStringIO.StringIO()
py>myfakeRecord.write("['1','2'] \n 'fdfdfdfd' \n 'dfdfdfdfd'
['1','2']\n")
py>myfakeRecord.seek(0)
py>lexer = shlex.shlex(myfakeRecord)

py>lexer.get_token()
'['
py>lexer.get_token()
'1'
py>lexer.get_token()
','
py>lexer.get_token()
'2'
py>lexer.get_token()
']'
py>lexer.get_token()
'fdfdfdfd'

You can do a lot with it that is just a teaser.
M.E.Farmer

-- 
http://mail.python.org/mailman/listinfo/python-list

Reply via email to