Steven D'Aprano <steve+comp.lang.pyt...@pearwood.info>: > You're arguing whether or not in the following line of code: > > spam = "abcd" "efgh" > # implicitly concatenated to "abcdefgh" at compile time > > the right hand side pair of strings counts as a single token or two? Am I > right, or am I missing something? > > If that's all it is, why don't you just run the tokenizer over it and > see what it says?
Now, someone *could* write a tokenizer that took care of string concatenation on the spot--as long as it dealt with comments as well: ("abc" # hello "def") It would be even possible to write a parser that didn't have a separate lexical analyzer at all. Arguing about terminology is pretty useless. Both sides in this fight are correct, namely: * string literal concatenation is part of expression syntax * what goes on inside an atom stays inside an atom For example, this expression is illegal: "abc" ("def") Marko -- https://mail.python.org/mailman/listinfo/python-list