Martin Panter added the comment:

I’m not sure I understand your questions Brett (Which tokenizer? What 
rebuilding?), but I will try to explain some relevant parts.

My main goal is to add a makefile dependency of parsetok.o on $(GRAMMAR_H). In 
Python 3, that is easy, and (I realize now) my problem would be solved without 
changing any other file. However my changes in parsetok.c eliminate an 
unnecessary bootstrapping problem, so I still think they are worthwhile.

For Python 3, the parsetok.c code gets compiled to parsetok_pgen.o, and after 
executing, also gets compiled to a separate parsetok.o file. Rebuilding the 
code is not a problem here.

In Python 2, the same parsetok.o object is both part of the pgen program, and 
the eventual Python program. In order to add my desired makefile dependency, I 
have to duplicate parsetok.c compilation into two makefile rules (parsetok.o 
and parsetok_pgen.o). So you could say I am breaking a circle _by_ rebuilding 
the code.

Dependencies between files with my patch applied:

Parser/parsetok_pgen.o -> Parser/pgen -> Include/graminit.h -> 
Parser/parsetok.o -> python

----------

_______________________________________
Python tracker <rep...@bugs.python.org>
<http://bugs.python.org/issue4347>
_______________________________________
_______________________________________________
Python-bugs-list mailing list
Unsubscribe: 
https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com

Reply via email to