Edsko de Vries wrote: > On Fri, Mar 02, 2007 at 12:55:29PM +0100, [EMAIL PROTECTED] wrote: >> On Friday 02 March 2007 12:30, Edsko de Vries wrote: >>> Hi, >>> >>> Is there any way I can make Bison ignore tokens when it can't deal with >>> them? I'll try to explain. For every completely blank line in the input, >>> my leexer generates a NOP token. In some situations the parser can deal >>> with this NOP token; in others it can't. >> If you need to ignore the blank lines, just make sure your scanner doesnt >> return a token to the parser. > > Well, no, that's the point. Blank lines should be recorded as NOPs where > possible, so that we know where they are and unparse them. For example, > if the user writes > > $a = 5; > $b = 6; > > foo($a); > bar($b); > > (I'm working on phc, which works on PHP), we want to unparse it like > that, and not like > > $a = 5; > $b = 6; > foo($a); > bar($b); > > By inserting the NOP at the blank line, we can avoid this. > > Edsko
To me this seems like the same problem as preserving comments in a parse-process-generate system (which is a pain in the ass). You could have the lexer tally empty lines and set that count as a property of the semantic value of "real" tokens. something like (pseudocode): foo.l: ^\n ++empty_lines; MYTOKEN { yylval = create_ast_node(t_MYTOKEN, yytext); yylval->empty_lines = empty_lines; empty_lines = 0; return t_MYTOKEN; } foo.y: nothing special - empty lines not seen at all generation: void generate_ast_node(FILE* out, ast_node* n) { while (n->empty_lines-- > 0) fputc ('\n', out); ... /* other "unparsing" steps */ } _______________________________________________ help-bison@gnu.org http://lists.gnu.org/mailman/listinfo/help-bison