Thanks for the quick reply.  It's helpful to know that it's not a
known about feature or new user error.  For my purposes, I've
determined that the lexer part is working fine and I've cobbled
together a simple script that does the job based on that.

I'll see if I can find out what's happening in yply, but I'm
relatively new to parsing techniques, so it might be a while before I
work it out!

Cheers,

David.

On 23 Nov, 00:45, David Beazley <[EMAIL PROTECTED]> wrote:
> It is extremely unlikely that this is due to a size limitation as I'm not 
> aware of anything like that built into PLY or yply.  However, yply has always 
> been a bit on the experimental
> side.   It's quite likely that this is some kind of processing bug in yply 
> that I'll need to track down.  Maybe there's a bad state transition or regex 
> that's consuming more than it
> should. Or maybe the grammar for yply itself is broken--stopping when it's 
> only seen the first part of the file.   I'm probably not going to be able to 
> track this down right this
> second, but if you're willing to do some digging and can find the bug, I'll 
> certainly fix it in Ply 2.6.
>
> Cheers,
> Dave
>
> On Sat 22/11/08  5:36 PM , David [EMAIL PROTECTED] sent:
>
>
>
>
>
> > I have an existing yacc definition for a language I'm trying to parse
> > and I was hoping to yply to get me started.
>
> > However, yply seems to stop part-way through the file - and I can't
> > see why.  Things I've tried:
>
> > 1.  Examining the output of the ylex.py.  This seems to read ok and at
> > the point that yply stops, it is still generating the correct tokens.
> > I can't see any difference between the tokens before it ends and
> > afterwards.
>
> > (ID,'rule',251,7668)
> > (:,':',251,7672)
> > (ID,'COMPILE_WITH',252,7675)
> > (ID,'stringvalue',252,7688)
> > (CODE,'{ $$ = $2; }',252,7711)
> > (|,'|',252,7713)
> > (CODE,'{ $$ = NULL; }',253,7743)
> > (;,';',253,7744)
> > (ID,'prefix',255,7747)
> > (:,':',255,7753)
> > (ID,'PREFIX',256,7756)
> > (ID,'filename',256,7763)
> > (CODE,'{ prefix_push($2); }',256,7793)
> > (|,'|',256,7795)
> > (ID,'PREFIX',257,7798)
> > (CODE,'{ prefix_pop(); }',257,7824)
> > (;,';',257,7825)
>
> > ## Output ends here!
>
> > (ID,'dev_defs',262,7871)
> > (:,':',262,7879)
> > (ID,'dev_defs',263,7882)
> > (ID,'dev_def',263,7891)
> > (|,'|',263,7899)
> > (ID,'dev_defs',264,7902)
> > (ID,'ENDFILE',264,7911)
> > (CODE,'{ enddefs(); checkfiles(); }',264,7947)
> > (|,'|',264,7949)
> > (;,';',265,7963)
>
> > 2.  Deleting a few rows from the original gram.y:  the resulting file
> > goes on for longer, but only to the approx extent of the deleted
> > lines.
>
> > It feels like it's hitting some sort of buffer/memory limitation
> > somewhere, but having inspected yply and yparse, I can't see where.
> > Has anyone else had this sort of problem?  Any hints on how to solve
> > it?
>
> > The file I'm looking to parse is gram.y from the kernel configuration
> > tool in NetBSD.  See link below
>
> >http://cvsweb.netbsd.org/bsdweb.cgi/src/usr.bin/config/gram.y?r
> > ev=1.17&content-type=text/x-cvsweb-markup&only_with_tag=MAIN
>
> > Any pointers greatly appreciated.
>
> > David.
>
> > PS - I'm using ply 2.5 on python 2.5.2 on ubuntu 8.10.
> > --~--~---------~--~----~------------~-------~--~----~
> > You received this message because you are subscribed to the Google Groups
> > "ply-hack" group.To post to this group, send email to ply
> > [EMAIL PROTECTED] unsubscribe from this group, send email to ply-hack+
> > [EMAIL PROTECTED] more options, visit this group 
> > athttp://groups.google.com/group/ply-hack?hl=en-~----------~----~----~-...
> --~--~---- Hide quoted text -
>
> - Show quoted text -
--~--~---------~--~----~------------~-------~--~----~
You received this message because you are subscribed to the Google Groups 
"ply-hack" group.
To post to this group, send email to [email protected]
To unsubscribe from this group, send email to [EMAIL PROTECTED]
For more options, visit this group at 
http://groups.google.com/group/ply-hack?hl=en
-~----------~----~----~----~------~----~------~--~---

Reply via email to