It's not really a parser, but likely useable as part of a stream parser.

I have converted my pattern to a new module, available here: 
https://github.com/kanongil/node-streamprocess

In addition to the basic processing, I have added an error handler, which will 
defer until processing of all buffered data is complete.
Hopefully someone will find it usable. I know I will. :-)

On 02/07/2013, at 12.43, Florent JABY <florent.j...@gmail.com> wrote:

> Yes, this is basically a parser. I think it's the way to go
> You could check out the one I wrote here http://npmjs.org/package/parser but 
> only handles text streams
> 
> 
> On 2 July 2013 12:32, Gil Pedersen <gp...@gpost.dk> wrote:
> I found a somewhat simple pattern that seems to work for me:
> 
> function processSequential(r, processFn) {
>   function getNext() {
>     var res = r.read();
>     if (res)
>       processFn(res, getNext);
>     else
>       r.once('readable', getNext);
>   }
>   getNext();
> }
> 
> And calling it:
> 
> processSequential(r, function(buf, cb) {
>   // do stuff
>   cb();
> });
> 
> I'm thinking about putting this into an npm module.
> 
> On 02/07/2013, at 11.38, Gil Pedersen <gp...@gpost.dk> wrote:
> 
>> Thanks, this looks like a somewhat interesting pattern. Looking at it, 
>> things quickly get complicated:
>> 
>> I need to loop, so I'll wrap it in a method, calling it recursively.
>> I don't want to call doAsync when bytes === null.
>> I can't "kick" when bytes === null, since I get an infinite recursive loop.
>> Depending on internal stream state, kicking will not always trigger a 
>> 'readable' emit, leading to a stall.
>> 
>> I really wish this was simpler :-/
>> 
>> On 02/07/2013, at 09.28, Floby <florent.j...@gmail.com> wrote:
>> 
>>> I'd use some kind of withReadable(stream, fn) helper
>>> 
>>> var bytes = stream.read(2);
>>> doAsync(bytes, function(err, res) {
>>>   withReadable(stream, function () {
>>>     // stream is readable for sure
>>>   });
>>> });
>>> 
>>> function withReadable(stream, callback) {
>>>   stream.once(callback);
>>>   stream.read(0);
>>> }
>>> 
>>> 
>>> On Monday, 1 July 2013 15:23:44 UTC+2, Gil Pedersen wrote:
>>> Hi, 
>>> 
>>> I have a case where I want to consume a Readable, by: 
>>> 
>>> 1. Read n bytes. 
>>> 2. Process buffer using a function with an async callback. 
>>> 3. When callback completes, goto 1. 
>>> 
>>> I fail to see any simple solution to what I feel must be a somewhat common 
>>> use case. The standard "on('readable', function() { this.read(); ... })" 
>>> doesn't apply, as I have no way of deferring future 'readable' emits once I 
>>> have returned from the handler. 
>>> 
>>> Can it really be that the most sensible solution is to fall back to the old 
>>> API, using pause() and resume()? or have I missed something? 
>>> 
>>> Regards, 
>>> Gil
>> 
> 
> 

-- 
-- 
Job Board: http://jobs.nodejs.org/
Posting guidelines: 
https://github.com/joyent/node/wiki/Mailing-List-Posting-Guidelines
You received this message because you are subscribed to the Google
Groups "nodejs" group.
To post to this group, send email to nodejs@googlegroups.com
To unsubscribe from this group, send email to
nodejs+unsubscr...@googlegroups.com
For more options, visit this group at
http://groups.google.com/group/nodejs?hl=en?hl=en

--- 
You received this message because you are subscribed to the Google Groups 
"nodejs" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to nodejs+unsubscr...@googlegroups.com.
For more options, visit https://groups.google.com/groups/opt_out.


Reply via email to