so the official ways to read a readable are:

1. Old Style: 
r.on('data', function(data){...})
r.on('end', function(data){...})  

works and will be supported (foreverever?)

2. The new, kindofCbutnotreally, Style: 
var existingChunk = rs.read();
//do something with existingChunk
rs.on('readable', function () {
    var chunk = rs.read();
    // do something with chunk
});

this feels very uncomfortable to me. you can poll all the time, 'readable' 
event looks to me like an invitation for calling read, but it's not in the 
first place, because it's emited only when you called read at least once 
and only if it returned null, right? looks like to much 'ifs' on that for 
me. looks like bad api design here to me. why is this designed this way? 
more intuitive would be:
- attach to 'readable'
- if there is data already, the listener is called as soon as it's attached 
('newListener' event right?).
- in the listener you can/should consume via .read as long as result isn't 
null (better undefined, suits some edge cases in object mode better.)
- end event emited, when stream is done.

3. The new old kinky way via piping to a consumer stream:
r.pipe(new CustomWritable())

this looks pretty to me and hides away the gotchas of 2. but it has 
indirection of dataflow control, wich is only handled by calling the 
callback in _write and tweekin waterMarks. so if you want better control 
i.E. unshifting chunks, then you probably have to stick with 2. right?

Trying to resolve my new confusion about readable.

Greet

greelgorke 



Am Dienstag, 7. Mai 2013 03:49:57 UTC+2 schrieb James Hartig:
>
> Just to update this thread for future readers...
> Talked to Isaac on Twitter and he had said that "readable" will be fired 
> on first chunk after making a Readable Stream. But before you make the 
> "readable" listener, you can call read() just to be sure there isn't 
> already data.
>
> So:
> var existingChunk = rs.read();
> //do something with existingChunk
> rs.on('readable', function () {
>     var chunk = rs.read();
>     // do something with chunk
> });
>
> On Monday, May 6, 2013 5:45:09 PM UTC-4, James Hartig wrote:
>>
>> > The 'readable' event fires as soon as *any* data is added to the 
>> > internal buffer, but only if a previous read() call returned null.  If 
>> > you never got a null read, then you haven't exhausted the buffer, so 
>> > there's no need to emit 'readable', since presumably you already know 
>> > it's readable.
>>
>> Is needReadable set to true when you create the socket or do I have to 
>> call read() right after making the socket?
>> I looked through the ReadableStream code and couldn't find a path that 
>> sets it to true. I'm using new fs.ReadStream(null, {fd: 4}).
>>
>> On Monday, May 6, 2013 4:59:39 PM UTC-4, Isaac Schlueter wrote:
>>>
>>> > Basically the loop is because the "readable" event doesn't fire until 
>>> the buffer is filled up and if you want to get data immediately, then you 
>>> can't rely on "readable"? 
>>>
>>> The 'readable' event fires as soon as *any* data is added to the 
>>> internal buffer, but only if a previous read() call returned null.  If 
>>> you never got a null read, then you haven't exhausted the buffer, so 
>>> there's no need to emit 'readable', since presumably you already know 
>>> it's readable. 
>>>
>>> > It would seem (from the docs) that read() without any limit returns 
>>> the whole buffer, so how would there be more data the next time you call 
>>> it? 
>>>
>>> The length of the returned data from read() is implementation-defined. 
>>>  In objectMode streams, it'll always be one "thing", but in 
>>> binary/string streams it can be any amount of data. 
>>>
>>> If you're using the stream.Readable base class, then yes, read() will 
>>> always return the full buffer, *unless* you're piping, in which case, 
>>> it returns the top chunk in the list, so as to avoid an unnecessary 
>>> copy in the case where there's more than one chunk ready. 
>>>
>>>
>>> On Mon, May 6, 2013 at 11:10 AM, James Hartig <faste...@gmail.com> 
>>> wrote: 
>>> > Sorry to be late to the party... 
>>> > 
>>> > Basically the loop is because the "readable" event doesn't fire until 
>>> the 
>>> > buffer is filled up and if you want to get data immediately, then you 
>>> can't 
>>> > rely on "readable"? 
>>> > 
>>> > It would seem (from the docs) that read() without any limit returns 
>>> the 
>>> > whole buffer, so how would there be more data the next time you call 
>>> it? 
>>> > 
>>> > 
>>> > On Sunday, April 14, 2013 4:57:49 PM UTC-4, Jorge wrote: 
>>> >> 
>>> >> On 30/03/2013, at 00:56, Isaac Schlueter wrote: 
>>> >> 
>>> >> > ```javascript 
>>> >> > var chunk; 
>>> >> > while (null !== (chunk = rs.read())) { 
>>> >> >  doSomething(chunk); 
>>> >> > } 
>>> >> > ``` 
>>> >> 
>>> >> I use to write code like that too but it might break it seems, look: 
>>> >> 
>>> >> <https://bugs.webkit.org/show_bug.cgi?id=114594> 
>>> >> 
>>> >> this works: 
>>> >> 
>>> >> function works (s) { 
>>> >>   var pos; 
>>> >>   var n= 0; 
>>> >>   var t; 
>>> >>   var r= ""; 
>>> >>   var o= 
>>> "0123456789abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ"; 
>>> >>   var p= 
>>> "5678901234nopqrstuvwxyzabcdefghijklmNOPQRSTUVWXYZABCDEFGHIJKLM"; 
>>> >>   while (n < s.length) { 
>>> >>     t= s[n]; 
>>> >>     pos= o.indexOf(t); 
>>> >>     r+= (pos >= 0) ? p[pos] : t; 
>>> >>     n++; 
>>> >>   } 
>>> >>   return r; 
>>> >> } 
>>> >> 
>>> >> this doesn't: 
>>> >> 
>>> >> function fails (s) { 
>>> >>   var pos, n = 0, 
>>> >>     t, r = "", 
>>> >>     o = 
>>> "0123456789abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ", 
>>> >>     p = 
>>> "5678901234nopqrstuvwxyzabcdefghijklmNOPQRSTUVWXYZABCDEFGHIJKLM"; 
>>> >>   while (n < s.length) { 
>>> >>     r += ((pos = o.indexOf(t = s[n++])) >= 0) ? p[pos] : t; 
>>> >>   } 
>>> >>   return r; 
>>> >> } 
>>> >> 
>>> >> -- 
>>> >> ( Jorge )(); 
>>> > 
>>> > -- 
>>> > -- 
>>> > Job Board: http://jobs.nodejs.org/ 
>>> > Posting guidelines: 
>>> > https://github.com/joyent/node/wiki/Mailing-List-Posting-Guidelines 
>>> > You received this message because you are subscribed to the Google 
>>> > Groups "nodejs" group. 
>>> > To post to this group, send email to nod...@googlegroups.com 
>>> > To unsubscribe from this group, send email to 
>>> > nodejs+un...@googlegroups.com 
>>> > For more options, visit this group at 
>>> > http://groups.google.com/group/nodejs?hl=en?hl=en 
>>> > 
>>> > --- 
>>> > You received this message because you are subscribed to the Google 
>>> Groups 
>>> > "nodejs" group. 
>>> > To unsubscribe from this group and stop receiving emails from it, send 
>>> an 
>>> > email to nodejs+un...@googlegroups.com. 
>>> > For more options, visit https://groups.google.com/groups/opt_out. 
>>> > 
>>> > 
>>>
>>

-- 
-- 
Job Board: http://jobs.nodejs.org/
Posting guidelines: 
https://github.com/joyent/node/wiki/Mailing-List-Posting-Guidelines
You received this message because you are subscribed to the Google
Groups "nodejs" group.
To post to this group, send email to nodejs@googlegroups.com
To unsubscribe from this group, send email to
nodejs+unsubscr...@googlegroups.com
For more options, visit this group at
http://groups.google.com/group/nodejs?hl=en?hl=en

--- 
You received this message because you are subscribed to the Google Groups 
"nodejs" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to nodejs+unsubscr...@googlegroups.com.
For more options, visit https://groups.google.com/groups/opt_out.


Reply via email to