I'd use some kind of withReadable(stream, fn) helper
var bytes = stream.read(2);
doAsync(bytes, function(err, res) {
withReadable(stream, function () {
// stream is readable for sure
});
});
function withReadable(stream, callback) {
stream.once(callback);
stream.read(0);
}
On
var drop = false;
doRequestA(handleRequest);
doRequestB(handleRequest);
doRequestC(handleRequest);
function handleRequest(err, response) {
if (err) {
return;
}
if (drop) {
return;
}
drop = true;
doSomethingWithResponse(response);
}
OR
var requests = [];
`close` event is not part of the stream API. It's generally used by
_readable_ streams that may close some underlying resources, but it's
optional.
On Mon, Jul 1, 2013 at 10:24 PM, Mark Hahn m...@hahnca.com wrote:
I'd wait for the close event to make sure that the file has fully
finished
This looks interesting, but not applicable to my case since I don't want to
require harmony (yet).
On 01/07/2013, at 21.13, tjholowaychuk tjholoway...@gmail.com wrote:
This is what I resorted to:
https://github.com/visionmedia/co/blob/master/examples/streams.js#L42, there
might be a nicer
http://blog.schwink.net/2011/01/http-chunks-and-onreadystatechange.html?m=1
Is how you would do this client side, there may be a way to do it in a
similar style server side
Scott
On 2 July 2013 09:40:33 Gil Pedersen gp...@gpost.dk wrote:
This looks interesting, but not applicable to my case
On Tue, Jul 2, 2013 at 7:53 AM, Rusty Conover write.to.ru...@gmail.com wrote:
Hi,
I have a stream that has many listeners waiting for it to drain before more
data will be written. When it fires a drain event, the first listener who
processes the drain event will generally cause the stream to
On Jul 2, 2013, at 03:47, Stephen Vickers wrote:
On Tuesday, 2 July 2013 01:06:49 UTC+1, ryandesign wrote:
On Jul 1, 2013, at 14:41, Stephen Vickers wrote:
I've just published raw-socket 1.1.8 to npm.
This change does break existing code - hence this announcement - but the
newer
Thanks, this looks like a somewhat interesting pattern. Looking at it, things
quickly get complicated:
I need to loop, so I'll wrap it in a method, calling it recursively.
I don't want to call doAsync when bytes === null.
I can't kick when bytes === null, since I get an infinite recursive loop.
Hi everyone,
Node.js is gaining some nice momentum in New Zealand recently and
we're seeing more and more companies and startups tinker, play and use
Node.js in production. So it's time I re-announced the Node.js NZ
mailing list [1] and also point out that two new MeetUp groups have
recently
I found a somewhat simple pattern that seems to work for me:
function processSequential(r, processFn) {
function getNext() {
var res = r.read();
if (res)
processFn(res, getNext);
else
r.once('readable', getNext);
}
getNext();
}
And calling it:
processSequential(r,
Yes, this is basically a parser. I think it's the way to go
You could check out the one I wrote here http://npmjs.org/package/parser but
only handles text streams
On 2 July 2013 12:32, Gil Pedersen gp...@gpost.dk wrote:
I found a somewhat simple pattern that seems to work for me:
function
On Mon, Jul 01, 2013 at 03:05:20AM -0700, Сергей Коротков wrote:
We saw strata prior to starting of our work and still considering it for
implementation because is should give us persistent indexes and that must
greatly speedup initial database load time. I believe that selection of
Thanks for the try, but it didn't helped. We have found a temporary hotfix,
however the real issue seems to be a bug in the incremental gc itself.
Here is a full documentation of the issue
https://github.com/LearnBoost/mongoose/issues/1565
Oleg Slobodskoi
twitter.com/oleg008
We really need a standard for Line Delimited JSON.
Please have a look at: http://en.wikipedia.org/wiki/Line_Delimited_JSON and
comment on the talk page.
--
--
Job Board: http://jobs.nodejs.org/
Posting guidelines:
https://github.com/joyent/node/wiki/Mailing-List-Posting-Guidelines
You
*Hello,*
I am looking forward to stream audio files using WebRTC and NodeJS to
clients.
It should also be possible to seek in the files.
*Background*
Essentially I like to create a 'pseudo radio' where long prerecorded non
stop DJ Mixes are streamed to the clients. (no i diont want to use
On Tuesday, July 2, 2013 3:59:48 AM UTC-4, Raynos wrote:
`close` event is not part of the stream API. It's generally used by
_readable_ streams that may close some underlying resources, but it's
optional.
Actually, that is incorrect:
Before we go into the merits of such functionality, what is a
plausible multiple writers/single reader scenario? The only one I can
come up is logging, where you probably don't care about the exact
order of writes.
By the way, this 'thundering writers' issue seems like it's
It's not really a parser, but likely useable as part of a stream parser.
I have converted my pattern to a new module, available here:
https://github.com/kanongil/node-streamprocess
In addition to the basic processing, I have added an error handler, which will
defer until processing of all
The Fedora Node.js Special Interest Group is delighted to announce the
release of Fedora 19 (Schrödinger's Cat), the first Fedora release to
include node right from the start. Open the box and take a look for
yourself!
*What's Fedora?*
Fedora is a leading-edge, free and open source operating
Is it time to start using ES6 in Node.js libraries hosted on NPM? Or is ES6
going to be the new CoffeeScript?
--
Alan Gutierrez ~ @bigeasy
--
--
Job Board: http://jobs.nodejs.org/
Posting guidelines:
https://github.com/joyent/node/wiki/Mailing-List-Posting-Guidelines
You received this message
My rule is to not use anything that's behind a flag in libraries. But I do
use things like generators in my applications that consume libraries. I
also use generators in my examples and README's because it's so much
cleaner.
I've been using continuables heavily in place of callback-last APIs
http://blog.strongloop.com/new-videos-working-with-node-js-and-the-buffer-api/
If you missed the San Francisco Node.js Meetup at Mozilla’s offices last
month, have no fear, the videos are here. These are technical videos for a
technical audience and are a bit lengthy.
Video #1: Forrest Norvell
I came to the value that the best thing to deal with node streams is not
deal with them at all.
Wrap them as soon as possible, do not pass them around, do not
build abstractions on top of them.
It is a super complected thing capable of what? You even can't do
`send(file)`. Just can't.
There
Hi,
minify seems to be broken on npm :-
C:\npm i minify
npm http GET https://registry.npmjs.org/minify
npm http 304 https://registry.npmjs.org/minify
npm http GET https://registry.npmjs.org/uglify-js/2.2.5
npm http GET https://registry.npmjs.org/clean-css/1.0.1
npm http GET
I didn't notice that `'close'` was added to the stream interface. Nice.
On Tue, Jul 2, 2013 at 6:31 AM, Rusty Conover write.to.ru...@gmail.comwrote:
On Tuesday, July 2, 2013 3:59:48 AM UTC-4, Raynos wrote:
`close` event is not part of the stream API. It's generally used by
_readable_
On Sun, Jun 30, 2013 at 11:35 AM, Jonahss jona...@gmail.com wrote:
I know this error was previously ignored before 0.10, but I'd like to know
what's causing this and how I can fix it/debug it?
Is it a problem on our end (server) closing connections too soon?
Wireshark the connection, its
C:\node minify
module.js:340
throw err;
^
Error: Cannot find module 'C:\Users\Aaron\Projects\MoshierEphemeris\MoshierEphem
eris\minify'
at Function.Module._resolveFilename (module.js:338:15)
at Function.Module._load (module.js:280:25)
at Function.Module.runMain
Unless the ES6 spec for the feature is complete and implementation in v8 is
well tested it, at the least, won't be going into core. For example, callbacks
called through generators cannot be inlined. This leads to significant
performance degradation. As far as personal projects, feel free to
On July 2, 2013 at 3:00:59 PM, Aaron Gray (aaronngray.li...@gmail.com) wrote:
C:\node minify
module.js:340
throw err;
^
Error: Cannot find module
'C:\Users\Aaron\Projects\MoshierEphemeris\MoshierEphem
eris\minify'
at Function.Module._resolveFilename (module.js:338:15)
at
Sorry disregards this - I don't know what I am doing !
--
--
Job Board: http://jobs.nodejs.org/
Posting guidelines:
https://github.com/joyent/node/wiki/Mailing-List-Posting-Guidelines
You received this message because you are subscribed to the Google
Groups nodejs group.
To post to this group,
For ES6 features I apply the library / application line.
I avoid using something complex in library implementations, that includes
promises, generators, flow control, web frameworks. This is mainly to make
individual libraries as lean as possible, you can still use dependencies to
hide
My simple stream API:
stream.read(len, cb)
stream.write(data, cb)
`read` returns its result through `cb`, just like fs.read
`len` is an optional parameter for read. If you omit it, read just returns
the next chunk.
`read` returns null if you are positioned at the end of stream.
`write`
I'm trying to create a wrapper around child_process.spawn but I'm having
trouble understanding streams2. i know how to do this with the old streams
suppose I have a function BlackBox with the external api that looks like
this: (|s are pipes)
inputStream | BlackBox | outputStream
but as
Hi,
I have some decryption code in python that I need ported to node. I cannot
seem to get the decryption right and I am going crazy.
My python code is:
from Crypto.Cipher import AES
mode = AES.MODE_ECB
secret = 9kL8yb/3Tu2czOr5qfiGPgJmx25s+T15
cipher = AES.new(secret, mode)
DecodeAES =
Here's one way to do it: https://github.com/isaacs/truncating-stream
npm install truncating-stream
var Trunc = require('truncating-stream')
var t = new Trunc({ limit: 100 })
Now no matter how much you t.write() only 100 bytes will come out.
Of course, the thing you're piping your reader INTO
Hi All,
Google search could not find the result regarding the issue I am after so
decided to post here. I have Nodejs server where socket.io work as a
mediator between client and the server. In server side I am using,
spawn = require('child_process').spawn;
to execute the (shell) command,
36 matches
Mail list logo