Just for comparative reference, here is a generators implementation (
https://gist.github.com/Raynos/56da201abbb50992ad06 )
I refactored the weird part into a `unique` function.
Things that are worth noting are:
- how easy `if` statements are with generators
- you can replace 90% of the
You could also just use Function.bind.
On Aug 21, 2013 6:40 AM, Mike Chen mikalora...@gmail.com wrote:
Nevermind, solved it, had the this wrapped in another function.
Thanks,
-mike
On Wednesday, August 21, 2013 1:30:58 AM UTC-4, Mike Chen wrote:
Hi,
I am doing a bit of code
Same as above.
A callback is meant to be called back after some operations are made. In
node, that means they're probably gonna get called when the current stack
has unwound. If you called the callback synchronously, that means it
behaves differently.
process.nextTick un most cases execute the
Thanks for the reply.
However, this is the same answer as what I keep coming across that doesn't
really answer my question. What is the downside? I want to see a real-world
example of what happens when I don't follow this contract. I've yet to see an
example of something bad happening in
How do you mean describe? It is described as a class in the documentation:
http://nodejs.org/api/fs.html#fs_class_fs_readstream
On Aug 20, 2013, at 11:06, Dennis Leukhin stone...@gmail.com wrote:
If the constructor is meant, it is necessary to describe it as a
constructor, and not learning
I've recently created something with the same intentions (say goodbye to
the callback hell), but simpler and based on Fibers.
https://github.com/luciotato/waitfor
Teaser...
What if... Fibers and wait.for where part of node core?
then you can deprecate almost half the functions at:
Hello,
I wonder if you can help me. I am new to nodejs and I am trying to write
a little application that will talk to a server written in C (the server
is third-party). The server requires XML messages with a binary string
for the length of the message being sent. The string is meant to
represent
I recall in the early versions of 0.6.x the default was to compile v8 for 32
bits and that lead to a heap limit of 1.4GB. In 64-bit v8, I know of no
restrictions on heap size.
Have you considered using Buffer objects? Buffers exist outside of v8 and can
hold a gigabyte each.
-Edmond
On Aug
You could do this:
client.on('message', doSomething.bind(client));
function doSomething(args) {
// do something
this.emit();
}
2013/8/21 Brian Di Palma off...@gmail.com
You could also just use Function.bind.
On Aug 21, 2013 6:40 AM, Mike Chen mikalora...@gmail.com wrote:
Nevermind,
'this' will be bound to the EventEmitter instance. This behaviour is
implemented around
here: https://github.com/joyent/node/blob/master/lib/events.js#L100.
So, the following code works as expected (prints called!).
var SomeClass = function () {
};
util.inherits(SomeClass, EventEmitter);
var
On Wed, Aug 21, 2013 at 2:12 AM, Neil Kandalgaonkar ne...@neilk.net wrote:
On Monday, August 19, 2013 9:50:50 PM UTC-7, Ben Noordhuis wrote:
On Tue, Aug 20, 2013 at 3:36 AM, Neil Kandalgaonkar ne...@neilk.net
wrote:
I'm running node inside a Linux VM, on my Mac OS laptop. I'd like to
debug
Hi!
I'd suggest you to try using [Buffer][0] for binary data and send your line
either in chunks:
socket.write('string');
socket.write(binary_buffer);
socket.write('rest');
Or in one buffer (if your server expects data to be in one chunk) by
writing strings and binary data into one buffer,
On Wed, Aug 21, 2013 at 5:04 AM, Edmond Meinfelder
edmond.meinfel...@gmail.com wrote:
In 64-bit v8, I know of no restrictions on heap size.
It's about 1.9 GB.
--
--
Job Board: http://jobs.nodejs.org/
Posting guidelines:
https://github.com/joyent/node/wiki/Mailing-List-Posting-Guidelines
You
https://groups.google.com/d/msg/nodejs/FNsM6Ns1MkE/4Ys-l1RI0Q0J
On Wed, Aug 21, 2013 at 4:19 AM, Bryan Donovan brdono...@gmail.com wrote:
Thanks for the reply.
However, this is the same answer as what I keep coming across that doesn't
really answer my question. What is the downside? I
This may be a stupid question, but how about distributing the workload
to multiple node processes?
This should sidestep the memory barrier and also happens to leverage
all cores. IPC is pretty straight forward with node
Maybe I'm missing something obvious, but hunting around the various docs
hasn't revealed a way to get all modules by author.
All I'm looking for is module names, and the input requirements are
flexible.
-Schoon
P.S. Is posting disabled in the NPM mailing list for shmoes like me? I
thought that
The npm web site has pages for users, which list their published modules.
For example, https://npmjs.org/~scott.gonzalez
On Wed, Aug 21, 2013 at 12:46 PM, Michael Schoonmaker
michael.r.schoonma...@gmail.com wrote:
Maybe I'm missing something obvious, but hunting around the various docs
Apparently you can also use /browse/author/ to just see the list of
modules: https://npmjs.org/browse/author/tjholowaychuk
On Wed, Aug 21, 2013 at 12:51 PM, Scott González
scott.gonza...@gmail.comwrote:
The npm web site has pages for users, which list their published modules.
For example,
The npm list is pretty dead, better to ask on github.
On Wednesday, August 21, 2013, Michael Schoonmaker wrote:
Maybe I'm missing something obvious, but hunting around the various docs
hasn't revealed a way to get all modules by author.
All I'm looking for is module names, and the input
If you're looking for a way that a person can look at the list of packages
a specific author has published, the npmjs.org site can show you, as others
have mentioned.
If you're looking for a programmatic way to get a list of packages
published by a specific author, you can use this:
$ curl
Thanks Scott, that thread was very helpful.
I think I get it now, but it does seem to me that executing the callback
synchronously in the particular ways I've been doing it is ok. If I were
following a convention like:
function connect(cb) {
process.nextTick(cb);
return {
From a little bit of black-box testing, I've deduced that transform (and
maybe duplex) streams will automatically unpipe themselves from their
source when they emit an error event. Is this behavior intended and normal,
and if it is intended, why isn't it documented?
--
--
Job Board:
2013.08.21, Version 0.11.6 (Unstable)
* uv: Upgrade to v0.11.8
* v8: upgrade v8 to 3.20.14.1
* build: disable SSLv2 by default (Ben Noordhuis)
* build: don't auto-destroy existing configuration (Ben Noordhuis)
* crypto: add TLS 1.1 and 1.2 to secureProtocol list (Matthias Bartelmeß)
*
One of the many great reasons to run a front end proxy like nginx is
exactly this.
On Wed, Aug 21, 2013 at 4:41 PM, Jose Luis Rivas ghostba...@gmail.comwrote:
Hey guys,
I have site.com which is running Express. Is there a way to have
site.com/subfolder as a different app? Has anyone any
app = express();
your_other_express_app = express();
// ...
app.use(/subfolder, your_other_express_app);
--
Diogo Resende
On Wednesday, August 21, 2013 at 21:41 , Jose Luis Rivas wrote:
Hey guys,
I have site.com (http://site.com) which is running Express. Is there a way to
have
This isn't pertinent to core node.js, but I'm hoping someone might be able to
give me a hand. I want to accomplish the equivalent of tar -cf output.tar -C
/tmp/whatever . with Isaac's fstream[1] and tar[2] modules. Unfortunately I
find the documentation really lacking; maybe I'm just missing
You might want to take a look at the npm code that unpacks package
tarballs. The key function is gunzTarPerm, here:
https://github.com/isaacs/npm/blob/master/lib/utils/tar.js#L191
Note that extractEntry gets to futz with each entry as it goes by:
Example with an ugly trick :
https://gist.github.com/kapouer/6301360
Jérémy.
On 22/08/2013 01:10, Martin Cooper wrote:
You might want to take a look at the npm code that unpacks package tarballs.
The key function is gunzTarPerm, here:
2013.08.21, Version 0.10.17 (Stable)
* uv: Upgrade v0.10.14
* http_parser: Do not accept PUN/GEM methods as PUT/GET (Chris Dickinson)
* tls: fix assertion when ssl is destroyed at read (Fedor Indutny)
* stream: Throw on 'error' if listeners removed (isaacs)
* dgram: fix assertion on bad
Well, using buffer is not a good choice for my use case, because in my
project, the bottle neck is not the data itself but the data index. I use
v8 object as sort of in-memory db.
If I have to use buffer or manipulate index/hash my self, then I'm likely
to do this in c++ or with redis. But
I got this error message:
WebSocket connection to 'ws://75.98.171.173:8080/' failed: Error during
WebSocket handshake: Sec-WebSocket-Protocol mismatch
Can any one please tell me what went wrong? I double checked everything and
still can't figure out what is the cause of the problem. I've used
Hi,
My node.js program crashed and i saw the following in the log:
FATAL ERROR: CALL_AND_RETRY_2 Allocation failed - process out of memory
i reduced the original program to the following tiny self-contained program:
=
...
function foo1()
{
var someBadURI =
Impressive changelog. Congrats team. This reminds me a bit of the early
days of fun.
On Wed, Aug 21, 2013 at 3:30 PM, Timothy J Fontaine tjfonta...@gmail.comwrote:
2013.08.21, Version 0.11.6 (Unstable)
* uv: Upgrade to v0.11.8
* v8: upgrade v8 to 3.20.14.1
* build: disable SSLv2 by
If all you need is a hash table, there are a lot of hash table C/C++
libraries that you could integrate with Node (like
Judyhttp://judy.sourceforge.net/)
and there are a few that have already been integrated, like
https://npmjs.org/package/hashtable (that specifically lives outside of
v8's
You're never ending / destroying / disposing any of the responses, so
they're just piling up inside the callback closures. setTimeout is kind of
a red herring in this case; you could replicate the same effect faster with
process.nextTick or just having the callback call foo1 directly.
F
On Wed,
as well as quite a few others on npm: https://npmjs.org/search?q=hashtable
.
Ok, maybe not quite a few -- most of the other ones are implemented in
Javascript. The only C++ ones I could find were
- https://npmjs.org/package/hashtable (
https://github.com/isaacbwagner/node-hashtable)
36 matches
Mail list logo