I have to scrape thousands of different websites, as fast as possible.
On a single node process I was able to fetch 10 urls per second.
Though if I fork the task to 10 worker processes, I can reach 64 reqs/
sec.

Why is so?
Why I am limited to 10 reqs/sec on a single process and have to spawn
workers to reach 64 reqs/sec?

- I am not reaching max sockets/host (agent.maxSockets) limit: all
urls are from unique hosts.
- I am not reaching max file descriptors limit (AFAIK): my ulimit -n
is 2560, and lsof shows that my scraper never uses more than 20 file
descriptors.

Is there any limit I don't know about? I am on Mac OS-X.

-- 
Job Board: http://jobs.nodejs.org/
Posting guidelines: 
https://github.com/joyent/node/wiki/Mailing-List-Posting-Guidelines
You received this message because you are subscribed to the Google
Groups "nodejs" group.
To post to this group, send email to nodejs@googlegroups.com
To unsubscribe from this group, send email to
nodejs+unsubscr...@googlegroups.com
For more options, visit this group at
http://groups.google.com/group/nodejs?hl=en?hl=en

Reply via email to