Jsdom have terrible performance, so think about using "cheerio" instead.
As for cpu-intensiveness, you have plenty of options here. Easiest solution would be to increase an amount of node.js processes using "cluster" library. But the best way is to write another node.js server that would do all blocking tasks, so your frontend would be a bridge between that server and the user and it would be always responsive. On Friday, May 24, 2013 11:17:49 AM UTC+4, Tamil selvan R.S wrote: > > Hi, > This is a post to get some suggestions over DOM Manipulation using JSDOM > in nodejs. > We have currently written a service which crawls given url on the fly and > input it to JSDOM to inspect DOM with jQuery. > We see that this service consumes 90% of CPU [Expected]. The urls scraped > are quite heavy in their html content and around ~100 req/sec > As the task is quite CPU intensive [not really async aswell] we feel that > when concurrency of requests increases our service goes unresponsive. > Are we missing out something? > -- -- Job Board: http://jobs.nodejs.org/ Posting guidelines: https://github.com/joyent/node/wiki/Mailing-List-Posting-Guidelines You received this message because you are subscribed to the Google Groups "nodejs" group. To post to this group, send email to [email protected] To unsubscribe from this group, send email to [email protected] For more options, visit this group at http://groups.google.com/group/nodejs?hl=en?hl=en --- You received this message because you are subscribed to the Google Groups "nodejs" group. To unsubscribe from this group and stop receiving emails from it, send an email to [email protected]. For more options, visit https://groups.google.com/groups/opt_out.
