Hello,

Assume I only have a dual core server, with limited 1GB memory , I
want to build a web robot to crawl 1000 pre-defined web sites.

Anyone can provide a basic strategy for my tasks?

Should I  create 1000 sessions at the same time, to archive the max
network throughput?





Thanks.

Reply via email to