Hi,

I am in the process of deciding specs for a crawling machine and a
searching machine (two machines), which will support merging/indexing
and searching operations on a single Lucene index that may scale to
about several million pages (at which it would be about 2-10 GB,
assuming linear growth with pages).

What is the range of hardware that I should be looking at? Could
anyone share their deployment/hardware specs for a large index size?
I'm looking for RAM and CPU considerations.

Also what is the preferred platform - Java has a max memory allocation
of 4GB on Solaris and 2GB on linux? -> Does it make sense to get more
RAM than this?

Thanks!

CW

---------------------------------------------------------------------
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]

Reply via email to