Personally I've never seen any single node support 1.5B documents. I advise
biting the bullet and sharding. Even if you do get the simple keyword
search working, the first time you sort I expect it to blow up. Then you'll
try to facet and it'll blow up. Then you'll start using filter queries and
it
It really should be unlimited: this setting has nothing to do with how
much RAM is on the computer.
See http://blog.thetaphi.de/2012/07/use-lucenes-mmapdirectory-on-64bit.html
Mike McCandless
http://blog.mikemccandless.com
On Tue, Feb 26, 2013 at 12:18 PM, zqzuk wrote:
> Hi
> sorry I couldnt d
Hi
sorry I couldnt do this directly... the way I do this is by subscribing to a
cluster of computers in our organisation and send the job with required
memory. It gets randomly allocated to a node (one single server in the
cluster) once executed and it is not possible to connect to that specific
no
Could you check the virtual memory limit (ulimit -v, check this for the
operating system user that runs Solr).
It should report "unlimited".
André
Von: zqzuk [ziqizh...@hotmail.co.uk]
Gesendet: Dienstag, 26. Februar 2013 13:22
An: solr-user@lucene.apache