Latest version is 3.4, and it is fairly compatible with 1.4.1, but you have to
reindex.
First step migration can be to continue using your 1.4 schema on new solr.war
(and SolrJ), but I suggest you take a few hours upgrading your schema and
config as well.
--
Jan Høydahl, search solution archite
On 10/10/2011 3:39 PM, � wrote:
Hi,
If you have 4Gb on your server total, try giving about 1Gb to Solr, leaving 3Gb
for OS, OS caching and mem-allocation outside the JVM.
Also, add 'ulimit -v unlimited' and 'ulimit -s 10240' to /etc/profile to
increase virtual memory and stack limit.
I will
Hi,
If you have 4Gb on your server total, try giving about 1Gb to Solr, leaving 3Gb
for OS, OS caching and mem-allocation outside the JVM.
Also, add 'ulimit -v unlimited' and 'ulimit -s 10240' to /etc/profile to
increase virtual memory and stack limit.
And you should also consider upgrading to
On 10/07/2011 6:21 PM, � wrote:
Hi,
What Solr version?
Solr Implementation Version: 1.4.1 955763M - mark - 2010-06-17 18:06:42.
Its running on a Suse Linux VM.
How often do you do commits, or do you use autocommit?
I had been doing commits every 100 documents (the entire set is about
3
Hi,
What Solr version?
How often do you do commits, or do you use autocommit?
What kind and size of docs?
Do you feed from a Java program? Where is the read timeout occurring? Can you
paste in some logs?
How much RAM on your server, and how much did you give to the JVM?
--
Jan Høydahl, search s
I'm batching documents into solr using solr cell with the 'stream.url'
parameter. Everything is working fine until I get to about 5k documents
in and then it starts issuing 'read timeout 500' errors on every document.
The sysadmin says there's plenty of CPU, memory, and no paging so it
doesn'