Fields with stored=false are stored though

2016-11-04 Thread Reinhard Budenstecher
I'm using Solr 6.2.1. Schema is static (schema.xml) and some fields look like and so on. But when querying in web browser GUI I can see, that these fields are stored though and values are returned on query. How can this happen? Looking into web schema browser I can see fields with

Re: Re: Config for massive inserts into Solr master

2016-10-12 Thread Reinhard Budenstecher
> That is not correct as of version 4.0. > > The only kind of update I've run into that cannot proceed at the same > time as an optimize is a deleteByQuery operation. If you do that, then > it will block until the optimize is done, and I think it will also block > > any update you do after it. >

Re: Re: Config for massive inserts into Solr master

2016-10-12 Thread Reinhard Budenstecher
> > That's considerably larger than you initially indicated. In just one > index, you've got almost 300 million docs taking up well over 200GB. > About half of them have been deleted, but they are still there. Those > deleted docs *DO* affect operation and memory usage. > > Getting rid of

Re: Re: Re: Config for massive inserts into Solr master

2016-10-10 Thread Reinhard Budenstecher
> > Just a sanity check. That directory mentioned, what kind of file system is > that on? NFS, NAS, RAID? I'm using Ext4 with options "noatime,nodiratime,barrier=0" on a hardware RAID10 with 4 SSD disks __ Gesendet mit Maills.de - mehr als

Re: Re: Config for massive inserts into Solr master

2016-10-10 Thread Reinhard Budenstecher
> > What I have been hoping to see is the exact text of an OutOfMemoryError > in solr.log so I can tell whether it's happening because of heap space > or some other problem, like stack space. The stacktrace on such an > error might be helpfultoo. > Hi, I did understand what you need, I'm

Re: Re: Config for massive inserts into Solr master

2016-10-09 Thread Reinhard Budenstecher
> > That's considerably larger than you initially indicated. In just one > index, you've got almost 300 million docs taking up well over 200GB. > About half of them have been deleted, but they are still there. Those > deleted docs *DO* affect operation and memory usage. > Yes, that's larger

Re: Re: Config for massive inserts into Solr master

2016-10-09 Thread Reinhard Budenstecher
> What version of Solr? How has it been installed and started? > Solr 6.2.1 on Debian Jessie, installed with: apt-get install openjdk-8-jre-headless openjdk-8-jdk-headless wget "http://www.eu.apache.org/dist/lucene/solr/6.2.1/solr-6.2.1.tgz; && tar xvfz solr-*.tgz

Config for massive inserts into Solr master

2016-10-09 Thread Reinhard Budenstecher
Hello, I'm not a pro in Solr nor in Java, so please be patient. We have an ecommerce application with 150 millions docs and a size of 140GB in Solr. We are using the following setup: Solr "MASTER": - DELL R530, 1x XEON E5-1650 - 64GB ECC RAM - 4x 480GB SSD as RAID10 on hardware RAID (but no BBU