OK, then let’s see the indexing code. Make sure you don’t
1> commit after every batch
2> never, never, never optimize.

BTW, you do not want to turn off commits entirely, there are some internal data 
structures that grow between commits. So I might do something like specify 
commitWithin on my adds for something like 5 minutes.

Best,
Erick

> On Jul 2, 2019, at 6:24 AM, derrick cui <derrickcui...@yahoo.ca.INVALID> 
> wrote:
> 
> I have tested the query desperately, actually executing query is pretty fast, 
> it only took a few minutes to go through all results including converting 
> solr document to java object. So I believe the slowness is in persistence 
> end.  BTW,  I am using linux system.
> 
> 
> Sent from Yahoo Mail for iPhone
> 
> 
> On Sunday, June 30, 2019, 4:52 PM, Shawn Heisey <apa...@elyograg.org> wrote:
> 
> On 6/30/2019 2:08 PM, derrick cui wrote:
>> Good point Erick, I will try it today, but I have already use cursorMark in 
>> my query for deep pagination.
>> Also I noticed that my cpu usage is pretty high, 8 cores, usage is over 
>> 700%. I am not sure it will help if I use ssd disk
> 
> That depends on whether the load is caused by iowait or by actual CPU usage.
> 
> If it's caused by iowait, then SSD would help, but additional memory 
> would help more.  Retrieving data from the OS disk cache (which exists 
> in main memory) is faster than SSD.
> 
> If it is actual CPU load, then it will take some additional poking 
> around to figure out which part of your activities causes the load, as 
> Erick mentioned.
> 
> It's normally a little bit easier to learn these things from Unix-like 
> operating systems than from Windows.  What OS are you running Solr on?
> 
> Thanks,
> Shawn
> 
> 
> 

Reply via email to