Hi Shawn,

The GC seems to be the issue, changing back to CMS worked and I see in the G1 
docs it states it really doesn’t work well for small heap sizes. We’ll be 
changing to a better resourced 64-bit VM with more memory later next year with 
the next Ubuntu LTS release, so it should cease to be a problem after that. 
Thanks for the help.

Scott.

> On 11 Jul 2019, at 12:20 pm, Shawn Heisey <apa...@elyograg.org> wrote:
> 
> On 6/19/2019 7:15 PM, Scott Yeadon wrote:
>> I’m running Solr on Ubuntu 18.04 (32-bit) using OpenJDK 10.0.2. Up until now 
>> I have had no problem with Solr (started running it since 4.x), however 
>> after upgrading from 7.x to 8.x I am getting serious memory issues.
>> I have a small repository of 30,000 documents currently using Solr 7.1 for 
>> the search function (for the last two years without issue). I attempted an 
>> upgrade to 8.1.1 and tried to perform a full reindex however, it manages 
>> about 1000 documents and then dies from lack of memory (or so it says). I 
>> tried 8.1.0 with the same result. I then tried 8.0.0 which did successfully 
>> manage a full reindex but then after performing a couple of search queries 
>> died from lack of memory. I then tried 7.7.2 which worked fine. I have now 
>> gone back to my original 7.1 as I can’t risk 8.x in my production system. 
>> Has anyone else had these issues with 8.x?
>> Note that I did increase Xmx to 1024m (previously 512m) but that made no 
>> difference, it may be some other resource than memory, but if it is, it 
>> isn’t saying so, and it’s such a small repository it doesn’t seem to make 
>> sense to be running out of memory.
> 
> Solr 8 has switched the garbage collector from CMS to G1, because CMS is 
> deprecated in newer versions of Java, and will be removed in the near future.
> 
> G1 is a more efficient collector, but it does require somewhat more memory 
> beyond the heap than CMS does.  For most users, this is not a problem, but 
> for the small heap values and total system memory you're using, it might be 
> enough to go over the threshold.
> 
> You could try setting the old 7.x GC_TUNE settings in your include file, 
> normally named solr.in.sh on non-windows platforms.
> 
>      GC_TUNE=('-XX:NewRatio=3' \
>        '-XX:SurvivorRatio=4' \
>        '-XX:TargetSurvivorRatio=90' \
>        '-XX:MaxTenuringThreshold=8' \
>        '-XX:+UseConcMarkSweepGC' \
>        '-XX:ConcGCThreads=4' '-XX:ParallelGCThreads=4' \
>        '-XX:+CMSScavengeBeforeRemark' \
>        '-XX:PretenureSizeThreshold=64m' \
>        '-XX:+UseCMSInitiatingOccupancyOnly' \
>        '-XX:CMSInitiatingOccupancyFraction=50' \
>        '-XX:CMSMaxAbortablePrecleanTime=6000' \
>        '-XX:+CMSParallelRemarkEnabled' \
>        '-XX:+ParallelRefProcEnabled' \
>        '-XX:-OmitStackTraceInFastThrow')
> 
> I would probably also use Java 8 rather than Java 10.  Java 10 is not an LTS 
> version, and the older version might require a little bit less memory, which 
> is a premium resource on your setup.  Upgrading to Java 11, the next LTS 
> version, would likely require even more memory.
> 
> Why are you running a 32-bit OS with such a small memory size?  It's not 
> possible to use heap sizes much larger than 1.5 GB on a 32-bit OS. There are 
> also some known bugs with running Lucene-based software on 32-bit Java -- and 
> one of them is specifically related to the G1 collector.
> 
> Thanks,
> Shawn

Reply via email to