Re: Solr out of memory exception

2012-03-15 Thread Li Li
it seems you are using 64bit jvm(32bit jvm can only allocate about 1.5GB).
you should enable pointer compression by -XX:+UseCompressedOops

On Thu, Mar 15, 2012 at 1:58 PM, Husain, Yavar yhus...@firstam.com wrote:

 Thanks for helping me out.

 I have allocated Xms-2.0GB Xmx-2.0GB

 However i see Tomcat is still using pretty less memory and not 2.0G

 Total Memory on my Windows Machine = 4GB.

 With smaller index size it is working perfectly fine. I was thinking of
 increasing the system RAM  tomcat heap space allocated but then how come
 on a different server with exactly same system and solr configuration 
 memory it is working fine?


 -Original Message-
 From: Li Li [mailto:fancye...@gmail.com]
 Sent: Thursday, March 15, 2012 11:11 AM
 To: solr-user@lucene.apache.org
 Subject: Re: Solr out of memory exception

 how many memory are allocated to JVM?

 On Thu, Mar 15, 2012 at 1:27 PM, Husain, Yavar yhus...@firstam.com
 wrote:

  Solr is giving out of memory exception. Full Indexing was completed fine.
  Later while searching maybe when it tries to load the results in memory
 it
  starts giving this exception. Though with the same memory allocated to
  Tomcat and exactly same solr replica on another server it is working
  perfectly fine. I am working on 64 bit software's including Java  Tomcat
  on Windows.
  Any help would be appreciated.
 
  Here are the logs:
 
  The server encountered an internal error (Severe errors in solr
  configuration. Check your log files for more detailed information on what
  may be wrong. If you want solr to continue after configuration errors,
  change: abortOnConfigurationErrorfalse/abortOnConfigurationError in
  null -
  java.lang.RuntimeException: java.lang.OutOfMemoryError: Java heap space
 at
  org.apache.solr.core.SolrCore.getSearcher(SolrCore.java:1068) at
  org.apache.solr.core.SolrCore.init(SolrCore.java:579) at
 
 org.apache.solr.core.CoreContainer$Initializer.initialize(CoreContainer.java:137)
  at
 
 org.apache.solr.servlet.SolrDispatchFilter.init(SolrDispatchFilter.java:83)
  at
 
 org.apache.catalina.core.ApplicationFilterConfig.getFilter(ApplicationFilterConfig.java:295)
  at
 
 org.apache.catalina.core.ApplicationFilterConfig.setFilterDef(ApplicationFilterConfig.java:422)
  at
 
 org.apache.catalina.core.ApplicationFilterConfig.init(ApplicationFilterConfig.java:115)
  at
 
 org.apache.catalina.core.StandardContext.filterStart(StandardContext.java:4072)
  at
  org.apache.catalina.core.StandardContext.start(StandardContext.java:4726)
  at
 
 org.apache.catalina.core.ContainerBase.addChildInternal(ContainerBase.java:799)
  at
 org.apache.catalina.core.ContainerBase.addChild(ContainerBase.java:779)
  at org.apache.catalina.core.StandardHost.addChild(StandardHost.java:601)
 at
  org.apache.catalina.startup.HostConfig.deployWAR(HostConfig.java:943) at
  org.apache.catalina.startup.HostConfig.deployWARs(HostConfig.java:778) at
  org.apache.catalina.startup.HostConfig.deployApps(HostConfig.java:504) at
  org.apache.catalina.startup.HostConfig.start(HostConfig.java:1317) at
 
 org.apache.catalina.startup.HostConfig.lifecycleEvent(HostConfig.java:324)
  at
 
 org.apache.catalina.util.LifecycleSupport.fireLifecycleEvent(LifecycleSupport.java:142)
  at org.apache.catalina.core.ContainerBase.start(ContainerBase.java:1065)
 at
  org.apache.catalina.core.StandardHost.start(StandardHost.java:840) at
  org.apache.catalina.core.ContainerBase.start(ContainerBase.java:1057) at
  org.apache.catalina.core.StandardEngine.start(StandardEngine.java:463) at
  org.apache.catalina.core.StandardService.start(StandardService.java:525)
 at
  org.apache.catalina.core.StandardServer.start(StandardServer.java:754) at
  org.apache.catalina.startup.Catalina.start(Catalina.java:595) at
  sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at
  sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source) at
  sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source) at
  java.lang.reflect.Method.invoke(Unknown Source) at
  org.apache.catalina.startup.Bootstrap.start(Bootstrap.java:289) at
  org.apache.catalina.startup.Bootstrap.main(Bootstrap.java:414) Caused by:
  java.lang.OutOfMemoryError: Java heap space at
 
 org.apache.lucene.index.SegmentTermEnum.termInfo(SegmentTermEnum.java:180)
  at
 org.apache.lucene.index.TermInfosReader.init(TermInfosReader.java:91)
  at
 
 org.apache.lucene.index.SegmentReader$CoreReaders.init(SegmentReader.java:122)
  at org.apache.lucene.index.SegmentReader.get(SegmentReader.java:652) at
  org.apache.lucene.index.SegmentReader.get(SegmentReader.java:613) at
  org.apache.lucene.index.DirectoryReader.init(DirectoryReader.java:104)
 at
 
 org.apache.lucene.index.ReadOnlyDirectoryReader.init(ReadOnlyDirectoryReader.java:27)
  at
  org.apache.lucene.index.DirectoryReader$1.doBody(DirectoryReader.java:74)
  at
 
 org.apache.lucene.index.SegmentInfos$FindSegmentsFile.run(SegmentInfos.java

Re: Solr out of memory exception

2012-03-15 Thread C.Yunqin
why should enable pointer compression?


 
 
-- Original --
From:  Li Lifancye...@gmail.com;
Date:  Thu, Mar 15, 2012 02:41 PM
To:  Husain, Yavaryhus...@firstam.com; 
Cc:  solr-user@lucene.apache.orgsolr-user@lucene.apache.org; 
Subject:  Re: Solr out of memory exception

 
it seems you are using 64bit jvm(32bit jvm can only allocate about 1.5GB).
you should enable pointer compression by -XX:+UseCompressedOops

On Thu, Mar 15, 2012 at 1:58 PM, Husain, Yavar yhus...@firstam.com wrote:

 Thanks for helping me out.

 I have allocated Xms-2.0GB Xmx-2.0GB

 However i see Tomcat is still using pretty less memory and not 2.0G

 Total Memory on my Windows Machine = 4GB.

 With smaller index size it is working perfectly fine. I was thinking of
 increasing the system RAM  tomcat heap space allocated but then how come
 on a different server with exactly same system and solr configuration 
 memory it is working fine?


 -Original Message-
 From: Li Li [mailto:fancye...@gmail.com]
 Sent: Thursday, March 15, 2012 11:11 AM
 To: solr-user@lucene.apache.org
 Subject: Re: Solr out of memory exception

 how many memory are allocated to JVM?

 On Thu, Mar 15, 2012 at 1:27 PM, Husain, Yavar yhus...@firstam.com
 wrote:

  Solr is giving out of memory exception. Full Indexing was completed fine.
  Later while searching maybe when it tries to load the results in memory
 it
  starts giving this exception. Though with the same memory allocated to
  Tomcat and exactly same solr replica on another server it is working
  perfectly fine. I am working on 64 bit software's including Java  Tomcat
  on Windows.
  Any help would be appreciated.
 
  Here are the logs:
 
  The server encountered an internal error (Severe errors in solr
  configuration. Check your log files for more detailed information on what
  may be wrong. If you want solr to continue after configuration errors,
  change: abortOnConfigurationErrorfalse/abortOnConfigurationError in
  null -
  java.lang.RuntimeException: java.lang.OutOfMemoryError: Java heap space
 at
  org.apache.solr.core.SolrCore.getSearcher(SolrCore.java:1068) at
  org.apache.solr.core.SolrCore.init(SolrCore.java:579) at
 
 org.apache.solr.core.CoreContainer$Initializer.initialize(CoreContainer.java:137)
  at
 
 org.apache.solr.servlet.SolrDispatchFilter.init(SolrDispatchFilter.java:83)
  at
 
 org.apache.catalina.core.ApplicationFilterConfig.getFilter(ApplicationFilterConfig.java:295)
  at
 
 org.apache.catalina.core.ApplicationFilterConfig.setFilterDef(ApplicationFilterConfig.java:422)
  at
 
 org.apache.catalina.core.ApplicationFilterConfig.init(ApplicationFilterConfig.java:115)
  at
 
 org.apache.catalina.core.StandardContext.filterStart(StandardContext.java:4072)
  at
  org.apache.catalina.core.StandardContext.start(StandardContext.java:4726)
  at
 
 org.apache.catalina.core.ContainerBase.addChildInternal(ContainerBase.java:799)
  at
 org.apache.catalina.core.ContainerBase.addChild(ContainerBase.java:779)
  at org.apache.catalina.core.StandardHost.addChild(StandardHost.java:601)
 at
  org.apache.catalina.startup.HostConfig.deployWAR(HostConfig.java:943) at
  org.apache.catalina.startup.HostConfig.deployWARs(HostConfig.java:778) at
  org.apache.catalina.startup.HostConfig.deployApps(HostConfig.java:504) at
  org.apache.catalina.startup.HostConfig.start(HostConfig.java:1317) at
 
 org.apache.catalina.startup.HostConfig.lifecycleEvent(HostConfig.java:324)
  at
 
 org.apache.catalina.util.LifecycleSupport.fireLifecycleEvent(LifecycleSupport.java:142)
  at org.apache.catalina.core.ContainerBase.start(ContainerBase.java:1065)
 at
  org.apache.catalina.core.StandardHost.start(StandardHost.java:840) at
  org.apache.catalina.core.ContainerBase.start(ContainerBase.java:1057) at
  org.apache.catalina.core.StandardEngine.start(StandardEngine.java:463) at
  org.apache.catalina.core.StandardService.start(StandardService.java:525)
 at
  org.apache.catalina.core.StandardServer.start(StandardServer.java:754) at
  org.apache.catalina.startup.Catalina.start(Catalina.java:595) at
  sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at
  sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source) at
  sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source) at
  java.lang.reflect.Method.invoke(Unknown Source) at
  org.apache.catalina.startup.Bootstrap.start(Bootstrap.java:289) at
  org.apache.catalina.startup.Bootstrap.main(Bootstrap.java:414) Caused by:
  java.lang.OutOfMemoryError: Java heap space at
 
 org.apache.lucene.index.SegmentTermEnum.termInfo(SegmentTermEnum.java:180)
  at
 org.apache.lucene.index.TermInfosReader.init(TermInfosReader.java:91)
  at
 
 org.apache.lucene.index.SegmentReader$CoreReaders.init(SegmentReader.java:122)
  at org.apache.lucene.index.SegmentReader.get(SegmentReader.java:652) at
  org.apache.lucene.index.SegmentReader.get(SegmentReader.java:613

Re: Solr out of memory exception

2012-03-15 Thread Li Li
it can reduce memory usage. for small heap application less than 4GB, it
may speed up.
but be careful, for large heap application, it depends.
you should do some test for yourself.
our application's test result is: it reduce memory usage but enlarge
response time. we use 25GB memory.

http://lists.apple.com/archives/java-dev/2010/Apr/msg00157.html

Dyer, James james.d...@ingrambook.com
viahttp://support.google.com/mail/bin/answer.py?hl=enctx=mailanswer=1311182
 lucene.apache.org
3/18/11

to solr-user
Our tests showed, in our situation, the compressed oops flag caused our
minor (ParNew) generation time to decrease significantly.   We're using a
larger heap (22gb) and our index size is somewhere in the 40's gb total.  I
guess with any of these jvm parameters, it all depends on your situation
and you need to test.  In our case, this flag solved a real problem we were
having.  Whoever wrote the JRocket book you refer to no doubt had other
scenarios in mind...

On Thu, Mar 15, 2012 at 3:02 PM, C.Yunqin 345804...@qq.com wrote:

 why should enable pointer compression?




 -- Original --
 From:  Li Lifancye...@gmail.com;
 Date:  Thu, Mar 15, 2012 02:41 PM
 To:  Husain, Yavaryhus...@firstam.com;
 Cc:  solr-user@lucene.apache.orgsolr-user@lucene.apache.org;
 Subject:  Re: Solr out of memory exception


 it seems you are using 64bit jvm(32bit jvm can only allocate about 1.5GB).
 you should enable pointer compression by -XX:+UseCompressedOops

 On Thu, Mar 15, 2012 at 1:58 PM, Husain, Yavar yhus...@firstam.com
 wrote:

  Thanks for helping me out.
 
  I have allocated Xms-2.0GB Xmx-2.0GB
 
  However i see Tomcat is still using pretty less memory and not 2.0G
 
  Total Memory on my Windows Machine = 4GB.
 
  With smaller index size it is working perfectly fine. I was thinking of
  increasing the system RAM  tomcat heap space allocated but then how come
  on a different server with exactly same system and solr configuration 
  memory it is working fine?
 
 
  -Original Message-
  From: Li Li [mailto:fancye...@gmail.com]
  Sent: Thursday, March 15, 2012 11:11 AM
  To: solr-user@lucene.apache.org
  Subject: Re: Solr out of memory exception
 
  how many memory are allocated to JVM?
 
  On Thu, Mar 15, 2012 at 1:27 PM, Husain, Yavar yhus...@firstam.com
  wrote:
 
   Solr is giving out of memory exception. Full Indexing was completed
 fine.
   Later while searching maybe when it tries to load the results in memory
  it
   starts giving this exception. Though with the same memory allocated to
   Tomcat and exactly same solr replica on another server it is working
   perfectly fine. I am working on 64 bit software's including Java 
 Tomcat
   on Windows.
   Any help would be appreciated.
  
   Here are the logs:
  
   The server encountered an internal error (Severe errors in solr
   configuration. Check your log files for more detailed information on
 what
   may be wrong. If you want solr to continue after configuration errors,
   change: abortOnConfigurationErrorfalse/abortOnConfigurationError in
   null -
   java.lang.RuntimeException: java.lang.OutOfMemoryError: Java heap space
  at
   org.apache.solr.core.SolrCore.getSearcher(SolrCore.java:1068) at
   org.apache.solr.core.SolrCore.init(SolrCore.java:579) at
  
 
 org.apache.solr.core.CoreContainer$Initializer.initialize(CoreContainer.java:137)
   at
  
 
 org.apache.solr.servlet.SolrDispatchFilter.init(SolrDispatchFilter.java:83)
   at
  
 
 org.apache.catalina.core.ApplicationFilterConfig.getFilter(ApplicationFilterConfig.java:295)
   at
  
 
 org.apache.catalina.core.ApplicationFilterConfig.setFilterDef(ApplicationFilterConfig.java:422)
   at
  
 
 org.apache.catalina.core.ApplicationFilterConfig.init(ApplicationFilterConfig.java:115)
   at
  
 
 org.apache.catalina.core.StandardContext.filterStart(StandardContext.java:4072)
   at
  
 org.apache.catalina.core.StandardContext.start(StandardContext.java:4726)
   at
  
 
 org.apache.catalina.core.ContainerBase.addChildInternal(ContainerBase.java:799)
   at
  org.apache.catalina.core.ContainerBase.addChild(ContainerBase.java:779)
   at
 org.apache.catalina.core.StandardHost.addChild(StandardHost.java:601)
  at
   org.apache.catalina.startup.HostConfig.deployWAR(HostConfig.java:943)
 at
   org.apache.catalina.startup.HostConfig.deployWARs(HostConfig.java:778)
 at
   org.apache.catalina.startup.HostConfig.deployApps(HostConfig.java:504)
 at
   org.apache.catalina.startup.HostConfig.start(HostConfig.java:1317) at
  
 
 org.apache.catalina.startup.HostConfig.lifecycleEvent(HostConfig.java:324)
   at
  
 
 org.apache.catalina.util.LifecycleSupport.fireLifecycleEvent(LifecycleSupport.java:142)
   at
 org.apache.catalina.core.ContainerBase.start(ContainerBase.java:1065)
  at
   org.apache.catalina.core.StandardHost.start(StandardHost.java:840) at
   org.apache.catalina.core.ContainerBase.start(ContainerBase.java:1057

RE: Solr out of memory exception

2012-03-15 Thread Husain, Yavar
Thanks a ton.

From: Li Li [fancye...@gmail.com]
Sent: Thursday, March 15, 2012 12:11 PM
To: Husain, Yavar
Cc: solr-user@lucene.apache.org
Subject: Re: Solr out of memory exception

it seems you are using 64bit jvm(32bit jvm can only allocate about 1.5GB). you 
should enable pointer compression by -XX:+UseCompressedOops

On Thu, Mar 15, 2012 at 1:58 PM, Husain, Yavar 
yhus...@firstam.commailto:yhus...@firstam.com wrote:
Thanks for helping me out.

I have allocated Xms-2.0GB Xmx-2.0GB

However i see Tomcat is still using pretty less memory and not 2.0G

Total Memory on my Windows Machine = 4GB.

With smaller index size it is working perfectly fine. I was thinking of 
increasing the system RAM  tomcat heap space allocated but then how come on a 
different server with exactly same system and solr configuration  memory it is 
working fine?


-Original Message-
From: Li Li [mailto:fancye...@gmail.commailto:fancye...@gmail.com]
Sent: Thursday, March 15, 2012 11:11 AM
To: solr-user@lucene.apache.orgmailto:solr-user@lucene.apache.org
Subject: Re: Solr out of memory exception

how many memory are allocated to JVM?

On Thu, Mar 15, 2012 at 1:27 PM, Husain, Yavar 
yhus...@firstam.commailto:yhus...@firstam.com wrote:

 Solr is giving out of memory exception. Full Indexing was completed fine.
 Later while searching maybe when it tries to load the results in memory it
 starts giving this exception. Though with the same memory allocated to
 Tomcat and exactly same solr replica on another server it is working
 perfectly fine. I am working on 64 bit software's including Java  Tomcat
 on Windows.
 Any help would be appreciated.

 Here are the logs:

 The server encountered an internal error (Severe errors in solr
 configuration. Check your log files for more detailed information on what
 may be wrong. If you want solr to continue after configuration errors,
 change: abortOnConfigurationErrorfalse/abortOnConfigurationError in
 null -
 java.lang.RuntimeException: java.lang.OutOfMemoryError: Java heap space at
 org.apache.solr.core.SolrCore.getSearcher(SolrCore.java:1068) at
 org.apache.solr.core.SolrCore.init(SolrCore.java:579) at
 org.apache.solr.core.CoreContainer$Initializer.initialize(CoreContainer.java:137)
 at
 org.apache.solr.servlet.SolrDispatchFilter.init(SolrDispatchFilter.java:83)
 at
 org.apache.catalina.core.ApplicationFilterConfig.getFilter(ApplicationFilterConfig.java:295)
 at
 org.apache.catalina.core.ApplicationFilterConfig.setFilterDef(ApplicationFilterConfig.java:422)
 at
 org.apache.catalina.core.ApplicationFilterConfig.init(ApplicationFilterConfig.java:115)
 at
 org.apache.catalina.core.StandardContext.filterStart(StandardContext.java:4072)
 at
 org.apache.catalina.core.StandardContext.start(StandardContext.java:4726)
 at
 org.apache.catalina.core.ContainerBase.addChildInternal(ContainerBase.java:799)
 at org.apache.catalina.core.ContainerBase.addChild(ContainerBase.java:779)
 at org.apache.catalina.core.StandardHost.addChild(StandardHost.java:601) at
 org.apache.catalina.startup.HostConfig.deployWAR(HostConfig.java:943) at
 org.apache.catalina.startup.HostConfig.deployWARs(HostConfig.java:778) at
 org.apache.catalina.startup.HostConfig.deployApps(HostConfig.java:504) at
 org.apache.catalina.startup.HostConfig.start(HostConfig.java:1317) at
 org.apache.catalina.startup.HostConfig.lifecycleEvent(HostConfig.java:324)
 at
 org.apache.catalina.util.LifecycleSupport.fireLifecycleEvent(LifecycleSupport.java:142)
 at org.apache.catalina.core.ContainerBase.start(ContainerBase.java:1065) at
 org.apache.catalina.core.StandardHost.start(StandardHost.java:840) at
 org.apache.catalina.core.ContainerBase.start(ContainerBase.java:1057) at
 org.apache.catalina.core.StandardEngine.start(StandardEngine.java:463) at
 org.apache.catalina.core.StandardService.start(StandardService.java:525) at
 org.apache.catalina.core.StandardServer.start(StandardServer.java:754) at
 org.apache.catalina.startup.Catalina.start(Catalina.java:595) at
 sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at
 sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source) at
 sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source) at
 java.lang.reflect.Method.invoke(Unknown Source) at
 org.apache.catalina.startup.Bootstrap.start(Bootstrap.java:289) at
 org.apache.catalina.startup.Bootstrap.main(Bootstrap.java:414) Caused by:
 java.lang.OutOfMemoryError: Java heap space at
 org.apache.lucene.index.SegmentTermEnum.termInfo(SegmentTermEnum.java:180)
 at org.apache.lucene.index.TermInfosReader.init(TermInfosReader.java:91)
 at
 org.apache.lucene.index.SegmentReader$CoreReaders.init(SegmentReader.java:122)
 at org.apache.lucene.index.SegmentReader.get(SegmentReader.java:652) at
 org.apache.lucene.index.SegmentReader.get(SegmentReader.java:613) at
 org.apache.lucene.index.DirectoryReader.init(DirectoryReader.java:104

Re: Solr out of memory exception

2012-03-15 Thread François Schiettecatte
FWIW it looks like this feature has been enabled by default since JDK 6 Update 
23:


http://blog.juma.me.uk/2008/10/14/32-bit-or-64-bit-jvm-how-about-a-hybrid/

François

On Mar 15, 2012, at 6:39 AM, Husain, Yavar wrote:

 Thanks a ton.
 
 From: Li Li [fancye...@gmail.com]
 Sent: Thursday, March 15, 2012 12:11 PM
 To: Husain, Yavar
 Cc: solr-user@lucene.apache.org
 Subject: Re: Solr out of memory exception
 
 it seems you are using 64bit jvm(32bit jvm can only allocate about 1.5GB). 
 you should enable pointer compression by -XX:+UseCompressedOops
 
 On Thu, Mar 15, 2012 at 1:58 PM, Husain, Yavar 
 yhus...@firstam.commailto:yhus...@firstam.com wrote:
 Thanks for helping me out.
 
 I have allocated Xms-2.0GB Xmx-2.0GB
 
 However i see Tomcat is still using pretty less memory and not 2.0G
 
 Total Memory on my Windows Machine = 4GB.
 
 With smaller index size it is working perfectly fine. I was thinking of 
 increasing the system RAM  tomcat heap space allocated but then how come on 
 a different server with exactly same system and solr configuration  memory 
 it is working fine?
 
 
 -Original Message-
 From: Li Li [mailto:fancye...@gmail.commailto:fancye...@gmail.com]
 Sent: Thursday, March 15, 2012 11:11 AM
 To: solr-user@lucene.apache.orgmailto:solr-user@lucene.apache.org
 Subject: Re: Solr out of memory exception
 
 how many memory are allocated to JVM?
 
 On Thu, Mar 15, 2012 at 1:27 PM, Husain, Yavar 
 yhus...@firstam.commailto:yhus...@firstam.com wrote:
 
 Solr is giving out of memory exception. Full Indexing was completed fine.
 Later while searching maybe when it tries to load the results in memory it
 starts giving this exception. Though with the same memory allocated to
 Tomcat and exactly same solr replica on another server it is working
 perfectly fine. I am working on 64 bit software's including Java  Tomcat
 on Windows.
 Any help would be appreciated.
 
 Here are the logs:
 
 The server encountered an internal error (Severe errors in solr
 configuration. Check your log files for more detailed information on what
 may be wrong. If you want solr to continue after configuration errors,
 change: abortOnConfigurationErrorfalse/abortOnConfigurationError in
 null -
 java.lang.RuntimeException: java.lang.OutOfMemoryError: Java heap space at
 org.apache.solr.core.SolrCore.getSearcher(SolrCore.java:1068) at
 org.apache.solr.core.SolrCore.init(SolrCore.java:579) at
 org.apache.solr.core.CoreContainer$Initializer.initialize(CoreContainer.java:137)
 at
 org.apache.solr.servlet.SolrDispatchFilter.init(SolrDispatchFilter.java:83)
 at
 org.apache.catalina.core.ApplicationFilterConfig.getFilter(ApplicationFilterConfig.java:295)
 at
 org.apache.catalina.core.ApplicationFilterConfig.setFilterDef(ApplicationFilterConfig.java:422)
 at
 org.apache.catalina.core.ApplicationFilterConfig.init(ApplicationFilterConfig.java:115)
 at
 org.apache.catalina.core.StandardContext.filterStart(StandardContext.java:4072)
 at
 org.apache.catalina.core.StandardContext.start(StandardContext.java:4726)
 at
 org.apache.catalina.core.ContainerBase.addChildInternal(ContainerBase.java:799)
 at org.apache.catalina.core.ContainerBase.addChild(ContainerBase.java:779)
 at org.apache.catalina.core.StandardHost.addChild(StandardHost.java:601) at
 org.apache.catalina.startup.HostConfig.deployWAR(HostConfig.java:943) at
 org.apache.catalina.startup.HostConfig.deployWARs(HostConfig.java:778) at
 org.apache.catalina.startup.HostConfig.deployApps(HostConfig.java:504) at
 org.apache.catalina.startup.HostConfig.start(HostConfig.java:1317) at
 org.apache.catalina.startup.HostConfig.lifecycleEvent(HostConfig.java:324)
 at
 org.apache.catalina.util.LifecycleSupport.fireLifecycleEvent(LifecycleSupport.java:142)
 at org.apache.catalina.core.ContainerBase.start(ContainerBase.java:1065) at
 org.apache.catalina.core.StandardHost.start(StandardHost.java:840) at
 org.apache.catalina.core.ContainerBase.start(ContainerBase.java:1057) at
 org.apache.catalina.core.StandardEngine.start(StandardEngine.java:463) at
 org.apache.catalina.core.StandardService.start(StandardService.java:525) at
 org.apache.catalina.core.StandardServer.start(StandardServer.java:754) at
 org.apache.catalina.startup.Catalina.start(Catalina.java:595) at
 sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at
 sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source) at
 sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source) at
 java.lang.reflect.Method.invoke(Unknown Source) at
 org.apache.catalina.startup.Bootstrap.start(Bootstrap.java:289) at
 org.apache.catalina.startup.Bootstrap.main(Bootstrap.java:414) Caused by:
 java.lang.OutOfMemoryError: Java heap space at
 org.apache.lucene.index.SegmentTermEnum.termInfo(SegmentTermEnum.java:180)
 at org.apache.lucene.index.TermInfosReader.init(TermInfosReader.java:91)
 at
 org.apache.lucene.index.SegmentReader

Re: Solr out of memory exception

2012-03-15 Thread Erick Erickson
See: 
http://javahowto.blogspot.com/2006/06/6-common-errors-in-setting-java-heap.html

Your Xmx specification is wrong I think.
-Xmx2.0GB
-Xmx2.0G
-Xmx-2GB
-Xmx-2G
-Xmx-2.0GB
-Xmx-2.0G
-Xmx=2G

all fail immediately when I try them from a command line on raw Java
from a command prompt.
Perhaps Tomcat is doing some magic here, I confess I don't speak very
fluent Tomcat but
the above link has a section on how to set memory for Tomcat that
doesn't make me hopeful...

Try
-Xmx2G -Xmx2G

no decimal place. No hyphen in front of the 2. no =.


2012/3/15 François Schiettecatte fschietteca...@gmail.com:
 FWIW it looks like this feature has been enabled by default since JDK 6 
 Update 23:

        
 http://blog.juma.me.uk/2008/10/14/32-bit-or-64-bit-jvm-how-about-a-hybrid/

 François

 On Mar 15, 2012, at 6:39 AM, Husain, Yavar wrote:

 Thanks a ton.
 
 From: Li Li [fancye...@gmail.com]
 Sent: Thursday, March 15, 2012 12:11 PM
 To: Husain, Yavar
 Cc: solr-user@lucene.apache.org
 Subject: Re: Solr out of memory exception

 it seems you are using 64bit jvm(32bit jvm can only allocate about 1.5GB). 
 you should enable pointer compression by -XX:+UseCompressedOops

 On Thu, Mar 15, 2012 at 1:58 PM, Husain, Yavar 
 yhus...@firstam.commailto:yhus...@firstam.com wrote:
 Thanks for helping me out.

 I have allocated Xms-2.0GB Xmx-2.0GB

 However i see Tomcat is still using pretty less memory and not 2.0G

 Total Memory on my Windows Machine = 4GB.

 With smaller index size it is working perfectly fine. I was thinking of 
 increasing the system RAM  tomcat heap space allocated but then how come on 
 a different server with exactly same system and solr configuration  memory 
 it is working fine?


 -Original Message-
 From: Li Li [mailto:fancye...@gmail.commailto:fancye...@gmail.com]
 Sent: Thursday, March 15, 2012 11:11 AM
 To: solr-user@lucene.apache.orgmailto:solr-user@lucene.apache.org
 Subject: Re: Solr out of memory exception

 how many memory are allocated to JVM?

 On Thu, Mar 15, 2012 at 1:27 PM, Husain, Yavar 
 yhus...@firstam.commailto:yhus...@firstam.com wrote:

 Solr is giving out of memory exception. Full Indexing was completed fine.
 Later while searching maybe when it tries to load the results in memory it
 starts giving this exception. Though with the same memory allocated to
 Tomcat and exactly same solr replica on another server it is working
 perfectly fine. I am working on 64 bit software's including Java  Tomcat
 on Windows.
 Any help would be appreciated.

 Here are the logs:

 The server encountered an internal error (Severe errors in solr
 configuration. Check your log files for more detailed information on what
 may be wrong. If you want solr to continue after configuration errors,
 change: abortOnConfigurationErrorfalse/abortOnConfigurationError in
 null -
 java.lang.RuntimeException: java.lang.OutOfMemoryError: Java heap space at
 org.apache.solr.core.SolrCore.getSearcher(SolrCore.java:1068) at
 org.apache.solr.core.SolrCore.init(SolrCore.java:579) at
 org.apache.solr.core.CoreContainer$Initializer.initialize(CoreContainer.java:137)
 at
 org.apache.solr.servlet.SolrDispatchFilter.init(SolrDispatchFilter.java:83)
 at
 org.apache.catalina.core.ApplicationFilterConfig.getFilter(ApplicationFilterConfig.java:295)
 at
 org.apache.catalina.core.ApplicationFilterConfig.setFilterDef(ApplicationFilterConfig.java:422)
 at
 org.apache.catalina.core.ApplicationFilterConfig.init(ApplicationFilterConfig.java:115)
 at
 org.apache.catalina.core.StandardContext.filterStart(StandardContext.java:4072)
 at
 org.apache.catalina.core.StandardContext.start(StandardContext.java:4726)
 at
 org.apache.catalina.core.ContainerBase.addChildInternal(ContainerBase.java:799)
 at org.apache.catalina.core.ContainerBase.addChild(ContainerBase.java:779)
 at org.apache.catalina.core.StandardHost.addChild(StandardHost.java:601) at
 org.apache.catalina.startup.HostConfig.deployWAR(HostConfig.java:943) at
 org.apache.catalina.startup.HostConfig.deployWARs(HostConfig.java:778) at
 org.apache.catalina.startup.HostConfig.deployApps(HostConfig.java:504) at
 org.apache.catalina.startup.HostConfig.start(HostConfig.java:1317) at
 org.apache.catalina.startup.HostConfig.lifecycleEvent(HostConfig.java:324)
 at
 org.apache.catalina.util.LifecycleSupport.fireLifecycleEvent(LifecycleSupport.java:142)
 at org.apache.catalina.core.ContainerBase.start(ContainerBase.java:1065) at
 org.apache.catalina.core.StandardHost.start(StandardHost.java:840) at
 org.apache.catalina.core.ContainerBase.start(ContainerBase.java:1057) at
 org.apache.catalina.core.StandardEngine.start(StandardEngine.java:463) at
 org.apache.catalina.core.StandardService.start(StandardService.java:525) at
 org.apache.catalina.core.StandardServer.start(StandardServer.java:754) at
 org.apache.catalina.startup.Catalina.start(Catalina.java:595

Re: Solr out of memory exception

2012-03-15 Thread Lance Norskog
Run the program under jconsole (visualgc on some machines). This
connects to your tomcat and gives a running view of memory use and
garbage collection activity.

On Thu, Mar 15, 2012 at 10:28 AM, Erick Erickson
erickerick...@gmail.com wrote:
 See: 
 http://javahowto.blogspot.com/2006/06/6-common-errors-in-setting-java-heap.html

 Your Xmx specification is wrong I think.
 -Xmx2.0GB
 -Xmx2.0G
 -Xmx-2GB
 -Xmx-2G
 -Xmx-2.0GB
 -Xmx-2.0G
 -Xmx=2G

 all fail immediately when I try them from a command line on raw Java
 from a command prompt.
 Perhaps Tomcat is doing some magic here, I confess I don't speak very
 fluent Tomcat but
 the above link has a section on how to set memory for Tomcat that
 doesn't make me hopeful...

 Try
 -Xmx2G -Xmx2G

 no decimal place. No hyphen in front of the 2. no =.


 2012/3/15 François Schiettecatte fschietteca...@gmail.com:
 FWIW it looks like this feature has been enabled by default since JDK 6 
 Update 23:

        
 http://blog.juma.me.uk/2008/10/14/32-bit-or-64-bit-jvm-how-about-a-hybrid/

 François

 On Mar 15, 2012, at 6:39 AM, Husain, Yavar wrote:

 Thanks a ton.
 
 From: Li Li [fancye...@gmail.com]
 Sent: Thursday, March 15, 2012 12:11 PM
 To: Husain, Yavar
 Cc: solr-user@lucene.apache.org
 Subject: Re: Solr out of memory exception

 it seems you are using 64bit jvm(32bit jvm can only allocate about 1.5GB). 
 you should enable pointer compression by -XX:+UseCompressedOops

 On Thu, Mar 15, 2012 at 1:58 PM, Husain, Yavar 
 yhus...@firstam.commailto:yhus...@firstam.com wrote:
 Thanks for helping me out.

 I have allocated Xms-2.0GB Xmx-2.0GB

 However i see Tomcat is still using pretty less memory and not 2.0G

 Total Memory on my Windows Machine = 4GB.

 With smaller index size it is working perfectly fine. I was thinking of 
 increasing the system RAM  tomcat heap space allocated but then how come 
 on a different server with exactly same system and solr configuration  
 memory it is working fine?


 -Original Message-
 From: Li Li [mailto:fancye...@gmail.commailto:fancye...@gmail.com]
 Sent: Thursday, March 15, 2012 11:11 AM
 To: solr-user@lucene.apache.orgmailto:solr-user@lucene.apache.org
 Subject: Re: Solr out of memory exception

 how many memory are allocated to JVM?

 On Thu, Mar 15, 2012 at 1:27 PM, Husain, Yavar 
 yhus...@firstam.commailto:yhus...@firstam.com wrote:

 Solr is giving out of memory exception. Full Indexing was completed fine.
 Later while searching maybe when it tries to load the results in memory it
 starts giving this exception. Though with the same memory allocated to
 Tomcat and exactly same solr replica on another server it is working
 perfectly fine. I am working on 64 bit software's including Java  Tomcat
 on Windows.
 Any help would be appreciated.

 Here are the logs:

 The server encountered an internal error (Severe errors in solr
 configuration. Check your log files for more detailed information on what
 may be wrong. If you want solr to continue after configuration errors,
 change: abortOnConfigurationErrorfalse/abortOnConfigurationError in
 null -
 java.lang.RuntimeException: java.lang.OutOfMemoryError: Java heap space at
 org.apache.solr.core.SolrCore.getSearcher(SolrCore.java:1068) at
 org.apache.solr.core.SolrCore.init(SolrCore.java:579) at
 org.apache.solr.core.CoreContainer$Initializer.initialize(CoreContainer.java:137)
 at
 org.apache.solr.servlet.SolrDispatchFilter.init(SolrDispatchFilter.java:83)
 at
 org.apache.catalina.core.ApplicationFilterConfig.getFilter(ApplicationFilterConfig.java:295)
 at
 org.apache.catalina.core.ApplicationFilterConfig.setFilterDef(ApplicationFilterConfig.java:422)
 at
 org.apache.catalina.core.ApplicationFilterConfig.init(ApplicationFilterConfig.java:115)
 at
 org.apache.catalina.core.StandardContext.filterStart(StandardContext.java:4072)
 at
 org.apache.catalina.core.StandardContext.start(StandardContext.java:4726)
 at
 org.apache.catalina.core.ContainerBase.addChildInternal(ContainerBase.java:799)
 at org.apache.catalina.core.ContainerBase.addChild(ContainerBase.java:779)
 at org.apache.catalina.core.StandardHost.addChild(StandardHost.java:601) at
 org.apache.catalina.startup.HostConfig.deployWAR(HostConfig.java:943) at
 org.apache.catalina.startup.HostConfig.deployWARs(HostConfig.java:778) at
 org.apache.catalina.startup.HostConfig.deployApps(HostConfig.java:504) at
 org.apache.catalina.startup.HostConfig.start(HostConfig.java:1317) at
 org.apache.catalina.startup.HostConfig.lifecycleEvent(HostConfig.java:324)
 at
 org.apache.catalina.util.LifecycleSupport.fireLifecycleEvent(LifecycleSupport.java:142)
 at org.apache.catalina.core.ContainerBase.start(ContainerBase.java:1065) at
 org.apache.catalina.core.StandardHost.start(StandardHost.java:840) at
 org.apache.catalina.core.ContainerBase.start(ContainerBase.java:1057) at
 org.apache.catalina.core.StandardEngine.start(StandardEngine.java

Re: Solr out of memory exception

2012-03-14 Thread Li Li
how many memory are allocated to JVM?

On Thu, Mar 15, 2012 at 1:27 PM, Husain, Yavar yhus...@firstam.com wrote:

 Solr is giving out of memory exception. Full Indexing was completed fine.
 Later while searching maybe when it tries to load the results in memory it
 starts giving this exception. Though with the same memory allocated to
 Tomcat and exactly same solr replica on another server it is working
 perfectly fine. I am working on 64 bit software's including Java  Tomcat
 on Windows.
 Any help would be appreciated.

 Here are the logs:

 The server encountered an internal error (Severe errors in solr
 configuration. Check your log files for more detailed information on what
 may be wrong. If you want solr to continue after configuration errors,
 change: abortOnConfigurationErrorfalse/abortOnConfigurationError in
 null -
 java.lang.RuntimeException: java.lang.OutOfMemoryError: Java heap space at
 org.apache.solr.core.SolrCore.getSearcher(SolrCore.java:1068) at
 org.apache.solr.core.SolrCore.init(SolrCore.java:579) at
 org.apache.solr.core.CoreContainer$Initializer.initialize(CoreContainer.java:137)
 at
 org.apache.solr.servlet.SolrDispatchFilter.init(SolrDispatchFilter.java:83)
 at
 org.apache.catalina.core.ApplicationFilterConfig.getFilter(ApplicationFilterConfig.java:295)
 at
 org.apache.catalina.core.ApplicationFilterConfig.setFilterDef(ApplicationFilterConfig.java:422)
 at
 org.apache.catalina.core.ApplicationFilterConfig.init(ApplicationFilterConfig.java:115)
 at
 org.apache.catalina.core.StandardContext.filterStart(StandardContext.java:4072)
 at
 org.apache.catalina.core.StandardContext.start(StandardContext.java:4726)
 at
 org.apache.catalina.core.ContainerBase.addChildInternal(ContainerBase.java:799)
 at org.apache.catalina.core.ContainerBase.addChild(ContainerBase.java:779)
 at org.apache.catalina.core.StandardHost.addChild(StandardHost.java:601) at
 org.apache.catalina.startup.HostConfig.deployWAR(HostConfig.java:943) at
 org.apache.catalina.startup.HostConfig.deployWARs(HostConfig.java:778) at
 org.apache.catalina.startup.HostConfig.deployApps(HostConfig.java:504) at
 org.apache.catalina.startup.HostConfig.start(HostConfig.java:1317) at
 org.apache.catalina.startup.HostConfig.lifecycleEvent(HostConfig.java:324)
 at
 org.apache.catalina.util.LifecycleSupport.fireLifecycleEvent(LifecycleSupport.java:142)
 at org.apache.catalina.core.ContainerBase.start(ContainerBase.java:1065) at
 org.apache.catalina.core.StandardHost.start(StandardHost.java:840) at
 org.apache.catalina.core.ContainerBase.start(ContainerBase.java:1057) at
 org.apache.catalina.core.StandardEngine.start(StandardEngine.java:463) at
 org.apache.catalina.core.StandardService.start(StandardService.java:525) at
 org.apache.catalina.core.StandardServer.start(StandardServer.java:754) at
 org.apache.catalina.startup.Catalina.start(Catalina.java:595) at
 sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at
 sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source) at
 sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source) at
 java.lang.reflect.Method.invoke(Unknown Source) at
 org.apache.catalina.startup.Bootstrap.start(Bootstrap.java:289) at
 org.apache.catalina.startup.Bootstrap.main(Bootstrap.java:414) Caused by:
 java.lang.OutOfMemoryError: Java heap space at
 org.apache.lucene.index.SegmentTermEnum.termInfo(SegmentTermEnum.java:180)
 at org.apache.lucene.index.TermInfosReader.init(TermInfosReader.java:91)
 at
 org.apache.lucene.index.SegmentReader$CoreReaders.init(SegmentReader.java:122)
 at org.apache.lucene.index.SegmentReader.get(SegmentReader.java:652) at
 org.apache.lucene.index.SegmentReader.get(SegmentReader.java:613) at
 org.apache.lucene.index.DirectoryReader.init(DirectoryReader.java:104) at
 org.apache.lucene.index.ReadOnlyDirectoryReader.init(ReadOnlyDirectoryReader.java:27)
 at
 org.apache.lucene.index.DirectoryReader$1.doBody(DirectoryReader.java:74)
 at
 org.apache.lucene.index.SegmentInfos$FindSegmentsFile.run(SegmentInfos.java:683)
 at org.apache.lucene.index.DirectoryReader.open(DirectoryReader.java:69) at
 org.apache.lucene.index.IndexReader.open(IndexReader.java:476) at
 org.apache.lucene.index.IndexReader.open(IndexReader.java:403) at
 org.apache.solr.core.StandardIndexReaderFactory.newReader(StandardIndexReaderFactory.java:38)
 at org.apache.solr.core.SolrCore.getSearcher(SolrCore.java:1057) at
 org.apache.solr.core.SolrCore.init(SolrCore.java:579) at
 org.apache.solr.core.CoreContainer$Initializer.initialize(CoreContainer.java:137)
 at
 org.apache.solr.servlet.SolrDispatchFilter.init(SolrDispatchFilter.java:83)
 at
 org.apache.catalina.core.ApplicationFilterConfig.getFilter(ApplicationFilterConfig.java:295)
 at
 org.apache.catalina.core.ApplicationFilterConfig.setFilterDef(ApplicationFilterConfig.java:422)
 at
 org.apache.catalina.core.ApplicationFilterConfig.init(ApplicationFilterConfig.java:115)
 at
 

RE: Solr out of memory exception

2012-03-14 Thread Husain, Yavar
Thanks for helping me out.

I have allocated Xms-2.0GB Xmx-2.0GB

However i see Tomcat is still using pretty less memory and not 2.0G

Total Memory on my Windows Machine = 4GB.

With smaller index size it is working perfectly fine. I was thinking of 
increasing the system RAM  tomcat heap space allocated but then how come on a 
different server with exactly same system and solr configuration  memory it is 
working fine?


-Original Message-
From: Li Li [mailto:fancye...@gmail.com] 
Sent: Thursday, March 15, 2012 11:11 AM
To: solr-user@lucene.apache.org
Subject: Re: Solr out of memory exception

how many memory are allocated to JVM?

On Thu, Mar 15, 2012 at 1:27 PM, Husain, Yavar yhus...@firstam.com wrote:

 Solr is giving out of memory exception. Full Indexing was completed fine.
 Later while searching maybe when it tries to load the results in memory it
 starts giving this exception. Though with the same memory allocated to
 Tomcat and exactly same solr replica on another server it is working
 perfectly fine. I am working on 64 bit software's including Java  Tomcat
 on Windows.
 Any help would be appreciated.

 Here are the logs:

 The server encountered an internal error (Severe errors in solr
 configuration. Check your log files for more detailed information on what
 may be wrong. If you want solr to continue after configuration errors,
 change: abortOnConfigurationErrorfalse/abortOnConfigurationError in
 null -
 java.lang.RuntimeException: java.lang.OutOfMemoryError: Java heap space at
 org.apache.solr.core.SolrCore.getSearcher(SolrCore.java:1068) at
 org.apache.solr.core.SolrCore.init(SolrCore.java:579) at
 org.apache.solr.core.CoreContainer$Initializer.initialize(CoreContainer.java:137)
 at
 org.apache.solr.servlet.SolrDispatchFilter.init(SolrDispatchFilter.java:83)
 at
 org.apache.catalina.core.ApplicationFilterConfig.getFilter(ApplicationFilterConfig.java:295)
 at
 org.apache.catalina.core.ApplicationFilterConfig.setFilterDef(ApplicationFilterConfig.java:422)
 at
 org.apache.catalina.core.ApplicationFilterConfig.init(ApplicationFilterConfig.java:115)
 at
 org.apache.catalina.core.StandardContext.filterStart(StandardContext.java:4072)
 at
 org.apache.catalina.core.StandardContext.start(StandardContext.java:4726)
 at
 org.apache.catalina.core.ContainerBase.addChildInternal(ContainerBase.java:799)
 at org.apache.catalina.core.ContainerBase.addChild(ContainerBase.java:779)
 at org.apache.catalina.core.StandardHost.addChild(StandardHost.java:601) at
 org.apache.catalina.startup.HostConfig.deployWAR(HostConfig.java:943) at
 org.apache.catalina.startup.HostConfig.deployWARs(HostConfig.java:778) at
 org.apache.catalina.startup.HostConfig.deployApps(HostConfig.java:504) at
 org.apache.catalina.startup.HostConfig.start(HostConfig.java:1317) at
 org.apache.catalina.startup.HostConfig.lifecycleEvent(HostConfig.java:324)
 at
 org.apache.catalina.util.LifecycleSupport.fireLifecycleEvent(LifecycleSupport.java:142)
 at org.apache.catalina.core.ContainerBase.start(ContainerBase.java:1065) at
 org.apache.catalina.core.StandardHost.start(StandardHost.java:840) at
 org.apache.catalina.core.ContainerBase.start(ContainerBase.java:1057) at
 org.apache.catalina.core.StandardEngine.start(StandardEngine.java:463) at
 org.apache.catalina.core.StandardService.start(StandardService.java:525) at
 org.apache.catalina.core.StandardServer.start(StandardServer.java:754) at
 org.apache.catalina.startup.Catalina.start(Catalina.java:595) at
 sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at
 sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source) at
 sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source) at
 java.lang.reflect.Method.invoke(Unknown Source) at
 org.apache.catalina.startup.Bootstrap.start(Bootstrap.java:289) at
 org.apache.catalina.startup.Bootstrap.main(Bootstrap.java:414) Caused by:
 java.lang.OutOfMemoryError: Java heap space at
 org.apache.lucene.index.SegmentTermEnum.termInfo(SegmentTermEnum.java:180)
 at org.apache.lucene.index.TermInfosReader.init(TermInfosReader.java:91)
 at
 org.apache.lucene.index.SegmentReader$CoreReaders.init(SegmentReader.java:122)
 at org.apache.lucene.index.SegmentReader.get(SegmentReader.java:652) at
 org.apache.lucene.index.SegmentReader.get(SegmentReader.java:613) at
 org.apache.lucene.index.DirectoryReader.init(DirectoryReader.java:104) at
 org.apache.lucene.index.ReadOnlyDirectoryReader.init(ReadOnlyDirectoryReader.java:27)
 at
 org.apache.lucene.index.DirectoryReader$1.doBody(DirectoryReader.java:74)
 at
 org.apache.lucene.index.SegmentInfos$FindSegmentsFile.run(SegmentInfos.java:683)
 at org.apache.lucene.index.DirectoryReader.open(DirectoryReader.java:69) at
 org.apache.lucene.index.IndexReader.open(IndexReader.java:476) at
 org.apache.lucene.index.IndexReader.open(IndexReader.java:403) at
 org.apache.solr.core.StandardIndexReaderFactory.newReader(StandardIndexReaderFactory.java:38

Re: SOlR -- Out of Memory exception

2011-06-17 Thread jyn7
I did that , but when I split them into 5 mill records, the first file went
through fine, when I started processing the second file SOLR hit an OOM
again:
org.apache.solr.common.SolrException log
SEVERE: java.lang.OutOfMemoryError: Java heap space
at
org.apache.lucene.index.FreqProxTermsWriterPerField$FreqProxPostingsArray.init(FreqProxTermsWriterPerField.java:184)
at
org.apache.lucene.index.FreqProxTermsWriterPerField$FreqProxPostingsArray.newInstance(FreqProxTermsWriterPerField.java:194)
at
org.apache.lucene.index.ParallelPostingsArray.grow(ParallelPostingsArray.java:48)
at
org.apache.lucene.index.TermsHashPerField.growParallelPostingsArray(TermsHashPerField.java:137)
at
org.apache.lucene.index.TermsHashPerField.add(TermsHashPerField.java:440)
at
org.apache.lucene.index.DocInverterPerField.processFields(DocInverterPerField.java:169)
at
org.apache.lucene.index.DocFieldProcessorPerThread.processDocument(DocFieldProcessorPerThread.java:248)

--
View this message in context: 
http://lucene.472066.n3.nabble.com/SOlR-Out-of-Memory-exception-tp3074636p3076610.html
Sent from the Solr - User mailing list archive at Nabble.com.


Re: SOlR -- Out of Memory exception

2011-06-17 Thread Yonik Seeley
On Fri, Jun 17, 2011 at 1:30 AM, pravesh suyalprav...@yahoo.com wrote:
 If you are sending whole CSV in a single HTTP request using curl, why not
 consider sending it in smaller chunks?

Smaller chunks should not matter - Solr streams from the input (i.e.
the whole thing is not buffered in memory).

It could be related to autoCommit.  Commits may be stacking up faster
than can be handled.   I'd recommend getting rid of autocommit if
possible, or at a minimum get rid of the maxDocs based autocommit.
Incremental updates can use commitWithin to guarantee a
time-of-visibility, and bulk updates like this CSV upload normally
shouldn't commit until the end.

-Yonik
http://www.lucidimagination.com


Re: SOlR -- Out of Memory exception

2011-06-17 Thread jyn7
I commented the autocommit option and tried uploading the file (a smaller
file now 5 million records) and I hit an oom again:

Jun 17, 2011 2:32:59 PM org.apache.solr.common.SolrException log
SEVERE: java.lang.OutOfMemoryError: Java heap space

--
View this message in context: 
http://lucene.472066.n3.nabble.com/SOlR-Out-of-Memory-exception-tp3074636p3077812.html
Sent from the Solr - User mailing list archive at Nabble.com.


SOlR -- Out of Memory exception

2011-06-16 Thread jyn7
We just started using SOLR. I am trying to load a single file with 20 million
records into SOLR using the CSV uploader. I keep getting and out of Memory
after loading 7 million records. Here is the config:

autoCommit 
 maxDocs1/maxDocs
 maxTime6/maxTime 
I also  encountered a LockObtainFailedException
org.apache.lucene.store.LockObtainFailedException: Lock obtain timed out:
NativeFSLock@D:\work\solr\.\data\index\write.lock
at org.apache.lucene.store.Lock.obtain(Lock.java:84)
at
org.apache.lucene.index.IndexWriter.lt;initgt;(IndexWriter.java:1097)

So I changed the  lockType to SIngle, now again I am getting an Out of
Memory Exception. I also increased the JVM heap space to 2048M but still
getting an Out of Memory.




--
View this message in context: 
http://lucene.472066.n3.nabble.com/SOlR-Out-of-Memory-exception-tp3074636p3074636.html
Sent from the Solr - User mailing list archive at Nabble.com.


Re: SOlR -- Out of Memory exception

2011-06-16 Thread Erick Erickson
H, are you still getting your OOM after 7M records? Or some larger
number? And how are you using the CSV uploader?

Best
Erick

On Thu, Jun 16, 2011 at 9:14 PM, jyn7 jyotsna.namb...@gmail.com wrote:
 We just started using SOLR. I am trying to load a single file with 20 million
 records into SOLR using the CSV uploader. I keep getting and out of Memory
 after loading 7 million records. Here is the config:

 autoCommit
         maxDocs1/maxDocs
         maxTime6/maxTime
 I also  encountered a LockObtainFailedException
 org.apache.lucene.store.LockObtainFailedException: Lock obtain timed out:
 NativeFSLock@D:\work\solr\.\data\index\write.lock
        at org.apache.lucene.store.Lock.obtain(Lock.java:84)
        at
 org.apache.lucene.index.IndexWriter.lt;initgt;(IndexWriter.java:1097)

 So I changed the  lockType to SIngle, now again I am getting an Out of
 Memory Exception. I also increased the JVM heap space to 2048M but still
 getting an Out of Memory.




 --
 View this message in context: 
 http://lucene.472066.n3.nabble.com/SOlR-Out-of-Memory-exception-tp3074636p3074636.html
 Sent from the Solr - User mailing list archive at Nabble.com.



Re: SOlR -- Out of Memory exception

2011-06-16 Thread jyn7
Yes Eric, after changing the lock type to Single, I got an OOM after loading
5.5 million records. I am using the curl command to upload the csv.

--
View this message in context: 
http://lucene.472066.n3.nabble.com/SOlR-Out-of-Memory-exception-tp3074636p3074765.html
Sent from the Solr - User mailing list archive at Nabble.com.


Re: SOlR -- Out of Memory exception

2011-06-16 Thread pravesh
If you are sending whole CSV in a single HTTP request using curl, why not
consider sending it in smaller chunks?


--
View this message in context: 
http://lucene.472066.n3.nabble.com/SOlR-Out-of-Memory-exception-tp3074636p3075091.html
Sent from the Solr - User mailing list archive at Nabble.com.