It seems that after some time also *configuration #1* proves to be
unstable, the server hanged with OOM.

Restarting it brought to the following exception [1], which eventually
seems to happen somewhere between *solr *and *lucene.*
*
*
I'm running the server with the following command line on a 2gigs server
running only Stanbol:
 java -server -Xmx1536m -jar
~/stanbol/launchers/full/target/org.apache.stanbol.launchers.full-0.9.0-incubating-SNAPSHOT.jar

[1] Stack Trace:
09.02.2012 21:03:39.236 *ERROR* [FelixStartLevel]
org.apache.stanbol.entityhub.yard.solr.impl.SolrYard Exception while
checking SolrIndex 'dbpedia' on ManagedSolrServer 'default'!
org.apache.stanbol.entityhub.servicesapi.yard.YardExcepti
on: Unable to actiate SolrIndex for SolrYard dbpedia default data index
        at
org.apache.stanbol.entityhub.yard.solr.impl.SolrYard.checkManagedSolrIndex(SolrYard.java:631)
        at
org.apache.stanbol.entityhub.yard.solr.impl.SolrYard.activate(SolrYard.java:448)
        at
org.apache.stanbol.entityhub.yard.solr.impl.SolrYard.activate(SolrYard.java:405)
        (...)
Caused by: java.lang.RuntimeException: java.lang.OutOfMemoryError: Java
heap space
        at org.apache.solr.core.SolrCore.getSearcher(SolrCore.java:1091)
        at org.apache.solr.core.SolrCore.<init>(SolrCore.java:585)
        at org.apache.solr.core.CoreContainer.create(CoreContainer.java:463)
        at
org.apache.stanbol.commons.solr.SolrServerAdapter.registerCore(SolrServerAdapter.java:352)
        at
org.apache.stanbol.commons.solr.managed.impl.ManagedSolrServerImpl.activateCore(ManagedSolrServerImpl.java:837)
        at
org.apache.stanbol.commons.solr.managed.impl.ManagedSolrServerImpl.activateIndex(ManagedSolrServerImpl.java:665)
        at
org.apache.stanbol.entityhub.yard.solr.impl.SolrYard.checkManagedSolrIndex(SolrYard.java:617)
        ... 30 more
Caused by: java.lang.OutOfMemoryError: Java heap space
        at
org.apache.lucene.index.TermInfosReader.<init>(TermInfosReader.java:116)
        at
org.apache.lucene.index.SegmentCoreReaders.<init>(SegmentCoreReaders.java:75)
        at org.apache.lucene.index.SegmentReader.get(SegmentReader.java:114)
        at org.apache.lucene.index.SegmentReader.get(SegmentReader.java:92)
        at
org.apache.lucene.index.DirectoryReader.<init>(DirectoryReader.java:113)
        at
org.apache.lucene.index.ReadOnlyDirectoryReader.<init>(ReadOnlyDirectoryReader.java:29)
        at
org.apache.lucene.index.DirectoryReader$1.doBody(DirectoryReader.java:81)
        at
org.apache.lucene.index.SegmentInfos$FindSegmentsFile.run(SegmentInfos.java:753)
        at
org.apache.lucene.index.DirectoryReader.open(DirectoryReader.java:75)
        at org.apache.lucene.index.IndexReader.open(IndexReader.java:428)
        at org.apache.lucene.index.IndexReader.open(IndexReader.java:371)
        at
org.apache.solr.core.StandardIndexReaderFactory.newReader(StandardIndexReaderFactory.java:38)
        at org.apache.solr.core.SolrCore.getSearcher(SolrCore.java:1080)
        at org.apache.solr.core.SolrCore.<init>(SolrCore.java:585)
        at org.apache.solr.core.CoreContainer.create(CoreContainer.java:463)
        at
org.apache.stanbol.commons.solr.SolrServerAdapter.registerCore(SolrServerAdapter.java:352)
        at
org.apache.stanbol.commons.solr.managed.impl.ManagedSolrServerImpl.activateCore(ManagedSolrServerImpl.java:837)
        at
org.apache.stanbol.commons.solr.managed.impl.ManagedSolrServerImpl.activateIndex(ManagedSolrServerImpl.java:665)
        (...)

On Wed, Feb 8, 2012 at 11:52 AM, David Riccitelli <[email protected]>wrote:

> Dears,
>
> I'm currently running a Stanbol instance with 2 different enhancer
> configurations.
>
> Configuration #1, works fine:
>
>    1. metaxa ( required , MetaxaEngine)
>    2. langid ( required , LangIdEnhancementEngine)
>    3. ner ( required , NamedEntityExtractionEnhancementEngine)
>    4. entityhubLinking ( required , NamedEntityTaggingEngine)
>    5. dbpediaLinking ( required , NamedEntityTaggingEngine)
>    6. refactor ( required , RefactorEnhancementEngine)
>
>
> Configuration #2, raises *java.lang.OutOfMemoryError*:
>
>    1. metaxa ( required , MetaxaEngine)
>    2. langid ( required , LangIdEnhancementEngine)
>    3. ner ( required , NamedEntityExtractionEnhancementEngine)
>    4. entityhubLinking ( required , NamedEntityTaggingEngine)
>    5. dbpediaLinking ( required , NamedEntityTaggingEngine)
>    6. keyword-linking-1 ( required , KeywordLinkingEngine)
>    7. refactor ( required , RefactorEnhancementEngine)
>
>
> The KeywordLinkingEngine allows us to find lots more entities, which in
> turn are passed over to the RefactorEnhancementEngine. I think this could
> be the root cause of the OOM.
>
> How can we further diagnose this issue? We need the KeywordLinkingEngine
> in order to detect entities for non-supported languages (such as Italian
> and Danish).
>
> Thanks,
> David
>
> --
> David Riccitelli
>
>
> ********************************************************************************
> InsideOut10 s.r.l.
> P.IVA: IT-11381771002
> Fax: +39 0110708239
> ---
> LinkedIn: http://it.linkedin.com/in/riccitelli
> Twitter: ziodave
> ---
> Layar Partner 
> Network<http://www.layar.com/publishing/developers/list/?page=1&country=&city=&keyword=insideout10&lpn=1>
>
> ********************************************************************************
>
>


-- 
David Riccitelli

********************************************************************************
InsideOut10 s.r.l.
P.IVA: IT-11381771002
Fax: +39 0110708239
---
LinkedIn: http://it.linkedin.com/in/riccitelli
Twitter: ziodave
---
Layar Partner 
Network<http://www.layar.com/publishing/developers/list/?page=1&country=&city=&keyword=insideout10&lpn=1>
********************************************************************************

Reply via email to