Re: Full Indexing fails on Solr-Probable connection issue.HELP!

2014-06-16 Thread Aniket Bhoi
On Tue, Jun 3, 2014 at 6:28 PM, Shawn Heisey  wrote:

> On 6/3/2014 3:04 AM, Aniket Bhoi wrote:
> > I changed the value of  removeAbandoned  to false,this time the indexing
> > failed due to a different exception:
>
> 
>
> > Caused by: com.microsoft.sqlserver.jdbc.SQLServerException: Connection
> > reset
>
> This is really the same error, but now the dataimport handler actually
> sees the connection reset, rather than a higher level problem.
>
> Something somewhere closed the connection.  If you have increased
> maxMergeCount to at least 6 and turned off removeAbandoned, then I have
> no idea what is happening.  Perhaps the SQL Server itself has a very low
> inactivity timeout value.
>
> Thanks,
> Shawn
>
>

I changed the indexing query in dataconfig to index data for the last
6months.That seemed to worked.Why is then failing if I index the whole data
load.Something to do with the same stale connection being used in the
connection pool or are the threads not sufficient ??.


Re: Full Indexing fails on Solr-Probable connection issue.HELP!

2014-06-03 Thread Shawn Heisey
On 6/3/2014 3:04 AM, Aniket Bhoi wrote:
> I changed the value of  removeAbandoned  to false,this time the indexing
> failed due to a different exception:



> Caused by: com.microsoft.sqlserver.jdbc.SQLServerException: Connection
> reset

This is really the same error, but now the dataimport handler actually
sees the connection reset, rather than a higher level problem.

Something somewhere closed the connection.  If you have increased
maxMergeCount to at least 6 and turned off removeAbandoned, then I have
no idea what is happening.  Perhaps the SQL Server itself has a very low
inactivity timeout value.

Thanks,
Shawn



Re: Full Indexing fails on Solr-Probable connection issue.HELP!

2014-06-03 Thread Aniket Bhoi
On Sun, Jun 1, 2014 at 10:55 PM, Shawn Heisey  wrote:

> On 5/31/2014 1:54 PM, Aniket Bhoi wrote:
> > Caused by: com.microsoft.sqlserver.jdbc.SQLServerException: The result
> >>> set is closed.
>
> I still think this is an indication of the source of the problem.
> Something closed the connection to your SQL server before Solr was done
> with it.  That could have been JDBC itself, or it might have been
> something else.  I think I can safely say that it wasn't Solr.
>
> Your stacktrace indicates that Tomcat's database connection pooling is
> active.  Tomcat's connection pooling has a feature for dropping
> abandoned connections.  If you have enabled this feature, this could be
> the problem.
>
> Abandoned connection dropping (when it is enabled) happens by default 60
> seconds after the connection is established.  If you have database
> connections that last longer than 60 seconds (which Solr's dataimport is
> very likely to do), you need to increase removeAbandonedTimeout to
> something larger than the longest time an import is likely to last -- or
> disable removeAbandoned entirely.  The latter is probably a better option.
>
> http://tomcat.apache.org/tomcat-7.0-doc/jdbc-pool.html
>
> Thanks,
> Shawn
>
>

I changed the value of  removeAbandoned  to false,this time the indexing
failed due to a different exception:

SEVERE: Exception while processing: srch_call document :
null:org.apache.solr.handler.dataimport.DataImportHandlerException: Error
reading data from database
at
org.apache.solr.handler.dataimport.DataImportHandlerException.wrapAndThrow(Unknown
Source)
at
org.apache.solr.handler.dataimport.JdbcDataSource$ResultSetIterator.getARow(Unknown
Source)
at
org.apache.solr.handler.dataimport.JdbcDataSource$ResultSetIterator.access$600(Unknown
Source)
at
org.apache.solr.handler.dataimport.JdbcDataSource$ResultSetIterator$1.next(Unknown
Source)
at
org.apache.solr.handler.dataimport.JdbcDataSource$ResultSetIterator$1.next(Unknown
Source)
at
org.apache.solr.handler.dataimport.EntityProcessorBase.getNext(Unknown
Source)
at
org.apache.solr.handler.dataimport.SqlEntityProcessor.nextRow(Unknown
Source)
at
org.apache.solr.handler.dataimport.ThreadedEntityProcessorWrapper.nextRow(Unknown
Source)
at
org.apache.solr.handler.dataimport.DocBuilder$EntityRunner.runAThread(Unknown
Source)
at
org.apache.solr.handler.dataimport.DocBuilder$EntityRunner.access$000(Unknown
Source)
at
org.apache.solr.handler.dataimport.DocBuilder$EntityRunner$1.run(Unknown
Source)
at java.util.concurrent.ThreadPoolExecutor$Worker.runTask(Unknown
Source)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(Unknown
Source)
at java.lang.Thread.run(Unknown Source)
*Caused by: com.microsoft.sqlserver.jdbc.SQLServerException: Connection
reset*
at
com.microsoft.sqlserver.jdbc.SQLServerException.makeFromDriverError(SQLServerException.java:170)
at
com.microsoft.sqlserver.jdbc.SimpleInputStream.getBytes(SimpleInputStream.java:350)
at
com.microsoft.sqlserver.jdbc.DDC.convertStreamToObject(DDC.java:419)
at
com.microsoft.sqlserver.jdbc.ServerDTVImpl.getValue(dtv.java:2007)
at com.microsoft.sqlserver.jdbc.DTV.getValue(dtv.java:175)
at com.microsoft.sqlserver.jdbc.Column.getValue(Column.java:113)
at
com.microsoft.sqlserver.jdbc.SQLServerResultSet.getValue(SQLServerResultSet.java:1982)
at
com.microsoft.sqlserver.jdbc.SQLServerResultSet.getValue(SQLServerResultSet.java:1967)
at
com.microsoft.sqlserver.jdbc.SQLServerResultSet.getObject(SQLServerResultSet.java:2256)
at
com.microsoft.sqlserver.jdbc.SQLServerResultSet.getObject(SQLServerResultSet.java:2265)
at
org.apache.tomcat.dbcp.dbcp.DelegatingResultSet.getObject(DelegatingResultSet.java:295)
... 13 more


Re: Full Indexing fails on Solr-Probable connection issue.HELP!

2014-06-01 Thread Shawn Heisey
On 5/31/2014 1:54 PM, Aniket Bhoi wrote:
> Caused by: com.microsoft.sqlserver.jdbc.SQLServerException: The result
>>> set is closed.

I still think this is an indication of the source of the problem.
Something closed the connection to your SQL server before Solr was done
with it.  That could have been JDBC itself, or it might have been
something else.  I think I can safely say that it wasn't Solr.

Your stacktrace indicates that Tomcat's database connection pooling is
active.  Tomcat's connection pooling has a feature for dropping
abandoned connections.  If you have enabled this feature, this could be
the problem.

Abandoned connection dropping (when it is enabled) happens by default 60
seconds after the connection is established.  If you have database
connections that last longer than 60 seconds (which Solr's dataimport is
very likely to do), you need to increase removeAbandonedTimeout to
something larger than the longest time an import is likely to last -- or
disable removeAbandoned entirely.  The latter is probably a better option.

http://tomcat.apache.org/tomcat-7.0-doc/jdbc-pool.html

Thanks,
Shawn



Re: Full Indexing fails on Solr-Probable connection issue.HELP!

2014-05-31 Thread Aniket Bhoi
still awaiting a response from someone


On Tue, May 27, 2014 at 1:35 PM, Aniket Bhoi  wrote:

>
>
>
> On Mon, May 26, 2014 at 4:14 PM, Aniket Bhoi 
> wrote:
>
>> Another thing I have noted is that the exception always follows a commit
>> operation.Log excerpt below:
>>
>> INFO: SolrDeletionPolicy.onCommit: commits:num=2
>> commit{dir=/opt/solr/cores/calls/data/index,segFN=segments_2qt,version=1347458723267,generation=3557,filenames=[_3z9.tii,
>> _3z3.fnm, _3z9.nrm, _3za.prx, _3z9.fdt, _3z9.fnm, _3z9.fdx, _3z3.frq,
>> _3za.nrm, segments_2qt, _3z3.fdx, _3z9.prx, _3z3.fdt, _3za.fdx, _3z9.frq,
>> _3z3.prx, _3za.fdt, _3z3.tii, _3za.tis, _3za.fnm, _3z3.nrm, _3z9.tis,
>> _3za.tii, _3za.frq, _3z3.tis]
>>  
>> commit{dir=/opt/solr/cores/calls/data/index,segFN=segments_2qu,version=1347458723269,generation=3558,filenames=[_3zb.fdt,
>> _3z9.tii, _3z3.fnm, _3z9.nrm, _3zb.tii, _3zb.tis, _3zb.fdx, _3za.prx,
>> _3z9.fdt, _3z9.fnm, _3z9.fdx, _3zb.frq, _3z3.frq, _3za.nrm, segments_2qu,
>> _3z3.fdx, _3zb.prx, _3z9.prx, _3zb.fnm, _3z3.fdt, _3za.fdx, _3z9.frq,
>> _3z3.prx, _3za.fdt, _3zb.nrm, _3z3.tii, _3za.tis, _3za.fnm, _3z3.nrm,
>> _3z9.tis, _3za.tii, _3za.frq, _3z3.tis]
>> May 24, 2014 5:49:05 AM org.apache.solr.core.SolrDeletionPolicy
>> updateCommits
>> INFO: newest commit = 1347458723269
>> May 24, 2014 5:49:05 AM org.apache.solr.search.SolrIndexSearcher 
>> INFO: Opening Searcher@423dbcca main
>> May 24, 2014 5:49:05 AM org.apache.solr.search.SolrIndexSearcher warm
>> INFO: autowarming Searcher@423dbcca main from Searcher@19c19869 main
>>
>> fieldValueCache{lookups=0,hits=0,hitratio=0.00,inserts=0,evictions=0,size=0,warmupTime=0,cumulative_lookups=0,cumulative_hits=0,cumulative_hitratio=0.00,cumulative_inserts=0,cumulative_evictions=0}
>> May 24, 2014 5:49:05 AM org.apache.solr.search.SolrIndexSearcher warm
>> INFO: autowarming result for Searcher@423dbcca main
>>
>> fieldValueCache{lookups=0,hits=0,hitratio=0.00,inserts=0,evictions=0,size=0,warmupTime=0,cumulative_lookups=0,cumulative_hits=0,cumulative_hitratio=0.00,cumulative_inserts=0,cumulative_evictions=0}
>> May 24, 2014 5:49:05 AM org.apache.solr.search.SolrIndexSearcher warm
>> INFO: autowarming Searcher@423dbcca main from Searcher@19c19869 main
>>
>> filterCache{lookups=0,hits=0,hitratio=0.00,inserts=0,evictions=0,size=0,warmupTime=0,cumulative_lookups=0,cumulative_hits=0,cumulative_hitratio=0.00,cumulative_inserts=0,cumulative_evictions=0}
>> May 24, 2014 5:49:05 AM org.apache.solr.search.SolrIndexSearcher warm
>> INFO: autowarming result for Searcher@423dbcca main
>>
>> filterCache{lookups=0,hits=0,hitratio=0.00,inserts=0,evictions=0,size=0,warmupTime=0,cumulative_lookups=0,cumulative_hits=0,cumulative_hitratio=0.00,cumulative_inserts=0,cumulative_evictions=0}
>> May 24, 2014 5:49:05 AM org.apache.solr.search.SolrIndexSearcher warm
>> INFO: autowarming Searcher@423dbcca main from Searcher@19c19869 main
>>
>> queryResultCache{lookups=1,hits=1,hitratio=1.00,inserts=3,evictions=0,size=3,warmupTime=2,cumulative_lookups=47,cumulative_hits=46,cumulative_hitratio=0.97,cumulative_inserts=1,cumulative_evictions=0}
>> May 24, 2014 5:49:05 AM org.apache.solr.search.SolrIndexSearcher warm
>> INFO: autowarming result for Searcher@423dbcca main
>>
>> queryResultCache{lookups=0,hits=0,hitratio=0.00,inserts=3,evictions=0,size=3,warmupTime=2,cumulative_lookups=47,cumulative_hits=46,cumulative_hitratio=0.97,cumulative_inserts=1,cumulative_evictions=0}
>> May 24, 2014 5:49:05 AM org.apache.solr.search.SolrIndexSearcher warm
>> INFO: autowarming Searcher@423dbcca main from Searcher@19c19869 main
>>
>> documentCache{lookups=0,hits=0,hitratio=0.00,inserts=40,evictions=0,size=40,warmupTime=0,cumulative_lookups=0,cumulative_hits=0,cumulative_hitratio=0.00,cumulative_inserts=0,cumulative_evictions=0}
>> May 24, 2014 5:49:05 AM org.apache.solr.search.SolrIndexSearcher warm
>> INFO: autowarming result for Searcher@423dbcca main
>>
>> documentCache{lookups=0,hits=0,hitratio=0.00,inserts=0,evictions=0,size=0,warmupTime=0,cumulative_lookups=0,cumulative_hits=0,cumulative_hitratio=0.00,cumulative_inserts=0,cumulative_evictions=0}
>> May 24, 2014 5:49:05 AM org.apache.solr.core.QuerySenderListener
>> newSearcher
>> INFO: QuerySenderListener sending requests to Searcher@423dbcca main
>> May 24, 2014 5:49:05 AM org.apache.solr.core.SolrCore execute
>> INFO: [calls] webapp=null path=null
>> params={start=0&event=newSearcher&q=*:*&rows=20} hits=40028 status=0
>> QTime=2
>> May 24, 2014 5:49:05 AM org.apache.solr.update.DirectUpdateHandler2 commit
>> INFO: end_commit_flush
>> May 24, 2014 5:49:05 AM org.apache.solr.core.SolrCore execute
>> INFO: [calls] webapp=null path=null
>> params={start=0&event=newSearcher&q=banking&rows=20} hits=636 status=0
>> QTime=3
>> May 24, 2014 5:49:05 AM org.apache.solr.core.QuerySenderListener
>> newSearcher
>> INFO: QuerySenderListener done.
>> May 24, 2014 5:49:05 AM
>> org.apache.solr.handler.component.SpellCheckComponent$SpellCheckerListener
>> newSearcher
>

Re: Full Indexing fails on Solr-Probable connection issue.HELP!

2014-05-27 Thread Aniket Bhoi
On Mon, May 26, 2014 at 4:14 PM, Aniket Bhoi  wrote:

> Another thing I have noted is that the exception always follows a commit
> operation.Log excerpt below:
>
> INFO: SolrDeletionPolicy.onCommit: commits:num=2
> commit{dir=/opt/solr/cores/calls/data/index,segFN=segments_2qt,version=1347458723267,generation=3557,filenames=[_3z9.tii,
> _3z3.fnm, _3z9.nrm, _3za.prx, _3z9.fdt, _3z9.fnm, _3z9.fdx, _3z3.frq,
> _3za.nrm, segments_2qt, _3z3.fdx, _3z9.prx, _3z3.fdt, _3za.fdx, _3z9.frq,
> _3z3.prx, _3za.fdt, _3z3.tii, _3za.tis, _3za.fnm, _3z3.nrm, _3z9.tis,
> _3za.tii, _3za.frq, _3z3.tis]
>  
> commit{dir=/opt/solr/cores/calls/data/index,segFN=segments_2qu,version=1347458723269,generation=3558,filenames=[_3zb.fdt,
> _3z9.tii, _3z3.fnm, _3z9.nrm, _3zb.tii, _3zb.tis, _3zb.fdx, _3za.prx,
> _3z9.fdt, _3z9.fnm, _3z9.fdx, _3zb.frq, _3z3.frq, _3za.nrm, segments_2qu,
> _3z3.fdx, _3zb.prx, _3z9.prx, _3zb.fnm, _3z3.fdt, _3za.fdx, _3z9.frq,
> _3z3.prx, _3za.fdt, _3zb.nrm, _3z3.tii, _3za.tis, _3za.fnm, _3z3.nrm,
> _3z9.tis, _3za.tii, _3za.frq, _3z3.tis]
> May 24, 2014 5:49:05 AM org.apache.solr.core.SolrDeletionPolicy
> updateCommits
> INFO: newest commit = 1347458723269
> May 24, 2014 5:49:05 AM org.apache.solr.search.SolrIndexSearcher 
> INFO: Opening Searcher@423dbcca main
> May 24, 2014 5:49:05 AM org.apache.solr.search.SolrIndexSearcher warm
> INFO: autowarming Searcher@423dbcca main from Searcher@19c19869 main
>
> fieldValueCache{lookups=0,hits=0,hitratio=0.00,inserts=0,evictions=0,size=0,warmupTime=0,cumulative_lookups=0,cumulative_hits=0,cumulative_hitratio=0.00,cumulative_inserts=0,cumulative_evictions=0}
> May 24, 2014 5:49:05 AM org.apache.solr.search.SolrIndexSearcher warm
> INFO: autowarming result for Searcher@423dbcca main
>
> fieldValueCache{lookups=0,hits=0,hitratio=0.00,inserts=0,evictions=0,size=0,warmupTime=0,cumulative_lookups=0,cumulative_hits=0,cumulative_hitratio=0.00,cumulative_inserts=0,cumulative_evictions=0}
> May 24, 2014 5:49:05 AM org.apache.solr.search.SolrIndexSearcher warm
> INFO: autowarming Searcher@423dbcca main from Searcher@19c19869 main
>
> filterCache{lookups=0,hits=0,hitratio=0.00,inserts=0,evictions=0,size=0,warmupTime=0,cumulative_lookups=0,cumulative_hits=0,cumulative_hitratio=0.00,cumulative_inserts=0,cumulative_evictions=0}
> May 24, 2014 5:49:05 AM org.apache.solr.search.SolrIndexSearcher warm
> INFO: autowarming result for Searcher@423dbcca main
>
> filterCache{lookups=0,hits=0,hitratio=0.00,inserts=0,evictions=0,size=0,warmupTime=0,cumulative_lookups=0,cumulative_hits=0,cumulative_hitratio=0.00,cumulative_inserts=0,cumulative_evictions=0}
> May 24, 2014 5:49:05 AM org.apache.solr.search.SolrIndexSearcher warm
> INFO: autowarming Searcher@423dbcca main from Searcher@19c19869 main
>
> queryResultCache{lookups=1,hits=1,hitratio=1.00,inserts=3,evictions=0,size=3,warmupTime=2,cumulative_lookups=47,cumulative_hits=46,cumulative_hitratio=0.97,cumulative_inserts=1,cumulative_evictions=0}
> May 24, 2014 5:49:05 AM org.apache.solr.search.SolrIndexSearcher warm
> INFO: autowarming result for Searcher@423dbcca main
>
> queryResultCache{lookups=0,hits=0,hitratio=0.00,inserts=3,evictions=0,size=3,warmupTime=2,cumulative_lookups=47,cumulative_hits=46,cumulative_hitratio=0.97,cumulative_inserts=1,cumulative_evictions=0}
> May 24, 2014 5:49:05 AM org.apache.solr.search.SolrIndexSearcher warm
> INFO: autowarming Searcher@423dbcca main from Searcher@19c19869 main
>
> documentCache{lookups=0,hits=0,hitratio=0.00,inserts=40,evictions=0,size=40,warmupTime=0,cumulative_lookups=0,cumulative_hits=0,cumulative_hitratio=0.00,cumulative_inserts=0,cumulative_evictions=0}
> May 24, 2014 5:49:05 AM org.apache.solr.search.SolrIndexSearcher warm
> INFO: autowarming result for Searcher@423dbcca main
>
> documentCache{lookups=0,hits=0,hitratio=0.00,inserts=0,evictions=0,size=0,warmupTime=0,cumulative_lookups=0,cumulative_hits=0,cumulative_hitratio=0.00,cumulative_inserts=0,cumulative_evictions=0}
> May 24, 2014 5:49:05 AM org.apache.solr.core.QuerySenderListener
> newSearcher
> INFO: QuerySenderListener sending requests to Searcher@423dbcca main
> May 24, 2014 5:49:05 AM org.apache.solr.core.SolrCore execute
> INFO: [calls] webapp=null path=null
> params={start=0&event=newSearcher&q=*:*&rows=20} hits=40028 status=0
> QTime=2
> May 24, 2014 5:49:05 AM org.apache.solr.update.DirectUpdateHandler2 commit
> INFO: end_commit_flush
> May 24, 2014 5:49:05 AM org.apache.solr.core.SolrCore execute
> INFO: [calls] webapp=null path=null
> params={start=0&event=newSearcher&q=banking&rows=20} hits=636 status=0
> QTime=3
> May 24, 2014 5:49:05 AM org.apache.solr.core.QuerySenderListener
> newSearcher
> INFO: QuerySenderListener done.
> May 24, 2014 5:49:05 AM
> org.apache.solr.handler.component.SpellCheckComponent$SpellCheckerListener
> newSearcher
> INFO: Index is not optimized therefore skipping building spell check index
> for: default
> May 24, 2014 5:49:05 AM org.apache.solr.core.SolrCore registerSearcher
> INFO: [calls

Re: Full Indexing fails on Solr-Probable connection issue.HELP!

2014-05-26 Thread Aniket Bhoi
Another thing I have noted is that the exception always follows a commit
operation.Log excerpt below:

INFO: SolrDeletionPolicy.onCommit: commits:num=2
commit{dir=/opt/solr/cores/calls/data/index,segFN=segments_2qt,version=1347458723267,generation=3557,filenames=[_3z9.tii,
_3z3.fnm, _3z9.nrm, _3za.prx, _3z9.fdt, _3z9.fnm, _3z9.fdx, _3z3.frq,
_3za.nrm, segments_2qt, _3z3.fdx, _3z9.prx, _3z3.fdt, _3za.fdx, _3z9.frq,
_3z3.prx, _3za.fdt, _3z3.tii, _3za.tis, _3za.fnm, _3z3.nrm, _3z9.tis,
_3za.tii, _3za.frq, _3z3.tis]
commit{dir=/opt/solr/cores/calls/data/index,segFN=segments_2qu,version=1347458723269,generation=3558,filenames=[_3zb.fdt,
_3z9.tii, _3z3.fnm, _3z9.nrm, _3zb.tii, _3zb.tis, _3zb.fdx, _3za.prx,
_3z9.fdt, _3z9.fnm, _3z9.fdx, _3zb.frq, _3z3.frq, _3za.nrm, segments_2qu,
_3z3.fdx, _3zb.prx, _3z9.prx, _3zb.fnm, _3z3.fdt, _3za.fdx, _3z9.frq,
_3z3.prx, _3za.fdt, _3zb.nrm, _3z3.tii, _3za.tis, _3za.fnm, _3z3.nrm,
_3z9.tis, _3za.tii, _3za.frq, _3z3.tis]
May 24, 2014 5:49:05 AM org.apache.solr.core.SolrDeletionPolicy
updateCommits
INFO: newest commit = 1347458723269
May 24, 2014 5:49:05 AM org.apache.solr.search.SolrIndexSearcher 
INFO: Opening Searcher@423dbcca main
May 24, 2014 5:49:05 AM org.apache.solr.search.SolrIndexSearcher warm
INFO: autowarming Searcher@423dbcca main from Searcher@19c19869 main
fieldValueCache{lookups=0,hits=0,hitratio=0.00,inserts=0,evictions=0,size=0,warmupTime=0,cumulative_lookups=0,cumulative_hits=0,cumulative_hitratio=0.00,cumulative_inserts=0,cumulative_evictions=0}
May 24, 2014 5:49:05 AM org.apache.solr.search.SolrIndexSearcher warm
INFO: autowarming result for Searcher@423dbcca main
fieldValueCache{lookups=0,hits=0,hitratio=0.00,inserts=0,evictions=0,size=0,warmupTime=0,cumulative_lookups=0,cumulative_hits=0,cumulative_hitratio=0.00,cumulative_inserts=0,cumulative_evictions=0}
May 24, 2014 5:49:05 AM org.apache.solr.search.SolrIndexSearcher warm
INFO: autowarming Searcher@423dbcca main from Searcher@19c19869 main
filterCache{lookups=0,hits=0,hitratio=0.00,inserts=0,evictions=0,size=0,warmupTime=0,cumulative_lookups=0,cumulative_hits=0,cumulative_hitratio=0.00,cumulative_inserts=0,cumulative_evictions=0}
May 24, 2014 5:49:05 AM org.apache.solr.search.SolrIndexSearcher warm
INFO: autowarming result for Searcher@423dbcca main
filterCache{lookups=0,hits=0,hitratio=0.00,inserts=0,evictions=0,size=0,warmupTime=0,cumulative_lookups=0,cumulative_hits=0,cumulative_hitratio=0.00,cumulative_inserts=0,cumulative_evictions=0}
May 24, 2014 5:49:05 AM org.apache.solr.search.SolrIndexSearcher warm
INFO: autowarming Searcher@423dbcca main from Searcher@19c19869 main
queryResultCache{lookups=1,hits=1,hitratio=1.00,inserts=3,evictions=0,size=3,warmupTime=2,cumulative_lookups=47,cumulative_hits=46,cumulative_hitratio=0.97,cumulative_inserts=1,cumulative_evictions=0}
May 24, 2014 5:49:05 AM org.apache.solr.search.SolrIndexSearcher warm
INFO: autowarming result for Searcher@423dbcca main
queryResultCache{lookups=0,hits=0,hitratio=0.00,inserts=3,evictions=0,size=3,warmupTime=2,cumulative_lookups=47,cumulative_hits=46,cumulative_hitratio=0.97,cumulative_inserts=1,cumulative_evictions=0}
May 24, 2014 5:49:05 AM org.apache.solr.search.SolrIndexSearcher warm
INFO: autowarming Searcher@423dbcca main from Searcher@19c19869 main
documentCache{lookups=0,hits=0,hitratio=0.00,inserts=40,evictions=0,size=40,warmupTime=0,cumulative_lookups=0,cumulative_hits=0,cumulative_hitratio=0.00,cumulative_inserts=0,cumulative_evictions=0}
May 24, 2014 5:49:05 AM org.apache.solr.search.SolrIndexSearcher warm
INFO: autowarming result for Searcher@423dbcca main
documentCache{lookups=0,hits=0,hitratio=0.00,inserts=0,evictions=0,size=0,warmupTime=0,cumulative_lookups=0,cumulative_hits=0,cumulative_hitratio=0.00,cumulative_inserts=0,cumulative_evictions=0}
May 24, 2014 5:49:05 AM org.apache.solr.core.QuerySenderListener newSearcher
INFO: QuerySenderListener sending requests to Searcher@423dbcca main
May 24, 2014 5:49:05 AM org.apache.solr.core.SolrCore execute
INFO: [calls] webapp=null path=null
params={start=0&event=newSearcher&q=*:*&rows=20} hits=40028 status=0
QTime=2
May 24, 2014 5:49:05 AM org.apache.solr.update.DirectUpdateHandler2 commit
INFO: end_commit_flush
May 24, 2014 5:49:05 AM org.apache.solr.core.SolrCore execute
INFO: [calls] webapp=null path=null
params={start=0&event=newSearcher&q=banking&rows=20} hits=636 status=0
QTime=3
May 24, 2014 5:49:05 AM org.apache.solr.core.QuerySenderListener newSearcher
INFO: QuerySenderListener done.
May 24, 2014 5:49:05 AM
org.apache.solr.handler.component.SpellCheckComponent$SpellCheckerListener
newSearcher
INFO: Index is not optimized therefore skipping building spell check index
for: default
May 24, 2014 5:49:05 AM org.apache.solr.core.SolrCore registerSearcher
INFO: [calls] Registered new searcher Searcher@423dbcca main
May 24, 2014 5:49:05 AM org.apache.solr.search.SolrIndexSearcher close
INFO: Closing Searcher@19c19869 main
fieldValueCache{lookups=0,hits=0,hitratio=0.00,ins

Re: Full Indexing fails on Solr-Probable connection issue.HELP!

2014-05-26 Thread Aniket Bhoi
On Thu, May 22, 2014 at 9:31 PM, Shawn Heisey  wrote:

> On 5/22/2014 8:31 AM, Aniket Bhoi wrote:
> > On Thu, May 22, 2014 at 7:13 PM, Shawn Heisey  wrote:
> >
> >> On 5/22/2014 1:53 AM, Aniket Bhoi wrote:
> >>> Details:
> >>>
> >>> *Solr Version:*
> >>> Solr Specification Version: 3.4.0.2012.01.23.14.08.01
> >>> Solr Implementation Version: 3.4
> >>> Lucene Specification Version: 3.4
> >>> Lucene Implementation Version: 3.4
> >>>
> >>> *Tomcat version:*
> >>> Apache Tomcat/6.0.18
> >>>
> >>> *OS details:*
> >>> SUSE Linux Enterprise Server 11 (x86_64)
> >>> VERSION = 11
> >>> PATCHLEVEL = 1
> >>>
> >>> While running indexing on this server,It failed.
> >>>
> >>> Log excerpt:
> >>>
> >>> Caused by:
> org.apache.solr.handler.dataimport.DataImportHandlerException:
> >>> com.microsoft.sqlserver.jdbc.SQLServerException: The result set is
> >> closed.
> >>>
> >>> Out intial hypothesis was that there is a problem with the connection
> >>> thread,so we made changes to the context.xml and added
> >>> validationQuery,testOnBorrow etc..to make sure the thread doesnt time
> >> out.
> >>> We also killed a lot of sleeping sessions from the server to the
> >> database.
> >>> All of the above still didnt work
> >> I have reduced your log excerpt to what I think is the important part.
> >>
> >> Removing the multithreaded support as others have suggested is a good
> >> idea, but what I think is really happening here is that Solr is engaging
> >> in a multi-tier merge, so it stops indexing for a while ... and
> >> meanwhile, JDBC times out and closes your database connection because of
> >> inactivity.  When the largest merge tier finishes, indexing tries to
> >> resume, which it can't do because the database connection is closed.
> >>
> >> The solution is to allow more simultaneous merges to happen, which
> >> allows indexing to continue while a multi-tier merge is underway.  This
> >> is my indexConfig section from solrconfig.xml:
> >>
> >> 
> >>   
> >> 35
> >> 35
> >> 105
> >>   
> >>class="org.apache.lucene.index.ConcurrentMergeScheduler">
> >> 1
> >> 6
> >>   
> >>   48
> >>   false
> >> 
> >>
> >> The important part for your purposes is the mergeScheduler config, and
> >> in particular, maxMergeCount.  Increase that to 6.  If you are using
> >> standard spinning hard disks, do not increase maxThreadCount beyond 1.
> >> If you are using SSD, you can safely increase that a small amount, but I
> >> don't think I'd go above 2 or 3.
> >>
> >> Thanks,
> >> Shawn
> >>
> >>
> > I may be missing something ,or looking in the wrong place,But I cannot
> find
> > an indexConfig section or any other mentioned detail above in the
> > solrconfig.xml file
>
> Solr will work without one, in which case it will simply use the
> defaults.  With older 3.x versions the mergeScheduler config will
> actually need to go in an indexDefaults section.  The mainIndex and
> indexDefaults sections were deprecated in 3.6 and removed entirely in 4.x.
>
> https://issues.apache.org/jira/browse/SOLR-1052
>
> If you don't have indexDefaults either, you may need to add the config
> as a top level element under .  If you do this, here's what you
> should add:
>
> 
>   
> 1
> 6
>   
> 
>
> I think we should probably change the default value that Solr uses for
> maxMergeCount.  This problem comes up fairly often.  As long as
> maxThreadCount is 1, I cannot think of a really good reason to limit
> maxMergeCount at the level that we currently do.
>
> Thanks,
> Shawn
>
>

I changed the solrconfig.xml file and included the changes you
suggested.However,this didnt work out,the Indexing still fails..I have also
added this

*:maxActive="100" minIdle="10" maxWait="1" initialSize="10"
logAbandoned="true" validationQuery="select 1" testOnBorrow="true"
testOnReturn="true" validationQueryTimeout="30" removeAbandoned="true"
removeAbandonedTimeout="3600"  *

to the* context.xml* file.This didnt work etiher.The thing to note is after
I changed the solrconfig.xml to add the merge config changes,the indexing
failed after 6 hours.Earlier,It used to fail after 1 hour .


Re: Full Indexing fails on Solr-Probable connection issue.HELP!

2014-05-22 Thread Shalin Shekhar Mangar
Shawn, can you open an issue so we don't forget about this?


On Thu, May 22, 2014 at 9:31 PM, Shawn Heisey  wrote:

> On 5/22/2014 8:31 AM, Aniket Bhoi wrote:
> > On Thu, May 22, 2014 at 7:13 PM, Shawn Heisey  wrote:
> >
> >> On 5/22/2014 1:53 AM, Aniket Bhoi wrote:
> >>> Details:
> >>>
> >>> *Solr Version:*
> >>> Solr Specification Version: 3.4.0.2012.01.23.14.08.01
> >>> Solr Implementation Version: 3.4
> >>> Lucene Specification Version: 3.4
> >>> Lucene Implementation Version: 3.4
> >>>
> >>> *Tomcat version:*
> >>> Apache Tomcat/6.0.18
> >>>
> >>> *OS details:*
> >>> SUSE Linux Enterprise Server 11 (x86_64)
> >>> VERSION = 11
> >>> PATCHLEVEL = 1
> >>>
> >>> While running indexing on this server,It failed.
> >>>
> >>> Log excerpt:
> >>>
> >>> Caused by:
> org.apache.solr.handler.dataimport.DataImportHandlerException:
> >>> com.microsoft.sqlserver.jdbc.SQLServerException: The result set is
> >> closed.
> >>>
> >>> Out intial hypothesis was that there is a problem with the connection
> >>> thread,so we made changes to the context.xml and added
> >>> validationQuery,testOnBorrow etc..to make sure the thread doesnt time
> >> out.
> >>> We also killed a lot of sleeping sessions from the server to the
> >> database.
> >>> All of the above still didnt work
> >> I have reduced your log excerpt to what I think is the important part.
> >>
> >> Removing the multithreaded support as others have suggested is a good
> >> idea, but what I think is really happening here is that Solr is engaging
> >> in a multi-tier merge, so it stops indexing for a while ... and
> >> meanwhile, JDBC times out and closes your database connection because of
> >> inactivity.  When the largest merge tier finishes, indexing tries to
> >> resume, which it can't do because the database connection is closed.
> >>
> >> The solution is to allow more simultaneous merges to happen, which
> >> allows indexing to continue while a multi-tier merge is underway.  This
> >> is my indexConfig section from solrconfig.xml:
> >>
> >> 
> >>   
> >> 35
> >> 35
> >> 105
> >>   
> >>class="org.apache.lucene.index.ConcurrentMergeScheduler">
> >> 1
> >> 6
> >>   
> >>   48
> >>   false
> >> 
> >>
> >> The important part for your purposes is the mergeScheduler config, and
> >> in particular, maxMergeCount.  Increase that to 6.  If you are using
> >> standard spinning hard disks, do not increase maxThreadCount beyond 1.
> >> If you are using SSD, you can safely increase that a small amount, but I
> >> don't think I'd go above 2 or 3.
> >>
> >> Thanks,
> >> Shawn
> >>
> >>
> > I may be missing something ,or looking in the wrong place,But I cannot
> find
> > an indexConfig section or any other mentioned detail above in the
> > solrconfig.xml file
>
> Solr will work without one, in which case it will simply use the
> defaults.  With older 3.x versions the mergeScheduler config will
> actually need to go in an indexDefaults section.  The mainIndex and
> indexDefaults sections were deprecated in 3.6 and removed entirely in 4.x.
>
> https://issues.apache.org/jira/browse/SOLR-1052
>
> If you don't have indexDefaults either, you may need to add the config
> as a top level element under .  If you do this, here's what you
> should add:
>
> 
>   
> 1
> 6
>   
> 
>
> I think we should probably change the default value that Solr uses for
> maxMergeCount.  This problem comes up fairly often.  As long as
> maxThreadCount is 1, I cannot think of a really good reason to limit
> maxMergeCount at the level that we currently do.
>
> Thanks,
> Shawn
>
>


-- 
Regards,
Shalin Shekhar Mangar.


Re: Full Indexing fails on Solr-Probable connection issue.HELP!

2014-05-22 Thread Shawn Heisey
On 5/22/2014 8:31 AM, Aniket Bhoi wrote:
> On Thu, May 22, 2014 at 7:13 PM, Shawn Heisey  wrote:
>
>> On 5/22/2014 1:53 AM, Aniket Bhoi wrote:
>>> Details:
>>>
>>> *Solr Version:*
>>> Solr Specification Version: 3.4.0.2012.01.23.14.08.01
>>> Solr Implementation Version: 3.4
>>> Lucene Specification Version: 3.4
>>> Lucene Implementation Version: 3.4
>>>
>>> *Tomcat version:*
>>> Apache Tomcat/6.0.18
>>>
>>> *OS details:*
>>> SUSE Linux Enterprise Server 11 (x86_64)
>>> VERSION = 11
>>> PATCHLEVEL = 1
>>>
>>> While running indexing on this server,It failed.
>>>
>>> Log excerpt:
>>>
>>> Caused by: org.apache.solr.handler.dataimport.DataImportHandlerException:
>>> com.microsoft.sqlserver.jdbc.SQLServerException: The result set is
>> closed.
>>>
>>> Out intial hypothesis was that there is a problem with the connection
>>> thread,so we made changes to the context.xml and added
>>> validationQuery,testOnBorrow etc..to make sure the thread doesnt time
>> out.
>>> We also killed a lot of sleeping sessions from the server to the
>> database.
>>> All of the above still didnt work
>> I have reduced your log excerpt to what I think is the important part.
>>
>> Removing the multithreaded support as others have suggested is a good
>> idea, but what I think is really happening here is that Solr is engaging
>> in a multi-tier merge, so it stops indexing for a while ... and
>> meanwhile, JDBC times out and closes your database connection because of
>> inactivity.  When the largest merge tier finishes, indexing tries to
>> resume, which it can't do because the database connection is closed.
>>
>> The solution is to allow more simultaneous merges to happen, which
>> allows indexing to continue while a multi-tier merge is underway.  This
>> is my indexConfig section from solrconfig.xml:
>>
>> 
>>   
>> 35
>> 35
>> 105
>>   
>>   
>> 1
>> 6
>>   
>>   48
>>   false
>> 
>>
>> The important part for your purposes is the mergeScheduler config, and
>> in particular, maxMergeCount.  Increase that to 6.  If you are using
>> standard spinning hard disks, do not increase maxThreadCount beyond 1.
>> If you are using SSD, you can safely increase that a small amount, but I
>> don't think I'd go above 2 or 3.
>>
>> Thanks,
>> Shawn
>>
>>
> I may be missing something ,or looking in the wrong place,But I cannot find
> an indexConfig section or any other mentioned detail above in the
> solrconfig.xml file

Solr will work without one, in which case it will simply use the
defaults.  With older 3.x versions the mergeScheduler config will
actually need to go in an indexDefaults section.  The mainIndex and
indexDefaults sections were deprecated in 3.6 and removed entirely in 4.x.

https://issues.apache.org/jira/browse/SOLR-1052

If you don't have indexDefaults either, you may need to add the config
as a top level element under .  If you do this, here's what you
should add:


  
1
6
  


I think we should probably change the default value that Solr uses for
maxMergeCount.  This problem comes up fairly often.  As long as
maxThreadCount is 1, I cannot think of a really good reason to limit
maxMergeCount at the level that we currently do.

Thanks,
Shawn



Re: Full Indexing fails on Solr-Probable connection issue.HELP!

2014-05-22 Thread Aniket Bhoi
On Thu, May 22, 2014 at 7:13 PM, Shawn Heisey  wrote:

> On 5/22/2014 1:53 AM, Aniket Bhoi wrote:
> > Details:
> >
> > *Solr Version:*
> > Solr Specification Version: 3.4.0.2012.01.23.14.08.01
> > Solr Implementation Version: 3.4
> > Lucene Specification Version: 3.4
> > Lucene Implementation Version: 3.4
> >
> > *Tomcat version:*
> > Apache Tomcat/6.0.18
> >
> > *OS details:*
> > SUSE Linux Enterprise Server 11 (x86_64)
> > VERSION = 11
> > PATCHLEVEL = 1
> >
> > While running indexing on this server,It failed.
> >
> > Log excerpt:
> >
> > Caused by: org.apache.solr.handler.dataimport.DataImportHandlerException:
> > com.microsoft.sqlserver.jdbc.SQLServerException: The result set is
> closed.
> >
> >
> > Out intial hypothesis was that there is a problem with the connection
> > thread,so we made changes to the context.xml and added
> > validationQuery,testOnBorrow etc..to make sure the thread doesnt time
> out.
> >
> > We also killed a lot of sleeping sessions from the server to the
> database.
> >
> > All of the above still didnt work
>
> I have reduced your log excerpt to what I think is the important part.
>
> Removing the multithreaded support as others have suggested is a good
> idea, but what I think is really happening here is that Solr is engaging
> in a multi-tier merge, so it stops indexing for a while ... and
> meanwhile, JDBC times out and closes your database connection because of
> inactivity.  When the largest merge tier finishes, indexing tries to
> resume, which it can't do because the database connection is closed.
>
> The solution is to allow more simultaneous merges to happen, which
> allows indexing to continue while a multi-tier merge is underway.  This
> is my indexConfig section from solrconfig.xml:
>
> 
>   
> 35
> 35
> 105
>   
>   
> 1
> 6
>   
>   48
>   false
> 
>
> The important part for your purposes is the mergeScheduler config, and
> in particular, maxMergeCount.  Increase that to 6.  If you are using
> standard spinning hard disks, do not increase maxThreadCount beyond 1.
> If you are using SSD, you can safely increase that a small amount, but I
> don't think I'd go above 2 or 3.
>
> Thanks,
> Shawn
>
>
I may be missing something ,or looking in the wrong place,But I cannot find
an indexConfig section or any other mentioned detail above in the
solrconfig.xml file


Re: Full Indexing fails on Solr-Probable connection issue.HELP!

2014-05-22 Thread Shawn Heisey
On 5/22/2014 1:53 AM, Aniket Bhoi wrote:
> Details:
> 
> *Solr Version:*
> Solr Specification Version: 3.4.0.2012.01.23.14.08.01
> Solr Implementation Version: 3.4
> Lucene Specification Version: 3.4
> Lucene Implementation Version: 3.4
> 
> *Tomcat version:*
> Apache Tomcat/6.0.18
> 
> *OS details:*
> SUSE Linux Enterprise Server 11 (x86_64)
> VERSION = 11
> PATCHLEVEL = 1
> 
> While running indexing on this server,It failed.
> 
> Log excerpt:
> 
> Caused by: org.apache.solr.handler.dataimport.DataImportHandlerException:
> com.microsoft.sqlserver.jdbc.SQLServerException: The result set is closed.
> 
> 
> Out intial hypothesis was that there is a problem with the connection
> thread,so we made changes to the context.xml and added
> validationQuery,testOnBorrow etc..to make sure the thread doesnt time out.
> 
> We also killed a lot of sleeping sessions from the server to the database.
> 
> All of the above still didnt work

I have reduced your log excerpt to what I think is the important part.

Removing the multithreaded support as others have suggested is a good
idea, but what I think is really happening here is that Solr is engaging
in a multi-tier merge, so it stops indexing for a while ... and
meanwhile, JDBC times out and closes your database connection because of
inactivity.  When the largest merge tier finishes, indexing tries to
resume, which it can't do because the database connection is closed.

The solution is to allow more simultaneous merges to happen, which
allows indexing to continue while a multi-tier merge is underway.  This
is my indexConfig section from solrconfig.xml:


  
35
35
105
  
  
1
6
  
  48
  false


The important part for your purposes is the mergeScheduler config, and
in particular, maxMergeCount.  Increase that to 6.  If you are using
standard spinning hard disks, do not increase maxThreadCount beyond 1.
If you are using SSD, you can safely increase that a small amount, but I
don't think I'd go above 2 or 3.

Thanks,
Shawn



Re: Full Indexing fails on Solr-Probable connection issue.HELP!

2014-05-22 Thread Mikhail Khludnev
As far as I remember
https://issues.apache.org/jira/browse/SOLR-3011threads didn't work at
3.4, and even had some minor issue at 3.6. Try to
run 3.6.1.

No threads in DIH in 4.x anymore
https://issues.apache.org/jira/browse/SOLR-3262


On Thu, May 22, 2014 at 11:58 AM, Shalin Shekhar Mangar <
shalinman...@gmail.com> wrote:

> You are running an ancient version of Solr plus the multi-threaded support
> in DataImportHandler was experimental at best and was removed a few
> versions later.
>
> Why don't you upgrade to a more recent version of Solr? At the very least,
> remove the threads setttings from DIH.
>
>
> On Thu, May 22, 2014 at 1:23 PM, Aniket Bhoi 
> wrote:
>
> > I have Apache Solr,hosted on my apache Tomcat Server with SQLServer
> > Backend.
> >
> >
> > Details:
> >
> > *Solr Version:*
> > Solr Specification Version: 3.4.0.2012.01.23.14.08.01
> > Solr Implementation Version: 3.4
> > Lucene Specification Version: 3.4
> > Lucene Implementation Version: 3.4
> >
> > *Tomcat version:*
> > Apache Tomcat/6.0.18
> >
> > *OS details:*
> > SUSE Linux Enterprise Server 11 (x86_64)
> > VERSION = 11
> > PATCHLEVEL = 1
> >
> > While running indexing on this server,It failed.
> >
> > Log excerpt:
> >
> > May 19, 2014 9:23:28 AM org.apache.solr.common.SolrException log
> > SEVERE: Full Import failed:java.lang.RuntimeException: Error in
> > multi-threaded import
> > at
> org.apache.solr.handler.dataimport.DocBuilder.doFullDump(Unknown
> > Source)
> > at org.apache.solr.handler.dataimport.DocBuilder.execute(Unknown
> > Source)
> > at
> > org.apache.solr.handler.dataimport.DataImporter.doFullImport(Unknown
> > Source)
> > at org.apache.solr.handler.dataimport.DataImporter.runCmd(Unknown
> > Source)
> > at org.apache.solr.handler.dataimport.DataImporter$1.run(Unknown
> > Source)
> > Caused by: org.apache.solr.handler.dataimport.DataImportHandlerException:
> > com.microsoft.sqlserver.jdbc.SQLServerException: The result set is
> closed.
> > at
> >
> >
> org.apache.solr.handler.dataimport.DataImportHandlerException.wrapAndThrow(Unknown
> > Source)
> > at
> >
> >
> org.apache.solr.handler.dataimport.JdbcDataSource$ResultSetIterator.hasnext(Unknown
> > Source)
> > at
> >
> >
> org.apache.solr.handler.dataimport.JdbcDataSource$ResultSetIterator.access$500(Unknown
> > Source)
> > at
> >
> >
> org.apache.solr.handler.dataimport.JdbcDataSource$ResultSetIterator$1.hasNext(Unknown
> > Source)
> > at
> > org.apache.solr.handler.dataimport.EntityProcessorBase.getNext(Unknown
> > Source)
> > at
> > org.apache.solr.handler.dataimport.SqlEntityProcessor.nextRow(Unknown
> > Source)
> > at
> >
> >
> org.apache.solr.handler.dataimport.ThreadedEntityProcessorWrapper.nextRow(Unknown
> > Source)
> > at
> >
> >
> org.apache.solr.handler.dataimport.DocBuilder$EntityRunner.runAThread(Unknown
> > Source)
> > at
> >
> >
> org.apache.solr.handler.dataimport.DocBuilder$EntityRunner.access$000(Unknown
> > Source)
> > at
> > org.apache.solr.handler.dataimport.DocBuilder$EntityRunner$1.run(Unknown
> > Source)
> > at java.util.concurrent.ThreadPoolExecutor$Worker.runTask(Unknown
> > Source)
> > at java.util.concurrent.ThreadPoolExecutor$Worker.run(Unknown
> > Source)
> > at java.lang.Thread.run(Unknown Source)
> > Caused by: com.microsoft.sqlserver.jdbc.SQLServerException: The result
> set
> > is closed.
> > at
> >
> >
> com.microsoft.sqlserver.jdbc.SQLServerException.makeFromDriverError(SQLServerException.java:170)
> > at
> >
> >
> com.microsoft.sqlserver.jdbc.SQLServerResultSet.checkClosed(SQLServerResultSet.java:346)
> > at
> >
> >
> com.microsoft.sqlserver.jdbc.SQLServerResultSet.next(SQLServerResultSet.java:916)
> > at
> >
> >
> org.apache.tomcat.dbcp.dbcp.DelegatingResultSet.next(DelegatingResultSet.java:174)
> > ... 12 more
> >
> >
> > SEVERE: Can not close connection
> > java.sql.SQLException: Already closed.
> > at
> >
> >
> org.apache.tomcat.dbcp.dbcp.PoolableConnection.close(PoolableConnection.java:84)
> > at
> >
> >
> org.apache.tomcat.dbcp.dbcp.PoolingDataSource$PoolGuardConnectionWrapper.close(PoolingDataSource.java:189)
> > at
> >
> >
> org.apache.solr.handler.dataimport.JdbcDataSource$ResultSetIterator.closeResources(Unknown
> > Source)
> > at
> >
> >
> org.apache.solr.handler.dataimport.JdbcDataSource$ResultSetIterator.hasnext(Unknown
> > Source)
> > at
> >
> >
> org.apache.solr.handler.dataimport.JdbcDataSource$ResultSetIterator.access$500(Unknown
> > Source)
> > at
> >
> >
> org.apache.solr.handler.dataimport.JdbcDataSource$ResultSetIterator$1.hasNext(Unknown
> > Source)
> > at
> > org.apache.solr.handler.dataimport.EntityProcessorBase.getNext(Unknown
> > Source)
> > at
> > org.apache.solr.handler.dataimport.SqlEntityProcessor.nextRow(Unknown
> > Source)
> > at
> >
> >
> org.apache

Re: Full Indexing fails on Solr-Probable connection issue.HELP!

2014-05-22 Thread Shalin Shekhar Mangar
You are running an ancient version of Solr plus the multi-threaded support
in DataImportHandler was experimental at best and was removed a few
versions later.

Why don't you upgrade to a more recent version of Solr? At the very least,
remove the threads setttings from DIH.


On Thu, May 22, 2014 at 1:23 PM, Aniket Bhoi  wrote:

> I have Apache Solr,hosted on my apache Tomcat Server with SQLServer
> Backend.
>
>
> Details:
>
> *Solr Version:*
> Solr Specification Version: 3.4.0.2012.01.23.14.08.01
> Solr Implementation Version: 3.4
> Lucene Specification Version: 3.4
> Lucene Implementation Version: 3.4
>
> *Tomcat version:*
> Apache Tomcat/6.0.18
>
> *OS details:*
> SUSE Linux Enterprise Server 11 (x86_64)
> VERSION = 11
> PATCHLEVEL = 1
>
> While running indexing on this server,It failed.
>
> Log excerpt:
>
> May 19, 2014 9:23:28 AM org.apache.solr.common.SolrException log
> SEVERE: Full Import failed:java.lang.RuntimeException: Error in
> multi-threaded import
> at org.apache.solr.handler.dataimport.DocBuilder.doFullDump(Unknown
> Source)
> at org.apache.solr.handler.dataimport.DocBuilder.execute(Unknown
> Source)
> at
> org.apache.solr.handler.dataimport.DataImporter.doFullImport(Unknown
> Source)
> at org.apache.solr.handler.dataimport.DataImporter.runCmd(Unknown
> Source)
> at org.apache.solr.handler.dataimport.DataImporter$1.run(Unknown
> Source)
> Caused by: org.apache.solr.handler.dataimport.DataImportHandlerException:
> com.microsoft.sqlserver.jdbc.SQLServerException: The result set is closed.
> at
>
> org.apache.solr.handler.dataimport.DataImportHandlerException.wrapAndThrow(Unknown
> Source)
> at
>
> org.apache.solr.handler.dataimport.JdbcDataSource$ResultSetIterator.hasnext(Unknown
> Source)
> at
>
> org.apache.solr.handler.dataimport.JdbcDataSource$ResultSetIterator.access$500(Unknown
> Source)
> at
>
> org.apache.solr.handler.dataimport.JdbcDataSource$ResultSetIterator$1.hasNext(Unknown
> Source)
> at
> org.apache.solr.handler.dataimport.EntityProcessorBase.getNext(Unknown
> Source)
> at
> org.apache.solr.handler.dataimport.SqlEntityProcessor.nextRow(Unknown
> Source)
> at
>
> org.apache.solr.handler.dataimport.ThreadedEntityProcessorWrapper.nextRow(Unknown
> Source)
> at
>
> org.apache.solr.handler.dataimport.DocBuilder$EntityRunner.runAThread(Unknown
> Source)
> at
>
> org.apache.solr.handler.dataimport.DocBuilder$EntityRunner.access$000(Unknown
> Source)
> at
> org.apache.solr.handler.dataimport.DocBuilder$EntityRunner$1.run(Unknown
> Source)
> at java.util.concurrent.ThreadPoolExecutor$Worker.runTask(Unknown
> Source)
> at java.util.concurrent.ThreadPoolExecutor$Worker.run(Unknown
> Source)
> at java.lang.Thread.run(Unknown Source)
> Caused by: com.microsoft.sqlserver.jdbc.SQLServerException: The result set
> is closed.
> at
>
> com.microsoft.sqlserver.jdbc.SQLServerException.makeFromDriverError(SQLServerException.java:170)
> at
>
> com.microsoft.sqlserver.jdbc.SQLServerResultSet.checkClosed(SQLServerResultSet.java:346)
> at
>
> com.microsoft.sqlserver.jdbc.SQLServerResultSet.next(SQLServerResultSet.java:916)
> at
>
> org.apache.tomcat.dbcp.dbcp.DelegatingResultSet.next(DelegatingResultSet.java:174)
> ... 12 more
>
>
> SEVERE: Can not close connection
> java.sql.SQLException: Already closed.
> at
>
> org.apache.tomcat.dbcp.dbcp.PoolableConnection.close(PoolableConnection.java:84)
> at
>
> org.apache.tomcat.dbcp.dbcp.PoolingDataSource$PoolGuardConnectionWrapper.close(PoolingDataSource.java:189)
> at
>
> org.apache.solr.handler.dataimport.JdbcDataSource$ResultSetIterator.closeResources(Unknown
> Source)
> at
>
> org.apache.solr.handler.dataimport.JdbcDataSource$ResultSetIterator.hasnext(Unknown
> Source)
> at
>
> org.apache.solr.handler.dataimport.JdbcDataSource$ResultSetIterator.access$500(Unknown
> Source)
> at
>
> org.apache.solr.handler.dataimport.JdbcDataSource$ResultSetIterator$1.hasNext(Unknown
> Source)
> at
> org.apache.solr.handler.dataimport.EntityProcessorBase.getNext(Unknown
> Source)
> at
> org.apache.solr.handler.dataimport.SqlEntityProcessor.nextRow(Unknown
> Source)
> at
>
> org.apache.solr.handler.dataimport.ThreadedEntityProcessorWrapper.nextRow(Unknown
> Source)
> at
>
> org.apache.solr.handler.dataimport.DocBuilder$EntityRunner.runAThread(Unknown
> Source)
> at
>
> org.apache.solr.handler.dataimport.DocBuilder$EntityRunner.access$000(Unknown
> Source)
> at
> org.apache.solr.handler.dataimport.DocBuilder$EntityRunner$1.run(Unknown
> Source)
> at java.util.concurrent.ThreadPoolExecutor$Worker.runTask(Unknown
> Source)
> at java.util.concurrent.ThreadPoolExecutor$Worker.run(Unknown
> Source)
> at java.lang.Thread.run(Unknown Source)
>
>
> Out intial hypothesis was that there