RE: RequestHandlerBase java.lang.NullPointerException Error
What Solr version - solr 7.3.1 What configuration? Solr cloud What is the document you are indexing and how do you send it to Solr? We are using the middleware to update the data. Also the issue started happening yesterday. Thanks & Regards Piyush Please consider the environment before printing. -Original Message- From: Jan Høydahl Sent: Wednesday, March 13, 2019 1:40 PM To: solr-user@lucene.apache.org Subject: [EXT] Re: RequestHandlerBase java.lang.NullPointerException Error What Solr version? What configuration? What is the document you are indexing and how do you send it to Solr? Think this may be a known bug that is already fixed.. Jan Høydahl > 13. mar. 2019 kl. 17:21 skrev Rathor, Piyush (US - Philadelphia) > : > > Facing following error suddenly for data update: > > null:java.lang.NullPointerException > at > org.apache.solr.update.processor.AddSchemaFieldsUpdateProcessorFactory > $AddSchemaFieldsUpdateProcessor.mapValueClassesToFieldType(AddSchemaFi > eldsUpdateProcessorFactory.java:509) > at > org.apache.solr.update.processor.AddSchemaFieldsUpdateProcessorFactory > $AddSchemaFieldsUpdateProcessor.processAdd(AddSchemaFieldsUpdateProces > sorFactory.java:396) at > org.apache.solr.update.processor.UpdateRequestProcessor.processAdd(Upd > ateRequestProcessor.java:55) at > org.apache.solr.update.processor.FieldMutatingUpdateProcessor.processA > dd(FieldMutatingUpdateProcessor.java:118) > at > org.apache.solr.update.processor.UpdateRequestProcessor.processAdd(Upd > ateRequestProcessor.java:55) at > org.apache.solr.update.processor.FieldMutatingUpdateProcessor.processA > dd(FieldMutatingUpdateProcessor.java:118) > at > org.apache.solr.update.processor.UpdateRequestProcessor.processAdd(Upd > ateRequestProcessor.java:55) at > org.apache.solr.update.processor.FieldMutatingUpdateProcessor.processA > dd(FieldMutatingUpdateProcessor.java:118) > at > org.apache.solr.update.processor.UpdateRequestProcessor.processAdd(Upd > ateRequestProcessor.java:55) at > org.apache.solr.update.processor.FieldMutatingUpdateProcessor.processA > dd(FieldMutatingUpdateProcessor.java:118) > at > org.apache.solr.update.processor.UpdateRequestProcessor.processAdd(Upd > ateRequestProcessor.java:55) at > org.apache.solr.update.processor.FieldNameMutatingUpdateProcessorFacto > ry$1.processAdd(FieldNameMutatingUpdateProcessorFactory.java:75) > at > org.apache.solr.update.processor.UpdateRequestProcessor.processAdd(Upd > ateRequestProcessor.java:55) at > org.apache.solr.update.processor.FieldMutatingUpdateProcessor.processA > dd(FieldMutatingUpdateProcessor.java:118) > at > org.apache.solr.update.processor.UpdateRequestProcessor.processAdd(Upd > ateRequestProcessor.java:55) at > org.apache.solr.update.processor.AbstractDefaultValueUpdateProcessorFa > ctory$DefaultValueUpdateProcessor.processAdd(AbstractDefaultValueUpdat > eProcessorFactory.java:92) at > org.apache.solr.handler.loader.JsonLoader$SingleThreadedJsonLoader.han > dleAdds(JsonLoader.java:501) at > org.apache.solr.handler.loader.JsonLoader$SingleThreadedJsonLoader.pro > cessUpdate(JsonLoader.java:145) at > org.apache.solr.handler.loader.JsonLoader$SingleThreadedJsonLoader.loa > d(JsonLoader.java:121) at > org.apache.solr.handler.loader.JsonLoader.load(JsonLoader.java:84) > at > org.apache.solr.handler.UpdateRequestHandler$1.load(UpdateRequestHandl > er.java:97) at > org.apache.solr.handler.ContentStreamHandlerBase.handleRequestBody(Con > tentStreamHandlerBase.java:68) at > org.apache.solr.handler.RequestHandlerBase.handleRequest(RequestHandle > rBase.java:195) at > org.apache.solr.core.SolrCore.execute(SolrCore.java:2503) > at org.apache.solr.servlet.HttpSolrCall.execute(HttpSolrCall.java:711) > at org.apache.solr.servlet.HttpSolrCall.call(HttpSolrCall.java:517) > at > org.apache.solr.servlet.SolrDispatchFilter.doFilter(SolrDispatchFilter > .java:384) at > org.apache.solr.servlet.SolrDispatchFilter.doFilter(SolrDispatchFilter > .java:330) at > org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletH > andler.java:1629) at > org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java: > 533) at > org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.ja > va:143) at > org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java > :548) at > org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper. > java:132) at > org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandle > r.java:190) at > org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandle > r.java:1595) at > org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandle > r.java:188) at
Re: RequestHandlerBase java.lang.NullPointerException Error
What Solr version? What configuration? What is the document you are indexing and how do you send it to Solr? Think this may be a known bug that is already fixed.. Jan Høydahl > 13. mar. 2019 kl. 17:21 skrev Rathor, Piyush (US - Philadelphia) > : > > Facing following error suddenly for data update: > > null:java.lang.NullPointerException > at > org.apache.solr.update.processor.AddSchemaFieldsUpdateProcessorFactory$AddSchemaFieldsUpdateProcessor.mapValueClassesToFieldType(AddSchemaFieldsUpdateProcessorFactory.java:509) > at > org.apache.solr.update.processor.AddSchemaFieldsUpdateProcessorFactory$AddSchemaFieldsUpdateProcessor.processAdd(AddSchemaFieldsUpdateProcessorFactory.java:396) > at > org.apache.solr.update.processor.UpdateRequestProcessor.processAdd(UpdateRequestProcessor.java:55) > at > org.apache.solr.update.processor.FieldMutatingUpdateProcessor.processAdd(FieldMutatingUpdateProcessor.java:118) > at > org.apache.solr.update.processor.UpdateRequestProcessor.processAdd(UpdateRequestProcessor.java:55) > at > org.apache.solr.update.processor.FieldMutatingUpdateProcessor.processAdd(FieldMutatingUpdateProcessor.java:118) > at > org.apache.solr.update.processor.UpdateRequestProcessor.processAdd(UpdateRequestProcessor.java:55) > at > org.apache.solr.update.processor.FieldMutatingUpdateProcessor.processAdd(FieldMutatingUpdateProcessor.java:118) > at > org.apache.solr.update.processor.UpdateRequestProcessor.processAdd(UpdateRequestProcessor.java:55) > at > org.apache.solr.update.processor.FieldMutatingUpdateProcessor.processAdd(FieldMutatingUpdateProcessor.java:118) > at > org.apache.solr.update.processor.UpdateRequestProcessor.processAdd(UpdateRequestProcessor.java:55) > at > org.apache.solr.update.processor.FieldNameMutatingUpdateProcessorFactory$1.processAdd(FieldNameMutatingUpdateProcessorFactory.java:75) > at > org.apache.solr.update.processor.UpdateRequestProcessor.processAdd(UpdateRequestProcessor.java:55) > at > org.apache.solr.update.processor.FieldMutatingUpdateProcessor.processAdd(FieldMutatingUpdateProcessor.java:118) > at > org.apache.solr.update.processor.UpdateRequestProcessor.processAdd(UpdateRequestProcessor.java:55) > at > org.apache.solr.update.processor.AbstractDefaultValueUpdateProcessorFactory$DefaultValueUpdateProcessor.processAdd(AbstractDefaultValueUpdateProcessorFactory.java:92) > at > org.apache.solr.handler.loader.JsonLoader$SingleThreadedJsonLoader.handleAdds(JsonLoader.java:501) > at > org.apache.solr.handler.loader.JsonLoader$SingleThreadedJsonLoader.processUpdate(JsonLoader.java:145) > at > org.apache.solr.handler.loader.JsonLoader$SingleThreadedJsonLoader.load(JsonLoader.java:121) > at org.apache.solr.handler.loader.JsonLoader.load(JsonLoader.java:84) > at > org.apache.solr.handler.UpdateRequestHandler$1.load(UpdateRequestHandler.java:97) > at > org.apache.solr.handler.ContentStreamHandlerBase.handleRequestBody(ContentStreamHandlerBase.java:68) > at > org.apache.solr.handler.RequestHandlerBase.handleRequest(RequestHandlerBase.java:195) > at org.apache.solr.core.SolrCore.execute(SolrCore.java:2503) > at org.apache.solr.servlet.HttpSolrCall.execute(HttpSolrCall.java:711) > at org.apache.solr.servlet.HttpSolrCall.call(HttpSolrCall.java:517) > at > org.apache.solr.servlet.SolrDispatchFilter.doFilter(SolrDispatchFilter.java:384) > at > org.apache.solr.servlet.SolrDispatchFilter.doFilter(SolrDispatchFilter.java:330) > at > org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1629) > at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:533) > at > org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) > at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:548) > at > org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:132) > at > org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:190) > at > org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1595) > at > org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:188) > at > org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1253) > at > org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:168) > at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:473) > at > org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1564) > at > org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:166) > at > org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1155) > at > org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) > at > org.eclipse.jetty.server.handler.ContextHandlerCollection.handle(ContextHandlerCollection.java:219) > at > org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:126) > at > org.eclipse.jetty.server.handler.Handle
RequestHandlerBase java.lang.NullPointerException Error
Facing following error suddenly for data update: null:java.lang.NullPointerException at org.apache.solr.update.processor.AddSchemaFieldsUpdateProcessorFactory$AddSchemaFieldsUpdateProcessor.mapValueClassesToFieldType(AddSchemaFieldsUpdateProcessorFactory.java:509) at org.apache.solr.update.processor.AddSchemaFieldsUpdateProcessorFactory$AddSchemaFieldsUpdateProcessor.processAdd(AddSchemaFieldsUpdateProcessorFactory.java:396) at org.apache.solr.update.processor.UpdateRequestProcessor.processAdd(UpdateRequestProcessor.java:55) at org.apache.solr.update.processor.FieldMutatingUpdateProcessor.processAdd(FieldMutatingUpdateProcessor.java:118) at org.apache.solr.update.processor.UpdateRequestProcessor.processAdd(UpdateRequestProcessor.java:55) at org.apache.solr.update.processor.FieldMutatingUpdateProcessor.processAdd(FieldMutatingUpdateProcessor.java:118) at org.apache.solr.update.processor.UpdateRequestProcessor.processAdd(UpdateRequestProcessor.java:55) at org.apache.solr.update.processor.FieldMutatingUpdateProcessor.processAdd(FieldMutatingUpdateProcessor.java:118) at org.apache.solr.update.processor.UpdateRequestProcessor.processAdd(UpdateRequestProcessor.java:55) at org.apache.solr.update.processor.FieldMutatingUpdateProcessor.processAdd(FieldMutatingUpdateProcessor.java:118) at org.apache.solr.update.processor.UpdateRequestProcessor.processAdd(UpdateRequestProcessor.java:55) at org.apache.solr.update.processor.FieldNameMutatingUpdateProcessorFactory$1.processAdd(FieldNameMutatingUpdateProcessorFactory.java:75) at org.apache.solr.update.processor.UpdateRequestProcessor.processAdd(UpdateRequestProcessor.java:55) at org.apache.solr.update.processor.FieldMutatingUpdateProcessor.processAdd(FieldMutatingUpdateProcessor.java:118) at org.apache.solr.update.processor.UpdateRequestProcessor.processAdd(UpdateRequestProcessor.java:55) at org.apache.solr.update.processor.AbstractDefaultValueUpdateProcessorFactory$DefaultValueUpdateProcessor.processAdd(AbstractDefaultValueUpdateProcessorFactory.java:92) at org.apache.solr.handler.loader.JsonLoader$SingleThreadedJsonLoader.handleAdds(JsonLoader.java:501) at org.apache.solr.handler.loader.JsonLoader$SingleThreadedJsonLoader.processUpdate(JsonLoader.java:145) at org.apache.solr.handler.loader.JsonLoader$SingleThreadedJsonLoader.load(JsonLoader.java:121) at org.apache.solr.handler.loader.JsonLoader.load(JsonLoader.java:84) at org.apache.solr.handler.UpdateRequestHandler$1.load(UpdateRequestHandler.java:97) at org.apache.solr.handler.ContentStreamHandlerBase.handleRequestBody(ContentStreamHandlerBase.java:68) at org.apache.solr.handler.RequestHandlerBase.handleRequest(RequestHandlerBase.java:195) at org.apache.solr.core.SolrCore.execute(SolrCore.java:2503) at org.apache.solr.servlet.HttpSolrCall.execute(HttpSolrCall.java:711) at org.apache.solr.servlet.HttpSolrCall.call(HttpSolrCall.java:517) at org.apache.solr.servlet.SolrDispatchFilter.doFilter(SolrDispatchFilter.java:384) at org.apache.solr.servlet.SolrDispatchFilter.doFilter(SolrDispatchFilter.java:330) at org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1629) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:533) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:548) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:132) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:190) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1595) at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:188) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1253) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:168) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:473) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1564) at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:166) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1155) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.ContextHandlerCollection.handle(ContextHandlerCollection.java:219) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:126) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:132) at org.eclipse.jetty.rewrite.handler.RewriteHandler.handle(RewriteHandler.java:335) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:132) at org.eclipse.jetty.server.Server.handle(Server.java:530) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:347) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:256) at org.eclipse
Re: java.lang.NullPointerException at org.apache.solr.handler.component.QueryComponent.mergeIds(QueryComponent.java:421)
Hi thanks for posting this, was getting same error and had same stored false ID. -- Sent from: http://lucene.472066.n3.nabble.com/Solr-User-f472068.html
Re: java.lang.NullPointerException in json facet hll function
Hi, Any updates on this issue? I am using Solr 6.3 and I have hit this same bug... Thanks -- View this message in context: http://lucene.472066.n3.nabble.com/java-lang-NullPointerException-in-json-facet-hll-function-tp4265378p4337877.html Sent from the Solr - User mailing list archive at Nabble.com.
Re: Streaming Expressions (/stream) StreamHandler java.lang.NullPointerException
I've not been able to replicate the null pointer exception being seen. I created a new collection called EventsAndDCF with 4 shards and 3 replicas using a simple conf $> /tmp/solr-go/bin/solr/bin/solr create -p 30001 -c EventsAndDCF -d ../../../test/main/conf/sample -n EventsAndDCF -shards 4 -replicationFactor 3 and then sent the same curl command Peter is sending $> curl --data-urlencode 'expr=search(EventsAndDCF,q="*:*",fl="AccessPath",sort="AccessPath asc",qt="/export")' "http://localhost:30001/solr/EventsAndDCF/stream"; Because I haven't indexed any data into the collection I get back an expected error that the sort field cannot be found, but this is *after* the parsing of the param expr occurs so I know I cannot replicate the exception you appear to be seeing. I 2nd Joel's question about being in SolrCloud mode - that is a requirement of streaming. - Dennis On Sun, Jun 26, 2016 at 8:51 PM, Joel Bernstein wrote: > The NPE is showing that the expression clause is Null. Are you in SolrCloud > mode? This is required for Streaming Expressions. > > I would try sending the query via your browser also, just to make sure > there isn't something we're missing in the curl syntax. > > You can call the /stream handler directly and pass the expr parameter. Most > browsers will url encode the expression, but to be sure you can url encode > the expression before sending it down. > > Joel Bernstein > http://joelsolr.blogspot.com/ > > On Sat, Jun 25, 2016 at 5:36 PM, Peter Sh wrote: > > > I've got an exception below running > > curl --data-urlencode > > 'expr=search(EventsAndDCF,q="*:*",fl="AccessPath",sort="AccessPath > > asc",qt="/export")' "http://localhost:8983/solr/EventsAndDCF/stream"; > > Solr responce: > > {"result-set":{"docs":[ > > {"EXCEPTION":null,"EOF":true}]}} > > > > > > My collection EventsAndDCF exists. and I succeed to run GET queries like: > > > > > http://localhost:8983/solr/EventsAndDCF/export?fl=AccessPath&q=*:*&sort=AccessPath > > desc&wt=json > > > > Solr version: 6.0.1. Single node > > > > > > > > 2016-06-25 21:15:44.147 ERROR (qtp1514322932-16) [ x:EventsAndDCF] > > o.a.s.h.StreamHandler java.lang.NullPointerException > > at > > > > > org.apache.solr.client.solrj.io.stream.expr.StreamExpressionParser.generateStreamExpression(StreamExpressionParser.java:46) > > at > > > > > org.apache.solr.client.solrj.io.stream.expr.StreamExpressionParser.parse(StreamExpressionParser.java:37) > > at > > > > > org.apache.solr.client.solrj.io.stream.expr.StreamFactory.constructStream(StreamFactory.java:178) > > at > > > > > org.apache.solr.handler.StreamHandler.handleRequestBody(StreamHandler.java:164) > > at > > > > > org.apache.solr.handler.RequestHandlerBase.handleRequest(RequestHandlerBase.java:155) > > at org.apache.solr.core.SolrCore.execute(SolrCore.java:2053) > > at org.apache.solr.servlet.HttpSolrCall.execute(HttpSolrCall.java:652) > > at org.apache.solr.servlet.HttpSolrCall.call(HttpSolrCall.java:460) > > at > > > > > org.apache.solr.servlet.SolrDispatchFilter.doFilter(SolrDispatchFilter.java:229) > > at > > > > > org.apache.solr.servlet.SolrDispatchFilter.doFilter(SolrDispatchFilter.java:184) > > at > > > > > org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1668) > > at > > > org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:581) > > at > > > > > org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) > > at > > > org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:548) > > at > > > > > org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:226) > > at > > > > > org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1160) > > at > > org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:511) > > at > > > > > org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:185) > > at > > > > > org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1092) > > at > > > > > org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) > > at > > > > > org.eclipse.jetty.server.handler.ContextHandlerCollection.
Re: Streaming Expressions (/stream) StreamHandler java.lang.NullPointerException
The NPE is showing that the expression clause is Null. Are you in SolrCloud mode? This is required for Streaming Expressions. I would try sending the query via your browser also, just to make sure there isn't something we're missing in the curl syntax. You can call the /stream handler directly and pass the expr parameter. Most browsers will url encode the expression, but to be sure you can url encode the expression before sending it down. Joel Bernstein http://joelsolr.blogspot.com/ On Sat, Jun 25, 2016 at 5:36 PM, Peter Sh wrote: > I've got an exception below running > curl --data-urlencode > 'expr=search(EventsAndDCF,q="*:*",fl="AccessPath",sort="AccessPath > asc",qt="/export")' "http://localhost:8983/solr/EventsAndDCF/stream"; > Solr responce: > {"result-set":{"docs":[ > {"EXCEPTION":null,"EOF":true}]}} > > > My collection EventsAndDCF exists. and I succeed to run GET queries like: > > http://localhost:8983/solr/EventsAndDCF/export?fl=AccessPath&q=*:*&sort=AccessPath > desc&wt=json > > Solr version: 6.0.1. Single node > > > > 2016-06-25 21:15:44.147 ERROR (qtp1514322932-16) [ x:EventsAndDCF] > o.a.s.h.StreamHandler java.lang.NullPointerException > at > > org.apache.solr.client.solrj.io.stream.expr.StreamExpressionParser.generateStreamExpression(StreamExpressionParser.java:46) > at > > org.apache.solr.client.solrj.io.stream.expr.StreamExpressionParser.parse(StreamExpressionParser.java:37) > at > > org.apache.solr.client.solrj.io.stream.expr.StreamFactory.constructStream(StreamFactory.java:178) > at > > org.apache.solr.handler.StreamHandler.handleRequestBody(StreamHandler.java:164) > at > > org.apache.solr.handler.RequestHandlerBase.handleRequest(RequestHandlerBase.java:155) > at org.apache.solr.core.SolrCore.execute(SolrCore.java:2053) > at org.apache.solr.servlet.HttpSolrCall.execute(HttpSolrCall.java:652) > at org.apache.solr.servlet.HttpSolrCall.call(HttpSolrCall.java:460) > at > > org.apache.solr.servlet.SolrDispatchFilter.doFilter(SolrDispatchFilter.java:229) > at > > org.apache.solr.servlet.SolrDispatchFilter.doFilter(SolrDispatchFilter.java:184) > at > > org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1668) > at > org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:581) > at > > org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) > at > org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:548) > at > > org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:226) > at > > org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1160) > at > org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:511) > at > > org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:185) > at > > org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1092) > at > > org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) > at > > org.eclipse.jetty.server.handler.ContextHandlerCollection.handle(ContextHandlerCollection.java:213) > at > > org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:119) > at > > org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:134) > at org.eclipse.jetty.server.Server.handle(Server.java:518) > at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:308) > at > org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:244) > at > > org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:273) > at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:95) > at > > org.eclipse.jetty.io.SelectChannelEndPoint$2.run(SelectChannelEndPoint.java:93) > at > > org.eclipse.jetty.util.thread.strategy.ExecuteProduceConsume.produceAndRun(ExecuteProduceConsume.java:246) > at > > org.eclipse.jetty.util.thread.strategy.ExecuteProduceConsume.run(ExecuteProduceConsume.java:156) > at > > org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:654) > at > > org.eclipse.jetty.util.thread.QueuedThreadPool$3.run(QueuedThreadPool.java:572) > at java.lang.Thread.run(Unknown Source) > > 2016-06-25 21:15:44.147 INFO (qtp1514322932-16) [ x:EventsAndDCF] > o.a.s.c.S.Request [EventsAndDCF] webapp=/solr path=/stream > > params={'expr=search(EventsAndDCF,q%3D*:*,fl%3DAccessPath,sort%3DAccessPath+asc,qt%3D/export)'} > status=0 QTime=2 >
Streaming Expressions (/stream) StreamHandler java.lang.NullPointerException
I've got an exception below running curl --data-urlencode 'expr=search(EventsAndDCF,q="*:*",fl="AccessPath",sort="AccessPath asc",qt="/export")' "http://localhost:8983/solr/EventsAndDCF/stream"; Solr responce: {"result-set":{"docs":[ {"EXCEPTION":null,"EOF":true}]}} My collection EventsAndDCF exists. and I succeed to run GET queries like: http://localhost:8983/solr/EventsAndDCF/export?fl=AccessPath&q=*:*&sort=AccessPath desc&wt=json Solr version: 6.0.1. Single node 2016-06-25 21:15:44.147 ERROR (qtp1514322932-16) [ x:EventsAndDCF] o.a.s.h.StreamHandler java.lang.NullPointerException at org.apache.solr.client.solrj.io.stream.expr.StreamExpressionParser.generateStreamExpression(StreamExpressionParser.java:46) at org.apache.solr.client.solrj.io.stream.expr.StreamExpressionParser.parse(StreamExpressionParser.java:37) at org.apache.solr.client.solrj.io.stream.expr.StreamFactory.constructStream(StreamFactory.java:178) at org.apache.solr.handler.StreamHandler.handleRequestBody(StreamHandler.java:164) at org.apache.solr.handler.RequestHandlerBase.handleRequest(RequestHandlerBase.java:155) at org.apache.solr.core.SolrCore.execute(SolrCore.java:2053) at org.apache.solr.servlet.HttpSolrCall.execute(HttpSolrCall.java:652) at org.apache.solr.servlet.HttpSolrCall.call(HttpSolrCall.java:460) at org.apache.solr.servlet.SolrDispatchFilter.doFilter(SolrDispatchFilter.java:229) at org.apache.solr.servlet.SolrDispatchFilter.doFilter(SolrDispatchFilter.java:184) at org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1668) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:581) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:548) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:226) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1160) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:511) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:185) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1092) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.ContextHandlerCollection.handle(ContextHandlerCollection.java:213) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:119) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:134) at org.eclipse.jetty.server.Server.handle(Server.java:518) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:308) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:244) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:273) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:95) at org.eclipse.jetty.io.SelectChannelEndPoint$2.run(SelectChannelEndPoint.java:93) at org.eclipse.jetty.util.thread.strategy.ExecuteProduceConsume.produceAndRun(ExecuteProduceConsume.java:246) at org.eclipse.jetty.util.thread.strategy.ExecuteProduceConsume.run(ExecuteProduceConsume.java:156) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:654) at org.eclipse.jetty.util.thread.QueuedThreadPool$3.run(QueuedThreadPool.java:572) at java.lang.Thread.run(Unknown Source) 2016-06-25 21:15:44.147 INFO (qtp1514322932-16) [ x:EventsAndDCF] o.a.s.c.S.Request [EventsAndDCF] webapp=/solr path=/stream params={'expr=search(EventsAndDCF,q%3D*:*,fl%3DAccessPath,sort%3DAccessPath+asc,qt%3D/export)'} status=0 QTime=2
Re: Parallel SQL Interface returns "java.lang.NullPointerException" after reloading collection
Thanks Joel! :) On Tue, 3 May 2016, 23:37 Joel Bernstein, wrote: > Ryan, there is a patch (for the master branch) up on SOLR-9059 that > resolves the issue. This will be in 6.1 and 6.0.1 if there is one. Thanks > for the bug report! > > Joel Bernstein > http://joelsolr.blogspot.com/ > > On Tue, May 3, 2016 at 10:41 AM, Joel Bernstein > wrote: > > > I opened SOLR-9059. > > > > Joel Bernstein > > http://joelsolr.blogspot.com/ > > > > On Tue, May 3, 2016 at 10:31 AM, Joel Bernstein > > wrote: > > > >> What I believe is happening is that the core is closing on the reload, > >> which is triggering the closeHook and shutting down all the connections > in > >> SolrClientCache. > >> > >> When the core reopens the connections are all still closed because the > >> SolrClientCache is instantiated statically with the creation of the > >> StreamHandler. > >> > >> So I think the correct fix is to create the SolrClientCache in inform(), > >> that way it will get recreated with each reload. As long as the > closeHook > >> has closed the existing SolrClientCache this shouldn't cause any > connection > >> leaks with reloads. > >> > >> > >> > >> > >> Joel Bernstein > >> http://joelsolr.blogspot.com/ > >> > >> On Tue, May 3, 2016 at 10:01 AM, Joel Bernstein > >> wrote: > >> > >>> I'll look into this today. > >>> > >>> Joel Bernstein > >>> http://joelsolr.blogspot.com/ > >>> > >>> On Tue, May 3, 2016 at 9:22 AM, Kevin Risden < > risd...@avalonconsult.com> > >>> wrote: > >>> > >>>> What I think is happening is that since the CloudSolrClient is from > the > >>>> SolrCache and the collection was reloaded. zkStateReader is actually > >>>> null > >>>> since there was no cloudSolrClient.connect() call after the reload. I > >>>> think > >>>> that would cause the NPE on anything that uses the zkStateReader like > >>>> getClusterState(). > >>>> > >>>> ZkStateReader zkStateReader = cloudSolrClient.getZkStateReader(); > >>>> ClusterState clusterState = zkStateReader.getClusterState(); > >>>> > >>>> > >>>> Kevin Risden > >>>> Apache Lucene/Solr Committer > >>>> Hadoop and Search Tech Lead | Avalon Consulting, LLC > >>>> <http://www.avalonconsult.com/> > >>>> M: 732 213 8417 > >>>> LinkedIn <http://www.linkedin.com/company/avalon-consulting-llc> | > >>>> Google+ > >>>> <http://www.google.com/+AvalonConsultingLLC> | Twitter > >>>> <https://twitter.com/avalonconsult> > >>>> > >>>> > >>>> > - > >>>> This message (including any attachments) contains confidential > >>>> information > >>>> intended for a specific individual and purpose, and is protected by > >>>> law. If > >>>> you are not the intended recipient, you should delete this message. > Any > >>>> disclosure, copying, or distribution of this message, or the taking of > >>>> any > >>>> action based on it, is strictly prohibited. > >>>> > >>>> On Mon, May 2, 2016 at 9:58 PM, Joel Bernstein > >>>> wrote: > >>>> > >>>> > Looks like the loop below is throwing a Null pointer. I suspect the > >>>> > collection has not yet come back online. In theory this should be > self > >>>> > healing and when the collection comes back online it should start > >>>> working > >>>> > again. If not then that would be a bug. > >>>> > > >>>> > for(String col : clusterState.getCollections()) { > >>>> > > >>>> > > >>>> > Joel Bernstein > >>>> > http://joelsolr.blogspot.com/ > >>>> > > >>>> > On Mon, May 2, 2016 at 10:06 PM, Ryan Yacyshyn < > >>>> ryan.yacys...@gmail.com> > >>>> > wrote: > >>>> > > >>>> > > Yes stack trace can be found here: > >>>> > > > >>>> > > http://pastie.org/10821638 > &
Re: Parallel SQL Interface returns "java.lang.NullPointerException" after reloading collection
Ryan, there is a patch (for the master branch) up on SOLR-9059 that resolves the issue. This will be in 6.1 and 6.0.1 if there is one. Thanks for the bug report! Joel Bernstein http://joelsolr.blogspot.com/ On Tue, May 3, 2016 at 10:41 AM, Joel Bernstein wrote: > I opened SOLR-9059. > > Joel Bernstein > http://joelsolr.blogspot.com/ > > On Tue, May 3, 2016 at 10:31 AM, Joel Bernstein > wrote: > >> What I believe is happening is that the core is closing on the reload, >> which is triggering the closeHook and shutting down all the connections in >> SolrClientCache. >> >> When the core reopens the connections are all still closed because the >> SolrClientCache is instantiated statically with the creation of the >> StreamHandler. >> >> So I think the correct fix is to create the SolrClientCache in inform(), >> that way it will get recreated with each reload. As long as the closeHook >> has closed the existing SolrClientCache this shouldn't cause any connection >> leaks with reloads. >> >> >> >> >> Joel Bernstein >> http://joelsolr.blogspot.com/ >> >> On Tue, May 3, 2016 at 10:01 AM, Joel Bernstein >> wrote: >> >>> I'll look into this today. >>> >>> Joel Bernstein >>> http://joelsolr.blogspot.com/ >>> >>> On Tue, May 3, 2016 at 9:22 AM, Kevin Risden >>> wrote: >>> >>>> What I think is happening is that since the CloudSolrClient is from the >>>> SolrCache and the collection was reloaded. zkStateReader is actually >>>> null >>>> since there was no cloudSolrClient.connect() call after the reload. I >>>> think >>>> that would cause the NPE on anything that uses the zkStateReader like >>>> getClusterState(). >>>> >>>> ZkStateReader zkStateReader = cloudSolrClient.getZkStateReader(); >>>> ClusterState clusterState = zkStateReader.getClusterState(); >>>> >>>> >>>> Kevin Risden >>>> Apache Lucene/Solr Committer >>>> Hadoop and Search Tech Lead | Avalon Consulting, LLC >>>> <http://www.avalonconsult.com/> >>>> M: 732 213 8417 >>>> LinkedIn <http://www.linkedin.com/company/avalon-consulting-llc> | >>>> Google+ >>>> <http://www.google.com/+AvalonConsultingLLC> | Twitter >>>> <https://twitter.com/avalonconsult> >>>> >>>> >>>> - >>>> This message (including any attachments) contains confidential >>>> information >>>> intended for a specific individual and purpose, and is protected by >>>> law. If >>>> you are not the intended recipient, you should delete this message. Any >>>> disclosure, copying, or distribution of this message, or the taking of >>>> any >>>> action based on it, is strictly prohibited. >>>> >>>> On Mon, May 2, 2016 at 9:58 PM, Joel Bernstein >>>> wrote: >>>> >>>> > Looks like the loop below is throwing a Null pointer. I suspect the >>>> > collection has not yet come back online. In theory this should be self >>>> > healing and when the collection comes back online it should start >>>> working >>>> > again. If not then that would be a bug. >>>> > >>>> > for(String col : clusterState.getCollections()) { >>>> > >>>> > >>>> > Joel Bernstein >>>> > http://joelsolr.blogspot.com/ >>>> > >>>> > On Mon, May 2, 2016 at 10:06 PM, Ryan Yacyshyn < >>>> ryan.yacys...@gmail.com> >>>> > wrote: >>>> > >>>> > > Yes stack trace can be found here: >>>> > > >>>> > > http://pastie.org/10821638 >>>> > > >>>> > > >>>> > > >>>> > > On Mon, 2 May 2016 at 01:05 Joel Bernstein >>>> wrote: >>>> > > >>>> > > > Can you post your stack trace? I suspect this has to do with how >>>> the >>>> > > > Streaming API is interacting with SolrCloud. We can probably also >>>> > create >>>> > > a >>>> > > > jira ticket for this. >>>> > > > >>>> > > > Joel Bernstein >>>> > > > http://
Re: Parallel SQL Interface returns "java.lang.NullPointerException" after reloading collection
I opened SOLR-9059. Joel Bernstein http://joelsolr.blogspot.com/ On Tue, May 3, 2016 at 10:31 AM, Joel Bernstein wrote: > What I believe is happening is that the core is closing on the reload, > which is triggering the closeHook and shutting down all the connections in > SolrClientCache. > > When the core reopens the connections are all still closed because the > SolrClientCache is instantiated statically with the creation of the > StreamHandler. > > So I think the correct fix is to create the SolrClientCache in inform(), > that way it will get recreated with each reload. As long as the closeHook > has closed the existing SolrClientCache this shouldn't cause any connection > leaks with reloads. > > > > > Joel Bernstein > http://joelsolr.blogspot.com/ > > On Tue, May 3, 2016 at 10:01 AM, Joel Bernstein > wrote: > >> I'll look into this today. >> >> Joel Bernstein >> http://joelsolr.blogspot.com/ >> >> On Tue, May 3, 2016 at 9:22 AM, Kevin Risden >> wrote: >> >>> What I think is happening is that since the CloudSolrClient is from the >>> SolrCache and the collection was reloaded. zkStateReader is actually null >>> since there was no cloudSolrClient.connect() call after the reload. I >>> think >>> that would cause the NPE on anything that uses the zkStateReader like >>> getClusterState(). >>> >>> ZkStateReader zkStateReader = cloudSolrClient.getZkStateReader(); >>> ClusterState clusterState = zkStateReader.getClusterState(); >>> >>> >>> Kevin Risden >>> Apache Lucene/Solr Committer >>> Hadoop and Search Tech Lead | Avalon Consulting, LLC >>> <http://www.avalonconsult.com/> >>> M: 732 213 8417 >>> LinkedIn <http://www.linkedin.com/company/avalon-consulting-llc> | >>> Google+ >>> <http://www.google.com/+AvalonConsultingLLC> | Twitter >>> <https://twitter.com/avalonconsult> >>> >>> >>> - >>> This message (including any attachments) contains confidential >>> information >>> intended for a specific individual and purpose, and is protected by law. >>> If >>> you are not the intended recipient, you should delete this message. Any >>> disclosure, copying, or distribution of this message, or the taking of >>> any >>> action based on it, is strictly prohibited. >>> >>> On Mon, May 2, 2016 at 9:58 PM, Joel Bernstein >>> wrote: >>> >>> > Looks like the loop below is throwing a Null pointer. I suspect the >>> > collection has not yet come back online. In theory this should be self >>> > healing and when the collection comes back online it should start >>> working >>> > again. If not then that would be a bug. >>> > >>> > for(String col : clusterState.getCollections()) { >>> > >>> > >>> > Joel Bernstein >>> > http://joelsolr.blogspot.com/ >>> > >>> > On Mon, May 2, 2016 at 10:06 PM, Ryan Yacyshyn < >>> ryan.yacys...@gmail.com> >>> > wrote: >>> > >>> > > Yes stack trace can be found here: >>> > > >>> > > http://pastie.org/10821638 >>> > > >>> > > >>> > > >>> > > On Mon, 2 May 2016 at 01:05 Joel Bernstein >>> wrote: >>> > > >>> > > > Can you post your stack trace? I suspect this has to do with how >>> the >>> > > > Streaming API is interacting with SolrCloud. We can probably also >>> > create >>> > > a >>> > > > jira ticket for this. >>> > > > >>> > > > Joel Bernstein >>> > > > http://joelsolr.blogspot.com/ >>> > > > >>> > > > On Sun, May 1, 2016 at 4:02 AM, Ryan Yacyshyn < >>> ryan.yacys...@gmail.com >>> > > >>> > > > wrote: >>> > > > >>> > > > > Hi all, >>> > > > > >>> > > > > I'm exploring with parallel SQL queries and found something >>> strange >>> > > after >>> > > > > reloading the collection: the same query will return a >>> > > > > java.lang.NullPointerException error. Here are my steps on a >>> fresh >>> > > > install >>> > > > > of Solr 6.0.0. >>> > > > > >>> > > > > *Start Solr in cloud mode with example* >>> > > > > bin/solr -e cloud -noprompt >>> > > > > >>> > > > > *Index some data* >>> > > > > bin/post -c gettingstarted example/exampledocs/*.xml >>> > > > > >>> > > > > *Send query, which works* >>> > > > > curl --data-urlencode 'stmt=select id,name from gettingstarted >>> where >>> > > > > inStock = true limit 2' >>> > http://localhost:8983/solr/gettingstarted/sql >>> > > > > >>> > > > > *Reload the collection* >>> > > > > curl ' >>> > > > > >>> > > > > >>> > > > >>> > > >>> > >>> http://localhost:8983/solr/admin/collections?action=RELOAD&name=gettingstarted >>> > > > > ' >>> > > > > >>> > > > > After reloading, running the exact query above will return the >>> null >>> > > > pointer >>> > > > > exception error. Any idea why? >>> > > > > >>> > > > > If I stop all Solr severs and restart, then it's fine. >>> > > > > >>> > > > > *java -version* >>> > > > > java version "1.8.0_25" >>> > > > > Java(TM) SE Runtime Environment (build 1.8.0_25-b17) >>> > > > > Java HotSpot(TM) 64-Bit Server VM (build 25.25-b02, mixed mode) >>> > > > > >>> > > > > Thanks, >>> > > > > Ryan >>> > > > > >>> > > > >>> > > >>> > >>> >> >> >
Re: Parallel SQL Interface returns "java.lang.NullPointerException" after reloading collection
What I believe is happening is that the core is closing on the reload, which is triggering the closeHook and shutting down all the connections in SolrClientCache. When the core reopens the connections are all still closed because the SolrClientCache is instantiated statically with the creation of the StreamHandler. So I think the correct fix is to create the SolrClientCache in inform(), that way it will get recreated with each reload. As long as the closeHook has closed the existing SolrClientCache this shouldn't cause any connection leaks with reloads. Joel Bernstein http://joelsolr.blogspot.com/ On Tue, May 3, 2016 at 10:01 AM, Joel Bernstein wrote: > I'll look into this today. > > Joel Bernstein > http://joelsolr.blogspot.com/ > > On Tue, May 3, 2016 at 9:22 AM, Kevin Risden > wrote: > >> What I think is happening is that since the CloudSolrClient is from the >> SolrCache and the collection was reloaded. zkStateReader is actually null >> since there was no cloudSolrClient.connect() call after the reload. I >> think >> that would cause the NPE on anything that uses the zkStateReader like >> getClusterState(). >> >> ZkStateReader zkStateReader = cloudSolrClient.getZkStateReader(); >> ClusterState clusterState = zkStateReader.getClusterState(); >> >> >> Kevin Risden >> Apache Lucene/Solr Committer >> Hadoop and Search Tech Lead | Avalon Consulting, LLC >> <http://www.avalonconsult.com/> >> M: 732 213 8417 >> LinkedIn <http://www.linkedin.com/company/avalon-consulting-llc> | >> Google+ >> <http://www.google.com/+AvalonConsultingLLC> | Twitter >> <https://twitter.com/avalonconsult> >> >> >> - >> This message (including any attachments) contains confidential information >> intended for a specific individual and purpose, and is protected by law. >> If >> you are not the intended recipient, you should delete this message. Any >> disclosure, copying, or distribution of this message, or the taking of any >> action based on it, is strictly prohibited. >> >> On Mon, May 2, 2016 at 9:58 PM, Joel Bernstein >> wrote: >> >> > Looks like the loop below is throwing a Null pointer. I suspect the >> > collection has not yet come back online. In theory this should be self >> > healing and when the collection comes back online it should start >> working >> > again. If not then that would be a bug. >> > >> > for(String col : clusterState.getCollections()) { >> > >> > >> > Joel Bernstein >> > http://joelsolr.blogspot.com/ >> > >> > On Mon, May 2, 2016 at 10:06 PM, Ryan Yacyshyn > > >> > wrote: >> > >> > > Yes stack trace can be found here: >> > > >> > > http://pastie.org/10821638 >> > > >> > > >> > > >> > > On Mon, 2 May 2016 at 01:05 Joel Bernstein >> wrote: >> > > >> > > > Can you post your stack trace? I suspect this has to do with how the >> > > > Streaming API is interacting with SolrCloud. We can probably also >> > create >> > > a >> > > > jira ticket for this. >> > > > >> > > > Joel Bernstein >> > > > http://joelsolr.blogspot.com/ >> > > > >> > > > On Sun, May 1, 2016 at 4:02 AM, Ryan Yacyshyn < >> ryan.yacys...@gmail.com >> > > >> > > > wrote: >> > > > >> > > > > Hi all, >> > > > > >> > > > > I'm exploring with parallel SQL queries and found something >> strange >> > > after >> > > > > reloading the collection: the same query will return a >> > > > > java.lang.NullPointerException error. Here are my steps on a fresh >> > > > install >> > > > > of Solr 6.0.0. >> > > > > >> > > > > *Start Solr in cloud mode with example* >> > > > > bin/solr -e cloud -noprompt >> > > > > >> > > > > *Index some data* >> > > > > bin/post -c gettingstarted example/exampledocs/*.xml >> > > > > >> > > > > *Send query, which works* >> > > > > curl --data-urlencode 'stmt=select id,name from gettingstarted >> where >> > > > > inStock = true limit 2' >> > http://localhost:8983/solr/gettingstarted/sql >> > > > > >> > > > > *Reload the collection* >> > > > > curl ' >> > > > > >> > > > > >> > > > >> > > >> > >> http://localhost:8983/solr/admin/collections?action=RELOAD&name=gettingstarted >> > > > > ' >> > > > > >> > > > > After reloading, running the exact query above will return the >> null >> > > > pointer >> > > > > exception error. Any idea why? >> > > > > >> > > > > If I stop all Solr severs and restart, then it's fine. >> > > > > >> > > > > *java -version* >> > > > > java version "1.8.0_25" >> > > > > Java(TM) SE Runtime Environment (build 1.8.0_25-b17) >> > > > > Java HotSpot(TM) 64-Bit Server VM (build 25.25-b02, mixed mode) >> > > > > >> > > > > Thanks, >> > > > > Ryan >> > > > > >> > > > >> > > >> > >> > >
Re: Parallel SQL Interface returns "java.lang.NullPointerException" after reloading collection
I'll look into this today. Joel Bernstein http://joelsolr.blogspot.com/ On Tue, May 3, 2016 at 9:22 AM, Kevin Risden wrote: > What I think is happening is that since the CloudSolrClient is from the > SolrCache and the collection was reloaded. zkStateReader is actually null > since there was no cloudSolrClient.connect() call after the reload. I think > that would cause the NPE on anything that uses the zkStateReader like > getClusterState(). > > ZkStateReader zkStateReader = cloudSolrClient.getZkStateReader(); > ClusterState clusterState = zkStateReader.getClusterState(); > > > Kevin Risden > Apache Lucene/Solr Committer > Hadoop and Search Tech Lead | Avalon Consulting, LLC > <http://www.avalonconsult.com/> > M: 732 213 8417 > LinkedIn <http://www.linkedin.com/company/avalon-consulting-llc> | Google+ > <http://www.google.com/+AvalonConsultingLLC> | Twitter > <https://twitter.com/avalonconsult> > > > - > This message (including any attachments) contains confidential information > intended for a specific individual and purpose, and is protected by law. If > you are not the intended recipient, you should delete this message. Any > disclosure, copying, or distribution of this message, or the taking of any > action based on it, is strictly prohibited. > > On Mon, May 2, 2016 at 9:58 PM, Joel Bernstein wrote: > > > Looks like the loop below is throwing a Null pointer. I suspect the > > collection has not yet come back online. In theory this should be self > > healing and when the collection comes back online it should start working > > again. If not then that would be a bug. > > > > for(String col : clusterState.getCollections()) { > > > > > > Joel Bernstein > > http://joelsolr.blogspot.com/ > > > > On Mon, May 2, 2016 at 10:06 PM, Ryan Yacyshyn > > wrote: > > > > > Yes stack trace can be found here: > > > > > > http://pastie.org/10821638 > > > > > > > > > > > > On Mon, 2 May 2016 at 01:05 Joel Bernstein wrote: > > > > > > > Can you post your stack trace? I suspect this has to do with how the > > > > Streaming API is interacting with SolrCloud. We can probably also > > create > > > a > > > > jira ticket for this. > > > > > > > > Joel Bernstein > > > > http://joelsolr.blogspot.com/ > > > > > > > > On Sun, May 1, 2016 at 4:02 AM, Ryan Yacyshyn < > ryan.yacys...@gmail.com > > > > > > > wrote: > > > > > > > > > Hi all, > > > > > > > > > > I'm exploring with parallel SQL queries and found something strange > > > after > > > > > reloading the collection: the same query will return a > > > > > java.lang.NullPointerException error. Here are my steps on a fresh > > > > install > > > > > of Solr 6.0.0. > > > > > > > > > > *Start Solr in cloud mode with example* > > > > > bin/solr -e cloud -noprompt > > > > > > > > > > *Index some data* > > > > > bin/post -c gettingstarted example/exampledocs/*.xml > > > > > > > > > > *Send query, which works* > > > > > curl --data-urlencode 'stmt=select id,name from gettingstarted > where > > > > > inStock = true limit 2' > > http://localhost:8983/solr/gettingstarted/sql > > > > > > > > > > *Reload the collection* > > > > > curl ' > > > > > > > > > > > > > > > > > > > > http://localhost:8983/solr/admin/collections?action=RELOAD&name=gettingstarted > > > > > ' > > > > > > > > > > After reloading, running the exact query above will return the null > > > > pointer > > > > > exception error. Any idea why? > > > > > > > > > > If I stop all Solr severs and restart, then it's fine. > > > > > > > > > > *java -version* > > > > > java version "1.8.0_25" > > > > > Java(TM) SE Runtime Environment (build 1.8.0_25-b17) > > > > > Java HotSpot(TM) 64-Bit Server VM (build 25.25-b02, mixed mode) > > > > > > > > > > Thanks, > > > > > Ryan > > > > > > > > > > > > > > >
Re: Parallel SQL Interface returns "java.lang.NullPointerException" after reloading collection
What I think is happening is that since the CloudSolrClient is from the SolrCache and the collection was reloaded. zkStateReader is actually null since there was no cloudSolrClient.connect() call after the reload. I think that would cause the NPE on anything that uses the zkStateReader like getClusterState(). ZkStateReader zkStateReader = cloudSolrClient.getZkStateReader(); ClusterState clusterState = zkStateReader.getClusterState(); Kevin Risden Apache Lucene/Solr Committer Hadoop and Search Tech Lead | Avalon Consulting, LLC <http://www.avalonconsult.com/> M: 732 213 8417 LinkedIn <http://www.linkedin.com/company/avalon-consulting-llc> | Google+ <http://www.google.com/+AvalonConsultingLLC> | Twitter <https://twitter.com/avalonconsult> - This message (including any attachments) contains confidential information intended for a specific individual and purpose, and is protected by law. If you are not the intended recipient, you should delete this message. Any disclosure, copying, or distribution of this message, or the taking of any action based on it, is strictly prohibited. On Mon, May 2, 2016 at 9:58 PM, Joel Bernstein wrote: > Looks like the loop below is throwing a Null pointer. I suspect the > collection has not yet come back online. In theory this should be self > healing and when the collection comes back online it should start working > again. If not then that would be a bug. > > for(String col : clusterState.getCollections()) { > > > Joel Bernstein > http://joelsolr.blogspot.com/ > > On Mon, May 2, 2016 at 10:06 PM, Ryan Yacyshyn > wrote: > > > Yes stack trace can be found here: > > > > http://pastie.org/10821638 > > > > > > > > On Mon, 2 May 2016 at 01:05 Joel Bernstein wrote: > > > > > Can you post your stack trace? I suspect this has to do with how the > > > Streaming API is interacting with SolrCloud. We can probably also > create > > a > > > jira ticket for this. > > > > > > Joel Bernstein > > > http://joelsolr.blogspot.com/ > > > > > > On Sun, May 1, 2016 at 4:02 AM, Ryan Yacyshyn > > > > wrote: > > > > > > > Hi all, > > > > > > > > I'm exploring with parallel SQL queries and found something strange > > after > > > > reloading the collection: the same query will return a > > > > java.lang.NullPointerException error. Here are my steps on a fresh > > > install > > > > of Solr 6.0.0. > > > > > > > > *Start Solr in cloud mode with example* > > > > bin/solr -e cloud -noprompt > > > > > > > > *Index some data* > > > > bin/post -c gettingstarted example/exampledocs/*.xml > > > > > > > > *Send query, which works* > > > > curl --data-urlencode 'stmt=select id,name from gettingstarted where > > > > inStock = true limit 2' > http://localhost:8983/solr/gettingstarted/sql > > > > > > > > *Reload the collection* > > > > curl ' > > > > > > > > > > > > > > http://localhost:8983/solr/admin/collections?action=RELOAD&name=gettingstarted > > > > ' > > > > > > > > After reloading, running the exact query above will return the null > > > pointer > > > > exception error. Any idea why? > > > > > > > > If I stop all Solr severs and restart, then it's fine. > > > > > > > > *java -version* > > > > java version "1.8.0_25" > > > > Java(TM) SE Runtime Environment (build 1.8.0_25-b17) > > > > Java HotSpot(TM) 64-Bit Server VM (build 25.25-b02, mixed mode) > > > > > > > > Thanks, > > > > Ryan > > > > > > > > > >
Re: Parallel SQL Interface returns "java.lang.NullPointerException" after reloading collection
Looks like the loop below is throwing a Null pointer. I suspect the collection has not yet come back online. In theory this should be self healing and when the collection comes back online it should start working again. If not then that would be a bug. for(String col : clusterState.getCollections()) { Joel Bernstein http://joelsolr.blogspot.com/ On Mon, May 2, 2016 at 10:06 PM, Ryan Yacyshyn wrote: > Yes stack trace can be found here: > > http://pastie.org/10821638 > > > > On Mon, 2 May 2016 at 01:05 Joel Bernstein wrote: > > > Can you post your stack trace? I suspect this has to do with how the > > Streaming API is interacting with SolrCloud. We can probably also create > a > > jira ticket for this. > > > > Joel Bernstein > > http://joelsolr.blogspot.com/ > > > > On Sun, May 1, 2016 at 4:02 AM, Ryan Yacyshyn > > wrote: > > > > > Hi all, > > > > > > I'm exploring with parallel SQL queries and found something strange > after > > > reloading the collection: the same query will return a > > > java.lang.NullPointerException error. Here are my steps on a fresh > > install > > > of Solr 6.0.0. > > > > > > *Start Solr in cloud mode with example* > > > bin/solr -e cloud -noprompt > > > > > > *Index some data* > > > bin/post -c gettingstarted example/exampledocs/*.xml > > > > > > *Send query, which works* > > > curl --data-urlencode 'stmt=select id,name from gettingstarted where > > > inStock = true limit 2' http://localhost:8983/solr/gettingstarted/sql > > > > > > *Reload the collection* > > > curl ' > > > > > > > > > http://localhost:8983/solr/admin/collections?action=RELOAD&name=gettingstarted > > > ' > > > > > > After reloading, running the exact query above will return the null > > pointer > > > exception error. Any idea why? > > > > > > If I stop all Solr severs and restart, then it's fine. > > > > > > *java -version* > > > java version "1.8.0_25" > > > Java(TM) SE Runtime Environment (build 1.8.0_25-b17) > > > Java HotSpot(TM) 64-Bit Server VM (build 25.25-b02, mixed mode) > > > > > > Thanks, > > > Ryan > > > > > >
Re: Parallel SQL Interface returns "java.lang.NullPointerException" after reloading collection
Yes stack trace can be found here: http://pastie.org/10821638 On Mon, 2 May 2016 at 01:05 Joel Bernstein wrote: > Can you post your stack trace? I suspect this has to do with how the > Streaming API is interacting with SolrCloud. We can probably also create a > jira ticket for this. > > Joel Bernstein > http://joelsolr.blogspot.com/ > > On Sun, May 1, 2016 at 4:02 AM, Ryan Yacyshyn > wrote: > > > Hi all, > > > > I'm exploring with parallel SQL queries and found something strange after > > reloading the collection: the same query will return a > > java.lang.NullPointerException error. Here are my steps on a fresh > install > > of Solr 6.0.0. > > > > *Start Solr in cloud mode with example* > > bin/solr -e cloud -noprompt > > > > *Index some data* > > bin/post -c gettingstarted example/exampledocs/*.xml > > > > *Send query, which works* > > curl --data-urlencode 'stmt=select id,name from gettingstarted where > > inStock = true limit 2' http://localhost:8983/solr/gettingstarted/sql > > > > *Reload the collection* > > curl ' > > > > > http://localhost:8983/solr/admin/collections?action=RELOAD&name=gettingstarted > > ' > > > > After reloading, running the exact query above will return the null > pointer > > exception error. Any idea why? > > > > If I stop all Solr severs and restart, then it's fine. > > > > *java -version* > > java version "1.8.0_25" > > Java(TM) SE Runtime Environment (build 1.8.0_25-b17) > > Java HotSpot(TM) 64-Bit Server VM (build 25.25-b02, mixed mode) > > > > Thanks, > > Ryan > > >
Re: Parallel SQL Interface returns "java.lang.NullPointerException" after reloading collection
Can you post your stack trace? I suspect this has to do with how the Streaming API is interacting with SolrCloud. We can probably also create a jira ticket for this. Joel Bernstein http://joelsolr.blogspot.com/ On Sun, May 1, 2016 at 4:02 AM, Ryan Yacyshyn wrote: > Hi all, > > I'm exploring with parallel SQL queries and found something strange after > reloading the collection: the same query will return a > java.lang.NullPointerException error. Here are my steps on a fresh install > of Solr 6.0.0. > > *Start Solr in cloud mode with example* > bin/solr -e cloud -noprompt > > *Index some data* > bin/post -c gettingstarted example/exampledocs/*.xml > > *Send query, which works* > curl --data-urlencode 'stmt=select id,name from gettingstarted where > inStock = true limit 2' http://localhost:8983/solr/gettingstarted/sql > > *Reload the collection* > curl ' > > http://localhost:8983/solr/admin/collections?action=RELOAD&name=gettingstarted > ' > > After reloading, running the exact query above will return the null pointer > exception error. Any idea why? > > If I stop all Solr severs and restart, then it's fine. > > *java -version* > java version "1.8.0_25" > Java(TM) SE Runtime Environment (build 1.8.0_25-b17) > Java HotSpot(TM) 64-Bit Server VM (build 25.25-b02, mixed mode) > > Thanks, > Ryan >
Parallel SQL Interface returns "java.lang.NullPointerException" after reloading collection
Hi all, I'm exploring with parallel SQL queries and found something strange after reloading the collection: the same query will return a java.lang.NullPointerException error. Here are my steps on a fresh install of Solr 6.0.0. *Start Solr in cloud mode with example* bin/solr -e cloud -noprompt *Index some data* bin/post -c gettingstarted example/exampledocs/*.xml *Send query, which works* curl --data-urlencode 'stmt=select id,name from gettingstarted where inStock = true limit 2' http://localhost:8983/solr/gettingstarted/sql *Reload the collection* curl ' http://localhost:8983/solr/admin/collections?action=RELOAD&name=gettingstarted ' After reloading, running the exact query above will return the null pointer exception error. Any idea why? If I stop all Solr severs and restart, then it's fine. *java -version* java version "1.8.0_25" Java(TM) SE Runtime Environment (build 1.8.0_25-b17) Java HotSpot(TM) 64-Bit Server VM (build 25.25-b02, mixed mode) Thanks, Ryan
Re: java.lang.NullPointerException in json facet hll function
Nop. A normal query with wt=json the q parameter is *:* The unique particular thing with this index is that some docs has the field visitor__visitor_id as dynamic type long and others has the field as type string. (our indexer tool didn't resolve the type right as result of a bug, that was resolved later) In fact if I add q=visitor__visitor_id_l:* to query I have no error. I think the problem is that I have the field "visitor__visitor_id" with _s and _l mixed in the index. But this should not be a problem because they are two independent fields, isn't it? \-- /Yago Riveiro > On Mar 22 2016, at 5:00 pm, Yonik Seeley <ysee...@gmail.com> wrote: > > Hmmm, looks like the "hll" value is missing for some reason. It's not clear why that would happen... are you running any custom code? > > -Yonik > > On Tue, Mar 22, 2016 at 12:54 PM, Yago Riveiro <yago.rive...@gmail.com> wrote: > Solr version: 5.3.1 > > With this query: > > group: > { > type:terms, > limit:-1, > field:group, > sort:{index:asc}, > numBuckets:true, > facet:{ > col_1_unique_visitors:'hll(visitor__visitor_id_l)' > } > } > } > > visitor__visitor_id_l is a dynamic field. > > Running the query described above I'm hitting this exception. > > java.lang.NullPointerException at > org.apache.solr.search.facet.HLLAgg$Merger.merge(HLLAgg.java:86) at > org.apache.solr.search.facet.FacetBucket.mergeBucket(FacetModule.java:410) > at > org.apache.solr.search.facet.FacetFieldMerger.mergeBucketList(FacetModule .java:510) > at org.apache.solr.search.facet.FacetFieldMerger.merge(FacetModule.java:488) > at org.apache.solr.search.facet.FacetFieldMerger.merge(FacetModule.java:462) > at > org.apache.solr.search.facet.FacetBucket.mergeBucket(FacetModule.java:410) > at org.apache.solr.search.facet.FacetQueryMerger.merge(FacetModule.java:337) > at > org.apache.solr.search.facet.FacetModule.handleResponses(FacetModule.java:178) > at > org.apache.solr.handler.component.SearchHandler.handleRequestBody(SearchH andler.java:410) > at > org.apache.solr.handler.RequestHandlerBase.handleRequest(RequestHandlerBa se.java:143) > at org.apache.solr.core.SolrCore.execute(SolrCore.java:2068) at > org.apache.solr.servlet.HttpSolrCall.execute(HttpSolrCall.java:669) at > org.apache.solr.servlet.HttpSolrCall.call(HttpSolrCall.java:462) at > org.apache.solr.servlet.SolrDispatchFilter.doFilter(SolrDispatchFilter.ja va:214) > at > org.apache.solr.servlet.SolrDispatchFilter.doFilter(SolrDispatchFilter.ja va:179) > at > org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHand ler.java:1652) > at > org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:585) > at > org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) > at > org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:577) > at > org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.j ava:223) > at > org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.j ava:1127) > at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:515) > at > org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.ja va:185) > at > org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.ja va:1061) > at > org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) > at > org.eclipse.jetty.server.handler.ContextHandlerCollection.handle(ContextH andlerCollection.java:215) > at > org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollecti on.java:110) > at > org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:97) > at org.eclipse.jetty.server.Server.handle(Server.java:499) at > org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:310) at > org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:257) > at > org.eclipse.jetty.io.AbstractConnection$2.run(AbstractConnection.java:540) > at > org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.ja va:635) > at > org.eclipse.jetty.util.thread.QueuedThreadPool$3.run(QueuedThreadPool.jav a:555) > at java.lang.Thread.run(Thread.java:745) > > > > \- > Best regards > \-- > View this message in context: <http://lucene.472066.n3.nabble.com/java- lang-NullPointerException-in-json-facet-hll-function-tp4265378.html> > Sent from the Solr - User mailing list archive at Nabble.com.
Re: java.lang.NullPointerException in json facet hll function
Hmmm, looks like the "hll" value is missing for some reason. It's not clear why that would happen... are you running any custom code? -Yonik On Tue, Mar 22, 2016 at 12:54 PM, Yago Riveiro wrote: > Solr version: 5.3.1 > > With this query: > > group: > { > type:terms, > limit:-1, > field:group, > sort:{index:asc}, > numBuckets:true, > facet:{ > col_1_unique_visitors:'hll(visitor__visitor_id_l)' > } > } > } > > visitor__visitor_id_l is a dynamic field. > > Running the query described above I'm hitting this exception. > > java.lang.NullPointerException at > org.apache.solr.search.facet.HLLAgg$Merger.merge(HLLAgg.java:86) at > org.apache.solr.search.facet.FacetBucket.mergeBucket(FacetModule.java:410) > at > org.apache.solr.search.facet.FacetFieldMerger.mergeBucketList(FacetModule.java:510) > at org.apache.solr.search.facet.FacetFieldMerger.merge(FacetModule.java:488) > at org.apache.solr.search.facet.FacetFieldMerger.merge(FacetModule.java:462) > at > org.apache.solr.search.facet.FacetBucket.mergeBucket(FacetModule.java:410) > at org.apache.solr.search.facet.FacetQueryMerger.merge(FacetModule.java:337) > at > org.apache.solr.search.facet.FacetModule.handleResponses(FacetModule.java:178) > at > org.apache.solr.handler.component.SearchHandler.handleRequestBody(SearchHandler.java:410) > at > org.apache.solr.handler.RequestHandlerBase.handleRequest(RequestHandlerBase.java:143) > at org.apache.solr.core.SolrCore.execute(SolrCore.java:2068) at > org.apache.solr.servlet.HttpSolrCall.execute(HttpSolrCall.java:669) at > org.apache.solr.servlet.HttpSolrCall.call(HttpSolrCall.java:462) at > org.apache.solr.servlet.SolrDispatchFilter.doFilter(SolrDispatchFilter.java:214) > at > org.apache.solr.servlet.SolrDispatchFilter.doFilter(SolrDispatchFilter.java:179) > at > org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1652) > at > org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:585) > at > org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) > at > org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:577) > at > org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:223) > at > org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1127) > at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:515) > at > org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:185) > at > org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1061) > at > org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) > at > org.eclipse.jetty.server.handler.ContextHandlerCollection.handle(ContextHandlerCollection.java:215) > at > org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:110) > at > org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:97) > at org.eclipse.jetty.server.Server.handle(Server.java:499) at > org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:310) at > org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:257) > at > org.eclipse.jetty.io.AbstractConnection$2.run(AbstractConnection.java:540) > at > org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:635) > at > org.eclipse.jetty.util.thread.QueuedThreadPool$3.run(QueuedThreadPool.java:555) > at java.lang.Thread.run(Thread.java:745) > > > > - > Best regards > -- > View this message in context: > http://lucene.472066.n3.nabble.com/java-lang-NullPointerException-in-json-facet-hll-function-tp4265378.html > Sent from the Solr - User mailing list archive at Nabble.com.
java.lang.NullPointerException in json facet hll function
Solr version: 5.3.1 With this query: group: { type:terms, limit:-1, field:group, sort:{index:asc}, numBuckets:true, facet:{ col_1_unique_visitors:'hll(visitor__visitor_id_l)' } } } visitor__visitor_id_l is a dynamic field. Running the query described above I'm hitting this exception. java.lang.NullPointerException at org.apache.solr.search.facet.HLLAgg$Merger.merge(HLLAgg.java:86) at org.apache.solr.search.facet.FacetBucket.mergeBucket(FacetModule.java:410) at org.apache.solr.search.facet.FacetFieldMerger.mergeBucketList(FacetModule.java:510) at org.apache.solr.search.facet.FacetFieldMerger.merge(FacetModule.java:488) at org.apache.solr.search.facet.FacetFieldMerger.merge(FacetModule.java:462) at org.apache.solr.search.facet.FacetBucket.mergeBucket(FacetModule.java:410) at org.apache.solr.search.facet.FacetQueryMerger.merge(FacetModule.java:337) at org.apache.solr.search.facet.FacetModule.handleResponses(FacetModule.java:178) at org.apache.solr.handler.component.SearchHandler.handleRequestBody(SearchHandler.java:410) at org.apache.solr.handler.RequestHandlerBase.handleRequest(RequestHandlerBase.java:143) at org.apache.solr.core.SolrCore.execute(SolrCore.java:2068) at org.apache.solr.servlet.HttpSolrCall.execute(HttpSolrCall.java:669) at org.apache.solr.servlet.HttpSolrCall.call(HttpSolrCall.java:462) at org.apache.solr.servlet.SolrDispatchFilter.doFilter(SolrDispatchFilter.java:214) at org.apache.solr.servlet.SolrDispatchFilter.doFilter(SolrDispatchFilter.java:179) at org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1652) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:585) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:577) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:223) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1127) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:515) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:185) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1061) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.ContextHandlerCollection.handle(ContextHandlerCollection.java:215) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:110) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:97) at org.eclipse.jetty.server.Server.handle(Server.java:499) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:310) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:257) at org.eclipse.jetty.io.AbstractConnection$2.run(AbstractConnection.java:540) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:635) at org.eclipse.jetty.util.thread.QueuedThreadPool$3.run(QueuedThreadPool.java:555) at java.lang.Thread.run(Thread.java:745) - Best regards -- View this message in context: http://lucene.472066.n3.nabble.com/java-lang-NullPointerException-in-json-facet-hll-function-tp4265378.html Sent from the Solr - User mailing list archive at Nabble.com.
RE: java.lang.NullPointerException. I am trying to use CachedSqlEntityProcessor
I have solved this problem and able work with CachedSqlEntityProcessor successfully after a very long struggle. I tried this on 4.2. There are still existing bugs it seems: 1. What ever you mention in cacheKey, that field name must in the select statement explicitly. 2. If I am correct, the field name in cacheKey and in the select statement are case sensitive. 3. We have ID field in our table, I tried to give cacheKey="ID". But that got conflicted with the uniqueKey as uniqueKey is also "ID". So I wrote it as "SELECT ID AS AID,." and cacheKey="AID" thanks Srini -- View this message in context: http://lucene.472066.n3.nabble.com/java-lang-NullPointerException-I-am-trying-to-use-CachedSqlEntityProcessor-tp4059815p4070059.html Sent from the Solr - User mailing list archive at Nabble.com.
RE: java.lang.NullPointerException. I am trying to use CachedSqlEntityProcessor
If I remember correctly, 3.6 DIH had bugs related to CachedSqlEntityProcessor and some were fixed in 3.6.1, 3.6.2, but some were not fixed until 4.0. You might want to use a 3.5 DIH jar with your 3.6 Solr. Or, post your data-config.xml and maybe someone can figure something out. James Dyer Ingram Content Group (615) 213-4311 -Original Message- From: srinalluri [mailto:nallurisr...@yahoo.com] Sent: Tuesday, April 30, 2013 10:53 AM To: solr-user@lucene.apache.org Subject: RE: java.lang.NullPointerException. I am trying to use CachedSqlEntityProcessor Thanks James for your reply. I have updated to 3.6.2. Now the NullPointerException is gone. But the entities with CachedSqlEntityProcessor don't add anything to solr. And entities without CachedSqlEntityProcessor, are working fine. Why entities with CachedSqlEntityProcessor don't do anything? What is wrong in my entity? -- View this message in context: http://lucene.472066.n3.nabble.com/java-lang-NullPointerException-I-am-trying-to-use-CachedSqlEntityProcessor-tp4059815p4060043.html Sent from the Solr - User mailing list archive at Nabble.com.
RE: java.lang.NullPointerException. I am trying to use CachedSqlEntityProcessor
Thanks James for your reply. I have updated to 3.6.2. Now the NullPointerException is gone. But the entities with CachedSqlEntityProcessor don't add anything to solr. And entities without CachedSqlEntityProcessor, are working fine. Why entities with CachedSqlEntityProcessor don't do anything? What is wrong in my entity? -- View this message in context: http://lucene.472066.n3.nabble.com/java-lang-NullPointerException-I-am-trying-to-use-CachedSqlEntityProcessor-tp4059815p4060043.html Sent from the Solr - User mailing list archive at Nabble.com.
RE: java.lang.NullPointerException. I am trying to use CachedSqlEntityProcessor
This sounds like https://issues.apache.org/jira/browse/SOLR-3791, which was resolved in 3.6.2 / 4.0. James Dyer Ingram Content Group (615) 213-4311 -Original Message- From: srinalluri [mailto:nallurisr...@yahoo.com] Sent: Monday, April 29, 2013 11:41 AM To: solr-user@lucene.apache.org Subject: java.lang.NullPointerException. I am trying to use CachedSqlEntityProcessor I am in Solr 3.6.1. The following entity gives java.lang.NullPointerException. How to debug this? Here I am usingCachedSqlEntityProcessor. Here is the exception message: SEVERE: Exception while processing: vig8-article-mon document : SolrInputDocument[{repository=repository(1.0)={vig8qamon}, native_id=native_id(1.0)={8f2210474fea2310VgnVCM10d1c1a8c0RCRD}, content_type=content_type(1.0)={article}}]:org.apache.solr.handler.dataimport.DataImportHandlerException: java.lang.NullPointerException at org.apache.solr.handler.dataimport.DataImportHandlerException.wrapAndThrow(DataImportHandlerException.java:64) at org.apache.solr.handler.dataimport.EntityProcessorWrapper.pullRow(EntityProcessorWrapper.java:333) at org.apache.solr.handler.dataimport.EntityProcessorWrapper.nextRow(EntityProcessorWrapper.java:296) at org.apache.solr.handler.dataimport.DocBuilder.buildDocument(DocBuilder.java:683) at org.apache.solr.handler.dataimport.DocBuilder.buildDocument(DocBuilder.java:709) at org.apache.solr.handler.dataimport.DocBuilder.buildDocument(DocBuilder.java:619) at org.apache.solr.handler.dataimport.DocBuilder.doFullDump(DocBuilder.java:327) at org.apache.solr.handler.dataimport.DocBuilder.execute(DocBuilder.java:225) at org.apache.solr.handler.dataimport.DataImporter.doFullImport(DataImporter.java:375) at org.apache.solr.handler.dataimport.DataImporter.runCmd(DataImporter.java:445) at org.apache.solr.handler.dataimport.DataImporter$1.run(DataImporter.java:426) Caused by: java.lang.NullPointerException at java.util.TreeMap.getEntry(TreeMap.java:342) at java.util.TreeMap.get(TreeMap.java:273) at org.apache.solr.handler.dataimport.SortedMapBackedCache.add(SortedMapBackedCache.java:57) at org.apache.solr.handler.dataimport.DIHCacheSupport.populateCache(DIHCacheSupport.java:124) at org.apache.solr.handler.dataimport.DIHCacheSupport.getIdCacheData(DIHCacheSupport.java:176) at org.apache.solr.handler.dataimport.DIHCacheSupport.getCacheData(DIHCacheSupport.java:145) at org.apache.solr.handler.dataimport.EntityProcessorBase.getNext(EntityProcessorBase.java:132) at org.apache.solr.handler.dataimport.SqlEntityProcessor.nextRow(SqlEntityProcessor.java:75) at org.apache.solr.handler.dataimport.EntityProcessorWrapper.pullRow(EntityProcessorWrapper.java:330) -- View this message in context: http://lucene.472066.n3.nabble.com/java-lang-NullPointerException-I-am-trying-to-use-CachedSqlEntityProcessor-tp4059815.html Sent from the Solr - User mailing list archive at Nabble.com.
java.lang.NullPointerException. I am trying to use CachedSqlEntityProcessor
I am in Solr 3.6.1. The following entity gives java.lang.NullPointerException. How to debug this? Here I am usingCachedSqlEntityProcessor. Here is the exception message: SEVERE: Exception while processing: vig8-article-mon document : SolrInputDocument[{repository=repository(1.0)={vig8qamon}, native_id=native_id(1.0)={8f2210474fea2310VgnVCM10d1c1a8c0RCRD}, content_type=content_type(1.0)={article}}]:org.apache.solr.handler.dataimport.DataImportHandlerException: java.lang.NullPointerException at org.apache.solr.handler.dataimport.DataImportHandlerException.wrapAndThrow(DataImportHandlerException.java:64) at org.apache.solr.handler.dataimport.EntityProcessorWrapper.pullRow(EntityProcessorWrapper.java:333) at org.apache.solr.handler.dataimport.EntityProcessorWrapper.nextRow(EntityProcessorWrapper.java:296) at org.apache.solr.handler.dataimport.DocBuilder.buildDocument(DocBuilder.java:683) at org.apache.solr.handler.dataimport.DocBuilder.buildDocument(DocBuilder.java:709) at org.apache.solr.handler.dataimport.DocBuilder.buildDocument(DocBuilder.java:619) at org.apache.solr.handler.dataimport.DocBuilder.doFullDump(DocBuilder.java:327) at org.apache.solr.handler.dataimport.DocBuilder.execute(DocBuilder.java:225) at org.apache.solr.handler.dataimport.DataImporter.doFullImport(DataImporter.java:375) at org.apache.solr.handler.dataimport.DataImporter.runCmd(DataImporter.java:445) at org.apache.solr.handler.dataimport.DataImporter$1.run(DataImporter.java:426) Caused by: java.lang.NullPointerException at java.util.TreeMap.getEntry(TreeMap.java:342) at java.util.TreeMap.get(TreeMap.java:273) at org.apache.solr.handler.dataimport.SortedMapBackedCache.add(SortedMapBackedCache.java:57) at org.apache.solr.handler.dataimport.DIHCacheSupport.populateCache(DIHCacheSupport.java:124) at org.apache.solr.handler.dataimport.DIHCacheSupport.getIdCacheData(DIHCacheSupport.java:176) at org.apache.solr.handler.dataimport.DIHCacheSupport.getCacheData(DIHCacheSupport.java:145) at org.apache.solr.handler.dataimport.EntityProcessorBase.getNext(EntityProcessorBase.java:132) at org.apache.solr.handler.dataimport.SqlEntityProcessor.nextRow(SqlEntityProcessor.java:75) at org.apache.solr.handler.dataimport.EntityProcessorWrapper.pullRow(EntityProcessorWrapper.java:330) -- View this message in context: http://lucene.472066.n3.nabble.com/java-lang-NullPointerException-I-am-trying-to-use-CachedSqlEntityProcessor-tp4059815.html Sent from the Solr - User mailing list archive at Nabble.com.
java.lang.NullPointerException withs stats component and shards
Hi, I have problem with Stats component in shards environment. Solr throws Java.lang.NullPointerException when there is no results and statistic is computed over date field. price ddate true *:* date:[2013-03-23T00:00:00Z TO *] price:[5000 TO *] 10 java.lang.NullPointerException at org.apache.solr.handler.component.DateStatsValues.updateTypeSpecificStats(StatsValuesFactory.java:340) at org.apache.solr.handler.component.AbstractStatsValues.accumulate(StatsValuesFactory.java:106) at org.apache.solr.handler.component.StatsComponent.handleResponses(StatsComponent.java:112) at org.apache.solr.handler.component.SearchHandler.handleRequestBody(SearchHandler.java:311) at org.apache.solr.handler.RequestHandlerBase.handleRequest(RequestHandlerBase.java:135) at org.apache.solr.core.SolrCore.execute(SolrCore.java:1797) at org.apache.solr.servlet.SolrDispatchFilter.execute(SolrDispatchFilter.java:637) at org.apache.solr.servlet.SolrDispatchFilter.doFilter(SolrDispatchFilter.java:343) at org.apache.solr.servlet.SolrDispatchFilter.doFilter(SolrDispatchFilter.java:141) at org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1307) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:453) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:137) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:560) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:231) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1072) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:382) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:193) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1006) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:135) at org.eclipse.jetty.server.handler.ContextHandlerCollection.handle(ContextHandlerCollection.java:255) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:154) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:116) at org.eclipse.jetty.server.Server.handle(Server.java:365) at org.eclipse.jetty.server.AbstractHttpConnection.handleRequest(AbstractHttpConnection.java:485) at org.eclipse.jetty.server.BlockingHttpConnection.handleRequest(BlockingHttpConnection.java:53) at org.eclipse.jetty.server.AbstractHttpConnection.headerComplete(AbstractHttpConnection.java:926) at org.eclipse.jetty.server.AbstractHttpConnection$RequestHandler.headerComplete(AbstractHttpConnection.java:988) at org.eclipse.jetty.http.HttpParser.parseNext(HttpParser.java:635) at org.eclipse.jetty.http.HttpParser.parseAvailable(HttpParser.java:235) at org.eclipse.jetty.server.BlockingHttpConnection.handle(BlockingHttpConnection.java:72) at org.eclipse.jetty.server.bio.SocketConnector$ConnectorEndPoint.run(SocketConnector.java:264) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:608) at org.eclipse.jetty.util.thread.QueuedThreadPool$3.run(QueuedThreadPool.java:543) at java.lang.Thread.run(Thread.java:662) 500 The same query without shards acts normally: price ddate true *:* ddate:[2013-03-23T00:00:00Z TO *] price:[5000 TO *] 10 0 0 I've tested it on Solr 4.0 and then on Solr 4.2 and the problem still exists. Regards Agnieszka Kukałowicz
RE: solr java.lang.NullPointerException on select queries
Regarding the large number of files, even after optimize, we found that when rebuilding a large, experimental 1.7TB index on Solr 3.5, instead of Solr 1.4.1, there were a ton of index files, thousands, in 3.5, when there used to be just 10 (or 11?) segments worth (as expected with mergeFactor set to 10) in 1.4.1. The apparent cause was a Solr switch to use TieredMergePolicy by default somewhere in the 3.x version series. TieredMergePolicy has a default segment size limit of 5GB, so if your index goes over 50GB, a mergeFactor of 10 effectively gets ignored. We remedied this by explicitly configuring TieredMergePolicy's segment size (and some other things that may or may not be making a difference) in solrconfig.xml: 10 10 3 -- Bryan > -Original Message- > From: avenka [mailto:ave...@gmail.com] > Sent: Tuesday, June 26, 2012 8:46 AM > To: solr-user@lucene.apache.org > Subject: Re: solr java.lang.NullPointerException on select queries > > So, I tried 'optimize', but it failed because of lack of space on the > first > machine. I then moved the whole thing to a different machine where the > index > was pretty much the only thing and was using about 37% of disk, but it > still > failed because of a "No space left on device" IOException. Also, the size > of > the index has since doubled to roughly 74% of the disk on this second > machine now and the number of files has increased from 3289 to 3329. > Actually even the 3289 files on the first machine were after I tried > optimize on it once, so the "original" size must have been even smaller. > > I don't think I can afford any more space and am close to giving up and > reclaiming space on the two machines. A couple more questions before that: > > 1) I am tempted to try editing binary--the "magnetic needle" option. Could > you elaborate on this? Would there be a way to go back to an index that is > the original size from its super-sized current form(s)? > > 2) Will CheckIndex also need more than twice the space? Would there be a > way > to bring down the size to the original size without running 'optimize' if > I > try that route? Also how exactly do I run CheckIndex, e.g., the exact URL > I > need to hit? > > > -- > View this message in context: http://lucene.472066.n3.nabble.com/solr- > java-lang-NullPointerException-on-select-queries-tp3989974p3991400.html > Sent from the Solr - User mailing list archive at Nabble.com.
Re: solr java.lang.NullPointerException on select queries
Well, you'd have to understand the whole way the index structure is laid out to do binary editing, and I don't know it well enough to even offer a rough idea. There are detailed docs hanging around _somewhere_ that will give you the formats, or you could go at the code. But that's probably pretty hairy. I'm surprised you're getting that many files, something sounds really screwy. I'm not at all surprised that the index tries to double _temporarily_ when optimizing, that's expected behavior. But it should go back down after the optimization/forcemerge is complete. Problem is that now you've got some garbage left around that prevents your original index from being copied while optimizing, and getting rid of the unnecessary files seems fraught. I'm also surprised that you wound up with a space problem when you tried to optimize unless you have compound file format turned on, which I doubt. Could you wipe the index from that disk and re-copy the original and try again? So I'm going to chicken out and leave it to my betters to offer any advice, I don't play in the lower-level file structures. Sorry I can't be more help Erick On Tue, Jun 26, 2012 at 11:46 AM, avenka wrote: > So, I tried 'optimize', but it failed because of lack of space on the first > machine. I then moved the whole thing to a different machine where the index > was pretty much the only thing and was using about 37% of disk, but it still > failed because of a "No space left on device" IOException. Also, the size of > the index has since doubled to roughly 74% of the disk on this second > machine now and the number of files has increased from 3289 to 3329. > Actually even the 3289 files on the first machine were after I tried > optimize on it once, so the "original" size must have been even smaller. > > I don't think I can afford any more space and am close to giving up and > reclaiming space on the two machines. A couple more questions before that: > > 1) I am tempted to try editing binary--the "magnetic needle" option. Could > you elaborate on this? Would there be a way to go back to an index that is > the original size from its super-sized current form(s)? > > 2) Will CheckIndex also need more than twice the space? Would there be a way > to bring down the size to the original size without running 'optimize' if I > try that route? Also how exactly do I run CheckIndex, e.g., the exact URL I > need to hit? > > > -- > View this message in context: > http://lucene.472066.n3.nabble.com/solr-java-lang-NullPointerException-on-select-queries-tp3989974p3991400.html > Sent from the Solr - User mailing list archive at Nabble.com.
Re: solr java.lang.NullPointerException on select queries
So, I tried 'optimize', but it failed because of lack of space on the first machine. I then moved the whole thing to a different machine where the index was pretty much the only thing and was using about 37% of disk, but it still failed because of a "No space left on device" IOException. Also, the size of the index has since doubled to roughly 74% of the disk on this second machine now and the number of files has increased from 3289 to 3329. Actually even the 3289 files on the first machine were after I tried optimize on it once, so the "original" size must have been even smaller. I don't think I can afford any more space and am close to giving up and reclaiming space on the two machines. A couple more questions before that: 1) I am tempted to try editing binary--the "magnetic needle" option. Could you elaborate on this? Would there be a way to go back to an index that is the original size from its super-sized current form(s)? 2) Will CheckIndex also need more than twice the space? Would there be a way to bring down the size to the original size without running 'optimize' if I try that route? Also how exactly do I run CheckIndex, e.g., the exact URL I need to hit? -- View this message in context: http://lucene.472066.n3.nabble.com/solr-java-lang-NullPointerException-on-select-queries-tp3989974p3991400.html Sent from the Solr - User mailing list archive at Nabble.com.
Re: solr java.lang.NullPointerException on select queries
Right, if you optimize, at the end maxDocs should == numDocs. Usually the document reclamation stuff is done when segments merge, but that won't happen in this case since this index is becoming static, so a manual optimize is probably indicated. Something like this should also work, either way: http://localhost:8983/solr/update?stream.body= But be prepared to wait for a very long time. I'd copy it somewhere else first just for safety's sake Best Erick On Thu, Jun 21, 2012 at 12:52 PM, avenka wrote: > Erick, much thanks for detailing these options. I am currently trying the > second one as that seems a little easier and quicker to me. > > I successfully deleted documents with IDs after the problem time that I do > know to an accuracy of a couple hours. Now, the stats are: > numDocs : 2132454075 > maxDoc : -2130733352 > The former is nicely below 2^31. But I can't seem to get the latter to > "decrease" and become positive by deleting further. > > Should I just run an optimize at this point? I have never manually run an > optimize and plan to just hit > http:///solr/update?optimize=true > Can you confirm this? > > -- > View this message in context: > http://lucene.472066.n3.nabble.com/solr-java-lang-NullPointerException-on-select-queries-tp3989974p3990798.html > Sent from the Solr - User mailing list archive at Nabble.com.
Re: solr java.lang.NullPointerException on select queries
Erick, much thanks for detailing these options. I am currently trying the second one as that seems a little easier and quicker to me. I successfully deleted documents with IDs after the problem time that I do know to an accuracy of a couple hours. Now, the stats are: numDocs : 2132454075 maxDoc : -2130733352 The former is nicely below 2^31. But I can't seem to get the latter to "decrease" and become positive by deleting further. Should I just run an optimize at this point? I have never manually run an optimize and plan to just hit http:///solr/update?optimize=true Can you confirm this? -- View this message in context: http://lucene.472066.n3.nabble.com/solr-java-lang-NullPointerException-on-select-queries-tp3989974p3990798.html Sent from the Solr - User mailing list archive at Nabble.com.
Re: solr java.lang.NullPointerException on select queries
Ah, OK, I misunderstood. OK, here's a couple of off-the-top-of-my-head ideas. make a backup of your index before anything else ... Split up your current index into two parts by segments. That is, copy the whole directory to another place, and remove some of the segments from each. I.e. when you're done, you'll still have all the segments you used to have, but some of them will be in one directory and some in another. Of course all of the segments files with a common prefix should be in place (e.g. all the _0.* files in the same dir, not split between the two dirs). Now run CheckIndex on them. That'll take a long time, but it _should_ spoof Solr/Lucene into thinking that there are two complete indexes out there. Now your idea of having an archival search should work, but with two places to look, not one. NOTE: Whether this plays nice with the over 2B docs or deleted documents I can't guarantee I believe that the deleted docs are per-segment, if so this should be fine. This won't work if you've recently optimized. when you're done you should have two cores out there (hmmm, these could also be treated as shards?) that you point your solr at. You might want to optimize in this case when you're done. I suspect you could, with a magnetized needle and a steady hand, edit some of the auxiliary files (segments*) but I would feel more secure letting CheckIndex to the heavy lifting. Here's another possibility > Try a delete-by-query from a bit before the date you think things went over > 2B to now (really hope you have a date!) > perhaps you can walk the underlying index in Lucene somehow and make this work if you don't have a date. Since the underlying Lucene IDs are segment_base + local_segment_count this should be safely under 2B but I'm reaching here into areas I don't know much about. > optimize (and wait. probably a really long time). > re-index everything after the date (or whatever) you used above into a new > shard > now treat the big index just as you were talking about. Please understand that the over 2B docs might cause some grief here, but since the underlying index is segment based (i.e. the internal Lucene doc IDs are a base+offset for each segment), this has a decent chance of working (but anyone who really understands, please chime in. I'm reaching). Oh, and if it works, please let us know... Best Erick On Wed, Jun 20, 2012 at 6:37 PM, avenka wrote: > Erick, thanks for the advice, but let me make sure you haven't misunderstood > what I was asking. > > I am not trying to split the huge existing index in install1 into shards. I > am also not trying to make the huge install1 index as one shard of a sharded > solr setup. I plan to use a sharded setup only for future docs. > > I do want to avoid trying to re-index the docs in install1 and think of them > as a slow "tape archive" index server if I ever need to go and query the > past documents. So I was wondering if I could somehow use the existing > segment files to run an isolated (unsharded) solr server that lets me query > roughly the first 2B docs before the wraparound problem happened. If the > "negative" internal doc IDs have pervasively corrupted the segment files, > this would not be possible, but I am not able to imagine an underlying > lucene design that would cause such a problem. Is my only option to re-index > the past 2B docs if I want to be able to query them at this point or is > there any way to use the existing segment files? > > -- > View this message in context: > http://lucene.472066.n3.nabble.com/solr-java-lang-NullPointerException-on-select-queries-tp3989974p3990615.html > Sent from the Solr - User mailing list archive at Nabble.com.
Re: solr java.lang.NullPointerException on select queries
Erick, thanks for the advice, but let me make sure you haven't misunderstood what I was asking. I am not trying to split the huge existing index in install1 into shards. I am also not trying to make the huge install1 index as one shard of a sharded solr setup. I plan to use a sharded setup only for future docs. I do want to avoid trying to re-index the docs in install1 and think of them as a slow "tape archive" index server if I ever need to go and query the past documents. So I was wondering if I could somehow use the existing segment files to run an isolated (unsharded) solr server that lets me query roughly the first 2B docs before the wraparound problem happened. If the "negative" internal doc IDs have pervasively corrupted the segment files, this would not be possible, but I am not able to imagine an underlying lucene design that would cause such a problem. Is my only option to re-index the past 2B docs if I want to be able to query them at this point or is there any way to use the existing segment files? -- View this message in context: http://lucene.472066.n3.nabble.com/solr-java-lang-NullPointerException-on-select-queries-tp3989974p3990615.html Sent from the Solr - User mailing list archive at Nabble.com.
Re: solr java.lang.NullPointerException on select queries
Don't even try to do that. First of all, you have to have a reliable way to index the same docs to the same shards. The docs are all mixed up in the segment files and would lead to chaos. Solr/Lucene report the same doc multiple times if it's indifferent shards, so if you ever updated a document, you wouldn't know what shard to send it to. Second, the segments are all parts of a single index, and Solr expects (well, actually Lucene) expects them to be consistent. Putting some on one shard and some on another would probably not allow Solr to start (but I confess I've never tried that). So I really wouldn't even try to go there. Best Erick On Wed, Jun 20, 2012 at 12:35 PM, avenka wrote: > Thanks. Do you know if the tons of index files with names like '_zxt.tis' in > the index/data/ directory have the lucene IDs embedded in the binaries? The > files look good to me and are partly readable even if in binary. I am > wondering if I could just set up a new solr instance and move these index > files there and hope to use them (or most of them) as is without shards? If > so, I will just set up a separate sharded index for the documents indexed > henceforth, but won't bother splitting the huge existing index. > > > -- > View this message in context: > http://lucene.472066.n3.nabble.com/solr-java-lang-NullPointerException-on-select-queries-tp3989974p3990560.html > Sent from the Solr - User mailing list archive at Nabble.com.
Re: solr java.lang.NullPointerException on select queries
Thanks. Do you know if the tons of index files with names like '_zxt.tis' in the index/data/ directory have the lucene IDs embedded in the binaries? The files look good to me and are partly readable even if in binary. I am wondering if I could just set up a new solr instance and move these index files there and hope to use them (or most of them) as is without shards? If so, I will just set up a separate sharded index for the documents indexed henceforth, but won't bother splitting the huge existing index. -- View this message in context: http://lucene.472066.n3.nabble.com/solr-java-lang-NullPointerException-on-select-queries-tp3989974p3990560.html Sent from the Solr - User mailing list archive at Nabble.com.
Re: solr java.lang.NullPointerException on select queries
That indeed sucks. But I don't personally know of a good way to try to split apart an existing index into shards. I'm afraid you're going to be stuck with re-indexing Wish I had a better solution Erick On Wed, Jun 20, 2012 at 10:45 AM, avenka wrote: > Yes, wonky indeed. > numDocs : -2006905329 > maxDoc : -1993357870 > > And yes, I meant that the holes are in the database auto-increment ID space, > nothing to do with lucene IDs. > > I will set up sharding. But is there any way to retrieve most of the current > index? Currently, all select queries even in ranges in the hundreds of > millions return the NullPointerException. It would suck to lose all of this. > :( > > -- > View this message in context: > http://lucene.472066.n3.nabble.com/solr-java-lang-NullPointerException-on-select-queries-tp3989974p3990542.html > Sent from the Solr - User mailing list archive at Nabble.com.
Re: solr java.lang.NullPointerException on select queries
Yes, wonky indeed. numDocs : -2006905329 maxDoc : -1993357870 And yes, I meant that the holes are in the database auto-increment ID space, nothing to do with lucene IDs. I will set up sharding. But is there any way to retrieve most of the current index? Currently, all select queries even in ranges in the hundreds of millions return the NullPointerException. It would suck to lose all of this. :( -- View this message in context: http://lucene.472066.n3.nabble.com/solr-java-lang-NullPointerException-on-select-queries-tp3989974p3990542.html Sent from the Solr - User mailing list archive at Nabble.com.
Re: solr java.lang.NullPointerException on select queries
Let's make sure we're talking about the same thing. Solr happily indexes and stores long (64) bit values, no problem. What it doesn't do is assign _internal_ documents IDs as longs, those are ints. on admin/statistics, look at maxDocs and numDocs. maxDocs +1 will be the next _internal_ lucene doc id assigned, so if that's wonky or > 2B, this is where the rub happens. BTW, the difference between numDocs and maxDocs is the number of documents deleted from your index. If your number of current documents is much smaller than 2B, you can get maxDocs to equal numDocs if you optimize, and get yourself some more headroom. whether your index will be OK I'm not prepared to guarantee though... But if I'm reading your notes correctly, the "85% holes" applies to a value in your document, and has nothing to do with the internal lucene ID issue. But internally, the int limit isn't robustly enforced, so I'm not surprised that it pops out (if, indeed, this is your problem) in odd places. Best Erick On Wed, Jun 20, 2012 at 10:02 AM, avenka wrote: > Erick, thanks for pointing that out. I was going to say in my original post > that it is almost like some limit on max documents got violated all of a > sudden, but the rest of the symptoms didn't seem to quite match. But now > that I think about it, the problem probably happened at 2B (corresponding > exactly to the size of the signed int space) as my ID space in the database > has roughly 85% holes and the problem probably happened when the ID hit > around 2.4B. > > It is still odd that indexing appears to proceed normally and the select > queries "know" which IDs are used because the error happens only for queries > with non-empty results, e.g., searching for an ID that doesn't exist gives a > valid "0 numResponses" response. Is this because solr uses 'long' or more > for indexing (given that the schema supports long) but not in the querying > modules? > > I hadn't used solr sharding because I really needed "rolling" partitions, > where I keep a small index of recent documents and throw the rest into a > slow "archive" index. So maintaining the smaller instance2 (usually < 50M) > and replicating it if needed was my homebrewed sharding approach. But I > guess it is time to shard the archive after all. > > AV > > -- > View this message in context: > http://lucene.472066.n3.nabble.com/solr-java-lang-NullPointerException-on-select-queries-tp3989974p3990534.html > Sent from the Solr - User mailing list archive at Nabble.com.
Re: solr java.lang.NullPointerException on select queries
Erick, thanks for pointing that out. I was going to say in my original post that it is almost like some limit on max documents got violated all of a sudden, but the rest of the symptoms didn't seem to quite match. But now that I think about it, the problem probably happened at 2B (corresponding exactly to the size of the signed int space) as my ID space in the database has roughly 85% holes and the problem probably happened when the ID hit around 2.4B. It is still odd that indexing appears to proceed normally and the select queries "know" which IDs are used because the error happens only for queries with non-empty results, e.g., searching for an ID that doesn't exist gives a valid "0 numResponses" response. Is this because solr uses 'long' or more for indexing (given that the schema supports long) but not in the querying modules? I hadn't used solr sharding because I really needed "rolling" partitions, where I keep a small index of recent documents and throw the rest into a slow "archive" index. So maintaining the smaller instance2 (usually < 50M) and replicating it if needed was my homebrewed sharding approach. But I guess it is time to shard the archive after all. AV -- View this message in context: http://lucene.472066.n3.nabble.com/solr-java-lang-NullPointerException-on-select-queries-tp3989974p3990534.html Sent from the Solr - User mailing list archive at Nabble.com.
Re: solr java.lang.NullPointerException on select queries
Internal Lucene document IDs are signed 32 bit numbers, so having 2.5B docs seems to be just _asking_ for trouble. Which could explain the fact that this just came out of thin air. If you kept adding docs to the problem instance, you wouldn't have changed configs etc, just added more docs I really think it's time to shard. Best Erick On Wed, Jun 20, 2012 at 2:15 AM, avenka wrote: > For the first install, I copied over all files in the directory "example" > into, let's call it, "install1". I did the same for "install2". The two > installs run on different ports, use different jar files, are not really > related to each other in any way as far as I can see. In particular, they > are not "multicore". They have the same access control setup via jetty. I > did a diff on config files and confirmed that only port numbers are > different. > > Both had been running fine in parallel importing from a common database for > several weeks. The documents indexed by install1, the problematic one > currently, is a vastly bigger (~2.5B) superset of those indexed by install2 > (~250M). > > At this point, select queries on install1 incurs the NullPointerException > irrespective of whether install2 is running or not. The log file looks like > it is indexing normally as always though. The index is also growing at the > usual rate each day. Just select queries fail. :( > > -- > View this message in context: > http://lucene.472066.n3.nabble.com/solr-java-lang-NullPointerException-on-select-queries-tp3989974p3990476.html > Sent from the Solr - User mailing list archive at Nabble.com.
Re: solr java.lang.NullPointerException on select queries
For the first install, I copied over all files in the directory "example" into, let's call it, "install1". I did the same for "install2". The two installs run on different ports, use different jar files, are not really related to each other in any way as far as I can see. In particular, they are not "multicore". They have the same access control setup via jetty. I did a diff on config files and confirmed that only port numbers are different. Both had been running fine in parallel importing from a common database for several weeks. The documents indexed by install1, the problematic one currently, is a vastly bigger (~2.5B) superset of those indexed by install2 (~250M). At this point, select queries on install1 incurs the NullPointerException irrespective of whether install2 is running or not. The log file looks like it is indexing normally as always though. The index is also growing at the usual rate each day. Just select queries fail. :( -- View this message in context: http://lucene.472066.n3.nabble.com/solr-java-lang-NullPointerException-on-select-queries-tp3989974p3990476.html Sent from the Solr - User mailing list archive at Nabble.com.
Re: solr java.lang.NullPointerException on select queries
Can you tell us more about how you ran the second server? Is this an independent installation or is it the same installation as your first server just starting with a different port? Because this is very strange. It half feels like there are conflicting jars somewhere in your path, but that's a guess. This really shouldn't be happening unless something _did_ change that doesn't seem important but is And does this only happen when both servers are running? If you turn the second one off and this still occurs, it _really_ sounds like clashing jar files Best Erick On Sat, Jun 16, 2012 at 3:39 PM, avenka wrote: > I have recently started getting the error pasted below with solr-3.6 on > /select queries. I don't know of anything that changed in the config to > start causing this error. I am also running a second independent solr server > on the same machine, which continues to run fine and has the same > configuration as the first one except for the port number. The first one > seems to be doing dataimport operations fine and updating index files as > usual, but fails on select queries. An example of a failing query (that used > to run fine) is: > http:///solr/select/?q=title%3Afoo&version=2.2&start=0&rows=10&indent=on > > I am stupefied. Any idea? > > HTTP ERROR 500 > > Problem accessing /solr/select/. Reason: > > null > java.lang.NullPointerException at > org.apache.solr.handler.component.QueryComponent.process(QueryComponent.java:398) > at > org.apache.solr.handler.component.SearchHandler.handleRequestBody(SearchHandler.java:186) > at > org.apache.solr.handler.RequestHandlerBase.handleRequest(RequestHandlerBase.java:129) > at org.apache.solr.core.SolrCore.execute(SolrCore.java:1376) at > org.apache.solr.servlet.SolrDispatchFilter.execute(SolrDispatchFilter.java:365) > at > org.apache.solr.servlet.SolrDispatchFilter.doFilter(SolrDispatchFilter.java:260) > at > org.mortbay.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1212) > at org.mortbay.jetty.servlet.ServletHandler.handle(ServletHandler.java:399) > at > org.mortbay.jetty.security.SecurityHandler.handle(SecurityHandler.java:216) > at org.mortbay.jetty.servlet.SessionHandler.handle(SessionHandler.java:182) > at org.mortbay.jetty.handler.ContextHandler.handle(ContextHandler.java:766) > at org.mortbay.jetty.webapp.WebAppContext.handle(WebAppContext.java:450) at > org.mortbay.jetty.handler.ContextHandlerCollection.handle(ContextHandlerCollection.java:230) > at > org.mortbay.jetty.handler.HandlerCollection.handle(HandlerCollection.java:114) > at org.mortbay.jetty.handler.HandlerWrapper.handle(HandlerWrapper.java:152) > at org.mortbay.jetty.Server.handle(Server.java:326) at > org.mortbay.jetty.HttpConnection.handleRequest(HttpConnection.java:542) at > org.mortbay.jetty.HttpConnection$RequestHandler.headerComplete(HttpConnection.java:928) > at org.mortbay.jetty.HttpParser.parseNext(HttpParser.java:549) at > org.mortbay.jetty.HttpParser.parseAvailable(HttpParser.java:212) at > org.mortbay.jetty.HttpConnection.handle(HttpConnection.java:404) at > org.mortbay.jetty.bio.SocketConnector$Connection.run(SocketConnector.java:228) > at > org.mortbay.thread.QueuedThreadPool$PoolThread.run(QueuedThreadPool.java:582) > > -- > View this message in context: > http://lucene.472066.n3.nabble.com/solr-java-lang-NullPointerException-on-select-queries-tp3989974.html > Sent from the Solr - User mailing list archive at Nabble.com.
solr java.lang.NullPointerException on select queries
I have recently started getting the error pasted below with solr-3.6 on /select queries. I don't know of anything that changed in the config to start causing this error. I am also running a second independent solr server on the same machine, which continues to run fine and has the same configuration as the first one except for the port number. The first one seems to be doing dataimport operations fine and updating index files as usual, but fails on select queries. An example of a failing query (that used to run fine) is: http:///solr/select/?q=title%3Afoo&version=2.2&start=0&rows=10&indent=on I am stupefied. Any idea? HTTP ERROR 500 Problem accessing /solr/select/. Reason: null java.lang.NullPointerException at org.apache.solr.handler.component.QueryComponent.process(QueryComponent.java:398) at org.apache.solr.handler.component.SearchHandler.handleRequestBody(SearchHandler.java:186) at org.apache.solr.handler.RequestHandlerBase.handleRequest(RequestHandlerBase.java:129) at org.apache.solr.core.SolrCore.execute(SolrCore.java:1376) at org.apache.solr.servlet.SolrDispatchFilter.execute(SolrDispatchFilter.java:365) at org.apache.solr.servlet.SolrDispatchFilter.doFilter(SolrDispatchFilter.java:260) at org.mortbay.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1212) at org.mortbay.jetty.servlet.ServletHandler.handle(ServletHandler.java:399) at org.mortbay.jetty.security.SecurityHandler.handle(SecurityHandler.java:216) at org.mortbay.jetty.servlet.SessionHandler.handle(SessionHandler.java:182) at org.mortbay.jetty.handler.ContextHandler.handle(ContextHandler.java:766) at org.mortbay.jetty.webapp.WebAppContext.handle(WebAppContext.java:450) at org.mortbay.jetty.handler.ContextHandlerCollection.handle(ContextHandlerCollection.java:230) at org.mortbay.jetty.handler.HandlerCollection.handle(HandlerCollection.java:114) at org.mortbay.jetty.handler.HandlerWrapper.handle(HandlerWrapper.java:152) at org.mortbay.jetty.Server.handle(Server.java:326) at org.mortbay.jetty.HttpConnection.handleRequest(HttpConnection.java:542) at org.mortbay.jetty.HttpConnection$RequestHandler.headerComplete(HttpConnection.java:928) at org.mortbay.jetty.HttpParser.parseNext(HttpParser.java:549) at org.mortbay.jetty.HttpParser.parseAvailable(HttpParser.java:212) at org.mortbay.jetty.HttpConnection.handle(HttpConnection.java:404) at org.mortbay.jetty.bio.SocketConnector$Connection.run(SocketConnector.java:228) at org.mortbay.thread.QueuedThreadPool$PoolThread.run(QueuedThreadPool.java:582) -- View this message in context: http://lucene.472066.n3.nabble.com/solr-java-lang-NullPointerException-on-select-queries-tp3989974.html Sent from the Solr - User mailing list archive at Nabble.com.
Re: facet.field :java.lang.NullPointerException
This is https://issues.apache.org/jira/browse/SOLR-2142 I'll look into it soon. -Yonik http://www.lucidimagination.com On Fri, Oct 15, 2010 at 3:12 PM, Pradeep Singh wrote: > Faceting blows up when the field has no data. And this seems to be random. > Sometimes it will work even with no data, other times not. Sometimes the > error goes away if the field is set to multiValued=true (even though it's > one value every time), other times it doesn't. In all cases setting > facet.method to enum takes care of the problem. If this param is not set, > the default leads to null pointer exception. > > > 09:18:52,218 SEVERE [SolrCore] Exception during facet.field of > xyz:java.lang.NullPointerException > > at java.lang.System.arraycopy(Native Method) > > at org.apache.lucene.util.PagedBytes.copy(PagedBytes.java:247) > > at > org.apache.solr.request.TermIndex$1.setTerm(UnInvertedField.java:1164) > > at > org.apache.solr.request.NumberedTermsEnum.(UnInvertedField.java:960) > > at > org.apache.solr.request.TermIndex$1.(UnInvertedField.java:1151) > > at > org.apache.solr.request.TermIndex.getEnumerator(UnInvertedField.java:1151) > > at > org.apache.solr.request.UnInvertedField.uninvert(UnInvertedField.java:204) > > at > org.apache.solr.request.UnInvertedField.(UnInvertedField.java:188) > > at > org.apache.solr.request.UnInvertedField.getUnInvertedField(UnInvertedField.java:911) > > at > org.apache.solr.request.SimpleFacets.getTermCounts(SimpleFacets.java:298) > > at > org.apache.solr.request.SimpleFacets.getFacetFieldCounts(SimpleFacets.java:354) > > at > org.apache.solr.request.SimpleFacets.getFacetCounts(SimpleFacets.java:190) > > at > org.apache.solr.handler.component.FacetComponent.process(FacetComponent.java:72) > > at > org.apache.solr.handler.component.SearchHandler.handleRequestBody(SearchHandler.java:210) > > at > org.apache.solr.handler.RequestHandlerBase.handleRequest(RequestHandlerBase.java:131) > > at org.apache.solr.core.SolrCore.execute(SolrCore.java:1323) > > at > org.apache.solr.servlet.SolrDispatchFilter.execute(SolrDispatchFilter.java:337) > > at > org.apache.solr.servlet.SolrDispatchFilter.doFilter(SolrDispatchFilter.java:240) > at >
facet.field :java.lang.NullPointerException
Faceting blows up when the field has no data. And this seems to be random. Sometimes it will work even with no data, other times not. Sometimes the error goes away if the field is set to multiValued=true (even though it's one value every time), other times it doesn't. In all cases setting facet.method to enum takes care of the problem. If this param is not set, the default leads to null pointer exception. 09:18:52,218 SEVERE [SolrCore] Exception during facet.field of xyz:java.lang.NullPointerException at java.lang.System.arraycopy(Native Method) at org.apache.lucene.util.PagedBytes.copy(PagedBytes.java:247) at org.apache.solr.request.TermIndex$1.setTerm(UnInvertedField.java:1164) at org.apache.solr.request.NumberedTermsEnum.(UnInvertedField.java:960) at org.apache.solr.request.TermIndex$1.(UnInvertedField.java:1151) at org.apache.solr.request.TermIndex.getEnumerator(UnInvertedField.java:1151) at org.apache.solr.request.UnInvertedField.uninvert(UnInvertedField.java:204) at org.apache.solr.request.UnInvertedField.(UnInvertedField.java:188) at org.apache.solr.request.UnInvertedField.getUnInvertedField(UnInvertedField.java:911) at org.apache.solr.request.SimpleFacets.getTermCounts(SimpleFacets.java:298) at org.apache.solr.request.SimpleFacets.getFacetFieldCounts(SimpleFacets.java:354) at org.apache.solr.request.SimpleFacets.getFacetCounts(SimpleFacets.java:190) at org.apache.solr.handler.component.FacetComponent.process(FacetComponent.java:72) at org.apache.solr.handler.component.SearchHandler.handleRequestBody(SearchHandler.java:210) at org.apache.solr.handler.RequestHandlerBase.handleRequest(RequestHandlerBase.java:131) at org.apache.solr.core.SolrCore.execute(SolrCore.java:1323) at org.apache.solr.servlet.SolrDispatchFilter.execute(SolrDispatchFilter.java:337) at org.apache.solr.servlet.SolrDispatchFilter.doFilter(SolrDispatchFilter.java:240) at
java.lang.NullPointerException
Hi again, I change the search options to decrease my query size and now I get passed the URI too long from the other thread. I already added : 819200 819200 On Jetty config but now I'm stucked again on: 13/Jul/2010 9:41:38 org.apache.solr.common.SolrException log SEVERE: java.lang.NullPointerException at java.io.StringReader.(Unknown Source) My querystring has now about 10 000 chars. Could this be memory issues? Thank You, Frederico
Re: java.lang.NullPointerException with MySQL DataImportHandler
On Thu, Feb 4, 2010 at 10:50 AM, Lance Norskog wrote: > I just tested this with a DIH that does not use database input. > > If the DataImportHandler JDBC code does not support a schema that has > optional fields, that is a major weakness. Noble/Shalin, is this true? The problem is obviously not with DIH. DIH blindly passes on all the fields it could obtain from the DB. if some field is missing DIH does not do anything > > On Tue, Feb 2, 2010 at 8:50 AM, Sascha Szott wrote: >> Hi, >> >> since some of the fields used in your DIH configuration aren't mandatory >> (e.g., keywords and tags are defined as nullable in your db table schema), >> add a default value to all optional fields in your schema configuration >> (e.g., default = ""). Note, that Solr does not understand the db-related >> concept of null values. >> >> Solr's log output >> >> SolrInputDocument[{keywords=keywords(1.0)={Dolce}, name=name(1.0)={Dolce >> & Gabbana D&G Neckties designer Tie for men 543}, >> productID=productID(1.0)={220213}}] >> >> indicates that there aren't any tags or descriptions stored for the item >> with productId 220213. Since no default value is specified, Solr raises an >> error when creating the index document. >> >> -Sascha >> >> Jean-Michel Philippon-Nadeau wrote: >>> >>> Hi, >>> >>> Thanks for the reply. >>> >>> On Tue, 2010-02-02 at 16:57 +0100, Sascha Szott wrote: * the output of MySQL's describe command for all tables/views referenced in your DIH configuration >>> >>> mysql> describe products; >>> >>> ++--+--+-+-++ >>> | Field | Type | Null | Key | Default | Extra >>> | >>> >>> ++--+--+-+-++ >>> | productID | int(10) unsigned | NO | PRI | NULL | >>> auto_increment | >>> | skuCode | varchar(320) | YES | MUL | NULL | >>> | >>> | upcCode | varchar(320) | YES | MUL | NULL | >>> | >>> | name | varchar(320) | NO | | NULL | >>> | >>> | description | text | NO | | NULL | >>> | >>> | keywords | text | YES | | NULL | >>> | >>> | disqusThreadID | varchar(50) | NO | | NULL | >>> | >>> | tags | text | YES | | NULL | >>> | >>> | createdOn | int(10) unsigned | NO | | NULL | >>> | >>> | lastUpdated | int(10) unsigned | NO | | NULL | >>> | >>> | imageURL | varchar(320) | YES | | NULL | >>> | >>> | inStock | tinyint(1) | YES | MUL | 1 | >>> | >>> | active | tinyint(1) | YES | | 1 | >>> | >>> >>> ++--+--+-+-++ >>> 13 rows in set (0.00 sec) >>> >>> mysql> describe product_soldby_vendor; >>> +-+--+--+-+-+---+ >>> | Field | Type | Null | Key | Default | Extra | >>> +-+--+--+-+-+---+ >>> | productID | int(10) unsigned | NO | MUL | NULL | | >>> | productVendorID | int(10) unsigned | NO | MUL | NULL | | >>> | price | double | NO | | NULL | | >>> | currency | varchar(5) | NO | | NULL | | >>> | buyURL | varchar(320) | NO | | NULL | | >>> +-+--+--+-+-+---+ >>> 5 rows in set (0.00 sec) >>> >>> mysql> describe products_vendors_subcategories; >>> >>> ++--+--+-+-++ >>> | Field | Type | Null | Key | Default | >>> Extra | >>> >>> ++--+--+-+-++ >>> | productVendorSubcategoryID | int(10) unsigned | NO | PRI | NULL | >>> auto_increment | >>> | productVendorCategoryID | int(10) unsigned | NO | | NULL | >>> | >>> | labelEnglish | varchar(320) | NO | | NULL | >>> | >>> | labelFrench | varchar(320) | NO | | NULL | >>> | >>> >>> ++--+--+-+-++ >>> 4 rows in set (0.00 sec) >>> >>> mysql> describe products_vendors_categories; >>> >>> +-+--+--+-+-++ >>> | Field | Type | Null | Key | Default | >>> Extra | >>> >>> +-+--+--+-+-++ >>> | productVendorCategoryID | int(10) unsigned | NO | PRI | NULL | >>> auto_increment | >>> | labelEnglish | varchar(320) | NO | | NULL | >>> | >>> | labelFrench | varchar(320) | NO | | NULL | >>> | >>> >>>
Re: java.lang.NullPointerException with MySQL DataImportHandler
I just tested this with a DIH that does not use database input. If the DataImportHandler JDBC code does not support a schema that has optional fields, that is a major weakness. Noble/Shalin, is this true? On Tue, Feb 2, 2010 at 8:50 AM, Sascha Szott wrote: > Hi, > > since some of the fields used in your DIH configuration aren't mandatory > (e.g., keywords and tags are defined as nullable in your db table schema), > add a default value to all optional fields in your schema configuration > (e.g., default = ""). Note, that Solr does not understand the db-related > concept of null values. > > Solr's log output > > SolrInputDocument[{keywords=keywords(1.0)={Dolce}, name=name(1.0)={Dolce > & Gabbana D&G Neckties designer Tie for men 543}, > productID=productID(1.0)={220213}}] > > indicates that there aren't any tags or descriptions stored for the item > with productId 220213. Since no default value is specified, Solr raises an > error when creating the index document. > > -Sascha > > Jean-Michel Philippon-Nadeau wrote: >> >> Hi, >> >> Thanks for the reply. >> >> On Tue, 2010-02-02 at 16:57 +0100, Sascha Szott wrote: >>> >>> * the output of MySQL's describe command for all tables/views referenced >>> in your DIH configuration >> >> mysql> describe products; >> >> ++--+--+-+-++ >> | Field | Type | Null | Key | Default | Extra >> | >> >> ++--+--+-+-++ >> | productID | int(10) unsigned | NO | PRI | NULL | >> auto_increment | >> | skuCode | varchar(320) | YES | MUL | NULL | >> | >> | upcCode | varchar(320) | YES | MUL | NULL | >> | >> | name | varchar(320) | NO | | NULL | >> | >> | description | text | NO | | NULL | >> | >> | keywords | text | YES | | NULL | >> | >> | disqusThreadID | varchar(50) | NO | | NULL | >> | >> | tags | text | YES | | NULL | >> | >> | createdOn | int(10) unsigned | NO | | NULL | >> | >> | lastUpdated | int(10) unsigned | NO | | NULL | >> | >> | imageURL | varchar(320) | YES | | NULL | >> | >> | inStock | tinyint(1) | YES | MUL | 1 | >> | >> | active | tinyint(1) | YES | | 1 | >> | >> >> ++--+--+-+-++ >> 13 rows in set (0.00 sec) >> >> mysql> describe product_soldby_vendor; >> +-+--+--+-+-+---+ >> | Field | Type | Null | Key | Default | Extra | >> +-+--+--+-+-+---+ >> | productID | int(10) unsigned | NO | MUL | NULL | | >> | productVendorID | int(10) unsigned | NO | MUL | NULL | | >> | price | double | NO | | NULL | | >> | currency | varchar(5) | NO | | NULL | | >> | buyURL | varchar(320) | NO | | NULL | | >> +-+--+--+-+-+---+ >> 5 rows in set (0.00 sec) >> >> mysql> describe products_vendors_subcategories; >> >> ++--+--+-+-++ >> | Field | Type | Null | Key | Default | >> Extra | >> >> ++--+--+-+-++ >> | productVendorSubcategoryID | int(10) unsigned | NO | PRI | NULL | >> auto_increment | >> | productVendorCategoryID | int(10) unsigned | NO | | NULL | >> | >> | labelEnglish | varchar(320) | NO | | NULL | >> | >> | labelFrench | varchar(320) | NO | | NULL | >> | >> >> ++--+--+-+-++ >> 4 rows in set (0.00 sec) >> >> mysql> describe products_vendors_categories; >> >> +-+--+--+-+-++ >> | Field | Type | Null | Key | Default | >> Extra | >> >> +-+--+--+-+-++ >> | productVendorCategoryID | int(10) unsigned | NO | PRI | NULL | >> auto_increment | >> | labelEnglish | varchar(320) | NO | | NULL | >> | >> | labelFrench | varchar(320) | NO | | NULL | >> | >> >> +-+--+--+-+-++ >> 3 rows in set (0.00 sec) >> >> mysql> describe product_vendor_in_subcategory; >> +---+--+--+-+-+---+ >> | Field | Type | Null | Key | Default | Extra | >> +--
Re: java.lang.NullPointerException with MySQL DataImportHandler
Hi, since some of the fields used in your DIH configuration aren't mandatory (e.g., keywords and tags are defined as nullable in your db table schema), add a default value to all optional fields in your schema configuration (e.g., default = ""). Note, that Solr does not understand the db-related concept of null values. Solr's log output SolrInputDocument[{keywords=keywords(1.0)={Dolce}, name=name(1.0)={Dolce & Gabbana D&G Neckties designer Tie for men 543}, productID=productID(1.0)={220213}}] indicates that there aren't any tags or descriptions stored for the item with productId 220213. Since no default value is specified, Solr raises an error when creating the index document. -Sascha Jean-Michel Philippon-Nadeau wrote: Hi, Thanks for the reply. On Tue, 2010-02-02 at 16:57 +0100, Sascha Szott wrote: * the output of MySQL's describe command for all tables/views referenced in your DIH configuration mysql> describe products; ++--+--+-+-++ | Field | Type | Null | Key | Default | Extra | ++--+--+-+-++ | productID | int(10) unsigned | NO | PRI | NULL| auto_increment | | skuCode| varchar(320) | YES | MUL | NULL| | | upcCode| varchar(320) | YES | MUL | NULL| | | name | varchar(320) | NO | | NULL| | | description| text | NO | | NULL| | | keywords | text | YES | | NULL| | | disqusThreadID | varchar(50) | NO | | NULL| | | tags | text | YES | | NULL| | | createdOn | int(10) unsigned | NO | | NULL| | | lastUpdated| int(10) unsigned | NO | | NULL| | | imageURL | varchar(320) | YES | | NULL| | | inStock| tinyint(1) | YES | MUL | 1 | | | active | tinyint(1) | YES | | 1 | | ++--+--+-+-++ 13 rows in set (0.00 sec) mysql> describe product_soldby_vendor; +-+--+--+-+-+---+ | Field | Type | Null | Key | Default | Extra | +-+--+--+-+-+---+ | productID | int(10) unsigned | NO | MUL | NULL| | | productVendorID | int(10) unsigned | NO | MUL | NULL| | | price | double | NO | | NULL| | | currency| varchar(5) | NO | | NULL| | | buyURL | varchar(320) | NO | | NULL| | +-+--+--+-+-+---+ 5 rows in set (0.00 sec) mysql> describe products_vendors_subcategories; ++--+--+-+-++ | Field | Type | Null | Key | Default | Extra | ++--+--+-+-++ | productVendorSubcategoryID | int(10) unsigned | NO | PRI | NULL| auto_increment | | productVendorCategoryID| int(10) unsigned | NO | | NULL| | | labelEnglish | varchar(320) | NO | | NULL| | | labelFrench| varchar(320) | NO | | NULL| | ++--+--+-+-++ 4 rows in set (0.00 sec) mysql> describe products_vendors_categories; +-+--+--+-+-++ | Field | Type | Null | Key | Default | Extra | +-+--+--+-+-++ | productVendorCategoryID | int(10) unsigned | NO | PRI | NULL| auto_increment | | labelEnglish| varchar(320) | NO | | NULL| | | labelFrench | varchar(320) | NO | | NULL| | +-+--+--+-+-++ 3 rows in set (0.00 sec) mysql> describe product_vendor_in_subcategory; +---+--+--+-+-+---+ | Field | Type | Null | Key | Default | Extra | +---+--+--+-+-+---+ | productVendorID | int(10) unsigned | NO | MUL | NULL| | | productCategoryID | int(10) unsigned | NO | MUL | NULL| | +---+--+--+-+-+---+ 2 rows in set (0.00 sec) mysql> describe products_vendors_countries; ++--+--+-+-++ | Field | Type | Null | Key | Default | Extra | ++--+--+-
Re: java.lang.NullPointerException with MySQL DataImportHandler
Hi, Thanks for the reply. On Tue, 2010-02-02 at 16:57 +0100, Sascha Szott wrote: > * the output of MySQL's describe command for all tables/views referenced > in your DIH configuration mysql> describe products; ++--+--+-+-++ | Field | Type | Null | Key | Default | Extra | ++--+--+-+-++ | productID | int(10) unsigned | NO | PRI | NULL| auto_increment | | skuCode| varchar(320) | YES | MUL | NULL| | | upcCode| varchar(320) | YES | MUL | NULL| | | name | varchar(320) | NO | | NULL| | | description| text | NO | | NULL| | | keywords | text | YES | | NULL| | | disqusThreadID | varchar(50) | NO | | NULL| | | tags | text | YES | | NULL| | | createdOn | int(10) unsigned | NO | | NULL| | | lastUpdated| int(10) unsigned | NO | | NULL| | | imageURL | varchar(320) | YES | | NULL| | | inStock| tinyint(1) | YES | MUL | 1 | | | active | tinyint(1) | YES | | 1 | | ++--+--+-+-++ 13 rows in set (0.00 sec) mysql> describe product_soldby_vendor; +-+--+--+-+-+---+ | Field | Type | Null | Key | Default | Extra | +-+--+--+-+-+---+ | productID | int(10) unsigned | NO | MUL | NULL| | | productVendorID | int(10) unsigned | NO | MUL | NULL| | | price | double | NO | | NULL| | | currency| varchar(5) | NO | | NULL| | | buyURL | varchar(320) | NO | | NULL| | +-+--+--+-+-+---+ 5 rows in set (0.00 sec) mysql> describe products_vendors_subcategories; ++--+--+-+-++ | Field | Type | Null | Key | Default | Extra | ++--+--+-+-++ | productVendorSubcategoryID | int(10) unsigned | NO | PRI | NULL| auto_increment | | productVendorCategoryID| int(10) unsigned | NO | | NULL| | | labelEnglish | varchar(320) | NO | | NULL| | | labelFrench| varchar(320) | NO | | NULL| | ++--+--+-+-++ 4 rows in set (0.00 sec) mysql> describe products_vendors_categories; +-+--+--+-+-++ | Field | Type | Null | Key | Default | Extra | +-+--+--+-+-++ | productVendorCategoryID | int(10) unsigned | NO | PRI | NULL| auto_increment | | labelEnglish| varchar(320) | NO | | NULL| | | labelFrench | varchar(320) | NO | | NULL| | +-+--+--+-+-++ 3 rows in set (0.00 sec) mysql> describe product_vendor_in_subcategory; +---+--+--+-+-+---+ | Field | Type | Null | Key | Default | Extra | +---+--+--+-+-+---+ | productVendorID | int(10) unsigned | NO | MUL | NULL| | | productCategoryID | int(10) unsigned | NO | MUL | NULL| | +---+--+--+-+-+---+ 2 rows in set (0.00 sec) mysql> describe products_vendors_countries; ++--+--+-+-++ | Field | Type | Null | Key | Default | Extra | ++--+--+-+-++ | productVendorCountryID | int(10) unsigned | NO | PRI | NULL| auto_increment | | name | varchar(50) | NO | | NULL| | | code | varchar(2) | NO | | NULL| | ++--+--+-+-++ 3 rows in set (0.00 sec) mysql> describe product_vendor_shipsto_country; ++-+--+-+-+---+ | Field | Type| Null | Key | Default | Extra | ++-+--+-+-+---+ | productVendorID| int(11) | NO | MUL | NULL| | | productVendorCountryID | in
Re: java.lang.NullPointerException with MySQL DataImportHandler
On Tue, Feb 2, 2010 at 8:36 PM, Jean-Michel Philippon-Nadeau < j...@jmpnadeau.ca> wrote: > > I am running into an issue with my MySQL DataImportHandler. I've > followed the quick-start in order to write the necessary config and so > far everything seemed to work. > > However, I am missing some fields in my index. I've switched all fields > to stored="true" temporarily in my schema to troubleshoot the issue. I > only have 3 fields listed in search results while I should have 8. > > Could this be caused by ampersands or illegal entities in my database? > How can I see if DIH is importing correctly all my rows into the index? > > Follows is the warning I have in my catalina.log. > > Thank you very much, > > Jean-Michel > > === > > Feb 2, 2010 12:21:07 AM org.apache.solr.handler.dataimport.SolrWriter > upload > WARNING: Error creating document : > SolrInputDocument[{keywords=keywords(1.0)={Dolce}, name=name(1.0)={Dolce > & Gabbana D&G Neckties designer Tie for men 543}, > productID=productID(1.0)={220213}}] > java.lang.NullPointerException >at > org.apache.lucene.util.StringHelper.intern(StringHelper.java:36) >at org.apache.lucene.document.Field.(Field.java:341) >at org.apache.lucene.document.Field.(Field.java:305) >at > org.apache.solr.schema.FieldType.createField(FieldType.java:210) > That exception indicates that a field name itself was null. Can you post your data-config? -- Regards, Shalin Shekhar Mangar.
Re: java.lang.NullPointerException with MySQL DataImportHandler
Hi, can you post * the output of MySQL's describe command for all tables/views referenced in your DIH configuration * the DIH configuration file (i.e., data-config.xml) * the schema definition (i.e., schema.xml) -Sascha Jean-Michel Philippon-Nadeau wrote: Hi, It is my first install of Solr. The setup has been pretty straightforward and yet, the performance is very impressive. I am running into an issue with my MySQL DataImportHandler. I've followed the quick-start in order to write the necessary config and so far everything seemed to work. However, I am missing some fields in my index. I've switched all fields to stored="true" temporarily in my schema to troubleshoot the issue. I only have 3 fields listed in search results while I should have 8. Could this be caused by ampersands or illegal entities in my database? How can I see if DIH is importing correctly all my rows into the index? Follows is the warning I have in my catalina.log. Thank you very much, Jean-Michel === Feb 2, 2010 12:21:07 AM org.apache.solr.handler.dataimport.SolrWriter upload WARNING: Error creating document : SolrInputDocument[{keywords=keywords(1.0)={Dolce}, name=name(1.0)={Dolce & Gabbana D&G Neckties designer Tie for men 543}, productID=productID(1.0)={220213}}] java.lang.NullPointerException at org.apache.lucene.util.StringHelper.intern(StringHelper.java:36) at org.apache.lucene.document.Field.(Field.java:341) at org.apache.lucene.document.Field.(Field.java:305) at org.apache.solr.schema.FieldType.createField(FieldType.java:210) at org.apache.solr.schema.SchemaField.createField(SchemaField.java:94) at org.apache.solr.update.DocumentBuilder.toDocument(DocumentBuilder.java:246) at org.apache.solr.update.processor.RunUpdateProcessor.processAdd(RunUpdateProcessorFactory.java:60) at org.apache.solr.handler.dataimport.SolrWriter.upload(SolrWriter.java:75) at org.apache.solr.handler.dataimport.DataImportHandler $1.upload(DataImportHandler.java:292) at org.apache.solr.handler.dataimport.DocBuilder.buildDocument(DocBuilder.java:392) at org.apache.solr.handler.dataimport.DocBuilder.doFullDump(DocBuilder.java:242) at org.apache.solr.handler.dataimport.DocBuilder.execute(DocBuilder.java:180) at org.apache.solr.handler.dataimport.DataImporter.doFullImport(DataImporter.java:331) at org.apache.solr.handler.dataimport.DataImporter.runCmd(DataImporter.java:389) at org.apache.solr.handler.dataimport.DataImporter $1.run(DataImporter.java:370)
java.lang.NullPointerException with MySQL DataImportHandler
Hi, It is my first install of Solr. The setup has been pretty straightforward and yet, the performance is very impressive. I am running into an issue with my MySQL DataImportHandler. I've followed the quick-start in order to write the necessary config and so far everything seemed to work. However, I am missing some fields in my index. I've switched all fields to stored="true" temporarily in my schema to troubleshoot the issue. I only have 3 fields listed in search results while I should have 8. Could this be caused by ampersands or illegal entities in my database? How can I see if DIH is importing correctly all my rows into the index? Follows is the warning I have in my catalina.log. Thank you very much, Jean-Michel === Feb 2, 2010 12:21:07 AM org.apache.solr.handler.dataimport.SolrWriter upload WARNING: Error creating document : SolrInputDocument[{keywords=keywords(1.0)={Dolce}, name=name(1.0)={Dolce & Gabbana D&G Neckties designer Tie for men 543}, productID=productID(1.0)={220213}}] java.lang.NullPointerException at org.apache.lucene.util.StringHelper.intern(StringHelper.java:36) at org.apache.lucene.document.Field.(Field.java:341) at org.apache.lucene.document.Field.(Field.java:305) at org.apache.solr.schema.FieldType.createField(FieldType.java:210) at org.apache.solr.schema.SchemaField.createField(SchemaField.java:94) at org.apache.solr.update.DocumentBuilder.toDocument(DocumentBuilder.java:246) at org.apache.solr.update.processor.RunUpdateProcessor.processAdd(RunUpdateProcessorFactory.java:60) at org.apache.solr.handler.dataimport.SolrWriter.upload(SolrWriter.java:75) at org.apache.solr.handler.dataimport.DataImportHandler $1.upload(DataImportHandler.java:292) at org.apache.solr.handler.dataimport.DocBuilder.buildDocument(DocBuilder.java:392) at org.apache.solr.handler.dataimport.DocBuilder.doFullDump(DocBuilder.java:242) at org.apache.solr.handler.dataimport.DocBuilder.execute(DocBuilder.java:180) at org.apache.solr.handler.dataimport.DataImporter.doFullImport(DataImporter.java:331) at org.apache.solr.handler.dataimport.DataImporter.runCmd(DataImporter.java:389) at org.apache.solr.handler.dataimport.DataImporter $1.run(DataImporter.java:370)
Re: java.lang.NullPointerException at org.apache.solr.handler.component.QueryComponent.mergeIds(QueryComponent.java:421)
On Tue, Nov 17, 2009 at 06:09:56PM +0200, Eugene Dzhurinsky wrote: > java.lang.NullPointerException > at > org.apache.solr.handler.component.QueryComponent.mergeIds(QueryComponent.java:421) I compared schema.xml from Solr installation package with the one I created, and found out that my unique key was not marked as storable. After I made it storable and re-indexed things - everything started to work fine. Just to record this for someone who may experience the same problem. -- Eugene N Dzhurinsky pgpwMXeCH9RcC.pgp Description: PGP signature
java.lang.NullPointerException at org.apache.solr.handler.component.QueryComponent.mergeIds(QueryComponent.java:421)
Hi there! I am trying to test the distributed search on 2 servers. I've created simple application which adds sample documents to 2 different solr servers (version 1.3.0). While it is possible to search for certain keyphrase on any of these servers, I am getting weird error when trying to search on both of these servers (like it was described at http://wiki.apache.org/solr/DistributedSearch) HTTP ERROR: 500 null java.lang.NullPointerException at org.apache.solr.handler.component.QueryComponent.mergeIds(QueryComponent.java:421) at org.apache.solr.handler.component.QueryComponent.handleResponses(QueryComponent.java:265) at org.apache.solr.handler.component.SearchHandler.handleRequestBody(SearchHandler.java:264) at org.apache.solr.handler.RequestHandlerBase.handleRequest(RequestHandlerBase.java:131) at org.apache.solr.core.SolrCore.execute(SolrCore.java:1204) at org.apache.solr.servlet.SolrDispatchFilter.execute(SolrDispatchFilter.java:303) at org.apache.solr.servlet.SolrDispatchFilter.doFilter(SolrDispatchFilter.java:232) at org.mortbay.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1089) at org.mortbay.jetty.servlet.ServletHandler.handle(ServletHandler.java:365) at org.mortbay.jetty.security.SecurityHandler.handle(SecurityHandler.java:216) at org.mortbay.jetty.servlet.SessionHandler.handle(SessionHandler.java:181) at org.mortbay.jetty.handler.ContextHandler.handle(ContextHandler.java:712) at org.mortbay.jetty.webapp.WebAppContext.handle(WebAppContext.java:405) at org.mortbay.jetty.handler.ContextHandlerCollection.handle(ContextHandlerCollection.java:211) at org.mortbay.jetty.handler.HandlerCollection.handle(HandlerCollection.java:114) at org.mortbay.jetty.handler.HandlerWrapper.handle(HandlerWrapper.java:139) at org.mortbay.jetty.Server.handle(Server.java:285) at org.mortbay.jetty.HttpConnection.handleRequest(HttpConnection.java:502) at org.mortbay.jetty.HttpConnection$RequestHandler.headerComplete(HttpConnection.java:821) at org.mortbay.jetty.HttpParser.parseNext(HttpParser.java:513) at org.mortbay.jetty.HttpParser.parseAvailable(HttpParser.java:208) at org.mortbay.jetty.HttpConnection.handle(HttpConnection.java:378) at org.mortbay.jetty.bio.SocketConnector$Connection.run(SocketConnector.java:226) at org.mortbay.thread.BoundedThreadPool$PoolThread.run(BoundedThreadPool.java:442) These servers are using the same configuration. What may cause this error? Thank you in advance. -- Eugene N Dzhurinsky pgp1itLsu8na6.pgp Description: PGP signature