Re: Significant terms expression giving error "length needs to be >= 1"

2021-02-16 Thread Joel Bernstein
Can you include the stack trace from the logs?


Joel Bernstein
http://joelsolr.blogspot.com/


On Mon, Feb 15, 2021 at 3:53 PM ufuk yılmaz 
wrote:

> We have a SolrCloud cluster, version 8.4
>
> At the customer’s site there’s a collection with very few documents,
> around 12. We usually have collections with hundreds of millions of
> documents, so that collection is a bit of an exception.
>
> When I send a significantTerms streaming expression it immediately gets a
> “IllegalArgumentException("length needs to be >= 1")” from that
> collection’s shard. I took a look at it, but it doesn’t seem to have
> anything different in it from other collections. We also don’t get that
> exception in our own cluster, which is very similar to customer’s.
>
> I found the exception log in
> “lucene-solr\lucene\core\src\java\org\apache\lucene\util\SparseFixedBitSet.java”
> but I don’t have enough knowlegde on the inner workings of the streaming
> expression to interpret it.
>
> What may cause this?
>
> --ufuk
>
> Sent from Mail for Windows 10
>
>


Significant terms expression giving error "length needs to be >= 1"

2021-02-15 Thread ufuk yılmaz
We have a SolrCloud cluster, version 8.4

At the customer’s site there’s a collection with very few documents, around 12. 
We usually have collections with hundreds of millions of documents, so that 
collection is a bit of an exception.

When I send a significantTerms streaming expression it immediately gets a 
“IllegalArgumentException("length needs to be >= 1")” from that collection’s 
shard. I took a look at it, but it doesn’t seem to have anything different in 
it from other collections. We also don’t get that exception in our own cluster, 
which is very similar to customer’s.

I found the exception log in 
“lucene-solr\lucene\core\src\java\org\apache\lucene\util\SparseFixedBitSet.java”
 but I don’t have enough knowlegde on the inner workings of the streaming 
expression to interpret it.

What may cause this?

--ufuk

Sent from Mail for Windows 10



Error Logs in Solr

2021-02-03 Thread Manisha Rahatadkar
Hi All

I see following errors logged, I think these errors are related to Text 
Suggester. I saw these errors reported in the client environment, but don't 
know what does it mean. Can someone guide me what is the possibility of these 
errors?

[cid:image001.jpg@01D6FA33.FF76CC20]

Regards
Manisha Rahatadkar

Confidentiality Notice

This email message, including any attachments, is for the sole use of the 
intended recipient and may contain confidential and privileged information. Any 
unauthorized view, use, disclosure or distribution is prohibited. If you are 
not the intended recipient, please contact the sender by reply email and 
destroy all copies of the original message. Anju Software, Inc. 4500 S. 
Lakeshore Drive, Suite 620, Tempe, AZ USA 85282.

java.lang.IllegalArgumentException: no tokens produced by analyzer,​ or 
the only tokens were empty strings
java.lang.IllegalArgumentException: no tokens produced by analyzer, or the only 
tokens were empty strings
at 
org.apache.lucene.search.suggest.analyzing.FreeTextSuggester.lookup(FreeTextSuggester.java:467)
at 
org.apache.lucene.search.suggest.analyzing.FreeTextSuggester.lookup(FreeTextSuggester.java:399)
at 
org.apache.lucene.search.suggest.analyzing.FreeTextSuggester.lookup(FreeTextSuggester.java:388)
at 
org.apache.solr.spelling.suggest.SolrSuggester.getSuggestions(SolrSuggester.java:243)
at 
org.apache.solr.handler.component.SuggestComponent.process(SuggestComponent.java:264)
at 
org.apache.solr.handler.component.SearchHandler.handleRequestBody(SearchHandler.java:298)
at 
org.apache.solr.handler.RequestHandlerBase.handleRequest(RequestHandlerBase.java:199)
at org.apache.solr.core.SolrCore.execute(SolrCore.java:2551)
at org.apache.solr.servlet.HttpSolrCall.execute(HttpSolrCall.java:710)
at org.apache.solr.servlet.HttpSolrCall.call(HttpSolrCall.java:516)
at 
org.apache.solr.servlet.SolrDispatchFilter.doFilter(SolrDispatchFilter.java:395)
at 
org.apache.solr.servlet.SolrDispatchFilter.doFilter(SolrDispatchFilter.java:341)
at 
org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1602)
at 
org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:540)
at 
org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:146)
at 
org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:548)
at 
org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:132)
at 
org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:257)
at 
org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1588)
at 
org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:255)
at 
org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1345)
at 
org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:203)
at 
org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:480)
at 
org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1557)
at 
org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:201)
at 
org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1247)
at 
org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:144)
at 
org.eclipse.jetty.server.handler.ContextHandlerCollection.handle(ContextHandlerCollection.java:220)
at 
org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:126)
at 
org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:132)
at 
org.eclipse.jetty.rewrite.handler.RewriteHandler.handle(RewriteHandler.java:335)
at 
org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:132)
at org.eclipse.jetty.server.Server.handle(Server.java:502)
at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:364)
at 
org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:260)
at 
org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:305)
at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:103)
at org.eclipse.jetty.io.ChannelEndPoint$2.run(ChannelEndPoint.java:118)
at 
org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:333)
at 
org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:310)
at 
org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:168)
at 
org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:126)
at 
org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecuto

Re: Error Adding a Replica to SOLR Cloud 8.2.0

2021-01-26 Thread Joe Lerner
We finally got this fixed by temporarily disabling any updates to the SOLR
index. 



--
Sent from: https://lucene.472066.n3.nabble.com/Solr-User-f472068.html


Re: Getting error "Bad Message 414 reason: URI Too Long"

2021-01-15 Thread Shawn Heisey

On 1/14/2021 2:31 AM, Abhay Kumar wrote:
I am trying to post below query to Solr but getting error as “Bad 
Message 414reason: URI Too Long”.


I am sending query using SolrNet library. Please suggest how to resolve 
this issue.


*Query :* 
http://localhost:8983/solr/documents/select?q=%22Geisteswissenschaften


If your query is a POST request rather than a GET, then you won't get 
that error.  And if the request is identical to the REALLY long URL that 
you included (which seemed to be incomplete), then it's definitely not a 
POST.  If it were, everything after the ? would be in the request body, 
not on the URL itself.


There is a note on the SolrNET FAQ about this.

https://github.com/SolrNet/SolrNet/blob/master/Documentation/FAQ.md#im-getting-a-uri-too-long-error

If you want more info on that, you'll need to ask SolrNET.  It's a 
completely different project.


Thanks,
Shawn


Re: Getting error "Bad Message 414 reason: URI Too Long"

2021-01-14 Thread Bernd Fehling

AFAIK, that could be a limit in Jetty and be raised in jetty.xml.
You might check the Jetty docs and look for something like BufferSize.
At least for Solr 6.6.x

Regards
Bernd


Am 14.01.21 um 13:19 schrieb Abhay Kumar:

Thank you Nicolas. Yes, we are making Post request to Solr using SolrNet 
library.
The current request length is approx. 32K characters, I have tested with 10K 
characters length request and it works fine.

Any suggestion to increase request length size in Solr configuration.

Thanks.
Abhay

-Original Message-
From: Nicolas Franck 
Sent: 14 January 2021 15:12
To: solr-user@lucene.apache.org
Subject: Re: Getting error "Bad Message 414 reason: URI Too Long"

Euh, sorry: I did not read your message well enough.
You did actually use a post request, with the parameters in the body
(your example suggests otherwise)


On 14 Jan 2021, at 10:37, Nicolas Franck  wrote:

I believe you can also access this path in a HTTP POST request.
That way you do no hit the URI size limit

cf. 
https://stackoverflow.com/questions/2997014/can-you-use-post-to-run-a-query-in-solr-select

I think some solr libraries already use this approach (e.g.  WebService::Solr 
in perl)

On 14 Jan 2021, at 10:31, Abhay Kumar 
mailto:abhay.ku...@anjusoftware.com>> wrote:

Hello,

I am trying to post below query to Solr but getting error as “Bad Message 
414reason: URI Too Long”.

I am sending query using SolrNet library. Please suggest how to resolve this 
issue.
...





RE: Getting error "Bad Message 414 reason: URI Too Long"

2021-01-14 Thread Abhay Kumar
Thank you Nicolas. Yes, we are making Post request to Solr using SolrNet 
library.
The current request length is approx. 32K characters, I have tested with 10K 
characters length request and it works fine.

Any suggestion to increase request length size in Solr configuration.

Thanks.
Abhay

-Original Message-
From: Nicolas Franck 
Sent: 14 January 2021 15:12
To: solr-user@lucene.apache.org
Subject: Re: Getting error "Bad Message 414 reason: URI Too Long"

Euh, sorry: I did not read your message well enough.
You did actually use a post request, with the parameters in the body
(your example suggests otherwise)

> On 14 Jan 2021, at 10:37, Nicolas Franck  wrote:
>
> I believe you can also access this path in a HTTP POST request.
> That way you do no hit the URI size limit
>
> cf. 
> https://stackoverflow.com/questions/2997014/can-you-use-post-to-run-a-query-in-solr-select
>
> I think some solr libraries already use this approach (e.g.  WebService::Solr 
> in perl)
>
> On 14 Jan 2021, at 10:31, Abhay Kumar 
> mailto:abhay.ku...@anjusoftware.com>> wrote:
>
> Hello,
>
> I am trying to post below query to Solr but getting error as “Bad Message 
> 414reason: URI Too Long”.
>
> I am sending query using SolrNet library. Please suggest how to resolve this 
> issue.
>
> Query : 
> http://localhost:8983/solr/documents/select?q=%22Geisteswissenschaften%22%20OR%20%22Humanities%22%20OR%20%22Art%22%20OR%20%22Arts%22%20OR%20%22Caricatures%22%20OR%20%22Caricature%22%20OR%20%22Cartoon%22%20OR%20%22Engraving%20and%20Engravings%22%20OR%20%22Engravings%20and%20Engraving%22%20OR%20%22Engraving%22%20OR%20%22Engravings%22%20OR%20%22Human%20Body%22%20OR%20%22Human%20Bodies%22%20OR%20%22Human%20Figure%22%20OR%20%22Human%20Figures%22%20OR%20%22menschlicher%20K%C3%B6rper%22%20OR%20%22Menschliche%20Gestalt%22%20OR%20%22Body%20Parts%22%20OR%20%22K%C3%B6rperteile%22%20OR%20%22Body%20Parts%20and%20Fluids%22%20OR%20%22K%C3%B6rperteile%20und%20-fl%C3%BCssigkeiten%22%20OR%20%22Medical%20Illustration%22%20OR%20%22Medical%20Illustrations%22%20OR%20%22medizinische%20Illustration%22%20OR%20%22Anatomy%2C%20Artistic%22%20OR%20%22Artistic%20Anatomy%22%20OR%20%22Artistic%20Anatomies%22%20OR%20%22Medicine%20in%20Art%22%20OR%20%22Medicine%20in%20Arts%22%20OR%20%22Numismatics%22%20OR%20%22M%C3%BCnzkunde%22%20OR%20%22Coins%22%20OR%20%22Coin%22%20OR%20%22M%C3%BCnzen%22%20OR%20%22Medals%22%20OR%20%22Medal%22%20OR%20%22Denkm%C3%BCnzen%22%20OR%20%22Gedenkm%C3%BCnzen%22%20OR%20%22Medaillen%22%20OR%20%22Paintings%22%20OR%20%22Painting%22%20OR%20%22Philately%22%20OR%20%22Philatelies%22%20OR%20%22Postage%20Stamps%22%20OR%20%22Postage%20Stamp%22%20OR%20%22Briefmarken%22%20OR%20%22Portraits%22%20OR%20%22Portrait%22%20OR%20%22Sculpture%22%20OR%20%22Sculptures%22%20OR%20%22Awards%20and%20Prizes%22%20OR%20%22Prizes%20and%20Awards%22%20OR%20%22Awards%22%20OR%20%22Award%22%20OR%20%22Prizes%22%20OR%20%22Prize%22%20OR%20%22Nobel%20Prize%22%20OR%20%22Ethics%22%20OR%20%22Egoism%22%20OR%20%22Ethical%20Issues%22%20OR%20%22Ethical%20Issue%22%20OR%20%22Metaethics%22%20OR%20%22Metaethik%22%20OR%20%22Moral%20Policy%22%20OR%20%22Moral%20Policies%22%20OR%20%22Moralischer%20Grundsatz%22%20OR%20%22Natural%20Law%22%20OR%20%22Natural%20Laws%22%20OR%20%22Naturrecht%22%20OR%20%22Situational%20Ethics%22%20OR%20%22Bioethical%20Issues%22%20OR%20%22Bioethical%20Issue%22%20OR%20%22Bioethics%22%20OR%20%22Biomedical%20Ethics%22%20OR%20%22Health%20Care%20Ethics%22%20OR%20%22Ethics%2C%20Clinical%22%20OR%20%22Clinical%20Ethics%22%20OR%20%22klinische%20Ethik%22%20OR%20%22Complicity%22%20OR%20%22Mitt%C3%A4terschaft%22%20OR%20%22Moral%20Complicity%22%20OR%20%22Moralische%20Komplizenschaft%22%20OR%20%22Moralische%20Mitt%C3%A4terschaft%22%20OR%20%22Conflict%20of%20Interest%22%20OR%20%22Interest%20Conflict%22%20OR%20%22Interest%20Conflicts%22%20OR%20%22Ethical%20Analysis%22%20OR%20%22Ethical%20Analyses%22%20OR%20%22Casuistry%22%20OR%20%22Retrospective%20Moral%20Judgment%22%20OR%20%22Retrospective%20Moral%20Judgments%22%20OR%20%22retrospektive%20Moralische%20Beurteilung%22%20OR%20%22Wedge%20Argument%22%20OR%20%22Wedge%20Arguments%22%20OR%20%22Slippery%20Slope%20Argument%22%20OR%20%22Slippery%20Slope%20Arguments%22%20OR%20%22Argument%20der%20schiefen%20Ebene%22%20OR%20%22Ethical%20Relativism%22%20OR%20%22Ethical%20Review%22%20OR%20%22Ethikgutachten%22%20OR%20%22Ethics%20Consultation%22%20OR%20%22Ethics%20Consultations%22%20OR%20%22Ethical%20Theory%22%20OR%20%22Ethical%20Theories%22%20OR%20%22Normative%20Ethics%22%20OR%20%22Normative%20Ethic%22%20OR%20%22Consequentialism%22%20OR%20%22Deontological%20Ethics%22%20OR%20%22Deontological%20Ethic%22%20OR%20%22Deontologie%22%20OR%20%22Ethik%20der%20Pflichtenlehre%22%20OR%20%22Teleological%20Ethics%22%20OR%20%22Teleological%20Ethic%22%20OR%20%22Teleologische%20Ethik%22%20OR%20%22Utilitarianism%22%20OR%20%22Utilitarianisms%22%20OR%20%22Utilitarismus%22%20OR%20

Re: Getting error "Bad Message 414 reason: URI Too Long"

2021-01-14 Thread Nicolas Franck
Euh, sorry: I did not read your message well enough.
You did actually use a post request, with the parameters in the body
(your example suggests otherwise)

> On 14 Jan 2021, at 10:37, Nicolas Franck  wrote:
> 
> I believe you can also access this path in a HTTP POST request.
> That way you do no hit the URI size limit
> 
> cf. 
> https://stackoverflow.com/questions/2997014/can-you-use-post-to-run-a-query-in-solr-select
> 
> I think some solr libraries already use this approach (e.g.  WebService::Solr 
> in perl)
> 
> On 14 Jan 2021, at 10:31, Abhay Kumar 
> mailto:abhay.ku...@anjusoftware.com>> wrote:
> 
> Hello,
> 
> I am trying to post below query to Solr but getting error as “Bad Message 
> 414reason: URI Too Long”.
> 
> I am sending query using SolrNet library. Please suggest how to resolve this 
> issue.
> 
> Query : 
> http://localhost:8983/solr/documents/select?q=%22Geisteswissenschaften%22%20OR%20%22Humanities%22%20OR%20%22Art%22%20OR%20%22Arts%22%20OR%20%22Caricatures%22%20OR%20%22Caricature%22%20OR%20%22Cartoon%22%20OR%20%22Engraving%20and%20Engravings%22%20OR%20%22Engravings%20and%20Engraving%22%20OR%20%22Engraving%22%20OR%20%22Engravings%22%20OR%20%22Human%20Body%22%20OR%20%22Human%20Bodies%22%20OR%20%22Human%20Figure%22%20OR%20%22Human%20Figures%22%20OR%20%22menschlicher%20K%C3%B6rper%22%20OR%20%22Menschliche%20Gestalt%22%20OR%20%22Body%20Parts%22%20OR%20%22K%C3%B6rperteile%22%20OR%20%22Body%20Parts%20and%20Fluids%22%20OR%20%22K%C3%B6rperteile%20und%20-fl%C3%BCssigkeiten%22%20OR%20%22Medical%20Illustration%22%20OR%20%22Medical%20Illustrations%22%20OR%20%22medizinische%20Illustration%22%20OR%20%22Anatomy%2C%20Artistic%22%20OR%20%22Artistic%20Anatomy%22%20OR%20%22Artistic%20Anatomies%22%20OR%20%22Medicine%20in%20Art%22%20OR%20%22Medicine%20in%20Arts%22%20OR%20%22Numismatics%22%20OR%20%22M%C3%BCnzkunde%22%20OR%20%22Coins%22%20OR%20%22Coin%22%20OR%20%22M%C3%BCnzen%22%20OR%20%22Medals%22%20OR%20%22Medal%22%20OR%20%22Denkm%C3%BCnzen%22%20OR%20%22Gedenkm%C3%BCnzen%22%20OR%20%22Medaillen%22%20OR%20%22Paintings%22%20OR%20%22Painting%22%20OR%20%22Philately%22%20OR%20%22Philatelies%22%20OR%20%22Postage%20Stamps%22%20OR%20%22Postage%20Stamp%22%20OR%20%22Briefmarken%22%20OR%20%22Portraits%22%20OR%20%22Portrait%22%20OR%20%22Sculpture%22%20OR%20%22Sculptures%22%20OR%20%22Awards%20and%20Prizes%22%20OR%20%22Prizes%20and%20Awards%22%20OR%20%22Awards%22%20OR%20%22Award%22%20OR%20%22Prizes%22%20OR%20%22Prize%22%20OR%20%22Nobel%20Prize%22%20OR%20%22Ethics%22%20OR%20%22Egoism%22%20OR%20%22Ethical%20Issues%22%20OR%20%22Ethical%20Issue%22%20OR%20%22Metaethics%22%20OR%20%22Metaethik%22%20OR%20%22Moral%20Policy%22%20OR%20%22Moral%20Policies%22%20OR%20%22Moralischer%20Grundsatz%22%20OR%20%22Natural%20Law%22%20OR%20%22Natural%20Laws%22%20OR%20%22Naturrecht%22%20OR%20%22Situational%20Ethics%22%20OR%20%22Bioethical%20Issues%22%20OR%20%22Bioethical%20Issue%22%20OR%20%22Bioethics%22%20OR%20%22Biomedical%20Ethics%22%20OR%20%22Health%20Care%20Ethics%22%20OR%20%22Ethics%2C%20Clinical%22%20OR%20%22Clinical%20Ethics%22%20OR%20%22klinische%20Ethik%22%20OR%20%22Complicity%22%20OR%20%22Mitt%C3%A4terschaft%22%20OR%20%22Moral%20Complicity%22%20OR%20%22Moralische%20Komplizenschaft%22%20OR%20%22Moralische%20Mitt%C3%A4terschaft%22%20OR%20%22Conflict%20of%20Interest%22%20OR%20%22Interest%20Conflict%22%20OR%20%22Interest%20Conflicts%22%20OR%20%22Ethical%20Analysis%22%20OR%20%22Ethical%20Analyses%22%20OR%20%22Casuistry%22%20OR%20%22Retrospective%20Moral%20Judgment%22%20OR%20%22Retrospective%20Moral%20Judgments%22%20OR%20%22retrospektive%20Moralische%20Beurteilung%22%20OR%20%22Wedge%20Argument%22%20OR%20%22Wedge%20Arguments%22%20OR%20%22Slippery%20Slope%20Argument%22%20OR%20%22Slippery%20Slope%20Arguments%22%20OR%20%22Argument%20der%20schiefen%20Ebene%22%20OR%20%22Ethical%20Relativism%22%20OR%20%22Ethical%20Review%22%20OR%20%22Ethikgutachten%22%20OR%20%22Ethics%20Consultation%22%20OR%20%22Ethics%20Consultations%22%20OR%20%22Ethical%20Theory%22%20OR%20%22Ethical%20Theories%22%20OR%20%22Normative%20Ethics%22%20OR%20%22Normative%20Ethic%22%20OR%20%22Consequentialism%22%20OR%20%22Deontological%20Ethics%22%20OR%20%22Deontological%20Ethic%22%20OR%20%22Deontologie%22%20OR%20%22Ethik%20der%20Pflichtenlehre%22%20OR%20%22Teleological%20Ethics%22%20OR%20%22Teleological%20Ethic%22%20OR%20%22Teleologische%20Ethik%22%20OR%20%22Utilitarianism%22%20OR%20%22Utilitarianisms%22%20OR%20%22Utilitarismus%22%20OR%20%22Ethicists%22%20OR%20%22Ethicist%22%20OR%20%22Ethics%20Consultants%22%20OR%20%22Ethics%20Consultant%22%20OR%20%22Bioethicists%22%20OR%20%22Bioethicist%22%20OR%20%22Bioethics%20Consultants%22%20OR%20%22Bioethics%20Consultant%22%20OR%20%22Bioethiker%22%20OR%20%22Clinical%20Ethicists%22%20OR%20%22Clinical%20Ethicist%22%20OR%20%22Ethics%20Committees%22%20OR%20%22Ethics%20Committee%22%20OR%20%22Institutional%20Ethics%20Committees%22%20OR%20%22Institutional%20Ethics%20Commi

Re: Getting error "Bad Message 414 reason: URI Too Long"

2021-01-14 Thread Nicolas Franck
I believe you can also access this path in a HTTP POST request.
That way you do no hit the URI size limit

cf. 
https://stackoverflow.com/questions/2997014/can-you-use-post-to-run-a-query-in-solr-select

I think some solr libraries already use this approach (e.g.  WebService::Solr 
in perl)

On 14 Jan 2021, at 10:31, Abhay Kumar 
mailto:abhay.ku...@anjusoftware.com>> wrote:

Hello,

I am trying to post below query to Solr but getting error as “Bad Message 
414reason: URI Too Long”.

I am sending query using SolrNet library. Please suggest how to resolve this 
issue.

Query : 
http://localhost:8983/solr/documents/select?q=%22Geisteswissenschaften%22%20OR%20%22Humanities%22%20OR%20%22Art%22%20OR%20%22Arts%22%20OR%20%22Caricatures%22%20OR%20%22Caricature%22%20OR%20%22Cartoon%22%20OR%20%22Engraving%20and%20Engravings%22%20OR%20%22Engravings%20and%20Engraving%22%20OR%20%22Engraving%22%20OR%20%22Engravings%22%20OR%20%22Human%20Body%22%20OR%20%22Human%20Bodies%22%20OR%20%22Human%20Figure%22%20OR%20%22Human%20Figures%22%20OR%20%22menschlicher%20K%C3%B6rper%22%20OR%20%22Menschliche%20Gestalt%22%20OR%20%22Body%20Parts%22%20OR%20%22K%C3%B6rperteile%22%20OR%20%22Body%20Parts%20and%20Fluids%22%20OR%20%22K%C3%B6rperteile%20und%20-fl%C3%BCssigkeiten%22%20OR%20%22Medical%20Illustration%22%20OR%20%22Medical%20Illustrations%22%20OR%20%22medizinische%20Illustration%22%20OR%20%22Anatomy%2C%20Artistic%22%20OR%20%22Artistic%20Anatomy%22%20OR%20%22Artistic%20Anatomies%22%20OR%20%22Medicine%20in%20Art%22%20OR%20%22Medicine%20in%20Arts%22%20OR%20%22Numismatics%22%20OR%20%22M%C3%BCnzkunde%22%20OR%20%22Coins%22%20OR%20%22Coin%22%20OR%20%22M%C3%BCnzen%22%20OR%20%22Medals%22%20OR%20%22Medal%22%20OR%20%22Denkm%C3%BCnzen%22%20OR%20%22Gedenkm%C3%BCnzen%22%20OR%20%22Medaillen%22%20OR%20%22Paintings%22%20OR%20%22Painting%22%20OR%20%22Philately%22%20OR%20%22Philatelies%22%20OR%20%22Postage%20Stamps%22%20OR%20%22Postage%20Stamp%22%20OR%20%22Briefmarken%22%20OR%20%22Portraits%22%20OR%20%22Portrait%22%20OR%20%22Sculpture%22%20OR%20%22Sculptures%22%20OR%20%22Awards%20and%20Prizes%22%20OR%20%22Prizes%20and%20Awards%22%20OR%20%22Awards%22%20OR%20%22Award%22%20OR%20%22Prizes%22%20OR%20%22Prize%22%20OR%20%22Nobel%20Prize%22%20OR%20%22Ethics%22%20OR%20%22Egoism%22%20OR%20%22Ethical%20Issues%22%20OR%20%22Ethical%20Issue%22%20OR%20%22Metaethics%22%20OR%20%22Metaethik%22%20OR%20%22Moral%20Policy%22%20OR%20%22Moral%20Policies%22%20OR%20%22Moralischer%20Grundsatz%22%20OR%20%22Natural%20Law%22%20OR%20%22Natural%20Laws%22%20OR%20%22Naturrecht%22%20OR%20%22Situational%20Ethics%22%20OR%20%22Bioethical%20Issues%22%20OR%20%22Bioethical%20Issue%22%20OR%20%22Bioethics%22%20OR%20%22Biomedical%20Ethics%22%20OR%20%22Health%20Care%20Ethics%22%20OR%20%22Ethics%2C%20Clinical%22%20OR%20%22Clinical%20Ethics%22%20OR%20%22klinische%20Ethik%22%20OR%20%22Complicity%22%20OR%20%22Mitt%C3%A4terschaft%22%20OR%20%22Moral%20Complicity%22%20OR%20%22Moralische%20Komplizenschaft%22%20OR%20%22Moralische%20Mitt%C3%A4terschaft%22%20OR%20%22Conflict%20of%20Interest%22%20OR%20%22Interest%20Conflict%22%20OR%20%22Interest%20Conflicts%22%20OR%20%22Ethical%20Analysis%22%20OR%20%22Ethical%20Analyses%22%20OR%20%22Casuistry%22%20OR%20%22Retrospective%20Moral%20Judgment%22%20OR%20%22Retrospective%20Moral%20Judgments%22%20OR%20%22retrospektive%20Moralische%20Beurteilung%22%20OR%20%22Wedge%20Argument%22%20OR%20%22Wedge%20Arguments%22%20OR%20%22Slippery%20Slope%20Argument%22%20OR%20%22Slippery%20Slope%20Arguments%22%20OR%20%22Argument%20der%20schiefen%20Ebene%22%20OR%20%22Ethical%20Relativism%22%20OR%20%22Ethical%20Review%22%20OR%20%22Ethikgutachten%22%20OR%20%22Ethics%20Consultation%22%20OR%20%22Ethics%20Consultations%22%20OR%20%22Ethical%20Theory%22%20OR%20%22Ethical%20Theories%22%20OR%20%22Normative%20Ethics%22%20OR%20%22Normative%20Ethic%22%20OR%20%22Consequentialism%22%20OR%20%22Deontological%20Ethics%22%20OR%20%22Deontological%20Ethic%22%20OR%20%22Deontologie%22%20OR%20%22Ethik%20der%20Pflichtenlehre%22%20OR%20%22Teleological%20Ethics%22%20OR%20%22Teleological%20Ethic%22%20OR%20%22Teleologische%20Ethik%22%20OR%20%22Utilitarianism%22%20OR%20%22Utilitarianisms%22%20OR%20%22Utilitarismus%22%20OR%20%22Ethicists%22%20OR%20%22Ethicist%22%20OR%20%22Ethics%20Consultants%22%20OR%20%22Ethics%20Consultant%22%20OR%20%22Bioethicists%22%20OR%20%22Bioethicist%22%20OR%20%22Bioethics%20Consultants%22%20OR%20%22Bioethics%20Consultant%22%20OR%20%22Bioethiker%22%20OR%20%22Clinical%20Ethicists%22%20OR%20%22Clinical%20Ethicist%22%20OR%20%22Ethics%20Committees%22%20OR%20%22Ethics%20Committee%22%20OR%20%22Institutional%20Ethics%20Committees%22%20OR%20%22Institutional%20Ethics%20Committee%22%20OR%20%22Institutionalisierte%20Ethikkommission%22%20OR%20%22Regional%20Ethics%20Committees%22%20OR%20%22Regional%20Ethics%20Committee%22%20OR%20%22Regionale%20Ethikkommissionen%22%20OR%20%22Ethics%20Committees%2C%20Clinical%22%20OR%20%22Clinical%20Ethics%20Committees%22%20OR%20%22Clinical%20Ethics%20Committee%22%2

Getting error "Bad Message 414 reason: URI Too Long"

2021-01-14 Thread Abhay Kumar
Hello,

I am trying to post below query to Solr but getting error as "Bad Message 
414reason: URI Too Long".

I am sending query using SolrNet library. Please suggest how to resolve this 
issue.

Query : 
http://localhost:8983/solr/documents/select?q=%22Geisteswissenschaften%22%20OR%20%22Humanities%22%20OR%20%22Art%22%20OR%20%22Arts%22%20OR%20%22Caricatures%22%20OR%20%22Caricature%22%20OR%20%22Cartoon%22%20OR%20%22Engraving%20and%20Engravings%22%20OR%20%22Engravings%20and%20Engraving%22%20OR%20%22Engraving%22%20OR%20%22Engravings%22%20OR%20%22Human%20Body%22%20OR%20%22Human%20Bodies%22%20OR%20%22Human%20Figure%22%20OR%20%22Human%20Figures%22%20OR%20%22menschlicher%20K%C3%B6rper%22%20OR%20%22Menschliche%20Gestalt%22%20OR%20%22Body%20Parts%22%20OR%20%22K%C3%B6rperteile%22%20OR%20%22Body%20Parts%20and%20Fluids%22%20OR%20%22K%C3%B6rperteile%20und%20-fl%C3%BCssigkeiten%22%20OR%20%22Medical%20Illustration%22%20OR%20%22Medical%20Illustrations%22%20OR%20%22medizinische%20Illustration%22%20OR%20%22Anatomy%2C%20Artistic%22%20OR%20%22Artistic%20Anatomy%22%20OR%20%22Artistic%20Anatomies%22%20OR%20%22Medicine%20in%20Art%22%20OR%20%22Medicine%20in%20Arts%22%20OR%20%22Numismatics%22%20OR%20%22M%C3%BCnzkunde%22%20OR%20%22Coins%22%20OR%20%22Coin%22%20OR%20%22M%C3%BCnzen%22%20OR%20%22Medals%22%20OR%20%22Medal%22%20OR%20%22Denkm%C3%BCnzen%22%20OR%20%22Gedenkm%C3%BCnzen%22%20OR%20%22Medaillen%22%20OR%20%22Paintings%22%20OR%20%22Painting%22%20OR%20%22Philately%22%20OR%20%22Philatelies%22%20OR%20%22Postage%20Stamps%22%20OR%20%22Postage%20Stamp%22%20OR%20%22Briefmarken%22%20OR%20%22Portraits%22%20OR%20%22Portrait%22%20OR%20%22Sculpture%22%20OR%20%22Sculptures%22%20OR%20%22Awards%20and%20Prizes%22%20OR%20%22Prizes%20and%20Awards%22%20OR%20%22Awards%22%20OR%20%22Award%22%20OR%20%22Prizes%22%20OR%20%22Prize%22%20OR%20%22Nobel%20Prize%22%20OR%20%22Ethics%22%20OR%20%22Egoism%22%20OR%20%22Ethical%20Issues%22%20OR%20%22Ethical%20Issue%22%20OR%20%22Metaethics%22%20OR%20%22Metaethik%22%20OR%20%22Moral%20Policy%22%20OR%20%22Moral%20Policies%22%20OR%20%22Moralischer%20Grundsatz%22%20OR%20%22Natural%20Law%22%20OR%20%22Natural%20Laws%22%20OR%20%22Naturrecht%22%20OR%20%22Situational%20Ethics%22%20OR%20%22Bioethical%20Issues%22%20OR%20%22Bioethical%20Issue%22%20OR%20%22Bioethics%22%20OR%20%22Biomedical%20Ethics%22%20OR%20%22Health%20Care%20Ethics%22%20OR%20%22Ethics%2C%20Clinical%22%20OR%20%22Clinical%20Ethics%22%20OR%20%22klinische%20Ethik%22%20OR%20%22Complicity%22%20OR%20%22Mitt%C3%A4terschaft%22%20OR%20%22Moral%20Complicity%22%20OR%20%22Moralische%20Komplizenschaft%22%20OR%20%22Moralische%20Mitt%C3%A4terschaft%22%20OR%20%22Conflict%20of%20Interest%22%20OR%20%22Interest%20Conflict%22%20OR%20%22Interest%20Conflicts%22%20OR%20%22Ethical%20Analysis%22%20OR%20%22Ethical%20Analyses%22%20OR%20%22Casuistry%22%20OR%20%22Retrospective%20Moral%20Judgment%22%20OR%20%22Retrospective%20Moral%20Judgments%22%20OR%20%22retrospektive%20Moralische%20Beurteilung%22%20OR%20%22Wedge%20Argument%22%20OR%20%22Wedge%20Arguments%22%20OR%20%22Slippery%20Slope%20Argument%22%20OR%20%22Slippery%20Slope%20Arguments%22%20OR%20%22Argument%20der%20schiefen%20Ebene%22%20OR%20%22Ethical%20Relativism%22%20OR%20%22Ethical%20Review%22%20OR%20%22Ethikgutachten%22%20OR%20%22Ethics%20Consultation%22%20OR%20%22Ethics%20Consultations%22%20OR%20%22Ethical%20Theory%22%20OR%20%22Ethical%20Theories%22%20OR%20%22Normative%20Ethics%22%20OR%20%22Normative%20Ethic%22%20OR%20%22Consequentialism%22%20OR%20%22Deontological%20Ethics%22%20OR%20%22Deontological%20Ethic%22%20OR%20%22Deontologie%22%20OR%20%22Ethik%20der%20Pflichtenlehre%22%20OR%20%22Teleological%20Ethics%22%20OR%20%22Teleological%20Ethic%22%20OR%20%22Teleologische%20Ethik%22%20OR%20%22Utilitarianism%22%20OR%20%22Utilitarianisms%22%20OR%20%22Utilitarismus%22%20OR%20%22Ethicists%22%20OR%20%22Ethicist%22%20OR%20%22Ethics%20Consultants%22%20OR%20%22Ethics%20Consultant%22%20OR%20%22Bioethicists%22%20OR%20%22Bioethicist%22%20OR%20%22Bioethics%20Consultants%22%20OR%20%22Bioethics%20Consultant%22%20OR%20%22Bioethiker%22%20OR%20%22Clinical%20Ethicists%22%20OR%20%22Clinical%20Ethicist%22%20OR%20%22Ethics%20Committees%22%20OR%20%22Ethics%20Committee%22%20OR%20%22Institutional%20Ethics%20Committees%22%20OR%20%22Institutional%20Ethics%20Committee%22%20OR%20%22Institutionalisierte%20Ethikkommission%22%20OR%20%22Regional%20Ethics%20Committees%22%20OR%20%22Regional%20Ethics%20Committee%22%20OR%20%22Regionale%20Ethikkommissionen%22%20OR%20%22Ethics%20Committees%2C%20Clinical%22%20OR%20%22Clinical%20Ethics%20Committees%22%20OR%20%22Clinical%20Ethics%20Committee%22%20OR%20%22Hospital%20Ethics%20Committee%22%20OR%20%22Hospital%20Ethics%20Committees%22%20OR%20%22klinische%20Ethik-Kommissionen%22%20OR%20%22Ethics%20Committees%2C%20Research%22%20OR%20%22Research%20Ethics%20Committees%22%20OR%20%22Research%20Ethics%20Committee%22%20OR%20%22Institutional%20Review%20Board%22%20OR%20%22Institutional%20Review%20Boards%22%20OR%20%22IRB%22%20OR%20

Re: Solr query with space (only) gives error

2021-01-09 Thread vstuart
Cross-posted / addressed (both me), here.

https://stackoverflow.com/questions/65620642/solr-query-with-space-only-q-20-stalls/65638561#65638561



--
Sent from: https://lucene.472066.n3.nabble.com/Solr-User-f472068.html


Re: Solr query with space (only) gives error

2021-01-09 Thread vstuart
Cross-posted / addressed (both me), here.

https://stackoverflow.com/questions/65620642/solr-query-with-space-only-q-20-stalls/65638561#65638561



--
Sent from: https://lucene.472066.n3.nabble.com/Solr-User-f472068.html


Re:Remote error message: empty String & null:java.lang.NumberFormatException: empty String

2021-01-09 Thread xiefengchang
why don't you check what is it trying to do number conversion? at least the 
numberformatException is quite clear

















At 2021-01-08 20:07:24, "Doss"  wrote:
>We have 12 node SOLR cloud with 3 zookeeper ensemble
>RAM: 80 CPU:40 Heap:16GB Records: 4 Million
>
>We do real time update and deletes (by ID), and we do us Inplace updates
>for 4 fields
>
>We have one index with 4 shards: 1 shard in 3 nodes
>
>Often we are getting the following errors
>
>1. *2021-01-08 17:11:14.305 ERROR (qtp1720891078-7429) [c:profilesindex
>s:shard4 r:core_node42 x:profilesindex_shard4_replica_n41]
>o.a.s.s.HttpSolrCall
>null:org.apache.solr.update.processor.DistributedUpdateProcessor$DistributedUpdatesAsyncException:
>Async exception during distributed update: Error from server at
>http://171.0.0.145:8983/solr/profilesindex_shard3_replica_n49/
><http://171.0.0.145:8983/solr/profilesindex_shard3_replica_n49/>: null*
>
>request: http://171.0.0.145:8983/solr/profilesindex_shard3_replica_n49/
>Remote error message: empty String
>at
>org.apache.solr.update.processor.DistributedZkUpdateProcessor.doDistribFinish(DistributedZkUpdateProcessor.java:1193)
>at
>org.apache.solr.update.processor.DistributedUpdateProcessor.finish(DistributedUpdateProcessor.java:1125)
>at
>org.apache.solr.handler.ContentStreamHandlerBase.handleRequestBody(ContentStreamHandlerBase.java:78)
>at
>org.apache.solr.handler.RequestHandlerBase.handleRequest(RequestHandlerBase.java:214)
>at org.apache.solr.core.SolrCore.execute(SolrCore.java:2606)
>at org.apache.solr.servlet.HttpSolrCall.execute(HttpSolrCall.java:812)
>at org.apache.solr.servlet.HttpSolrCall.call(HttpSolrCall.java:588)
>at
>org.apache.solr.servlet.SolrDispatchFilter.doFilter(SolrDispatchFilter.java:415)
>at
>org.apache.solr.servlet.SolrDispatchFilter.doFilter(SolrDispatchFilter.java:345)
>at
>org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1596)
>at
>org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:545)
>at
>org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143)
>at
>org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:590)
>at
>org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127)
>at
>org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235)
>at
>org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1610)
>at
>org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233)
>at
>org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1300)
>at
>org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188)
>at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:485)
>at
>org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1580)
>at
>org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186)
>at
>org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1215)
>at
>org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141)
>at
>org.eclipse.jetty.server.handler.ContextHandlerCollection.handle(ContextHandlerCollection.java:221)
>at
>org.eclipse.jetty.server.handler.InetAccessHandler.handle(InetAccessHandler.java:177)
>at
>org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146)
>at
>org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127)
>at
>org.eclipse.jetty.rewrite.handler.RewriteHandler.handle(RewriteHandler.java:322)
>at
>org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127)
>at org.eclipse.jetty.server.Server.handle(Server.java:500)
>at
>org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:383)
>at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:547)
>at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:375)
>at
>org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:273)
>at
>org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311)
>at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:103)
>at org.eclipse.jetty.io.ChannelEndPoint$2.run(ChannelEndPoint.java:117)
>at
>org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:336)
>at
>org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:313)
>at
>org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:171)
>at
>org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:129)
>at
>org.eclipse.jetty.u

Remote error message: empty String & null:java.lang.NumberFormatException: empty String

2021-01-08 Thread Doss
We have 12 node SOLR cloud with 3 zookeeper ensemble
RAM: 80 CPU:40 Heap:16GB Records: 4 Million

We do real time update and deletes (by ID), and we do us Inplace updates
for 4 fields

We have one index with 4 shards: 1 shard in 3 nodes

Often we are getting the following errors

1. *2021-01-08 17:11:14.305 ERROR (qtp1720891078-7429) [c:profilesindex
s:shard4 r:core_node42 x:profilesindex_shard4_replica_n41]
o.a.s.s.HttpSolrCall
null:org.apache.solr.update.processor.DistributedUpdateProcessor$DistributedUpdatesAsyncException:
Async exception during distributed update: Error from server at
http://171.0.0.145:8983/solr/profilesindex_shard3_replica_n49/
<http://171.0.0.145:8983/solr/profilesindex_shard3_replica_n49/>: null*

request: http://171.0.0.145:8983/solr/profilesindex_shard3_replica_n49/
Remote error message: empty String
at
org.apache.solr.update.processor.DistributedZkUpdateProcessor.doDistribFinish(DistributedZkUpdateProcessor.java:1193)
at
org.apache.solr.update.processor.DistributedUpdateProcessor.finish(DistributedUpdateProcessor.java:1125)
at
org.apache.solr.handler.ContentStreamHandlerBase.handleRequestBody(ContentStreamHandlerBase.java:78)
at
org.apache.solr.handler.RequestHandlerBase.handleRequest(RequestHandlerBase.java:214)
at org.apache.solr.core.SolrCore.execute(SolrCore.java:2606)
at org.apache.solr.servlet.HttpSolrCall.execute(HttpSolrCall.java:812)
at org.apache.solr.servlet.HttpSolrCall.call(HttpSolrCall.java:588)
at
org.apache.solr.servlet.SolrDispatchFilter.doFilter(SolrDispatchFilter.java:415)
at
org.apache.solr.servlet.SolrDispatchFilter.doFilter(SolrDispatchFilter.java:345)
at
org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1596)
at
org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:545)
at
org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143)
at
org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:590)
at
org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127)
at
org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235)
at
org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1610)
at
org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233)
at
org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1300)
at
org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188)
at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:485)
at
org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1580)
at
org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186)
at
org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1215)
at
org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141)
at
org.eclipse.jetty.server.handler.ContextHandlerCollection.handle(ContextHandlerCollection.java:221)
at
org.eclipse.jetty.server.handler.InetAccessHandler.handle(InetAccessHandler.java:177)
at
org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146)
at
org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127)
at
org.eclipse.jetty.rewrite.handler.RewriteHandler.handle(RewriteHandler.java:322)
at
org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127)
at org.eclipse.jetty.server.Server.handle(Server.java:500)
at
org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:383)
at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:547)
at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:375)
at
org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:273)
at
org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311)
at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:103)
at org.eclipse.jetty.io.ChannelEndPoint$2.run(ChannelEndPoint.java:117)
at
org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:336)
at
org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:313)
at
org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:171)
at
org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:129)
at
org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:375)
at
org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:806)
at
org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:938)
at java.base/java.lang.Thread.run(Thread.java:832)
=
2.  *2021-01-08 17:15:17.849 ERROR (qtp1720891078-7120) [c:profilesindex
s:shard4 r:core_node42 x:profilesindex_shard4_replica_n41]
o.a.s.s.HttpSolrCall null:java.lang.NumberFormatException: empty String*
at
jav

Solr query with space (only) gives error

2021-01-07 Thread vstuart
I have a frontend that uses Ajax to query Solr.

It's working well, but if I enter a single space (nothing else) in the
input/search box (the URL in the browser will show

... index.html#q=%20

In that circumstance I get a 400 error (as there are no parameters in the
request), which is fine, but my web page stalls, waiting for a response.

If, however, I enter a semicolon ( ; ) rather than a space, then the page
immediately refreshes, albeit with no results ("displaying 0 to 0 of 0").
Also fine / expected.

My question is what is triggering the " " (%20) query fault in Solr, and how
do I address (ideally, ignore) it?



--
Sent from: https://lucene.472066.n3.nabble.com/Solr-User-f472068.html


Solr query with space (only) gives error

2021-01-07 Thread vstuart
I have a frontend that uses Ajax to query Solr.

It's working well, but if I enter a single space (nothing else) in the
input/search box (the URL in the browser will show

... index.html#q=%20

In that circumstance I get a 400 error (as there are no parameters in the
request), which is fine, but my web page stalls, waiting for a response.

If, however, I enter a semicolon ( ; ) rather than a space, then the page
immediately refreshes, albeit with no results ("displaying 0 to 0 of 0").
Also fine / expected.

My question is what is triggering the " " (%20) query fault in Solr, and how
do I address (ideally, ignore) it?



--
Sent from: https://lucene.472066.n3.nabble.com/Solr-User-f472068.html


An Error related Apache tika

2020-12-15 Thread fujii.ruka...@mitsubishichem-sys.co.jp
Hi, All

I get this error when I try to get the contents of a Microsoft Office document 
(.xls) using Apache Tika.

Error 500 java.lang.NoSuchMethodError: 
‘org.apache.poi.hssf.record.common.ExtRst 
org.apache.poi.hssf.record.common.UnicodeString.getExtendedRst()’

Apache tika can extract many documents (.xlsx, .docx, .pptx, .pdf, etc.), but 
this error mainly occurs when extracting .xls documents.
We searched for a solution to this error but couldn't find it.
The versions of the tools using are:
Solr 8.6.3
openjdk 11.0.9.1
poi-4.1.1.jar
poi-ooxml-4.1.1.jar

Could you please tell me why this error occurred?

Thanks.
Ruka Fujii


Re: SolrCloud crashing due to memory error - 'Cannot allocate memory' (errno=12)

2020-12-10 Thread Walter Underwood
How much RAM do you have on those machines? That message says you ran out.

32 GB is a HUGE heap. Unless you have a specific need for that, run with a 8 GB
heap and see how that works. 

wunder
Walter Underwood
wun...@wunderwood.org
http://observer.wunderwood.org/  (my blog)

> On Dec 10, 2020, at 7:55 PM, Altamirano, Emmanuel 
>  wrote:
> 
> Hello,
>  
> We have a SolrCloud(8.6) with 3 servers with the same characteristics and 
> configuration. We assigned32GB for heap memory each, and after some short 
> period of time sending 40 concurrent requests to the SolrCloud using a load 
> balancer, we are getting the following error that shutdown each Solr Server 
> and Zookeeper:
>  
> OpenJDK 64-Bit Server VM warning: Failed to reserve large pages memory 
> req_addr: 0x bytes: 536870912 (errno = 12).
> OpenJDK 64-Bit Server VM warning: Attempt to deallocate stack guard pages 
> failed.
> OpenJDK 64-Bit Server VM warning: INFO: os::commit_memory(0x7edd4d9da000, 
> 12288, 0) failed; error='Cannot allocate memory' (errno=12)
>  
>  
> 20201201 10:43:29.495 [ERROR] {qtp2051853139-23369} [c:express s:shard1 
> r:core_node6 x:express_shard1_replica_n4] 
> [org.apache.solr.handler.RequestHandlerBase, 148] | 
> org.apache.solr.common.SolrException: Cannot talk to ZooKeeper - Updates are 
> disabled.
> at 
> org.apache.solr.update.processor.DistributedZkUpdateProcessor.zkCheck(DistributedZkUpdateProcessor.java:1245)
> at 
> org.apache.solr.update.processor.DistributedZkUpdateProcessor.setupRequest(DistributedZkUpdateProcessor.java:582)
> at 
> org.apache.solr.update.processor.DistributedZkUpdateProcessor.processAdd(DistributedZkUpdateProcessor.java:239)
>  
> 
>  
> We have a one collection with one shard, almost 400 million documents 
> (~334GB).
>  
> $ sysctl vm.nr_hugepages
> vm.nr_hugepages = 32768
> $ sysctl vm.max_map_count
> vm.max_map_count = 131072
>  
> /etc/security/limits.conf
>  
> * - core unlimited
> * - data unlimited
> * - priority unlimited
> * - fsize unlimited
> * - sigpending 513928
> * - memlock unlimited
> * - nofile 131072
> * - msgqueue 819200
> * - rtprio 0
> * - stack 8192
> * - cpu unlimited
> * - rss unlimited #virtual memory unlimited
> * - locks unlimited
> * soft nproc 65536
> * hard nproc 65536
> * - nofile 131072
>  
>  
>  
> /etc/sysctl.conf
>  
> vm.nr_hugepages =  32768
> vm.max_map_count = 131072
>  
>  
> Could you please provide me some advice to fix this error?
>  
> Thanks,
>  
> Emmanuel Altamirano



SolrCloud crashing due to memory error - 'Cannot allocate memory' (errno=12)

2020-12-10 Thread Altamirano, Emmanuel
Hello,

We have a SolrCloud(8.6) with 3 servers with the same characteristics and 
configuration. We assigned 32GB for heap memory each, and after some short 
period of time sending 40 concurrent requests to the SolrCloud using a load 
balancer, we are getting the following error that shutdown each Solr Server and 
Zookeeper:

OpenJDK 64-Bit Server VM warning: Failed to reserve large pages memory 
req_addr: 0x bytes: 536870912 (errno = 12).
OpenJDK 64-Bit Server VM warning: Attempt to deallocate stack guard pages 
failed.
OpenJDK 64-Bit Server VM warning: INFO: os::commit_memory(0x7edd4d9da000, 
12288, 0) failed; error='Cannot allocate memory' (errno=12)


20201201 10:43:29.495 [ERROR] {qtp2051853139-23369} [c:express s:shard1 
r:core_node6 x:express_shard1_replica_n4] 
[org.apache.solr.handler.RequestHandlerBase, 148] | 
org.apache.solr.common.SolrException: Cannot talk to ZooKeeper - Updates are 
disabled.
at 
org.apache.solr.update.processor.DistributedZkUpdateProcessor.zkCheck(DistributedZkUpdateProcessor.java:1245)
at 
org.apache.solr.update.processor.DistributedZkUpdateProcessor.setupRequest(DistributedZkUpdateProcessor.java:582)
at 
org.apache.solr.update.processor.DistributedZkUpdateProcessor.processAdd(DistributedZkUpdateProcessor.java:239)

[cid:image004.jpg@01D6CF3F.27574B90]

We have a one collection with one shard, almost 400 million documents (~334GB).

$ sysctl vm.nr_hugepages
vm.nr_hugepages = 32768
$ sysctl vm.max_map_count
vm.max_map_count = 131072

/etc/security/limits.conf

* - core unlimited
* - data unlimited
* - priority unlimited
* - fsize unlimited
* - sigpending 513928
* - memlock unlimited
* - nofile 131072
* - msgqueue 819200
* - rtprio 0
* - stack 8192
* - cpu unlimited
* - rss unlimited #virtual memory unlimited
* - locks unlimited
* soft nproc 65536
* hard nproc 65536
* - nofile 131072



/etc/sysctl.conf

vm.nr_hugepages =  32768
vm.max_map_count = 131072


Could you please provide me some advice to fix this error?

Thanks,

Emmanuel Altamirano


Error when restoring Solr

2020-11-19 Thread Gell-Holleron, Daniel
Hello,

I'm trying to restore Solr and I'm getting a timeout error, e.g. Timeout 
occurred when waiting response from server at http://solrserver:8983/solr

It then says 'could not restore core'. There are just under 40 million records 
to restore so I understand this will take some time.

What timeout setting is it that I'd need to increase? My guess is the 
connTimeout and maybe the socketTimeout in the solr.xml?

Thanks,

Daniel



Re: httpclient gives error

2020-10-31 Thread Shawn Heisey

On 10/31/2020 12:54 PM, Raivo Rebane wrote:
I try to use solrj in web application with eclipse tomcat but I get 
following errors





Tomcat lib contains following http jars:

-rw-rw-rw- 1 hydra hydra 326724 sept   6 21:33 httpcore-4.4.4.jar
-rw-rw-rw- 1 hydra hydra 736658 sept   6 21:33 httpclient-4.5.2.jar
-rwxrwxr-x 1 hydra hydra  21544 sept   9 11:17 
httpcore5-reactive-5.0.2.jar*

-rwxrwxr-x 1 hydra hydra 809733 sept   9 12:26 httpcore5-5.0.2.jar*
-rwxrwxr-x 1 hydra hydra 225863 sept   9 12:27 httpcore5-h2-5.0.2.jar*
-rwxrwxr-x 1 hydra hydra 145492 sept   9 12:30 httpcore5-testing-5.0.2.jar*
-rwxrwxr-x 1 hydra hydra 775798 okt    3 18:53 httpclient5-5.0.3.jar*
-rwxrwxr-x 1 hydra hydra  24047 okt    3 18:54 
httpclient5-fluent-5.0.3.jar*

-rwxrwxr-x 1 hydra hydra 259199 okt    3 18:54 httpclient5-cache-5.0.3.jar*
-rwxrwxr-x 1 hydra hydra  15576 okt    3 18:54 httpclient5-win-5.0.3.jar*
-rwxrwxr-x 1 hydra hydra  38022 okt    3 18:55 
httpclient5-testing-5.0.3.jar*

-rw-rw-r-- 1 hydra hydra  37068 okt   31 19:50 httpmime-4.3.jar


Version 5 of the apache httpclient is not used by any SolrJ version. 
Newer versions of SolrJ utilize the Jetty httpclient for http/2 support, 
not the apache httpclient.  The older client, using apache httpclient 
4.x, is still present in newer SolrJ versions.  Your message did not 
indicate which version of SolrJ you are using.  One of your previous 
emails to the list mentions version 8.6.3 of SolrJ ... the httpclient 
4.x jars that you have are different versions than that version of SolrJ 
asks for.


Looking over previous emails that you have sent to the mailing list, I 
wonder why you are adding jars manually instead of letting Maven handle 
all of the dependencies.  A common problem when dependency resolution is 
not automatic is that the classpath is missing one or more of the jars 
that exist on the filesystem.


I don't think this problem is directly caused by SolrJ.  It could be 
that the httpclient 4.x jars you have are not new enough, or there might 
be some unknown interaction between the 4.x jars and the 5.x jars.  Or 
maybe your classpath is incomplete -- doesn't include something in your 
file listing above.


Problems like this can also be caused by having multiple copies of the 
same or similar versions of jars on the classpath.  That kind of issue 
could be very hard to track down.  It can easily be caused by utilizing 
a mixture of automatic and manual dependencies.  Choose either all 
automatic (maven, ivy, gradle, etc) or all manual.


Thanks,
Shawn


httpclient gives error

2020-10-31 Thread Raivo Rebane

Hello

I try to use solrj in web application with eclipse tomcat but I get 
following errors


java.lang.NoSuchFieldError: INSTANCE
    at 
org.apache.http.impl.io.DefaultHttpRequestWriterFactory.(DefaultHttpRequestWriterFactory.java:52)
    at 
org.apache.http.impl.io.DefaultHttpRequestWriterFactory.(DefaultHttpRequestWriterFactory.java:56)
    at 
org.apache.http.impl.io.DefaultHttpRequestWriterFactory.(DefaultHttpRequestWriterFactory.java:46)
    at 
org.apache.http.impl.conn.ManagedHttpClientConnectionFactory.(ManagedHttpClientConnectionFactory.java:82)
    at 
org.apache.http.impl.conn.ManagedHttpClientConnectionFactory.(ManagedHttpClientConnectionFactory.java:95)
    at 
org.apache.http.impl.conn.ManagedHttpClientConnectionFactory.(ManagedHttpClientConnectionFactory.java:104)
    at 
org.apache.http.impl.conn.ManagedHttpClientConnectionFactory.(ManagedHttpClientConnectionFactory.java:62)
    at 
org.apache.http.impl.conn.PoolingHttpClientConnectionManager$InternalConnectionFactory.(PoolingHttpClientConnectionManager.java:572)
    at 
org.apache.http.impl.conn.PoolingHttpClientConnectionManager.(PoolingHttpClientConnectionManager.java:174)
    at 
org.apache.http.impl.conn.PoolingHttpClientConnectionManager.(PoolingHttpClientConnectionManager.java:158)
    at 
org.apache.http.impl.conn.PoolingHttpClientConnectionManager.(PoolingHttpClientConnectionManager.java:149)
    at 
org.apache.http.impl.conn.PoolingHttpClientConnectionManager.(PoolingHttpClientConnectionManager.java:125)
    at 
org.apache.solr.client.solrj.impl.HttpClientUtil.createPoolingConnectionManager(HttpClientUtil.java:278)
    at 
org.apache.solr.client.solrj.impl.HttpClientUtil.createClient(HttpClientUtil.java:273)
    at 
org.apache.solr.client.solrj.impl.HttpSolrClient.(HttpSolrClient.java:204)
    at 
org.apache.solr.client.solrj.impl.HttpSolrClient$Builder.build(HttpSolrClient.java:968)

    at AppServServlet.init(AppServServlet.java:63)


Tomcat lib contains following http jars:

-rw-rw-rw- 1 hydra hydra 326724 sept   6 21:33 httpcore-4.4.4.jar
-rw-rw-rw- 1 hydra hydra 736658 sept   6 21:33 httpclient-4.5.2.jar
-rwxrwxr-x 1 hydra hydra  21544 sept   9 11:17 httpcore5-reactive-5.0.2.jar*
-rwxrwxr-x 1 hydra hydra 809733 sept   9 12:26 httpcore5-5.0.2.jar*
-rwxrwxr-x 1 hydra hydra 225863 sept   9 12:27 httpcore5-h2-5.0.2.jar*
-rwxrwxr-x 1 hydra hydra 145492 sept   9 12:30 httpcore5-testing-5.0.2.jar*
-rwxrwxr-x 1 hydra hydra 775798 okt    3 18:53 httpclient5-5.0.3.jar*
-rwxrwxr-x 1 hydra hydra  24047 okt    3 18:54 httpclient5-fluent-5.0.3.jar*
-rwxrwxr-x 1 hydra hydra 259199 okt    3 18:54 httpclient5-cache-5.0.3.jar*
-rwxrwxr-x 1 hydra hydra  15576 okt    3 18:54 httpclient5-win-5.0.3.jar*
-rwxrwxr-x 1 hydra hydra  38022 okt    3 18:55 
httpclient5-testing-5.0.3.jar*

-rw-rw-r-- 1 hydra hydra  37068 okt   31 19:50 httpmime-4.3.jar

 which is accesible for tomcat.

What is wrong with my application ?

Please somebody help me

Regards

Raivo




Re: Need help in understanding the below error message when running solr-exporter

2020-10-19 Thread yaswanth kumar
Can someone help on the above pls??

On Sat, Oct 17, 2020 at 6:22 AM yaswanth kumar 
wrote:

> Using Solr 8.2; Zoo 3.4; Solr mode: Cloud with multiple collections; Basic
> Authentication: Enabled
>
> I am trying to run the
>
> export JAVA_OPTS="-Djavax.net.ssl.trustStore=etc/solr-keystore.jks
> -Djavax.net.ssl.trustStorePassword=solrssl
> -Dsolr.httpclient.builder.factory=org.apache.solr.client.solrj.impl.PreemptiveBasicAuthClientBuilderFactory
> -Dbasicauth=solrrocks:"
>
> export
> CLASSPATH_PREFIX="../../server/solr-webapp/webapp/WEB-INF/lib/commons-codec-1.11.jar"
>
> /bin/solr-exporter -p 8085 -z localhost:2181/solr -f
> ./conf/solr-exporter-config.xml -n 16
>
> and seeing these below messages and on the grafana solr dashboard I do see
> panels coming in but data is not populating on them.
>
> Can someone help me if I am missing something interms of configuration?
>
> WARN  - 2020-10-17 11:17:59.687; org.apache.solr.prometheus.scraper.Async;
> Error occurred during metrics collection =>
> java.util.concurrent.ExecutionException: java.lang.RuntimeException:
> java.lang.NullPointerException
> at
> java.base/java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:395)
> java.util.concurrent.ExecutionException: java.lang.RuntimeException:
> java.lang.NullPointerException
> at
> java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:395)
> ~[?:?]
> at
> java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1999)
> ~[?:?]
> at
> org.apache.solr.prometheus.scraper.Async.lambda$null$1(Async.java:45)
> [solr-prometheus-exporter-8.2.0.jar:8.2.0
> 31d7ec7bbfdcd2c4cc61d9d35e962165410b65fe - ivera - 2019-07-19 15:10:57]
> at
> org.apache.solr.prometheus.scraper.Async$$Lambda$190/.accept(Unknown
> Source) [solr-prometheus-exporter-8.2.0.jar:8.2.0
> 31d7ec7bbfdcd2c4cc61d9d35e962165410b65fe - ivera - 2019-07-19 15:10:57]
> at
> java.util.stream.ForEachOps$ForEachOp$OfRef.accept(ForEachOps.java:183)
> [?:?]
> at
> java.util.stream.ReferencePipeline$2$1.accept(ReferencePipeline.java:177)
> [?:?]
> at
> java.util.ArrayList$ArrayListSpliterator.forEachRemaining(ArrayList.java:1654)
> [?:?]
> at
> java.util.stream.AbstractPipeline.copyInto(AbstractPipeline.java:497) [?:?]
> at
> java.util.stream.AbstractPipeline.wrapAndCopyInto(AbstractPipeline.java:487)
> [?:?]
> at
> java.util.stream.ForEachOps$ForEachOp.evaluateSequential(ForEachOps.java:150)
> [?:?]
> at
> java.util.stream.ForEachOps$ForEachOp$OfRef.evaluateSequential(ForEachOps.java:173)
> [?:?]
> at
> java.util.stream.AbstractPipeline.evaluate(AbstractPipeline.java:239) [?:?]
> at
> java.util.stream.ReferencePipeline.forEach(ReferencePipeline.java:497) [?:?]
> at
> org.apache.solr.prometheus.scraper.Async.lambda$waitForAllSuccessfulResponses$3(Async.java:43)
> [solr-prometheus-exporter-8.2.0.jar:8.2.0
> 31d7ec7bbfdcd2c4cc61d9d35e962165410b65fe - ivera - 2019-07-19 15:10:57]
> at
> org.apache.solr.prometheus.scraper.Async$$Lambda$165/.apply(Unknown
> Source) [solr-prometheus-exporter-8.2.0.jar:8.2.0
> 31d7ec7bbfdcd2c4cc61d9d35e962165410b65fe - ivera - 2019-07-19 15:10:57]
> at
> java.util.concurrent.CompletableFuture.uniExceptionally(CompletableFuture.java:986)
> [?:?]
> at
> java.util.concurrent.CompletableFuture$UniExceptionally.tryFire(CompletableFuture.java:970)
> [?:?]
> at
> java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:506)
> [?:?]
> at
> java.util.concurrent.CompletableFuture$AsyncSupply.run(CompletableFuture.java:1705)
> [?:?]
> at
> org.apache.solr.common.util.ExecutorUtil$MDCAwareThreadPoolExecutor.lambda$execute$0(ExecutorUtil.java:209)
> [solr-solrj-8.2.0.jar:8.2.0 31d7ec7bbfdcd2c4cc61d9d35e962165410b65fe -
> ivera - 2019-07-19 15:11:07]
> at
> org.apache.solr.common.util.ExecutorUtil$MDCAwareThreadPoolExecutor$$Lambda$142/.run(Unknown
> Source) [solr-solrj-8.2.0.jar:8.2.0
> 31d7ec7bbfdcd2c4cc61d9d35e962165410b65fe - ivera - 2019-07-19 15:11:07]
> at
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128)
> [?:?]
> at
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628)
> [?:?]
> at java.lang.Thread.run(Thread.java:834) [?:?]
> Caused by: java.lang.RuntimeException: java.lang.NullPointerException
> at
> org.apache.solr.prometheus.collector.SchedulerMetricsCollector.lambda$collectMetrics$0(SchedulerMetricsCollector.java:92)
> ~[solr-pr

Need help in understanding the below error message when running solr-exporter

2020-10-17 Thread yaswanth kumar
Using Solr 8.2; Zoo 3.4; Solr mode: Cloud with multiple collections; Basic
Authentication: Enabled

I am trying to run the

export JAVA_OPTS="-Djavax.net.ssl.trustStore=etc/solr-keystore.jks
-Djavax.net.ssl.trustStorePassword=solrssl
-Dsolr.httpclient.builder.factory=org.apache.solr.client.solrj.impl.PreemptiveBasicAuthClientBuilderFactory
-Dbasicauth=solrrocks:"

export
CLASSPATH_PREFIX="../../server/solr-webapp/webapp/WEB-INF/lib/commons-codec-1.11.jar"

/bin/solr-exporter -p 8085 -z localhost:2181/solr -f
./conf/solr-exporter-config.xml -n 16

and seeing these below messages and on the grafana solr dashboard I do see
panels coming in but data is not populating on them.

Can someone help me if I am missing something interms of configuration?

WARN  - 2020-10-17 11:17:59.687; org.apache.solr.prometheus.scraper.Async;
Error occurred during metrics collection =>
java.util.concurrent.ExecutionException: java.lang.RuntimeException:
java.lang.NullPointerException
at
java.base/java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:395)
java.util.concurrent.ExecutionException: java.lang.RuntimeException:
java.lang.NullPointerException
at
java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:395)
~[?:?]
at
java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1999)
~[?:?]
at
org.apache.solr.prometheus.scraper.Async.lambda$null$1(Async.java:45)
[solr-prometheus-exporter-8.2.0.jar:8.2.0
31d7ec7bbfdcd2c4cc61d9d35e962165410b65fe - ivera - 2019-07-19 15:10:57]
at
org.apache.solr.prometheus.scraper.Async$$Lambda$190/.accept(Unknown
Source) [solr-prometheus-exporter-8.2.0.jar:8.2.0
31d7ec7bbfdcd2c4cc61d9d35e962165410b65fe - ivera - 2019-07-19 15:10:57]
at
java.util.stream.ForEachOps$ForEachOp$OfRef.accept(ForEachOps.java:183)
[?:?]
at
java.util.stream.ReferencePipeline$2$1.accept(ReferencePipeline.java:177)
[?:?]
at
java.util.ArrayList$ArrayListSpliterator.forEachRemaining(ArrayList.java:1654)
[?:?]
at
java.util.stream.AbstractPipeline.copyInto(AbstractPipeline.java:497) [?:?]
at
java.util.stream.AbstractPipeline.wrapAndCopyInto(AbstractPipeline.java:487)
[?:?]
at
java.util.stream.ForEachOps$ForEachOp.evaluateSequential(ForEachOps.java:150)
[?:?]
at
java.util.stream.ForEachOps$ForEachOp$OfRef.evaluateSequential(ForEachOps.java:173)
[?:?]
at
java.util.stream.AbstractPipeline.evaluate(AbstractPipeline.java:239) [?:?]
at
java.util.stream.ReferencePipeline.forEach(ReferencePipeline.java:497) [?:?]
at
org.apache.solr.prometheus.scraper.Async.lambda$waitForAllSuccessfulResponses$3(Async.java:43)
[solr-prometheus-exporter-8.2.0.jar:8.2.0
31d7ec7bbfdcd2c4cc61d9d35e962165410b65fe - ivera - 2019-07-19 15:10:57]
at
org.apache.solr.prometheus.scraper.Async$$Lambda$165/.apply(Unknown
Source) [solr-prometheus-exporter-8.2.0.jar:8.2.0
31d7ec7bbfdcd2c4cc61d9d35e962165410b65fe - ivera - 2019-07-19 15:10:57]
at
java.util.concurrent.CompletableFuture.uniExceptionally(CompletableFuture.java:986)
[?:?]
at
java.util.concurrent.CompletableFuture$UniExceptionally.tryFire(CompletableFuture.java:970)
[?:?]
at
java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:506)
[?:?]
at
java.util.concurrent.CompletableFuture$AsyncSupply.run(CompletableFuture.java:1705)
[?:?]
at
org.apache.solr.common.util.ExecutorUtil$MDCAwareThreadPoolExecutor.lambda$execute$0(ExecutorUtil.java:209)
[solr-solrj-8.2.0.jar:8.2.0 31d7ec7bbfdcd2c4cc61d9d35e962165410b65fe -
ivera - 2019-07-19 15:11:07]
at
org.apache.solr.common.util.ExecutorUtil$MDCAwareThreadPoolExecutor$$Lambda$142/.run(Unknown
Source) [solr-solrj-8.2.0.jar:8.2.0
31d7ec7bbfdcd2c4cc61d9d35e962165410b65fe - ivera - 2019-07-19 15:11:07]
at
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128)
[?:?]
at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628)
[?:?]
at java.lang.Thread.run(Thread.java:834) [?:?]
Caused by: java.lang.RuntimeException: java.lang.NullPointerException
at
org.apache.solr.prometheus.collector.SchedulerMetricsCollector.lambda$collectMetrics$0(SchedulerMetricsCollector.java:92)
~[solr-prometheus-exporter-8.2.0.jar:8.2.0
31d7ec7bbfdcd2c4cc61d9d35e962165410b65fe - ivera - 2019-07-19 15:10:57]
at
org.apache.solr.prometheus.collector.SchedulerMetricsCollector$$Lambda$163/.get(Unknown
Source) ~[?:?]
at
java.util.concurrent.CompletableFuture$AsyncSupply.run(CompletableFuture.java:1700)
~[?:?]
... 5 more
Caused by: java.lang.NullPointerException
at
org.apache.solr.prometheus.scraper.SolrScraper.request(SolrScraper.java:112)
~[solr-prometheus-exporter-8.2.0.jar:8.2.0
31d7ec7bbfdcd2c4cc61d9d35e962165410b65fe - ivera - 2019-07-

Error false and Error true in Solr logs

2020-10-15 Thread gnandre
Hi,

What do Error false and Error true flags mentioned against Solr errors in
Solr admin UI log mean?


Need help in trying to understand the error

2020-10-13 Thread yaswanth kumar
I am seeing the below errors frequently on the solr logs, every
functionality seems to be working fine but not really sure why there are
lots of these errors happening in the backend

Using : solr8.2, zoo 3.4
we have enable solr basicauthentication with security.json

2020-10-13 20:37:12.320 ERROR (qtp969996005-4438) [   ]
o.a.s.c.s.i.HttpClientUtil  => org.apache.solr.common.SolrException:
javax.crypto.BadPaddingException: RSA private key operation failed
at org.apache.solr.util.CryptoKeys$RSAKeyPair.encrypt(CryptoKeys.java:325)
org.apache.solr.common.SolrException: javax.crypto.BadPaddingException: RSA
private key operation failed
at org.apache.solr.util.CryptoKeys$RSAKeyPair.encrypt(CryptoKeys.java:325)
~[solr-core-8.2.0.jar:8.2.0 31d7ec7bbfdcd2c4cc61d9d35e962165410b65fe -
ivera - 2019-07-19 15:11:04]
at
org.apache.solr.security.PKIAuthenticationPlugin.generateToken(PKIAuthenticationPlugin.java:305)
~[solr-core-8.2.0.jar:8.2.0 31d7ec7bbfdcd2c4cc61d9d35e962165410b65fe -
ivera - 2019-07-19 15:11:04]
at
org.apache.solr.security.PKIAuthenticationPlugin.setHeader(PKIAuthenticationPlugin.java:311)
~[solr-core-8.2.0.jar:8.2.0 31d7ec7bbfdcd2c4cc61d9d35e962165410b65fe -
ivera - 2019-07-19 15:11:04]
at
org.apache.solr.security.PKIAuthenticationPlugin$HttpHeaderClientInterceptor.process(PKIAuthenticationPlugin.java:271)
~[solr-core-8.2.0.jar:8.2.0 31d7ec7bbfdcd2c4cc61d9d35e962165410b65fe -
ivera - 2019-07-19 15:11:04]
at
org.apache.solr.client.solrj.impl.HttpClientUtil$DynamicInterceptor$1.accept(HttpClientUtil.java:179)
~[solr-solrj-8.2.0.jar:8.2.0 31d7ec7bbfdcd2c4cc61d9d35e962165410b65fe -
ivera - 2019-07-19 15:11:07]
at
org.apache.solr.client.solrj.impl.HttpClientUtil$DynamicInterceptor$1.accept(HttpClientUtil.java:174)
~[solr-solrj-8.2.0.jar:8.2.0 31d7ec7bbfdcd2c4cc61d9d35e962165410b65fe -
ivera - 2019-07-19 15:11:07]
at
java.util.concurrent.CopyOnWriteArrayList.forEach(CopyOnWriteArrayList.java:804)
~[?:?]
at
org.apache.solr.client.solrj.impl.HttpClientUtil$DynamicInterceptor.process(HttpClientUtil.java:174)
~[solr-solrj-8.2.0.jar:8.2.0 31d7ec7bbfdcd2c4cc61d9d35e962165410b65fe -
ivera - 2019-07-19 15:11:07]
at
org.apache.http.protocol.ImmutableHttpProcessor.process(ImmutableHttpProcessor.java:133)
~[httpcore-4.4.10.jar:4.4.10]
at
org.apache.http.impl.execchain.ProtocolExec.execute(ProtocolExec.java:183)
~[httpclient-4.5.6.jar:4.5.6]
at org.apache.http.impl.execchain.RetryExec.execute(RetryExec.java:89)
~[httpclient-4.5.6.jar:4.5.6]
at
org.apache.http.impl.execchain.RedirectExec.execute(RedirectExec.java:110)
~[httpclient-4.5.6.jar:4.5.6]
at
org.apache.http.impl.client.InternalHttpClient.doExecute(InternalHttpClient.java:185)
~[httpclient-4.5.6.jar:4.5.6]
at
org.apache.http.impl.client.CloseableHttpClient.execute(CloseableHttpClient.java:83)
~[httpclient-4.5.6.jar:4.5.6]
at
org.apache.http.impl.client.CloseableHttpClient.execute(CloseableHttpClient.java:56)
~[httpclient-4.5.6.jar:4.5.6]
at org.apache.solr.servlet.HttpSolrCall.remoteQuery(HttpSolrCall.java:688)
~[solr-core-8.2.0.jar:8.2.0 31d7ec7bbfdcd2c4cc61d9d35e962165410b65fe -
ivera - 2019-07-19 15:11:04]
at org.apache.solr.servlet.HttpSolrCall.call(HttpSolrCall.java:550)
~[solr-core-8.2.0.jar:8.2.0 31d7ec7bbfdcd2c4cc61d9d35e962165410b65fe -
ivera - 2019-07-19 15:11:04]
at
org.apache.solr.servlet.SolrDispatchFilter.doFilter(SolrDispatchFilter.java:423)
~[solr-core-8.2.0.jar:8.2.0 31d7ec7bbfdcd2c4cc61d9d35e962165410b65fe -
ivera - 2019-07-19 15:11:04]
at
org.apache.solr.servlet.SolrDispatchFilter.doFilter(SolrDispatchFilter.java:350)
~[solr-core-8.2.0.jar:8.2.0 31d7ec7bbfdcd2c4cc61d9d35e962165410b65fe -
ivera - 2019-07-19 15:11:04]
at
org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1602)
~[jetty-servlet-9.4.19.v20190610.jar:9.4.19.v20190610]
at
org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:540)
~[jetty-servlet-9.4.19.v20190610.jar:9.4.19.v20190610]
at
org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:146)
~[jetty-server-9.4.19.v20190610.jar:9.4.19.v20190610]
at
org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:548)
~[jetty-security-9.4.19.v20190610.jar:9.4.19.v20190610]
at
org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:132)
~[jetty-server-9.4.19.v20190610.jar:9.4.19.v20190610]
at
org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:257)
~[jetty-server-9.4.19.v20190610.jar:9.4.19.v20190610]
at
org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1711)
~[jetty-server-9.4.19.v20190610.jar:9.4.19.v20190610]
at
org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:255)
~[jetty-server-9.4.19.v20190610.jar:9.4.19.v20190610]
at
org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1347)
~[jetty-server-9.4.19.v20190610.jar:9.4.19.v20190610]
at
org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:203)
~[jetty-server-9.4

Re: Solr Issue - Logger : UpdateLog Error Message : java.io.EOFException

2020-10-03 Thread Erick Erickson
Very strange things start to happen when GC becomes unstable. The first and 
simplest thing to do would be to bump up your heap, say to 20g (note, try to 
stay under 32G or be prepared to jump significantly higher. At 32G long 
pointers have to be used and you actually have less memory available than you 
think).

The first three warnings indicate that you have both managed-schema and 
schema.xml in your configset _and_ are using the managed schema (enabled in 
solrconfig.xml). Which also suggests you’re upgrading from a previous version. 
This is printed out in as a courtesy notification that schema.xml is no longer 
being used so you should delete it to avoid confusion. NOTE: if you want to use 
schema.xml like you have before, see the reference guide.

The fourth warning suggests that you have killed Solr without committing and 
it’s replaying the transaction log. For instance, “kill -9” or other will do 
it. If you do something like that before a commit completes, updates are 
replayed from the tlog in order to preserve data.

Which leads to your second issue. I’d guess either you’re not committing after 
your updates (and, BTW, please just let your autocommit settings handle that), 
and forcefully killing Solr (e.g. kill -9). That can happen even if you use the 
“bin/solr stop” command if it takes too long (3 minutes by default last I 
knew). A “normal” shutdown that succeeds (i.e. bin/solr stop that doesn’t print 
a message about forcefully killing Solr” will commit on shutdown BTW. Taking 
over 3 minutes may be a symptom of GC going crazy.

You should to try to figure out why you have this kind of memory spike, 
returning a zillion documents is one possible cause (i.e. rows=100 or 
something). All the docs have to be assembled in memory, so if you need to 
return lots of rows, use streaming or cursorMark.

So what I’d do:
1> bump up your heap
2> insure that you shut Solr down gracefully
3> see if any particular query triggers this memory spike and if you’re using 
an anti-pattern.

Best,
Erick

> On Oct 2, 2020, at 7:10 PM, Training By Coding  
> wrote:
> 
> Events:
>   • GC logs showing continuous Full GC events. Log report attached.
>   • Core filling failed , showing less data( Num Docs)  than expected.
>   • following warnings showing on dashboard before error.
> 
> Level Logger  Message
> WARN falseManagedIndexSchemaFactory   The schema has been upgraded to 
> managed, but the non-managed schema schema.xml is still loadable. PLEASE 
> REMOVE THIS FILE.
> WARN falseManagedIndexSchemaFactory   The schema has been upgraded to 
> managed, but the non-managed schema schema.xml is still loadable. PLEASE 
> REMOVE THIS FILE.
> WARN falseSolrResourceLoader  Solr loaded a deprecated 
> plugin/analysis class [solr.TrieDateField]. Please consult documentation how 
> to replace it accordingly.
> WARN falseManagedIndexSchemaFactory   The schema has been upgraded to 
> managed, but the non-managed schema schema.xml is still loadable. PLEASE 
> REMOVE THIS FILE.
> WARN falseUpdateLog   Starting log replay 
> tlog{file=\data\tlog\tlog.0445482 refcount=2} 
> active=false starting pos=0 inSortedOrder=false
>   • Total data in all cores around 8 GB
>   • Other Configurations:
>   • -XX:+UseG1GC
>   • -XX:+UseStringDeduplication
>   • -XX:MaxGCPauseMillis=500
>   • -Xms15g
>   • -Xmx15g
>   • -Xss256k
>   • OS Environment :
>   • Windows 10,
>   • Filling cores by calling SQL query using jtds-1.3.1 library.
>   • Solr Version 7.5
>   • Runtime: Oracle Corporation OpenJDK 64-Bit Server VM 11.0.2 
> 11.0.2+9
>   • Processors : 48
>   • System Physical Memory : 128 GB
>   • Swap Space : 256GB
>   • solr-spec7.5.0
>   • solr-impl7.5.0 b5bf70b7e32d7ddd9742cc821d471c5fabd4e3df - 
> jimczi - 2018-09-18 13:07:55
>   • lucene-spec7.5.0
>   • lucene-impl7.5.0 b5bf70b7e32d7ddd9742cc821d471c5fabd4e3df - 
> jimczi - 2018-09-18 13:01:1
> Error Message : 
> java.io.EOFException
> at 
> org.apache.solr.common.util.FastInputStream.readFully(FastInputStream.java:168)
> at org.apache.solr.common.util.JavaBinCodec.readStr(JavaBinCodec.java:863)
> at org.apache.solr.common.util.JavaBinCodec.readStr(JavaBinCodec.java:857)
> at org.apache.solr.common.util.JavaBinCodec.readObject(JavaBinCodec.java:266)
> at org.apache.solr.common.util.JavaBinCodec.readVal(JavaBinCodec.java:256)
> at 
> org.apache.solr.common.util.JavaBinCodec.readSolrInputDocument(JavaBinCodec.java:603)
> at org.apache.solr.common.util.JavaBinCodec.readObject(JavaBinCodec.java:315)
> at org.apache.solr.commo

Solr Issue - Logger : UpdateLog Error Message : java.io.EOFException

2020-10-03 Thread Training By Coding
*Events:*

   1. GC logs showing continuous Full GC events. Log report attached.
   2. *Core filling failed , showing less data( Num Docs)  than expected.*
   3. following warnings showing on dashboard before error.


Level Logger Message
WARN false ManagedIndexSchemaFactory The schema has been upgraded to
managed, but the non-managed schema schema.xml is still loadable. PLEASE
REMOVE THIS FILE.
WARN false ManagedIndexSchemaFactory The schema has been upgraded to
managed, but the non-managed schema schema.xml is still loadable. PLEASE
REMOVE THIS FILE.
WARN false SolrResourceLoader Solr loaded a deprecated plugin/analysis
class [solr.TrieDateField]. Please consult documentation how to replace it
accordingly.
WARN false ManagedIndexSchemaFactory The schema has been upgraded to
managed, but the non-managed schema schema.xml is still loadable. PLEASE
REMOVE THIS FILE.
WARN false UpdateLog Starting log replay
tlog{file=\data\tlog\tlog.0445482 refcount=2}
active=false starting pos=0 inSortedOrder=false

   - Total data in all cores around 8 GB
   - *Other Configurations:*
  - -XX:+UseG1GC
  - -XX:+UseStringDeduplication
  - -XX:MaxGCPauseMillis=500
  - -Xms15g
  - -Xmx15g
  - -Xss256k
   - *OS Environment :*
  - Windows 10,
  - Filling cores by calling SQL query using jtds-1.3.1 library.
  - Solr Version 7.5
  - Runtime: Oracle Corporation OpenJDK 64-Bit Server VM 11.0.2 11.0.2+9
  - Processors : 48
  - System Physical Memory : 128 GB
  - Swap Space : 256GB
   - solr-spec7.5.0
  - solr-impl7.5.0 b5bf70b7e32d7ddd9742cc821d471c5fabd4e3df - jimczi -
  2018-09-18 13:07:55
   - lucene-spec7.5.0
  - lucene-impl7.5.0 b5bf70b7e32d7ddd9742cc821d471c5fabd4e3df - jimczi
  - 2018-09-18 13:01:1

*Error Message :*

java.io.EOFException
at
org.apache.solr.common.util.FastInputStream.readFully(FastInputStream.java:168)
at org.apache.solr.common.util.JavaBinCodec.readStr(JavaBinCodec.java:863)
at org.apache.solr.common.util.JavaBinCodec.readStr(JavaBinCodec.java:857)
at
org.apache.solr.common.util.JavaBinCodec.readObject(JavaBinCodec.java:266)
at org.apache.solr.common.util.JavaBinCodec.readVal(JavaBinCodec.java:256)
at
org.apache.solr.common.util.JavaBinCodec.readSolrInputDocument(JavaBinCodec.java:603)
at
org.apache.solr.common.util.JavaBinCodec.readObject(JavaBinCodec.java:315)
at org.apache.solr.common.util.JavaBinCodec.readVal(JavaBinCodec.java:256)
at org.apache.solr.common.util.JavaBinCodec.readArray(JavaBinCodec.java:747)
at
org.apache.solr.common.util.JavaBinCodec.readObject(JavaBinCodec.java:272)
at org.apache.solr.common.util.JavaBinCodec.readVal(JavaBinCodec.java:256)
at
org.apache.solr.update.TransactionLog$LogReader.next(TransactionLog.java:673)
at
org.apache.solr.update.UpdateLog$LogReplayer.doReplay(UpdateLog.java:1832)
at org.apache.solr.update.UpdateLog$LogReplayer.run(UpdateLog.java:1747)
at
java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515)
at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:264)
at
java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515)
at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:264)
at
org.apache.solr.common.util.ExecutorUtil$MDCAwareThreadPoolExecutor.lambda$execute$0(ExecutorUtil.java:209)
at
java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128)
at
java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628)
at java.base/java.lang.Thread.run(Thread.java:834)


Query function error - can not use FieldCache on multivalued field

2020-09-14 Thread Shamik Bandopadhyay
Hi,

  I'm trying to use Solr query function as a boost for term matches in the
title field. Here's my boost function

bf=if(exists(query({!v='title:Import data'})),10,0)

This throws the following error --> can not use FieldCache on multivalued
field: data

The function seems to be only working for a single term. The title field
doesn't support multivalued but it's configured to analyze terms. Here's
the field definition.



I was under the impression that I would be able to use the query function
to evaluate a regular query field. Am I missing something? If there's a
constraint on this function, can this boost be done in a different way?

Any pointers will be appreciated.

Thanks,
Shamik


Re: Error on searches containing specific character pattern

2020-09-07 Thread Andy @ BlueFusion

Thanks David, I'll set up  the techproducts schema and see what happens.

Kind regards,

Andy

On 4/09/20 4:09 pm, David Smiley wrote:

Hi,

I looked at the code at those line numbers and it seems simply impossible
that an ArrayIndexOutOfBoundsException could be thrown there because it's
guarded by a condition ensuring the array is of length 1.
https://github.com/apache/lucene-solr/blob/2752d50dd1dcf758a32dc573d02967612a2cf1ff/lucene/core/src/java/org/apache/lucene/util/QueryBuilder.java#L653

If you can reproduce this with the "techproducts" schema, please share the
complete query.  If there's a problem here, I suspect the synonyms you have
may be pertinent.

~ David Smiley
Apache Lucene/Solr Search Developer
http://www.linkedin.com/in/davidwsmiley


On Tue, Sep 1, 2020 at 11:50 PM Andy @ BlueFusion 
wrote:


Hi All,

I have an 8.6.0 instance that is working well with one exception.

It returns an error when the search term follows a pattern of numbers &
alpha characters such as:

   * 1a1 aa
   * 1a1 1aa
   * 1a1 11

Similar patterns that don't error

   * 1a1 a
   * 1a1 1
   * 1a11 aa
   * 11a1 aa
   * 1a1aa
   * 11a11 aa

The error is:

|"trace":"java.lang.ArrayIndexOutOfBoundsException: 0\n\t at
org.apache.lucene.util.QueryBuilder.newSynonymQuery(QueryBuilder.java:653)\n\t

at
org.apache.solr.parser.SolrQueryParserBase.newSynonymQuery(SolrQueryParserBase.java:617)\n\t

at
org.apache.lucene.util.QueryBuilder.analyzeGraphBoolean(QueryBuilder.java:533)\n\t

at
org.apache.lucene.util.QueryBuilder.createFieldQuery(QueryBuilder.java:320)\n\t

at
org.apache.lucene.util.QueryBuilder.createFieldQuery(QueryBuilder.java:240)\n\t

at
org.apache.solr.parser.SolrQueryParserBase.newFieldQuery(SolrQueryParserBase.java:524)\n\t

at
org.apache.solr.parser.QueryParser.newFieldQuery(QueryParser.java:62)\n\t
at
org.apache.solr.parser.SolrQueryParserBase.getFieldQuery(SolrQueryParserBase.java:1122)\n\t

at
org.apache.solr.parser.QueryParser.MultiTerm(QueryParser.java:593)\n\t
at org.apache.solr.parser.QueryParser.Query(QueryParser.java:142)\n\t at
org.apache.solr.parser.QueryParser.Clause(QueryParser.java:282)\n\t at
org.apache.solr.parser.QueryParser.Query(QueryParser.java:162)\n\t at
org.apache.solr.parser.QueryParser.Clause(QueryParser.java:282)\n\t at
org.apache.solr.parser.QueryParser.Query(QueryParser.java:162)\n\t at
org.apache.solr.parser.QueryParser.Clause(QueryParser.java:282)\n\t at
org.apache.solr.parser.QueryParser.Query(QueryParser.java:162)\n\t at
org.apache.solr.parser.QueryParser.TopLevelQuery(QueryParser.java:131)\n\t
at
org.apache.solr.parser.SolrQueryParserBase.parse(SolrQueryParserBase.java:260)\n\t

at org.apache.solr.search.LuceneQParser.parse(LuceneQParser.java:49)\n\t
at org.apache.solr.search.QParser.getQuery(QParser.java:174)\n\t at
org.apache.solr.handler.component.QueryComponent.prepare(QueryComponent.java:160)\n\t

at
org.apache.solr.handler.component.SearchHandler.handleRequestBody(SearchHandler.java:302)\n\t

at
org.apache.solr.handler.RequestHandlerBase.handleRequest(RequestHandlerBase.java:211)\n\t

at org.apache.solr.core.SolrCore.execute(SolrCore.java:2596)\n\t at
org.apache.solr.servlet.HttpSolrCall.execute(HttpSolrCall.java:799)\n\t
at org.apache.solr.servlet.HttpSolrCall.call(HttpSolrCall.java:578)\n\t
at
org.apache.solr.servlet.SolrDispatchFilter.doFilter(SolrDispatchFilter.java:419)\n\t

at
org.apache.solr.servlet.SolrDispatchFilter.doFilter(SolrDispatchFilter.java:351)\n\t

at
org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1602)\n\t

at
org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:540)\n\t

at
org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:146)\n\t

at
org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:548)\n\t

at
org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:132)\n\t

at
org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:257)\n\t

at
org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1711)\n\t

at
org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:255)\n\t

at
org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1347)\n\t

at
org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:203)\n\t

at
org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:480)\n\t

at
org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1678)\n\t

at
org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:201)\n\t

at
org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1249)\n\t

at
org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:144)\n\t

at
org.eclipse.jetty.server.handler.ContextHandlerCollection.handle(ContextHandlerCollection.java:220)\n\t

at
org.eclipse.jetty.server.handler.HandlerColl

Re: Error on searches containing specific character pattern

2020-09-03 Thread David Smiley
Hi,

I looked at the code at those line numbers and it seems simply impossible
that an ArrayIndexOutOfBoundsException could be thrown there because it's
guarded by a condition ensuring the array is of length 1.
https://github.com/apache/lucene-solr/blob/2752d50dd1dcf758a32dc573d02967612a2cf1ff/lucene/core/src/java/org/apache/lucene/util/QueryBuilder.java#L653

If you can reproduce this with the "techproducts" schema, please share the
complete query.  If there's a problem here, I suspect the synonyms you have
may be pertinent.

~ David Smiley
Apache Lucene/Solr Search Developer
http://www.linkedin.com/in/davidwsmiley


On Tue, Sep 1, 2020 at 11:50 PM Andy @ BlueFusion 
wrote:

> Hi All,
>
> I have an 8.6.0 instance that is working well with one exception.
>
> It returns an error when the search term follows a pattern of numbers &
> alpha characters such as:
>
>   * 1a1 aa
>   * 1a1 1aa
>   * 1a1 11
>
> Similar patterns that don't error
>
>   * 1a1 a
>   * 1a1 1
>   * 1a11 aa
>   * 11a1 aa
>   * 1a1aa
>   * 11a11 aa
>
> The error is:
>
> |"trace":"java.lang.ArrayIndexOutOfBoundsException: 0\n\t at
> org.apache.lucene.util.QueryBuilder.newSynonymQuery(QueryBuilder.java:653)\n\t
>
> at
> org.apache.solr.parser.SolrQueryParserBase.newSynonymQuery(SolrQueryParserBase.java:617)\n\t
>
> at
> org.apache.lucene.util.QueryBuilder.analyzeGraphBoolean(QueryBuilder.java:533)\n\t
>
> at
> org.apache.lucene.util.QueryBuilder.createFieldQuery(QueryBuilder.java:320)\n\t
>
> at
> org.apache.lucene.util.QueryBuilder.createFieldQuery(QueryBuilder.java:240)\n\t
>
> at
> org.apache.solr.parser.SolrQueryParserBase.newFieldQuery(SolrQueryParserBase.java:524)\n\t
>
> at
> org.apache.solr.parser.QueryParser.newFieldQuery(QueryParser.java:62)\n\t
> at
> org.apache.solr.parser.SolrQueryParserBase.getFieldQuery(SolrQueryParserBase.java:1122)\n\t
>
> at
> org.apache.solr.parser.QueryParser.MultiTerm(QueryParser.java:593)\n\t
> at org.apache.solr.parser.QueryParser.Query(QueryParser.java:142)\n\t at
> org.apache.solr.parser.QueryParser.Clause(QueryParser.java:282)\n\t at
> org.apache.solr.parser.QueryParser.Query(QueryParser.java:162)\n\t at
> org.apache.solr.parser.QueryParser.Clause(QueryParser.java:282)\n\t at
> org.apache.solr.parser.QueryParser.Query(QueryParser.java:162)\n\t at
> org.apache.solr.parser.QueryParser.Clause(QueryParser.java:282)\n\t at
> org.apache.solr.parser.QueryParser.Query(QueryParser.java:162)\n\t at
> org.apache.solr.parser.QueryParser.TopLevelQuery(QueryParser.java:131)\n\t
> at
> org.apache.solr.parser.SolrQueryParserBase.parse(SolrQueryParserBase.java:260)\n\t
>
> at org.apache.solr.search.LuceneQParser.parse(LuceneQParser.java:49)\n\t
> at org.apache.solr.search.QParser.getQuery(QParser.java:174)\n\t at
> org.apache.solr.handler.component.QueryComponent.prepare(QueryComponent.java:160)\n\t
>
> at
> org.apache.solr.handler.component.SearchHandler.handleRequestBody(SearchHandler.java:302)\n\t
>
> at
> org.apache.solr.handler.RequestHandlerBase.handleRequest(RequestHandlerBase.java:211)\n\t
>
> at org.apache.solr.core.SolrCore.execute(SolrCore.java:2596)\n\t at
> org.apache.solr.servlet.HttpSolrCall.execute(HttpSolrCall.java:799)\n\t
> at org.apache.solr.servlet.HttpSolrCall.call(HttpSolrCall.java:578)\n\t
> at
> org.apache.solr.servlet.SolrDispatchFilter.doFilter(SolrDispatchFilter.java:419)\n\t
>
> at
> org.apache.solr.servlet.SolrDispatchFilter.doFilter(SolrDispatchFilter.java:351)\n\t
>
> at
> org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1602)\n\t
>
> at
> org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:540)\n\t
>
> at
> org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:146)\n\t
>
> at
> org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:548)\n\t
>
> at
> org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:132)\n\t
>
> at
> org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:257)\n\t
>
> at
> org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1711)\n\t
>
> at
> org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:255)\n\t
>
> at
> org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1347)\n\t
>
> at
> org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:203)\n\t
>
> at
> org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:480)\n\t
>
> at
> org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1678)\n\t
>
> at
> o

Error on searches containing specific character pattern

2020-09-01 Thread Andy @ BlueFusion

Hi All,

I have an 8.6.0 instance that is working well with one exception.

It returns an error when the search term follows a pattern of numbers & 
alpha characters such as:


 * 1a1 aa
 * 1a1 1aa
 * 1a1 11

Similar patterns that don't error

 * 1a1 a
 * 1a1 1
 * 1a11 aa
 * 11a1 aa
 * 1a1aa
 * 11a11 aa

The error is:

|"trace":"java.lang.ArrayIndexOutOfBoundsException: 0\n\t at 
org.apache.lucene.util.QueryBuilder.newSynonymQuery(QueryBuilder.java:653)\n\t 
at 
org.apache.solr.parser.SolrQueryParserBase.newSynonymQuery(SolrQueryParserBase.java:617)\n\t 
at 
org.apache.lucene.util.QueryBuilder.analyzeGraphBoolean(QueryBuilder.java:533)\n\t 
at 
org.apache.lucene.util.QueryBuilder.createFieldQuery(QueryBuilder.java:320)\n\t 
at 
org.apache.lucene.util.QueryBuilder.createFieldQuery(QueryBuilder.java:240)\n\t 
at 
org.apache.solr.parser.SolrQueryParserBase.newFieldQuery(SolrQueryParserBase.java:524)\n\t 
at 
org.apache.solr.parser.QueryParser.newFieldQuery(QueryParser.java:62)\n\t 
at 
org.apache.solr.parser.SolrQueryParserBase.getFieldQuery(SolrQueryParserBase.java:1122)\n\t 
at 
org.apache.solr.parser.QueryParser.MultiTerm(QueryParser.java:593)\n\t 
at org.apache.solr.parser.QueryParser.Query(QueryParser.java:142)\n\t at 
org.apache.solr.parser.QueryParser.Clause(QueryParser.java:282)\n\t at 
org.apache.solr.parser.QueryParser.Query(QueryParser.java:162)\n\t at 
org.apache.solr.parser.QueryParser.Clause(QueryParser.java:282)\n\t at 
org.apache.solr.parser.QueryParser.Query(QueryParser.java:162)\n\t at 
org.apache.solr.parser.QueryParser.Clause(QueryParser.java:282)\n\t at 
org.apache.solr.parser.QueryParser.Query(QueryParser.java:162)\n\t at 
org.apache.solr.parser.QueryParser.TopLevelQuery(QueryParser.java:131)\n\t 
at 
org.apache.solr.parser.SolrQueryParserBase.parse(SolrQueryParserBase.java:260)\n\t 
at org.apache.solr.search.LuceneQParser.parse(LuceneQParser.java:49)\n\t 
at org.apache.solr.search.QParser.getQuery(QParser.java:174)\n\t at 
org.apache.solr.handler.component.QueryComponent.prepare(QueryComponent.java:160)\n\t 
at 
org.apache.solr.handler.component.SearchHandler.handleRequestBody(SearchHandler.java:302)\n\t 
at 
org.apache.solr.handler.RequestHandlerBase.handleRequest(RequestHandlerBase.java:211)\n\t 
at org.apache.solr.core.SolrCore.execute(SolrCore.java:2596)\n\t at 
org.apache.solr.servlet.HttpSolrCall.execute(HttpSolrCall.java:799)\n\t 
at org.apache.solr.servlet.HttpSolrCall.call(HttpSolrCall.java:578)\n\t 
at 
org.apache.solr.servlet.SolrDispatchFilter.doFilter(SolrDispatchFilter.java:419)\n\t 
at 
org.apache.solr.servlet.SolrDispatchFilter.doFilter(SolrDispatchFilter.java:351)\n\t 
at 
org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1602)\n\t 
at 
org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:540)\n\t 
at 
org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:146)\n\t 
at 
org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:548)\n\t 
at 
org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:132)\n\t 
at 
org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:257)\n\t 
at 
org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1711)\n\t 
at 
org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:255)\n\t 
at 
org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1347)\n\t 
at 
org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:203)\n\t 
at 
org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:480)\n\t 
at 
org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1678)\n\t 
at 
org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:201)\n\t 
at 
org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1249)\n\t 
at 
org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:144)\n\t 
at 
org.eclipse.jetty.server.handler.ContextHandlerCollection.handle(ContextHandlerCollection.java:220)\n\t 
at 
org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:152)\n\t 
at 
org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:132)\n\t 
at 
org.eclipse.jetty.rewrite.handler.RewriteHandler.handle(RewriteHandler.java:335)\n\t 
at 
org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:132)\n\t 
at org.eclipse.jetty.server.Server.handle(Server.java:505)\n\t at 
org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:370)\n\t at 
org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:267)\n\t 
at 
org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:305)\n\t 
at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:103)\n\t 
at 
org.eclipse.jetty.io.ChannelEndPoint$2.run(ChannelEn

Re: Addreplica throwing error when authentication is enabled

2020-09-01 Thread yaswanth kumar
Hi Ben

Thanks for looking.. but I am not understanding about the file encrypted stuff 
that you mentioned?? Which file are you saying encrypted ? Security.json??

Sent from my iPhone

> On Sep 1, 2020, at 10:56 PM, Ben  wrote:
> 
> It appears the issue is with the encrypted file. Are these files encrypted?
> If yes, you need to decrypt it first.
> 
> moreCaused by: javax.crypto.BadPaddingException: RSA private key operation
> failed
> 
> Best,
> Ben
> 
>> On Tue, Sep 1, 2020, 10:51 PM yaswanth kumar  wrote:
>> 
>> Can some one please help me on the below error??
>> 
>> Solr 8.2; zookeeper 3.4
>> 
>> Enabled authentication and authentication and make sure that the role gets
>> all access
>> 
>> Now just add a collection with single replica and once done .. now try to
>> add another replica with addreplica solr api and that’s throwing error ..
>> note: this is happening only when security.json was enabled with
>> authentication
>> 
>> Below is the error
>> Collection: test operation: restore
>> failed:org.apache.solr.common.SolrException: ADDREPLICA failed to create
>> replicaCollection: test operation: restore
>> failed:org.apache.solr.common.SolrException: ADDREPLICA failed to create
>> replica at
>> org.apache.solr.cloud.api.collections.OverseerCollectionMessageHandler$ShardRequestTracker.processResponses(OverseerCollectionMessageHandler.java:1030)
>> at
>> org.apache.solr.cloud.api.collections.OverseerCollectionMessageHandler$ShardRequestTracker.processResponses(OverseerCollectionMessageHandler.java:1013)
>> at
>> org.apache.solr.cloud.api.collections.AddReplicaCmd.lambda$addReplica$1(AddReplicaCmd.java:177)
>> at
>> org.apache.solr.cloud.api.collections.AddReplicaCmd$$Lambda$798/.run(Unknown
>> Source) at
>> org.apache.solr.cloud.api.collections.AddReplicaCmd.addReplica(AddReplicaCmd.java:199)
>> at
>> org.apache.solr.cloud.api.collections.OverseerCollectionMessageHandler.addReplica(OverseerCollectionMessageHandler.java:708)
>> at
>> org.apache.solr.cloud.api.collections.RestoreCmd.call(RestoreCmd.java:286)
>> at
>> org.apache.solr.cloud.api.collections.OverseerCollectionMessageHandler.processMessage(OverseerCollectionMessageHandler.java:264)
>> at
>> org.apache.solr.cloud.OverseerTaskProcessor$Runner.run(OverseerTaskProcessor.java:505)
>> at
>> org.apache.solr.common.util.ExecutorUtil$MDCAwareThreadPoolExecutor.lambda$execute$0(ExecutorUtil.java:209)
>> at
>> org.apache.solr.common.util.ExecutorUtil$MDCAwareThreadPoolExecutor$$Lambda$142/.run(Unknown
>> Source) at
>> java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128)
>> at
>> java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628)
>> at java.base/java.lang.Thread.run(Thread.java:834)Caused by:
>> org.apache.solr.common.SolrException: javax.crypto.BadPaddingException: RSA
>> private key operation failed at
>> org.apache.solr.util.CryptoKeys$RSAKeyPair.encrypt(CryptoKeys.java:325) at
>> org.apache.solr.security.PKIAuthenticationPlugin.generateToken(PKIAuthenticationPlugin.java:305)
>> at
>> org.apache.solr.security.PKIAuthenticationPlugin.access$200(PKIAuthenticationPlugin.java:61)
>> at
>> org.apache.solr.security.PKIAuthenticationPlugin$2.onQueued(PKIAuthenticationPlugin.java:239)
>> at
>> org.apache.solr.client.solrj.impl.Http2SolrClient.decorateRequest(Http2SolrClient.java:468)
>> at
>> org.apache.solr.client.solrj.impl.Http2SolrClient.makeRequest(Http2SolrClient.java:455)
>> at
>> org.apache.solr.client.solrj.impl.Http2SolrClient.request(Http2SolrClient.java:364)
>> at
>> org.apache.solr.client.solrj.impl.Http2SolrClient.request(Http2SolrClient.java:746)
>> at org.apache.solr.client.solrj.SolrClient.request(SolrClient.java:1274) at
>> org.apache.solr.handler.component.HttpShardHandler.request(HttpShardHandler.java:238)
>> at
>> org.apache.solr.handler.component.HttpShardHandler.lambda$submit$0(HttpShardHandler.java:199)
>> at
>> org.apache.solr.handler.component.HttpShardHandler$$Lambda$512/.call(Unknown
>> Source) at
>> java.base/java.util.concurrent.FutureTask.run(FutureTask.java:264) at
>> java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515)
>> at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:264) at
>> com.codahale.metrics.InstrumentedExecutorService$InstrumentedRunnable.run(InstrumentedExecutorService.java:181)
>> ... 5 moreCaused by: javax.crypto.BadPaddingException: RSA priva

Re: Addreplica throwing error when authentication is enabled

2020-09-01 Thread Ben
It appears the issue is with the encrypted file. Are these files encrypted?
If yes, you need to decrypt it first.

moreCaused by: javax.crypto.BadPaddingException: RSA private key operation
failed

Best,
Ben

On Tue, Sep 1, 2020, 10:51 PM yaswanth kumar  wrote:

> Can some one please help me on the below error??
>
> Solr 8.2; zookeeper 3.4
>
> Enabled authentication and authentication and make sure that the role gets
> all access
>
> Now just add a collection with single replica and once done .. now try to
> add another replica with addreplica solr api and that’s throwing error ..
> note: this is happening only when security.json was enabled with
> authentication
>
> Below is the error
> Collection: test operation: restore
> failed:org.apache.solr.common.SolrException: ADDREPLICA failed to create
> replicaCollection: test operation: restore
> failed:org.apache.solr.common.SolrException: ADDREPLICA failed to create
> replica at
> org.apache.solr.cloud.api.collections.OverseerCollectionMessageHandler$ShardRequestTracker.processResponses(OverseerCollectionMessageHandler.java:1030)
> at
> org.apache.solr.cloud.api.collections.OverseerCollectionMessageHandler$ShardRequestTracker.processResponses(OverseerCollectionMessageHandler.java:1013)
> at
> org.apache.solr.cloud.api.collections.AddReplicaCmd.lambda$addReplica$1(AddReplicaCmd.java:177)
> at
> org.apache.solr.cloud.api.collections.AddReplicaCmd$$Lambda$798/.run(Unknown
> Source) at
> org.apache.solr.cloud.api.collections.AddReplicaCmd.addReplica(AddReplicaCmd.java:199)
> at
> org.apache.solr.cloud.api.collections.OverseerCollectionMessageHandler.addReplica(OverseerCollectionMessageHandler.java:708)
> at
> org.apache.solr.cloud.api.collections.RestoreCmd.call(RestoreCmd.java:286)
> at
> org.apache.solr.cloud.api.collections.OverseerCollectionMessageHandler.processMessage(OverseerCollectionMessageHandler.java:264)
> at
> org.apache.solr.cloud.OverseerTaskProcessor$Runner.run(OverseerTaskProcessor.java:505)
> at
> org.apache.solr.common.util.ExecutorUtil$MDCAwareThreadPoolExecutor.lambda$execute$0(ExecutorUtil.java:209)
> at
> org.apache.solr.common.util.ExecutorUtil$MDCAwareThreadPoolExecutor$$Lambda$142/.run(Unknown
> Source) at
> java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128)
> at
> java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628)
> at java.base/java.lang.Thread.run(Thread.java:834)Caused by:
> org.apache.solr.common.SolrException: javax.crypto.BadPaddingException: RSA
> private key operation failed at
> org.apache.solr.util.CryptoKeys$RSAKeyPair.encrypt(CryptoKeys.java:325) at
> org.apache.solr.security.PKIAuthenticationPlugin.generateToken(PKIAuthenticationPlugin.java:305)
> at
> org.apache.solr.security.PKIAuthenticationPlugin.access$200(PKIAuthenticationPlugin.java:61)
> at
> org.apache.solr.security.PKIAuthenticationPlugin$2.onQueued(PKIAuthenticationPlugin.java:239)
> at
> org.apache.solr.client.solrj.impl.Http2SolrClient.decorateRequest(Http2SolrClient.java:468)
> at
> org.apache.solr.client.solrj.impl.Http2SolrClient.makeRequest(Http2SolrClient.java:455)
> at
> org.apache.solr.client.solrj.impl.Http2SolrClient.request(Http2SolrClient.java:364)
> at
> org.apache.solr.client.solrj.impl.Http2SolrClient.request(Http2SolrClient.java:746)
> at org.apache.solr.client.solrj.SolrClient.request(SolrClient.java:1274) at
> org.apache.solr.handler.component.HttpShardHandler.request(HttpShardHandler.java:238)
> at
> org.apache.solr.handler.component.HttpShardHandler.lambda$submit$0(HttpShardHandler.java:199)
> at
> org.apache.solr.handler.component.HttpShardHandler$$Lambda$512/.call(Unknown
> Source) at
> java.base/java.util.concurrent.FutureTask.run(FutureTask.java:264) at
> java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515)
> at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:264) at
> com.codahale.metrics.InstrumentedExecutorService$InstrumentedRunnable.run(InstrumentedExecutorService.java:181)
> ... 5 moreCaused by: javax.crypto.BadPaddingException: RSA private key
> operation failed at
> java.base/sun.security.rsa.NativeRSACore.crtCrypt_Native(NativeRSACore.java:149)
> at java.base/sun.security.rsa.NativeRSACore.rsa(NativeRSACore.java:91) at
> java.base/sun.security.rsa.RSACore.rsa(RSACore.java:149) at
> java.base/com.sun.crypto.provider.RSACipher.doFinal(RSACipher.java:355) at
> java.base/com.sun.crypto.provider.RSACipher.engineDoFinal(RSACipher.java:392)
> at java.base/javax.crypto.Cipher.doFinal(Cipher.java:2260) at
> org.apache.solr.util.CryptoKeys$RSAKeyPair.encrypt(CryptoKeys.java:323) ...
> 20 more
>
> That's the error stack tr

Addreplica throwing error when authentication is enabled

2020-09-01 Thread yaswanth kumar
Can some one please help me on the below error??

Solr 8.2; zookeeper 3.4

Enabled authentication and authentication and make sure that the role gets all 
access 

Now just add a collection with single replica and once done .. now try to add 
another replica with addreplica solr api and that’s throwing error .. note: 
this is happening only when security.json was enabled with authentication 

Below is the error
Collection: test operation: restore 
failed:org.apache.solr.common.SolrException: ADDREPLICA failed to create 
replicaCollection: test operation: restore 
failed:org.apache.solr.common.SolrException: ADDREPLICA failed to create 
replica at 
org.apache.solr.cloud.api.collections.OverseerCollectionMessageHandler$ShardRequestTracker.processResponses(OverseerCollectionMessageHandler.java:1030)
 at 
org.apache.solr.cloud.api.collections.OverseerCollectionMessageHandler$ShardRequestTracker.processResponses(OverseerCollectionMessageHandler.java:1013)
 at 
org.apache.solr.cloud.api.collections.AddReplicaCmd.lambda$addReplica$1(AddReplicaCmd.java:177)
 at 
org.apache.solr.cloud.api.collections.AddReplicaCmd$$Lambda$798/.run(Unknown
 Source) at 
org.apache.solr.cloud.api.collections.AddReplicaCmd.addReplica(AddReplicaCmd.java:199)
 at 
org.apache.solr.cloud.api.collections.OverseerCollectionMessageHandler.addReplica(OverseerCollectionMessageHandler.java:708)
 at org.apache.solr.cloud.api.collections.RestoreCmd.call(RestoreCmd.java:286) 
at 
org.apache.solr.cloud.api.collections.OverseerCollectionMessageHandler.processMessage(OverseerCollectionMessageHandler.java:264)
 at 
org.apache.solr.cloud.OverseerTaskProcessor$Runner.run(OverseerTaskProcessor.java:505)
 at 
org.apache.solr.common.util.ExecutorUtil$MDCAwareThreadPoolExecutor.lambda$execute$0(ExecutorUtil.java:209)
 at 
org.apache.solr.common.util.ExecutorUtil$MDCAwareThreadPoolExecutor$$Lambda$142/.run(Unknown
 Source) at 
java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128)
 at 
java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628)
 at java.base/java.lang.Thread.run(Thread.java:834)Caused by: 
org.apache.solr.common.SolrException: javax.crypto.BadPaddingException: RSA 
private key operation failed at 
org.apache.solr.util.CryptoKeys$RSAKeyPair.encrypt(CryptoKeys.java:325) at 
org.apache.solr.security.PKIAuthenticationPlugin.generateToken(PKIAuthenticationPlugin.java:305)
 at 
org.apache.solr.security.PKIAuthenticationPlugin.access$200(PKIAuthenticationPlugin.java:61)
 at 
org.apache.solr.security.PKIAuthenticationPlugin$2.onQueued(PKIAuthenticationPlugin.java:239)
 at 
org.apache.solr.client.solrj.impl.Http2SolrClient.decorateRequest(Http2SolrClient.java:468)
 at 
org.apache.solr.client.solrj.impl.Http2SolrClient.makeRequest(Http2SolrClient.java:455)
 at 
org.apache.solr.client.solrj.impl.Http2SolrClient.request(Http2SolrClient.java:364)
 at 
org.apache.solr.client.solrj.impl.Http2SolrClient.request(Http2SolrClient.java:746)
 at org.apache.solr.client.solrj.SolrClient.request(SolrClient.java:1274) at 
org.apache.solr.handler.component.HttpShardHandler.request(HttpShardHandler.java:238)
 at 
org.apache.solr.handler.component.HttpShardHandler.lambda$submit$0(HttpShardHandler.java:199)
 at 
org.apache.solr.handler.component.HttpShardHandler$$Lambda$512/.call(Unknown
 Source) at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:264) 
at 
java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515)
 at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:264) at 
com.codahale.metrics.InstrumentedExecutorService$InstrumentedRunnable.run(InstrumentedExecutorService.java:181)
 ... 5 moreCaused by: javax.crypto.BadPaddingException: RSA private key 
operation failed at 
java.base/sun.security.rsa.NativeRSACore.crtCrypt_Native(NativeRSACore.java:149)
 at java.base/sun.security.rsa.NativeRSACore.rsa(NativeRSACore.java:91) at 
java.base/sun.security.rsa.RSACore.rsa(RSACore.java:149) at 
java.base/com.sun.crypto.provider.RSACipher.doFinal(RSACipher.java:355) at 
java.base/com.sun.crypto.provider.RSACipher.engineDoFinal(RSACipher.java:392) 
at java.base/javax.crypto.Cipher.doFinal(Cipher.java:2260) at 
org.apache.solr.util.CryptoKeys$RSAKeyPair.encrypt(CryptoKeys.java:323) ... 20 
more
 
That's the error stack trace I am seeing, as soon as I call the restore API I 
am seeing the collection test with a single core on the cloud but its in down 
state.
 
No of nodes that I configured with solr cloud is : 2 
Testing on a single collection with 2 replicas
Here is my security.json looks like
{
"authentication":{
"class":"solr.BasicAuthPlugin",
"credentials":
{ "admin":"", "dev":""}
,
"":{"v":11},
"blockUnknown":true,
"forwardCredentials":true},
"authorization":{
"cla

Re: Error from server at http://localhost:8983/solr/search: Expected mime type application/octet-stream but got text/html

2020-08-27 Thread Dominique Bejean
Hi,

There were few discussions about similar issues these days. A JIRA issue
was created
https://issues.apache.org/jira/browse/SOLR-14768

Regards

Dominique


Le jeu. 27 août 2020 à 15:00, Divino I. Ribeiro Jr. <
divinoirj.ib...@gmail.com> a écrit :

> Hello everyone!
> When I run an query to Solr Server, it returns the following message:
>
> 2020-08-27 03:24:03,338 ERROR org.dspace.app.rest.utils.DiscoverQueryBuilder 
> @ divinoirj.ib...@gmail.com::Error in Discovery while setting up date facet 
> range:date facet\colon; 
> org.dspace.discovery.configuration.DiscoverySearchFilterFacet@1350bf85
> org.dspace.discovery.SearchServiceException: Error from server at 
> http://localhost:8983/solr/search: Expected mime type 
> application/octet-stream but got text/html.  
> 
> 
> Error 500 java.lang.NoClassDefFoundError: 
> org/eclipse/jetty/server/MultiParts
> 
> HTTP ERROR 500 java.lang.NoClassDefFoundError: 
> org/eclipse/jetty/server/MultiParts
> 
> URI:/solr/search/select
> STATUS:500
>
> Solr instalation cores:
>
> CWD: /opt/solr-8.6.0/server
> Instance: /var/solr/data/search
> Data: /var/solr/data/search/data
> Index: /var/solr/data/search/data/index
> Impl: org.apache.solr.core.NRTCachingDirectoryFactory
>
> Thanks!
>


Re: IndexSchema is not mutable error Solr Cloud 7.7.1

2020-07-23 Thread Shawn Heisey

On 7/23/2020 8:56 AM, Porritt, Ian wrote:
Note: the solrconfig has class="ClassicIndexSchemaFactory"/> defined.



org.apache.solr.common.SolrException: *This IndexSchema is not mutable*.

     at 
org.apache.solr.update.processor.AddSchemaFieldsUpdateProcessorFactory$AddSchemaFieldsUpdateProcessor.processAdd(AddSchemaFieldsUpdateProcessorFactory.java:376)


Your config contains an update processor chain using the 
AddSchemaFieldsUpdateProcessorFactory.


This config requires a mutable schema, but you have changed to the 
classic schema factory, which is not mutable.


You'll either have to remove the config for the update processor, or 
change back to the mutable schema.  I would recommend the former.


Thanks,
Shawn


IndexSchema is not mutable error Solr Cloud 7.7.1

2020-07-23 Thread Porritt, Ian
Hi All,

 

I made a change to schema to add new fields in a
collection, this was uploaded to Zookeeper via the
below command:

 

For the Schema

solr zk cp
file:E:\SolrCloud\server\solr\configsets\COLLECTIO
N\conf\schema.xml
zk:/configs/COLLECTION/schema.xml -z
SERVERNAME1.uleaf.site

 

For the Solrconfig

solr zk cp
file:E:\SolrCloud\server\solr\configsets\COLLECTIO
N\conf\solrconfig.xml
zk:/configs/COLLECTION/solrconfig.xml -z
SERVERNAME1.uleaf.site

Note: the solrconfig has  defined.

 

 

When I then go to update a record with the new
field in you get the following error: 

 

org.apache.solr.common.SolrException: This
IndexSchema is not mutable.

at
org.apache.solr.update.processor.AddSchemaFieldsUp
dateProcessorFactory$AddSchemaFieldsUpdateProcesso
r.processAdd(AddSchemaFieldsUpdateProcessorFactory
.java:376)

at
org.apache.solr.update.processor.UpdateRequestProc
essor.processAdd(UpdateRequestProcessor.java:55)

at
org.apache.solr.update.processor.FieldMutatingUpda
teProcessor.processAdd(FieldMutatingUpdateProcesso
r.java:118)

at
org.apache.solr.update.processor.UpdateRequestProc
essor.processAdd(UpdateRequestProcessor.java:55)

at
org.apache.solr.update.processor.FieldMutatingUpda
teProcessor.processAdd(FieldMutatingUpdateProcesso
r.java:118)

at
org.apache.solr.update.processor.UpdateRequestProc
essor.processAdd(UpdateRequestProcessor.java:55)

at
org.apache.solr.update.processor.FieldMutatingUpda
teProcessor.processAdd(FieldMutatingUpdateProcesso
r.java:118)

at
org.apache.solr.update.processor.UpdateRequestProc
essor.processAdd(UpdateRequestProcessor.java:55)

at
org.apache.solr.update.processor.FieldMutatingUpda
teProcessor.processAdd(FieldMutatingUpdateProcesso
r.java:118)

at
org.apache.solr.update.processor.UpdateRequestProc
essor.processAdd(UpdateRequestProcessor.java:55)

at
org.apache.solr.update.processor.FieldNameMutating
UpdateProcessorFactory$1.processAdd(FieldNameMutat
ingUpdateProcessorFactory.java:75)

at
org.apache.solr.update.processor.UpdateRequestProc
essor.processAdd(UpdateRequestProcessor.java:55)

at
org.apache.solr.update.processor.FieldMutatingUpda
teProcessor.processAdd(FieldMutatingUpdateProcesso
r.java:118)

at
org.apache.solr.update.processor.UpdateRequestProc
essor.processAdd(UpdateRequestProcessor.java:55)

at
org.apache.solr.update.processor.AbstractDefaultVa
lueUpdateProcessorFactory$DefaultValueUpdateProces
sor.processAdd(AbstractDefaultValueUpdateProcessor
Factory.java:92)

at
org.apache.solr.handler.loader.JavabinLoader$1.upd
ate(JavabinLoader.java:110)

at
org.apache.solr.client.solrj.request.JavaBinUpdate
RequestCodec$StreamingCodec.readOuterMostDocIterat
or(JavaBinUpdateRequestCodec.java:327)

at
org.apache.solr.client.solrj.request.JavaBinUpdate
RequestCodec$StreamingCodec.readIterator(JavaBinUp
dateRequestCodec.java:280)

at
org.apache.solr.common.util.JavaBinCodec.readObjec
t(JavaBinCodec.java:333)

at
org.apache.solr.common.util.JavaBinCodec.readVal(J
avaBinCodec.java:278)

at
org.apache.solr.client.solrj.request.JavaBinUpdate
RequestCodec$StreamingCodec.readNamedList(JavaBinU
pdateRequestCodec.java:235)

at
org.apache.solr.common.util.JavaBinCodec.readObjec
t(JavaBinCodec.java:298)

at
org.apache.solr.common.util.JavaBinCodec.readVal(J
avaBinCodec.java:278)

at
org.apache.solr.common.util.JavaBinCodec.unmarshal
(JavaBinCodec.java:191)

at
org.apache.solr.client.solrj.request.JavaBinUpdate
RequestCodec.unmarshal(JavaBinUpdateRequestCodec.j
ava:126)

at
org.apache.solr.handler.loader.JavabinLoader.parse
AndLoadDocs(JavabinLoader.java:123)

at
org.apache.solr.handler.loader.JavabinLoader.load(
JavabinLoader.java:70)

at
org.apache.solr.handler.UpdateRequestHandler$1.loa
d(UpdateRequestHandler.java:97)

at
org.apache.solr.handler.ContentStreamHandlerBase.h
andleRequestBody(ContentStreamHandlerBase.java:68)

at
org.apache.solr.handler.RequestHandlerBase.handleR
equest(RequestHandlerBase.java:199)

at
org.apache.solr.core.SolrCore.execute(SolrCore.jav
a:2551)

at
org.apache.solr.servlet.HttpSolrCall.execute(HttpS
olrCall.java:710)

at
org.apache.solr.servlet.HttpSolrCall.call(HttpSolr
Call.java:516)

at
org.apache.solr.servlet.SolrDispatchFilter.doFilte
r(SolrDispatchFilter.java:395)

at
org.apache.solr.servlet.SolrDispatchFilter.doFilte
r(SolrDispatchFilter.java:341)

at
org.eclipse.jetty.servlet.ServletHandler$CachedCha
in.doFilter(ServletHandler.java:1602)

at

Re: Solr 6.6.6 - Shards Whitelist Error

2020-06-18 Thread Jan Høydahl
Have you tried starting each node with -Dsolr.disable.shardsWhitelist=true to 
revert back to old behavior?

Jan

> 18. jun. 2020 kl. 17:23 skrev Ray W :
> 
> Hi Everyone,
> 
> We upgraded to solr 6.6.6 and a recurring error shows up as "shards
> parameter value contained values not in the shards whitelist". I have
> attached the error log as well as configurations in solr.xml and
> solrconfig.xml. Solr 6.6 documentations doesn't cover setting up a
> shards whitelist and it was through release notes
> <https://lucene.apache.org/solr/news.html> that we found (CVE-2017-3164 -
> configure a host whitelist) was patched in. We've been trying different
> combinations in the configuration files and the latest has been this
> stackoverflow thread
> <https://stackoverflow.com/questions/59036314/configuring-shardswhitelist-in-solr-6-6>
> which
> we know may not be correct. If anyone can point out what we're missing that
> would be appreciated. Thank you for reading, appreciate the help in
> advanced, we're stuck.
> 
> -Ray
> 
> Error:
> org.apache.solr.common.SolrException: The 'shards' parameter value
> 'solrcloud-dev.vmdata:8983/solr/statistics' contained value(s) not on the
> shards whitelist. shardUrl:solrcloud-dev.vmdata:8983/solr/statistics. set
> -Dsolr.disable.shardsWhitelist=true to disable shards whitelist checks
> at
> org.apache.solr.handler.component.HttpShardHandlerFactory$WhitelistHostChecker.lambda$checkWhitelist$1(HttpShardHandlerFactory.java:565)
> at java.util.stream.ForEachOps$ForEachOp$OfRef.accept(ForEachOps.java:184)
> at java.util.stream.ReferencePipeline$3$1.accept(ReferencePipeline.java:193)
> at
> java.util.ArrayList$ArrayListSpliterator.forEachRemaining(ArrayList.java:1382)
> at java.util.stream.AbstractPipeline.copyInto(AbstractPipeline.java:482)
> at
> java.util.stream.AbstractPipeline.wrapAndCopyInto(AbstractPipeline.java:472)
> at
> java.util.stream.ForEachOps$ForEachOp.evaluateSequential(ForEachOps.java:151)
> at
> java.util.stream.ForEachOps$ForEachOp$OfRef.evaluateSequential(ForEachOps.java:174)
> at java.util.stream.AbstractPipeline.evaluate(AbstractPipeline.java:234)
> at java.util.stream.ReferencePipeline.forEach(ReferencePipeline.java:418)
> at
> org.apache.solr.handler.component.HttpShardHandlerFactory$WhitelistHostChecker.checkWhitelist(HttpShardHandlerFactory.java:548)
> at
> org.apache.solr.handler.component.HttpShardHandler.prepDistributed(HttpShardHandler.java:393)
> at
> org.apache.solr.handler.component.SearchHandler.getAndPrepShardHandler(SearchHandler.java:227)
> at
> org.apache.solr.handler.component.SearchHandler.handleRequestBody(SearchHandler.java:265)
> at
> org.apache.solr.handler.RequestHandlerBase.handleRequest(RequestHandlerBase.java:173)
> at org.apache.solr.core.SolrCore.execute(SolrCore.java:2477)
> at org.apache.solr.servlet.HttpSolrCall.execute(HttpSolrCall.java:724)
> at org.apache.solr.servlet.HttpSolrCall.call(HttpSolrCall.java:530)
> at
> org.apache.solr.servlet.SolrDispatchFilter.doFilter(SolrDispatchFilter.java:361)
> at
> org.apache.solr.servlet.SolrDispatchFilter.doFilter(SolrDispatchFilter.java:305)
> at
> org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1691)
> at
> org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:582)
> at
> org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143)
> at
> org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:548)
> at
> org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:226)
> at
> org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1180)
> at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:512)
> at
> org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:185)
> at
> org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1112)
> at
> org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141)
> at
> org.eclipse.jetty.server.handler.ContextHandlerCollection.handle(ContextHandlerCollection.java:213)
> at
> org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:119)
> at
> org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:134)
> at
> org.eclipse.jetty.rewrite.handler.RewriteHandler.handle(RewriteHandler.java:335)
> at
> org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:134)
> at org.eclipse.jetty.server.Server.handle(Server.java:534)
> at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:320)
> at
> o

Solr 6.6.6 - Shards Whitelist Error

2020-06-18 Thread Ray W
Hi Everyone,

We upgraded to solr 6.6.6 and a recurring error shows up as "shards
parameter value contained values not in the shards whitelist". I have
attached the error log as well as configurations in solr.xml and
solrconfig.xml. Solr 6.6 documentations doesn't cover setting up a
shards whitelist and it was through release notes
<https://lucene.apache.org/solr/news.html> that we found (CVE-2017-3164 -
configure a host whitelist) was patched in. We've been trying different
combinations in the configuration files and the latest has been this
stackoverflow thread
<https://stackoverflow.com/questions/59036314/configuring-shardswhitelist-in-solr-6-6>
which
we know may not be correct. If anyone can point out what we're missing that
would be appreciated. Thank you for reading, appreciate the help in
advanced, we're stuck.

-Ray

Error:
org.apache.solr.common.SolrException: The 'shards' parameter value
'solrcloud-dev.vmdata:8983/solr/statistics' contained value(s) not on the
shards whitelist. shardUrl:solrcloud-dev.vmdata:8983/solr/statistics. set
-Dsolr.disable.shardsWhitelist=true to disable shards whitelist checks
at
org.apache.solr.handler.component.HttpShardHandlerFactory$WhitelistHostChecker.lambda$checkWhitelist$1(HttpShardHandlerFactory.java:565)
at java.util.stream.ForEachOps$ForEachOp$OfRef.accept(ForEachOps.java:184)
at java.util.stream.ReferencePipeline$3$1.accept(ReferencePipeline.java:193)
at
java.util.ArrayList$ArrayListSpliterator.forEachRemaining(ArrayList.java:1382)
at java.util.stream.AbstractPipeline.copyInto(AbstractPipeline.java:482)
at
java.util.stream.AbstractPipeline.wrapAndCopyInto(AbstractPipeline.java:472)
at
java.util.stream.ForEachOps$ForEachOp.evaluateSequential(ForEachOps.java:151)
at
java.util.stream.ForEachOps$ForEachOp$OfRef.evaluateSequential(ForEachOps.java:174)
at java.util.stream.AbstractPipeline.evaluate(AbstractPipeline.java:234)
at java.util.stream.ReferencePipeline.forEach(ReferencePipeline.java:418)
at
org.apache.solr.handler.component.HttpShardHandlerFactory$WhitelistHostChecker.checkWhitelist(HttpShardHandlerFactory.java:548)
at
org.apache.solr.handler.component.HttpShardHandler.prepDistributed(HttpShardHandler.java:393)
at
org.apache.solr.handler.component.SearchHandler.getAndPrepShardHandler(SearchHandler.java:227)
at
org.apache.solr.handler.component.SearchHandler.handleRequestBody(SearchHandler.java:265)
at
org.apache.solr.handler.RequestHandlerBase.handleRequest(RequestHandlerBase.java:173)
at org.apache.solr.core.SolrCore.execute(SolrCore.java:2477)
at org.apache.solr.servlet.HttpSolrCall.execute(HttpSolrCall.java:724)
at org.apache.solr.servlet.HttpSolrCall.call(HttpSolrCall.java:530)
at
org.apache.solr.servlet.SolrDispatchFilter.doFilter(SolrDispatchFilter.java:361)
at
org.apache.solr.servlet.SolrDispatchFilter.doFilter(SolrDispatchFilter.java:305)
at
org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1691)
at
org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:582)
at
org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143)
at
org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:548)
at
org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:226)
at
org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1180)
at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:512)
at
org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:185)
at
org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1112)
at
org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141)
at
org.eclipse.jetty.server.handler.ContextHandlerCollection.handle(ContextHandlerCollection.java:213)
at
org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:119)
at
org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:134)
at
org.eclipse.jetty.rewrite.handler.RewriteHandler.handle(RewriteHandler.java:335)
at
org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:134)
at org.eclipse.jetty.server.Server.handle(Server.java:534)
at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:320)
at
org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:251)
at
org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:273)
at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:95)
at
org.eclipse.jetty.io.SelectChannelEndPoint$2.run(SelectChannelEndPoint.java:93)
at
org.eclipse.jetty.util.thread.strategy.ExecuteProduceConsume.executeProduceConsume(ExecuteProduceConsume.java:303)
at
org.eclipse.jetty.util.thread.strategy.ExecuteProduceConsume.produceConsume(ExecuteProduceConsume.java:148)
at
org.eclipse.jetty.util.thread.strategy.ExecuteProduceConsume.run(ExecuteProduceConsume.java:136)
at
org.eclipse.jetty.uti

Re: Proxy Error when cluster went down

2020-06-16 Thread Vishal Vaibhav
So entire cluster was down. I m trying to bring node by node . I restarted
the first node . The solr comes up but add replicas command fails. And then
I tried to check clusterstatus api, it showed shard in active state, no
core as active i.e. all down and one live node which was the one that I
restarted. Also this all connects to one zookeeper ensemble of 3 nodes

On Tue, 16 Jun 2020 at 11:20 PM, Jörn Franke  wrote:

> Do you have another host with replica alive or are all replicas on the
> host that is down?
>
> Are all SolrCloud hosts in the same ZooKeeper?
>
> > Am 16.06.2020 um 19:29 schrieb Vishal Vaibhav :
> >
> > Hi thanks . My solr is running in kubernetes. So host name goes away
> with
> > the pod going
> search-rules-solr-v1-2.search-rules-solr-v1.search-digital.s
> > vc.cluster.local
> >  So in my case the pod with this host has gone and also the hostname
> >
> search-rules-solr-v1-2.search-rules-solr-v1.search-digital.svc.cluster.local
> > is no more there.. should not the solr cloud be aware of the fact that
> all
> > the replicas in that solr host is down and should not proxy the request
> to
> > that node..
> >
> >> On Tue, 16 Jun 2020 at 5:06 PM, Shawn Heisey 
> wrote:
> >>
> >>> On 6/15/2020 9:04 PM, Vishal Vaibhav wrote:
> >>> I am running on solr 8.5. For some reason entire cluster went down.
> When
> >> i
> >>> am trying to bring up the nodes,its not coming up. My health check is
> >>> on "/solr/rules/admin/system". I tried forcing a leader election but it
> >>> dint help.
> >>> so when i run the following commands. Why is it trying to proxy when
> >> those
> >>> nodes are down. Am i missing something?
> >>
> >> 
> >>
> >>> java.net.UnknownHostException:
> >>>
> >>
> search-rules-solr-v1-2.search-rules-solr-v1.search-digital.svc.cluster.local:
> >>
> >> It is trying to proxy because it's SolrCloud.  SolrCloud has an internal
> >> load balancer that spreads queries across multiple replicas when
> >> possible.  Your cluster must be aware of multiple servers where the
> >> "rules" collection can be queried.
> >>
> >> The underlying problem behind this error message is that the following
> >> hostname is being looked up, and it doesn't exist:
> >>
> >>
> >>
> search-rules-solr-v1-2.search-rules-solr-v1.search-digital.svc.cluster.local
> >>
> >> This hostname is most likely coming from /etc/hosts on one of your
> >> systems when that system starts Solr and it registers with the cluster,
> >> and that /etc/hosts file is the ONLY place that the hostname exists, so
> >> when SolrCloud tries to forward the request to that server, it is
> failing.
> >>
> >> Thanks,
> >> Shawn
> >>
>


Re: Proxy Error when cluster went down

2020-06-16 Thread Jörn Franke
Do you have another host with replica alive or are all replicas on the host 
that is down?

Are all SolrCloud hosts in the same ZooKeeper?

> Am 16.06.2020 um 19:29 schrieb Vishal Vaibhav :
> 
> Hi thanks . My solr is running in kubernetes. So host name goes away with
> the pod going search-rules-solr-v1-2.search-rules-solr-v1.search-digital.s
> vc.cluster.local
>  So in my case the pod with this host has gone and also the hostname
> search-rules-solr-v1-2.search-rules-solr-v1.search-digital.svc.cluster.local
> is no more there.. should not the solr cloud be aware of the fact that all
> the replicas in that solr host is down and should not proxy the request to
> that node..
> 
>> On Tue, 16 Jun 2020 at 5:06 PM, Shawn Heisey  wrote:
>> 
>>> On 6/15/2020 9:04 PM, Vishal Vaibhav wrote:
>>> I am running on solr 8.5. For some reason entire cluster went down. When
>> i
>>> am trying to bring up the nodes,its not coming up. My health check is
>>> on "/solr/rules/admin/system". I tried forcing a leader election but it
>>> dint help.
>>> so when i run the following commands. Why is it trying to proxy when
>> those
>>> nodes are down. Am i missing something?
>> 
>> 
>> 
>>> java.net.UnknownHostException:
>>> 
>> search-rules-solr-v1-2.search-rules-solr-v1.search-digital.svc.cluster.local:
>> 
>> It is trying to proxy because it's SolrCloud.  SolrCloud has an internal
>> load balancer that spreads queries across multiple replicas when
>> possible.  Your cluster must be aware of multiple servers where the
>> "rules" collection can be queried.
>> 
>> The underlying problem behind this error message is that the following
>> hostname is being looked up, and it doesn't exist:
>> 
>> 
>> search-rules-solr-v1-2.search-rules-solr-v1.search-digital.svc.cluster.local
>> 
>> This hostname is most likely coming from /etc/hosts on one of your
>> systems when that system starts Solr and it registers with the cluster,
>> and that /etc/hosts file is the ONLY place that the hostname exists, so
>> when SolrCloud tries to forward the request to that server, it is failing.
>> 
>> Thanks,
>> Shawn
>> 


Re: Proxy Error when cluster went down

2020-06-16 Thread Vishal Vaibhav
Hi thanks . My solr is running in kubernetes. So host name goes away with
the pod going search-rules-solr-v1-2.search-rules-solr-v1.search-digital.s
vc.cluster.local
  So in my case the pod with this host has gone and also the hostname
search-rules-solr-v1-2.search-rules-solr-v1.search-digital.svc.cluster.local
is no more there.. should not the solr cloud be aware of the fact that all
the replicas in that solr host is down and should not proxy the request to
that node..

On Tue, 16 Jun 2020 at 5:06 PM, Shawn Heisey  wrote:

> On 6/15/2020 9:04 PM, Vishal Vaibhav wrote:
> > I am running on solr 8.5. For some reason entire cluster went down. When
> i
> > am trying to bring up the nodes,its not coming up. My health check is
> > on "/solr/rules/admin/system". I tried forcing a leader election but it
> > dint help.
> > so when i run the following commands. Why is it trying to proxy when
> those
> > nodes are down. Am i missing something?
>
> 
>
> > java.net.UnknownHostException:
> >
> search-rules-solr-v1-2.search-rules-solr-v1.search-digital.svc.cluster.local:
>
> It is trying to proxy because it's SolrCloud.  SolrCloud has an internal
> load balancer that spreads queries across multiple replicas when
> possible.  Your cluster must be aware of multiple servers where the
> "rules" collection can be queried.
>
> The underlying problem behind this error message is that the following
> hostname is being looked up, and it doesn't exist:
>
>
> search-rules-solr-v1-2.search-rules-solr-v1.search-digital.svc.cluster.local
>
> This hostname is most likely coming from /etc/hosts on one of your
> systems when that system starts Solr and it registers with the cluster,
> and that /etc/hosts file is the ONLY place that the hostname exists, so
> when SolrCloud tries to forward the request to that server, it is failing.
>
> Thanks,
> Shawn
>


Re: Proxy Error when cluster went down

2020-06-16 Thread Shawn Heisey

On 6/15/2020 9:04 PM, Vishal Vaibhav wrote:

I am running on solr 8.5. For some reason entire cluster went down. When i
am trying to bring up the nodes,its not coming up. My health check is
on "/solr/rules/admin/system". I tried forcing a leader election but it
dint help.
so when i run the following commands. Why is it trying to proxy when those
nodes are down. Am i missing something?





java.net.UnknownHostException:
search-rules-solr-v1-2.search-rules-solr-v1.search-digital.svc.cluster.local:


It is trying to proxy because it's SolrCloud.  SolrCloud has an internal 
load balancer that spreads queries across multiple replicas when 
possible.  Your cluster must be aware of multiple servers where the 
"rules" collection can be queried.


The underlying problem behind this error message is that the following 
hostname is being looked up, and it doesn't exist:


search-rules-solr-v1-2.search-rules-solr-v1.search-digital.svc.cluster.local

This hostname is most likely coming from /etc/hosts on one of your 
systems when that system starts Solr and it registers with the cluster, 
and that /etc/hosts file is the ONLY place that the hostname exists, so 
when SolrCloud tries to forward the request to that server, it is failing.


Thanks,
Shawn


Proxy Error when cluster went down

2020-06-15 Thread Vishal Vaibhav
Hello all,

I am running on solr 8.5. For some reason entire cluster went down. When i
am trying to bring up the nodes,its not coming up. My health check is
on "/solr/rules/admin/system". I tried forcing a leader election but it
dint help.
so when i run the following commands. Why is it trying to proxy when those
nodes are down. Am i missing something?


*curl "http://localhost:8983/solr/rules/admin/system
<http://localhost:8983/solr/rules/admin/system>" * % Total% Received %
Xferd  Average Speed   TimeTime Time  Current
 Dload  Upload   Total   SpentLeft
 Speed
100  5582  100  55820 0   454k  0 --:--:-- --:--:-- --:--:--
 495k
{
  "error":{
"metadata":[
  "error-class","org.apache.solr.common.SolrException",
  "root-error-class","java.net.UnknownHostException"],
"msg":"Error trying to proxy request for url:
http://search-rules-solr-v1-2.search-rules-solr-v1.search-digital.svc.cluster.local:8983/solr/rules/admin/system
",
"trace":"org.apache.solr.common.SolrException: Error trying to proxy
request for url:
http://search-rules-solr-v1-2.search-rules-solr-v1.search-digital.svc.cluster.local:8983/solr/rules/admin/system\n\tat
org.apache.solr.servlet.HttpSolrCall.remoteQuery(HttpSolrCall.java:735)\n\tat
org.apache.solr.servlet.HttpSolrCall.call(HttpSolrCall.java:562)\n\tat
org.apache.solr.servlet.SolrDispatchFilter.doFilter(SolrDispatchFilter.java:419)\n\tat
org.apache.solr.servlet.SolrDispatchFilter.doFilter(SolrDispatchFilter.java:351)\n\tat
org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1602)\n\tat
org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:540)\n\tat
org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:146)\n\tat
org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:548)\n\tat
org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:132)\n\tat
org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:257)\n\tat
org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1711)\n\tat
org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:255)\n\tat
org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1347)\n\tat
org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:203)\n\tat
org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:480)\n\tat
org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1678)\n\tat
org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:201)\n\tat
org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1249)\n\tat
org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:144)\n\tat
org.eclipse.jetty.server.handler.ContextHandlerCollection.handle(ContextHandlerCollection.java:220)\n\tat
org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:152)\n\tat
org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:132)\n\tat
org.eclipse.jetty.rewrite.handler.RewriteHandler.handle(RewriteHandler.java:335)\n\tat
org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:132)\n\tat
org.eclipse.jetty.server.Server.handle(Server.java:505)\n\tat
org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:370)\n\tat
org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:267)\n\tat
org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:305)\n\tat
org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:103)\n\tat
org.eclipse.jetty.io.ChannelEndPoint$2.run(ChannelEndPoint.java:117)\n\tat
org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:333)\n\tat
org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:310)\n\tat
org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:168)\n\tat
org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:126)\n\tat
org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:366)\n\tat
org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:781)\n\tat
org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:917)\n\tat
java.base/java.lang.Thread.run(Unknown Source)\nCaused by:
java.net.UnknownHostException:
search-rules-solr-v1-2.search-rules-solr-v1.search-digital.svc.cluster.local:
Name or service not known\n\tat
java.base/java.net.Inet4AddressImpl.lookupAllHostAddr(Native Method)\n\tat
java.base/java.net.InetAddress$PlatformNameService.lookupAllHostAddr(Unknown
Source)\n\tat
java.base/java.net.InetAddress.getAddressesFromNameService(Unknown
Source)\n\tat
java.base

Re: [EXTERNAL] - SolR OOM error due to query injection

2020-06-11 Thread Michael Gibney
Guilherme,
The answer is likely to be dependent on the query parser, query parser
configuration, and analysis chains. If you post those it could aid in
helping troubleshoot. One thing that jumps to mind is the asterisks
("*") -- if they're interpreted as wildcards, that could be
problematic? More generally, it's of course true that Solr won't parse
this input as SQL, but as Isabelle pointed out, there are still
potentially lots of meta-characters (in addition to quite a few short,
common terms).
Michael


On Thu, Jun 11, 2020 at 7:43 AM Guilherme Viteri  wrote:
>
> Hi Isabelle
> Thanks for your input.
> In fact SolR returns 30 results out of this queries. Why does it behave in a 
> way that causes OOM ? Also the commands, they are SQL commands and solr would 
> parse it as normal character …
>
> Thanks
>
>
> > On 10 Jun 2020, at 22:50, Isabelle Giguere  
> > wrote:
> >
> > Hi Guilherme;
> >
> > The only thing I can think of right now is the number of non-alphanumeric 
> > characters.
> >
> > In the first 'q' in your examples, after resolving the character escapes, 
> > 1/3 of characters are non-alphanumeric (* / = , etc).
> >
> > Maybe filter-out queries that contain too many non-alphanumeric characters 
> > before sending the request to Solr ?  Whatever "too many" could be.
> >
> > Isabelle Giguère
> > Computational Linguist & Java Developer
> > Linguiste informaticienne & développeur java
> >
> >
> > 
> > De : Guilherme Viteri 
> > Envoyé : 10 juin 2020 16:57
> > À : solr-user@lucene.apache.org 
> > Objet : [EXTERNAL] - SolR OOM error due to query injection
> >
> > Hi,
> >
> > Environment: SolR 6.6.2, with org.apache.solr.solr-core:6.1.0. This setup 
> > has been running for at least 4 years without having OutOfMemory error. (it 
> > is never too late for an OOM…)
> >
> > This week, our search tool has been attacked via ‘sql injection’ like, and 
> > that led to an OOM. These requests weren’t aggressive that stressed the 
> > server with an excessive number of hits, however 5 to 10 request of this 
> > nature was enough to crash the server.
> >
> > I’ve come across a this link 
> > https://urldefense.com/v3/__https://stackoverflow.com/questions/26862474/prevent-from-solr-query-injections-when-using-solrj__;!!Obbck6kTJA!IdbT_RQCp3jXO5KJxMkWNJIRlNU9Hu1hnJsWqCWT_QS3zpZSAxYeFPM_hGWNwp3y$
> >   
> > <https://urldefense.com/v3/__https://stackoverflow.com/questions/26862474/prevent-from-solr-query-injections-when-using-solrj__;!!Obbck6kTJA!IdbT_RQCp3jXO5KJxMkWNJIRlNU9Hu1hnJsWqCWT_QS3zpZSAxYeFPM_hGWNwp3y$
> >  >, however, that’s not what I am after. In our case we do allow lucene 
> > query and field search like title:Title or our ids have dash and if it get 
> > escaped, then the search won’t work properly.
> >
> > Does anyone have an idea ?
> >
> > Cheers
> > G
> >
> > Here are some of the requests that appeared in the logs in relation to the 
> > attack (see below: sorry it is messy)
> > query?q=IPP%22%29%29%29%2F%2A%2A%2FAND%2F%2A%2A%2F%28SELECT%2F%2A%2A%2F2%2A%28IF%28%28SELECT%2F%2A%2A%2F%2A%2F%2A%2A%2FFROM%2F%2A%2A%2F%28SELECT%2F%2A%2A%2FCONCAT%280x717a707871%2C%28SELECT%2F%2A%2A%2F%28ELT%283235%3D3235%2C1%29%29%29%2C0x717a626271%2C0x78%29%29s%29%2C%2F%2A%2A%2F8446744073709551610%2C%2F%2A%2A%2F8446744073709551610%29%29%29%2F%2A%2A%2FAND%2F%2A%2A%2F%28%28%28%22YBXk%22%2F%2A%2A%2FLIKE%2F%2A%2A%2F%22YBXk&species=Homo%20sapiens&types=Reaction&types=Pathway&cluster=true
> >
> > q=IPP%22%29%29%29%2F%2A%2A%2FAND%2F%2A%2A%2F%28SELECT%2F%2A%2A%2F2%2A%28IF%28%28SELECT%2F%2A%2A%2F%2A%2F%2A%2A%2FFROM%2F%2A%2A%2F%28SELECT%2F%2A%2A%2FCONCAT%280x717a707871%2C%28SELECT%2F%2A%2A%2F%28ELT%283235%3D3235%2C1%29%29%29%2C0x717a626271%2C0x78%29%29s%29%2C%2F%2A%2A%2F8446744073709551610%2C%2F%2A%2A%2F8446744073709551610%29%29%29%2F%2A%2A%2FAND%2F%2A%2A%2F%28%28%28%22rDmG%22%3D%22rDmG&species=Homo%20sapiens&types=Reaction&types=Pathway&cluster=true
> >
> > q=IPP%22%29%29%29%2F%2A%2A%2FAND%2F%2A%2A%2F%28SELECT%2F%2A%2A%2F3641%2F%2A%2A%2FFROM%28SELECT%2F%2A%2A%2FCOUNT%28%2A%29%2CCONCAT%280x717a707871%2C%28SELECT%2F%2A%2A%2F%28ELT%283641%3D3641%2C1%29%29%29%2C0x717a626271%2CFLOOR%28RAND%280%29%2A2%29%29x%2F%2A%2A%2FFROM%2F%2A%2A%2FINFORMATION_SCHEMA.PLUGINS%2F%2A%2A%2FGROUP%2F%2A%2A%2FBY%2F%2A%2A%2Fx%29a%29%2F%2A%2A%2FAND%2F%2A%2A%2F%28%28%28%22dfkM%22%2F%2A%2A%2FLIKE%2F%2A%2A%2F%22dfkM&species=Homo%20sapiens&types=Reaction&types=Pathway&cluster=true
> >
> > q=IPP%22%29%29%29%2F%2A%2A%2FAND%2F%2A%2A%2F%28SELECT%2F%2A%2A%2F36

Re: [EXTERNAL] - SolR OOM error due to query injection

2020-06-11 Thread Guilherme Viteri
Hi Isabelle
Thanks for your input.
In fact SolR returns 30 results out of this queries. Why does it behave in a 
way that causes OOM ? Also the commands, they are SQL commands and solr would 
parse it as normal character …

Thanks


> On 10 Jun 2020, at 22:50, Isabelle Giguere  
> wrote:
> 
> Hi Guilherme;
> 
> The only thing I can think of right now is the number of non-alphanumeric 
> characters.
> 
> In the first 'q' in your examples, after resolving the character escapes, 1/3 
> of characters are non-alphanumeric (* / = , etc).
> 
> Maybe filter-out queries that contain too many non-alphanumeric characters 
> before sending the request to Solr ?  Whatever "too many" could be.
> 
> Isabelle Giguère
> Computational Linguist & Java Developer
> Linguiste informaticienne & développeur java
> 
> 
> 
> De : Guilherme Viteri 
> Envoyé : 10 juin 2020 16:57
> À : solr-user@lucene.apache.org 
> Objet : [EXTERNAL] - SolR OOM error due to query injection
> 
> Hi,
> 
> Environment: SolR 6.6.2, with org.apache.solr.solr-core:6.1.0. This setup has 
> been running for at least 4 years without having OutOfMemory error. (it is 
> never too late for an OOM…)
> 
> This week, our search tool has been attacked via ‘sql injection’ like, and 
> that led to an OOM. These requests weren’t aggressive that stressed the 
> server with an excessive number of hits, however 5 to 10 request of this 
> nature was enough to crash the server.
> 
> I’ve come across a this link 
> https://urldefense.com/v3/__https://stackoverflow.com/questions/26862474/prevent-from-solr-query-injections-when-using-solrj__;!!Obbck6kTJA!IdbT_RQCp3jXO5KJxMkWNJIRlNU9Hu1hnJsWqCWT_QS3zpZSAxYeFPM_hGWNwp3y$
>   
> <https://urldefense.com/v3/__https://stackoverflow.com/questions/26862474/prevent-from-solr-query-injections-when-using-solrj__;!!Obbck6kTJA!IdbT_RQCp3jXO5KJxMkWNJIRlNU9Hu1hnJsWqCWT_QS3zpZSAxYeFPM_hGWNwp3y$
>  >, however, that’s not what I am after. In our case we do allow lucene query 
> and field search like title:Title or our ids have dash and if it get escaped, 
> then the search won’t work properly.
> 
> Does anyone have an idea ?
> 
> Cheers
> G
> 
> Here are some of the requests that appeared in the logs in relation to the 
> attack (see below: sorry it is messy)
> query?q=IPP%22%29%29%29%2F%2A%2A%2FAND%2F%2A%2A%2F%28SELECT%2F%2A%2A%2F2%2A%28IF%28%28SELECT%2F%2A%2A%2F%2A%2F%2A%2A%2FFROM%2F%2A%2A%2F%28SELECT%2F%2A%2A%2FCONCAT%280x717a707871%2C%28SELECT%2F%2A%2A%2F%28ELT%283235%3D3235%2C1%29%29%29%2C0x717a626271%2C0x78%29%29s%29%2C%2F%2A%2A%2F8446744073709551610%2C%2F%2A%2A%2F8446744073709551610%29%29%29%2F%2A%2A%2FAND%2F%2A%2A%2F%28%28%28%22YBXk%22%2F%2A%2A%2FLIKE%2F%2A%2A%2F%22YBXk&species=Homo%20sapiens&types=Reaction&types=Pathway&cluster=true
> 
> q=IPP%22%29%29%29%2F%2A%2A%2FAND%2F%2A%2A%2F%28SELECT%2F%2A%2A%2F2%2A%28IF%28%28SELECT%2F%2A%2A%2F%2A%2F%2A%2A%2FFROM%2F%2A%2A%2F%28SELECT%2F%2A%2A%2FCONCAT%280x717a707871%2C%28SELECT%2F%2A%2A%2F%28ELT%283235%3D3235%2C1%29%29%29%2C0x717a626271%2C0x78%29%29s%29%2C%2F%2A%2A%2F8446744073709551610%2C%2F%2A%2A%2F8446744073709551610%29%29%29%2F%2A%2A%2FAND%2F%2A%2A%2F%28%28%28%22rDmG%22%3D%22rDmG&species=Homo%20sapiens&types=Reaction&types=Pathway&cluster=true
> 
> q=IPP%22%29%29%29%2F%2A%2A%2FAND%2F%2A%2A%2F%28SELECT%2F%2A%2A%2F3641%2F%2A%2A%2FFROM%28SELECT%2F%2A%2A%2FCOUNT%28%2A%29%2CCONCAT%280x717a707871%2C%28SELECT%2F%2A%2A%2F%28ELT%283641%3D3641%2C1%29%29%29%2C0x717a626271%2CFLOOR%28RAND%280%29%2A2%29%29x%2F%2A%2A%2FFROM%2F%2A%2A%2FINFORMATION_SCHEMA.PLUGINS%2F%2A%2A%2FGROUP%2F%2A%2A%2FBY%2F%2A%2A%2Fx%29a%29%2F%2A%2A%2FAND%2F%2A%2A%2F%28%28%28%22dfkM%22%2F%2A%2A%2FLIKE%2F%2A%2A%2F%22dfkM&species=Homo%20sapiens&types=Reaction&types=Pathway&cluster=true
> 
> q=IPP%22%29%29%29%2F%2A%2A%2FAND%2F%2A%2A%2F%28SELECT%2F%2A%2A%2F3641%2F%2A%2A%2FFROM%28SELECT%2F%2A%2A%2FCOUNT%28%2A%29%2CCONCAT%280x717a707871%2C%28SELECT%2F%2A%2A%2F%28ELT%283641%3D3641%2C1%29%29%29%2C0x717a626271%2CFLOOR%28RAND%280%29%2A2%29%29x%2F%2A%2A%2FFROM%2F%2A%2A%2FINFORMATION_SCHEMA.PLUGINS%2F%2A%2A%2FGROUP%2F%2A%2A%2FBY%2F%2A%2A%2Fx%29a%29%2F%2A%2A%2FAND%2F%2A%2A%2F%28%28%28%22yBhx%22%3D%22yBhx&species=Homo%20sapiens&types=Reaction&types=Pathway&cluster=true
> 
> q=IPP%22%29%29%29%2F%2A%2A%2FAND%2F%2A%2A%2F1695%3DCTXSYS.DRITHSX.SN%281695%2C%28CHR%28113%29%7C%7CCHR%28122%29%7C%7CCHR%28112%29%7C%7CCHR%28120%29%7C%7CCHR%28113%29%7C%7C%28SELECT%2F%2A%2A%2F%28CASE%2F%2A%2A%2FWHEN%2F%2A%2A%2F%281695%3D1695%29%2F%2A%2A%2FTHEN%2F%2A%2A%2F1%2F%2A%2A%2FELSE%2F%2A%2A%2F0%2F%2A%2A%2FEND%29%2F%2A%2A%2FFROM%2F%2A%2A%2FDUAL%29%7C%7CCHR%28113%29%7C%7CCHR%28122%29%7C%7CCHR%2898%29%7C%7CCHR%2898%29%7C%7CCHR%28113%29%29%29%2F%2A%2A%2FAND%2F%2A%2A%2F%28

Re: [EXTERNAL] - SolR OOM error due to query injection

2020-06-10 Thread Isabelle Giguere
Hi Guilherme;

The only thing I can think of right now is the number of non-alphanumeric 
characters.

In the first 'q' in your examples, after resolving the character escapes, 1/3 
of characters are non-alphanumeric (* / = , etc).

Maybe filter-out queries that contain too many non-alphanumeric characters 
before sending the request to Solr ?  Whatever "too many" could be.

Isabelle Giguère
Computational Linguist & Java Developer
Linguiste informaticienne & développeur java



De : Guilherme Viteri 
Envoyé : 10 juin 2020 16:57
À : solr-user@lucene.apache.org 
Objet : [EXTERNAL] - SolR OOM error due to query injection

Hi,

Environment: SolR 6.6.2, with org.apache.solr.solr-core:6.1.0. This setup has 
been running for at least 4 years without having OutOfMemory error. (it is 
never too late for an OOM…)

This week, our search tool has been attacked via ‘sql injection’ like, and that 
led to an OOM. These requests weren’t aggressive that stressed the server with 
an excessive number of hits, however 5 to 10 request of this nature was enough 
to crash the server.

I’ve come across a this link 
https://urldefense.com/v3/__https://stackoverflow.com/questions/26862474/prevent-from-solr-query-injections-when-using-solrj__;!!Obbck6kTJA!IdbT_RQCp3jXO5KJxMkWNJIRlNU9Hu1hnJsWqCWT_QS3zpZSAxYeFPM_hGWNwp3y$
  
<https://urldefense.com/v3/__https://stackoverflow.com/questions/26862474/prevent-from-solr-query-injections-when-using-solrj__;!!Obbck6kTJA!IdbT_RQCp3jXO5KJxMkWNJIRlNU9Hu1hnJsWqCWT_QS3zpZSAxYeFPM_hGWNwp3y$
 >, however, that’s not what I am after. In our case we do allow lucene query 
and field search like title:Title or our ids have dash and if it get escaped, 
then the search won’t work properly.

Does anyone have an idea ?

Cheers
G

Here are some of the requests that appeared in the logs in relation to the 
attack (see below: sorry it is messy)
query?q=IPP%22%29%29%29%2F%2A%2A%2FAND%2F%2A%2A%2F%28SELECT%2F%2A%2A%2F2%2A%28IF%28%28SELECT%2F%2A%2A%2F%2A%2F%2A%2A%2FFROM%2F%2A%2A%2F%28SELECT%2F%2A%2A%2FCONCAT%280x717a707871%2C%28SELECT%2F%2A%2A%2F%28ELT%283235%3D3235%2C1%29%29%29%2C0x717a626271%2C0x78%29%29s%29%2C%2F%2A%2A%2F8446744073709551610%2C%2F%2A%2A%2F8446744073709551610%29%29%29%2F%2A%2A%2FAND%2F%2A%2A%2F%28%28%28%22YBXk%22%2F%2A%2A%2FLIKE%2F%2A%2A%2F%22YBXk&species=Homo%20sapiens&types=Reaction&types=Pathway&cluster=true

q=IPP%22%29%29%29%2F%2A%2A%2FAND%2F%2A%2A%2F%28SELECT%2F%2A%2A%2F2%2A%28IF%28%28SELECT%2F%2A%2A%2F%2A%2F%2A%2A%2FFROM%2F%2A%2A%2F%28SELECT%2F%2A%2A%2FCONCAT%280x717a707871%2C%28SELECT%2F%2A%2A%2F%28ELT%283235%3D3235%2C1%29%29%29%2C0x717a626271%2C0x78%29%29s%29%2C%2F%2A%2A%2F8446744073709551610%2C%2F%2A%2A%2F8446744073709551610%29%29%29%2F%2A%2A%2FAND%2F%2A%2A%2F%28%28%28%22rDmG%22%3D%22rDmG&species=Homo%20sapiens&types=Reaction&types=Pathway&cluster=true

q=IPP%22%29%29%29%2F%2A%2A%2FAND%2F%2A%2A%2F%28SELECT%2F%2A%2A%2F3641%2F%2A%2A%2FFROM%28SELECT%2F%2A%2A%2FCOUNT%28%2A%29%2CCONCAT%280x717a707871%2C%28SELECT%2F%2A%2A%2F%28ELT%283641%3D3641%2C1%29%29%29%2C0x717a626271%2CFLOOR%28RAND%280%29%2A2%29%29x%2F%2A%2A%2FFROM%2F%2A%2A%2FINFORMATION_SCHEMA.PLUGINS%2F%2A%2A%2FGROUP%2F%2A%2A%2FBY%2F%2A%2A%2Fx%29a%29%2F%2A%2A%2FAND%2F%2A%2A%2F%28%28%28%22dfkM%22%2F%2A%2A%2FLIKE%2F%2A%2A%2F%22dfkM&species=Homo%20sapiens&types=Reaction&types=Pathway&cluster=true

q=IPP%22%29%29%29%2F%2A%2A%2FAND%2F%2A%2A%2F%28SELECT%2F%2A%2A%2F3641%2F%2A%2A%2FFROM%28SELECT%2F%2A%2A%2FCOUNT%28%2A%29%2CCONCAT%280x717a707871%2C%28SELECT%2F%2A%2A%2F%28ELT%283641%3D3641%2C1%29%29%29%2C0x717a626271%2CFLOOR%28RAND%280%29%2A2%29%29x%2F%2A%2A%2FFROM%2F%2A%2A%2FINFORMATION_SCHEMA.PLUGINS%2F%2A%2A%2FGROUP%2F%2A%2A%2FBY%2F%2A%2A%2Fx%29a%29%2F%2A%2A%2FAND%2F%2A%2A%2F%28%28%28%22yBhx%22%3D%22yBhx&species=Homo%20sapiens&types=Reaction&types=Pathway&cluster=true

q=IPP%22%29%29%29%2F%2A%2A%2FAND%2F%2A%2A%2F1695%3DCTXSYS.DRITHSX.SN%281695%2C%28CHR%28113%29%7C%7CCHR%28122%29%7C%7CCHR%28112%29%7C%7CCHR%28120%29%7C%7CCHR%28113%29%7C%7C%28SELECT%2F%2A%2A%2F%28CASE%2F%2A%2A%2FWHEN%2F%2A%2A%2F%281695%3D1695%29%2F%2A%2A%2FTHEN%2F%2A%2A%2F1%2F%2A%2A%2FELSE%2F%2A%2A%2F0%2F%2A%2A%2FEND%29%2F%2A%2A%2FFROM%2F%2A%2A%2FDUAL%29%7C%7CCHR%28113%29%7C%7CCHR%28122%29%7C%7CCHR%2898%29%7C%7CCHR%2898%29%7C%7CCHR%28113%29%29%29%2F%2A%2A%2FAND%2F%2A%2A%2F%28%28%28%22eEdc%22%2F%2A%2A%2FLIKE%2F%2A%2A%2F%22eEdc&species=Homo%20sapiens&types=Reaction&types=Pathway&cluster=true

q=IPP%22%29%29%29%2F%2A%2A%2FAND%2F%2A%2A%2F1695%3DCTXSYS.DRITHSX.SN%281695%2C%28CHR%28113%29%7C%7CCHR%28122%29%7C%7CCHR%28112%29%7C%7CCHR%28120%29%7C%7CCHR%28113%29%7C%7C%28SELECT%2F%2A%2A%2F%28CASE%2F%2A%2A%2FWHEN%2F%2A%2A%2F%281695%3D1695%29%2F%2A%2A%2FTHEN%2F%2A%2A%2F1%2F%2A%2A%2FELSE%2F%2A%2A%2F0%2F%2A%2A%2FEND%29%2F%2A%2A%2FFROM%2F%2A%2A%2FDUAL%29%7C%7CCHR%28113%29%7C%7CCHR%28122%29%7C%7CCHR%2898%29%7C%7CCHR%2898%29%7C%7CCHR%28113%29%29%29%2

SolR OOM error due to query injection

2020-06-10 Thread Guilherme Viteri
Hi,

Environment: SolR 6.6.2, with org.apache.solr.solr-core:6.1.0. This setup has 
been running for at least 4 years without having OutOfMemory error. (it is 
never too late for an OOM…)

This week, our search tool has been attacked via ‘sql injection’ like, and that 
led to an OOM. These requests weren’t aggressive that stressed the server with 
an excessive number of hits, however 5 to 10 request of this nature was enough 
to crash the server.

I’ve come across a this link 
https://stackoverflow.com/questions/26862474/prevent-from-solr-query-injections-when-using-solrj
 
<https://stackoverflow.com/questions/26862474/prevent-from-solr-query-injections-when-using-solrj>,
 however, that’s not what I am after. In our case we do allow lucene query and 
field search like title:Title or our ids have dash and if it get escaped, then 
the search won’t work properly.

Does anyone have an idea ?

Cheers
G

Here are some of the requests that appeared in the logs in relation to the 
attack (see below: sorry it is messy)
query?q=IPP%22%29%29%29%2F%2A%2A%2FAND%2F%2A%2A%2F%28SELECT%2F%2A%2A%2F2%2A%28IF%28%28SELECT%2F%2A%2A%2F%2A%2F%2A%2A%2FFROM%2F%2A%2A%2F%28SELECT%2F%2A%2A%2FCONCAT%280x717a707871%2C%28SELECT%2F%2A%2A%2F%28ELT%283235%3D3235%2C1%29%29%29%2C0x717a626271%2C0x78%29%29s%29%2C%2F%2A%2A%2F8446744073709551610%2C%2F%2A%2A%2F8446744073709551610%29%29%29%2F%2A%2A%2FAND%2F%2A%2A%2F%28%28%28%22YBXk%22%2F%2A%2A%2FLIKE%2F%2A%2A%2F%22YBXk&species=Homo%20sapiens&types=Reaction&types=Pathway&cluster=true

q=IPP%22%29%29%29%2F%2A%2A%2FAND%2F%2A%2A%2F%28SELECT%2F%2A%2A%2F2%2A%28IF%28%28SELECT%2F%2A%2A%2F%2A%2F%2A%2A%2FFROM%2F%2A%2A%2F%28SELECT%2F%2A%2A%2FCONCAT%280x717a707871%2C%28SELECT%2F%2A%2A%2F%28ELT%283235%3D3235%2C1%29%29%29%2C0x717a626271%2C0x78%29%29s%29%2C%2F%2A%2A%2F8446744073709551610%2C%2F%2A%2A%2F8446744073709551610%29%29%29%2F%2A%2A%2FAND%2F%2A%2A%2F%28%28%28%22rDmG%22%3D%22rDmG&species=Homo%20sapiens&types=Reaction&types=Pathway&cluster=true

q=IPP%22%29%29%29%2F%2A%2A%2FAND%2F%2A%2A%2F%28SELECT%2F%2A%2A%2F3641%2F%2A%2A%2FFROM%28SELECT%2F%2A%2A%2FCOUNT%28%2A%29%2CCONCAT%280x717a707871%2C%28SELECT%2F%2A%2A%2F%28ELT%283641%3D3641%2C1%29%29%29%2C0x717a626271%2CFLOOR%28RAND%280%29%2A2%29%29x%2F%2A%2A%2FFROM%2F%2A%2A%2FINFORMATION_SCHEMA.PLUGINS%2F%2A%2A%2FGROUP%2F%2A%2A%2FBY%2F%2A%2A%2Fx%29a%29%2F%2A%2A%2FAND%2F%2A%2A%2F%28%28%28%22dfkM%22%2F%2A%2A%2FLIKE%2F%2A%2A%2F%22dfkM&species=Homo%20sapiens&types=Reaction&types=Pathway&cluster=true

q=IPP%22%29%29%29%2F%2A%2A%2FAND%2F%2A%2A%2F%28SELECT%2F%2A%2A%2F3641%2F%2A%2A%2FFROM%28SELECT%2F%2A%2A%2FCOUNT%28%2A%29%2CCONCAT%280x717a707871%2C%28SELECT%2F%2A%2A%2F%28ELT%283641%3D3641%2C1%29%29%29%2C0x717a626271%2CFLOOR%28RAND%280%29%2A2%29%29x%2F%2A%2A%2FFROM%2F%2A%2A%2FINFORMATION_SCHEMA.PLUGINS%2F%2A%2A%2FGROUP%2F%2A%2A%2FBY%2F%2A%2A%2Fx%29a%29%2F%2A%2A%2FAND%2F%2A%2A%2F%28%28%28%22yBhx%22%3D%22yBhx&species=Homo%20sapiens&types=Reaction&types=Pathway&cluster=true

q=IPP%22%29%29%29%2F%2A%2A%2FAND%2F%2A%2A%2F1695%3DCTXSYS.DRITHSX.SN%281695%2C%28CHR%28113%29%7C%7CCHR%28122%29%7C%7CCHR%28112%29%7C%7CCHR%28120%29%7C%7CCHR%28113%29%7C%7C%28SELECT%2F%2A%2A%2F%28CASE%2F%2A%2A%2FWHEN%2F%2A%2A%2F%281695%3D1695%29%2F%2A%2A%2FTHEN%2F%2A%2A%2F1%2F%2A%2A%2FELSE%2F%2A%2A%2F0%2F%2A%2A%2FEND%29%2F%2A%2A%2FFROM%2F%2A%2A%2FDUAL%29%7C%7CCHR%28113%29%7C%7CCHR%28122%29%7C%7CCHR%2898%29%7C%7CCHR%2898%29%7C%7CCHR%28113%29%29%29%2F%2A%2A%2FAND%2F%2A%2A%2F%28%28%28%22eEdc%22%2F%2A%2A%2FLIKE%2F%2A%2A%2F%22eEdc&species=Homo%20sapiens&types=Reaction&types=Pathway&cluster=true

q=IPP%22%29%29%29%2F%2A%2A%2FAND%2F%2A%2A%2F1695%3DCTXSYS.DRITHSX.SN%281695%2C%28CHR%28113%29%7C%7CCHR%28122%29%7C%7CCHR%28112%29%7C%7CCHR%28120%29%7C%7CCHR%28113%29%7C%7C%28SELECT%2F%2A%2A%2F%28CASE%2F%2A%2A%2FWHEN%2F%2A%2A%2F%281695%3D1695%29%2F%2A%2A%2FTHEN%2F%2A%2A%2F1%2F%2A%2A%2FELSE%2F%2A%2A%2F0%2F%2A%2A%2FEND%29%2F%2A%2A%2FFROM%2F%2A%2A%2FDUAL%29%7C%7CCHR%28113%29%7C%7CCHR%28122%29%7C%7CCHR%2898%29%7C%7CCHR%2898%29%7C%7CCHR%28113%29%29%29%2F%2A%2A%2FAND%2F%2A%2A%2F%28%28%28%22zAUD%22%3D%22zAUD&species=Homo%20sapiens&types=Reaction&types=Pathway&cluster=true

q=IPP%22%29%29%29%2F%2A%2A%2FAND%2F%2A%2A%2F4144%3DCONVERT%28INT%2C%28SELECT%2F%2A%2A%2FCHAR%28113%29%2BCHAR%28122%29%2BCHAR%28112%29%2BCHAR%28120%29%2BCHAR%28113%29%2B%28SELECT%2F%2A%2A%2F%28CASE%2F%2A%2A%2FWHEN%2F%2A%2A%2F%284144%3D4144%29%2F%2A%2A%2FTHEN%2F%2A%2A%2FCHAR%2849%29%2F%2A%2A%2FELSE%2F%2A%2A%2FCHAR%2848%29%2F%2A%2A%2FEND%29%29%2BCHAR%28113%29%2BCHAR%28122%29%2BCHAR%2898%29%2BCHAR%2898%29%2BCHAR%28113%29%29%29%2F%2A%2A%2FAND%2F%2A%2A%2F%28%28%28%22ePUW%22%2F%2A%2A%2FLIKE%2F%2A%2A%2F%22ePUW&species=Homo%20sapiens&types=Reaction&types=Pathway&cluster=true

q=IPP%22%29%29%29%2F%2A%2A%2FAND%2F%2A%2A%2F4144%3DCONVERT%28INT%2C%28SELECT%2F%2A%2A%2FCHAR%28113%29%2BCHAR%28122%29%2BCHAR%28112%29%2BCHAR%28120%29%2BCHAR%28113%29%2B%28SELECT%2

Indexing error when using Category Routed Alias

2020-06-09 Thread Tom Evans
Hi all

1. Setup simple 1 node solrcloud test setup using docker-compose,
solr:8.5.2, zookeeper:3.5.8.
2. Upload a configset
3. Create two collections, one standard collection, one CRA, both
using the same configset

legacy:
action=CREATE&name=products_old&collection.configName=products&autoAddReplicas=true&numShards=1&maxShardsPerNode=-1

CRA:

{
  "create-alias": {
"name": "products_20200609",
"router": {
  "name": "category",
  "field": "date_published.year",
  "maxCardinality": 30,
  "mustMatch": "(199[6-9]|20[0,1,2][0-9])"
},
"create-collection": {
  "config": "products",
  "numShards": 1,
  "nrtReplicas": 1,
  "tlogReplicas": 0,
  "maxShardsPerNode": 1,
  "autoAddReplicas": true
}
  }
}

Post a small selection of docs in JSON format using curl to non-CRA
collection -> OK

> $ docker-compose exec -T solr curl -H 'Content-Type: application/json' 
> -d@/resources/product-json/products-12381742.json 
> http://solr:8983/solr/products_old/update/json/docs
  % Total% Received % Xferd  Average Speed   TimeTime Time  Current
 Dload  Upload   Total   SpentLeft  Speed
100 11.6M  10071  100 11.6M  5   950k  0:00:14  0:00:12  0:00:02  687k
{
  "responseHeader":{
"rf":1,
"status":0,
"QTime":12541}}

The same documents, sent to the CRA -> boom

> $ docker-compose exec -T solr curl -H 'Content-Type: application/json' 
> -d@/resources/product-json/products-12381742.json 
> http://solr:8983/solr/products_20200609/update/json/docs
  % Total    % Received % Xferd  Average Speed   TimeTime Time  Current
 Dload  Upload   Total   SpentLeft  Speed
100 11.6M  100   888  100 11.6M366  4913k  0:00:02  0:00:02 --:--:-- 4914k
{
  "responseHeader":{
"status":400,
"QTime":2422},
  "error":{
"metadata":[
  "error-class","org.apache.solr.common.SolrException",
  "root-error-class","org.apache.solr.common.SolrException",
  
"error-class","org.apache.solr.update.processor.DistributedUpdateProcessor$DistributedUpdatesAsyncException",
  
"root-error-class","org.apache.solr.update.processor.DistributedUpdateProcessor$DistributedUpdatesAsyncException"],
"msg":"Async exception during distributed update: Error from
server at 
http://10.20.36.130:8983/solr/products_20200609__CRA__2005_shard1_replica_n1/:
null\n\n\n\nrequest:
http://10.20.36.130:8983/solr/products_20200609__CRA__2005_shard1_replica_n1/\nRemote
error message: Cannot parse provided JSON: JSON Parse Error:
char=\u0002,position=0 AFTER='\u0002'
BEFORE='��¶ms��2update.contentType0applicat'",
"code":400}}

Repeating the request again to the CRA -> OK

> $ docker-compose exec -T solr curl -H 'Content-Type: application/json' 
> -d@/resources/product-json/products-12381742.json 
> http://solr:8983/solr/products_20200609/update/json/docs
  % Total% Received % Xferd  Average Speed   TimeTime Time  Current
 Dload  Upload   Total   SpentLeft  Speed
100 11.6M  10071  100 11.6M  6  1041k  0:00:11  0:00:11 --:--:--  706k
{
  "responseHeader":{
"rf":1,
"status":0,
"QTime":11446}}

It seems to be related to when a new collection is needed to be
created by the CRA.

The relevant logs:

2020-06-09 02:12:56.107 INFO
(OverseerThreadFactory-9-thread-3-processing-n:10.20.36.130:8983_solr)
[   ] o.a.s.c.a.c.CreateCollectionCmd Create collection
products_20200609__CRA__2005
2020-06-09 02:12:56.232 INFO
(OverseerStateUpdate-72169202568593409-10.20.36.130:8983_solr-n_00)[
  ] o.a.s.c.o.SliceMutator createReplica() {
  "operation":"ADDREPLICA",
  "collection":"products_20200609__CRA__2005",
  "shard":"shard1",
  "core":"products_20200609__CRA__2005_shard1_replica_n1",
  "state":"down",
  "base_url":"http://10.20.36.130:8983/solr";,
  "node_name":"10.20.36.130:8983_solr",
  "type":"NRT",
  "waitForFinalState":"false"}
2020-06-09 02:12:56.444 INFO  (qtp90045638-25) [
x:products_20200609__CRA__2005_shard1_replica_n1]
o.a.s.h.a.CoreAdminOperation core create command
qt=/admin/cores&coreNodeName=core_node2&collection.configName=products&newCollection=true&name=products_20200609__CRA__2005_shard1_replica_n

Re: Error when trying to create a core in solr

2020-06-09 Thread Jim Anderson
Hi Erick,

I probably should have included information about the config directory. As
part of the setup, I had copied the config directory as follows:

$ cp -r /usr/share/solr-8.5.1/server/solr/configsets/_default/* .

Note that the copy was from solr-8.5.1 because I could not find a
'_default' directory in solr-7.3.1.  Coping from 8.5.1 may well be my
problem.
I will check and see if I can find a 7.3.1 example directory to copy from.
I will report back.

Regards,
Jim

On Tue, Jun 9, 2020 at 10:22 AM Erick Erickson 
wrote:

> You need the entire config directory for a start, not just the schema file.
>
> And there’s no need to copy things around, just path to the nutch-provided
> config directory and you can leave off the “conf” since the upload process
> automatically checks for it and does the right thing.
>
> Best,
> Erick
>
> > On Jun 9, 2020, at 9:50 AM, Jim Anderson 
> wrote:
> >
> > Hi,
> >
> > I am running Solr-7.3.1. I have just untarred the Solr-7.3.1 area and
> > created a 'nutch' directory for the core. I have downloaded
> > nutch-master.zip from
> > https://github.com/apache/nutch, unzipped that file and copied
> schema.xml
> > to .../server/solr/configsets/nutch/conf/schema.xml
> >
> > In the schema file, I modified the lastModified file value to true, with
> no
> > other changes.
> >
> > I am running the following command:
> >
> > .../bin/solr create -c nutch -d .../server/solr/configsets/nutch/conf/
> >
> > and getting the error message:
> >
> > ERROR: Error CREATEing SolrCore 'nutch': Unable to create core [nutch]
> > Caused by: Illegal pattern component: pp
> >
> > I have done a search for an error message containing: "Illegal pattern
> > component: pp" but I did not find anything useful.
> >
> > Can anyone help explain what this error message means and/or what needs
> to
> > be done to fix this problem?
> >
> > Jim A.
>
>


Re: Error when trying to create a core in solr

2020-06-09 Thread Erick Erickson
You need the entire config directory for a start, not just the schema file.

And there’s no need to copy things around, just path to the nutch-provided
config directory and you can leave off the “conf” since the upload process
automatically checks for it and does the right thing.

Best,
Erick

> On Jun 9, 2020, at 9:50 AM, Jim Anderson  wrote:
> 
> Hi,
> 
> I am running Solr-7.3.1. I have just untarred the Solr-7.3.1 area and
> created a 'nutch' directory for the core. I have downloaded
> nutch-master.zip from
> https://github.com/apache/nutch, unzipped that file and copied schema.xml
> to .../server/solr/configsets/nutch/conf/schema.xml
> 
> In the schema file, I modified the lastModified file value to true, with no
> other changes.
> 
> I am running the following command:
> 
> .../bin/solr create -c nutch -d .../server/solr/configsets/nutch/conf/
> 
> and getting the error message:
> 
> ERROR: Error CREATEing SolrCore 'nutch': Unable to create core [nutch]
> Caused by: Illegal pattern component: pp
> 
> I have done a search for an error message containing: "Illegal pattern
> component: pp" but I did not find anything useful.
> 
> Can anyone help explain what this error message means and/or what needs to
> be done to fix this problem?
> 
> Jim A.



Error when trying to create a core in solr

2020-06-09 Thread Jim Anderson
Hi,

I am running Solr-7.3.1. I have just untarred the Solr-7.3.1 area and
created a 'nutch' directory for the core. I have downloaded
nutch-master.zip from
https://github.com/apache/nutch, unzipped that file and copied schema.xml
to .../server/solr/configsets/nutch/conf/schema.xml

In the schema file, I modified the lastModified file value to true, with no
other changes.

I am running the following command:

.../bin/solr create -c nutch -d .../server/solr/configsets/nutch/conf/

and getting the error message:

ERROR: Error CREATEing SolrCore 'nutch': Unable to create core [nutch]
Caused by: Illegal pattern component: pp

I have done a search for an error message containing: "Illegal pattern
component: pp" but I did not find anything useful.

Can anyone help explain what this error message means and/or what needs to
be done to fix this problem?

Jim A.


Re: Solr admin error message - where are relevant log files?

2020-06-07 Thread Jim Anderson
I cleared the Firefox cache and restarted and things are working ok now.

Jim

On Sun, Jun 7, 2020 at 3:44 PM Jim Anderson 
wrote:

> @Jan
>
> Thanks for the suggestion. I tried opera instead of firefox and it worked.
> I will try cleaner the cache on firefox, restart it and see if it works
> there.
>
> Jim
>
> On Sun, Jun 7, 2020 at 3:28 PM Jim Anderson 
> wrote:
>
>> An update.
>>
>> I started over by removing my Solr 7.3.1 installation and untarring again.
>>
>> Then went to the bin root directory and entered:
>>
>> bin/solr -start
>>
>> Next, I brought up the solr admin window and it still gives the same
>> error message and hangs up. As far as I can tell I am running solr straight
>> out of the box.
>>
>> Jim
>>
>> On Sun, Jun 7, 2020 at 3:07 PM Jim Anderson 
>> wrote:
>>
>>> >>> Did you install Solr with the installer script
>>>
>>> I was not aware that there is an install script. I will look for it, but
>>> if you can point me to it, that will help
>>>
>>> >>> or just
>>> >>> start it up after extracting the archive?
>>>
>>> I extracted the files from a tar ball and did a bit of setting up. For
>>> example, I created a core and modified my schema.xml file a bit.
>>>
>>> >> Does the solr/server/logs
>>> >> directory you mentioned contain files with timestamps that are
>>> current?
>>>
>>> The log files were current.
>>>
>>> >>> If you go to the "Logging" tab when the admin UI shows that error
>>>
>>> I cannot go to the "Logging" tab. When the admin UI comes up, it shows
>>> the error message and hangs with the cursor spinning.
>>>
>>> Thanks for the input. Again, if you can provide the install script, that
>>> will likely help. I'm going to go back and start with installing Solr again.
>>>
>>> Jim
>>>
>>>
>>>
>>> On Sun, Jun 7, 2020 at 1:09 PM Shawn Heisey  wrote:
>>>
>>>> On 6/7/2020 10:16 AM, Jim Anderson wrote:
>>>> > The admin pages comes up with:
>>>> >
>>>> > SolrCore Initialization Failures
>>>>
>>>> 
>>>>
>>>> > I look in my .../solr/server/logs directory and cannot find and
>>>> meaningful
>>>> > errors or warnings.
>>>> >
>>>> > Should I be looking elsewhere?
>>>>
>>>> That depends.  Did you install Solr with the installer script, or just
>>>> start it up after extracting the archive?  Does the solr/server/logs
>>>> directory you mentioned contain files with timestamps that are current?
>>>> If not, then the logs are likely going somewhere else.
>>>>
>>>> If you go to the "Logging" tab when the admin UI shows that error, you
>>>> will be able to see any log messages at WARN or higher severity.  Often
>>>> such log entries will need to be expanded by clicking on the little "i"
>>>> icon.  It will close again quickly, so you need to read fast.
>>>>
>>>> Thanks,
>>>> Shawn
>>>>
>>>


Re: Solr admin error message - where are relevant log files?

2020-06-07 Thread Jim Anderson
@Jan

Thanks for the suggestion. I tried opera instead of firefox and it worked.
I will try cleaner the cache on firefox, restart it and see if it works
there.

Jim

On Sun, Jun 7, 2020 at 3:28 PM Jim Anderson 
wrote:

> An update.
>
> I started over by removing my Solr 7.3.1 installation and untarring again.
>
> Then went to the bin root directory and entered:
>
> bin/solr -start
>
> Next, I brought up the solr admin window and it still gives the same error
> message and hangs up. As far as I can tell I am running solr straight out
> of the box.
>
> Jim
>
> On Sun, Jun 7, 2020 at 3:07 PM Jim Anderson 
> wrote:
>
>> >>> Did you install Solr with the installer script
>>
>> I was not aware that there is an install script. I will look for it, but
>> if you can point me to it, that will help
>>
>> >>> or just
>> >>> start it up after extracting the archive?
>>
>> I extracted the files from a tar ball and did a bit of setting up. For
>> example, I created a core and modified my schema.xml file a bit.
>>
>> >> Does the solr/server/logs
>> >> directory you mentioned contain files with timestamps that are
>> current?
>>
>> The log files were current.
>>
>> >>> If you go to the "Logging" tab when the admin UI shows that error
>>
>> I cannot go to the "Logging" tab. When the admin UI comes up, it shows
>> the error message and hangs with the cursor spinning.
>>
>> Thanks for the input. Again, if you can provide the install script, that
>> will likely help. I'm going to go back and start with installing Solr again.
>>
>> Jim
>>
>>
>>
>> On Sun, Jun 7, 2020 at 1:09 PM Shawn Heisey  wrote:
>>
>>> On 6/7/2020 10:16 AM, Jim Anderson wrote:
>>> > The admin pages comes up with:
>>> >
>>> > SolrCore Initialization Failures
>>>
>>> 
>>>
>>> > I look in my .../solr/server/logs directory and cannot find and
>>> meaningful
>>> > errors or warnings.
>>> >
>>> > Should I be looking elsewhere?
>>>
>>> That depends.  Did you install Solr with the installer script, or just
>>> start it up after extracting the archive?  Does the solr/server/logs
>>> directory you mentioned contain files with timestamps that are current?
>>> If not, then the logs are likely going somewhere else.
>>>
>>> If you go to the "Logging" tab when the admin UI shows that error, you
>>> will be able to see any log messages at WARN or higher severity.  Often
>>> such log entries will need to be expanded by clicking on the little "i"
>>> icon.  It will close again quickly, so you need to read fast.
>>>
>>> Thanks,
>>> Shawn
>>>
>>


Re: Solr admin error message - where are relevant log files?

2020-06-07 Thread Jim Anderson
An update.

I started over by removing my Solr 7.3.1 installation and untarring again.

Then went to the bin root directory and entered:

bin/solr -start

Next, I brought up the solr admin window and it still gives the same error
message and hangs up. As far as I can tell I am running solr straight out
of the box.

Jim

On Sun, Jun 7, 2020 at 3:07 PM Jim Anderson 
wrote:

> >>> Did you install Solr with the installer script
>
> I was not aware that there is an install script. I will look for it, but
> if you can point me to it, that will help
>
> >>> or just
> >>> start it up after extracting the archive?
>
> I extracted the files from a tar ball and did a bit of setting up. For
> example, I created a core and modified my schema.xml file a bit.
>
> >> Does the solr/server/logs
> >> directory you mentioned contain files with timestamps that are current?
>
> The log files were current.
>
> >>> If you go to the "Logging" tab when the admin UI shows that error
>
> I cannot go to the "Logging" tab. When the admin UI comes up, it shows the
> error message and hangs with the cursor spinning.
>
> Thanks for the input. Again, if you can provide the install script, that
> will likely help. I'm going to go back and start with installing Solr again.
>
> Jim
>
>
>
> On Sun, Jun 7, 2020 at 1:09 PM Shawn Heisey  wrote:
>
>> On 6/7/2020 10:16 AM, Jim Anderson wrote:
>> > The admin pages comes up with:
>> >
>> > SolrCore Initialization Failures
>>
>> 
>>
>> > I look in my .../solr/server/logs directory and cannot find and
>> meaningful
>> > errors or warnings.
>> >
>> > Should I be looking elsewhere?
>>
>> That depends.  Did you install Solr with the installer script, or just
>> start it up after extracting the archive?  Does the solr/server/logs
>> directory you mentioned contain files with timestamps that are current?
>> If not, then the logs are likely going somewhere else.
>>
>> If you go to the "Logging" tab when the admin UI shows that error, you
>> will be able to see any log messages at WARN or higher severity.  Often
>> such log entries will need to be expanded by clicking on the little "i"
>> icon.  It will close again quickly, so you need to read fast.
>>
>> Thanks,
>> Shawn
>>
>


Re: Solr admin error message - where are relevant log files?

2020-06-07 Thread Jan Høydahl
Try force reloading the admin page in your browser a few times. Or try another 
browser?

Jan Høydahl

> 7. jun. 2020 kl. 21:07 skrev Jim Anderson :
> 
> 
>>>> Did you install Solr with the installer script
> 
> I was not aware that there is an install script. I will look for it, but if
> you can point me to it, that will help
> 
>>>> or just
>>>> start it up after extracting the archive?
> 
> I extracted the files from a tar ball and did a bit of setting up. For
> example, I created a core and modified my schema.xml file a bit.
> 
>>> Does the solr/server/logs
>>> directory you mentioned contain files with timestamps that are current?
> 
> The log files were current.
> 
>>>> If you go to the "Logging" tab when the admin UI shows that error
> 
> I cannot go to the "Logging" tab. When the admin UI comes up, it shows the
> error message and hangs with the cursor spinning.
> 
> Thanks for the input. Again, if you can provide the install script, that
> will likely help. I'm going to go back and start with installing Solr again.
> 
> Jim
> 
> 
> 
>>> On Sun, Jun 7, 2020 at 1:09 PM Shawn Heisey  wrote:
>>> On 6/7/2020 10:16 AM, Jim Anderson wrote:
>>> The admin pages comes up with:
>>> SolrCore Initialization Failures
>> 
>>> I look in my .../solr/server/logs directory and cannot find and
>> meaningful
>>> errors or warnings.
>>> Should I be looking elsewhere?
>> That depends.  Did you install Solr with the installer script, or just
>> start it up after extracting the archive?  Does the solr/server/logs
>> directory you mentioned contain files with timestamps that are current?
>> If not, then the logs are likely going somewhere else.
>> If you go to the "Logging" tab when the admin UI shows that error, you
>> will be able to see any log messages at WARN or higher severity.  Often
>> such log entries will need to be expanded by clicking on the little "i"
>> icon.  It will close again quickly, so you need to read fast.
>> Thanks,
>> Shawn


Re: Solr admin error message - where are relevant log files?

2020-06-07 Thread Jim Anderson
 >>> Did you install Solr with the installer script

I was not aware that there is an install script. I will look for it, but if
you can point me to it, that will help

>>> or just
>>> start it up after extracting the archive?

I extracted the files from a tar ball and did a bit of setting up. For
example, I created a core and modified my schema.xml file a bit.

>> Does the solr/server/logs
>> directory you mentioned contain files with timestamps that are current?

The log files were current.

>>> If you go to the "Logging" tab when the admin UI shows that error

I cannot go to the "Logging" tab. When the admin UI comes up, it shows the
error message and hangs with the cursor spinning.

Thanks for the input. Again, if you can provide the install script, that
will likely help. I'm going to go back and start with installing Solr again.

Jim



On Sun, Jun 7, 2020 at 1:09 PM Shawn Heisey  wrote:

> On 6/7/2020 10:16 AM, Jim Anderson wrote:
> > The admin pages comes up with:
> >
> > SolrCore Initialization Failures
>
> 
>
> > I look in my .../solr/server/logs directory and cannot find and
> meaningful
> > errors or warnings.
> >
> > Should I be looking elsewhere?
>
> That depends.  Did you install Solr with the installer script, or just
> start it up after extracting the archive?  Does the solr/server/logs
> directory you mentioned contain files with timestamps that are current?
> If not, then the logs are likely going somewhere else.
>
> If you go to the "Logging" tab when the admin UI shows that error, you
> will be able to see any log messages at WARN or higher severity.  Often
> such log entries will need to be expanded by clicking on the little "i"
> icon.  It will close again quickly, so you need to read fast.
>
> Thanks,
> Shawn
>


Re: Solr admin error message - where are relevant log files?

2020-06-07 Thread Shawn Heisey

On 6/7/2020 10:16 AM, Jim Anderson wrote:

The admin pages comes up with:

SolrCore Initialization Failures





I look in my .../solr/server/logs directory and cannot find and meaningful
errors or warnings.

Should I be looking elsewhere?


That depends.  Did you install Solr with the installer script, or just 
start it up after extracting the archive?  Does the solr/server/logs 
directory you mentioned contain files with timestamps that are current? 
If not, then the logs are likely going somewhere else.


If you go to the "Logging" tab when the admin UI shows that error, you 
will be able to see any log messages at WARN or higher severity.  Often 
such log entries will need to be expanded by clicking on the little "i" 
icon.  It will close again quickly, so you need to read fast.


Thanks,
Shawn


Solr admin error message - where are relevant log files?

2020-06-07 Thread Jim Anderson
Hi,

I'm a newbie with Solr, and going through tutorials and trying to get Solr
working with Nutch.

Today, I started up Solr and then brought up Solr Admin at:

http://localhost:8983/solr/

The admin pages comes up with:

SolrCore Initialization Failures

   - *{{core}}:* {{error}}

Please check your logs for more information


I look in my .../solr/server/logs directory and cannot find and meaningful
errors or warnings.


Should I be looking elsewhere?

Jim A.


Re: Solr 8.5.1 startup error - lengthTag=109, too big.

2020-05-31 Thread Zheng Lin Edwin Yeo
Hi Jan,

Thanks for your reply.

I have found that the issue is due to SOLR_SSL_KEY_STORE_TYPE env default
was set to PKS12, while in my previous version it was JKS.

Regards,
Edwin


On Thu, 28 May 2020 at 21:08, Jan Høydahl  wrote:

> I also believe this is due to keystore format confusion.
> How exactly do you generate your keystore, what is the keystore file
> named, and do you specify the SOLR_SSL_KEY_STORE_TYPE env?
>
> Jan
>
> > 28. mai 2020 kl. 04:03 skrev Zheng Lin Edwin Yeo :
> >
> > Hi Mike,
> >
> > Thanks for your reply.
> >
> > Yes, I have SSL enabled in 8.2.1 as well. The error is there even it I
> use
> > the same certificate for 8.2.1, which was working fine there.
> > I have also generated the certificate for both 8.2.1 and 8.5.1 by the
> same
> > method.
> >
> > Is there any changes between these 2 versions that would have affected
> > this? (Eg: there are changes in the way we generate the certificate)
> >
> > Regards,
> > Edwin
> >
> > On Wed, 27 May 2020 at 04:23, Mike Drob  wrote:
> >
> >> Did you have SSL enabled with 8.2.1?
> >>
> >> The error looks common to certificate handling and not specific to Solr.
> >>
> >> I would verify that you have no extra characters in your certificate
> file
> >> (including line endings) and that the keystore type that you specified
> >> matches the file you are presenting (JKS or PKCS12)
> >>
> >> Mike
> >>
> >> On Sat, May 23, 2020 at 10:11 PM Zheng Lin Edwin Yeo <
> edwinye...@gmail.com
> >>>
> >> wrote:
> >>
> >>> Hi,
> >>>
> >>> I'm trying to upgrade from Solr 8.2.1 to Solr 8.5.1, with Solr SSL
> >>> Authentication and Authorization.
> >>>
> >>> However, I get the following error when I enable SSL. The Solr itself
> can
> >>> start up if there is no SSL.  The main error that I see is this
> >>>
> >>>  java.io.IOException: DerInputStream.getLength(): lengthTag=109, too
> >> big.
> >>>
> >>> What could be the reason that causes this?
> >>>
> >>>
> >>> INFO  - 2020-05-24 10:38:20.080;
> >>> org.apache.solr.util.configuration.SSLConfigurations; Setting
> >>> javax.net.ssl.keyStorePassword
> >>> INFO  - 2020-05-24 10:38:20.081;
> >>> org.apache.solr.util.configuration.SSLConfigurations; Setting
> >>> javax.net.ssl.trustStorePassword
> >>> Waiting up to 120 to see Solr running on port 8983
> >>> java.lang.reflect.InvocationTargetException
> >>>at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> >>>at sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source)
> >>>at sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown
> >> Source)
> >>>at java.lang.reflect.Method.invoke(Unknown Source)
> >>>at org.eclipse.jetty.start.Main.invokeMain(Main.java:218)
> >>>at org.eclipse.jetty.start.Main.start(Main.java:491)
> >>>at org.eclipse.jetty.start.Main.main(Main.java:77)d
> >>> Caused by: java.security.PrivilegedActionException:
> java.io.IOException:
> >>> DerInputStream.getLength(): lengthTag=109, too big.
> >>>at java.security.AccessController.doPrivileged(Native Method)
> >>>at
> >>> org.eclipse.jetty.xml.XmlConfiguration.main(XmlConfiguration.java:1837)
> >>>... 7 more
> >>> Caused by: java.io.IOException: DerInputStream.getLength():
> >> lengthTag=109,
> >>> too big.
> >>>at sun.security.util.DerInputStream.getLength(Unknown Source)
> >>>at sun.security.util.DerValue.init(Unknown Source)
> >>>at sun.security.util.DerValue.(Unknown Source)
> >>>at sun.security.util.DerValue.(Unknown Source)
> >>>at sun.security.pkcs12.PKCS12KeyStore.engineLoad(Unknown Source)
> >>>at java.security.KeyStore.load(Unknown Source)
> >>>at
> >>>
> >>>
> >>
> org.eclipse.jetty.util.security.CertificateUtils.getKeyStore(CertificateUtils.java:54)
> >>>at
> >>>
> >>>
> >>
> org.eclipse.jetty.util.ssl.SslContextFactory.loadKeyStore(SslContextFactory.java:1188)
> >>>at
> >>>
> >>>
> >>
> org.eclipse.jetty.util.ssl.SslContextFactory.load(SslContextFactory.java:323)
&

Re: Solr 8.5.1 startup error - lengthTag=109, too big.

2020-05-28 Thread Jan Høydahl
I also believe this is due to keystore format confusion.
How exactly do you generate your keystore, what is the keystore file named, and 
do you specify the SOLR_SSL_KEY_STORE_TYPE env?

Jan

> 28. mai 2020 kl. 04:03 skrev Zheng Lin Edwin Yeo :
> 
> Hi Mike,
> 
> Thanks for your reply.
> 
> Yes, I have SSL enabled in 8.2.1 as well. The error is there even it I use
> the same certificate for 8.2.1, which was working fine there.
> I have also generated the certificate for both 8.2.1 and 8.5.1 by the same
> method.
> 
> Is there any changes between these 2 versions that would have affected
> this? (Eg: there are changes in the way we generate the certificate)
> 
> Regards,
> Edwin
> 
> On Wed, 27 May 2020 at 04:23, Mike Drob  wrote:
> 
>> Did you have SSL enabled with 8.2.1?
>> 
>> The error looks common to certificate handling and not specific to Solr.
>> 
>> I would verify that you have no extra characters in your certificate file
>> (including line endings) and that the keystore type that you specified
>> matches the file you are presenting (JKS or PKCS12)
>> 
>> Mike
>> 
>> On Sat, May 23, 2020 at 10:11 PM Zheng Lin Edwin Yeo >> 
>> wrote:
>> 
>>> Hi,
>>> 
>>> I'm trying to upgrade from Solr 8.2.1 to Solr 8.5.1, with Solr SSL
>>> Authentication and Authorization.
>>> 
>>> However, I get the following error when I enable SSL. The Solr itself can
>>> start up if there is no SSL.  The main error that I see is this
>>> 
>>>  java.io.IOException: DerInputStream.getLength(): lengthTag=109, too
>> big.
>>> 
>>> What could be the reason that causes this?
>>> 
>>> 
>>> INFO  - 2020-05-24 10:38:20.080;
>>> org.apache.solr.util.configuration.SSLConfigurations; Setting
>>> javax.net.ssl.keyStorePassword
>>> INFO  - 2020-05-24 10:38:20.081;
>>> org.apache.solr.util.configuration.SSLConfigurations; Setting
>>> javax.net.ssl.trustStorePassword
>>> Waiting up to 120 to see Solr running on port 8983
>>> java.lang.reflect.InvocationTargetException
>>>at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>>at sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source)
>>>at sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown
>> Source)
>>>at java.lang.reflect.Method.invoke(Unknown Source)
>>>at org.eclipse.jetty.start.Main.invokeMain(Main.java:218)
>>>at org.eclipse.jetty.start.Main.start(Main.java:491)
>>>at org.eclipse.jetty.start.Main.main(Main.java:77)d
>>> Caused by: java.security.PrivilegedActionException: java.io.IOException:
>>> DerInputStream.getLength(): lengthTag=109, too big.
>>>at java.security.AccessController.doPrivileged(Native Method)
>>>at
>>> org.eclipse.jetty.xml.XmlConfiguration.main(XmlConfiguration.java:1837)
>>>... 7 more
>>> Caused by: java.io.IOException: DerInputStream.getLength():
>> lengthTag=109,
>>> too big.
>>>at sun.security.util.DerInputStream.getLength(Unknown Source)
>>>at sun.security.util.DerValue.init(Unknown Source)
>>>at sun.security.util.DerValue.(Unknown Source)
>>>at sun.security.util.DerValue.(Unknown Source)
>>>at sun.security.pkcs12.PKCS12KeyStore.engineLoad(Unknown Source)
>>>at java.security.KeyStore.load(Unknown Source)
>>>at
>>> 
>>> 
>> org.eclipse.jetty.util.security.CertificateUtils.getKeyStore(CertificateUtils.java:54)
>>>at
>>> 
>>> 
>> org.eclipse.jetty.util.ssl.SslContextFactory.loadKeyStore(SslContextFactory.java:1188)
>>>at
>>> 
>>> 
>> org.eclipse.jetty.util.ssl.SslContextFactory.load(SslContextFactory.java:323)
>>>at
>>> 
>>> 
>> org.eclipse.jetty.util.ssl.SslContextFactory.doStart(SslContextFactory.java:245)
>>>at
>>> 
>>> 
>> org.eclipse.jetty.util.component.AbstractLifeCycle.start(AbstractLifeCycle.java:72)
>>>at
>>> 
>>> 
>> org.eclipse.jetty.util.component.ContainerLifeCycle.start(ContainerLifeCycle.java:169)
>>>at
>>> 
>>> 
>> org.eclipse.jetty.util.component.ContainerLifeCycle.doStart(ContainerLifeCycle.java:117)
>>>at
>>> 
>>> 
>> org.eclipse.jetty.server.SslConnectionFactory.doStart(SslConnectionFactory.java:92)
>>>at
>>> 

Re: Solr 8.5.1 startup error - lengthTag=109, too big.

2020-05-27 Thread Zheng Lin Edwin Yeo
Hi Mike,

Thanks for your reply.

Yes, I have SSL enabled in 8.2.1 as well. The error is there even it I use
the same certificate for 8.2.1, which was working fine there.
I have also generated the certificate for both 8.2.1 and 8.5.1 by the same
method.

Is there any changes between these 2 versions that would have affected
this? (Eg: there are changes in the way we generate the certificate)

Regards,
Edwin

On Wed, 27 May 2020 at 04:23, Mike Drob  wrote:

> Did you have SSL enabled with 8.2.1?
>
> The error looks common to certificate handling and not specific to Solr.
>
> I would verify that you have no extra characters in your certificate file
> (including line endings) and that the keystore type that you specified
> matches the file you are presenting (JKS or PKCS12)
>
> Mike
>
> On Sat, May 23, 2020 at 10:11 PM Zheng Lin Edwin Yeo  >
> wrote:
>
> > Hi,
> >
> > I'm trying to upgrade from Solr 8.2.1 to Solr 8.5.1, with Solr SSL
> > Authentication and Authorization.
> >
> > However, I get the following error when I enable SSL. The Solr itself can
> > start up if there is no SSL.  The main error that I see is this
> >
> >   java.io.IOException: DerInputStream.getLength(): lengthTag=109, too
> big.
> >
> > What could be the reason that causes this?
> >
> >
> > INFO  - 2020-05-24 10:38:20.080;
> > org.apache.solr.util.configuration.SSLConfigurations; Setting
> > javax.net.ssl.keyStorePassword
> > INFO  - 2020-05-24 10:38:20.081;
> > org.apache.solr.util.configuration.SSLConfigurations; Setting
> > javax.net.ssl.trustStorePassword
> > Waiting up to 120 to see Solr running on port 8983
> > java.lang.reflect.InvocationTargetException
> > at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> > at sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source)
> > at sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown
> Source)
> > at java.lang.reflect.Method.invoke(Unknown Source)
> > at org.eclipse.jetty.start.Main.invokeMain(Main.java:218)
> > at org.eclipse.jetty.start.Main.start(Main.java:491)
> > at org.eclipse.jetty.start.Main.main(Main.java:77)d
> > Caused by: java.security.PrivilegedActionException: java.io.IOException:
> > DerInputStream.getLength(): lengthTag=109, too big.
> > at java.security.AccessController.doPrivileged(Native Method)
> > at
> > org.eclipse.jetty.xml.XmlConfiguration.main(XmlConfiguration.java:1837)
> > ... 7 more
> > Caused by: java.io.IOException: DerInputStream.getLength():
> lengthTag=109,
> > too big.
> > at sun.security.util.DerInputStream.getLength(Unknown Source)
> > at sun.security.util.DerValue.init(Unknown Source)
> > at sun.security.util.DerValue.(Unknown Source)
> > at sun.security.util.DerValue.(Unknown Source)
> > at sun.security.pkcs12.PKCS12KeyStore.engineLoad(Unknown Source)
> > at java.security.KeyStore.load(Unknown Source)
> > at
> >
> >
> org.eclipse.jetty.util.security.CertificateUtils.getKeyStore(CertificateUtils.java:54)
> > at
> >
> >
> org.eclipse.jetty.util.ssl.SslContextFactory.loadKeyStore(SslContextFactory.java:1188)
> > at
> >
> >
> org.eclipse.jetty.util.ssl.SslContextFactory.load(SslContextFactory.java:323)
> > at
> >
> >
> org.eclipse.jetty.util.ssl.SslContextFactory.doStart(SslContextFactory.java:245)
> > at
> >
> >
> org.eclipse.jetty.util.component.AbstractLifeCycle.start(AbstractLifeCycle.java:72)
> > at
> >
> >
> org.eclipse.jetty.util.component.ContainerLifeCycle.start(ContainerLifeCycle.java:169)
> > at
> >
> >
> org.eclipse.jetty.util.component.ContainerLifeCycle.doStart(ContainerLifeCycle.java:117)
> > at
> >
> >
> org.eclipse.jetty.server.SslConnectionFactory.doStart(SslConnectionFactory.java:92)
> > at
> >
> >
> org.eclipse.jetty.util.component.AbstractLifeCycle.start(AbstractLifeCycle.java:72)
> > at
> >
> >
> org.eclipse.jetty.util.component.ContainerLifeCycle.start(ContainerLifeCycle.java:169)
> > at
> >
> >
> org.eclipse.jetty.util.component.ContainerLifeCycle.doStart(ContainerLifeCycle.java:117)
> > at
> >
> >
> org.eclipse.jetty.server.AbstractConnector.doStart(AbstractConnector.java:320)
> > at
> >
> >
> org.eclipse.jetty.server.AbstractNetworkConnector.doStart(AbstractNetworkConnector.java:

Re: Solr 8.5.1 startup error - lengthTag=109, too big.

2020-05-26 Thread Mike Drob
Did you have SSL enabled with 8.2.1?

The error looks common to certificate handling and not specific to Solr.

I would verify that you have no extra characters in your certificate file
(including line endings) and that the keystore type that you specified
matches the file you are presenting (JKS or PKCS12)

Mike

On Sat, May 23, 2020 at 10:11 PM Zheng Lin Edwin Yeo 
wrote:

> Hi,
>
> I'm trying to upgrade from Solr 8.2.1 to Solr 8.5.1, with Solr SSL
> Authentication and Authorization.
>
> However, I get the following error when I enable SSL. The Solr itself can
> start up if there is no SSL.  The main error that I see is this
>
>   java.io.IOException: DerInputStream.getLength(): lengthTag=109, too big.
>
> What could be the reason that causes this?
>
>
> INFO  - 2020-05-24 10:38:20.080;
> org.apache.solr.util.configuration.SSLConfigurations; Setting
> javax.net.ssl.keyStorePassword
> INFO  - 2020-05-24 10:38:20.081;
> org.apache.solr.util.configuration.SSLConfigurations; Setting
> javax.net.ssl.trustStorePassword
> Waiting up to 120 to see Solr running on port 8983
> java.lang.reflect.InvocationTargetException
> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> at sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source)
> at sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source)
> at java.lang.reflect.Method.invoke(Unknown Source)
> at org.eclipse.jetty.start.Main.invokeMain(Main.java:218)
> at org.eclipse.jetty.start.Main.start(Main.java:491)
> at org.eclipse.jetty.start.Main.main(Main.java:77)d
> Caused by: java.security.PrivilegedActionException: java.io.IOException:
> DerInputStream.getLength(): lengthTag=109, too big.
> at java.security.AccessController.doPrivileged(Native Method)
> at
> org.eclipse.jetty.xml.XmlConfiguration.main(XmlConfiguration.java:1837)
> ... 7 more
> Caused by: java.io.IOException: DerInputStream.getLength(): lengthTag=109,
> too big.
> at sun.security.util.DerInputStream.getLength(Unknown Source)
> at sun.security.util.DerValue.init(Unknown Source)
> at sun.security.util.DerValue.(Unknown Source)
> at sun.security.util.DerValue.(Unknown Source)
> at sun.security.pkcs12.PKCS12KeyStore.engineLoad(Unknown Source)
> at java.security.KeyStore.load(Unknown Source)
> at
>
> org.eclipse.jetty.util.security.CertificateUtils.getKeyStore(CertificateUtils.java:54)
> at
>
> org.eclipse.jetty.util.ssl.SslContextFactory.loadKeyStore(SslContextFactory.java:1188)
> at
>
> org.eclipse.jetty.util.ssl.SslContextFactory.load(SslContextFactory.java:323)
> at
>
> org.eclipse.jetty.util.ssl.SslContextFactory.doStart(SslContextFactory.java:245)
> at
>
> org.eclipse.jetty.util.component.AbstractLifeCycle.start(AbstractLifeCycle.java:72)
> at
>
> org.eclipse.jetty.util.component.ContainerLifeCycle.start(ContainerLifeCycle.java:169)
> at
>
> org.eclipse.jetty.util.component.ContainerLifeCycle.doStart(ContainerLifeCycle.java:117)
> at
>
> org.eclipse.jetty.server.SslConnectionFactory.doStart(SslConnectionFactory.java:92)
> at
>
> org.eclipse.jetty.util.component.AbstractLifeCycle.start(AbstractLifeCycle.java:72)
> at
>
> org.eclipse.jetty.util.component.ContainerLifeCycle.start(ContainerLifeCycle.java:169)
> at
>
> org.eclipse.jetty.util.component.ContainerLifeCycle.doStart(ContainerLifeCycle.java:117)
> at
>
> org.eclipse.jetty.server.AbstractConnector.doStart(AbstractConnector.java:320)
> at
>
> org.eclipse.jetty.server.AbstractNetworkConnector.doStart(AbstractNetworkConnector.java:81)
> at
> org.eclipse.jetty.server.ServerConnector.doStart(ServerConnector.java:231)
> at
>
> org.eclipse.jetty.util.component.AbstractLifeCycle.start(AbstractLifeCycle.java:72)
> at org.eclipse.jetty.server.Server.doStart(Server.java:385)
> at
>
> org.eclipse.jetty.util.component.AbstractLifeCycle.start(AbstractLifeCycle.java:72)
> at
>
> org.eclipse.jetty.xml.XmlConfiguration.lambda$main$0(XmlConfiguration.java:1888)
> ... 9 more
> java.lang.reflect.InvocationTargetException
> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> at sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source)
> at sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source)
> at java.lang.reflect.Method.invoke(Unknown Source)
> at org.eclipse.jetty.start.Main.invokeMain(Main.java:218)
> at org.eclipse.jetty.start.Main.start(Main.java:491)
> at org.eclipse.jetty.

Solr Collection core initialization Error with length mismatch

2020-05-26 Thread Mohamed Sirajudeen Mayitti Ahamed Pillai
Hello team,

We have 4 Solr VMs in Solr Cloud 7.4. Only a specific node Admin UI, we are 
seeing the message,
· cs_signals_shard1_replica_n1: 
org.apache.solr.common.SolrException:org.apache.solr.common.SolrException: 
Error opening new searcher

When restarting Solr, noticed below error for the collection in solr.log.

2020-05-26T12:28:29.759-0500 - ERROR 
[coreContainerWorkExecutor-2-thread-1-processing-n:fzcexfsnbepd02:8983_solr:solr.core.CoreContainer@714]
 - {node_name=n:fzcexfsnbepd02:8983_solr} - Error waiting for SolrCore to be 
loaded on startup
org.apache.solr.common.SolrException: Unable to create core 
[cs_signals_shard1_replica_n1]
at 
org.apache.solr.core.CoreContainer.createFromDescriptor(CoreContainer.java:1156)
 ~[solr-core-7.4.0.jar:7.4.0 9060ac689c270b02143f375de0348b7f626adebc - jpountz 
- 2018-06-18 16:55:13]
at 
org.apache.solr.core.CoreContainer.lambda$load$13(CoreContainer.java:681) 
~[solr-core-7.4.0.jar:7.4.0 9060ac689c270b02143f375de0348b7f626adebc - jpountz 
- 2018-06-18 16:55:13]
at 
com.codahale.metrics.InstrumentedExecutorService$InstrumentedCallable.call(InstrumentedExecutorService.java:197)
 ~[metrics-core-3.2.6.jar:3.2.6]
at java.util.concurrent.FutureTask.run(FutureTask.java:266) 
~[?:1.8.0_212]
at 
org.apache.solr.common.util.ExecutorUtil$MDCAwareThreadPoolExecutor.lambda$execute$0(ExecutorUtil.java:209)
 ~[solr-solrj-7.4.0.jar:7.4.0 9060ac689c270b02143f375de0348b7f626adebc - 
jpountz - 2018-06-18 16:55:14]
at 
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) 
[?:1.8.0_212]
at 
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) 
[?:1.8.0_212]
at java.lang.Thread.run(Thread.java:748) [?:1.8.0_212]
Caused by: org.apache.solr.common.SolrException: Error opening new searcher
at org.apache.solr.core.SolrCore.(SolrCore.java:1012) 
~[solr-core-7.4.0.jar:7.4.0 9060ac689c270b02143f375de0348b7f626adebc - jpountz 
- 2018-06-18 16:55:13]
at org.apache.solr.core.SolrCore.(SolrCore.java:867) 
~[solr-core-7.4.0.jar:7.4.0 9060ac689c270b02143f375de0348b7f626adebc - jpountz 
- 2018-06-18 16:55:13]
at 
org.apache.solr.core.CoreContainer.createFromDescriptor(CoreContainer.java:1135)
 ~[solr-core-7.4.0.jar:7.4.0 9060ac689c270b02143f375de0348b7f626adebc - jpountz 
- 2018-06-18 16:55:13]
... 7 more
Caused by: org.apache.solr.common.SolrException: Error opening new searcher
at 
org.apache.solr.core.SolrCore.openNewSearcher(SolrCore.java:2126) 
~[solr-core-7.4.0.jar:7.4.0 9060ac689c270b02143f375de0348b7f626adebc - jpountz 
- 2018-06-18 16:55:13]
at 
org.apache.solr.core.SolrCore.getSearcher(SolrCore.java:2246) 
~[solr-core-7.4.0.jar:7.4.0 9060ac689c270b02143f375de0348b7f626adebc - jpountz 
- 2018-06-18 16:55:13]
at 
org.apache.solr.core.SolrCore.initSearcher(SolrCore.java:1095) 
~[solr-core-7.4.0.jar:7.4.0 9060ac689c270b02143f375de0348b7f626adebc - jpountz 
- 2018-06-18 16:55:13]
at org.apache.solr.core.SolrCore.(SolrCore.java:984) 
~[solr-core-7.4.0.jar:7.4.0 9060ac689c270b02143f375de0348b7f626adebc - jpountz 
- 2018-06-18 16:55:13]
at org.apache.solr.core.SolrCore.(SolrCore.java:867) 
~[solr-core-7.4.0.jar:7.4.0 9060ac689c270b02143f375de0348b7f626adebc - jpountz 
- 2018-06-18 16:55:13]
at 
org.apache.solr.core.CoreContainer.createFromDescriptor(CoreContainer.java:1135)
 ~[solr-core-7.4.0.jar:7.4.0 9060ac689c270b02143f375de0348b7f626adebc - jpountz 
- 2018-06-18 16:55:13]
... 7 more
Caused by: org.apache.lucene.index.CorruptIndexException: length should be 
5271336964 bytes, but is 5272582148 instead 
(resource=MMapIndexInput(path="/opt/solr-7.4/data/solr/cs_signals_shard1_replica_n1/data/index.20200514060942617/_52b3.cfs"))
at 
org.apache.lucene.codecs.lucene50.Lucene50CompoundReader.(Lucene50CompoundReader.java:91)
 ~[lucene-core-7.4.0.jar:7.4.0 9060ac689c270b02143f375de0348b7f626adebc - 
jpountz - 2018-06-18 16:51:45]
at 
org.apache.lucene.codecs.lucene50.Lucene50CompoundFormat.getCompoundReader(Lucene50CompoundFormat.java:70)
 ~[lucene-core-7.4.0.jar:7.4.0 9060ac689c270b02143f375de0348b7f626adebc - 
jpountz - 2018-06-18 16:51:45]
at 
org.apache.lucene.index.IndexWriter.readFieldInfos(IndexWriter.java:960) 
~[lucene-core-7.4.0.jar:7.4.0 9060ac689c270b02143f375de0348b7f626adebc - 
jpountz - 2018-06-18 16:51:45]
at 
org.apache.lucene.index.IndexWriter.getFieldNumberMap(IndexWriter.java:977) 
~[lucene-core-7.4.0.jar:7.4.0 9060ac689c270b02143f375de0348b7f626adebc - 
jpountz - 2018-06-18 16:51:45]
at 
org.apache.lucene.index.IndexWriter.(IndexWriter.java:869) 
~[lucene-core-7.4.0.jar:7.4.0 9060ac689c270b02143f375de0348b7f626adebc - 
jpountz

Solr 8.5.1 startup error - lengthTag=109, too big.

2020-05-23 Thread Zheng Lin Edwin Yeo
Hi,

I'm trying to upgrade from Solr 8.2.1 to Solr 8.5.1, with Solr SSL
Authentication and Authorization.

However, I get the following error when I enable SSL. The Solr itself can
start up if there is no SSL.  The main error that I see is this

  java.io.IOException: DerInputStream.getLength(): lengthTag=109, too big.

What could be the reason that causes this?


INFO  - 2020-05-24 10:38:20.080;
org.apache.solr.util.configuration.SSLConfigurations; Setting
javax.net.ssl.keyStorePassword
INFO  - 2020-05-24 10:38:20.081;
org.apache.solr.util.configuration.SSLConfigurations; Setting
javax.net.ssl.trustStorePassword
Waiting up to 120 to see Solr running on port 8983
java.lang.reflect.InvocationTargetException
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source)
at java.lang.reflect.Method.invoke(Unknown Source)
at org.eclipse.jetty.start.Main.invokeMain(Main.java:218)
at org.eclipse.jetty.start.Main.start(Main.java:491)
at org.eclipse.jetty.start.Main.main(Main.java:77)d
Caused by: java.security.PrivilegedActionException: java.io.IOException:
DerInputStream.getLength(): lengthTag=109, too big.
at java.security.AccessController.doPrivileged(Native Method)
at
org.eclipse.jetty.xml.XmlConfiguration.main(XmlConfiguration.java:1837)
... 7 more
Caused by: java.io.IOException: DerInputStream.getLength(): lengthTag=109,
too big.
at sun.security.util.DerInputStream.getLength(Unknown Source)
at sun.security.util.DerValue.init(Unknown Source)
at sun.security.util.DerValue.(Unknown Source)
at sun.security.util.DerValue.(Unknown Source)
at sun.security.pkcs12.PKCS12KeyStore.engineLoad(Unknown Source)
at java.security.KeyStore.load(Unknown Source)
at
org.eclipse.jetty.util.security.CertificateUtils.getKeyStore(CertificateUtils.java:54)
at
org.eclipse.jetty.util.ssl.SslContextFactory.loadKeyStore(SslContextFactory.java:1188)
at
org.eclipse.jetty.util.ssl.SslContextFactory.load(SslContextFactory.java:323)
at
org.eclipse.jetty.util.ssl.SslContextFactory.doStart(SslContextFactory.java:245)
at
org.eclipse.jetty.util.component.AbstractLifeCycle.start(AbstractLifeCycle.java:72)
at
org.eclipse.jetty.util.component.ContainerLifeCycle.start(ContainerLifeCycle.java:169)
at
org.eclipse.jetty.util.component.ContainerLifeCycle.doStart(ContainerLifeCycle.java:117)
at
org.eclipse.jetty.server.SslConnectionFactory.doStart(SslConnectionFactory.java:92)
at
org.eclipse.jetty.util.component.AbstractLifeCycle.start(AbstractLifeCycle.java:72)
at
org.eclipse.jetty.util.component.ContainerLifeCycle.start(ContainerLifeCycle.java:169)
at
org.eclipse.jetty.util.component.ContainerLifeCycle.doStart(ContainerLifeCycle.java:117)
at
org.eclipse.jetty.server.AbstractConnector.doStart(AbstractConnector.java:320)
at
org.eclipse.jetty.server.AbstractNetworkConnector.doStart(AbstractNetworkConnector.java:81)
at
org.eclipse.jetty.server.ServerConnector.doStart(ServerConnector.java:231)
at
org.eclipse.jetty.util.component.AbstractLifeCycle.start(AbstractLifeCycle.java:72)
at org.eclipse.jetty.server.Server.doStart(Server.java:385)
at
org.eclipse.jetty.util.component.AbstractLifeCycle.start(AbstractLifeCycle.java:72)
at
org.eclipse.jetty.xml.XmlConfiguration.lambda$main$0(XmlConfiguration.java:1888)
... 9 more
java.lang.reflect.InvocationTargetException
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source)
at java.lang.reflect.Method.invoke(Unknown Source)
at org.eclipse.jetty.start.Main.invokeMain(Main.java:218)
at org.eclipse.jetty.start.Main.start(Main.java:491)
at org.eclipse.jetty.start.Main.main(Main.java:77)
Caused by: java.security.PrivilegedActionException: java.io.IOException:
DerInputStream.getLength(): lengthTag=109, too big.
at java.security.AccessController.doPrivileged(Native Method)
at
org.eclipse.jetty.xml.XmlConfiguration.main(XmlConfiguration.java:1837)
... 7 more
Caused by: java.io.IOException: DerInputStream.getLength(): lengthTag=109,
too big.
at sun.security.util.DerInputStream.getLength(Unknown Source)
at sun.security.util.DerValue.init(Unknown Source)
at sun.security.util.DerValue.(Unknown Source)
at sun.security.util.DerValue.(Unknown Source)
at sun.security.pkcs12.PKCS12KeyStore.engineLoad(Unknown Source)
at java.security.KeyStore.load(Unknown Source)
at
org.eclipse.jetty.util.security.CertificateUtils.getKeyStore(CertificateUtils.java:54)
 

Error checking plugin : => org.apache.solr.common.SolrException: Error loading class 'solr.VelocityResponseWriter'

2020-05-18 Thread Prakhar Kumar
Hello Team,

I am using Solr 8.5.0 and here is the full log for the error which I am
getting:

SolrConfigHandler Error checking plugin :  =>
org.apache.solr.common.SolrException: Error loading class
'solr.VelocityResponseWriter'
@40005ec3702b3710a43c at
org.apache.solr.core.SolrResourceLoader.findClass(SolrResourceLoader.java:570)
@40005ec3702b3710a824 org.apache.solr.common.SolrException: Error
loading class 'solr.VelocityResponseWriter'
@40005ec3702b3710ac0c at
org.apache.solr.core.SolrResourceLoader.findClass(SolrResourceLoader.java:570)
~[?:?]
@40005ec3702b3710f25c at
org.apache.solr.core.SolrResourceLoader.findClass(SolrResourceLoader.java:501)
~[?:?]
@40005ec3702b3710f644 at
org.apache.solr.core.SolrCore.createInstance(SolrCore.java:824) ~[?:?]
@40005ec3702b3710f644 at
org.apache.solr.core.SolrCore.createInitInstance(SolrCore.java:880) ~[?:?]
@40005ec3702b3710fa2c at
org.apache.solr.handler.SolrConfigHandler$Command.verifyClass(SolrConfigHandler.java:601)
~[?:?]
@40005ec3702b371105e4 at
org.apache.solr.handler.SolrConfigHandler$Command.updateNamedPlugin(SolrConfigHandler.java:565)
~[?:?]
@40005ec3702b371105e4 at
org.apache.solr.handler.SolrConfigHandler$Command.handleCommands(SolrConfigHandler.java:502)
~[?:?]
@40005ec3702b3711196c at
org.apache.solr.handler.SolrConfigHandler$Command.handlePOST(SolrConfigHandler.java:363)
~[?:?]
@40005ec3702b3711196c at
org.apache.solr.handler.SolrConfigHandler$Command.access$100(SolrConfigHandler.java:161)
~[?:?]
@40005ec3702b37111d54 at
org.apache.solr.handler.SolrConfigHandler.handleRequestBody(SolrConfigHandler.java:139)
~[?:?]
@40005ec3702b3711213c at
org.apache.solr.handler.RequestHandlerBase.handleRequest(RequestHandlerBase.java:211)
~[?:?]
@40005ec3702b3711290c at
org.apache.solr.core.SolrCore.execute(SolrCore.java:2596) ~[?:?]
@40005ec3702b37112cf4 at
org.apache.solr.servlet.HttpSolrCall.execute(HttpSolrCall.java:802) ~[?:?]
@40005ec3702b37112cf4 at
org.apache.solr.servlet.HttpSolrCall.call(HttpSolrCall.java:579) ~[?:?]
@40005ec3702b371130dc at
org.apache.solr.servlet.SolrDispatchFilter.doFilter(SolrDispatchFilter.java:420)
~[?:?]
@40005ec3702b37115404 at
org.apache.solr.servlet.SolrDispatchFilter.doFilter(SolrDispatchFilter.java:352)
~[?:?]
@40005ec3702b371157ec at
org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1596)
~[jetty-servlet-9.4.24.v20191120.jar:9.4.24.v20191120]
@40005ec3702b371157ec at
org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:545)
~[jetty-servlet-9.4.24.v20191120.jar:9.4.24.v20191120]
@40005ec3702b3711678c at
org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143)
~[jetty-server-9.4.24.v20191120.jar:9.4.24.v20191120]
@40005ec3702b3711678c at
org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:590)
~[jetty-security-9.4.24.v20191120.jar:9.4.24.v20191120]
@40005ec3702b37116b74 at
org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127)
~[jetty-server-9.4.24.v20191120.jar:9.4.24.v20191120]
@40005ec3702b37117344 at
org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235)
~[jetty-server-9.4.24.v20191120.jar:9.4.24.v20191120]
@40005ec3702b3711772c at
org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1607)
~[jetty-server-9.4.24.v20191120.jar:9.4.24.v20191120]
@40005ec3702b37117b14 at
org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233)
~[jetty-server-9.4.24.v20191120.jar:9.4.24.v20191120]
@40005ec3702b371182e4 at
org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1297)
~[jetty-server-9.4.24.v20191120.jar:9.4.24.v20191120]
@40005ec3702b37119284 at
org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188)
~[jetty-server-9.4.24.v20191120.jar:9.4.24.v20191120]
@40005ec3702b37119284 at
org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:485)
~[jetty-servlet-9.4.24.v20191120.jar:9.4.24.v20191120]
@40005ec3702b3711966c at
org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1577)
~[jetty-server-9.4.24.v20191120.jar:9.4.24.v20191120]
@40005ec3702b3711a224 at
org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186)
~[jetty-server-9.4.24.v20191120.jar:9.4.24.v20191120]
@40005ec3702b3711a60c at
org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1212)
~[jetty-server-9.4.24.v20191120.jar:9.4.24.v20191120]
@40005ec3702b3711a9f4 at
org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141)
~[jetty-server-9.4.24.v20191120.jar:9.4.24.v20191120]
@40005ec3702b3711b1c4 at
org.eclipse.jetty.server.handler.ContextHandlerCollection.handle(ContextHandlerCollection.java:221)
~[jetty-server-9.4.24.v20191120.jar:9.4.24.v20191120]
@4000

Re: Solr 8.1.5 Postlogs - Basic Authentication Error

2020-05-15 Thread Joel Bernstein
Right now this is not, but this would be fairly easy to add. I'll see if I
can get that in for the next release.


Joel Bernstein
http://joelsolr.blogspot.com/


On Mon, May 11, 2020 at 5:03 PM Waheed, Imran 
wrote:

> Is there a way to use bin/postllogs with basic authentication on? I am
> getting error if do not give username/password
>
> bin/postlogs http://localhost:8983/solr/logs server/logs/<
> http://localhost:8983/solr/logs%20server/logs/> server/logs
>
> Exception in thread "main"
> org.apache.solr.client.solrj.impl.HttpSolrClient$RemoteSolrException: Error
> from server at http://localhost:8983/solr/logs: Expected mime type
> application/octet-stream but got text/html. 
> 
> 
> Error 401 require authentication
> 
> HTTP ERROR 401 require authentication
> 
> URI:/solr/logs/update
> STATUS:401
> MESSAGE:require authentication
> SERVLET:default
> 
>
> I get a different error if I try
> bin/postlogs -u user:@password http://localhost:8983/solr/logs
> server/logs/
>
>
> SLF4J: Failed to load class "org.slf4j.impl.StaticLoggerBinder".
> SLF4J: Defaulting to no-operation (NOP) logger implementation
> SLF4J: See http://www.slf4j.org/codes.html#StaticLoggerBinder for further
> details.
> Exception in thread "main" java.lang.NullPointerException
> at
> org.apache.solr.util.SolrLogPostTool.gatherFiles(SolrLogPostTool.java:127)
> at
> org.apache.solr.util.SolrLogPostTool.main(SolrLogPostTool.java:65)
>
> thank you,
> Imran
>
>
> The information in this e-mail is intended only for the person to whom it
> is
> addressed. If you believe this e-mail was sent to you in error and the
> e-mail
> contains patient information, please contact the Partners Compliance
> HelpLine at
> http://www.partners.org/complianceline . If the e-mail was sent to you in
> error
> but does not contain patient information, please contact the sender and
> properly
> dispose of the e-mail.
>


Re: Solr 8.1.5 Postlogs - Basic Authentication Error

2020-05-13 Thread ART GALLERY
check out the videos on this website TROO.TUBE don't be such a
sheep/zombie/loser/NPC. Much love!
https://troo.tube/videos/watch/aaa64864-52ee-4201-922f-41300032f219

On Mon, May 11, 2020 at 4:03 PM Waheed, Imran
 wrote:
>
> Is there a way to use bin/postllogs with basic authentication on? I am 
> getting error if do not give username/password
>
> bin/postlogs http://localhost:8983/solr/logs 
> server/logs/<http://localhost:8983/solr/logs%20server/logs/> server/logs
>
> Exception in thread "main" 
> org.apache.solr.client.solrj.impl.HttpSolrClient$RemoteSolrException: Error 
> from server at http://localhost:8983/solr/logs: Expected mime type 
> application/octet-stream but got text/html. 
> 
> 
> Error 401 require authentication
> 
> HTTP ERROR 401 require authentication
> 
> URI:/solr/logs/update
> STATUS:401
> MESSAGE:require authentication
> SERVLET:default
> 
>
> I get a different error if I try
> bin/postlogs -u user:@password http://localhost:8983/solr/logs server/logs/
>
>
> SLF4J: Failed to load class "org.slf4j.impl.StaticLoggerBinder".
> SLF4J: Defaulting to no-operation (NOP) logger implementation
> SLF4J: See http://www.slf4j.org/codes.html#StaticLoggerBinder for further 
> details.
> Exception in thread "main" java.lang.NullPointerException
> at 
> org.apache.solr.util.SolrLogPostTool.gatherFiles(SolrLogPostTool.java:127)
> at 
> org.apache.solr.util.SolrLogPostTool.main(SolrLogPostTool.java:65)
>
> thank you,
> Imran
>
>
> The information in this e-mail is intended only for the person to whom it is
> addressed. If you believe this e-mail was sent to you in error and the e-mail
> contains patient information, please contact the Partners Compliance HelpLine 
> at
> http://www.partners.org/complianceline . If the e-mail was sent to you in 
> error
> but does not contain patient information, please contact the sender and 
> properly
> dispose of the e-mail.


Solr 8.1.5 Postlogs - Basic Authentication Error

2020-05-11 Thread Waheed, Imran
Is there a way to use bin/postllogs with basic authentication on? I am getting 
error if do not give username/password

bin/postlogs http://localhost:8983/solr/logs 
server/logs/<http://localhost:8983/solr/logs%20server/logs/> server/logs

Exception in thread "main" 
org.apache.solr.client.solrj.impl.HttpSolrClient$RemoteSolrException: Error 
from server at http://localhost:8983/solr/logs: Expected mime type 
application/octet-stream but got text/html. 


Error 401 require authentication

HTTP ERROR 401 require authentication

URI:/solr/logs/update
STATUS:401
MESSAGE:require authentication
SERVLET:default


I get a different error if I try
bin/postlogs -u user:@password http://localhost:8983/solr/logs server/logs/


SLF4J: Failed to load class "org.slf4j.impl.StaticLoggerBinder".
SLF4J: Defaulting to no-operation (NOP) logger implementation
SLF4J: See http://www.slf4j.org/codes.html#StaticLoggerBinder for further 
details.
Exception in thread "main" java.lang.NullPointerException
at 
org.apache.solr.util.SolrLogPostTool.gatherFiles(SolrLogPostTool.java:127)
at 
org.apache.solr.util.SolrLogPostTool.main(SolrLogPostTool.java:65)

thank you,
Imran


The information in this e-mail is intended only for the person to whom it is
addressed. If you believe this e-mail was sent to you in error and the e-mail
contains patient information, please contact the Partners Compliance HelpLine at
http://www.partners.org/complianceline . If the e-mail was sent to you in error
but does not contain patient information, please contact the sender and properly
dispose of the e-mail.


Re: Cause of java.io.IOException: No space left on device Error

2020-04-27 Thread Kayak28
Hello, Community:

Thank you for replying to my email.
Your responses were very helpful.




2020年4月23日(木) 20:35 Erick Erickson :

> In addition to what Dario mentioned, background merges
> happen all the time, optimize is just a special case (and
> very expensive).
>
> You say “one of my Solr cores has 47G”, but segment merging
> can easily occur on multiple cores at once, so that’s not
> definitive.
>
> We usually recommend that people have at least as much free
> space on disk as the aggregate of _all_ the cores/replicas on
> a physical machine.
>
> More than you may want to know about segment merging:
>
> https://lucidworks.com/post/segment-merging-deleted-documents-optimize-may-bad/
>
> Best,
> Erick
>
> > On Apr 23, 2020, at 4:38 AM, Dario Rigolin 
> wrote:
> >
> > When solr starts an optimization of the index you have to have free at
> > least same size (I don't know if 3 times is correct) of the core you are
> > optimizing.
> > Maybe your free space isn't enough to handle the optimization process.
> > Sometimes you have to restart the Solr process to have released more
> space
> > on the filesystem, couple of time solr didn't release all space.
> >
> >
> >
> > Il giorno gio 23 apr 2020 alle ore 10:23 Kayak28  >
> > ha scritto:
> >
> >> Hello, Community:
> >>
> >> I am currently using Solr 5.3.1. on CentOS.
> >> The other day, I faced an error message that shows
> >> " java.io.IOException: No space left on device"
> >>
> >> My disk for Solr has empty space about 35GB
> >> and the total amount of the disk is 581GB.
> >>
> >> I doubted there was no enough space for Linux inode,
> >> but inode still has spaces. (IUse was 1% )
> >>
> >> One of my Solr cores has 47GB of indexes.
> >>
> >> Is there a possibility that the error happens when I do forceMerge on
> the
> >> big core
> >> (which I believe optimize needs temporarily 3 times spaces of
> index-size)?
> >> Or is there any other possibility to cause the error?
> >>
> >>
> >> Any Clues are very helpful.
> >>
> >> --
> >>
> >> Sincerely,
> >> Kaya
> >> github: https://github.com/28kayak
> >>
> >
> >
> > --
> >
> > Dario Rigolin
> > Comperio srl - CTO
> > Mobile: +39 347 7232652 - Office: +39 0425 471482
> > Skype: dario.rigolin
>
>

-- 

Sincerely,
Kaya
github: https://github.com/28kayak


Re: "Error creating core. No system property or default value specified for X"

2020-04-24 Thread Erick Erickson
Screenshots do not come through, the mail server pretty aggressively strips 
them.

Ok, this is weird. There’s no way you should be getting results from an index 
that
has only a write.lock file. So the fact that you were suggests that you are 
looking.
at one thing and Solr is looking at another. How, is a puzzle. One hand-wavy
possibility is that you didn't restart the Solr server (or reload the core) 
after
deleting the data dir and the index files were still being held by open index
searchers, assuming a *nix system and assuming no new searchers were
opened.

As for the jdbc issue, I’m pretty sure how core.properties is irrelevant. These
definitions are usually either sysvars at runtime (i.e. -Dsomething=whatever)
or defined in the solr.in.sh file (again *nix).

I suspect that you are looking at one place and Solr is looking at another,
 i.e. SOLR_HOME, dataDir or other path is defined differently from
where you’re looking.

As to why things ran between Wednesday and now, the only thing I can come up
with is that no indexing was going on and/or the system restarted. Either one 
should
have surfaced the problem earlier.

> On Apr 24, 2020, at 10:01 AM, Teresa McMains  
> wrote:
> 
> Some events as background.
>  
> I have made some changes to schema.xml to define a new field type and have a 
> few string fields use this field type.
> 
> I reloaded the core.
>  
> Upon suggestion from this group, rather than just re-indexing (because it 
> might not refresh all segments appropriately), I deleted the “data” directory 
> for the core and then re-indexed.
> The new data directory contained an “index” directory with a single file 
> called write.lock and nothing else. There was no tlog directory (which seems 
> notable because we used to have one).
>  
> But things were behaving OK – that is searches worked and returned documents 
> -- so I didn’t think much of it.
>  
> This was Wednesday.
>  
> Today, we received errors that the core didn’t exist.  When I connected to 
> the admin interface, I had the error “Error creating core. No system property 
> or default value specified for jdbcdir”.
>  
> Now, our solconfig.xml does refer to “${jdbcdir}” and core.properties defines 
> this setting as a path and that path exists.  These config files all exist 
> and haven’t been touched in several months. The permissions have not been 
> changed.
> I replaced “${jdbcdir}” with its value from core.properties. Received the 
> same error about two other settings which I also replaced with hard-coded 
> values which are defined in core.properties and I am now able to load the 
> core.
>  
> My question is – what on earth just happened?
>  
> Once the core was reloaded, I re-indexed our data which completed in seconds 
> (usually about 45 minutes) and searches returned no matches (0 documents).
>  
> Then, I replaced the basically empty data directory with a backup I made of 
> the contents as of Wednesday and re-indexed and searches are working again 
> (they return documents).  After doing this, I tried to put back the original 
> solrconfig.xml but it still errored out with the same errors about missing 
> default or system properties.
>  
> Screenshot of a section of the original solrconfig.xml:
>  
> 
>  
> Screenshot of the core.properties:
>  
> 
>  
> Any idea what happened and how I can go back to using my original 
> solrconfig.xml which does not have hard-coded paths and parameters??
> Also any thoughts on what went wrong with the re-index and why it never 
> created any segment files in the index directory?
> Are these two things related?
>  
> Many thanks,
> Teresa



"Error creating core. No system property or default value specified for X"

2020-04-24 Thread Teresa McMains
Some events as background.

I have made some changes to schema.xml to define a new field type and have a 
few string fields use this field type.
[cid:image001.png@01D61A1C.F1D06060]
I reloaded the core.

Upon suggestion from this group, rather than just re-indexing (because it might 
not refresh all segments appropriately), I deleted the "data" directory for the 
core and then re-indexed.
The new data directory contained an "index" directory with a single file called 
write.lock and nothing else. There was no tlog directory (which seems notable 
because we used to have one).

But things were behaving OK - that is searches worked and returned documents -- 
so I didn't think much of it.

This was Wednesday.

Today, we received errors that the core didn't exist.  When I connected to the 
admin interface, I had the error "Error creating core. No system property or 
default value specified for jdbcdir".

Now, our solconfig.xml does refer to "${jdbcdir}" and core.properties defines 
this setting as a path and that path exists.  These config files all exist and 
haven't been touched in several months. The permissions have not been changed.
I replaced "${jdbcdir}" with its value from core.properties. Received the same 
error about two other settings which I also replaced with hard-coded values 
which are defined in core.properties and I am now able to load the core.

My question is - what on earth just happened?

Once the core was reloaded, I re-indexed our data which completed in seconds 
(usually about 45 minutes) and searches returned no matches (0 documents).

Then, I replaced the basically empty data directory with a backup I made of the 
contents as of Wednesday and re-indexed and searches are working again (they 
return documents).  After doing this, I tried to put back the original 
solrconfig.xml but it still errored out with the same errors about missing 
default or system properties.

Screenshot of a section of the original solrconfig.xml:

[cid:image005.png@01D61A1F.468380E0]

Screenshot of the core.properties:

[cid:image006.png@01D61A1F.468380E0]

Any idea what happened and how I can go back to using my original 
solrconfig.xml which does not have hard-coded paths and parameters??
Also any thoughts on what went wrong with the re-index and why it never created 
any segment files in the index directory?
Are these two things related?

Many thanks,
Teresa


Re: Cause of java.io.IOException: No space left on device Error

2020-04-23 Thread Erick Erickson
In addition to what Dario mentioned, background merges
happen all the time, optimize is just a special case (and
very expensive). 

You say “one of my Solr cores has 47G”, but segment merging
can easily occur on multiple cores at once, so that’s not
definitive.

We usually recommend that people have at least as much free
space on disk as the aggregate of _all_ the cores/replicas on
a physical machine.

More than you may want to know about segment merging:
https://lucidworks.com/post/segment-merging-deleted-documents-optimize-may-bad/

Best,
Erick

> On Apr 23, 2020, at 4:38 AM, Dario Rigolin  wrote:
> 
> When solr starts an optimization of the index you have to have free at
> least same size (I don't know if 3 times is correct) of the core you are
> optimizing.
> Maybe your free space isn't enough to handle the optimization process.
> Sometimes you have to restart the Solr process to have released more space
> on the filesystem, couple of time solr didn't release all space.
> 
> 
> 
> Il giorno gio 23 apr 2020 alle ore 10:23 Kayak28 
> ha scritto:
> 
>> Hello, Community:
>> 
>> I am currently using Solr 5.3.1. on CentOS.
>> The other day, I faced an error message that shows
>> " java.io.IOException: No space left on device"
>> 
>> My disk for Solr has empty space about 35GB
>> and the total amount of the disk is 581GB.
>> 
>> I doubted there was no enough space for Linux inode,
>> but inode still has spaces. (IUse was 1% )
>> 
>> One of my Solr cores has 47GB of indexes.
>> 
>> Is there a possibility that the error happens when I do forceMerge on the
>> big core
>> (which I believe optimize needs temporarily 3 times spaces of index-size)?
>> Or is there any other possibility to cause the error?
>> 
>> 
>> Any Clues are very helpful.
>> 
>> --
>> 
>> Sincerely,
>> Kaya
>> github: https://github.com/28kayak
>> 
> 
> 
> -- 
> 
> Dario Rigolin
> Comperio srl - CTO
> Mobile: +39 347 7232652 - Office: +39 0425 471482
> Skype: dario.rigolin



Re: Cause of java.io.IOException: No space left on device Error

2020-04-23 Thread Dario Rigolin
When solr starts an optimization of the index you have to have free at
least same size (I don't know if 3 times is correct) of the core you are
optimizing.
Maybe your free space isn't enough to handle the optimization process.
Sometimes you have to restart the Solr process to have released more space
on the filesystem, couple of time solr didn't release all space.



Il giorno gio 23 apr 2020 alle ore 10:23 Kayak28 
ha scritto:

> Hello, Community:
>
> I am currently using Solr 5.3.1. on CentOS.
> The other day, I faced an error message that shows
> " java.io.IOException: No space left on device"
>
> My disk for Solr has empty space about 35GB
> and the total amount of the disk is 581GB.
>
> I doubted there was no enough space for Linux inode,
> but inode still has spaces. (IUse was 1% )
>
> One of my Solr cores has 47GB of indexes.
>
> Is there a possibility that the error happens when I do forceMerge on the
> big core
> (which I believe optimize needs temporarily 3 times spaces of index-size)?
> Or is there any other possibility to cause the error?
>
>
> Any Clues are very helpful.
>
> --
>
> Sincerely,
> Kaya
> github: https://github.com/28kayak
>


-- 

Dario Rigolin
Comperio srl - CTO
Mobile: +39 347 7232652 - Office: +39 0425 471482
Skype: dario.rigolin


Cause of java.io.IOException: No space left on device Error

2020-04-23 Thread Kayak28
Hello, Community:

I am currently using Solr 5.3.1. on CentOS.
The other day, I faced an error message that shows
" java.io.IOException: No space left on device"

My disk for Solr has empty space about 35GB
and the total amount of the disk is 581GB.

I doubted there was no enough space for Linux inode,
but inode still has spaces. (IUse was 1% )

One of my Solr cores has 47GB of indexes.

Is there a possibility that the error happens when I do forceMerge on the
big core
(which I believe optimize needs temporarily 3 times spaces of index-size)?
Or is there any other possibility to cause the error?


Any Clues are very helpful.

-- 

Sincerely,
Kaya
github: https://github.com/28kayak


Re: "SolrCore Initialization Failures" error message appears briefly in Solr 8.5.1 Admin UI

2020-04-21 Thread Colvin Cowie
>From a (very) brief googling it seems like using the ng-cloak attribute is
the right way to fix this, and it certainly seems to work for me.
https://issues.apache.org/jira/browse/SOLR-14422

On Mon, 20 Apr 2020 at 18:12, Colvin Cowie 
wrote:

> Sorry if this has already been raised, but I didn't see it.
>
> When loading / refreshing the Admin UI in 8.5.1, it briefly but *visibly*
> shows a placeholder for the "SolrCore Initialization Failures" error
> message, with a lot of redness. It looks like there is a real problem.
> Obviously the message then disappears, and it can be ignored.
> However, if I was a first time user, it would not give me confidence that
> everything is okay. In a way, an error message that appears briefly then
> disappears before I can finish reading it is worse than one which just
> stays there.
>
> Here's a screenshot of what I mean
> https://drive.google.com/open?id=1eK4HNprEuEua08_UwtEoDQuRwFgqbGjU
> and a gif:
> https://drive.google.com/open?id=1Rw3z03MzAqFpfZFU4uVv4G158vk66QVx
>
> I assume that this is connected to the UI updates discussed in
> https://issues.apache.org/jira/browse/SOLR-14359
>
> Cheers,
> Colvin
>


"SolrCore Initialization Failures" error message appears briefly in Solr 8.5.1 Admin UI

2020-04-20 Thread Colvin Cowie
Sorry if this has already been raised, but I didn't see it.

When loading / refreshing the Admin UI in 8.5.1, it briefly but *visibly*
shows a placeholder for the "SolrCore Initialization Failures" error
message, with a lot of redness. It looks like there is a real problem.
Obviously the message then disappears, and it can be ignored.
However, if I was a first time user, it would not give me confidence that
everything is okay. In a way, an error message that appears briefly then
disappears before I can finish reading it is worse than one which just
stays there.

Here's a screenshot of what I mean
https://drive.google.com/open?id=1eK4HNprEuEua08_UwtEoDQuRwFgqbGjU
and a gif:
https://drive.google.com/open?id=1Rw3z03MzAqFpfZFU4uVv4G158vk66QVx

I assume that this is connected to the UI updates discussed in
https://issues.apache.org/jira/browse/SOLR-14359

Cheers,
Colvin


Re: expand=true throws error

2020-03-31 Thread Szűcs Roland
Hi Munendra,

Yes, indeed it was the problem. Thank you very much your help. Expand is
just a pure parameter. Now it is working.

Thanks,
Roland

Munendra S N  ezt írta (időpont: 2020. márc. 31.,
K, 5:22):

> > Case 3 let;s extend it with expand=true:
> > { "responseHeader":{ "status":0, "QTime":1, "params":{
> > "q":"author:\"William
> > Shakespeare\"", "fq":"{!collapse field=title}&expand=true", "_":
> > "1585603593269"}},
> >
> I think it is because, expand=true parameter is not passed properly. As you
> can see from the params in the responseHeader section, q , fq are separate
> keys but expand=true is appended to fq value.
>
> If passed correctly, it should look something like this
>
> > { "responseHeader":{ "status":0, "QTime":1, "params":{
> > "q":"author:\"William
> > Shakespeare\"", "fq":"{!collapse field=title}", "expand": "true", "_":
> > "1585603593269"}},
> >
>
> Regards,
> Munendra S N
>
>
>
> On Tue, Mar 31, 2020 at 3:07 AM Szűcs Roland 
> wrote:
>
> > Hi Munendra,
> > Let's see the 3 scenario:
> > 1. Query without collapse
> > 2. Query with collapse
> > 3. Query with collapse and expand
> > I made a mini book database for this:
> > Case 1:
> > { "responseHeader":{ "status":0, "QTime":0, "params":{
> > "q":"author:\"William
> > Shakespeare\"", "_":"1585603593269"}},
> "response":{"numFound":4,"start":0,"
> > docs":[ { "id":"1", "author":"William Shakespeare", "title":"The Taming
> of
> > the Shrew", "format":"ebook", "_version_":1662625767773700096}, {
> "id":"2",
> > "author":"William Shakespeare", "title":"The Taming of the Shrew",
> > "format":
> > "paper", "_version_":1662625790857052160}, { "id":"3", "author":"William
> > Shakespeare", "title":"The Taming of the Shrew", "format":"audiobook", "
> > _version_":1662625809553162240}, { "id":"4", "author":"William
> > Shakespeare",
> > "title":"Much Ado about Nothing", "format":"paper", "_version_":
> > 1662625868323749888}] }}
> > As you can see there are 3 different format from the same book.
> >
> > Case 2:
> > { "responseHeader":{ "status":0, "QTime":2, "params":{
> > "q":"author:\"William
> > Shakespeare\"", "fq":"{!collapse field=title}", "_":"1585603593269"}}, "
> > response":{"numFound":2,"start":0,"docs":[ { "id":"1", "author":"William
> > Shakespeare", "title":"The Taming of the Shrew", "format":"ebook", "
> > _version_":1662625767773700096}, { "id":"4", "author":"William
> > Shakespeare",
> > "title":"Much Ado about Nothing", "format":"paper", "_version_":
> > 1662625868323749888}] }}
> > Collapse post filter worked as I expected.
> > Case 3 let;s extend it with expand=true:
> > { "responseHeader":{ "status":0, "QTime":1, "params":{
> > "q":"author:\"William
> > Shakespeare\"", "fq":"{!collapse field=title}&expand=true", "_":
> > "1585603593269"}}, "response":{"numFound":2,"start":0,"docs":[ {
> "id":"1",
> > "
> > author":"William Shakespeare", "title":"The Taming of the Shrew",
> "format":
> > "ebook", "_version_":1662625767773700096}, { "id":"4", "author":"William
> > Shakespeare", "title":"Much Ado about Nothing", "format":"paper",
> > "_version_
> > ":1662625868323749888}] }}
> >
> > As you can see nothing as changed. There is no additional section of the
> > response.
> >
> > Cheers,
> > Roland
> >
> > Munendra S N  ezt írta (időpont: 2020. márc.
> 30.,
> > H, 17:46):
> >
> > > Please share the complete request. Also, does number of results change
> > with
> > > & without collapse. Usually title would be unique every document. If
> that
> > > is  the case then, there won't be anything to expand right?
> > >
> > > On Mon, Mar 30, 2020, 8:22 PM Szűcs Roland <
> szucs.rol...@bookandwalk.hu>
> > > wrote:
> > >
> > > > Hi Munendra,
> > > > I do not get error . The strange thing is that I get exactly the same
> > > > response with fq={!collapse field=title} versus  fq={!collapse
> > > > field=title}&expand=true.
> > > > Collapse works properly as a standalone fq but expand has no impact.
> > How
> > > > can I have access to the "hidden" documents then?
> > > >
> > > > Roland
> > > >
> > > > Munendra S N  ezt írta (időpont: 2020.
> márc.
> > > 30.,
> > > > H, 16:47):
> > > >
> > > > > Hey,
> > > > > Could you please share the stacktrace or error message you
> received?
> > > > >
> > > > > On Mon, Mar 30, 2020, 7:58 PM Szűcs Roland <
> > > szucs.rol...@bookandwalk.hu>
> > > > > wrote:
> > > > >
> > > > > > Hi All,
> > > > > >
> > > > > > I manage to use edismax queryparser in solr 8.4.1 with collapse
> > > without
> > > > > any
> > > > > > problem. I tested it with the SOLR admin GUI. So fq={!collapse
> > > > > field=title}
> > > > > > worked fine.
> > > > > >
> > > > > > As soon as I use the example from the documentation and use:
> > > > > fq={!collapse
> > > > > > field=title}&expand=true, I did not get back any additional
> output
> > > with
> > > > > > section expanded.
> > > > > >
> > > > > > Any idea?
> > > > > >
> > > > > > Thanks in advance,
> > > > > > Roland
> > > > > >
> > > > >
> > > >
> > >
> >
>


Re: expand=true throws error

2020-03-30 Thread Munendra S N
> Case 3 let;s extend it with expand=true:
> { "responseHeader":{ "status":0, "QTime":1, "params":{
> "q":"author:\"William
> Shakespeare\"", "fq":"{!collapse field=title}&expand=true", "_":
> "1585603593269"}},
>
I think it is because, expand=true parameter is not passed properly. As you
can see from the params in the responseHeader section, q , fq are separate
keys but expand=true is appended to fq value.

If passed correctly, it should look something like this

> { "responseHeader":{ "status":0, "QTime":1, "params":{
> "q":"author:\"William
> Shakespeare\"", "fq":"{!collapse field=title}", "expand": "true", "_":
> "1585603593269"}},
>

Regards,
Munendra S N



On Tue, Mar 31, 2020 at 3:07 AM Szűcs Roland 
wrote:

> Hi Munendra,
> Let's see the 3 scenario:
> 1. Query without collapse
> 2. Query with collapse
> 3. Query with collapse and expand
> I made a mini book database for this:
> Case 1:
> { "responseHeader":{ "status":0, "QTime":0, "params":{
> "q":"author:\"William
> Shakespeare\"", "_":"1585603593269"}}, "response":{"numFound":4,"start":0,"
> docs":[ { "id":"1", "author":"William Shakespeare", "title":"The Taming of
> the Shrew", "format":"ebook", "_version_":1662625767773700096}, { "id":"2",
> "author":"William Shakespeare", "title":"The Taming of the Shrew",
> "format":
> "paper", "_version_":1662625790857052160}, { "id":"3", "author":"William
> Shakespeare", "title":"The Taming of the Shrew", "format":"audiobook", "
> _version_":1662625809553162240}, { "id":"4", "author":"William
> Shakespeare",
> "title":"Much Ado about Nothing", "format":"paper", "_version_":
> 1662625868323749888}] }}
> As you can see there are 3 different format from the same book.
>
> Case 2:
> { "responseHeader":{ "status":0, "QTime":2, "params":{
> "q":"author:\"William
> Shakespeare\"", "fq":"{!collapse field=title}", "_":"1585603593269"}}, "
> response":{"numFound":2,"start":0,"docs":[ { "id":"1", "author":"William
> Shakespeare", "title":"The Taming of the Shrew", "format":"ebook", "
> _version_":1662625767773700096}, { "id":"4", "author":"William
> Shakespeare",
> "title":"Much Ado about Nothing", "format":"paper", "_version_":
> 1662625868323749888}] }}
> Collapse post filter worked as I expected.
> Case 3 let;s extend it with expand=true:
> { "responseHeader":{ "status":0, "QTime":1, "params":{
> "q":"author:\"William
> Shakespeare\"", "fq":"{!collapse field=title}&expand=true", "_":
> "1585603593269"}}, "response":{"numFound":2,"start":0,"docs":[ { "id":"1",
> "
> author":"William Shakespeare", "title":"The Taming of the Shrew", "format":
> "ebook", "_version_":1662625767773700096}, { "id":"4", "author":"William
> Shakespeare", "title":"Much Ado about Nothing", "format":"paper",
> "_version_
> ":1662625868323749888}] }}
>
> As you can see nothing as changed. There is no additional section of the
> response.
>
> Cheers,
> Roland
>
> Munendra S N  ezt írta (időpont: 2020. márc. 30.,
> H, 17:46):
>
> > Please share the complete request. Also, does number of results change
> with
> > & without collapse. Usually title would be unique every document. If that
> > is  the case then, there won't be anything to expand right?
> >
> > On Mon, Mar 30, 2020, 8:22 PM Szűcs Roland 
> > wrote:
> >
> > > Hi Munendra,
> > > I do not get error . The strange thing is that I get exactly the same
> > > response with fq={!collapse field=title} versus  fq={!collapse
> > > field=title}&expand=true.
> > > Collapse works properly as a standalone fq but expand has no impact.
> How
> > > can I have access to the "hidden" documents then?
> > >
> > > Roland
> > >
> > > Munendra S N  ezt írta (időpont: 2020. márc.
> > 30.,
> > > H, 16:47):
> > >
> > > > Hey,
> > > > Could you please share the stacktrace or error message you received?
> > > >
> > > > On Mon, Mar 30, 2020, 7:58 PM Szűcs Roland <
> > szucs.rol...@bookandwalk.hu>
> > > > wrote:
> > > >
> > > > > Hi All,
> > > > >
> > > > > I manage to use edismax queryparser in solr 8.4.1 with collapse
> > without
> > > > any
> > > > > problem. I tested it with the SOLR admin GUI. So fq={!collapse
> > > > field=title}
> > > > > worked fine.
> > > > >
> > > > > As soon as I use the example from the documentation and use:
> > > > fq={!collapse
> > > > > field=title}&expand=true, I did not get back any additional output
> > with
> > > > > section expanded.
> > > > >
> > > > > Any idea?
> > > > >
> > > > > Thanks in advance,
> > > > > Roland
> > > > >
> > > >
> > >
> >
>


Re: expand=true throws error

2020-03-30 Thread Szűcs Roland
Hi Munendra,
Let's see the 3 scenario:
1. Query without collapse
2. Query with collapse
3. Query with collapse and expand
I made a mini book database for this:
Case 1:
{ "responseHeader":{ "status":0, "QTime":0, "params":{ "q":"author:\"William
Shakespeare\"", "_":"1585603593269"}}, "response":{"numFound":4,"start":0,"
docs":[ { "id":"1", "author":"William Shakespeare", "title":"The Taming of
the Shrew", "format":"ebook", "_version_":1662625767773700096}, { "id":"2",
"author":"William Shakespeare", "title":"The Taming of the Shrew", "format":
"paper", "_version_":1662625790857052160}, { "id":"3", "author":"William
Shakespeare", "title":"The Taming of the Shrew", "format":"audiobook", "
_version_":1662625809553162240}, { "id":"4", "author":"William Shakespeare",
"title":"Much Ado about Nothing", "format":"paper", "_version_":
1662625868323749888}] }}
As you can see there are 3 different format from the same book.

Case 2:
{ "responseHeader":{ "status":0, "QTime":2, "params":{ "q":"author:\"William
Shakespeare\"", "fq":"{!collapse field=title}", "_":"1585603593269"}}, "
response":{"numFound":2,"start":0,"docs":[ { "id":"1", "author":"William
Shakespeare", "title":"The Taming of the Shrew", "format":"ebook", "
_version_":1662625767773700096}, { "id":"4", "author":"William Shakespeare",
"title":"Much Ado about Nothing", "format":"paper", "_version_":
1662625868323749888}] }}
Collapse post filter worked as I expected.
Case 3 let;s extend it with expand=true:
{ "responseHeader":{ "status":0, "QTime":1, "params":{ "q":"author:\"William
Shakespeare\"", "fq":"{!collapse field=title}&expand=true", "_":
"1585603593269"}}, "response":{"numFound":2,"start":0,"docs":[ { "id":"1", "
author":"William Shakespeare", "title":"The Taming of the Shrew", "format":
"ebook", "_version_":1662625767773700096}, { "id":"4", "author":"William
Shakespeare", "title":"Much Ado about Nothing", "format":"paper", "_version_
":1662625868323749888}] }}

As you can see nothing as changed. There is no additional section of the
response.

Cheers,
Roland

Munendra S N  ezt írta (időpont: 2020. márc. 30.,
H, 17:46):

> Please share the complete request. Also, does number of results change with
> & without collapse. Usually title would be unique every document. If that
> is  the case then, there won't be anything to expand right?
>
> On Mon, Mar 30, 2020, 8:22 PM Szűcs Roland 
> wrote:
>
> > Hi Munendra,
> > I do not get error . The strange thing is that I get exactly the same
> > response with fq={!collapse field=title} versus  fq={!collapse
> > field=title}&expand=true.
> > Collapse works properly as a standalone fq but expand has no impact. How
> > can I have access to the "hidden" documents then?
> >
> > Roland
> >
> > Munendra S N  ezt írta (időpont: 2020. márc.
> 30.,
> > H, 16:47):
> >
> > > Hey,
> > > Could you please share the stacktrace or error message you received?
> > >
> > > On Mon, Mar 30, 2020, 7:58 PM Szűcs Roland <
> szucs.rol...@bookandwalk.hu>
> > > wrote:
> > >
> > > > Hi All,
> > > >
> > > > I manage to use edismax queryparser in solr 8.4.1 with collapse
> without
> > > any
> > > > problem. I tested it with the SOLR admin GUI. So fq={!collapse
> > > field=title}
> > > > worked fine.
> > > >
> > > > As soon as I use the example from the documentation and use:
> > > fq={!collapse
> > > > field=title}&expand=true, I did not get back any additional output
> with
> > > > section expanded.
> > > >
> > > > Any idea?
> > > >
> > > > Thanks in advance,
> > > > Roland
> > > >
> > >
> >
>


Re: expand=true throws error

2020-03-30 Thread Munendra S N
Please share the complete request. Also, does number of results change with
& without collapse. Usually title would be unique every document. If that
is  the case then, there won't be anything to expand right?

On Mon, Mar 30, 2020, 8:22 PM Szűcs Roland 
wrote:

> Hi Munendra,
> I do not get error . The strange thing is that I get exactly the same
> response with fq={!collapse field=title} versus  fq={!collapse
> field=title}&expand=true.
> Collapse works properly as a standalone fq but expand has no impact. How
> can I have access to the "hidden" documents then?
>
> Roland
>
> Munendra S N  ezt írta (időpont: 2020. márc. 30.,
> H, 16:47):
>
> > Hey,
> > Could you please share the stacktrace or error message you received?
> >
> > On Mon, Mar 30, 2020, 7:58 PM Szűcs Roland 
> > wrote:
> >
> > > Hi All,
> > >
> > > I manage to use edismax queryparser in solr 8.4.1 with collapse without
> > any
> > > problem. I tested it with the SOLR admin GUI. So fq={!collapse
> > field=title}
> > > worked fine.
> > >
> > > As soon as I use the example from the documentation and use:
> > fq={!collapse
> > > field=title}&expand=true, I did not get back any additional output with
> > > section expanded.
> > >
> > > Any idea?
> > >
> > > Thanks in advance,
> > > Roland
> > >
> >
>


Re: expand=true throws error

2020-03-30 Thread Szűcs Roland
Hi Munendra,
I do not get error . The strange thing is that I get exactly the same
response with fq={!collapse field=title} versus  fq={!collapse
field=title}&expand=true.
Collapse works properly as a standalone fq but expand has no impact. How
can I have access to the "hidden" documents then?

Roland

Munendra S N  ezt írta (időpont: 2020. márc. 30.,
H, 16:47):

> Hey,
> Could you please share the stacktrace or error message you received?
>
> On Mon, Mar 30, 2020, 7:58 PM Szűcs Roland 
> wrote:
>
> > Hi All,
> >
> > I manage to use edismax queryparser in solr 8.4.1 with collapse without
> any
> > problem. I tested it with the SOLR admin GUI. So fq={!collapse
> field=title}
> > worked fine.
> >
> > As soon as I use the example from the documentation and use:
> fq={!collapse
> > field=title}&expand=true, I did not get back any additional output with
> > section expanded.
> >
> > Any idea?
> >
> > Thanks in advance,
> > Roland
> >
>


Re: expand=true throws error

2020-03-30 Thread Munendra S N
Hey,
Could you please share the stacktrace or error message you received?

On Mon, Mar 30, 2020, 7:58 PM Szűcs Roland 
wrote:

> Hi All,
>
> I manage to use edismax queryparser in solr 8.4.1 with collapse without any
> problem. I tested it with the SOLR admin GUI. So fq={!collapse field=title}
> worked fine.
>
> As soon as I use the example from the documentation and use:  fq={!collapse
> field=title}&expand=true, I did not get back any additional output with
> section expanded.
>
> Any idea?
>
> Thanks in advance,
> Roland
>


expand=true throws error

2020-03-30 Thread Szűcs Roland
Hi All,

I manage to use edismax queryparser in solr 8.4.1 with collapse without any
problem. I tested it with the SOLR admin GUI. So fq={!collapse field=title}
worked fine.

As soon as I use the example from the documentation and use:  fq={!collapse
field=title}&expand=true, I did not get back any additional output with
section expanded.

Any idea?

Thanks in advance,
Roland


RE: OutOfMemory error solr 8.4.1

2020-03-09 Thread Srinivas Kashyap
Hi Erick,

Yes you were right, in my custom jar I'm using HttpSolrClient as below:

HttpSolrClient  client = new HttpSolrClient.Builder("http://"; + server + ":" + 
port + "/" + webapp + "/").build();  
 try {
 client.request(new QueryRequest(params),coreName);
 } catch (SolrServerException e1) {
 // TODO Auto-generated catch block
 e1.printStackTrace();
 } catch (IOException e1) {
 // TODO Auto-generated catch block
 e1.printStackTrace();
 }

And strangely, I'm not closing the connection(client.close()). The same code 
would work without creating heaps of threads in version 5.2.1. Now after I 
added finally block and closed the connection, the threads have stopped growing 
in size.

finally{
if(client!=null)
{
try {
client.close();
} catch (IOException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
}
}

Thanks and Regards,
Srinivas Kashyap

From: Erick Erickson  
Sent: 09 March 2020 21:13
To: solr-user@lucene.apache.org
Subject: Re: OutOfMemory error solr 8.4.1

I’m 99% certain that something in your custom jar is the culprit, otherwise 
we’d have seen a _lot_ of these. TIMED_WAITING is usually just a listener 
thread, but they shouldn’t be generated when SOlr is just sitting there. 

The first thing I’d do is dummy out my custom code or remove it completely and 
see. If you don’t have this thread explosion, then it’s pretty certainly your 
custom code.

Best,
Erick

> On Mar 9, 2020, at 01:29, Srinivas Kashyap  
> wrote:
> 
> Hi Erick,
> 
> I recompiled my custom code with 8.4.1 jars and placed back my jar in the lib 
> folder. Under Solr admin console/Thread Dump, I'm seeing a lot of below 
> threads which are in TIMED_WAITING stage.
> 
> Connection evictor (999)
>java.lang.Thread.sleep​(Native Method)
>
> org.apache.http.impl.client.IdleConnectionEvictor$1.run​(IdleConnectionEvictor.java:66)
>java.lang.Thread.run​(Thread.java:748)
> 
> It's been 15 minutes since I restarted the solr, and I believe already 999 
> threads have started?? And everytime I refresh the console, I'm seeing jump :
> 
> Connection evictor (1106)
> java.lang.Thread.sleep​(Native Method)
> org.apache.http.impl.client.IdleConnectionEvictor$1.run​(IdleConnectionEvictor.java:66)
> java.lang.Thread.run​(Thread.java:748)
> 
> Thanks and Regards,
> Srinivas Kashyap
> 
> -Original Message-
> From: Erick Erickson  
> Sent: 06 March 2020 21:34
> To: solr-user@lucene.apache.org
> Subject: Re: OutOfMemory error solr 8.4.1
> 
> I assume you recompiled the jar file? re-using the same one compiled against 
> 5x is unsupported, nobody will be able to help until you recompile.
> 
> Once you’ve done that, if you still have the problem you need to take a 
> thread dump to see if your custom code is leaking threads, that’s my number 
> one suspect.
> 
> Best,
> Erick
> 
>> On Mar 6, 2020, at 07:36, Srinivas Kashyap  
>> wrote:
>> 
>> Hi Erick,
>> 
>> We have custom code which are schedulers to run delta imports on our cores 
>> and I have added that custom code as a jar and I have placed it on 
>> server/solr-webapp/WEB-INF/lib. Basically we are fetching the JNDI 
>> datasource configured in the jetty.xml(Oracle) and creating connection 
>> object. And after that in the finally block we are closing it too.
>> 
>> Never faced this issue while we were in solr5.2.1 version though. The same 
>> jar was placed there too.
>> 
>> Thanks,
>> Srinivas
>> 
>> On 06-Mar-2020 8:55 pm, Erick Erickson  wrote:
>> This one can be a bit tricky. You’re not running out of overall memory, but 
>> you are running out of memory to allocate stacks. Which implies that, for 
>> some reason, you are creating a zillion threads. Do you have any custom code?
>> 
>> You can take a thread dump and see what your threads are doing, and you 
>> don’t need to wait until you see the error. If you take a thread dump my 
>> guess is you’ll see the number of threads increase over time. If that’s the 
>> case, and if you have no custom code running, we need to see the thread dump.
>> 
>> Best,
>> Erick
>> 
>>>> On Mar 6, 2020, at 05:54, Srinivas Kashyap 
>>>>  wrote:
>>> 
>>> Hi All,
>>> 
>>> I have recently upgraded solr to 8.4.1 and have installed solr as service 
>>> in linux machine. Once I start my service, it will be up for 15-18hours and 
>&

Re: OutOfMemory error solr 8.4.1

2020-03-09 Thread Erick Erickson
I’m 99% certain that something in your custom jar is the culprit, otherwise 
we’d have seen a _lot_ of these. TIMED_WAITING is usually just a listener 
thread, but they shouldn’t be generated when SOlr is just sitting there. 

The first thing I’d do is dummy out my custom code or remove it completely and 
see. If you don’t have this thread explosion, then it’s pretty certainly your 
custom code.

Best,
Erick

> On Mar 9, 2020, at 01:29, Srinivas Kashyap  
> wrote:
> 
> Hi Erick,
> 
> I recompiled my custom code with 8.4.1 jars and placed back my jar in the lib 
> folder. Under Solr admin console/Thread Dump, I'm seeing a lot of below 
> threads which are in TIMED_WAITING stage.
> 
> Connection evictor (999)
>java.lang.Thread.sleep​(Native Method)
>
> org.apache.http.impl.client.IdleConnectionEvictor$1.run​(IdleConnectionEvictor.java:66)
>java.lang.Thread.run​(Thread.java:748)
> 
> It's been 15 minutes since I restarted the solr, and I believe already 999 
> threads have started?? And everytime I refresh the console, I'm seeing jump :
> 
> Connection evictor (1106)
> java.lang.Thread.sleep​(Native Method)
> org.apache.http.impl.client.IdleConnectionEvictor$1.run​(IdleConnectionEvictor.java:66)
> java.lang.Thread.run​(Thread.java:748)
> 
> Thanks and Regards,
> Srinivas Kashyap
> 
> -Original Message-
> From: Erick Erickson  
> Sent: 06 March 2020 21:34
> To: solr-user@lucene.apache.org
> Subject: Re: OutOfMemory error solr 8.4.1
> 
> I assume you recompiled the jar file? re-using the same one compiled against 
> 5x is unsupported, nobody will be able to help until you recompile.
> 
> Once you’ve done that, if you still have the problem you need to take a 
> thread dump to see if your custom code is leaking threads, that’s my number 
> one suspect.
> 
> Best,
> Erick
> 
>> On Mar 6, 2020, at 07:36, Srinivas Kashyap  
>> wrote:
>> 
>> Hi Erick,
>> 
>> We have custom code which are schedulers to run delta imports on our cores 
>> and I have added that custom code as a jar and I have placed it on 
>> server/solr-webapp/WEB-INF/lib. Basically we are fetching the JNDI 
>> datasource configured in the jetty.xml(Oracle) and creating connection 
>> object. And after that in the finally block we are closing it too.
>> 
>> Never faced this issue while we were in solr5.2.1 version though. The same 
>> jar was placed there too.
>> 
>> Thanks,
>> Srinivas
>> 
>> On 06-Mar-2020 8:55 pm, Erick Erickson  wrote:
>> This one can be a bit tricky. You’re not running out of overall memory, but 
>> you are running out of memory to allocate stacks. Which implies that, for 
>> some reason, you are creating a zillion threads. Do you have any custom code?
>> 
>> You can take a thread dump and see what your threads are doing, and you 
>> don’t need to wait until you see the error. If you take a thread dump my 
>> guess is you’ll see the number of threads increase over time. If that’s the 
>> case, and if you have no custom code running, we need to see the thread dump.
>> 
>> Best,
>> Erick
>> 
>>>> On Mar 6, 2020, at 05:54, Srinivas Kashyap 
>>>>  wrote:
>>> 
>>> Hi All,
>>> 
>>> I have recently upgraded solr to 8.4.1 and have installed solr as service 
>>> in linux machine. Once I start my service, it will be up for 15-18hours and 
>>> suddenly stops without us shutting down. In solr.log I found below error. 
>>> Can somebody guide me what values should I be increasing in Linux machine?
>>> 
>>> Earlier, open file limit was not set and now I have increased. Below are my 
>>> system configuration for solr:
>>> 
>>> JVM memory: 8GB
>>> RAM: 32GB
>>> Open file descriptor count: 50
>>> 
>>> Ulimit -v - unlimited
>>> Ulimit -m - unlimited
>>> 
>>> 
>>> ERROR STACK TRACE:
>>> 
>>> 2020-03-06 12:08:03.071 ERROR (qtp1691185247-21) [   x:product] 
>>> o.a.s.s.HttpSolrCall null:java.lang.RuntimeException: 
>>> java.lang.OutOfMemoryError: unable to create new native thread
>>>  at 
>>> org.apache.solr.servlet.HttpSolrCall.sendError(HttpSolrCall.java:752)
>>>  at org.apache.solr.servlet.HttpSolrCall.call(HttpSolrCall.java:603)
>>>  at 
>>> org.apache.solr.servlet.SolrDispatchFilter.doFilter(SolrDispatchFilter.java:419)
>>>  at 
>>> org.apache.solr.servlet.SolrDispatchFilter.doFilter(SolrDispatchFilter.java:351)
>>>  at 
>>> org.eclipse.jetty.servlet.Servlet

RE: OutOfMemory error solr 8.4.1

2020-03-09 Thread Srinivas Kashyap
Hi Erick,

I recompiled my custom code with 8.4.1 jars and placed back my jar in the lib 
folder. Under Solr admin console/Thread Dump, I'm seeing a lot of below threads 
which are in TIMED_WAITING stage.

Connection evictor (999)
java.lang.Thread.sleep​(Native Method)

org.apache.http.impl.client.IdleConnectionEvictor$1.run​(IdleConnectionEvictor.java:66)
java.lang.Thread.run​(Thread.java:748)

It's been 15 minutes since I restarted the solr, and I believe already 999 
threads have started?? And everytime I refresh the console, I'm seeing jump :

Connection evictor (1106)
java.lang.Thread.sleep​(Native Method)
org.apache.http.impl.client.IdleConnectionEvictor$1.run​(IdleConnectionEvictor.java:66)
java.lang.Thread.run​(Thread.java:748)

Thanks and Regards,
Srinivas Kashyap

-Original Message-
From: Erick Erickson  
Sent: 06 March 2020 21:34
To: solr-user@lucene.apache.org
Subject: Re: OutOfMemory error solr 8.4.1

I assume you recompiled the jar file? re-using the same one compiled against 5x 
is unsupported, nobody will be able to help until you recompile.

Once you’ve done that, if you still have the problem you need to take a thread 
dump to see if your custom code is leaking threads, that’s my number one 
suspect.

Best,
Erick

> On Mar 6, 2020, at 07:36, Srinivas Kashyap  
> wrote:
> 
> Hi Erick,
> 
> We have custom code which are schedulers to run delta imports on our cores 
> and I have added that custom code as a jar and I have placed it on 
> server/solr-webapp/WEB-INF/lib. Basically we are fetching the JNDI datasource 
> configured in the jetty.xml(Oracle) and creating connection object. And after 
> that in the finally block we are closing it too.
> 
> Never faced this issue while we were in solr5.2.1 version though. The same 
> jar was placed there too.
> 
> Thanks,
> Srinivas
> 
> On 06-Mar-2020 8:55 pm, Erick Erickson  wrote:
> This one can be a bit tricky. You’re not running out of overall memory, but 
> you are running out of memory to allocate stacks. Which implies that, for 
> some reason, you are creating a zillion threads. Do you have any custom code?
> 
> You can take a thread dump and see what your threads are doing, and you don’t 
> need to wait until you see the error. If you take a thread dump my guess is 
> you’ll see the number of threads increase over time. If that’s the case, and 
> if you have no custom code running, we need to see the thread dump.
> 
> Best,
> Erick
> 
>> On Mar 6, 2020, at 05:54, Srinivas Kashyap  
>> wrote:
>> 
>> Hi All,
>> 
>> I have recently upgraded solr to 8.4.1 and have installed solr as service in 
>> linux machine. Once I start my service, it will be up for 15-18hours and 
>> suddenly stops without us shutting down. In solr.log I found below error. 
>> Can somebody guide me what values should I be increasing in Linux machine?
>> 
>> Earlier, open file limit was not set and now I have increased. Below are my 
>> system configuration for solr:
>> 
>> JVM memory: 8GB
>> RAM: 32GB
>> Open file descriptor count: 50
>> 
>> Ulimit -v - unlimited
>> Ulimit -m - unlimited
>> 
>> 
>> ERROR STACK TRACE:
>> 
>> 2020-03-06 12:08:03.071 ERROR (qtp1691185247-21) [   x:product] 
>> o.a.s.s.HttpSolrCall null:java.lang.RuntimeException: 
>> java.lang.OutOfMemoryError: unable to create new native thread
>>   at 
>> org.apache.solr.servlet.HttpSolrCall.sendError(HttpSolrCall.java:752)
>>   at org.apache.solr.servlet.HttpSolrCall.call(HttpSolrCall.java:603)
>>   at 
>> org.apache.solr.servlet.SolrDispatchFilter.doFilter(SolrDispatchFilter.java:419)
>>   at 
>> org.apache.solr.servlet.SolrDispatchFilter.doFilter(SolrDispatchFilter.java:351)
>>   at 
>> org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1602)
>>   at 
>> org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:540)
>>   at 
>> org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:146)
>>   at 
>> org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:548)
>>   at 
>> org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:132)
>>   at 
>> org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:257)
>>   at 
>> org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1711)
>>   at 
>> org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:255)
>>   at 
>> org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextH

Re: OutOfMemory error solr 8.4.1

2020-03-06 Thread Erick Erickson
I assume you recompiled the jar file? re-using the same one compiled against 5x 
is unsupported, nobody will be able to help until you recompile.

Once you’ve done that, if you still have the problem you need to take a thread 
dump to see if your custom code is leaking threads, that’s my number one 
suspect.

Best,
Erick

> On Mar 6, 2020, at 07:36, Srinivas Kashyap  
> wrote:
> 
> Hi Erick,
> 
> We have custom code which are schedulers to run delta imports on our cores 
> and I have added that custom code as a jar and I have placed it on 
> server/solr-webapp/WEB-INF/lib. Basically we are fetching the JNDI datasource 
> configured in the jetty.xml(Oracle) and creating connection object. And after 
> that in the finally block we are closing it too.
> 
> Never faced this issue while we were in solr5.2.1 version though. The same 
> jar was placed there too.
> 
> Thanks,
> Srinivas
> 
> On 06-Mar-2020 8:55 pm, Erick Erickson  wrote:
> This one can be a bit tricky. You’re not running out of overall memory, but 
> you are running out of memory to allocate stacks. Which implies that, for 
> some reason, you are creating a zillion threads. Do you have any custom code?
> 
> You can take a thread dump and see what your threads are doing, and you don’t 
> need to wait until you see the error. If you take a thread dump my guess is 
> you’ll see the number of threads increase over time. If that’s the case, and 
> if you have no custom code running, we need to see the thread dump.
> 
> Best,
> Erick
> 
>> On Mar 6, 2020, at 05:54, Srinivas Kashyap  
>> wrote:
>> 
>> Hi All,
>> 
>> I have recently upgraded solr to 8.4.1 and have installed solr as service in 
>> linux machine. Once I start my service, it will be up for 15-18hours and 
>> suddenly stops without us shutting down. In solr.log I found below error. 
>> Can somebody guide me what values should I be increasing in Linux machine?
>> 
>> Earlier, open file limit was not set and now I have increased. Below are my 
>> system configuration for solr:
>> 
>> JVM memory: 8GB
>> RAM: 32GB
>> Open file descriptor count: 50
>> 
>> Ulimit -v - unlimited
>> Ulimit -m - unlimited
>> 
>> 
>> ERROR STACK TRACE:
>> 
>> 2020-03-06 12:08:03.071 ERROR (qtp1691185247-21) [   x:product] 
>> o.a.s.s.HttpSolrCall null:java.lang.RuntimeException: 
>> java.lang.OutOfMemoryError: unable to create new native thread
>>   at 
>> org.apache.solr.servlet.HttpSolrCall.sendError(HttpSolrCall.java:752)
>>   at org.apache.solr.servlet.HttpSolrCall.call(HttpSolrCall.java:603)
>>   at 
>> org.apache.solr.servlet.SolrDispatchFilter.doFilter(SolrDispatchFilter.java:419)
>>   at 
>> org.apache.solr.servlet.SolrDispatchFilter.doFilter(SolrDispatchFilter.java:351)
>>   at 
>> org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1602)
>>   at 
>> org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:540)
>>   at 
>> org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:146)
>>   at 
>> org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:548)
>>   at 
>> org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:132)
>>   at 
>> org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:257)
>>   at 
>> org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1711)
>>   at 
>> org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:255)
>>   at 
>> org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1347)
>>   at 
>> org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:203)
>>   at 
>> org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:480)
>>   at 
>> org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1678)
>>   at 
>> org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:201)
>>   at 
>> org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1249)
>>   at 
>> org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:144)
>>   at 
>> org.eclipse.jetty.server.handler.ContextHandlerCollection.handle(ContextHandlerCollection.java:220)
>>   at 
>> org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:152)
>>   at 
>> org.eclipse.jetty.server.handler.HandlerWrapper.handle(Handler

Re: OutOfMemory error solr 8.4.1

2020-03-06 Thread Srinivas Kashyap
Hi Erick,

We have custom code which are schedulers to run delta imports on our cores and 
I have added that custom code as a jar and I have placed it on 
server/solr-webapp/WEB-INF/lib. Basically we are fetching the JNDI datasource 
configured in the jetty.xml(Oracle) and creating connection object. And after 
that in the finally block we are closing it too.

Never faced this issue while we were in solr5.2.1 version though. The same jar 
was placed there too.

Thanks,
Srinivas

On 06-Mar-2020 8:55 pm, Erick Erickson  wrote:
This one can be a bit tricky. You’re not running out of overall memory, but you 
are running out of memory to allocate stacks. Which implies that, for some 
reason, you are creating a zillion threads. Do you have any custom code?

You can take a thread dump and see what your threads are doing, and you don’t 
need to wait until you see the error. If you take a thread dump my guess is 
you’ll see the number of threads increase over time. If that’s the case, and if 
you have no custom code running, we need to see the thread dump.

Best,
Erick

> On Mar 6, 2020, at 05:54, Srinivas Kashyap  
> wrote:
>
> Hi All,
>
> I have recently upgraded solr to 8.4.1 and have installed solr as service in 
> linux machine. Once I start my service, it will be up for 15-18hours and 
> suddenly stops without us shutting down. In solr.log I found below error. Can 
> somebody guide me what values should I be increasing in Linux machine?
>
> Earlier, open file limit was not set and now I have increased. Below are my 
> system configuration for solr:
>
> JVM memory: 8GB
> RAM: 32GB
> Open file descriptor count: 50
>
> Ulimit -v - unlimited
> Ulimit -m - unlimited
>
>
> ERROR STACK TRACE:
>
> 2020-03-06 12:08:03.071 ERROR (qtp1691185247-21) [   x:product] 
> o.a.s.s.HttpSolrCall null:java.lang.RuntimeException: 
> java.lang.OutOfMemoryError: unable to create new native thread
>at 
> org.apache.solr.servlet.HttpSolrCall.sendError(HttpSolrCall.java:752)
>at org.apache.solr.servlet.HttpSolrCall.call(HttpSolrCall.java:603)
>at 
> org.apache.solr.servlet.SolrDispatchFilter.doFilter(SolrDispatchFilter.java:419)
>at 
> org.apache.solr.servlet.SolrDispatchFilter.doFilter(SolrDispatchFilter.java:351)
>at 
> org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1602)
>at 
> org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:540)
>at 
> org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:146)
>at 
> org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:548)
>at 
> org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:132)
>at 
> org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:257)
>at 
> org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1711)
>at 
> org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:255)
>at 
> org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1347)
>at 
> org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:203)
>at 
> org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:480)
>at 
> org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1678)
>at 
> org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:201)
>at 
> org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1249)
>at 
> org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:144)
>at 
> org.eclipse.jetty.server.handler.ContextHandlerCollection.handle(ContextHandlerCollection.java:220)
>at 
> org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:152)
>at 
> org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:132)
>at 
> org.eclipse.jetty.rewrite.handler.RewriteHandler.handle(RewriteHandler.java:335)
>at 
> org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:132)
>at org.eclipse.jetty.server.Server.handle(Server.java:505)
>at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:370)
>at 
> org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:267)
>at 
> org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:305)
>at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:103)
>at org.eclipse.jetty.io.ChannelEndPoint$2.run(ChannelEndPoint.java:117)
>at 
> org.eclipse.jetty.uti

Re: OutOfMemory error solr 8.4.1

2020-03-06 Thread Erick Erickson
This one can be a bit tricky. You’re not running out of overall memory, but you 
are running out of memory to allocate stacks. Which implies that, for some 
reason, you are creating a zillion threads. Do you have any custom code?

You can take a thread dump and see what your threads are doing, and you don’t 
need to wait until you see the error. If you take a thread dump my guess is 
you’ll see the number of threads increase over time. If that’s the case, and if 
you have no custom code running, we need to see the thread dump.

Best,
Erick

> On Mar 6, 2020, at 05:54, Srinivas Kashyap  
> wrote:
> 
> Hi All,
> 
> I have recently upgraded solr to 8.4.1 and have installed solr as service in 
> linux machine. Once I start my service, it will be up for 15-18hours and 
> suddenly stops without us shutting down. In solr.log I found below error. Can 
> somebody guide me what values should I be increasing in Linux machine?
> 
> Earlier, open file limit was not set and now I have increased. Below are my 
> system configuration for solr:
> 
> JVM memory: 8GB
> RAM: 32GB
> Open file descriptor count: 50
> 
> Ulimit -v - unlimited
> Ulimit -m - unlimited
> 
> 
> ERROR STACK TRACE:
> 
> 2020-03-06 12:08:03.071 ERROR (qtp1691185247-21) [   x:product] 
> o.a.s.s.HttpSolrCall null:java.lang.RuntimeException: 
> java.lang.OutOfMemoryError: unable to create new native thread
>at 
> org.apache.solr.servlet.HttpSolrCall.sendError(HttpSolrCall.java:752)
>at org.apache.solr.servlet.HttpSolrCall.call(HttpSolrCall.java:603)
>at 
> org.apache.solr.servlet.SolrDispatchFilter.doFilter(SolrDispatchFilter.java:419)
>at 
> org.apache.solr.servlet.SolrDispatchFilter.doFilter(SolrDispatchFilter.java:351)
>at 
> org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1602)
>at 
> org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:540)
>at 
> org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:146)
>at 
> org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:548)
>at 
> org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:132)
>at 
> org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:257)
>at 
> org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1711)
>at 
> org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:255)
>at 
> org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1347)
>at 
> org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:203)
>at 
> org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:480)
>at 
> org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1678)
>at 
> org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:201)
>at 
> org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1249)
>at 
> org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:144)
>at 
> org.eclipse.jetty.server.handler.ContextHandlerCollection.handle(ContextHandlerCollection.java:220)
>at 
> org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:152)
>at 
> org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:132)
>at 
> org.eclipse.jetty.rewrite.handler.RewriteHandler.handle(RewriteHandler.java:335)
>at 
> org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:132)
>at org.eclipse.jetty.server.Server.handle(Server.java:505)
>at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:370)
>at 
> org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:267)
>at 
> org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:305)
>at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:103)
>at org.eclipse.jetty.io.ChannelEndPoint$2.run(ChannelEndPoint.java:117)
>at 
> org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:333)
>at 
> org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:310)
>at 
> org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:168)
>at 
> org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:126)
>at 
> org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:366)
>at 
>

OutOfMemory error solr 8.4.1

2020-03-06 Thread Srinivas Kashyap
Hi All,

I have recently upgraded solr to 8.4.1 and have installed solr as service in 
linux machine. Once I start my service, it will be up for 15-18hours and 
suddenly stops without us shutting down. In solr.log I found below error. Can 
somebody guide me what values should I be increasing in Linux machine?

Earlier, open file limit was not set and now I have increased. Below are my 
system configuration for solr:

JVM memory: 8GB
RAM: 32GB
Open file descriptor count: 50

Ulimit -v - unlimited
Ulimit -m - unlimited


ERROR STACK TRACE:

2020-03-06 12:08:03.071 ERROR (qtp1691185247-21) [   x:product] 
o.a.s.s.HttpSolrCall null:java.lang.RuntimeException: 
java.lang.OutOfMemoryError: unable to create new native thread
at org.apache.solr.servlet.HttpSolrCall.sendError(HttpSolrCall.java:752)
at org.apache.solr.servlet.HttpSolrCall.call(HttpSolrCall.java:603)
at 
org.apache.solr.servlet.SolrDispatchFilter.doFilter(SolrDispatchFilter.java:419)
at 
org.apache.solr.servlet.SolrDispatchFilter.doFilter(SolrDispatchFilter.java:351)
at 
org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1602)
at 
org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:540)
at 
org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:146)
at 
org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:548)
at 
org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:132)
at 
org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:257)
at 
org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1711)
at 
org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:255)
at 
org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1347)
at 
org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:203)
at 
org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:480)
at 
org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1678)
at 
org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:201)
at 
org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1249)
at 
org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:144)
at 
org.eclipse.jetty.server.handler.ContextHandlerCollection.handle(ContextHandlerCollection.java:220)
at 
org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:152)
at 
org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:132)
at 
org.eclipse.jetty.rewrite.handler.RewriteHandler.handle(RewriteHandler.java:335)
at 
org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:132)
at org.eclipse.jetty.server.Server.handle(Server.java:505)
at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:370)
at 
org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:267)
at 
org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:305)
at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:103)
at org.eclipse.jetty.io.ChannelEndPoint$2.run(ChannelEndPoint.java:117)
at 
org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:333)
at 
org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:310)
at 
org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:168)
at 
org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:126)
at 
org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:366)
at 
org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:781)
at 
org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:917)
at java.lang.Thread.run(Thread.java:748)
Caused by: java.lang.OutOfMemoryError: unable to create new native thread
at java.lang.Thread.start0(Native Method)
at java.lang.Thread.start(Thread.java:717)
at 
org.apache.solr.handler.dataimport.DataImporter.runAsync(DataImporter.java:466)
at 
org.apache.solr.handler.dataimport.DataImportHandler.handleRequestBody(DataImportHandler.java:205)
at 
org.apache.solr.handler.RequestHandlerBase.handleRequest(RequestHandlerBase.java:211)
at org.apache.solr.core.SolrCore.execute(SolrCore.java:2596)
at org.apache.solr.servlet.HttpSolrCall.execute(HttpSolrCall.java:799)
at org.apache.solr.servlet.HttpSolrCall.call(HttpSolrCall.java:578)
... 36 more




java.lang.OutOfMemoryError: unable to create new native thread
at

Re: Solr console showing error in 7 .7

2020-02-25 Thread Paras Lehana
Please post full error possibly with trace (see logs).

On Mon, 20 Jan 2020 at 22:29, Rajdeep Sahoo 
wrote:

> When reloading the solr console,it is showing some error in the console
> itself for some small amount of time.
> The error is error reloading/initialising the core.
>


-- 
-- 
Regards,

*Paras Lehana* [65871]
Development Engineer, *Auto-Suggest*,
IndiaMART InterMESH Ltd,

11th Floor, Tower 2, Assotech Business Cresterra,
Plot No. 22, Sector 135, Noida, Uttar Pradesh, India 201305

Mob.: +91-9560911996
Work: 0120-4056700 | Extn:
*11096*

-- 
*
*

 <https://www.facebook.com/IndiaMART/videos/578196442936091/>


SolrCloud Kerberos + SSL Internode comminication Error

2020-02-20 Thread Ecil Ec
Hi,

I had a Kerberos and SSL enabled SolrCloud cluster which has 2 number of
instances on different machines. I didn't have any trouble with this setup
but when I installed one more instance on one of the machines, Solr started
to fire "Unauthorized access" errors for internode communication.

My Environment:
2 Machines, 3 Solr instances:
Java: 1.8.0_241
Solr: 8.4.1
ZK: apache-zookeeper-3.5.7


*Internode exception is :*
2020-02-20 04:50:52.072 ERROR (qtp1226020905-16) [c:test2 s:shard2
r:core_node10 x:test2_shard2_replica_n9] o.a.s.h.RequestHandlerBase
org.apache.solr.update.processor.DistributedUpdateProcessor$DistributedUpdatesAsyncException:
Async exception during distributed update: Error from server at https://XX
-1.XXgx.internal.cloudapp.net:8983/solr/test2_shard1_replica_n2/:
Unauthorized access

Security configs

*SOLR_AUTH_TYPE*="kerberos"

SOLR_AUTHENTICATION_OPTS="-Djava.security.auth.login.config=/home/XX/solr-jaas.conf
-Dsolr.kerberos.cookie.domain=XX.XX.gx.internal.cloudapp.net
-Dsolr.kerberos.cookie.portaware=true -Dsolr.kerberos.principal=HTTP/XX-XX.
xx.gx.internal.cloudapp@yyy.com -Dsolr.kerberos.keytab=/home/XX/XX-XX.ketab
-Dsolr.kerberos.delegation.token.enabled=true"

My server clocks are sync and I tried -Dpkiauth.ttl=10 but it didn't
work.

Do you have any suggestions to solve this problem?

Regards,
Eyyub


Re: Re-creating deleted Managed Stopwords lists results in error

2020-02-17 Thread Walter Underwood
Make phrases into single tokens at indexing and query time. Let the engine do
the rest of the work.

For example, “subunits of the army” can become “subunitsofthearmy” or 
“subunits_of_the_army”.
We used patterns to choose phrases, so “word word”, “word glue word”, or “word 
glue glue word”
could become phrases.

Nutch did something like this, but used it for filtering down the candidates 
for matching,
then used regular Lucene scoring for ranking.

The Infoseek Ultra index used these phrase terms but did not store positions.

The idea came from early DNA search engines.

wunder
Walter Underwood
wun...@wunderwood.org
http://observer.wunderwood.org/  (my blog)

> On Feb 17, 2020, at 10:53 AM, David Hastings  
> wrote:
> 
> interesting, i cant seem to find anything on Phrase IDF, dont suppose you
> have a link or two i could look at by chance?
> 
> On Mon, Feb 17, 2020 at 1:48 PM Walter Underwood 
> wrote:
> 
>> At Infoseek, we used “glue words” to build phrase tokens. It was really
>> effective.
>> Phrase IDF is powerful stuff.
>> 
>> Luckily for you, the patent on that has expired. :-)
>> 
>> wunder
>> Walter Underwood
>> wun...@wunderwood.org
>> http://observer.wunderwood.org/  (my blog)
>> 
>>> On Feb 17, 2020, at 10:46 AM, David Hastings <
>> hastings.recurs...@gmail.com> wrote:
>>> 
>>> i use stop words for building shingles into "interesting phrases" for my
>>> machine teacher/students, so i wouldnt say theres no reason, however my
>> use
>>> case is very specific.  Otherwise yeah, theyre gone for all practical
>>> reasons/search scenarios.
>>> 
>>> On Mon, Feb 17, 2020 at 1:41 PM Walter Underwood 
>>> wrote:
>>> 
>>>> Why are you using stopwords? I would need a really, really good reason
>> to
>>>> use those.
>>>> 
>>>> Stopwords are an obsolete technique from 16-bit processors. I’ve never
>>>> used them and
>>>> I’ve been a search engineer since 1997.
>>>> 
>>>> wunder
>>>> Walter Underwood
>>>> wun...@wunderwood.org
>>>> http://observer.wunderwood.org/  (my blog)
>>>> 
>>>>> On Feb 17, 2020, at 7:31 AM, Thomas Corthals 
>>>> wrote:
>>>>> 
>>>>> Hi
>>>>> 
>>>>> I've run into an issue with creating a Managed Stopwords list that has
>>>> the
>>>>> same name as a previously deleted list. Going through the same flow
>> with
>>>>> Managed Synonyms doesn't result in this unexpected behaviour. Am I
>>>> missing
>>>>> something or did I discover a bug in Solr?
>>>>> 
>>>>> On a newly started solr with the techproducts core:
>>>>> 
>>>>> curl -X PUT -H 'Content-type:application/json' --data-binary
>>>>> 
>> '{"class":"org.apache.solr.rest.schema.analysis.ManagedWordSetResource"}'
>>>>> 
>>>> 
>> http://localhost:8983/solr/techproducts/schema/analysis/stopwords/testlist
>>>>> curl -X DELETE
>>>>> 
>>>> 
>> http://localhost:8983/solr/techproducts/schema/analysis/stopwords/testlist
>>>>> curl
>>>> http://localhost:8983/solr/admin/cores?action=RELOAD\&core=techproducts
>>>>> curl -X PUT -H 'Content-type:application/json' --data-binary
>>>>> 
>> '{"class":"org.apache.solr.rest.schema.analysis.ManagedWordSetResource"}'
>>>>> 
>>>> 
>> http://localhost:8983/solr/techproducts/schema/analysis/stopwords/testlist
>>>>> 
>>>>> The second PUT request results in a status 500 with error
>>>>> msg "java.util.LinkedHashMap cannot be cast to java.util.List".
>>>>> 
>>>>> Similar requests for synonyms work fine, no matter how many times I
>>>> repeat
>>>>> the CREATE/DELETE/RELOAD cycle:
>>>>> 
>>>>> curl -X PUT -H 'Content-type:application/json' --data-binary
>>>>> 
>>>> 
>> '{"class":"org.apache.solr.rest.schema.analysis.ManagedSynonymGraphFilterFactory$SynonymManager"}'
>>>>> 
>> http://localhost:8983/solr/techproducts/schema/analysis/synonyms/testmap
>>>>> curl -X DELETE
>>>>> 
>> http://localhost:8983/solr/techproducts/schema/analysis/synonyms/testmap
>>>>

Re: Re-creating deleted Managed Stopwords lists results in error

2020-02-17 Thread David Hastings
interesting, i cant seem to find anything on Phrase IDF, dont suppose you
have a link or two i could look at by chance?

On Mon, Feb 17, 2020 at 1:48 PM Walter Underwood 
wrote:

> At Infoseek, we used “glue words” to build phrase tokens. It was really
> effective.
> Phrase IDF is powerful stuff.
>
> Luckily for you, the patent on that has expired. :-)
>
> wunder
> Walter Underwood
> wun...@wunderwood.org
> http://observer.wunderwood.org/  (my blog)
>
> > On Feb 17, 2020, at 10:46 AM, David Hastings <
> hastings.recurs...@gmail.com> wrote:
> >
> > i use stop words for building shingles into "interesting phrases" for my
> > machine teacher/students, so i wouldnt say theres no reason, however my
> use
> > case is very specific.  Otherwise yeah, theyre gone for all practical
> > reasons/search scenarios.
> >
> > On Mon, Feb 17, 2020 at 1:41 PM Walter Underwood 
> > wrote:
> >
> >> Why are you using stopwords? I would need a really, really good reason
> to
> >> use those.
> >>
> >> Stopwords are an obsolete technique from 16-bit processors. I’ve never
> >> used them and
> >> I’ve been a search engineer since 1997.
> >>
> >> wunder
> >> Walter Underwood
> >> wun...@wunderwood.org
> >> http://observer.wunderwood.org/  (my blog)
> >>
> >>> On Feb 17, 2020, at 7:31 AM, Thomas Corthals 
> >> wrote:
> >>>
> >>> Hi
> >>>
> >>> I've run into an issue with creating a Managed Stopwords list that has
> >> the
> >>> same name as a previously deleted list. Going through the same flow
> with
> >>> Managed Synonyms doesn't result in this unexpected behaviour. Am I
> >> missing
> >>> something or did I discover a bug in Solr?
> >>>
> >>> On a newly started solr with the techproducts core:
> >>>
> >>> curl -X PUT -H 'Content-type:application/json' --data-binary
> >>>
> '{"class":"org.apache.solr.rest.schema.analysis.ManagedWordSetResource"}'
> >>>
> >>
> http://localhost:8983/solr/techproducts/schema/analysis/stopwords/testlist
> >>> curl -X DELETE
> >>>
> >>
> http://localhost:8983/solr/techproducts/schema/analysis/stopwords/testlist
> >>> curl
> >> http://localhost:8983/solr/admin/cores?action=RELOAD\&core=techproducts
> >>> curl -X PUT -H 'Content-type:application/json' --data-binary
> >>>
> '{"class":"org.apache.solr.rest.schema.analysis.ManagedWordSetResource"}'
> >>>
> >>
> http://localhost:8983/solr/techproducts/schema/analysis/stopwords/testlist
> >>>
> >>> The second PUT request results in a status 500 with error
> >>> msg "java.util.LinkedHashMap cannot be cast to java.util.List".
> >>>
> >>> Similar requests for synonyms work fine, no matter how many times I
> >> repeat
> >>> the CREATE/DELETE/RELOAD cycle:
> >>>
> >>> curl -X PUT -H 'Content-type:application/json' --data-binary
> >>>
> >>
> '{"class":"org.apache.solr.rest.schema.analysis.ManagedSynonymGraphFilterFactory$SynonymManager"}'
> >>>
> http://localhost:8983/solr/techproducts/schema/analysis/synonyms/testmap
> >>> curl -X DELETE
> >>>
> http://localhost:8983/solr/techproducts/schema/analysis/synonyms/testmap
> >>> curl
> >> http://localhost:8983/solr/admin/cores?action=RELOAD\&core=techproducts
> >>> curl -X PUT -H 'Content-type:application/json' --data-binary
> >>>
> >>
> '{"class":"org.apache.solr.rest.schema.analysis.ManagedSynonymGraphFilterFactory$SynonymManager"}'
> >>>
> http://localhost:8983/solr/techproducts/schema/analysis/synonyms/testmap
> >>>
> >>> Reloading after creating the Stopwords list but not after deleting it
> >> works
> >>> without error too on a fresh techproducts core (you'll have to remove
> the
> >>> directory from disk and create the core again after running the
> previous
> >>> commands).
> >>>
> >>> curl -X PUT -H 'Content-type:application/json' --data-binary
> >>>
> '{"class":"org.apache.solr.rest.schema.analysis.ManagedWordSetResource"}'
> >>>
> >>
> http://localhost:8983/solr/techprod

Re: Re-creating deleted Managed Stopwords lists results in error

2020-02-17 Thread Walter Underwood
At Infoseek, we used “glue words” to build phrase tokens. It was really 
effective.
Phrase IDF is powerful stuff.

Luckily for you, the patent on that has expired. :-)

wunder
Walter Underwood
wun...@wunderwood.org
http://observer.wunderwood.org/  (my blog)

> On Feb 17, 2020, at 10:46 AM, David Hastings  
> wrote:
> 
> i use stop words for building shingles into "interesting phrases" for my
> machine teacher/students, so i wouldnt say theres no reason, however my use
> case is very specific.  Otherwise yeah, theyre gone for all practical
> reasons/search scenarios.
> 
> On Mon, Feb 17, 2020 at 1:41 PM Walter Underwood 
> wrote:
> 
>> Why are you using stopwords? I would need a really, really good reason to
>> use those.
>> 
>> Stopwords are an obsolete technique from 16-bit processors. I’ve never
>> used them and
>> I’ve been a search engineer since 1997.
>> 
>> wunder
>> Walter Underwood
>> wun...@wunderwood.org
>> http://observer.wunderwood.org/  (my blog)
>> 
>>> On Feb 17, 2020, at 7:31 AM, Thomas Corthals 
>> wrote:
>>> 
>>> Hi
>>> 
>>> I've run into an issue with creating a Managed Stopwords list that has
>> the
>>> same name as a previously deleted list. Going through the same flow with
>>> Managed Synonyms doesn't result in this unexpected behaviour. Am I
>> missing
>>> something or did I discover a bug in Solr?
>>> 
>>> On a newly started solr with the techproducts core:
>>> 
>>> curl -X PUT -H 'Content-type:application/json' --data-binary
>>> '{"class":"org.apache.solr.rest.schema.analysis.ManagedWordSetResource"}'
>>> 
>> http://localhost:8983/solr/techproducts/schema/analysis/stopwords/testlist
>>> curl -X DELETE
>>> 
>> http://localhost:8983/solr/techproducts/schema/analysis/stopwords/testlist
>>> curl
>> http://localhost:8983/solr/admin/cores?action=RELOAD\&core=techproducts
>>> curl -X PUT -H 'Content-type:application/json' --data-binary
>>> '{"class":"org.apache.solr.rest.schema.analysis.ManagedWordSetResource"}'
>>> 
>> http://localhost:8983/solr/techproducts/schema/analysis/stopwords/testlist
>>> 
>>> The second PUT request results in a status 500 with error
>>> msg "java.util.LinkedHashMap cannot be cast to java.util.List".
>>> 
>>> Similar requests for synonyms work fine, no matter how many times I
>> repeat
>>> the CREATE/DELETE/RELOAD cycle:
>>> 
>>> curl -X PUT -H 'Content-type:application/json' --data-binary
>>> 
>> '{"class":"org.apache.solr.rest.schema.analysis.ManagedSynonymGraphFilterFactory$SynonymManager"}'
>>> http://localhost:8983/solr/techproducts/schema/analysis/synonyms/testmap
>>> curl -X DELETE
>>> http://localhost:8983/solr/techproducts/schema/analysis/synonyms/testmap
>>> curl
>> http://localhost:8983/solr/admin/cores?action=RELOAD\&core=techproducts
>>> curl -X PUT -H 'Content-type:application/json' --data-binary
>>> 
>> '{"class":"org.apache.solr.rest.schema.analysis.ManagedSynonymGraphFilterFactory$SynonymManager"}'
>>> http://localhost:8983/solr/techproducts/schema/analysis/synonyms/testmap
>>> 
>>> Reloading after creating the Stopwords list but not after deleting it
>> works
>>> without error too on a fresh techproducts core (you'll have to remove the
>>> directory from disk and create the core again after running the previous
>>> commands).
>>> 
>>> curl -X PUT -H 'Content-type:application/json' --data-binary
>>> '{"class":"org.apache.solr.rest.schema.analysis.ManagedWordSetResource"}'
>>> 
>> http://localhost:8983/solr/techproducts/schema/analysis/stopwords/testlist
>>> curl
>> http://localhost:8983/solr/admin/cores?action=RELOAD\&core=techproducts
>>> curl -X DELETE
>>> 
>> http://localhost:8983/solr/techproducts/schema/analysis/stopwords/testlist
>>> curl -X PUT -H 'Content-type:application/json' --data-binary
>>> '{"class":"org.apache.solr.rest.schema.analysis.ManagedWordSetResource"}'
>>> 
>> http://localhost:8983/solr/techproducts/schema/analysis/stopwords/testlist
>>> 
>>> And even curiouser, when doing a CREATE/DELETE for Stopwords, then a
>>> CREATE/DELETE for Synonyms, and only then a RELOAD of

Re: Re-creating deleted Managed Stopwords lists results in error

2020-02-17 Thread David Hastings
i use stop words for building shingles into "interesting phrases" for my
machine teacher/students, so i wouldnt say theres no reason, however my use
case is very specific.  Otherwise yeah, theyre gone for all practical
reasons/search scenarios.

On Mon, Feb 17, 2020 at 1:41 PM Walter Underwood 
wrote:

> Why are you using stopwords? I would need a really, really good reason to
> use those.
>
> Stopwords are an obsolete technique from 16-bit processors. I’ve never
> used them and
> I’ve been a search engineer since 1997.
>
> wunder
> Walter Underwood
> wun...@wunderwood.org
> http://observer.wunderwood.org/  (my blog)
>
> > On Feb 17, 2020, at 7:31 AM, Thomas Corthals 
> wrote:
> >
> > Hi
> >
> > I've run into an issue with creating a Managed Stopwords list that has
> the
> > same name as a previously deleted list. Going through the same flow with
> > Managed Synonyms doesn't result in this unexpected behaviour. Am I
> missing
> > something or did I discover a bug in Solr?
> >
> > On a newly started solr with the techproducts core:
> >
> > curl -X PUT -H 'Content-type:application/json' --data-binary
> > '{"class":"org.apache.solr.rest.schema.analysis.ManagedWordSetResource"}'
> >
> http://localhost:8983/solr/techproducts/schema/analysis/stopwords/testlist
> > curl -X DELETE
> >
> http://localhost:8983/solr/techproducts/schema/analysis/stopwords/testlist
> > curl
> http://localhost:8983/solr/admin/cores?action=RELOAD\&core=techproducts
> > curl -X PUT -H 'Content-type:application/json' --data-binary
> > '{"class":"org.apache.solr.rest.schema.analysis.ManagedWordSetResource"}'
> >
> http://localhost:8983/solr/techproducts/schema/analysis/stopwords/testlist
> >
> > The second PUT request results in a status 500 with error
> > msg "java.util.LinkedHashMap cannot be cast to java.util.List".
> >
> > Similar requests for synonyms work fine, no matter how many times I
> repeat
> > the CREATE/DELETE/RELOAD cycle:
> >
> > curl -X PUT -H 'Content-type:application/json' --data-binary
> >
> '{"class":"org.apache.solr.rest.schema.analysis.ManagedSynonymGraphFilterFactory$SynonymManager"}'
> > http://localhost:8983/solr/techproducts/schema/analysis/synonyms/testmap
> > curl -X DELETE
> > http://localhost:8983/solr/techproducts/schema/analysis/synonyms/testmap
> > curl
> http://localhost:8983/solr/admin/cores?action=RELOAD\&core=techproducts
> > curl -X PUT -H 'Content-type:application/json' --data-binary
> >
> '{"class":"org.apache.solr.rest.schema.analysis.ManagedSynonymGraphFilterFactory$SynonymManager"}'
> > http://localhost:8983/solr/techproducts/schema/analysis/synonyms/testmap
> >
> > Reloading after creating the Stopwords list but not after deleting it
> works
> > without error too on a fresh techproducts core (you'll have to remove the
> > directory from disk and create the core again after running the previous
> > commands).
> >
> > curl -X PUT -H 'Content-type:application/json' --data-binary
> > '{"class":"org.apache.solr.rest.schema.analysis.ManagedWordSetResource"}'
> >
> http://localhost:8983/solr/techproducts/schema/analysis/stopwords/testlist
> > curl
> http://localhost:8983/solr/admin/cores?action=RELOAD\&core=techproducts
> > curl -X DELETE
> >
> http://localhost:8983/solr/techproducts/schema/analysis/stopwords/testlist
> > curl -X PUT -H 'Content-type:application/json' --data-binary
> > '{"class":"org.apache.solr.rest.schema.analysis.ManagedWordSetResource"}'
> >
> http://localhost:8983/solr/techproducts/schema/analysis/stopwords/testlist
> >
> > And even curiouser, when doing a CREATE/DELETE for Stopwords, then a
> > CREATE/DELETE for Synonyms, and only then a RELOAD of the core, the cycle
> > can be completed twice. (Again, on a freshly created techproducts core.)
> > Only the third attempt to create a list results in an error. Synonyms can
> > still be created and deleted repeatedly after this.
> >
> > curl -X PUT -H 'Content-type:application/json' --data-binary
> > '{"class":"org.apache.solr.rest.schema.analysis.ManagedWordSetResource"}'
> >
> http://localhost:8983/solr/techproducts/schema/analysis/stopwords/testlist
> > curl -X DELETE
> >
> http://localhost:8983/solr/techproducts/schema/analysis/stopwords/testlist
>

Re: Re-creating deleted Managed Stopwords lists results in error

2020-02-17 Thread Walter Underwood
Why are you using stopwords? I would need a really, really good reason to use 
those.

Stopwords are an obsolete technique from 16-bit processors. I’ve never used 
them and
I’ve been a search engineer since 1997.

wunder
Walter Underwood
wun...@wunderwood.org
http://observer.wunderwood.org/  (my blog)

> On Feb 17, 2020, at 7:31 AM, Thomas Corthals  wrote:
> 
> Hi
> 
> I've run into an issue with creating a Managed Stopwords list that has the
> same name as a previously deleted list. Going through the same flow with
> Managed Synonyms doesn't result in this unexpected behaviour. Am I missing
> something or did I discover a bug in Solr?
> 
> On a newly started solr with the techproducts core:
> 
> curl -X PUT -H 'Content-type:application/json' --data-binary
> '{"class":"org.apache.solr.rest.schema.analysis.ManagedWordSetResource"}'
> http://localhost:8983/solr/techproducts/schema/analysis/stopwords/testlist
> curl -X DELETE
> http://localhost:8983/solr/techproducts/schema/analysis/stopwords/testlist
> curl http://localhost:8983/solr/admin/cores?action=RELOAD\&core=techproducts
> curl -X PUT -H 'Content-type:application/json' --data-binary
> '{"class":"org.apache.solr.rest.schema.analysis.ManagedWordSetResource"}'
> http://localhost:8983/solr/techproducts/schema/analysis/stopwords/testlist
> 
> The second PUT request results in a status 500 with error
> msg "java.util.LinkedHashMap cannot be cast to java.util.List".
> 
> Similar requests for synonyms work fine, no matter how many times I repeat
> the CREATE/DELETE/RELOAD cycle:
> 
> curl -X PUT -H 'Content-type:application/json' --data-binary
> '{"class":"org.apache.solr.rest.schema.analysis.ManagedSynonymGraphFilterFactory$SynonymManager"}'
> http://localhost:8983/solr/techproducts/schema/analysis/synonyms/testmap
> curl -X DELETE
> http://localhost:8983/solr/techproducts/schema/analysis/synonyms/testmap
> curl http://localhost:8983/solr/admin/cores?action=RELOAD\&core=techproducts
> curl -X PUT -H 'Content-type:application/json' --data-binary
> '{"class":"org.apache.solr.rest.schema.analysis.ManagedSynonymGraphFilterFactory$SynonymManager"}'
> http://localhost:8983/solr/techproducts/schema/analysis/synonyms/testmap
> 
> Reloading after creating the Stopwords list but not after deleting it works
> without error too on a fresh techproducts core (you'll have to remove the
> directory from disk and create the core again after running the previous
> commands).
> 
> curl -X PUT -H 'Content-type:application/json' --data-binary
> '{"class":"org.apache.solr.rest.schema.analysis.ManagedWordSetResource"}'
> http://localhost:8983/solr/techproducts/schema/analysis/stopwords/testlist
> curl http://localhost:8983/solr/admin/cores?action=RELOAD\&core=techproducts
> curl -X DELETE
> http://localhost:8983/solr/techproducts/schema/analysis/stopwords/testlist
> curl -X PUT -H 'Content-type:application/json' --data-binary
> '{"class":"org.apache.solr.rest.schema.analysis.ManagedWordSetResource"}'
> http://localhost:8983/solr/techproducts/schema/analysis/stopwords/testlist
> 
> And even curiouser, when doing a CREATE/DELETE for Stopwords, then a
> CREATE/DELETE for Synonyms, and only then a RELOAD of the core, the cycle
> can be completed twice. (Again, on a freshly created techproducts core.)
> Only the third attempt to create a list results in an error. Synonyms can
> still be created and deleted repeatedly after this.
> 
> curl -X PUT -H 'Content-type:application/json' --data-binary
> '{"class":"org.apache.solr.rest.schema.analysis.ManagedWordSetResource"}'
> http://localhost:8983/solr/techproducts/schema/analysis/stopwords/testlist
> curl -X DELETE
> http://localhost:8983/solr/techproducts/schema/analysis/stopwords/testlist
> curl -X PUT -H 'Content-type:application/json' --data-binary
> '{"class":"org.apache.solr.rest.schema.analysis.ManagedSynonymGraphFilterFactory$SynonymManager"}'
> http://localhost:8983/solr/techproducts/schema/analysis/synonyms/testmap
> curl -X DELETE
> http://localhost:8983/solr/techproducts/schema/analysis/synonyms/testmap
> curl http://localhost:8983/solr/admin/cores?action=RELOAD\&core=techproducts
> curl -X PUT -H 'Content-type:application/json' --data-binary
> '{"class":"org.apache.solr.rest.schema.analysis.ManagedWordSetResource"}'
> http://localhost:8983/solr/techproducts/schema/analysis/stopwords/testlist
> curl -X DELETE
&g

Re-creating deleted Managed Stopwords lists results in error

2020-02-17 Thread Thomas Corthals
Hi

I've run into an issue with creating a Managed Stopwords list that has the
same name as a previously deleted list. Going through the same flow with
Managed Synonyms doesn't result in this unexpected behaviour. Am I missing
something or did I discover a bug in Solr?

On a newly started solr with the techproducts core:

curl -X PUT -H 'Content-type:application/json' --data-binary
'{"class":"org.apache.solr.rest.schema.analysis.ManagedWordSetResource"}'
http://localhost:8983/solr/techproducts/schema/analysis/stopwords/testlist
curl -X DELETE
http://localhost:8983/solr/techproducts/schema/analysis/stopwords/testlist
curl http://localhost:8983/solr/admin/cores?action=RELOAD\&core=techproducts
curl -X PUT -H 'Content-type:application/json' --data-binary
'{"class":"org.apache.solr.rest.schema.analysis.ManagedWordSetResource"}'
http://localhost:8983/solr/techproducts/schema/analysis/stopwords/testlist

The second PUT request results in a status 500 with error
msg "java.util.LinkedHashMap cannot be cast to java.util.List".

Similar requests for synonyms work fine, no matter how many times I repeat
the CREATE/DELETE/RELOAD cycle:

curl -X PUT -H 'Content-type:application/json' --data-binary
'{"class":"org.apache.solr.rest.schema.analysis.ManagedSynonymGraphFilterFactory$SynonymManager"}'
http://localhost:8983/solr/techproducts/schema/analysis/synonyms/testmap
curl -X DELETE
http://localhost:8983/solr/techproducts/schema/analysis/synonyms/testmap
curl http://localhost:8983/solr/admin/cores?action=RELOAD\&core=techproducts
curl -X PUT -H 'Content-type:application/json' --data-binary
'{"class":"org.apache.solr.rest.schema.analysis.ManagedSynonymGraphFilterFactory$SynonymManager"}'
http://localhost:8983/solr/techproducts/schema/analysis/synonyms/testmap

Reloading after creating the Stopwords list but not after deleting it works
without error too on a fresh techproducts core (you'll have to remove the
directory from disk and create the core again after running the previous
commands).

curl -X PUT -H 'Content-type:application/json' --data-binary
'{"class":"org.apache.solr.rest.schema.analysis.ManagedWordSetResource"}'
http://localhost:8983/solr/techproducts/schema/analysis/stopwords/testlist
curl http://localhost:8983/solr/admin/cores?action=RELOAD\&core=techproducts
curl -X DELETE
http://localhost:8983/solr/techproducts/schema/analysis/stopwords/testlist
curl -X PUT -H 'Content-type:application/json' --data-binary
'{"class":"org.apache.solr.rest.schema.analysis.ManagedWordSetResource"}'
http://localhost:8983/solr/techproducts/schema/analysis/stopwords/testlist

And even curiouser, when doing a CREATE/DELETE for Stopwords, then a
CREATE/DELETE for Synonyms, and only then a RELOAD of the core, the cycle
can be completed twice. (Again, on a freshly created techproducts core.)
Only the third attempt to create a list results in an error. Synonyms can
still be created and deleted repeatedly after this.

curl -X PUT -H 'Content-type:application/json' --data-binary
'{"class":"org.apache.solr.rest.schema.analysis.ManagedWordSetResource"}'
http://localhost:8983/solr/techproducts/schema/analysis/stopwords/testlist
curl -X DELETE
http://localhost:8983/solr/techproducts/schema/analysis/stopwords/testlist
curl -X PUT -H 'Content-type:application/json' --data-binary
'{"class":"org.apache.solr.rest.schema.analysis.ManagedSynonymGraphFilterFactory$SynonymManager"}'
http://localhost:8983/solr/techproducts/schema/analysis/synonyms/testmap
curl -X DELETE
http://localhost:8983/solr/techproducts/schema/analysis/synonyms/testmap
curl http://localhost:8983/solr/admin/cores?action=RELOAD\&core=techproducts
curl -X PUT -H 'Content-type:application/json' --data-binary
'{"class":"org.apache.solr.rest.schema.analysis.ManagedWordSetResource"}'
http://localhost:8983/solr/techproducts/schema/analysis/stopwords/testlist
curl -X DELETE
http://localhost:8983/solr/techproducts/schema/analysis/stopwords/testlist
curl -X PUT -H 'Content-type:application/json' --data-binary
'{"class":"org.apache.solr.rest.schema.analysis.ManagedSynonymGraphFilterFactory$SynonymManager"}'
http://localhost:8983/solr/techproducts/schema/analysis/synonyms/testmap
curl -X DELETE
http://localhost:8983/solr/techproducts/schema/analysis/synonyms/testmap
curl http://localhost:8983/solr/admin/cores?action=RELOAD\&core=techproducts
curl -X PUT -H 'Content-type:application/json' --data-binary
'{"class":"org.apache.solr.rest.schema.analysis.ManagedWordSetResource"}'
http://localhost:8983/solr/techproducts/schema/analysis/stopwords/testlist

The same successes/errors occur when running each cycle against a different
core if the cores share the same configset.

Any ideas on what might be going wrong?


  1   2   3   4   5   6   7   8   9   10   >