Re: SolrJ NestableJsonFacet ordering of query facet

2020-10-29 Thread Shivam Jha
Hi folks,

Does anyone have any advice on this issue?

Thanks,
Shivam

On Tue, Oct 27, 2020 at 1:20 PM Shivam Jha  wrote:

> Hi folks,
>
> Doing some faceted queries using 'facet.json' param and SolrJ, the results
> of which I am processing using SolrJ NestableJsonFacet class.
> basically as   *queryResponse.getJsonFacetingResponse() -> returns 
> *NestableJsonFacet
> object.
>
> But I have noticed it does not maintain the facet-query order in which it
> was given in *facet.json.*
> *Direct queries to solr do maintain that order, but not after it comes to
> Java layer in SolrJ.*
>
> Is there a way to make it maintain that order ?
> Hopefully the question makes sense, if not please let me know I can
> clarify further.
>
> Thanks,
> Shivam
>


-- 
shivamJha


Solr Cert Renewal Issue

2020-10-29 Thread Ritesh Kumar (Avanade)
Hello All,

I need to renew the expiring cert for SOLR in a Windows SOLR-ZK ensemble with 3 
Solr VMs and 3 ZK VMs and as this is critical application I am performing one 
Solr VM at a time so that my index is available.
So on the non-leader VM, I placed the new PFX cert at 
"F:\solr-6.6.3\server\etc" & created a JKS file in the same location (overwrote 
the old at same location after taking the backup).

The new PFX cert has been installed at Personal\Certificates & copied to 
Trusted Root CA as well as Trusted People.

When I restart the Solr service (restarted VM as well), I can see the new cert 
in the solr url but do not see the index coming online.

This is the below SOLR error I am receiving:

2020-10-29 22:53:57.212 ERROR 
(recoveryExecutor-3-thread-3-processing-n:IPAddress:8983_solr 
x:z_web_index_rebuild_shard1_replica2 s:shard1 c:z_web_index_rebuild 
r:core_node6) [c:z_web_index_rebuild s:shard1 r:core_node6 
x:z_web_index_rebuild_shard1_replica2] o.a.s.c.RecoveryStrategy Error while 
trying to recover. 
core=z_web_index_rebuild_shard1_replica2:java.util.concurrent.ExecutionException:
 org.apache.solr.client.solrj.SolrServerException: IOException occured when 
talking to server at: https://IPAddress:8983/solr
  at java.util.concurrent.FutureTask.report(Unknown Source)
  at java.util.concurrent.FutureTask.get(Unknown Source)
  at 
org.apache.solr.cloud.RecoveryStrategy.sendPrepRecoveryCmd(RecoveryStrategy.java:678)
  at 
org.apache.solr.cloud.RecoveryStrategy.sendPrepRecoveryCmd(RecoveryStrategy.java:653)
  at 
org.apache.solr.cloud.RecoveryStrategy.doRecovery(RecoveryStrategy.java:413)
  at 
org.apache.solr.cloud.RecoveryStrategy.run(RecoveryStrategy.java:284)
  at 
com.codahale.metrics.InstrumentedExecutorService$InstrumentedRunnable.run(InstrumentedExecutorService.java:176)
  at java.util.concurrent.Executors$RunnableAdapter.call(Unknown 
Source)
  at java.util.concurrent.FutureTask.run(Unknown Source)
  at 
org.apache.solr.common.util.ExecutorUtil$MDCAwareThreadPoolExecutor.lambda$execute$0(ExecutorUtil.java:229)
  at java.util.concurrent.ThreadPoolExecutor.runWorker(Unknown 
Source)
  at java.util.concurrent.ThreadPoolExecutor$Worker.run(Unknown 
Source)
  at java.lang.Thread.run(Unknown Source)
Caused by: org.apache.solr.client.solrj.SolrServerException: IOException 
occured when talking to server at: https://IPAddress:8983/solr
  at 
org.apache.solr.client.solrj.impl.HttpSolrClient.executeMethod(HttpSolrClient.java:626)
  at 
org.apache.solr.client.solrj.impl.HttpSolrClient.lambda$httpUriRequest$0(HttpSolrClient.java:319)
  ... 5 more
Caused by: javax.net.ssl.SSLHandshakeException: 
sun.security.validator.ValidatorException: PKIX path building failed: 
sun.security.provider.certpath.SunCertPathBuilderException: unable to find 
valid certification path to requested target
  at sun.security.ssl.Alerts.getSSLException(Unknown Source)
  at sun.security.ssl.SSLSocketImpl.fatal(Unknown Source)
  at sun.security.ssl.Handshaker.fatalSE(Unknown Source)
  at sun.security.ssl.Handshaker.fatalSE(Unknown Source)
  at sun.security.ssl.ClientHandshaker.serverCertificate(Unknown 
Source)
  at sun.security.ssl.ClientHandshaker.processMessage(Unknown 
Source)
  at sun.security.ssl.Handshaker.processLoop(Unknown Source)
  at sun.security.ssl.Handshaker.process_record(Unknown Source)
  at sun.security.ssl.SSLSocketImpl.readRecord(Unknown Source)
  at sun.security.ssl.SSLSocketImpl.performInitialHandshake(Unknown 
Source)
  at sun.security.ssl.SSLSocketImpl.startHandshake(Unknown Source)
  at sun.security.ssl.SSLSocketImpl.startHandshake(Unknown Source)
  at 
org.apache.http.conn.ssl.SSLSocketFactory.connectSocket(SSLSocketFactory.java:535)
  at 
org.apache.http.conn.ssl.SSLSocketFactory.connectSocket(SSLSocketFactory.java:403)
  at 
org.apache.http.impl.conn.DefaultClientConnectionOperator.openConnection(DefaultClientConnectionOperator.java:177)
  at 
org.apache.http.impl.conn.ManagedClientConnectionImpl.open(ManagedClientConnectionImpl.java:304)
  at 
org.apache.http.impl.client.DefaultRequestDirector.tryConnect(DefaultRequestDirector.java:611)
  at 
org.apache.http.impl.client.DefaultRequestDirector.execute(DefaultRequestDirector.java:446)
  at 
org.apache.http.impl.client.AbstractHttpClient.doExecute(AbstractHttpClient.java:863)
  at 
org.apache.http.impl.client.CloseableHttpClient.execute(CloseableHttpClient.java:82)
  at 
org.apache.http.impl.client.CloseableHttpClient.execute(CloseableHttpClient.java:57)
  at 

Re: Avoiding duplicate entry for a multivalued field

2020-10-29 Thread Walter Underwood
Since you are already taking the performance hit of atomic updates, 
I doubt you’ll see any impact from field types or update request processors.
The extra cost of atomic updates will be much greater than indexing cost.

wunder
Walter Underwood
wun...@wunderwood.org
http://observer.wunderwood.org/  (my blog)

> On Oct 29, 2020, at 3:16 AM, Srinivas Kashyap 
>  wrote:
> 
> Thanks Dwane,
> 
> I have a doubt, according to the java doc, the duplicates still continue to 
> exist in the field. May be during query time, the field returns only unique 
> values? Am I right with my assumption?
> 
> And also, what is the performance overhead for this UniqueFiled*Factory?
> 
> Thanks,
> Srinivas
> 
> From: Dwane Hall 
> Sent: 29 October 2020 14:33
> To: solr-user@lucene.apache.org
> Subject: Re: Avoiding duplicate entry for a multivalued field
> 
> Srinivas this is possible by adding an unique field update processor to the 
> update processor chain you are using to perform your updates (/update, 
> /update/json, /update/json/docs, .../a_custom_one)
> 
> The Java Documents explain its use nicely
> (https://lucene.apache.org/solr/8_6_0//solr-core/org/apache/solr/update/processor/UniqFieldsUpdateProcessorFactory.html)
>  or there are articles on stack overflow addressing this exact problem 
> (https://stackoverflow.com/questions/37005747/how-to-remove-duplicates-from-multivalued-fields-in-solr#37006655)
> 
> Thanks,
> 
> Dwane
> 
> From: Srinivas Kashyap 
> mailto:srini...@bamboorose.com.INVALID>>
> Sent: Thursday, 29 October 2020 3:49 PM
> To: solr-user@lucene.apache.org 
> mailto:solr-user@lucene.apache.org>>
> Subject: Avoiding duplicate entry for a multivalued field
> 
> Hello,
> 
> Say, I have a schema field which is multivalued. Is there a way to maintain 
> distinct values for that field though I continue to add duplicate values 
> through atomic update via solrj?
> 
> Is there some property setting to have only unique values in a multi valued 
> fields?
> 
> Thanks,
> Srinivas
> 
> DISCLAIMER:
> E-mails and attachments from Bamboo Rose, LLC are confidential.
> If you are not the intended recipient, please notify the sender immediately 
> by replying to the e-mail, and then delete it without making copies or using 
> it in any way.
> No representation is made that this email or any attachments are free of 
> viruses. Virus scanning is recommended and is the responsibility of the 
> recipient.
> 
> Disclaimer
> 
> The information contained in this communication from the sender is 
> confidential. It is intended solely for use by the recipient and others 
> authorized to receive it. If you are not the recipient, you are hereby 
> notified that any disclosure, copying, distribution or taking action in 
> relation of the contents of this information is strictly prohibited and may 
> be unlawful.
> 
> This email has been scanned for viruses and malware, and may have been 
> automatically archived by Mimecast Ltd, an innovator in Software as a Service 
> (SaaS) for business. Providing a safer and more useful place for your human 
> generated data. Specializing in; Security, archiving and compliance. To find 
> out more visit the Mimecast website.



Re: Solr LockObtainFailedException and NPEs for CoreAdmin STATUS

2020-10-29 Thread ahubold
I've created a JIRA ticket now:
https://issues.apache.org/jira/browse/SOLR-14969

I'd be really glad, if a Solr developer could help or comment on the issue. 

Thank you,
Andreas



--
Sent from: https://lucene.472066.n3.nabble.com/Solr-User-f472068.html


Re: Avoiding duplicate entry for a multivalued field

2020-10-29 Thread Michael Gibney
If I understand correctly what you're trying to do, docValues for a
number of field types are (at least in their multivalued incarnation)
backed by SortedSetDocValues, which inherently deduplicate values
per-document. In your case it sounds like you could maybe rely on that
behavior as a feature, set stored=false, docValues=true,
useDocValuesAsStored=true, and achieve the desired behavior?
Michael

On Thu, Oct 29, 2020 at 6:17 AM Srinivas Kashyap
 wrote:
>
> Thanks Dwane,
>
> I have a doubt, according to the java doc, the duplicates still continue to 
> exist in the field. May be during query time, the field returns only unique 
> values? Am I right with my assumption?
>
> And also, what is the performance overhead for this UniqueFiled*Factory?
>
> Thanks,
> Srinivas
>
> From: Dwane Hall 
> Sent: 29 October 2020 14:33
> To: solr-user@lucene.apache.org
> Subject: Re: Avoiding duplicate entry for a multivalued field
>
> Srinivas this is possible by adding an unique field update processor to the 
> update processor chain you are using to perform your updates (/update, 
> /update/json, /update/json/docs, .../a_custom_one)
>
> The Java Documents explain its use nicely
> (https://lucene.apache.org/solr/8_6_0//solr-core/org/apache/solr/update/processor/UniqFieldsUpdateProcessorFactory.html)
>  or there are articles on stack overflow addressing this exact problem 
> (https://stackoverflow.com/questions/37005747/how-to-remove-duplicates-from-multivalued-fields-in-solr#37006655)
>
> Thanks,
>
> Dwane
> 
> From: Srinivas Kashyap 
> mailto:srini...@bamboorose.com.INVALID>>
> Sent: Thursday, 29 October 2020 3:49 PM
> To: solr-user@lucene.apache.org 
> mailto:solr-user@lucene.apache.org>>
> Subject: Avoiding duplicate entry for a multivalued field
>
> Hello,
>
> Say, I have a schema field which is multivalued. Is there a way to maintain 
> distinct values for that field though I continue to add duplicate values 
> through atomic update via solrj?
>
> Is there some property setting to have only unique values in a multi valued 
> fields?
>
> Thanks,
> Srinivas
> 
> DISCLAIMER:
> E-mails and attachments from Bamboo Rose, LLC are confidential.
> If you are not the intended recipient, please notify the sender immediately 
> by replying to the e-mail, and then delete it without making copies or using 
> it in any way.
> No representation is made that this email or any attachments are free of 
> viruses. Virus scanning is recommended and is the responsibility of the 
> recipient.
>
> Disclaimer
>
> The information contained in this communication from the sender is 
> confidential. It is intended solely for use by the recipient and others 
> authorized to receive it. If you are not the recipient, you are hereby 
> notified that any disclosure, copying, distribution or taking action in 
> relation of the contents of this information is strictly prohibited and may 
> be unlawful.
>
> This email has been scanned for viruses and malware, and may have been 
> automatically archived by Mimecast Ltd, an innovator in Software as a Service 
> (SaaS) for business. Providing a safer and more useful place for your human 
> generated data. Specializing in; Security, archiving and compliance. To find 
> out more visit the Mimecast website.


Re: Solr with HDFS configuration example running in production/dev

2020-10-29 Thread Gézapeti
Cloudera's default configuration for the HDFSDirectoryFactory

is very similar to yours in solrconfig.xml.  The solr.hdfs.home property is
provided as a java property during Solr startup and we haven't seen the ":"
issue yet.

Hope it helps
gp

On Wed, Aug 26, 2020 at 9:17 AM Prashant Jyoti  wrote:

> Hi Joe,
> Yes I had made these changes for getting HDFS to work with Solr. Below are
> config changes which I carried out:
>
>  Changes in solr.in.cmd
> 
>
> set SOLR_OPTS=%SOLR_OPTS% -Dsolr.directoryFactory=HdfsDirectoryFactory
> set SOLR_OPTS=%SOLR_OPTS% -Dsolr.lock.type=hdfs
> set SOLR_OPTS=%SOLR_OPTS% -Dsolr.hdfs.home=hdfs://
>
> hn1-pjhado.tvbhpqtgh3judk1e5ihrx2k21d.tx.internal.cloudapp.net:8020/user/solr-data
>
>
>  Changes in solrconfig.xml
> 
>
>  class="solr.HdfsDirectoryFactory">
>   hdfs://
>
> hn1-pjhado.tvbhpqtgh3judk1e5ihrx2k21d.tx.internal.cloudapp.net:8020/user/solr-data
> 
>   /etc/hadoop/conf
>   true
>   1
>name="solr.hdfs.blockcache.direct.memory.allocation">true
>   16384
>   true
>   true
>   16
>   192
> 
>
> Let me know if you have any comments?
>
> Thanks!
>
> On Mon, Aug 24, 2020 at 10:23 PM Joe Obernberger <
> joseph.obernber...@gmail.com> wrote:
>
> > Are you running with solr.lock.type=hdfs
> > ?
> >
> > Have you defined your DirectoryFactory - something like:
> >
> >  > class="solr.HdfsDirectoryFactory">
> > true
> > true
> > 43
> >  > name="solr.hdfs.blockcache.direct.memory.allocation">true
> > 16384
> > true
> > true
> >  name="solr.hdfs.nrtcachingdirectory.maxmergesizemb">128
> > 1024
> >  name="solr.hdfs.home">hdfs://nameservice1:8020/solr8.2.0
> >  name="solr.hdfs.confdir">/etc/hadoop/conf.cloudera.hdfs1
> > 
> >
> > -Joe
> > On 8/20/2020 2:30 AM, Prashant Jyoti wrote:
> >
> > Hi Joe,
> > These are the errors I am running into:
> >
> > org.apache.solr.common.SolrException: Error CREATEing SolrCore
> > 'newcollsolr2_shard1_replica_n1': Unable to create core
> > [newcollsolr2_shard1_replica_n1] Caused by: Illegal char <:> at index 4:
> > hdfs://
> >
> hn1-pjhado.tvbhpqtgh3judk1e5ihrx2k21d.tx.internal.cloudapp.net:8020/user/solr-data/newcollsolr2/core_node3/data\
> 
> > <
> http://hn1-pjhado.tvbhpqtgh3judk1e5ihrx2k21d.tx.internal.cloudapp.net:8020/user/solr-data/newcollsolr2/core_node3/data%5C
> >
> > at org.apache.solr.core.CoreContainer.create(CoreContainer.java:1256)
> > at
> >
> org.apache.solr.handler.admin.CoreAdminOperation.lambda$static$0(CoreAdminOperation.java:93)
> > at
> >
> org.apache.solr.handler.admin.CoreAdminOperation.execute(CoreAdminOperation.java:362)
> > at
> >
> org.apache.solr.handler.admin.CoreAdminHandler$CallInfo.call(CoreAdminHandler.java:397)
> > at
> >
> org.apache.solr.handler.admin.CoreAdminHandler.handleRequestBody(CoreAdminHandler.java:181)
> > at
> >
> org.apache.solr.handler.RequestHandlerBase.handleRequest(RequestHandlerBase.java:211)
> > at
> org.apache.solr.servlet.HttpSolrCall.handleAdmin(HttpSolrCall.java:842)
> > at
> >
> org.apache.solr.servlet.HttpSolrCall.handleAdminRequest(HttpSolrCall.java:808)
> > at org.apache.solr.servlet.HttpSolrCall.call(HttpSolrCall.java:559)
> > at
> >
> org.apache.solr.servlet.SolrDispatchFilter.doFilter(SolrDispatchFilter.java:420)
> > at
> >
> org.apache.solr.servlet.SolrDispatchFilter.doFilter(SolrDispatchFilter.java:352)
> > at
> >
> org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1596)
> > at
> >
> org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:545)
> > at
> >
> org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143)
> > at
> >
> org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:590)
> > at
> >
> org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127)
> > at
> >
> org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235)
> > at
> >
> org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1607)
> > at
> >
> org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233)
> > at
> >
> org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1297)
> > at
> >
> org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188)
> > at
> > org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:485)
> > at
> >
> org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1577)
> > at
> >
> org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186)
> > at
> >
> 

The FunctionQuery.AllScorer.Score 0 points problem

2020-10-29 Thread Dawn
Hi:
FunctionQuery. AllScorer. Score () method, If it is less than 0, it 
returns 0 points. 

But sometimes you have to compute a negative score. 

If it were the case, there would be no negative points. 

Can a global variable be turned on to control whether it has a negative 
score, compatible with different requirements.

public float score() throws IOException {
  float val = vals.floatVal(docID());
  if (val >= 0 == false) { // this covers NaN as well since comparisons with 
NaN return false
return 0;
  } else {
return boost * val;
  }
}



RE: Avoiding duplicate entry for a multivalued field

2020-10-29 Thread Srinivas Kashyap
Thanks Dwane,

I have a doubt, according to the java doc, the duplicates still continue to 
exist in the field. May be during query time, the field returns only unique 
values? Am I right with my assumption?

And also, what is the performance overhead for this UniqueFiled*Factory?

Thanks,
Srinivas

From: Dwane Hall 
Sent: 29 October 2020 14:33
To: solr-user@lucene.apache.org
Subject: Re: Avoiding duplicate entry for a multivalued field

Srinivas this is possible by adding an unique field update processor to the 
update processor chain you are using to perform your updates (/update, 
/update/json, /update/json/docs, .../a_custom_one)

The Java Documents explain its use nicely
(https://lucene.apache.org/solr/8_6_0//solr-core/org/apache/solr/update/processor/UniqFieldsUpdateProcessorFactory.html)
 or there are articles on stack overflow addressing this exact problem 
(https://stackoverflow.com/questions/37005747/how-to-remove-duplicates-from-multivalued-fields-in-solr#37006655)

Thanks,

Dwane

From: Srinivas Kashyap 
mailto:srini...@bamboorose.com.INVALID>>
Sent: Thursday, 29 October 2020 3:49 PM
To: solr-user@lucene.apache.org 
mailto:solr-user@lucene.apache.org>>
Subject: Avoiding duplicate entry for a multivalued field

Hello,

Say, I have a schema field which is multivalued. Is there a way to maintain 
distinct values for that field though I continue to add duplicate values 
through atomic update via solrj?

Is there some property setting to have only unique values in a multi valued 
fields?

Thanks,
Srinivas

DISCLAIMER:
E-mails and attachments from Bamboo Rose, LLC are confidential.
If you are not the intended recipient, please notify the sender immediately by 
replying to the e-mail, and then delete it without making copies or using it in 
any way.
No representation is made that this email or any attachments are free of 
viruses. Virus scanning is recommended and is the responsibility of the 
recipient.

Disclaimer

The information contained in this communication from the sender is 
confidential. It is intended solely for use by the recipient and others 
authorized to receive it. If you are not the recipient, you are hereby notified 
that any disclosure, copying, distribution or taking action in relation of the 
contents of this information is strictly prohibited and may be unlawful.

This email has been scanned for viruses and malware, and may have been 
automatically archived by Mimecast Ltd, an innovator in Software as a Service 
(SaaS) for business. Providing a safer and more useful place for your human 
generated data. Specializing in; Security, archiving and compliance. To find 
out more visit the Mimecast website.


Re: Avoiding duplicate entry for a multivalued field

2020-10-29 Thread Dwane Hall
Srinivas this is possible by adding an unique field update processor to the 
update processor chain you are using to perform your updates (/update, 
/update/json, /update/json/docs, .../a_custom_one)

The Java Documents explain its use nicely
(https://lucene.apache.org/solr/8_6_0//solr-core/org/apache/solr/update/processor/UniqFieldsUpdateProcessorFactory.html)
 or there are articles on stack overflow addressing this exact problem 
(https://stackoverflow.com/questions/37005747/how-to-remove-duplicates-from-multivalued-fields-in-solr#37006655)

Thanks,

Dwane

From: Srinivas Kashyap 
Sent: Thursday, 29 October 2020 3:49 PM
To: solr-user@lucene.apache.org 
Subject: Avoiding duplicate entry for a multivalued field

Hello,

Say, I have a schema field which is multivalued. Is there a way to maintain 
distinct values for that field though I continue to add duplicate values 
through atomic update via solrj?

Is there some property setting to have only unique values in a multi valued 
fields?

Thanks,
Srinivas

DISCLAIMER:
E-mails and attachments from Bamboo Rose, LLC are confidential.
If you are not the intended recipient, please notify the sender immediately by 
replying to the e-mail, and then delete it without making copies or using it in 
any way.
No representation is made that this email or any attachments are free of 
viruses. Virus scanning is recommended and is the responsibility of the 
recipient.

Disclaimer

The information contained in this communication from the sender is 
confidential. It is intended solely for use by the recipient and others 
authorized to receive it. If you are not the recipient, you are hereby notified 
that any disclosure, copying, distribution or taking action in relation of the 
contents of this information is strictly prohibited and may be unlawful.

This email has been scanned for viruses and malware, and may have been 
automatically archived by Mimecast Ltd, an innovator in Software as a Service 
(SaaS) for business. Providing a safer and more useful place for your human 
generated data. Specializing in; Security, archiving and compliance. To find 
out more visit the Mimecast website.