[jira] [Commented] (SOLR-11981) Multiple kerberos name rules can not be passed with SOLR_AUTHENTICATION_OPTS

2018-03-04 Thread Amrit Sarkar (JIRA)

[ 
https://issues.apache.org/jira/browse/SOLR-11981?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16385723#comment-16385723
 ] 

Amrit Sarkar commented on SOLR-11981:
-

Makes sense. I am not sure whether we add this bit of information in the 
documentation as it is one of the odd cases.

> Multiple kerberos name rules can not be passed with SOLR_AUTHENTICATION_OPTS
> 
>
> Key: SOLR-11981
> URL: https://issues.apache.org/jira/browse/SOLR-11981
> Project: Solr
>  Issue Type: Bug
>  Security Level: Public(Default Security Level. Issues are Public) 
>  Components: security
>Affects Versions: 5.5.5, 6.6.2, 7.2.1
>Reporter: Olivér Szabó
>Priority: Major
>
> On secure env, when multiline (or space separated) kerberos name rules are 
> used ( in solr.in),  those values cannot be passed to .the start script 
> properly. (using {{org.apache.solr.security.KerberosPlugin}})
> Example:
> {code:java}
> SOLR_JAAS_FILE=solr.jaas
> SOLR_KERB_KEYTAB=/etc/security/keytabs/solr.keytab
> SOLR_KERB_PRINCIPAL=solr/myhost1@example.com
> SOLR_KERB_NAME_RULES="RULE:[1:$1@$0](.*@ADMIN.EXAMPLE.NET)s/@.*///L 
> RULE:[1:$1@$0](.*@PROD.EXAMPLE.NET)s/@.*///L 
> RULE:[2:$1@$0](s...@admin.example.net)s/.*/solr/"
> SOLR_AUTHENTICATION_CLIENT_CONFIGURER="org.apache.solr.client.solrj.impl.Krb5HttpClientConfigurer"
> SOLR_AUTHENTICATION_OPTS=" 
> -DauthenticationPlugin=org.apache.solr.security.KerberosPlugin 
> -Djava.security.auth.login.config=$SOLR_JAAS_FILE 
> -Dsolr.kerberos.principal=${SOLR_KERB_PRINCIPAL} 
> -Dsolr.kerberos.keytab=${SOLR_KERB_KEYTAB} 
> -Dsolr.kerberos.cookie.domain=${SOLR_HOST}" 
> -Dsolr.kerberos.name.rules=${SOLR_KERB_NAME_RULES}
> {code}
> that will cause:
> {code:java}
> Caused by: 
> org.apache.hadoop.security.authentication.util.KerberosName$NoMatchingRule: 
> No rules applied to solr/host.exam...@admin.example.net 
> at 
> org.apache.hadoop.security.authentication.util.KerberosName.getShortName(KerberosName.java:389)
>  
> at 
> org.apache.hadoop.security.authentication.server.KerberosAuthenticationHandler
> {code}
> Reason for that (probably): in solr start script, there are multiple 
> {{"${SOLR_OPTS[@]}}}-like (for auth props as well), which magically handle 
> variables as arrays (separated by space or endlines).
> I have tried to add {{solr.kerberos.name.rules}} property directly to 
> SOLR_OPTS instead of SOLR_AUTHENTICATION_OPTS, but i could not using 
> spaces/newlines there even with quotes or escape characters.
> With Ambari we faced this issue before: 
> https://issues.apache.org/jira/browse/AMBARI-18898, the quick solution was to 
> patch the start script to use 
> {{-Dsolr.kerberos.name.rules="$SOLR_KERB_NAME_RULES"}} directly where the 
> scripts starts the java process
> You can close this jira invalid if there is a workaround for that issue or 
> fixed already, if not, then my proposed solution to do something similar. 
> (maybe there are better places where to put that variable)



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



Re: [VOTE] Release Lucene/Solr 6.6.3 RC1

2018-03-04 Thread Shalin Shekhar Mangar
+1

SUCCESS! [1:08:00.711198]

On Sat, Mar 3, 2018 at 3:39 AM, Steve Rowe  wrote:

> Please vote for release candidate 1 for Lucene/Solr 6.6.3.
>
> The artifacts can be downloaded from:
>
> https://dist.apache.org/repos/dist/dev/lucene/lucene-solr-6.6.3-RC1-
> revd1e9bbd333ea55cfa0c75d324424606e857a775b
>
> You can run the smoke tester directly with this command:
>
> python3 -u dev-tools/scripts/smokeTestRelease.py \
> https://dist.apache.org/repos/dist/dev/lucene/lucene-solr-6.6.3-RC1-
> revd1e9bbd333ea55cfa0c75d324424606e857a775b
>
> Here's my +1, smoke tester says SUCCESS! [0:34:...] (from memory -
> terminal scrollback is uncooperative...)
>
> --
> Steve
> www.lucidworks.com
>
>
>
> -
> To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
> For additional commands, e-mail: dev-h...@lucene.apache.org
>
>


-- 
Regards,
Shalin Shekhar Mangar.


[jira] [Commented] (SOLR-11598) Export Writer needs to support more than 4 Sort fields - Say 10, ideally it should not be bound at all, but 4 seems to really short sell the StreamRollup capabilities.

2018-03-04 Thread Amrit Sarkar (JIRA)

[ 
https://issues.apache.org/jira/browse/SOLR-11598?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16385722#comment-16385722
 ] 

Amrit Sarkar commented on SOLR-11598:
-

Improved the patch proving the sorting is taking place at Nth position. I can 
improve the tests more, by adding more documents with diverse field values. Let 
me know if we need to have that in place.

> Export Writer needs to support more than 4 Sort fields - Say 10, ideally it 
> should not be bound at all, but 4 seems to really short sell the StreamRollup 
> capabilities.
> ---
>
> Key: SOLR-11598
> URL: https://issues.apache.org/jira/browse/SOLR-11598
> Project: Solr
>  Issue Type: Improvement
>  Security Level: Public(Default Security Level. Issues are Public) 
>  Components: streaming expressions
>Affects Versions: 6.6.1, 7.0
>Reporter: Aroop
>Priority: Major
>  Labels: patch
> Attachments: SOLR-11598-6_6-streamtests, SOLR-11598-6_6.patch, 
> SOLR-11598-master.patch, SOLR-11598.patch, SOLR-11598.patch, SOLR-11598.patch
>
>
> I am a user of Streaming and I am currently trying to use rollups on an 10 
> dimensional document.
> I am unable to get correct results on this query as I am bounded by the 
> limitation of the export handler which supports only 4 sort fields.
> I do not see why this needs to be the case, as it could very well be 10 or 20.
> My current needs would be satisfied with 10, but one would want to ask why 
> can't it be any decent integer n, beyond which we know performance degrades, 
> but even then it should be caveat emptor.
> [~varunthacker] 
> Code Link:
> https://github.com/apache/lucene-solr/blob/19db1df81a18e6eb2cce5be973bf2305d606a9f8/solr/core/src/java/org/apache/solr/handler/ExportWriter.java#L455
> Error
> null:java.io.IOException: A max of 4 sorts can be specified
>   at 
> org.apache.solr.handler.ExportWriter.getSortDoc(ExportWriter.java:452)
>   at org.apache.solr.handler.ExportWriter.writeDocs(ExportWriter.java:228)
>   at 
> org.apache.solr.handler.ExportWriter.lambda$null$1(ExportWriter.java:219)
>   at 
> org.apache.solr.common.util.JavaBinCodec.writeIterator(JavaBinCodec.java:664)
>   at 
> org.apache.solr.common.util.JavaBinCodec.writeKnownType(JavaBinCodec.java:333)
>   at 
> org.apache.solr.common.util.JavaBinCodec.writeVal(JavaBinCodec.java:223)
>   at org.apache.solr.common.util.JavaBinCodec$1.put(JavaBinCodec.java:394)
>   at 
> org.apache.solr.handler.ExportWriter.lambda$null$2(ExportWriter.java:219)
>   at 
> org.apache.solr.common.util.JavaBinCodec.writeMap(JavaBinCodec.java:437)
>   at 
> org.apache.solr.common.util.JavaBinCodec.writeKnownType(JavaBinCodec.java:354)
>   at 
> org.apache.solr.common.util.JavaBinCodec.writeVal(JavaBinCodec.java:223)
>   at org.apache.solr.common.util.JavaBinCodec$1.put(JavaBinCodec.java:394)
>   at 
> org.apache.solr.handler.ExportWriter.lambda$write$3(ExportWriter.java:217)
>   at 
> org.apache.solr.common.util.JavaBinCodec.writeMap(JavaBinCodec.java:437)
>   at org.apache.solr.handler.ExportWriter.write(ExportWriter.java:215)
>   at org.apache.solr.core.SolrCore$3.write(SolrCore.java:2601)
>   at 
> org.apache.solr.response.QueryResponseWriterUtil.writeQueryResponse(QueryResponseWriterUtil.java:49)
>   at 
> org.apache.solr.servlet.HttpSolrCall.writeResponse(HttpSolrCall.java:809)
>   at org.apache.solr.servlet.HttpSolrCall.call(HttpSolrCall.java:538)
>   at 
> org.apache.solr.servlet.SolrDispatchFilter.doFilter(SolrDispatchFilter.java:361)
>   at 
> org.apache.solr.servlet.SolrDispatchFilter.doFilter(SolrDispatchFilter.java:305)
>   at 
> org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1691)
>   at 
> org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:582)
>   at 
> org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143)
>   at 
> org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:548)
>   at 
> org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:226)
>   at 
> org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1180)
>   at 
> org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:512)
>   at 
> org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:185)
>   at 
> org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1112)
>   at 
> org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141)
>   at 
> 

[JENKINS] Lucene-Solr-repro - Build # 193 - Still Unstable

2018-03-04 Thread Apache Jenkins Server
Build: https://builds.apache.org/job/Lucene-Solr-repro/193/

[...truncated 28 lines...]
[repro] Jenkins log URL: 
https://builds.apache.org/job/Lucene-Solr-NightlyTests-7.x/165/consoleText

[repro] Revision: 59f67468b7f9f90f2377033d358521d451508f9a

[repro] Ant options: -Dtests.multiplier=2 
-Dtests.linedocsfile=/home/jenkins/jenkins-slave/workspace/Lucene-Solr-NightlyTests-7.x/test-data/enwiki.random.lines.txt
[repro] Repro line:  ant test  -Dtestcase=TestLargeCluster 
-Dtests.method=testAddNode -Dtests.seed=EC5794D1006C94B6 -Dtests.multiplier=2 
-Dtests.nightly=true -Dtests.slow=true 
-Dtests.linedocsfile=/home/jenkins/jenkins-slave/workspace/Lucene-Solr-NightlyTests-7.x/test-data/enwiki.random.lines.txt
 -Dtests.locale=it-IT -Dtests.timezone=MET -Dtests.asserts=true 
-Dtests.file.encoding=ISO-8859-1

[repro] Repro line:  ant test  -Dtestcase=TestLargeCluster 
-Dtests.method=testNodeLost -Dtests.seed=EC5794D1006C94B6 -Dtests.multiplier=2 
-Dtests.nightly=true -Dtests.slow=true 
-Dtests.linedocsfile=/home/jenkins/jenkins-slave/workspace/Lucene-Solr-NightlyTests-7.x/test-data/enwiki.random.lines.txt
 -Dtests.locale=it-IT -Dtests.timezone=MET -Dtests.asserts=true 
-Dtests.file.encoding=ISO-8859-1

[repro] Repro line:  ant test  -Dtestcase=TestLargeCluster 
-Dtests.method=testBasic -Dtests.seed=EC5794D1006C94B6 -Dtests.multiplier=2 
-Dtests.nightly=true -Dtests.slow=true 
-Dtests.linedocsfile=/home/jenkins/jenkins-slave/workspace/Lucene-Solr-NightlyTests-7.x/test-data/enwiki.random.lines.txt
 -Dtests.locale=it-IT -Dtests.timezone=MET -Dtests.asserts=true 
-Dtests.file.encoding=ISO-8859-1

[repro] git rev-parse --abbrev-ref HEAD
[repro] git rev-parse HEAD
[repro] Initial local git branch/revision: 
6d66fc04b23358f58ae13020a399013e13063b4f
[repro] git fetch
[repro] git checkout 59f67468b7f9f90f2377033d358521d451508f9a

[...truncated 2 lines...]
[repro] git merge --ff-only

[...truncated 1 lines...]
[repro] ant clean

[...truncated 6 lines...]
[repro] Test suites by module:
[repro]solr/core
[repro]   TestLargeCluster
[repro] ant compile-test

[...truncated 3310 lines...]
[repro] ant test-nocompile -Dtests.dups=5 -Dtests.maxfailures=5 
-Dtests.class="*.TestLargeCluster" -Dtests.showOutput=onerror 
-Dtests.multiplier=2 
-Dtests.linedocsfile=/home/jenkins/jenkins-slave/workspace/Lucene-Solr-NightlyTests-7.x/test-data/enwiki.random.lines.txt
 -Dtests.seed=EC5794D1006C94B6 -Dtests.multiplier=2 -Dtests.nightly=true 
-Dtests.slow=true 
-Dtests.linedocsfile=/home/jenkins/jenkins-slave/workspace/Lucene-Solr-NightlyTests-7.x/test-data/enwiki.random.lines.txt
 -Dtests.locale=it-IT -Dtests.timezone=MET -Dtests.asserts=true 
-Dtests.file.encoding=ISO-8859-1

[...truncated 12811 lines...]
[repro] Setting last failure code to 256

[repro] Failures:
[repro]   3/5 failed: org.apache.solr.cloud.autoscaling.sim.TestLargeCluster
[repro] git checkout 6d66fc04b23358f58ae13020a399013e13063b4f

[...truncated 2 lines...]
[repro] Exiting with code 256

[...truncated 5 lines...]

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org

[jira] [Commented] (SOLR-7887) Upgrade Solr to use log4j2 -- log4j 1 now officially end of life

2018-03-04 Thread Varun Thacker (JIRA)

[ 
https://issues.apache.org/jira/browse/SOLR-7887?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16385703#comment-16385703
 ] 

Varun Thacker commented on SOLR-7887:
-

This might have been introduced with the latest changes made by Erick .. It was 
working fine before that . I'll check it out tomorrow morning

> Upgrade Solr to use log4j2 -- log4j 1 now officially end of life
> 
>
> Key: SOLR-7887
> URL: https://issues.apache.org/jira/browse/SOLR-7887
> Project: Solr
>  Issue Type: Task
>Affects Versions: 5.2.1
>Reporter: Shawn Heisey
>Assignee: Varun Thacker
>Priority: Major
> Attachments: SOLR-7887-WIP.patch, SOLR-7887-eoe-review.patch, 
> SOLR-7887-eoe-review.patch, SOLR-7887.patch, SOLR-7887.patch, 
> SOLR-7887.patch, SOLR-7887.patch, SOLR-7887.patch, SOLR-7887.patch, 
> SOLR-7887.patch, SOLR-7887.patch, SOLR-7887.patch, SOLR-7887.patch
>
>
> The logging services project has officially announced the EOL of log4j 1:
> https://blogs.apache.org/foundation/entry/apache_logging_services_project_announces
> In the official binary jetty deployment, we use use log4j 1.2 as our final 
> logging destination, so the admin UI has a log watcher that actually uses 
> log4j and java.util.logging classes.  That will need to be extended to add 
> log4j2.  I think that might be the largest pain point to this upgrade.
> There is some crossover between log4j2 and slf4j.  Figuring out exactly which 
> jars need to be in the lib/ext directory will take some research.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[jira] [Commented] (SOLR-7887) Upgrade Solr to use log4j2 -- log4j 1 now officially end of life

2018-03-04 Thread Shawn Heisey (JIRA)

[ 
https://issues.apache.org/jira/browse/SOLR-7887?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16385701#comment-16385701
 ] 

Shawn Heisey commented on SOLR-7887:


Also, why is it using a path relative to solr.solr.home when solr.log.dir is 
defined by the script?

> Upgrade Solr to use log4j2 -- log4j 1 now officially end of life
> 
>
> Key: SOLR-7887
> URL: https://issues.apache.org/jira/browse/SOLR-7887
> Project: Solr
>  Issue Type: Task
>Affects Versions: 5.2.1
>Reporter: Shawn Heisey
>Assignee: Varun Thacker
>Priority: Major
> Attachments: SOLR-7887-WIP.patch, SOLR-7887-eoe-review.patch, 
> SOLR-7887-eoe-review.patch, SOLR-7887.patch, SOLR-7887.patch, 
> SOLR-7887.patch, SOLR-7887.patch, SOLR-7887.patch, SOLR-7887.patch, 
> SOLR-7887.patch, SOLR-7887.patch, SOLR-7887.patch, SOLR-7887.patch
>
>
> The logging services project has officially announced the EOL of log4j 1:
> https://blogs.apache.org/foundation/entry/apache_logging_services_project_announces
> In the official binary jetty deployment, we use use log4j 1.2 as our final 
> logging destination, so the admin UI has a log watcher that actually uses 
> log4j and java.util.logging classes.  That will need to be extended to add 
> log4j2.  I think that might be the largest pain point to this upgrade.
> There is some crossover between log4j2 and slf4j.  Figuring out exactly which 
> jars need to be in the lib/ext directory will take some research.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[jira] [Commented] (SOLR-7887) Upgrade Solr to use log4j2 -- log4j 1 now officially end of life

2018-03-04 Thread Shawn Heisey (JIRA)

[ 
https://issues.apache.org/jira/browse/SOLR-7887?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16385697#comment-16385697
 ] 

Shawn Heisey commented on SOLR-7887:


Patch didn't want to apply for me on my desktop.  Moved it to a Linux machine, 
and then it worked.  Building the server target and then running "bin/solr 
start", I see the same exception, but before that, the first two errors are 
different.  I trimmed most of the stacktraces out.  It acts like the property 
substitution isn't working properly ... I think the first message came directly 
from Java file code.

I couldn't figure out which xml config file was being used.  Ultimately by 
experimenting I learned that it is using the one at 
server/scripts/cloud-scripts/log4j2.xml ... which doesn't seem right.

{noformat}
2018-03-05 00:01:38,054 main ERROR Unable to create file 
${sys:solr.solr.home}/../logs/solr.log java.io.IOException: No such file or 
directory
at java.io.UnixFileSystem.createFileExclusively(Native Method)
at java.io.File.createNewFile(File.java:1012)
at 
org.apache.logging.log4j.core.appender.rolling.RollingFileManager$RollingFileManagerFactory.createManager(RollingFileManager.java:628)
{noformat}

{noformat}
2018-03-05 00:01:38,063 main ERROR Could not create plugin of type class 
org.apache.logging.log4j.core.appender.RollingFileAppender for element 
RollingFile: java.lang.IllegalStateException: ManagerFactory 
[org.apache.logging.log4j.core.appender.rolling.RollingFileManager$RollingFileManagerFactory@545997b1]
 unable to create manager for [${sys:solr.solr.home}/../logs/solr.log] with 
data 
[org.apache.logging.log4j.core.appender.rolling.RollingFileManager$FactoryData@4cf4d528[pattern=${sys:solr.solr.home}/../logs/solr.%i.log.gz,
 append=true, bufferedIO=true, bufferSize=8192, 
policy=CompositeTriggeringPolicy(policies=[OnStartupTriggeringPolicy, 
SizeBasedTriggeringPolicy(size=4194304)]), 
strategy=DefaultRolloverStrategy(min=1, max=7, useMax=true), advertiseURI=null, 
layout=%d{-MM-dd HH:mm:ss.SSS} %-5p (%t) [%X{collection} %X{shard} 
%X{replica} %X{core}] %c{1.} %m%n, filePermissions=null, fileOwner=null]] 
java.lang.IllegalStateException: ManagerFactory 
[org.apache.logging.log4j.core.appender.rolling.RollingFileManager$RollingFileManagerFactory@545997b1]
 unable to create manager for [${sys:solr.solr.home}/../logs/solr.log] with 
data 
[org.apache.logging.log4j.core.appender.rolling.RollingFileManager$FactoryData@4cf4d528[pattern=${sys:solr.solr.home}/../logs/solr.%i.log.gz,
 append=true, bufferedIO=true, bufferSize=8192, 
policy=CompositeTriggeringPolicy(policies=[OnStartupTriggeringPolicy, 
SizeBasedTriggeringPolicy(size=4194304)]), 
strategy=DefaultRolloverStrategy(min=1, max=7, useMax=true), advertiseURI=null, 
layout=%d{-MM-dd HH:mm:ss.SSS} %-5p (%t) [%X{collection} %X{shard} 
%X{replica} %X{core}] %c{1.} %m%n, filePermissions=null, fileOwner=null]]
at 
org.apache.logging.log4j.core.appender.AbstractManager.getManager(AbstractManager.java:115)
{noformat}


> Upgrade Solr to use log4j2 -- log4j 1 now officially end of life
> 
>
> Key: SOLR-7887
> URL: https://issues.apache.org/jira/browse/SOLR-7887
> Project: Solr
>  Issue Type: Task
>Affects Versions: 5.2.1
>Reporter: Shawn Heisey
>Assignee: Varun Thacker
>Priority: Major
> Attachments: SOLR-7887-WIP.patch, SOLR-7887-eoe-review.patch, 
> SOLR-7887-eoe-review.patch, SOLR-7887.patch, SOLR-7887.patch, 
> SOLR-7887.patch, SOLR-7887.patch, SOLR-7887.patch, SOLR-7887.patch, 
> SOLR-7887.patch, SOLR-7887.patch, SOLR-7887.patch, SOLR-7887.patch
>
>
> The logging services project has officially announced the EOL of log4j 1:
> https://blogs.apache.org/foundation/entry/apache_logging_services_project_announces
> In the official binary jetty deployment, we use use log4j 1.2 as our final 
> logging destination, so the admin UI has a log watcher that actually uses 
> log4j and java.util.logging classes.  That will need to be extended to add 
> log4j2.  I think that might be the largest pain point to this upgrade.
> There is some crossover between log4j2 and slf4j.  Figuring out exactly which 
> jars need to be in the lib/ext directory will take some research.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[JENKINS] Lucene-Solr-7.x-Linux (64bit/jdk1.8.0_162) - Build # 1467 - Still Unstable!

2018-03-04 Thread Policeman Jenkins Server
Build: https://jenkins.thetaphi.de/job/Lucene-Solr-7.x-Linux/1467/
Java: 64bit/jdk1.8.0_162 -XX:+UseCompressedOops -XX:+UseParallelGC

2 tests failed.
FAILED:  
junit.framework.TestSuite.org.apache.solr.cloud.TestSolrCloudWithSecureImpersonation

Error Message:
4 threads leaked from SUITE scope at 
org.apache.solr.cloud.TestSolrCloudWithSecureImpersonation: 1) 
Thread[id=24995, name=jetty-launcher-8279-thread-2-SendThread(127.0.0.1:33597), 
state=TIMED_WAITING, group=TGRP-TestSolrCloudWithSecureImpersonation] 
at java.lang.Thread.sleep(Native Method) at 
org.apache.zookeeper.client.StaticHostProvider.next(StaticHostProvider.java:105)
 at 
org.apache.zookeeper.ClientCnxn$SendThread.startConnect(ClientCnxn.java:1000)   
  at org.apache.zookeeper.ClientCnxn$SendThread.run(ClientCnxn.java:1063)   
 2) Thread[id=24994, name=jetty-launcher-8279-thread-1-EventThread, 
state=WAITING, group=TGRP-TestSolrCloudWithSecureImpersonation] at 
sun.misc.Unsafe.park(Native Method) at 
java.util.concurrent.locks.LockSupport.park(LockSupport.java:175) at 
java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject.await(AbstractQueuedSynchronizer.java:2039)
 at 
java.util.concurrent.LinkedBlockingQueue.take(LinkedBlockingQueue.java:442) 
at org.apache.zookeeper.ClientCnxn$EventThread.run(ClientCnxn.java:502)
3) Thread[id=24996, name=jetty-launcher-8279-thread-2-EventThread, 
state=WAITING, group=TGRP-TestSolrCloudWithSecureImpersonation] at 
sun.misc.Unsafe.park(Native Method) at 
java.util.concurrent.locks.LockSupport.park(LockSupport.java:175) at 
java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject.await(AbstractQueuedSynchronizer.java:2039)
 at 
java.util.concurrent.LinkedBlockingQueue.take(LinkedBlockingQueue.java:442) 
at org.apache.zookeeper.ClientCnxn$EventThread.run(ClientCnxn.java:502)
4) Thread[id=24993, 
name=jetty-launcher-8279-thread-1-SendThread(127.0.0.1:33597), 
state=TIMED_WAITING, group=TGRP-TestSolrCloudWithSecureImpersonation] 
at java.lang.Thread.sleep(Native Method) at 
org.apache.zookeeper.client.StaticHostProvider.next(StaticHostProvider.java:105)
 at 
org.apache.zookeeper.ClientCnxn$SendThread.startConnect(ClientCnxn.java:1000)   
  at org.apache.zookeeper.ClientCnxn$SendThread.run(ClientCnxn.java:1063)

Stack Trace:
com.carrotsearch.randomizedtesting.ThreadLeakError: 4 threads leaked from SUITE 
scope at org.apache.solr.cloud.TestSolrCloudWithSecureImpersonation: 
   1) Thread[id=24995, 
name=jetty-launcher-8279-thread-2-SendThread(127.0.0.1:33597), 
state=TIMED_WAITING, group=TGRP-TestSolrCloudWithSecureImpersonation]
at java.lang.Thread.sleep(Native Method)
at 
org.apache.zookeeper.client.StaticHostProvider.next(StaticHostProvider.java:105)
at 
org.apache.zookeeper.ClientCnxn$SendThread.startConnect(ClientCnxn.java:1000)
at org.apache.zookeeper.ClientCnxn$SendThread.run(ClientCnxn.java:1063)
   2) Thread[id=24994, name=jetty-launcher-8279-thread-1-EventThread, 
state=WAITING, group=TGRP-TestSolrCloudWithSecureImpersonation]
at sun.misc.Unsafe.park(Native Method)
at java.util.concurrent.locks.LockSupport.park(LockSupport.java:175)
at 
java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject.await(AbstractQueuedSynchronizer.java:2039)
at 
java.util.concurrent.LinkedBlockingQueue.take(LinkedBlockingQueue.java:442)
at org.apache.zookeeper.ClientCnxn$EventThread.run(ClientCnxn.java:502)
   3) Thread[id=24996, name=jetty-launcher-8279-thread-2-EventThread, 
state=WAITING, group=TGRP-TestSolrCloudWithSecureImpersonation]
at sun.misc.Unsafe.park(Native Method)
at java.util.concurrent.locks.LockSupport.park(LockSupport.java:175)
at 
java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject.await(AbstractQueuedSynchronizer.java:2039)
at 
java.util.concurrent.LinkedBlockingQueue.take(LinkedBlockingQueue.java:442)
at org.apache.zookeeper.ClientCnxn$EventThread.run(ClientCnxn.java:502)
   4) Thread[id=24993, 
name=jetty-launcher-8279-thread-1-SendThread(127.0.0.1:33597), 
state=TIMED_WAITING, group=TGRP-TestSolrCloudWithSecureImpersonation]
at java.lang.Thread.sleep(Native Method)
at 
org.apache.zookeeper.client.StaticHostProvider.next(StaticHostProvider.java:105)
at 
org.apache.zookeeper.ClientCnxn$SendThread.startConnect(ClientCnxn.java:1000)
at org.apache.zookeeper.ClientCnxn$SendThread.run(ClientCnxn.java:1063)
at __randomizedtesting.SeedInfo.seed([2B2B03219481452]:0)


FAILED:  
junit.framework.TestSuite.org.apache.solr.cloud.TestSolrCloudWithSecureImpersonation

Error Message:
There are still zombie threads that couldn't be terminated:1) 
Thread[id=24995, name=jetty-launcher-8279-thread-2-SendThread(127.0.0.1:33597), 
state=TIMED_WAITING, 

[jira] [Commented] (SOLR-11795) Add Solr metrics exporter for Prometheus

2018-03-04 Thread ASF subversion and git services (JIRA)

[ 
https://issues.apache.org/jira/browse/SOLR-11795?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16385685#comment-16385685
 ] 

ASF subversion and git services commented on SOLR-11795:


Commit b07382c6e429fbb0db3a33d6d85044ee730a7fc0 in lucene-solr's branch 
refs/heads/branch_7x from koji
[ https://git-wip-us.apache.org/repos/asf?p=lucene-solr.git;h=b07382c ]

SOLR-11795: Add Solr metrics exporter for Prometheus


> Add Solr metrics exporter for Prometheus
> 
>
> Key: SOLR-11795
> URL: https://issues.apache.org/jira/browse/SOLR-11795
> Project: Solr
>  Issue Type: Improvement
>  Security Level: Public(Default Security Level. Issues are Public) 
>  Components: metrics
>Affects Versions: 7.2
>Reporter: Minoru Osuka
>Assignee: Koji Sekiguchi
>Priority: Minor
> Fix For: master (8.0), 7.3
>
> Attachments: SOLR-11795-10.patch, SOLR-11795-11.patch, 
> SOLR-11795-2.patch, SOLR-11795-3.patch, SOLR-11795-4.patch, 
> SOLR-11795-5.patch, SOLR-11795-6.patch, SOLR-11795-7.patch, 
> SOLR-11795-8.patch, SOLR-11795-9.patch, SOLR-11795-dev-tools.patch, 
> SOLR-11795.patch, solr-dashboard.png, solr-exporter-diagram.png
>
>  Time Spent: 20m
>  Remaining Estimate: 0h
>
> I 'd like to monitor Solr using Prometheus and Grafana.
> I've already created Solr metrics exporter for Prometheus. I'd like to 
> contribute to contrib directory if you don't mind.
> !solr-exporter-diagram.png|thumbnail!
> !solr-dashboard.png|thumbnail!



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[jira] [Commented] (SOLR-11795) Add Solr metrics exporter for Prometheus

2018-03-04 Thread ASF subversion and git services (JIRA)

[ 
https://issues.apache.org/jira/browse/SOLR-11795?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16385658#comment-16385658
 ] 

ASF subversion and git services commented on SOLR-11795:


Commit 6d66fc04b23358f58ae13020a399013e13063b4f in lucene-solr's branch 
refs/heads/master from koji
[ https://git-wip-us.apache.org/repos/asf?p=lucene-solr.git;h=6d66fc0 ]

SOLR-11795: Add Solr metrics exporter for Prometheus


> Add Solr metrics exporter for Prometheus
> 
>
> Key: SOLR-11795
> URL: https://issues.apache.org/jira/browse/SOLR-11795
> Project: Solr
>  Issue Type: Improvement
>  Security Level: Public(Default Security Level. Issues are Public) 
>  Components: metrics
>Affects Versions: 7.2
>Reporter: Minoru Osuka
>Assignee: Koji Sekiguchi
>Priority: Minor
> Fix For: master (8.0), 7.3
>
> Attachments: SOLR-11795-10.patch, SOLR-11795-11.patch, 
> SOLR-11795-2.patch, SOLR-11795-3.patch, SOLR-11795-4.patch, 
> SOLR-11795-5.patch, SOLR-11795-6.patch, SOLR-11795-7.patch, 
> SOLR-11795-8.patch, SOLR-11795-9.patch, SOLR-11795-dev-tools.patch, 
> SOLR-11795.patch, solr-dashboard.png, solr-exporter-diagram.png
>
>  Time Spent: 20m
>  Remaining Estimate: 0h
>
> I 'd like to monitor Solr using Prometheus and Grafana.
> I've already created Solr metrics exporter for Prometheus. I'd like to 
> contribute to contrib directory if you don't mind.
> !solr-exporter-diagram.png|thumbnail!
> !solr-dashboard.png|thumbnail!



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[jira] [Updated] (LUCENE-8190) Replace dependency on LegacyCell for setting pruneLeafyBranches on RecursivePrefixTreeStrategy

2018-03-04 Thread Ignacio Vera (JIRA)

 [ 
https://issues.apache.org/jira/browse/LUCENE-8190?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Ignacio Vera updated LUCENE-8190:
-
Summary: Replace dependency on LegacyCell for setting pruneLeafyBranches on 
RecursivePrefixTreeStrategy  (was: Replace dendency on LegacyCell for setting 
pruneLeafyBranches on RecursivePrefixTreeStrategy)

> Replace dependency on LegacyCell for setting pruneLeafyBranches on 
> RecursivePrefixTreeStrategy
> --
>
> Key: LUCENE-8190
> URL: https://issues.apache.org/jira/browse/LUCENE-8190
> Project: Lucene - Core
>  Issue Type: Improvement
>  Components: modules/spatial-extras
>Reporter: Ignacio Vera
>Priority: Major
> Attachments: LUCENE-8190.patch, LUCENE-8190.patch
>
>
> The setting {{pruneLeafyBranches}} on {{RecursivePrefixTreeStrategy}} depends 
> on abstract class {{LegacyCell}} and therefore trees like the newly added 
> {{S2PrefixTree}} cannot benefit for such optimization.
> It is proposed to add a new specialize interface for {{cell}} interface and 
> make the setting depends on it instead of {{LegacyCell.}}



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[JENKINS-EA] Lucene-Solr-6.6-Linux (64bit/jdk-10-ea+43) - Build # 192 - Unstable!

2018-03-04 Thread Policeman Jenkins Server
Build: https://jenkins.thetaphi.de/job/Lucene-Solr-6.6-Linux/192/
Java: 64bit/jdk-10-ea+43 -XX:-UseCompressedOops -XX:+UseConcMarkSweepGC

70 tests failed.
FAILED:  
junit.framework.TestSuite.org.apache.solr.cloud.OverseerCollectionConfigSetProcessorTest

Error Message:
 Mockito cannot mock this class: class org.apache.solr.cloud.OverseerTaskQueue. 
 Mockito can only mock non-private & non-final classes. If you're not sure why 
you're getting this error, please report to the mailing list.   Java
   : 10 JVM vendor name: "Oracle Corporation" JVM vendor version : 10+43 
JVM name   : OpenJDK 64-Bit Server VM JVM version: 10+43 JVM 
info   : mixed mode OS name: Linux OS version : 
4.13.0-36-generic   Underlying exception : 
java.lang.UnsupportedOperationException: Cannot define class using reflection

Stack Trace:
org.mockito.exceptions.base.MockitoException: 
Mockito cannot mock this class: class org.apache.solr.cloud.OverseerTaskQueue.

Mockito can only mock non-private & non-final classes.
If you're not sure why you're getting this error, please report to the mailing 
list.


Java   : 10
JVM vendor name: "Oracle Corporation"
JVM vendor version : 10+43
JVM name   : OpenJDK 64-Bit Server VM
JVM version: 10+43
JVM info   : mixed mode
OS name: Linux
OS version : 4.13.0-36-generic


Underlying exception : java.lang.UnsupportedOperationException: Cannot define 
class using reflection
at __randomizedtesting.SeedInfo.seed([F02FF9CB9F842A66]:0)
at 
org.apache.solr.cloud.OverseerCollectionConfigSetProcessorTest.setUpOnce(OverseerCollectionConfigSetProcessorTest.java:103)
at 
java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at 
java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at 
java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.base/java.lang.reflect.Method.invoke(Method.java:564)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner.invoke(RandomizedRunner.java:1713)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$6.evaluate(RandomizedRunner.java:847)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$7.evaluate(RandomizedRunner.java:863)
at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at 
com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:57)
at 
org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:45)
at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at 
org.apache.lucene.util.TestRuleStoreClassName$1.evaluate(TestRuleStoreClassName.java:41)
at 
com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:40)
at 
com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:40)
at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at 
org.apache.lucene.util.TestRuleAssertionsRequired$1.evaluate(TestRuleAssertionsRequired.java:53)
at 
org.apache.lucene.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:47)
at 
org.apache.lucene.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:64)
at 
org.apache.lucene.util.TestRuleIgnoreTestSuites$1.evaluate(TestRuleIgnoreTestSuites.java:54)
at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at 
com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:368)
at java.base/java.lang.Thread.run(Thread.java:844)
Caused by: java.lang.UnsupportedOperationException: Cannot define class using 
reflection
at 
net.bytebuddy.dynamic.loading.ClassInjector$UsingReflection$Dispatcher$Unavailable.defineClass(ClassInjector.java:819)
at 
net.bytebuddy.dynamic.loading.ClassInjector$UsingReflection.inject(ClassInjector.java:183)
at 
net.bytebuddy.dynamic.loading.ClassLoadingStrategy$Default$InjectionDispatcher.load(ClassLoadingStrategy.java:187)
at 
net.bytebuddy.dynamic.TypeResolutionStrategy$Passive.initialize(TypeResolutionStrategy.java:79)
at 
net.bytebuddy.dynamic.DynamicType$Default$Unloaded.load(DynamicType.java:4352)
at 
org.mockito.internal.creation.bytebuddy.SubclassBytecodeGenerator.mockClass(SubclassBytecodeGenerator.java:94)
at 

[JENKINS] Lucene-Solr-BadApples-Tests-master - Build # 5 - Still Unstable

2018-03-04 Thread Apache Jenkins Server
Build: https://builds.apache.org/job/Lucene-Solr-BadApples-Tests-master/5/

4 tests failed.
FAILED:  org.apache.solr.ltr.TestLTRReRankingPipeline.testDifferentTopN

Error Message:
expected:<1.0> but was:<0.0>

Stack Trace:
java.lang.AssertionError: expected:<1.0> but was:<0.0>
at 
__randomizedtesting.SeedInfo.seed([C3F479E35220566C:32550BB3679B9CFE]:0)
at org.junit.Assert.fail(Assert.java:93)
at org.junit.Assert.failNotEquals(Assert.java:647)
at org.junit.Assert.assertEquals(Assert.java:443)
at org.junit.Assert.assertEquals(Assert.java:512)
at 
org.apache.solr.ltr.TestLTRReRankingPipeline.testDifferentTopN(TestLTRReRankingPipeline.java:256)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner.invoke(RandomizedRunner.java:1737)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$8.evaluate(RandomizedRunner.java:934)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$9.evaluate(RandomizedRunner.java:970)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$10.evaluate(RandomizedRunner.java:984)
at 
org.apache.lucene.util.TestRuleSetupTeardownChained$1.evaluate(TestRuleSetupTeardownChained.java:49)
at 
org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:45)
at 
org.apache.lucene.util.TestRuleThreadAndTestName$1.evaluate(TestRuleThreadAndTestName.java:48)
at 
org.apache.lucene.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:64)
at 
org.apache.lucene.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:47)
at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at 
com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:368)
at 
com.carrotsearch.randomizedtesting.ThreadLeakControl.forkTimeoutingTask(ThreadLeakControl.java:817)
at 
com.carrotsearch.randomizedtesting.ThreadLeakControl$3.evaluate(ThreadLeakControl.java:468)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner.runSingleTest(RandomizedRunner.java:943)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$5.evaluate(RandomizedRunner.java:829)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$6.evaluate(RandomizedRunner.java:879)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$7.evaluate(RandomizedRunner.java:890)
at 
org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:45)
at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at 
org.apache.lucene.util.TestRuleStoreClassName$1.evaluate(TestRuleStoreClassName.java:41)
at 
com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:40)
at 
com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:40)
at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at 
org.apache.lucene.util.TestRuleAssertionsRequired$1.evaluate(TestRuleAssertionsRequired.java:53)
at 
org.apache.lucene.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:47)
at 
org.apache.lucene.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:64)
at 
org.apache.lucene.util.TestRuleIgnoreTestSuites$1.evaluate(TestRuleIgnoreTestSuites.java:54)
at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at 
com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:368)
at java.lang.Thread.run(Thread.java:748)


FAILED:  org.apache.solr.cloud.ZkControllerTest.testPublishAndWaitForDownStates

Error Message:
The ZkController.publishAndWaitForDownStates should have timed out but it didn't

Stack Trace:
java.lang.AssertionError: The ZkController.publishAndWaitForDownStates should 
have timed out but it didn't
at 
__randomizedtesting.SeedInfo.seed([8078486235829B49:A777248E9415D476]:0)
at org.junit.Assert.fail(Assert.java:93)
at org.junit.Assert.assertTrue(Assert.java:43)
at 

[jira] [Commented] (SOLR-7887) Upgrade Solr to use log4j2 -- log4j 1 now officially end of life

2018-03-04 Thread Erick Erickson (JIRA)

[ 
https://issues.apache.org/jira/browse/SOLR-7887?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16385637#comment-16385637
 ] 

Erick Erickson commented on SOLR-7887:
--

Unfortunately, I've been concentrating on getting unit tests to run and the 
like. With the current patch when I try to start a regular Solr instance I get:


2018-03-04 21:58:00,789 main ERROR Unable to invoke factory method in class 
org.apache.logging.log4j.core.appender.RollingFileAppender for element 
RollingFile: java.lang.IllegalStateException: No factory method found for class 
org.apache.logging.log4j.core.appender.RollingFileAppender 
java.lang.IllegalStateException: No factory method found for class 
org.apache.logging.log4j.core.appender.RollingFileAppender
at 
org.apache.logging.log4j.core.config.plugins.util.PluginBuilder.findFactoryMethod(PluginBuilder.java:229)
at 
org.apache.logging.log4j.core.config.plugins.util.PluginBuilder.build(PluginBuilder.java:134)
at 
org.apache.logging.log4j.core.config.AbstractConfiguration.createPluginObject(AbstractConfiguration.java:958)
at 
org.apache.logging.log4j.core.config.AbstractConfiguration.createConfiguration(AbstractConfiguration.java:898)
at 
org.apache.logging.log4j.core.config.AbstractConfiguration.createConfiguration(AbstractConfiguration.java:890)
at 
org.apache.logging.log4j.core.config.AbstractConfiguration.doConfigure(AbstractConfiguration.java:513)
at 
org.apache.logging.log4j.core.config.AbstractConfiguration.initialize(AbstractConfiguration.java:237)
at 
org.apache.logging.log4j.core.config.AbstractConfiguration.start(AbstractConfiguration.java:249)
at 
org.apache.logging.log4j.core.LoggerContext.setConfiguration(LoggerContext.java:545)
at 
org.apache.logging.log4j.core.LoggerContext.reconfigure(LoggerContext.java:617)
at 
org.apache.logging.log4j.core.LoggerContext.reconfigure(LoggerContext.java:634)
at 
org.apache.logging.log4j.core.LoggerContext.start(LoggerContext.java:229)
at 
org.apache.logging.log4j.core.impl.Log4jContextFactory.getContext(Log4jContextFactory.java:153)
at 
org.apache.logging.log4j.core.impl.Log4jContextFactory.getContext(Log4jContextFactory.java:45)
at org.apache.logging.log4j.LogManager.getContext(LogManager.java:194)
at 
org.apache.logging.log4j.spi.AbstractLoggerAdapter.getContext(AbstractLoggerAdapter.java:122)
at 
org.apache.logging.slf4j.Log4jLoggerFactory.getContext(Log4jLoggerFactory.java:43)
at 
org.apache.logging.log4j.spi.AbstractLoggerAdapter.getLogger(AbstractLoggerAdapter.java:46)
at 
org.apache.logging.slf4j.Log4jLoggerFactory.getLogger(Log4jLoggerFactory.java:29)
at org.slf4j.LoggerFactory.getLogger(LoggerFactory.java:358)
at org.slf4j.LoggerFactory.getLogger(LoggerFactory.java:383)
at org.apache.solr.util.SolrCLI.(SolrCLI.java:227)

2018-03-04 21:58:00,791 main ERROR Null object returned for RollingFile in 
Appenders.
2018-03-04 21:58:00,799 main ERROR Unable to locate appender "RollingFile" for 
logger config "root"

> Upgrade Solr to use log4j2 -- log4j 1 now officially end of life
> 
>
> Key: SOLR-7887
> URL: https://issues.apache.org/jira/browse/SOLR-7887
> Project: Solr
>  Issue Type: Task
>Affects Versions: 5.2.1
>Reporter: Shawn Heisey
>Assignee: Varun Thacker
>Priority: Major
> Attachments: SOLR-7887-WIP.patch, SOLR-7887-eoe-review.patch, 
> SOLR-7887-eoe-review.patch, SOLR-7887.patch, SOLR-7887.patch, 
> SOLR-7887.patch, SOLR-7887.patch, SOLR-7887.patch, SOLR-7887.patch, 
> SOLR-7887.patch, SOLR-7887.patch, SOLR-7887.patch, SOLR-7887.patch
>
>
> The logging services project has officially announced the EOL of log4j 1:
> https://blogs.apache.org/foundation/entry/apache_logging_services_project_announces
> In the official binary jetty deployment, we use use log4j 1.2 as our final 
> logging destination, so the admin UI has a log watcher that actually uses 
> log4j and java.util.logging classes.  That will need to be extended to add 
> log4j2.  I think that might be the largest pain point to this upgrade.
> There is some crossover between log4j2 and slf4j.  Figuring out exactly which 
> jars need to be in the lib/ext directory will take some research.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[jira] [Commented] (SOLR-7887) Upgrade Solr to use log4j2 -- log4j 1 now officially end of life

2018-03-04 Thread Erick Erickson (JIRA)

[ 
https://issues.apache.org/jira/browse/SOLR-7887?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16385612#comment-16385612
 ] 

Erick Erickson commented on SOLR-7887:
--

Here's what I have at this point. Everything is fine except one test failure 
(TestLogWatcher). But the odd thing is that it _only_ fails when running 'ant 
test'. I can beast it over and over and over, I can run it individually or with 
test-nocompile etc. etc. etc.

We need to figure this out or BadApple it before committing, I'd prefer the 
former of course.

> Upgrade Solr to use log4j2 -- log4j 1 now officially end of life
> 
>
> Key: SOLR-7887
> URL: https://issues.apache.org/jira/browse/SOLR-7887
> Project: Solr
>  Issue Type: Task
>Affects Versions: 5.2.1
>Reporter: Shawn Heisey
>Assignee: Varun Thacker
>Priority: Major
> Attachments: SOLR-7887-WIP.patch, SOLR-7887-eoe-review.patch, 
> SOLR-7887-eoe-review.patch, SOLR-7887.patch, SOLR-7887.patch, 
> SOLR-7887.patch, SOLR-7887.patch, SOLR-7887.patch, SOLR-7887.patch, 
> SOLR-7887.patch, SOLR-7887.patch, SOLR-7887.patch, SOLR-7887.patch
>
>
> The logging services project has officially announced the EOL of log4j 1:
> https://blogs.apache.org/foundation/entry/apache_logging_services_project_announces
> In the official binary jetty deployment, we use use log4j 1.2 as our final 
> logging destination, so the admin UI has a log watcher that actually uses 
> log4j and java.util.logging classes.  That will need to be extended to add 
> log4j2.  I think that might be the largest pain point to this upgrade.
> There is some crossover between log4j2 and slf4j.  Figuring out exactly which 
> jars need to be in the lib/ext directory will take some research.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[jira] [Updated] (SOLR-7887) Upgrade Solr to use log4j2 -- log4j 1 now officially end of life

2018-03-04 Thread Erick Erickson (JIRA)

 [ 
https://issues.apache.org/jira/browse/SOLR-7887?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Erick Erickson updated SOLR-7887:
-
Attachment: SOLR-7887.patch

> Upgrade Solr to use log4j2 -- log4j 1 now officially end of life
> 
>
> Key: SOLR-7887
> URL: https://issues.apache.org/jira/browse/SOLR-7887
> Project: Solr
>  Issue Type: Task
>Affects Versions: 5.2.1
>Reporter: Shawn Heisey
>Assignee: Varun Thacker
>Priority: Major
> Attachments: SOLR-7887-WIP.patch, SOLR-7887-eoe-review.patch, 
> SOLR-7887-eoe-review.patch, SOLR-7887.patch, SOLR-7887.patch, 
> SOLR-7887.patch, SOLR-7887.patch, SOLR-7887.patch, SOLR-7887.patch, 
> SOLR-7887.patch, SOLR-7887.patch, SOLR-7887.patch, SOLR-7887.patch
>
>
> The logging services project has officially announced the EOL of log4j 1:
> https://blogs.apache.org/foundation/entry/apache_logging_services_project_announces
> In the official binary jetty deployment, we use use log4j 1.2 as our final 
> logging destination, so the admin UI has a log watcher that actually uses 
> log4j and java.util.logging classes.  That will need to be extended to add 
> log4j2.  I think that might be the largest pain point to this upgrade.
> There is some crossover between log4j2 and slf4j.  Figuring out exactly which 
> jars need to be in the lib/ext directory will take some research.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[JENKINS] Lucene-Solr-Tests-7.x - Build # 481 - Unstable

2018-03-04 Thread Apache Jenkins Server
Build: https://builds.apache.org/job/Lucene-Solr-Tests-7.x/481/

3 tests failed.
FAILED:  
org.apache.lucene.codecs.lucene54.TestLucene54DocValuesFormat.testSortedNumericsSingleValuedMissingVsStoredFields

Error Message:
Test abandoned because suite timeout was reached.

Stack Trace:
java.lang.Exception: Test abandoned because suite timeout was reached.
at __randomizedtesting.SeedInfo.seed([C5389FDCE7D3ACCC]:0)


FAILED:  
junit.framework.TestSuite.org.apache.lucene.codecs.lucene54.TestLucene54DocValuesFormat

Error Message:
Suite timeout exceeded (>= 720 msec).

Stack Trace:
java.lang.Exception: Suite timeout exceeded (>= 720 msec).
at __randomizedtesting.SeedInfo.seed([C5389FDCE7D3ACCC]:0)


FAILED:  org.apache.solr.cloud.autoscaling.sim.TestLargeCluster.testNodeLost

Error Message:
/autoscaling/nodeAdded/127.0.0.1:10119_solr

Stack Trace:
java.util.NoSuchElementException: /autoscaling/nodeAdded/127.0.0.1:10119_solr
at 
__randomizedtesting.SeedInfo.seed([B4C91F39E209F767:BDCD1C761E392E1]:0)
at 
org.apache.solr.cloud.autoscaling.sim.SimDistribStateManager$Node.removeChild(SimDistribStateManager.java:163)
at 
org.apache.solr.cloud.autoscaling.sim.SimDistribStateManager$Node.removeEphemeralChildren(SimDistribStateManager.java:195)
at 
org.apache.solr.cloud.autoscaling.sim.SimDistribStateManager$Node.removeEphemeralChildren(SimDistribStateManager.java:197)
at 
org.apache.solr.cloud.autoscaling.sim.SimDistribStateManager$Node.removeEphemeralChildren(SimDistribStateManager.java:197)
at 
org.apache.solr.cloud.autoscaling.sim.SimClusterStateProvider.simRemoveNode(SimClusterStateProvider.java:253)
at 
org.apache.solr.cloud.autoscaling.sim.SimCloudManager.simRemoveNode(SimCloudManager.java:273)
at 
org.apache.solr.cloud.autoscaling.sim.SimCloudManager.simRemoveRandomNodes(SimCloudManager.java:293)
at 
org.apache.solr.cloud.autoscaling.sim.SimSolrCloudTestCase.setUp(SimSolrCloudTestCase.java:161)
at sun.reflect.GeneratedMethodAccessor3.invoke(Unknown Source)
at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner.invoke(RandomizedRunner.java:1737)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$9.evaluate(RandomizedRunner.java:968)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$10.evaluate(RandomizedRunner.java:984)
at 
com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:57)
at 
org.apache.lucene.util.TestRuleSetupTeardownChained$1.evaluate(TestRuleSetupTeardownChained.java:49)
at 
org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:45)
at 
org.apache.lucene.util.TestRuleThreadAndTestName$1.evaluate(TestRuleThreadAndTestName.java:48)
at 
org.apache.lucene.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:64)
at 
org.apache.lucene.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:47)
at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at 
com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:368)
at 
com.carrotsearch.randomizedtesting.ThreadLeakControl.forkTimeoutingTask(ThreadLeakControl.java:817)
at 
com.carrotsearch.randomizedtesting.ThreadLeakControl$3.evaluate(ThreadLeakControl.java:468)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner.runSingleTest(RandomizedRunner.java:943)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$5.evaluate(RandomizedRunner.java:829)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$6.evaluate(RandomizedRunner.java:879)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$7.evaluate(RandomizedRunner.java:890)
at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at 
com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:57)
at 
org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:45)
at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at 
org.apache.lucene.util.TestRuleStoreClassName$1.evaluate(TestRuleStoreClassName.java:41)
at 
com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:40)
at 
com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:40)
at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
  

[jira] [Commented] (SOLR-12008) Settle a location for the "correct" log4j2.xml file.

2018-03-04 Thread David Smiley (JIRA)

[ 
https://issues.apache.org/jira/browse/SOLR-12008?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16385586#comment-16385586
 ] 

David Smiley commented on SOLR-12008:
-

If we want them to be identical, we could keep solr/server/resources and have 
"ant server" ensure the other 2 get copied?

> Settle a location for the "correct" log4j2.xml file.
> 
>
> Key: SOLR-12008
> URL: https://issues.apache.org/jira/browse/SOLR-12008
> Project: Solr
>  Issue Type: Improvement
>  Security Level: Public(Default Security Level. Issues are Public) 
>  Components: logging
>Reporter: Erick Erickson
>Assignee: Erick Erickson
>Priority: Major
>
> As part of SOLR-11934 I started looking at log4j.properties files. Waaay back 
> in 2015, the %C in "/solr/server/resources/log4j.properties" was changed to 
> use %c, but the file in "solr/example/resources/log4j.properties" was not 
> changed. That got me to looking around and there are a bunch of 
> log4j.properties files:
> ./solr/core/src/test-files/log4j.properties
> ./solr/example/resources/log4j.properties
> ./solr/solrj/src/test-files/log4j.properties
> ./solr/server/resources/log4j.properties
> ./solr/server/scripts/cloud-scripts/log4j.properties
> ./solr/contrib/dataimporthandler/src/test-files/log4j.properties
> ./solr/contrib/clustering/src/test-files/log4j.properties
> ./solr/contrib/ltr/src/test-files/log4j.properties
> ./solr/test-framework/src/test-files/log4j.properties
> Why do we have so many? After the log4j2 ticket gets checked in (SOLR-7887) I 
> propose the logging configuration files get consolidated. The question is 
> "how far"? 
> I at least want to get rid of the one in solr/example, users should use the 
> one in server/resources. Having to maintain these two separately is asking 
> for trouble.
> [~markrmil...@gmail.com] Do you have any wisdom on the properties file in 
> server/scripts/cloud-scripts?
> Anyone else who has a clue about why the other properties files were created, 
> especially the ones in contrib?
> And what about all the ones in various test-files directories? People didn't 
> create them for no reason, and I don't want to rediscover that it's a real 
> pain to try to re-use the one in server/resources for instance.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[JENKINS] Lucene-Solr-SmokeRelease-master - Build # 970 - Still Failing

2018-03-04 Thread Apache Jenkins Server
Build: https://builds.apache.org/job/Lucene-Solr-SmokeRelease-master/970/

No tests ran.

Build Log:
[...truncated 28738 lines...]
prepare-release-no-sign:
[mkdir] Created dir: 
/x1/jenkins/jenkins-slave/workspace/Lucene-Solr-SmokeRelease-master/lucene/build/smokeTestRelease/dist
 [copy] Copying 491 files to 
/x1/jenkins/jenkins-slave/workspace/Lucene-Solr-SmokeRelease-master/lucene/build/smokeTestRelease/dist/lucene
 [copy] Copying 215 files to 
/x1/jenkins/jenkins-slave/workspace/Lucene-Solr-SmokeRelease-master/lucene/build/smokeTestRelease/dist/solr
   [smoker] Java 1.8 JAVA_HOME=/home/jenkins/tools/java/latest1.8
   [smoker] Java 9 JAVA_HOME=/home/jenkins/tools/java/latest1.9
   [smoker] NOTE: output encoding is UTF-8
   [smoker] 
   [smoker] Load release URL 
"file:/x1/jenkins/jenkins-slave/workspace/Lucene-Solr-SmokeRelease-master/lucene/build/smokeTestRelease/dist/"...
   [smoker] 
   [smoker] Test Lucene...
   [smoker]   test basics...
   [smoker]   get KEYS
   [smoker] 0.2 MB in 0.02 sec (15.4 MB/sec)
   [smoker]   check changes HTML...
   [smoker]   download lucene-8.0.0-src.tgz...
   [smoker] 30.2 MB in 0.03 sec (865.8 MB/sec)
   [smoker] verify md5/sha1 digests
   [smoker]   download lucene-8.0.0.tgz...
   [smoker] 73.3 MB in 0.09 sec (842.7 MB/sec)
   [smoker] verify md5/sha1 digests
   [smoker]   download lucene-8.0.0.zip...
   [smoker] 83.8 MB in 0.09 sec (892.6 MB/sec)
   [smoker] verify md5/sha1 digests
   [smoker]   unpack lucene-8.0.0.tgz...
   [smoker] verify JAR metadata/identity/no javax.* or java.* classes...
   [smoker] test demo with 1.8...
   [smoker]   got 6251 hits for query "lucene"
   [smoker] checkindex with 1.8...
   [smoker] test demo with 9...
   [smoker]   got 6251 hits for query "lucene"
   [smoker] checkindex with 9...
   [smoker] check Lucene's javadoc JAR
   [smoker]   unpack lucene-8.0.0.zip...
   [smoker] verify JAR metadata/identity/no javax.* or java.* classes...
   [smoker] test demo with 1.8...
   [smoker]   got 6251 hits for query "lucene"
   [smoker] checkindex with 1.8...
   [smoker] test demo with 9...
   [smoker]   got 6251 hits for query "lucene"
   [smoker] checkindex with 9...
   [smoker] check Lucene's javadoc JAR
   [smoker]   unpack lucene-8.0.0-src.tgz...
   [smoker] make sure no JARs/WARs in src dist...
   [smoker] run "ant validate"
   [smoker] run tests w/ Java 8 and testArgs='-Dtests.badapples=false 
-Dtests.slow=false'...
   [smoker] test demo with 1.8...
   [smoker]   got 212 hits for query "lucene"
   [smoker] checkindex with 1.8...
   [smoker] generate javadocs w/ Java 8...
   [smoker] 
   [smoker] Crawl/parse...
   [smoker] 
   [smoker] Verify...
   [smoker] run tests w/ Java 9 and testArgs='-Dtests.badapples=false 
-Dtests.slow=false'...
   [smoker] test demo with 9...
   [smoker]   got 212 hits for query "lucene"
   [smoker] checkindex with 9...
   [smoker]   confirm all releases have coverage in TestBackwardsCompatibility
   [smoker] find all past Lucene releases...
   [smoker] run TestBackwardsCompatibility..
   [smoker] success!
   [smoker] 
   [smoker] Test Solr...
   [smoker]   test basics...
   [smoker]   get KEYS
   [smoker] 0.2 MB in 0.00 sec (255.5 MB/sec)
   [smoker]   check changes HTML...
   [smoker]   download solr-8.0.0-src.tgz...
   [smoker] 52.6 MB in 0.11 sec (500.4 MB/sec)
   [smoker] verify md5/sha1 digests
   [smoker]   download solr-8.0.0.tgz...
   [smoker] 151.0 MB in 0.29 sec (524.7 MB/sec)
   [smoker] verify md5/sha1 digests
   [smoker]   download solr-8.0.0.zip...
   [smoker] 152.0 MB in 1.19 sec (127.3 MB/sec)
   [smoker] verify md5/sha1 digests
   [smoker]   unpack solr-8.0.0.tgz...
   [smoker] verify JAR metadata/identity/no javax.* or java.* classes...
   [smoker] unpack lucene-8.0.0.tgz...
   [smoker]   **WARNING**: skipping check of 
/x1/jenkins/jenkins-slave/workspace/Lucene-Solr-SmokeRelease-master/lucene/build/smokeTestRelease/tmp/unpack/solr-8.0.0/contrib/dataimporthandler-extras/lib/javax.mail-1.5.1.jar:
 it has javax.* classes
   [smoker]   **WARNING**: skipping check of 
/x1/jenkins/jenkins-slave/workspace/Lucene-Solr-SmokeRelease-master/lucene/build/smokeTestRelease/tmp/unpack/solr-8.0.0/contrib/dataimporthandler-extras/lib/activation-1.1.1.jar:
 it has javax.* classes
   [smoker] copying unpacked distribution for Java 8 ...
   [smoker] test solr example w/ Java 8...
   [smoker]   start Solr instance 
(log=/x1/jenkins/jenkins-slave/workspace/Lucene-Solr-SmokeRelease-master/lucene/build/smokeTestRelease/tmp/unpack/solr-8.0.0-java8/solr-example.log)...
   [smoker] No process found for Solr node running on port 8983
   [smoker]   Running techproducts example on port 8983 from 

[jira] [Commented] (SOLR-7034) Consider allowing any node to become leader, regardless of their last published state.

2018-03-04 Thread Cao Manh Dat (JIRA)

[ 
https://issues.apache.org/jira/browse/SOLR-7034?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16385579#comment-16385579
 ] 

Cao Manh Dat commented on SOLR-7034:


After SOLR-12011 get committed, the only thing left for this issue is removing 
the useless FORCEPREPAREFORLEADERSHIP API and related tests.

> Consider allowing any node to become leader, regardless of their last 
> published state.
> --
>
> Key: SOLR-7034
> URL: https://issues.apache.org/jira/browse/SOLR-7034
> Project: Solr
>  Issue Type: Bug
>  Components: SolrCloud
>Reporter: Mark Miller
>Assignee: Mark Miller
>Priority: Major
> Fix For: 5.2, 6.0
>
> Attachments: SOLR-7034.patch, SOLR-7034.patch, SOLR-7034.patch
>
>
> Now that we allow a min replication param for updates, I think it's time to 
> loosen this up. Currently, you can end up in a state where no one in a shard 
> thinks they can be leader and you so do this fast ugly infinite loop trying 
> to pick the leader.
> We should let anyone that is able to properly sync with the available 
> replicas to become leader if that process succeeds.
> The previous strategy was to account for the case of not having enough 
> replicas after a machine loss to ensure you don't lose the data. The idea was 
> that you should stop the cluster to avoid losing data and repair and get all 
> your replicas involved in a leadership election. Instead, we should favor 
> carrying on, and those that want to ensure they don't lose data due to major 
> replica loss should use the min replication update param.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[jira] [Updated] (SOLR-7034) Consider allowing any node to become leader, regardless of their last published state.

2018-03-04 Thread Cao Manh Dat (JIRA)

 [ 
https://issues.apache.org/jira/browse/SOLR-7034?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Cao Manh Dat updated SOLR-7034:
---
Attachment: SOLR-7034.patch

> Consider allowing any node to become leader, regardless of their last 
> published state.
> --
>
> Key: SOLR-7034
> URL: https://issues.apache.org/jira/browse/SOLR-7034
> Project: Solr
>  Issue Type: Bug
>  Components: SolrCloud
>Reporter: Mark Miller
>Assignee: Mark Miller
>Priority: Major
> Fix For: 5.2, 6.0
>
> Attachments: SOLR-7034.patch, SOLR-7034.patch, SOLR-7034.patch
>
>
> Now that we allow a min replication param for updates, I think it's time to 
> loosen this up. Currently, you can end up in a state where no one in a shard 
> thinks they can be leader and you so do this fast ugly infinite loop trying 
> to pick the leader.
> We should let anyone that is able to properly sync with the available 
> replicas to become leader if that process succeeds.
> The previous strategy was to account for the case of not having enough 
> replicas after a machine loss to ensure you don't lose the data. The idea was 
> that you should stop the cluster to avoid losing data and repair and get all 
> your replicas involved in a leadership election. Instead, we should favor 
> carrying on, and those that want to ensure they don't lose data due to major 
> replica loss should use the min replication update param.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[JENKINS-EA] Lucene-Solr-master-Linux (64bit/jdk-10-ea+43) - Build # 21573 - Still Unstable!

2018-03-04 Thread Policeman Jenkins Server
Build: https://jenkins.thetaphi.de/job/Lucene-Solr-master-Linux/21573/
Java: 64bit/jdk-10-ea+43 -XX:+UseCompressedOops -XX:+UseParallelGC

3 tests failed.
FAILED:  org.apache.solr.cloud.MoveReplicaHDFSTest.testFailedMove

Error Message:


Stack Trace:
java.lang.AssertionError
at 
__randomizedtesting.SeedInfo.seed([F8C80E019B54EAA1:5205DDF32C873F71]:0)
at org.junit.Assert.fail(Assert.java:92)
at org.junit.Assert.assertTrue(Assert.java:43)
at org.junit.Assert.assertFalse(Assert.java:68)
at org.junit.Assert.assertFalse(Assert.java:79)
at 
org.apache.solr.cloud.MoveReplicaTest.testFailedMove(MoveReplicaTest.java:303)
at 
java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at 
java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at 
java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.base/java.lang.reflect.Method.invoke(Method.java:564)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner.invoke(RandomizedRunner.java:1737)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$8.evaluate(RandomizedRunner.java:934)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$9.evaluate(RandomizedRunner.java:970)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$10.evaluate(RandomizedRunner.java:984)
at 
com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:57)
at 
org.apache.lucene.util.TestRuleSetupTeardownChained$1.evaluate(TestRuleSetupTeardownChained.java:49)
at 
org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:45)
at 
org.apache.lucene.util.TestRuleThreadAndTestName$1.evaluate(TestRuleThreadAndTestName.java:48)
at 
org.apache.lucene.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:64)
at 
org.apache.lucene.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:47)
at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at 
com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:368)
at 
com.carrotsearch.randomizedtesting.ThreadLeakControl.forkTimeoutingTask(ThreadLeakControl.java:817)
at 
com.carrotsearch.randomizedtesting.ThreadLeakControl$3.evaluate(ThreadLeakControl.java:468)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner.runSingleTest(RandomizedRunner.java:943)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$5.evaluate(RandomizedRunner.java:829)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$6.evaluate(RandomizedRunner.java:879)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$7.evaluate(RandomizedRunner.java:890)
at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at 
com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:57)
at 
org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:45)
at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at 
org.apache.lucene.util.TestRuleStoreClassName$1.evaluate(TestRuleStoreClassName.java:41)
at 
com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:40)
at 
com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:40)
at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at 
org.apache.lucene.util.TestRuleAssertionsRequired$1.evaluate(TestRuleAssertionsRequired.java:53)
at 
org.apache.lucene.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:47)
at 
org.apache.lucene.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:64)
at 
org.apache.lucene.util.TestRuleIgnoreTestSuites$1.evaluate(TestRuleIgnoreTestSuites.java:54)
at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at 
com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:368)
at java.base/java.lang.Thread.run(Thread.java:844)


FAILED:  
org.apache.solr.cloud.TestCloudConsistency.testOutOfSyncReplicasCannotBecomeLeaderAfterRestart

Error Message:
Timeout waiting for active collection null Live Nodes: [127.0.0.1:33725_solr, 
127.0.0.1:38885_solr, 127.0.0.1:44303_solr, 127.0.0.1:46499_solr] Last 

[jira] [Commented] (SOLR-11795) Add Solr metrics exporter for Prometheus

2018-03-04 Thread Koji Sekiguchi (JIRA)

[ 
https://issues.apache.org/jira/browse/SOLR-11795?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16385552#comment-16385552
 ] 

Koji Sekiguchi commented on SOLR-11795:
---

Uwe's suggestion helped us to check this patch working on various platforms 
without causing someone trouble. Actually, Java 9 Jenkins found that SnakeYAML 
stuff uses reflection in illegal ways, which we couldn't notice before 
committing.

... and the results look good so far. I'd like to commit this to master and 
branch_7x soon.

> Add Solr metrics exporter for Prometheus
> 
>
> Key: SOLR-11795
> URL: https://issues.apache.org/jira/browse/SOLR-11795
> Project: Solr
>  Issue Type: Improvement
>  Security Level: Public(Default Security Level. Issues are Public) 
>  Components: metrics
>Affects Versions: 7.2
>Reporter: Minoru Osuka
>Assignee: Koji Sekiguchi
>Priority: Minor
> Fix For: master (8.0), 7.3
>
> Attachments: SOLR-11795-10.patch, SOLR-11795-11.patch, 
> SOLR-11795-2.patch, SOLR-11795-3.patch, SOLR-11795-4.patch, 
> SOLR-11795-5.patch, SOLR-11795-6.patch, SOLR-11795-7.patch, 
> SOLR-11795-8.patch, SOLR-11795-9.patch, SOLR-11795-dev-tools.patch, 
> SOLR-11795.patch, solr-dashboard.png, solr-exporter-diagram.png
>
>  Time Spent: 20m
>  Remaining Estimate: 0h
>
> I 'd like to monitor Solr using Prometheus and Grafana.
> I've already created Solr metrics exporter for Prometheus. I'd like to 
> contribute to contrib directory if you don't mind.
> !solr-exporter-diagram.png|thumbnail!
> !solr-dashboard.png|thumbnail!



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[jira] [Commented] (LUCENE-8190) Replace dendency on LegacyCell for setting pruneLeafyBranches on RecursivePrefixTreeStrategy

2018-03-04 Thread David Smiley (JIRA)

[ 
https://issues.apache.org/jira/browse/LUCENE-8190?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16385547#comment-16385547
 ] 

David Smiley commented on LUCENE-8190:
--

+1 to the patch.  Thanks Ignacio!

> Replace dendency on LegacyCell for setting pruneLeafyBranches on 
> RecursivePrefixTreeStrategy
> 
>
> Key: LUCENE-8190
> URL: https://issues.apache.org/jira/browse/LUCENE-8190
> Project: Lucene - Core
>  Issue Type: Improvement
>  Components: modules/spatial-extras
>Reporter: Ignacio Vera
>Priority: Major
> Attachments: LUCENE-8190.patch, LUCENE-8190.patch
>
>
> The setting {{pruneLeafyBranches}} on {{RecursivePrefixTreeStrategy}} depends 
> on abstract class {{LegacyCell}} and therefore trees like the newly added 
> {{S2PrefixTree}} cannot benefit for such optimization.
> It is proposed to add a new specialize interface for {{cell}} interface and 
> make the setting depends on it instead of {{LegacyCell.}}



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[jira] [Assigned] (SOLR-11336) DocBasedVersionConstraintsProcessor should be more extensible and support multiple version fields

2018-03-04 Thread David Smiley (JIRA)

 [ 
https://issues.apache.org/jira/browse/SOLR-11336?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

David Smiley reassigned SOLR-11336:
---

Assignee: David Smiley

> DocBasedVersionConstraintsProcessor should be more extensible and support 
> multiple version fields
> -
>
> Key: SOLR-11336
> URL: https://issues.apache.org/jira/browse/SOLR-11336
> Project: Solr
>  Issue Type: Improvement
>  Security Level: Public(Default Security Level. Issues are Public) 
>Affects Versions: master (8.0)
>Reporter: Michael Braun
>Assignee: David Smiley
>Priority: Minor
> Attachments: SOLR-11336.patch, SOLR-11336.patch, SOLR-11336.patch
>
>
> DocBasedVersionConstraintsProcessor supports allowing document updates only 
> if the new version is greater than the old. However, if any behavior wants to 
> be extended / changed in minor ways, the entire class will need to be copied 
> and slightly modified rather than extending and changing the method in 
> question. 
> It would be nice if DocBasedVersionConstraintsProcessor stood on its own as a 
> non-private class. In addition, certain methods (such as pieces of 
> isVersionNewEnough) should be broken out into separate methods so they can be 
> extended such that someone can extend the processor class and override what 
> it means for a new version to be accepted (allowing equal versions through? 
> What if new is a lower not greater number?). 



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[jira] [Commented] (SOLR-11336) DocBasedVersionConstraintsProcessor should be more extensible and support multiple version fields

2018-03-04 Thread David Smiley (JIRA)

[ 
https://issues.apache.org/jira/browse/SOLR-11336?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16385529#comment-16385529
 ] 

David Smiley commented on SOLR-11336:
-

bq. Regarding versionFields vs splitting versionField, what do you mean?

I only mean it in terms of configuration -- that's all.  Split versionField by 
comma is the alternative I suggest.  It's simpler to implement, simpler to 
understand the parameters this URP takes (1 required versus an XOR between 2).  
It just takes a white lie that "versionField" is singular when in fact it could 
be multiple.  I think it's not a big deal considering the need for more than 
one is uncommon.  That's all... I'm not particularly opinionated about this so 
if you'd rather keep how you've already coded it now then fine.

bq. unless there was a version of the processor (which can now be accomplished 
by subclassing and overriding versionInUpdateIsAcceptable) that accepted 
greater than or equal numbers and passed them along until the last one

I don't have an opinion because I haven't thought through what you say 
admittedly; I just want to allow this URP to be more subclass-able to allow 
more customization.  You've thought through this and I trust your judgement.



> DocBasedVersionConstraintsProcessor should be more extensible and support 
> multiple version fields
> -
>
> Key: SOLR-11336
> URL: https://issues.apache.org/jira/browse/SOLR-11336
> Project: Solr
>  Issue Type: Improvement
>  Security Level: Public(Default Security Level. Issues are Public) 
>Affects Versions: master (8.0)
>Reporter: Michael Braun
>Priority: Minor
> Attachments: SOLR-11336.patch, SOLR-11336.patch, SOLR-11336.patch
>
>
> DocBasedVersionConstraintsProcessor supports allowing document updates only 
> if the new version is greater than the old. However, if any behavior wants to 
> be extended / changed in minor ways, the entire class will need to be copied 
> and slightly modified rather than extending and changing the method in 
> question. 
> It would be nice if DocBasedVersionConstraintsProcessor stood on its own as a 
> non-private class. In addition, certain methods (such as pieces of 
> isVersionNewEnough) should be broken out into separate methods so they can be 
> extended such that someone can extend the processor class and override what 
> it means for a new version to be accepted (allowing equal versions through? 
> What if new is a lower not greater number?). 



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[JENKINS] Lucene-Solr-7.x-Linux (64bit/jdk1.8.0_162) - Build # 1466 - Unstable!

2018-03-04 Thread Policeman Jenkins Server
Build: https://jenkins.thetaphi.de/job/Lucene-Solr-7.x-Linux/1466/
Java: 64bit/jdk1.8.0_162 -XX:+UseCompressedOops -XX:+UseConcMarkSweepGC

12 tests failed.
FAILED:  org.apache.solr.cloud.LIRRollingUpdatesTest.testNewReplicaOldLeader

Error Message:
Replica core_node44 is not put as DOWN null Live Nodes: [127.0.0.1:33383_solr, 
127.0.0.1:34161_solr, 127.0.0.1:42637_solr] Last available state: 
DocCollection(testNewReplicaOldLeader//collections/testNewReplicaOldLeader/state.json/6)={
   "pullReplicas":"0",   "replicationFactor":"2",   "shards":{"shard1":{   
"range":"8000-7fff",   "state":"active",   "replicas":{ 
"core_node42":{   "core":"testNewReplicaOldLeader_shard1_replica_n41",  
 "base_url":"https://127.0.0.1:42637/solr;,   
"node_name":"127.0.0.1:42637_solr",   "state":"active",   
"type":"NRT",   "leader":"true"}, "core_node44":{   
"core":"testNewReplicaOldLeader_shard1_replica_n43",   
"base_url":"https://127.0.0.1:34161/solr;,   
"node_name":"127.0.0.1:34161_solr",   "state":"recovering",   
"type":"NRT",   "router":{"name":"compositeId"},   "maxShardsPerNode":"1",  
 "autoAddReplicas":"false",   "nrtReplicas":"2",   "tlogReplicas":"0"}

Stack Trace:
java.lang.AssertionError: Replica core_node44 is not put as DOWN
null
Live Nodes: [127.0.0.1:33383_solr, 127.0.0.1:34161_solr, 127.0.0.1:42637_solr]
Last available state: 
DocCollection(testNewReplicaOldLeader//collections/testNewReplicaOldLeader/state.json/6)={
  "pullReplicas":"0",
  "replicationFactor":"2",
  "shards":{"shard1":{
  "range":"8000-7fff",
  "state":"active",
  "replicas":{
"core_node42":{
  "core":"testNewReplicaOldLeader_shard1_replica_n41",
  "base_url":"https://127.0.0.1:42637/solr;,
  "node_name":"127.0.0.1:42637_solr",
  "state":"active",
  "type":"NRT",
  "leader":"true"},
"core_node44":{
  "core":"testNewReplicaOldLeader_shard1_replica_n43",
  "base_url":"https://127.0.0.1:34161/solr;,
  "node_name":"127.0.0.1:34161_solr",
  "state":"recovering",
  "type":"NRT",
  "router":{"name":"compositeId"},
  "maxShardsPerNode":"1",
  "autoAddReplicas":"false",
  "nrtReplicas":"2",
  "tlogReplicas":"0"}
at 
__randomizedtesting.SeedInfo.seed([C24C29A9D03A1BD1:7F1A837730E4650F]:0)
at org.junit.Assert.fail(Assert.java:93)
at 
org.apache.solr.cloud.SolrCloudTestCase.waitForState(SolrCloudTestCase.java:269)
at 
org.apache.solr.cloud.LIRRollingUpdatesTest.testNewReplicaOldLeader(LIRRollingUpdatesTest.java:129)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner.invoke(RandomizedRunner.java:1737)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$8.evaluate(RandomizedRunner.java:934)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$9.evaluate(RandomizedRunner.java:970)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$10.evaluate(RandomizedRunner.java:984)
at 
com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:57)
at 
org.apache.lucene.util.TestRuleSetupTeardownChained$1.evaluate(TestRuleSetupTeardownChained.java:49)
at 
org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:45)
at 
org.apache.lucene.util.TestRuleThreadAndTestName$1.evaluate(TestRuleThreadAndTestName.java:48)
at 
org.apache.lucene.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:64)
at 
org.apache.lucene.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:47)
at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at 
com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:368)
at 
com.carrotsearch.randomizedtesting.ThreadLeakControl.forkTimeoutingTask(ThreadLeakControl.java:817)
at 
com.carrotsearch.randomizedtesting.ThreadLeakControl$3.evaluate(ThreadLeakControl.java:468)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner.runSingleTest(RandomizedRunner.java:943)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$5.evaluate(RandomizedRunner.java:829)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$6.evaluate(RandomizedRunner.java:879)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$7.evaluate(RandomizedRunner.java:890)
at 

[JENKINS] Lucene-Solr-BadApples-Tests-7.x - Build # 5 - Still Unstable

2018-03-04 Thread Apache Jenkins Server
Build: https://builds.apache.org/job/Lucene-Solr-BadApples-Tests-7.x/5/

8 tests failed.
FAILED:  
org.apache.solr.cloud.TestCloudConsistency.testOutOfSyncReplicasCannotBecomeLeaderAfterRestart

Error Message:
Timeout waiting for active collection null Live Nodes: [127.0.0.1:33857_solr, 
127.0.0.1:42997_solr, 127.0.0.1:44180_solr, 127.0.0.1:48103_solr] Last 
available state: 
DocCollection(outOfSyncReplicasCannotBecomeLeader-true//collections/outOfSyncReplicasCannotBecomeLeader-true/state.json/24)={
   "pullReplicas":"0",   "replicationFactor":"3",   "shards":{"shard1":{   
"range":"8000-7fff",   "state":"active",   "replicas":{ 
"core_node62":{   
"core":"outOfSyncReplicasCannotBecomeLeader-true_shard1_replica_n61",   
"base_url":"https://127.0.0.1:48103/solr;,   
"node_name":"127.0.0.1:48103_solr",   "state":"active",   
"type":"NRT",   "leader":"true"}, "core_node64":{   
"core":"outOfSyncReplicasCannotBecomeLeader-true_shard1_replica_n63",   
"base_url":"https://127.0.0.1:33857/solr;,   
"node_name":"127.0.0.1:33857_solr",   "state":"down",   
"type":"NRT"}, "core_node66":{   
"core":"outOfSyncReplicasCannotBecomeLeader-true_shard1_replica_n65",   
"base_url":"https://127.0.0.1:44180/solr;,   
"node_name":"127.0.0.1:44180_solr",   "state":"down",   
"type":"NRT",   "router":{"name":"compositeId"},   "maxShardsPerNode":"1",  
 "autoAddReplicas":"false",   "nrtReplicas":"3",   "tlogReplicas":"0"}

Stack Trace:
java.lang.AssertionError: Timeout waiting for active collection
null
Live Nodes: [127.0.0.1:33857_solr, 127.0.0.1:42997_solr, 127.0.0.1:44180_solr, 
127.0.0.1:48103_solr]
Last available state: 
DocCollection(outOfSyncReplicasCannotBecomeLeader-true//collections/outOfSyncReplicasCannotBecomeLeader-true/state.json/24)={
  "pullReplicas":"0",
  "replicationFactor":"3",
  "shards":{"shard1":{
  "range":"8000-7fff",
  "state":"active",
  "replicas":{
"core_node62":{
  "core":"outOfSyncReplicasCannotBecomeLeader-true_shard1_replica_n61",
  "base_url":"https://127.0.0.1:48103/solr;,
  "node_name":"127.0.0.1:48103_solr",
  "state":"active",
  "type":"NRT",
  "leader":"true"},
"core_node64":{
  "core":"outOfSyncReplicasCannotBecomeLeader-true_shard1_replica_n63",
  "base_url":"https://127.0.0.1:33857/solr;,
  "node_name":"127.0.0.1:33857_solr",
  "state":"down",
  "type":"NRT"},
"core_node66":{
  "core":"outOfSyncReplicasCannotBecomeLeader-true_shard1_replica_n65",
  "base_url":"https://127.0.0.1:44180/solr;,
  "node_name":"127.0.0.1:44180_solr",
  "state":"down",
  "type":"NRT",
  "router":{"name":"compositeId"},
  "maxShardsPerNode":"1",
  "autoAddReplicas":"false",
  "nrtReplicas":"3",
  "tlogReplicas":"0"}
at 
__randomizedtesting.SeedInfo.seed([C35EF6BD84A2BA2E:EBCB64E8572F6B75]:0)
at org.junit.Assert.fail(Assert.java:93)
at 
org.apache.solr.cloud.SolrCloudTestCase.waitForState(SolrCloudTestCase.java:269)
at 
org.apache.solr.cloud.TestCloudConsistency.addDocToWhenOtherReplicasAreDown(TestCloudConsistency.java:164)
at 
org.apache.solr.cloud.TestCloudConsistency.testOutOfSyncReplicasCannotBecomeLeader(TestCloudConsistency.java:122)
at 
org.apache.solr.cloud.TestCloudConsistency.testOutOfSyncReplicasCannotBecomeLeaderAfterRestart(TestCloudConsistency.java:95)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner.invoke(RandomizedRunner.java:1737)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$8.evaluate(RandomizedRunner.java:934)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$9.evaluate(RandomizedRunner.java:970)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$10.evaluate(RandomizedRunner.java:984)
at 
com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:57)
at 
org.apache.lucene.util.TestRuleSetupTeardownChained$1.evaluate(TestRuleSetupTeardownChained.java:49)
at 
org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:45)
at 
org.apache.lucene.util.TestRuleThreadAndTestName$1.evaluate(TestRuleThreadAndTestName.java:48)
at 
org.apache.lucene.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:64)
at 

[jira] [Resolved] (SOLR-12011) Consistence problem when in-sync replicas are DOWN

2018-03-04 Thread Cao Manh Dat (JIRA)

 [ 
https://issues.apache.org/jira/browse/SOLR-12011?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Cao Manh Dat resolved SOLR-12011.
-
   Resolution: Fixed
Fix Version/s: 7.3
   master (8.0)

> Consistence problem when in-sync replicas are DOWN
> --
>
> Key: SOLR-12011
> URL: https://issues.apache.org/jira/browse/SOLR-12011
> Project: Solr
>  Issue Type: Bug
>  Security Level: Public(Default Security Level. Issues are Public) 
>  Components: SolrCloud
>Reporter: Cao Manh Dat
>Assignee: Cao Manh Dat
>Priority: Major
> Fix For: master (8.0), 7.3
>
> Attachments: SOLR-12011.patch, SOLR-12011.patch, SOLR-12011.patch, 
> SOLR-12011.patch, SOLR-12011.patch
>
>
> Currently, we will meet consistency problem when in-sync replicas are DOWN. 
> For example:
>  1. A collection with 1 shard with 1 leader and 2 replicas
>  2. Nodes contain 2 replicas go down
>  3. The leader receives an update A, success
>  4. The node contains the leader goes down
>  5. 2 replicas come back
>  6. One of them become leader --> But they shouldn't become leader since they 
> missed the update A
> A solution to this issue :
>  * The idea here is using term value of each replica (SOLR-11702) will be 
> enough to tell that a replica received the latest updates or not. Therefore 
> only replicas with the highest term can become the leader.
>  * There are a couple of things need to be done on this issue
>  ** When leader receives the first updates, its term should be changed from 0 
> -> 1, so further replicas added to the same shard won't be able to become 
> leader (their term = 0) until they finish recovery
>  ** For DOWN replicas, the leader should also need to check (in DUP.finish()) 
> that those replicas have term less than leader before return results to users
>  ** Just by looking at term value of replica, it is not enough to tell us 
> that replica is in-sync with leader or not. Because that replica might not 
> finish the recovery process. We need to introduce another flag (stored on 
> shard term node on ZK) to tell us that replica finished recovery or not. It 
> will look like this.
>  *** {"code_node1" : 1, "core_node2" : 0} — (when core_node2 start recovery) 
> --->
>  *** {"core_node1" : 1, "core_node2" : 1, "core_node2_recovering" : 1} — 
> (when core_node2 finish recovery) --->
>  *** {"core_node1" : 1, "core_node2" : 1}



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[jira] [Commented] (SOLR-12054) ebeAdd and ebeSubtract should support matrix operations

2018-03-04 Thread ASF subversion and git services (JIRA)

[ 
https://issues.apache.org/jira/browse/SOLR-12054?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16385463#comment-16385463
 ] 

ASF subversion and git services commented on SOLR-12054:


Commit 450a3d949006ce884d8d04535330d90581122a6f in lucene-solr's branch 
refs/heads/branch_7x from [~joel.bernstein]
[ https://git-wip-us.apache.org/repos/asf?p=lucene-solr.git;h=450a3d9 ]

SOLR-12054: ebeAdd and ebeSubtract should support matrix operations


> ebeAdd and ebeSubtract should support matrix operations
> ---
>
> Key: SOLR-12054
> URL: https://issues.apache.org/jira/browse/SOLR-12054
> Project: Solr
>  Issue Type: New Feature
>  Security Level: Public(Default Security Level. Issues are Public) 
>Reporter: Joel Bernstein
>Assignee: Joel Bernstein
>Priority: Major
> Fix For: 7.3
>
> Attachments: SOLR-12054.patch
>
>
> Currently ebeAdd and ebeSubtract perform element-by-element addition and 
> subtraction of vectors. This ticket will allow them to perform 
> element-by-element addition and subtraction of matrices as well.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[jira] [Commented] (SOLR-12054) ebeAdd and ebeSubtract should support matrix operations

2018-03-04 Thread ASF subversion and git services (JIRA)

[ 
https://issues.apache.org/jira/browse/SOLR-12054?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16385462#comment-16385462
 ] 

ASF subversion and git services commented on SOLR-12054:


Commit dc5db9b2f1050f1d1fc545c33f117ae4ec867983 in lucene-solr's branch 
refs/heads/master from [~joel.bernstein]
[ https://git-wip-us.apache.org/repos/asf?p=lucene-solr.git;h=dc5db9b ]

SOLR-12054: ebeAdd and ebeSubtract should support matrix operations


> ebeAdd and ebeSubtract should support matrix operations
> ---
>
> Key: SOLR-12054
> URL: https://issues.apache.org/jira/browse/SOLR-12054
> Project: Solr
>  Issue Type: New Feature
>  Security Level: Public(Default Security Level. Issues are Public) 
>Reporter: Joel Bernstein
>Assignee: Joel Bernstein
>Priority: Major
> Fix For: 7.3
>
> Attachments: SOLR-12054.patch
>
>
> Currently ebeAdd and ebeSubtract perform element-by-element addition and 
> subtraction of vectors. This ticket will allow them to perform 
> element-by-element addition and subtraction of matrices as well.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[JENKINS] Lucene-Solr-repro - Build # 192 - Still Unstable

2018-03-04 Thread Apache Jenkins Server
Build: https://builds.apache.org/job/Lucene-Solr-repro/192/

[...truncated 28 lines...]
[repro] Jenkins log URL: 
https://builds.apache.org/job/Lucene-Solr-SmokeRelease-7.x/164/consoleText

[repro] Revision: 59f67468b7f9f90f2377033d358521d451508f9a

[repro] Ant options: -DsmokeTestRelease.java9=/home/jenkins/tools/java/latest1.9
[repro] Repro line:  ant test  -Dtestcase=LIRRollingUpdatesTest 
-Dtests.method=testNewLeaderOldReplica -Dtests.seed=101C8F3EDB66CFCC 
-Dtests.multiplier=2 -Dtests.locale=es-CU -Dtests.timezone=Asia/Seoul 
-Dtests.asserts=true -Dtests.file.encoding=UTF-8

[repro] Repro line:  ant test  -Dtestcase=LIRRollingUpdatesTest 
-Dtests.method=testOldLeaderAndMixedReplicas -Dtests.seed=101C8F3EDB66CFCC 
-Dtests.multiplier=2 -Dtests.locale=es-CU -Dtests.timezone=Asia/Seoul 
-Dtests.asserts=true -Dtests.file.encoding=UTF-8

[repro] Repro line:  ant test  -Dtestcase=LIRRollingUpdatesTest 
-Dtests.method=testNewReplicaOldLeader -Dtests.seed=101C8F3EDB66CFCC 
-Dtests.multiplier=2 -Dtests.locale=es-CU -Dtests.timezone=Asia/Seoul 
-Dtests.asserts=true -Dtests.file.encoding=UTF-8

[repro] Repro line:  ant test  -Dtestcase=TestCloudConsistency 
-Dtests.method=testOutOfSyncReplicasCannotBecomeLeaderAfterRestart 
-Dtests.seed=101C8F3EDB66CFCC -Dtests.multiplier=2 -Dtests.locale=ca-ES 
-Dtests.timezone=America/Adak -Dtests.asserts=true -Dtests.file.encoding=UTF-8

[repro] Repro line:  ant test  -Dtestcase=MoveReplicaHDFSTest 
-Dtests.method=testFailedMove -Dtests.seed=101C8F3EDB66CFCC 
-Dtests.multiplier=2 -Dtests.locale=be-BY -Dtests.timezone=America/Boa_Vista 
-Dtests.asserts=true -Dtests.file.encoding=UTF-8

[repro] Repro line:  ant test  -Dtestcase=MoveReplicaHDFSTest 
-Dtests.method=test -Dtests.seed=101C8F3EDB66CFCC -Dtests.multiplier=2 
-Dtests.locale=be-BY -Dtests.timezone=America/Boa_Vista -Dtests.asserts=true 
-Dtests.file.encoding=UTF-8

[repro] git rev-parse --abbrev-ref HEAD
[repro] git rev-parse HEAD
[repro] Initial local git branch/revision: 
97299ed00699c248fc38465ee1b0eb0bb1561d3d
[repro] git fetch
[repro] git checkout 59f67468b7f9f90f2377033d358521d451508f9a

[...truncated 2 lines...]
[repro] git merge --ff-only

[...truncated 1 lines...]
[repro] ant clean

[...truncated 6 lines...]
[repro] Test suites by module:
[repro]solr/core
[repro]   TestCloudConsistency
[repro]   LIRRollingUpdatesTest
[repro]   MoveReplicaHDFSTest
[repro] ant compile-test

[...truncated 3310 lines...]
[repro] ant test-nocompile -Dtests.dups=5 -Dtests.maxfailures=15 
-Dtests.class="*.TestCloudConsistency|*.LIRRollingUpdatesTest|*.MoveReplicaHDFSTest"
 -Dtests.showOutput=onerror 
-DsmokeTestRelease.java9=/home/jenkins/tools/java/latest1.9 
-Dtests.seed=101C8F3EDB66CFCC -Dtests.multiplier=2 -Dtests.locale=ca-ES 
-Dtests.timezone=America/Adak -Dtests.asserts=true -Dtests.file.encoding=UTF-8

[...truncated 21422 lines...]
[repro] Setting last failure code to 256

[repro] Failures:
[repro]   0/5 failed: org.apache.solr.cloud.TestCloudConsistency
[repro]   2/5 failed: org.apache.solr.cloud.LIRRollingUpdatesTest
[repro]   2/5 failed: org.apache.solr.cloud.MoveReplicaHDFSTest
[repro] git checkout 97299ed00699c248fc38465ee1b0eb0bb1561d3d

[...truncated 2 lines...]
[repro] Exiting with code 256

[...truncated 5 lines...]

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org

[jira] [Commented] (LUCENE-8186) CustomAnalyzer with a LowerCaseTokenizerFactory fails to normalize multiterms

2018-03-04 Thread Robert Muir (JIRA)

[ 
https://issues.apache.org/jira/browse/LUCENE-8186?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16385449#comment-16385449
 ] 

Robert Muir commented on LUCENE-8186:
-

I think it would be best to fix this bug here, then let Tim remove any 
duplication, then refactor the API to be safer.

> CustomAnalyzer with a LowerCaseTokenizerFactory fails to normalize multiterms 
> --
>
> Key: LUCENE-8186
> URL: https://issues.apache.org/jira/browse/LUCENE-8186
> Project: Lucene - Core
>  Issue Type: Bug
>Reporter: Tim Allison
>Priority: Minor
> Attachments: LUCENE-8186.patch
>
>
> While working on SOLR-12034, a unit test that relied on the 
> LowerCaseTokenizerFactory failed.
> After some digging, I was able to replicate this at the Lucene level.
> Unit test:
> {noformat}
>   @Test
>   public void testLCTokenizerFactoryNormalize() throws Exception {
> Analyzer analyzer =  
> CustomAnalyzer.builder().withTokenizer(LowerCaseTokenizerFactory.class).build();
> //fails
> assertEquals(new BytesRef("hello"), analyzer.normalize("f", "Hello"));
> 
> //now try an integration test with the classic query parser
> QueryParser p = new QueryParser("f", analyzer);
> Query q = p.parse("Hello");
> //passes
> assertEquals(new TermQuery(new Term("f", "hello")), q);
> q = p.parse("Hello*");
> //fails
> assertEquals(new PrefixQuery(new Term("f", "hello")), q);
> q = p.parse("Hel*o");
> //fails
> assertEquals(new WildcardQuery(new Term("f", "hel*o")), q);
>   }
> {noformat}
> The problem is that the CustomAnalyzer iterates through the tokenfilters, but 
> does not call the tokenizer, which, in the case of the LowerCaseTokenizer, 
> does the filtering work.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[jira] [Commented] (SOLR-11336) DocBasedVersionConstraintsProcessor should be more extensible and support multiple version fields

2018-03-04 Thread Michael Braun (JIRA)

[ 
https://issues.apache.org/jira/browse/SOLR-11336?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16385447#comment-16385447
 ] 

Michael Braun commented on SOLR-11336:
--

Whoops thanks [~dsmiley], will have that fixed on the next version of the patch.

Regarding versionFields vs splitting versionField, what do you mean? The case 
we have is a document has multiple version fields - it wouldn't simply be 
enough to have n DocBasedVersionConstraintsProcessors, where n is the number of 
versions, unless there was a version of the processor (which can now be 
accomplished by subclassing and overriding versionInUpdateIsAcceptable) that 
accepted greater than or equal numbers and passed them along until the last 
one, which was only greater than. What do you think? Would that be a cleaner 
solution?

> DocBasedVersionConstraintsProcessor should be more extensible and support 
> multiple version fields
> -
>
> Key: SOLR-11336
> URL: https://issues.apache.org/jira/browse/SOLR-11336
> Project: Solr
>  Issue Type: Improvement
>  Security Level: Public(Default Security Level. Issues are Public) 
>Affects Versions: master (8.0)
>Reporter: Michael Braun
>Priority: Minor
> Attachments: SOLR-11336.patch, SOLR-11336.patch, SOLR-11336.patch
>
>
> DocBasedVersionConstraintsProcessor supports allowing document updates only 
> if the new version is greater than the old. However, if any behavior wants to 
> be extended / changed in minor ways, the entire class will need to be copied 
> and slightly modified rather than extending and changing the method in 
> question. 
> It would be nice if DocBasedVersionConstraintsProcessor stood on its own as a 
> non-private class. In addition, certain methods (such as pieces of 
> isVersionNewEnough) should be broken out into separate methods so they can be 
> extended such that someone can extend the processor class and override what 
> it means for a new version to be accepted (allowing equal versions through? 
> What if new is a lower not greater number?). 



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[JENKINS] Lucene-Solr-SmokeRelease-6.6 - Build # 33 - Failure

2018-03-04 Thread Apache Jenkins Server
Build: https://builds.apache.org/job/Lucene-Solr-SmokeRelease-6.6/33/

No tests ran.

Build Log:
[...truncated 27882 lines...]
prepare-release-no-sign:
[mkdir] Created dir: 
/home/jenkins/jenkins-slave/workspace/Lucene-Solr-SmokeRelease-6.6/lucene/build/smokeTestRelease/dist
 [copy] Copying 476 files to 
/home/jenkins/jenkins-slave/workspace/Lucene-Solr-SmokeRelease-6.6/lucene/build/smokeTestRelease/dist/lucene
 [copy] Copying 215 files to 
/home/jenkins/jenkins-slave/workspace/Lucene-Solr-SmokeRelease-6.6/lucene/build/smokeTestRelease/dist/solr
   [smoker] Java 1.8 JAVA_HOME=/home/jenkins/tools/java/latest1.8
   [smoker] NOTE: output encoding is UTF-8
   [smoker] 
   [smoker] Load release URL 
"file:/home/jenkins/jenkins-slave/workspace/Lucene-Solr-SmokeRelease-6.6/lucene/build/smokeTestRelease/dist/"...
   [smoker] 
   [smoker] Test Lucene...
   [smoker]   test basics...
   [smoker]   get KEYS
   [smoker] 0.2 MB in 0.01 sec (30.9 MB/sec)
   [smoker]   check changes HTML...
   [smoker]   download lucene-6.6.3-src.tgz...
   [smoker] 30.9 MB in 0.02 sec (1241.4 MB/sec)
   [smoker] verify md5/sha1 digests
   [smoker]   download lucene-6.6.3.tgz...
   [smoker] 67.7 MB in 0.06 sec (1194.6 MB/sec)
   [smoker] verify md5/sha1 digests
   [smoker]   download lucene-6.6.3.zip...
   [smoker] 78.1 MB in 0.06 sec (1205.4 MB/sec)
   [smoker] verify md5/sha1 digests
   [smoker]   unpack lucene-6.6.3.tgz...
   [smoker] verify JAR metadata/identity/no javax.* or java.* classes...
   [smoker] test demo with 1.8...
   [smoker]   got 6252 hits for query "lucene"
   [smoker] checkindex with 1.8...
   [smoker] check Lucene's javadoc JAR
   [smoker]   unpack lucene-6.6.3.zip...
   [smoker] verify JAR metadata/identity/no javax.* or java.* classes...
   [smoker] test demo with 1.8...
   [smoker]   got 6252 hits for query "lucene"
   [smoker] checkindex with 1.8...
   [smoker] check Lucene's javadoc JAR
   [smoker]   unpack lucene-6.6.3-src.tgz...
   [smoker] make sure no JARs/WARs in src dist...
   [smoker] run "ant validate"
   [smoker] run tests w/ Java 8 and testArgs='-Dtests.slow=false'...
   [smoker] test demo with 1.8...
   [smoker]   got 229 hits for query "lucene"
   [smoker] checkindex with 1.8...
   [smoker] generate javadocs w/ Java 8...
   [smoker] 
   [smoker] Crawl/parse...
   [smoker] 
   [smoker] Verify...
   [smoker]   confirm all releases have coverage in TestBackwardsCompatibility
   [smoker] find all past Lucene releases...
   [smoker] run TestBackwardsCompatibility..
   [smoker]   Backcompat testing not required for release 7.2.1 because 
it's not less than 6.6.3
   [smoker]   Backcompat testing not required for release 7.0.1 because 
it's not less than 6.6.3
   [smoker]   Backcompat testing not required for release 7.2.0 because 
it's not less than 6.6.3
   [smoker]   Backcompat testing not required for release 7.1.0 because 
it's not less than 6.6.3
   [smoker]   Backcompat testing not required for release 7.0.0 because 
it's not less than 6.6.3
   [smoker] success!
   [smoker] 
   [smoker] Test Solr...
   [smoker]   test basics...
   [smoker]   get KEYS
   [smoker] 0.2 MB in 0.00 sec (246.4 MB/sec)
   [smoker]   check changes HTML...
   [smoker]   download solr-6.6.3-src.tgz...
   [smoker] 51.8 MB in 0.35 sec (147.9 MB/sec)
   [smoker] verify md5/sha1 digests
   [smoker]   download solr-6.6.3.tgz...
   [smoker] 140.5 MB in 0.16 sec (864.3 MB/sec)
   [smoker] verify md5/sha1 digests
   [smoker]   download solr-6.6.3.zip...
   [smoker] 141.6 MB in 0.16 sec (909.5 MB/sec)
   [smoker] verify md5/sha1 digests
   [smoker]   unpack solr-6.6.3.tgz...
   [smoker] verify JAR metadata/identity/no javax.* or java.* classes...
   [smoker] unpack lucene-6.6.3.tgz...
   [smoker]   **WARNING**: skipping check of 
/home/jenkins/jenkins-slave/workspace/Lucene-Solr-SmokeRelease-6.6/lucene/build/smokeTestRelease/tmp/unpack/solr-6.6.3/contrib/dataimporthandler-extras/lib/javax.mail-1.5.1.jar:
 it has javax.* classes
   [smoker]   **WARNING**: skipping check of 
/home/jenkins/jenkins-slave/workspace/Lucene-Solr-SmokeRelease-6.6/lucene/build/smokeTestRelease/tmp/unpack/solr-6.6.3/contrib/dataimporthandler-extras/lib/activation-1.1.1.jar:
 it has javax.* classes
   [smoker] copying unpacked distribution for Java 8 ...
   [smoker] test solr example w/ Java 8...
   [smoker]   start Solr instance 
(log=/home/jenkins/jenkins-slave/workspace/Lucene-Solr-SmokeRelease-6.6/lucene/build/smokeTestRelease/tmp/unpack/solr-6.6.3-java8/solr-example.log)...
   [smoker] No process found for Solr node running on port 8983
   [smoker]   Running techproducts example on port 8983 from 
/home/jenkins/jenkins-slave/workspace/Lucene-Solr-SmokeRelease-6.6/lucene/build/smokeTestRelease/tmp/unpack/solr-6.6.3-java8
   [smoker] Creating Solr home directory 

[JENKINS] Lucene-Solr-repro - Build # 191 - Unstable

2018-03-04 Thread Apache Jenkins Server
Build: https://builds.apache.org/job/Lucene-Solr-repro/191/

[...truncated 28 lines...]
[repro] Jenkins log URL: 
https://builds.apache.org/job/Lucene-Solr-BadApples-Tests-master/4/consoleText

[repro] Revision: 78f11d05acc8730986d23eda7011b00b213d0fe5

[repro] Repro line:  ant test  -Dtestcase=TestPKIAuthenticationPlugin 
-Dtests.method=test -Dtests.seed=42B4BD78D69635DF -Dtests.multiplier=2 
-Dtests.slow=true -Dtests.badapples=true -Dtests.locale=es-MX 
-Dtests.timezone=Australia/Victoria -Dtests.asserts=true 
-Dtests.file.encoding=ISO-8859-1

[repro] Repro line:  ant test  -Dtestcase=TestReplicationHandler 
-Dtests.method=doTestIndexFetchOnMasterRestart -Dtests.seed=42B4BD78D69635DF 
-Dtests.multiplier=2 -Dtests.slow=true -Dtests.badapples=true 
-Dtests.locale=ko-KR -Dtests.timezone=America/Miquelon -Dtests.asserts=true 
-Dtests.file.encoding=ISO-8859-1

[repro] Repro line:  ant test  -Dtestcase=TestTriggerIntegration 
-Dtests.method=testNodeMarkersRegistration -Dtests.seed=42B4BD78D69635DF 
-Dtests.multiplier=2 -Dtests.slow=true -Dtests.badapples=true 
-Dtests.locale=de-AT -Dtests.timezone=UTC -Dtests.asserts=true 
-Dtests.file.encoding=ISO-8859-1

[repro] Repro line:  ant test  -Dtestcase=ZkControllerTest 
-Dtests.method=testPublishAndWaitForDownStates -Dtests.seed=42B4BD78D69635DF 
-Dtests.multiplier=2 -Dtests.slow=true -Dtests.badapples=true 
-Dtests.locale=sr-Latn-BA -Dtests.timezone=Asia/Seoul -Dtests.asserts=true 
-Dtests.file.encoding=ISO-8859-1

[repro] Repro line:  ant test  -Dtestcase=TestJmxIntegration 
-Dtests.method=testJmxOnCoreReload -Dtests.seed=42B4BD78D69635DF 
-Dtests.multiplier=2 -Dtests.slow=true -Dtests.badapples=true 
-Dtests.locale=ar-MA -Dtests.timezone=America/Port-au-Prince 
-Dtests.asserts=true -Dtests.file.encoding=ISO-8859-1

[repro] Repro line:  ant test  -Dtestcase=TestLTRReRankingPipeline 
-Dtests.method=testDifferentTopN -Dtests.seed=1C72B4201CB89142 
-Dtests.multiplier=2 -Dtests.slow=true -Dtests.badapples=true 
-Dtests.locale=de-GR -Dtests.timezone=Etc/GMT-3 -Dtests.asserts=true 
-Dtests.file.encoding=ISO-8859-1

[repro] git rev-parse --abbrev-ref HEAD
[repro] git rev-parse HEAD
[repro] Initial local git branch/revision: 
97299ed00699c248fc38465ee1b0eb0bb1561d3d
[repro] git fetch
[repro] git checkout 78f11d05acc8730986d23eda7011b00b213d0fe5

[...truncated 2 lines...]
[repro] git merge --ff-only

[...truncated 1 lines...]
[repro] ant clean

[...truncated 6 lines...]
[repro] Test suites by module:
[repro]solr/core
[repro]   TestReplicationHandler
[repro]   TestJmxIntegration
[repro]   TestPKIAuthenticationPlugin
[repro]   ZkControllerTest
[repro]   TestTriggerIntegration
[repro]solr/contrib/ltr
[repro]   TestLTRReRankingPipeline
[repro] ant compile-test

[...truncated 3292 lines...]
[repro] ant test-nocompile -Dtests.dups=5 -Dtests.maxfailures=25 
-Dtests.class="*.TestReplicationHandler|*.TestJmxIntegration|*.TestPKIAuthenticationPlugin|*.ZkControllerTest|*.TestTriggerIntegration"
 -Dtests.showOutput=onerror  -Dtests.seed=42B4BD78D69635DF -Dtests.multiplier=2 
-Dtests.slow=true -Dtests.badapples=true -Dtests.locale=ko-KR 
-Dtests.timezone=America/Miquelon -Dtests.asserts=true 
-Dtests.file.encoding=ISO-8859-1

[...truncated 50482 lines...]
[repro] Setting last failure code to 256

[repro] ant compile-test

[...truncated 566 lines...]
[repro] ant test-nocompile -Dtests.dups=5 -Dtests.maxfailures=5 
-Dtests.class="*.TestLTRReRankingPipeline" -Dtests.showOutput=onerror  
-Dtests.seed=1C72B4201CB89142 -Dtests.multiplier=2 -Dtests.slow=true 
-Dtests.badapples=true -Dtests.locale=de-GR -Dtests.timezone=Etc/GMT-3 
-Dtests.asserts=true -Dtests.file.encoding=ISO-8859-1

[...truncated 135 lines...]
[repro] Setting last failure code to 256

[repro] Failures:
[repro]   0/5 failed: 
org.apache.solr.cloud.autoscaling.sim.TestTriggerIntegration
[repro]   0/5 failed: org.apache.solr.security.TestPKIAuthenticationPlugin
[repro]   5/5 failed: org.apache.solr.cloud.ZkControllerTest
[repro]   5/5 failed: org.apache.solr.core.TestJmxIntegration
[repro]   5/5 failed: org.apache.solr.handler.TestReplicationHandler
[repro]   5/5 failed: org.apache.solr.ltr.TestLTRReRankingPipeline

[repro] Re-testing 100% failures at the tip of master
[repro] ant clean

[...truncated 8 lines...]
[repro] Test suites by module:
[repro]solr/core
[repro]   TestReplicationHandler
[repro]   TestJmxIntegration
[repro]   ZkControllerTest
[repro]solr/contrib/ltr
[repro]   TestLTRReRankingPipeline
[repro] ant compile-test

[...truncated 3292 lines...]
[repro] ant test-nocompile -Dtests.dups=5 -Dtests.maxfailures=15 
-Dtests.class="*.TestReplicationHandler|*.TestJmxIntegration|*.ZkControllerTest"
 -Dtests.showOutput=onerror  -Dtests.seed=42B4BD78D69635DF -Dtests.multiplier=2 
-Dtests.slow=true -Dtests.badapples=true -Dtests.locale=ko-KR 
-Dtests.timezone=America/Miquelon -Dtests.asserts=true 
-Dtests.file.encoding=ISO-8859-1

[...truncated 50577 lines...]
[repro] 

[JENKINS] Lucene-Solr-repro - Build # 189 - Still Unstable

2018-03-04 Thread Apache Jenkins Server
Build: https://builds.apache.org/job/Lucene-Solr-repro/189/

[...truncated 28 lines...]
[repro] Jenkins log URL: 
https://builds.apache.org/job/Lucene-Solr-BadApples-Tests-7.x/4/consoleText

[repro] Revision: 1a3468b87282a9743b7176d9eaeb603e4968

[repro] Repro line:  ant test  -Dtestcase=SoftAutoCommitTest 
-Dtests.method=testHardCommitWithinAndSoftCommitMaxTimeRapidAdds 
-Dtests.seed=786BB0605B78DC00 -Dtests.multiplier=2 -Dtests.slow=true 
-Dtests.badapples=true -Dtests.locale=ar-OM -Dtests.timezone=Africa/Libreville 
-Dtests.asserts=true -Dtests.file.encoding=ISO-8859-1

[repro] Repro line:  ant test  -Dtestcase=TestJmxIntegration 
-Dtests.method=testJmxOnCoreReload -Dtests.seed=786BB0605B78DC00 
-Dtests.multiplier=2 -Dtests.slow=true -Dtests.badapples=true 
-Dtests.locale=es-DO -Dtests.timezone=America/Boise -Dtests.asserts=true 
-Dtests.file.encoding=ISO-8859-1

[repro] Repro line:  ant test  -Dtestcase=TestTriggerIntegration 
-Dtests.method=testSearchRate -Dtests.seed=786BB0605B78DC00 
-Dtests.multiplier=2 -Dtests.slow=true -Dtests.badapples=true 
-Dtests.locale=es-PR -Dtests.timezone=Europe/Vaduz -Dtests.asserts=true 
-Dtests.file.encoding=ISO-8859-1

[repro] Repro line:  ant test  -Dtestcase=ZkControllerTest 
-Dtests.method=testPublishAndWaitForDownStates -Dtests.seed=786BB0605B78DC00 
-Dtests.multiplier=2 -Dtests.slow=true -Dtests.badapples=true 
-Dtests.locale=mk-MK -Dtests.timezone=Etc/GMT+11 -Dtests.asserts=true 
-Dtests.file.encoding=ISO-8859-1

[repro] Repro line:  ant test  -Dtestcase=TestReplicationHandler 
-Dtests.method=doTestIndexFetchOnMasterRestart -Dtests.seed=786BB0605B78DC00 
-Dtests.multiplier=2 -Dtests.slow=true -Dtests.badapples=true 
-Dtests.locale=it-CH -Dtests.timezone=Europe/Kirov -Dtests.asserts=true 
-Dtests.file.encoding=ISO-8859-1

[repro] Repro line:  ant test  -Dtestcase=TestTlogReplica 
-Dtests.method=testRecovery -Dtests.seed=786BB0605B78DC00 -Dtests.multiplier=2 
-Dtests.slow=true -Dtests.badapples=true -Dtests.locale=hi 
-Dtests.timezone=Europe/Amsterdam -Dtests.asserts=true 
-Dtests.file.encoding=ISO-8859-1

[repro] Repro line:  ant test  -Dtestcase=TestLargeCluster 
-Dtests.method=testBasic -Dtests.seed=786BB0605B78DC00 -Dtests.multiplier=2 
-Dtests.slow=true -Dtests.badapples=true -Dtests.locale=ar-SD 
-Dtests.timezone=Africa/Conakry -Dtests.asserts=true 
-Dtests.file.encoding=ISO-8859-1

[repro] Repro line:  ant test  -Dtestcase=AutoscalingHistoryHandlerTest 
-Dtests.method=testHistory -Dtests.seed=786BB0605B78DC00 -Dtests.multiplier=2 
-Dtests.slow=true -Dtests.badapples=true -Dtests.locale=und 
-Dtests.timezone=Antarctica/Casey -Dtests.asserts=true 
-Dtests.file.encoding=ISO-8859-1

[repro] Repro line:  ant test  -Dtestcase=TestLTRReRankingPipeline 
-Dtests.method=testDifferentTopN -Dtests.seed=4AD8884F44D954E4 
-Dtests.multiplier=2 -Dtests.slow=true -Dtests.badapples=true 
-Dtests.locale=tr-TR -Dtests.timezone=Asia/Dhaka -Dtests.asserts=true 
-Dtests.file.encoding=UTF-8

[repro] git rev-parse --abbrev-ref HEAD
[repro] git rev-parse HEAD
[repro] Initial local git branch/revision: 
97299ed00699c248fc38465ee1b0eb0bb1561d3d
[repro] git fetch
[repro] git checkout 1a3468b87282a9743b7176d9eaeb603e4968

[...truncated 2 lines...]
[repro] git merge --ff-only

[...truncated 1 lines...]
[repro] ant clean

[...truncated 6 lines...]
[repro] Test suites by module:
[repro]solr/core
[repro]   SoftAutoCommitTest
[repro]   TestTlogReplica
[repro]   AutoscalingHistoryHandlerTest
[repro]   TestLargeCluster
[repro]   TestTriggerIntegration
[repro]   TestJmxIntegration
[repro]   TestReplicationHandler
[repro]   ZkControllerTest
[repro]solr/contrib/ltr
[repro]   TestLTRReRankingPipeline
[repro] ant compile-test

[...truncated 3310 lines...]
[repro] ant test-nocompile -Dtests.dups=5 -Dtests.maxfailures=40 
-Dtests.class="*.SoftAutoCommitTest|*.TestTlogReplica|*.AutoscalingHistoryHandlerTest|*.TestLargeCluster|*.TestTriggerIntegration|*.TestJmxIntegration|*.TestReplicationHandler|*.ZkControllerTest"
 -Dtests.showOutput=onerror  -Dtests.seed=786BB0605B78DC00 -Dtests.multiplier=2 
-Dtests.slow=true -Dtests.badapples=true -Dtests.locale=ar-OM 
-Dtests.timezone=Africa/Libreville -Dtests.asserts=true 
-Dtests.file.encoding=ISO-8859-1

[...truncated 75712 lines...]
[repro] Setting last failure code to 256

[repro] ant compile-test

[...truncated 566 lines...]
[repro] ant test-nocompile -Dtests.dups=5 -Dtests.maxfailures=5 
-Dtests.class="*.TestLTRReRankingPipeline" -Dtests.showOutput=onerror  
-Dtests.seed=4AD8884F44D954E4 -Dtests.multiplier=2 -Dtests.slow=true 
-Dtests.badapples=true -Dtests.locale=tr-TR -Dtests.timezone=Asia/Dhaka 
-Dtests.asserts=true -Dtests.file.encoding=UTF-8

[...truncated 135 lines...]
[repro] Setting last failure code to 256

[repro] Failures:
[repro]   0/5 failed: org.apache.solr.cloud.TestTlogReplica
[repro]   0/5 failed: 
org.apache.solr.cloud.autoscaling.sim.TestTriggerIntegration
[repro]   0/5 failed: 

[JENKINS] Lucene-Solr-master-Linux (32bit/jdk1.8.0_162) - Build # 21572 - Unstable!

2018-03-04 Thread Policeman Jenkins Server
Build: https://jenkins.thetaphi.de/job/Lucene-Solr-master-Linux/21572/
Java: 32bit/jdk1.8.0_162 -server -XX:+UseParallelGC

1 tests failed.
FAILED:  
org.apache.solr.cloud.TestCloudConsistency.testOutOfSyncReplicasCannotBecomeLeaderAfterRestart

Error Message:
Timeout waiting for active collection null Live Nodes: [127.0.0.1:33427_solr, 
127.0.0.1:36805_solr, 127.0.0.1:37845_solr, 127.0.0.1:3_solr] Last 
available state: 
DocCollection(outOfSyncReplicasCannotBecomeLeader-true//collections/outOfSyncReplicasCannotBecomeLeader-true/state.json/23)={
   "pullReplicas":"0",   "replicationFactor":"3",   "shards":{"shard1":{   
"range":"8000-7fff",   "state":"active",   "replicas":{ 
"core_node62":{   
"core":"outOfSyncReplicasCannotBecomeLeader-true_shard1_replica_n61",   
"base_url":"https://127.0.0.1:37845/solr;,   
"node_name":"127.0.0.1:37845_solr",   "state":"active",   
"type":"NRT",   "leader":"true"}, "core_node64":{   
"core":"outOfSyncReplicasCannotBecomeLeader-true_shard1_replica_n63",   
"base_url":"https://127.0.0.1:33427/solr;,   
"node_name":"127.0.0.1:33427_solr",   "state":"down",   
"type":"NRT"}, "core_node66":{   
"core":"outOfSyncReplicasCannotBecomeLeader-true_shard1_replica_n65",   
"base_url":"https://127.0.0.1:3/solr;,   
"node_name":"127.0.0.1:3_solr",   "state":"down",   
"type":"NRT",   "router":{"name":"compositeId"},   "maxShardsPerNode":"1",  
 "autoAddReplicas":"false",   "nrtReplicas":"3",   "tlogReplicas":"0"}

Stack Trace:
java.lang.AssertionError: Timeout waiting for active collection
null
Live Nodes: [127.0.0.1:33427_solr, 127.0.0.1:36805_solr, 127.0.0.1:37845_solr, 
127.0.0.1:3_solr]
Last available state: 
DocCollection(outOfSyncReplicasCannotBecomeLeader-true//collections/outOfSyncReplicasCannotBecomeLeader-true/state.json/23)={
  "pullReplicas":"0",
  "replicationFactor":"3",
  "shards":{"shard1":{
  "range":"8000-7fff",
  "state":"active",
  "replicas":{
"core_node62":{
  "core":"outOfSyncReplicasCannotBecomeLeader-true_shard1_replica_n61",
  "base_url":"https://127.0.0.1:37845/solr;,
  "node_name":"127.0.0.1:37845_solr",
  "state":"active",
  "type":"NRT",
  "leader":"true"},
"core_node64":{
  "core":"outOfSyncReplicasCannotBecomeLeader-true_shard1_replica_n63",
  "base_url":"https://127.0.0.1:33427/solr;,
  "node_name":"127.0.0.1:33427_solr",
  "state":"down",
  "type":"NRT"},
"core_node66":{
  "core":"outOfSyncReplicasCannotBecomeLeader-true_shard1_replica_n65",
  "base_url":"https://127.0.0.1:3/solr;,
  "node_name":"127.0.0.1:3_solr",
  "state":"down",
  "type":"NRT",
  "router":{"name":"compositeId"},
  "maxShardsPerNode":"1",
  "autoAddReplicas":"false",
  "nrtReplicas":"3",
  "tlogReplicas":"0"}
at 
__randomizedtesting.SeedInfo.seed([7F4947079F714EA4:57DCD5524CFC9FFF]:0)
at org.junit.Assert.fail(Assert.java:93)
at 
org.apache.solr.cloud.SolrCloudTestCase.waitForState(SolrCloudTestCase.java:269)
at 
org.apache.solr.cloud.TestCloudConsistency.addDocToWhenOtherReplicasAreDown(TestCloudConsistency.java:164)
at 
org.apache.solr.cloud.TestCloudConsistency.testOutOfSyncReplicasCannotBecomeLeader(TestCloudConsistency.java:122)
at 
org.apache.solr.cloud.TestCloudConsistency.testOutOfSyncReplicasCannotBecomeLeaderAfterRestart(TestCloudConsistency.java:95)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner.invoke(RandomizedRunner.java:1737)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$8.evaluate(RandomizedRunner.java:934)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$9.evaluate(RandomizedRunner.java:970)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$10.evaluate(RandomizedRunner.java:984)
at 
com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:57)
at 
org.apache.lucene.util.TestRuleSetupTeardownChained$1.evaluate(TestRuleSetupTeardownChained.java:49)
at 
org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:45)
at 
org.apache.lucene.util.TestRuleThreadAndTestName$1.evaluate(TestRuleThreadAndTestName.java:48)
at 
org.apache.lucene.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:64)
at 

[JENKINS] Lucene-Solr-Tests-master - Build # 2401 - Unstable

2018-03-04 Thread Apache Jenkins Server
Build: https://builds.apache.org/job/Lucene-Solr-Tests-master/2401/

1 tests failed.
FAILED:  org.apache.solr.cloud.autoscaling.sim.TestLargeCluster.testAddNode

Error Message:
no MOVEREPLICA ops?

Stack Trace:
java.lang.AssertionError: no MOVEREPLICA ops?
at 
__randomizedtesting.SeedInfo.seed([FD2E2C2A99B258F7:5AC1318956FFD7EF]:0)
at org.junit.Assert.fail(Assert.java:93)
at org.junit.Assert.assertTrue(Assert.java:43)
at 
org.apache.solr.cloud.autoscaling.sim.TestLargeCluster.testAddNode(TestLargeCluster.java:262)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner.invoke(RandomizedRunner.java:1737)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$8.evaluate(RandomizedRunner.java:934)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$9.evaluate(RandomizedRunner.java:970)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$10.evaluate(RandomizedRunner.java:984)
at 
com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:57)
at 
org.apache.lucene.util.TestRuleSetupTeardownChained$1.evaluate(TestRuleSetupTeardownChained.java:49)
at 
org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:45)
at 
org.apache.lucene.util.TestRuleThreadAndTestName$1.evaluate(TestRuleThreadAndTestName.java:48)
at 
org.apache.lucene.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:64)
at 
org.apache.lucene.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:47)
at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at 
com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:368)
at 
com.carrotsearch.randomizedtesting.ThreadLeakControl.forkTimeoutingTask(ThreadLeakControl.java:817)
at 
com.carrotsearch.randomizedtesting.ThreadLeakControl$3.evaluate(ThreadLeakControl.java:468)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner.runSingleTest(RandomizedRunner.java:943)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$5.evaluate(RandomizedRunner.java:829)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$6.evaluate(RandomizedRunner.java:879)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$7.evaluate(RandomizedRunner.java:890)
at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at 
com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:57)
at 
org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:45)
at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at 
org.apache.lucene.util.TestRuleStoreClassName$1.evaluate(TestRuleStoreClassName.java:41)
at 
com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:40)
at 
com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:40)
at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at 
org.apache.lucene.util.TestRuleAssertionsRequired$1.evaluate(TestRuleAssertionsRequired.java:53)
at 
org.apache.lucene.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:47)
at 
org.apache.lucene.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:64)
at 
org.apache.lucene.util.TestRuleIgnoreTestSuites$1.evaluate(TestRuleIgnoreTestSuites.java:54)
at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at 
com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:368)
at java.lang.Thread.run(Thread.java:748)




Build Log:
[...truncated 12208 lines...]
   [junit4] Suite: org.apache.solr.cloud.autoscaling.sim.TestLargeCluster
   [junit4]   2> Creating dataDir: 
/home/jenkins/jenkins-slave/workspace/Lucene-Solr-Tests-master/solr/build/solr-core/test/J2/temp/solr.cloud.autoscaling.sim.TestLargeCluster_FD2E2C2A99B258F7-001/init-core-data-001
   [junit4]   2> 359220 WARN  

[jira] [Commented] (LUCENE-8186) CustomAnalyzer with a LowerCaseTokenizerFactory fails to normalize multiterms

2018-03-04 Thread Uwe Schindler (JIRA)

[ 
https://issues.apache.org/jira/browse/LUCENE-8186?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16385376#comment-16385376
 ] 

Uwe Schindler commented on LUCENE-8186:
---

[~talli...@apache.org] could you explain why this works with Solr's 
TokenizerChain, or was this a new test that you added? Solr has exactly same 
code in TokenizerChain...

> CustomAnalyzer with a LowerCaseTokenizerFactory fails to normalize multiterms 
> --
>
> Key: LUCENE-8186
> URL: https://issues.apache.org/jira/browse/LUCENE-8186
> Project: Lucene - Core
>  Issue Type: Bug
>Reporter: Tim Allison
>Priority: Minor
> Attachments: LUCENE-8186.patch
>
>
> While working on SOLR-12034, a unit test that relied on the 
> LowerCaseTokenizerFactory failed.
> After some digging, I was able to replicate this at the Lucene level.
> Unit test:
> {noformat}
>   @Test
>   public void testLCTokenizerFactoryNormalize() throws Exception {
> Analyzer analyzer =  
> CustomAnalyzer.builder().withTokenizer(LowerCaseTokenizerFactory.class).build();
> //fails
> assertEquals(new BytesRef("hello"), analyzer.normalize("f", "Hello"));
> 
> //now try an integration test with the classic query parser
> QueryParser p = new QueryParser("f", analyzer);
> Query q = p.parse("Hello");
> //passes
> assertEquals(new TermQuery(new Term("f", "hello")), q);
> q = p.parse("Hello*");
> //fails
> assertEquals(new PrefixQuery(new Term("f", "hello")), q);
> q = p.parse("Hel*o");
> //fails
> assertEquals(new WildcardQuery(new Term("f", "hel*o")), q);
>   }
> {noformat}
> The problem is that the CustomAnalyzer iterates through the tokenfilters, but 
> does not call the tokenizer, which, in the case of the LowerCaseTokenizer, 
> does the filtering work.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[jira] [Commented] (SOLR-11884) find/fix inefficiencies in our use of logging

2018-03-04 Thread Erick Erickson (JIRA)

[ 
https://issues.apache.org/jira/browse/SOLR-11884?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16385372#comment-16385372
 ] 

Erick Erickson commented on SOLR-11884:
---

NOTE: SOLR-7887 fixed the %C issues

> find/fix inefficiencies in our use of logging
> -
>
> Key: SOLR-11884
> URL: https://issues.apache.org/jira/browse/SOLR-11884
> Project: Solr
>  Issue Type: Improvement
>  Security Level: Public(Default Security Level. Issues are Public) 
>  Components: logging
>Reporter: Erick Erickson
>Assignee: Erick Erickson
>Priority: Major
>
> We've been looking at Solr using Flight Recorder and ran across some 
> interesting things I'd like to discuss. Let's discuss general logging 
> approaches here, then perhaps break out sub-JIRAs when we reach any kind of 
> agreement.
> 1> Every log message generates a new Throwable, presumably to get things like 
> line number, file, class name and the like. On a 2 minute run blasting 
> updates this meant 150,000 (yes, 150K) instances of "new Throwable()".
>  
> See the section "Asynchronous Logging with Caller Location Information" at:
> [https://logging.apache.org/log4j/2.x/performance.html]
> I'm not totally sure changing the layout pattern will fix this in log4j 1.x, 
> but apparently certainly should in log4j 2.
>  
> The cost of course would be that lots of our log messages would lack some of 
> the information. Exceptions would still contain all the file/class/line 
> information of course.
>  
> Proposal:
> Change the layout pattern to, by default, _NOT_  include information that 
> requires a Throwable to be created. Also include a pattern that could be 
> un-commented to get this information back for troubleshooting.
>  
> 
>  
> We generate strings when we don't need them. Any construct like
> log.info("whatever " + method_that_builds_a_string + " : " + some_variable);
> generates the string (some of which are quite expensive) and then throws it 
> away if the log level is at, say, WARN. The above link also shows that 
> parameterizing this doesn't suffer this problem, so anything like the above 
> should be re-written as:
> log.info("whatever {} : {} ", method_that_builds_a_string, some_variable);
>  
> The alternative is to do something like but let's make use of the built-in 
> capabilities instead.
> if (log.level >= INFO) {
>    log.info("whatever " + method_that_builds_a_string + " : " + 
> some_variable);
> }
> etc.
> This would be a pretty huge thing to fix all-at-once so I suppose we'll have 
> to approach it incrementally. It's also something that, if we get them all 
> out of the code should be added to precommit failures. In the meantime, if 
> anyone who has the precommit chops could create a target that checked for 
> this it'd be a great help in tracking all of them down, then could be 
> incorporated in the regular precommit checks if/when they're all removed.
> Proposal:
> Use JFR or whatever to identify the egregious violations of this kind of 
> thing (I have a couple I've found) and change them to parameterized form (and 
> prove it works). Then see what we can do to move forward with removing them 
> all through the code base.
>  
>  



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[jira] [Commented] (LUCENE-8186) CustomAnalyzer with a LowerCaseTokenizerFactory fails to normalize multiterms

2018-03-04 Thread Adrien Grand (JIRA)

[ 
https://issues.apache.org/jira/browse/LUCENE-8186?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16385362#comment-16385362
 ] 

Adrien Grand commented on LUCENE-8186:
--

+1 to this patch and +1 to improve type safety of these APIs.

> CustomAnalyzer with a LowerCaseTokenizerFactory fails to normalize multiterms 
> --
>
> Key: LUCENE-8186
> URL: https://issues.apache.org/jira/browse/LUCENE-8186
> Project: Lucene - Core
>  Issue Type: Bug
>Reporter: Tim Allison
>Priority: Minor
> Attachments: LUCENE-8186.patch
>
>
> While working on SOLR-12034, a unit test that relied on the 
> LowerCaseTokenizerFactory failed.
> After some digging, I was able to replicate this at the Lucene level.
> Unit test:
> {noformat}
>   @Test
>   public void testLCTokenizerFactoryNormalize() throws Exception {
> Analyzer analyzer =  
> CustomAnalyzer.builder().withTokenizer(LowerCaseTokenizerFactory.class).build();
> //fails
> assertEquals(new BytesRef("hello"), analyzer.normalize("f", "Hello"));
> 
> //now try an integration test with the classic query parser
> QueryParser p = new QueryParser("f", analyzer);
> Query q = p.parse("Hello");
> //passes
> assertEquals(new TermQuery(new Term("f", "hello")), q);
> q = p.parse("Hello*");
> //fails
> assertEquals(new PrefixQuery(new Term("f", "hello")), q);
> q = p.parse("Hel*o");
> //fails
> assertEquals(new WildcardQuery(new Term("f", "hel*o")), q);
>   }
> {noformat}
> The problem is that the CustomAnalyzer iterates through the tokenfilters, but 
> does not call the tokenizer, which, in the case of the LowerCaseTokenizer, 
> does the filtering work.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



Re: Review Request 65888: Upgrade Solr to use Log4J2

2018-03-04 Thread Erick Erickson


> On March 4, 2018, 1:28 a.m., Erick Erickson wrote:
> > there's still a reference to log4j.properties in solr/bin/solr.cmd changed 
> > in the patch I'll upload shortly.
> > 
> > NOTE: I've changed everything _except_ the references to log4j 1.2 in 
> > solr/sever/ivy.xml and solr/core/ivy.xml in the patch I'll upload shortly.

Done


- Erick


---
This is an automatically generated e-mail. To reply, visit:
https://reviews.apache.org/r/65888/#review198587
---


On March 3, 2018, 2:51 a.m., Varun Thacker wrote:
> 
> ---
> This is an automatically generated e-mail. To reply, visit:
> https://reviews.apache.org/r/65888/
> ---
> 
> (Updated March 3, 2018, 2:51 a.m.)
> 
> 
> Review request for lucene.
> 
> 
> Repository: lucene-solr
> 
> 
> Description
> ---
> 
> Upgrade Solr to use Log4J2
> 
> 
> Diffs
> -
> 
>   lucene/CHANGES.txt e3799bc9d5 
>   lucene/common-build.xml 4fa59ac936 
>   lucene/ivy-versions.properties 5ab36ddfa2 
>   solr/CHANGES.txt 05f4f560bd 
>   solr/bin/install_solr_service.sh b82957144d 
>   solr/bin/solr 4e178de945 
>   solr/bin/solr.cmd dcff0c6af7 
>   solr/bin/solr.in.cmd bfb33e0e9d 
>   solr/bin/solr.in.sh e7478cdf5c 
>   solr/contrib/clustering/src/test-files/log4j.properties b5216db8b2 
>   solr/contrib/clustering/src/test-files/log4j2.xml PRE-CREATION 
>   solr/contrib/dataimporthandler/src/test-files/log4j.properties d3ea4deafc 
>   solr/contrib/dataimporthandler/src/test-files/log4j2.xml PRE-CREATION 
>   solr/contrib/ltr/src/test-files/log4j.properties d86c6988d5 
>   solr/contrib/ltr/src/test-files/log4j2.xml PRE-CREATION 
>   solr/core/ivy.xml ff4fa48679 
>   
> solr/core/src/java/org/apache/solr/client/solrj/embedded/JettySolrRunner.java 
> 23a8dc1eb3 
>   solr/core/src/java/org/apache/solr/handler/admin/LoggingHandler.java 
> 122d2cbf8b 
>   solr/core/src/java/org/apache/solr/logging/LogWatcher.java c510590282 
>   solr/core/src/java/org/apache/solr/logging/log4j/EventAppender.java 
> ff2876fb2f 
>   solr/core/src/java/org/apache/solr/logging/log4j/Log4jInfo.java dfd3dde74a 
>   solr/core/src/java/org/apache/solr/logging/log4j/Log4jWatcher.java 
> 04fa5fb1d8 
>   solr/core/src/java/org/apache/solr/logging/log4j/package-info.java 
> f78953385c 
>   solr/core/src/java/org/apache/solr/logging/log4j2/Log4j2Watcher.java 
> PRE-CREATION 
>   solr/core/src/java/org/apache/solr/servlet/SolrDispatchFilter.java 
> 4d944d239b 
>   solr/core/src/java/org/apache/solr/util/SolrCLI.java 266ef3a28a 
>   solr/core/src/java/org/apache/solr/util/SolrLogLayout.java a60ada828b 
>   solr/core/src/java/org/apache/solr/util/StartupLoggingUtils.java c582eff4c0 
>   solr/core/src/test-files/log4j.properties 969439a228 
>   solr/core/src/test-files/log4j2.xml PRE-CREATION 
>   solr/core/src/test/org/apache/solr/handler/RequestLoggingTest.java 
> 4c780ccda4 
>   solr/core/src/test/org/apache/solr/handler/admin/LoggingHandlerTest.java 
> 555c1376a5 
>   solr/core/src/test/org/apache/solr/logging/log4j2/Log4j2WatcherTest.java 
> PRE-CREATION 
>   solr/core/src/test/org/apache/solr/util/TestSolrCLIRunExample.java 
> 89008517f8 
>   solr/example/README.txt 562c256377 
>   solr/example/example-DIH/solr/db/conf/solrconfig.xml 1ffbbe817f 
>   solr/example/example-DIH/solr/mail/conf/solrconfig.xml 770b0fd870 
>   solr/example/example-DIH/solr/solr/conf/solrconfig.xml 3f00141340 
>   solr/example/resources/log4j.properties 02f91c5dae 
>   solr/example/resources/log4j2.xml PRE-CREATION 
>   solr/licenses/audience-annotations-0.7.0.jar.sha1 PRE-CREATION 
>   solr/licenses/audience-annotations-LICENSE-ASL.txt PRE-CREATION 
>   solr/licenses/audience-annotations-NOTICE.txt PRE-CREATION 
>   solr/licenses/disruptor-3.4.0.jar.sha1 PRE-CREATION 
>   solr/licenses/disruptor-LICENSE-ASL.txt PRE-CREATION 
>   solr/licenses/disruptor-NOTICE.txt PRE-CREATION 
>   solr/licenses/log4j-1.2-api-2.10.0.jar.sha1 PRE-CREATION 
>   solr/licenses/log4j-1.2.17.jar.sha1 383110e29f 
>   solr/licenses/log4j-api-2.10.0.jar.sha1 PRE-CREATION 
>   solr/licenses/log4j-api-LICENSE-ASL.txt PRE-CREATION 
>   solr/licenses/log4j-api-NOTICE.txt PRE-CREATION 
>   solr/licenses/log4j-core-2.10.0.jar.sha1 PRE-CREATION 
>   solr/licenses/log4j-core-LICENSE-ASL.txt PRE-CREATION 
>   solr/licenses/log4j-core-NOTICE.txt PRE-CREATION 
>   solr/licenses/log4j-slf4j-LICENSE-ASL.txt PRE-CREATION 
>   solr/licenses/log4j-slf4j-NOTICE.txt PRE-CREATION 
>   solr/licenses/log4j-slf4j-impl-2.10.0.jar.sha1 PRE-CREATION 
>   solr/licenses/slf4j-log4j12-1.7.24.jar.sha1 b8ec050172 
>   solr/server/README.txt 228f4d467b 
>   solr/server/ivy.xml c9b3a73014 
>   solr/server/resources/log4j.properties 9f9c4a0f74 
>   solr/server/resources/log4j2.xml PRE-CREATION 
>   solr/server/scripts/cloud-scripts/log4j.properties 5f2ae18574 
>   

Re: Review Request 65888: Upgrade Solr to use Log4J2

2018-03-04 Thread Erick Erickson


> On March 4, 2018, 2:04 a.m., Erick Erickson wrote:
> > A couple more things.
> > 
> > WDYT about including 
> >  
> >  in the Root?
> >  
> >  
> > And let's just copy the log4j2.xml file from server/resources to 
> > example/resources.
> > 
> > example/resources/log4j2.xml has this in the pattern, where did that come 
> > from?: \u2013 (en-dash?) Looks like an inadvertent change from %c?
> > 
> > %-4r [%t] %-5p %c %x [%X{collection} %X{shard} %X{replica} %X{core}] \u2013 
> > %m%n
> > 
> > These changes are _NOT_ in the patch I'm uploading 
> > (SOLR-7887-eoe-review.patch)
> 
> Shawn Heisey wrote:
> Disclaimer:  I have not looked over the xml config files for the new 
> version, and I do not know what the change you're discussing would actually 
> do.
> 
> Everything I've read that talks about insane performance levels in log4j2 
> has talked about the async loggers.  Therefore, I think we do need to set up 
> async logging, however that's done.

I decided to make this a new JIRA so we can address it separately and give 
people a chance to discuss it. I think it should be done. See SOLR-12055


- Erick


---
This is an automatically generated e-mail. To reply, visit:
https://reviews.apache.org/r/65888/#review198588
---


On March 3, 2018, 2:51 a.m., Varun Thacker wrote:
> 
> ---
> This is an automatically generated e-mail. To reply, visit:
> https://reviews.apache.org/r/65888/
> ---
> 
> (Updated March 3, 2018, 2:51 a.m.)
> 
> 
> Review request for lucene.
> 
> 
> Repository: lucene-solr
> 
> 
> Description
> ---
> 
> Upgrade Solr to use Log4J2
> 
> 
> Diffs
> -
> 
>   lucene/CHANGES.txt e3799bc9d5 
>   lucene/common-build.xml 4fa59ac936 
>   lucene/ivy-versions.properties 5ab36ddfa2 
>   solr/CHANGES.txt 05f4f560bd 
>   solr/bin/install_solr_service.sh b82957144d 
>   solr/bin/solr 4e178de945 
>   solr/bin/solr.cmd dcff0c6af7 
>   solr/bin/solr.in.cmd bfb33e0e9d 
>   solr/bin/solr.in.sh e7478cdf5c 
>   solr/contrib/clustering/src/test-files/log4j.properties b5216db8b2 
>   solr/contrib/clustering/src/test-files/log4j2.xml PRE-CREATION 
>   solr/contrib/dataimporthandler/src/test-files/log4j.properties d3ea4deafc 
>   solr/contrib/dataimporthandler/src/test-files/log4j2.xml PRE-CREATION 
>   solr/contrib/ltr/src/test-files/log4j.properties d86c6988d5 
>   solr/contrib/ltr/src/test-files/log4j2.xml PRE-CREATION 
>   solr/core/ivy.xml ff4fa48679 
>   
> solr/core/src/java/org/apache/solr/client/solrj/embedded/JettySolrRunner.java 
> 23a8dc1eb3 
>   solr/core/src/java/org/apache/solr/handler/admin/LoggingHandler.java 
> 122d2cbf8b 
>   solr/core/src/java/org/apache/solr/logging/LogWatcher.java c510590282 
>   solr/core/src/java/org/apache/solr/logging/log4j/EventAppender.java 
> ff2876fb2f 
>   solr/core/src/java/org/apache/solr/logging/log4j/Log4jInfo.java dfd3dde74a 
>   solr/core/src/java/org/apache/solr/logging/log4j/Log4jWatcher.java 
> 04fa5fb1d8 
>   solr/core/src/java/org/apache/solr/logging/log4j/package-info.java 
> f78953385c 
>   solr/core/src/java/org/apache/solr/logging/log4j2/Log4j2Watcher.java 
> PRE-CREATION 
>   solr/core/src/java/org/apache/solr/servlet/SolrDispatchFilter.java 
> 4d944d239b 
>   solr/core/src/java/org/apache/solr/util/SolrCLI.java 266ef3a28a 
>   solr/core/src/java/org/apache/solr/util/SolrLogLayout.java a60ada828b 
>   solr/core/src/java/org/apache/solr/util/StartupLoggingUtils.java c582eff4c0 
>   solr/core/src/test-files/log4j.properties 969439a228 
>   solr/core/src/test-files/log4j2.xml PRE-CREATION 
>   solr/core/src/test/org/apache/solr/handler/RequestLoggingTest.java 
> 4c780ccda4 
>   solr/core/src/test/org/apache/solr/handler/admin/LoggingHandlerTest.java 
> 555c1376a5 
>   solr/core/src/test/org/apache/solr/logging/log4j2/Log4j2WatcherTest.java 
> PRE-CREATION 
>   solr/core/src/test/org/apache/solr/util/TestSolrCLIRunExample.java 
> 89008517f8 
>   solr/example/README.txt 562c256377 
>   solr/example/example-DIH/solr/db/conf/solrconfig.xml 1ffbbe817f 
>   solr/example/example-DIH/solr/mail/conf/solrconfig.xml 770b0fd870 
>   solr/example/example-DIH/solr/solr/conf/solrconfig.xml 3f00141340 
>   solr/example/resources/log4j.properties 02f91c5dae 
>   solr/example/resources/log4j2.xml PRE-CREATION 
>   solr/licenses/audience-annotations-0.7.0.jar.sha1 PRE-CREATION 
>   solr/licenses/audience-annotations-LICENSE-ASL.txt PRE-CREATION 
>   solr/licenses/audience-annotations-NOTICE.txt PRE-CREATION 
>   solr/licenses/disruptor-3.4.0.jar.sha1 PRE-CREATION 
>   solr/licenses/disruptor-LICENSE-ASL.txt PRE-CREATION 
>   solr/licenses/disruptor-NOTICE.txt PRE-CREATION 
>   solr/licenses/log4j-1.2-api-2.10.0.jar.sha1 PRE-CREATION 
>   solr/licenses/log4j-1.2.17.jar.sha1 383110e29f 
>   solr/licenses/log4j-api-2.10.0.jar.sha1 

Re: Review Request 65888: Upgrade Solr to use Log4J2

2018-03-04 Thread Erick Erickson


> On March 3, 2018, 4:41 a.m., Tomás Fernández Löbbe wrote:
> > LGTM. I think you need to add org.apache.logging.log4j.** to forbidden APIs

It turns out not. Thanks to Shawn I now know the shim is used, which means we 
can (and do) still use log4j calls in the code.


> On March 3, 2018, 4:41 a.m., Tomás Fernández Löbbe wrote:
> > solr/bin/solr.in.sh
> > Line 96 (original), 96 (patched)
> > 
> >
> > xml

Done


> On March 3, 2018, 4:41 a.m., Tomás Fernández Löbbe wrote:
> > solr/core/ivy.xml
> > Line 51 (original), 51 (patched)
> > 
> >
> > What is this new dependency used for?

trying to remove it, we'll see if there are any failures.


> On March 3, 2018, 4:41 a.m., Tomás Fernández Löbbe wrote:
> > solr/core/ivy.xml
> > Lines 57 (patched)
> > 
> >
> > ...and this?

from: https://logging.apache.org/log4j/2.x/manual/async.html

LMAX Disruptor technology. Asynchronous Loggers internally use the Disruptor, a 
lock-free inter-thread communication library, instead of queues, resulting in 
higher throughput and lower latency.


> On March 3, 2018, 4:41 a.m., Tomás Fernández Löbbe wrote:
> > solr/core/src/java/org/apache/solr/logging/log4j2/Log4j2Watcher.java
> > Lines 154 (patched)
> > 
> >
> > Should this could be parametrized? (applies to other entries in this 
> > file too)

Done


> On March 3, 2018, 4:41 a.m., Tomás Fernández Löbbe wrote:
> > solr/core/src/test/org/apache/solr/logging/log4j2/Log4j2WatcherTest.java
> > Lines 88-89 (patched)
> > 
> >
> > assertTrue?

Done


- Erick


---
This is an automatically generated e-mail. To reply, visit:
https://reviews.apache.org/r/65888/#review198580
---


On March 3, 2018, 2:51 a.m., Varun Thacker wrote:
> 
> ---
> This is an automatically generated e-mail. To reply, visit:
> https://reviews.apache.org/r/65888/
> ---
> 
> (Updated March 3, 2018, 2:51 a.m.)
> 
> 
> Review request for lucene.
> 
> 
> Repository: lucene-solr
> 
> 
> Description
> ---
> 
> Upgrade Solr to use Log4J2
> 
> 
> Diffs
> -
> 
>   lucene/CHANGES.txt e3799bc9d5 
>   lucene/common-build.xml 4fa59ac936 
>   lucene/ivy-versions.properties 5ab36ddfa2 
>   solr/CHANGES.txt 05f4f560bd 
>   solr/bin/install_solr_service.sh b82957144d 
>   solr/bin/solr 4e178de945 
>   solr/bin/solr.cmd dcff0c6af7 
>   solr/bin/solr.in.cmd bfb33e0e9d 
>   solr/bin/solr.in.sh e7478cdf5c 
>   solr/contrib/clustering/src/test-files/log4j.properties b5216db8b2 
>   solr/contrib/clustering/src/test-files/log4j2.xml PRE-CREATION 
>   solr/contrib/dataimporthandler/src/test-files/log4j.properties d3ea4deafc 
>   solr/contrib/dataimporthandler/src/test-files/log4j2.xml PRE-CREATION 
>   solr/contrib/ltr/src/test-files/log4j.properties d86c6988d5 
>   solr/contrib/ltr/src/test-files/log4j2.xml PRE-CREATION 
>   solr/core/ivy.xml ff4fa48679 
>   
> solr/core/src/java/org/apache/solr/client/solrj/embedded/JettySolrRunner.java 
> 23a8dc1eb3 
>   solr/core/src/java/org/apache/solr/handler/admin/LoggingHandler.java 
> 122d2cbf8b 
>   solr/core/src/java/org/apache/solr/logging/LogWatcher.java c510590282 
>   solr/core/src/java/org/apache/solr/logging/log4j/EventAppender.java 
> ff2876fb2f 
>   solr/core/src/java/org/apache/solr/logging/log4j/Log4jInfo.java dfd3dde74a 
>   solr/core/src/java/org/apache/solr/logging/log4j/Log4jWatcher.java 
> 04fa5fb1d8 
>   solr/core/src/java/org/apache/solr/logging/log4j/package-info.java 
> f78953385c 
>   solr/core/src/java/org/apache/solr/logging/log4j2/Log4j2Watcher.java 
> PRE-CREATION 
>   solr/core/src/java/org/apache/solr/servlet/SolrDispatchFilter.java 
> 4d944d239b 
>   solr/core/src/java/org/apache/solr/util/SolrCLI.java 266ef3a28a 
>   solr/core/src/java/org/apache/solr/util/SolrLogLayout.java a60ada828b 
>   solr/core/src/java/org/apache/solr/util/StartupLoggingUtils.java c582eff4c0 
>   solr/core/src/test-files/log4j.properties 969439a228 
>   solr/core/src/test-files/log4j2.xml PRE-CREATION 
>   solr/core/src/test/org/apache/solr/handler/RequestLoggingTest.java 
> 4c780ccda4 
>   solr/core/src/test/org/apache/solr/handler/admin/LoggingHandlerTest.java 
> 555c1376a5 
>   solr/core/src/test/org/apache/solr/logging/log4j2/Log4j2WatcherTest.java 
> PRE-CREATION 
>   solr/core/src/test/org/apache/solr/util/TestSolrCLIRunExample.java 
> 89008517f8 
>   solr/example/README.txt 562c256377 
>   solr/example/example-DIH/solr/db/conf/solrconfig.xml 1ffbbe817f 
>   

[jira] [Created] (SOLR-12055) Enable asynch logging by default

2018-03-04 Thread Erick Erickson (JIRA)
Erick Erickson created SOLR-12055:
-

 Summary: Enable asynch logging by default
 Key: SOLR-12055
 URL: https://issues.apache.org/jira/browse/SOLR-12055
 Project: Solr
  Issue Type: Improvement
  Security Level: Public (Default Security Level. Issues are Public)
  Components: logging
Affects Versions: 7.2
Reporter: Erick Erickson
Assignee: Erick Erickson


When SOLR-7887 is done, switching to asynch logging will be a simple change to 
the config files for log4j2. This will reduce contention and increase 
throughput generally and logging in particular.

There's a discussion of the pros/cons here: 
https://logging.apache.org/log4j/2.0/manual/async.html

An alternative is to put a note in the Ref Guide about how to enable async 
logging.

I guess even if we enable async by default the ref guide still needs a note 
about how to _disable_ it.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[JENKINS] Lucene-Solr-NightlyTests-master - Build # 1494 - Still Unstable

2018-03-04 Thread Apache Jenkins Server
Build: https://builds.apache.org/job/Lucene-Solr-NightlyTests-master/1494/

4 tests failed.
FAILED:  
org.apache.solr.cloud.cdcr.CdcrReplicationHandlerTest.testReplicationWithBufferedUpdates

Error Message:
Timeout while trying to assert number of documents @ 
https://127.0.0.1:50859/rqgf/zx/source_collection_shard1_replica_n1/

Stack Trace:
java.lang.AssertionError: Timeout while trying to assert number of documents @ 
https://127.0.0.1:50859/rqgf/zx/source_collection_shard1_replica_n1/
at 
__randomizedtesting.SeedInfo.seed([BBB8DF46830C4269:68B18F58C69FDEFE]:0)
at 
org.apache.solr.cloud.cdcr.CdcrReplicationHandlerTest.assertNumDocs(CdcrReplicationHandlerTest.java:257)
at 
org.apache.solr.cloud.cdcr.CdcrReplicationHandlerTest.testReplicationWithBufferedUpdates(CdcrReplicationHandlerTest.java:237)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner.invoke(RandomizedRunner.java:1737)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$8.evaluate(RandomizedRunner.java:934)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$9.evaluate(RandomizedRunner.java:970)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$10.evaluate(RandomizedRunner.java:984)
at 
org.apache.solr.BaseDistributedSearchTestCase$ShardsRepeatRule$ShardsFixedStatement.callStatement(BaseDistributedSearchTestCase.java:993)
at 
org.apache.solr.BaseDistributedSearchTestCase$ShardsRepeatRule$ShardsStatement.evaluate(BaseDistributedSearchTestCase.java:968)
at 
com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:57)
at 
org.apache.lucene.util.TestRuleSetupTeardownChained$1.evaluate(TestRuleSetupTeardownChained.java:49)
at 
org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:45)
at 
org.apache.lucene.util.TestRuleThreadAndTestName$1.evaluate(TestRuleThreadAndTestName.java:48)
at 
org.apache.lucene.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:64)
at 
org.apache.lucene.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:47)
at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at 
com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:368)
at 
com.carrotsearch.randomizedtesting.ThreadLeakControl.forkTimeoutingTask(ThreadLeakControl.java:817)
at 
com.carrotsearch.randomizedtesting.ThreadLeakControl$3.evaluate(ThreadLeakControl.java:468)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner.runSingleTest(RandomizedRunner.java:943)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$5.evaluate(RandomizedRunner.java:829)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$6.evaluate(RandomizedRunner.java:879)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$7.evaluate(RandomizedRunner.java:890)
at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at 
com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:57)
at 
org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:45)
at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at 
org.apache.lucene.util.TestRuleStoreClassName$1.evaluate(TestRuleStoreClassName.java:41)
at 
com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:40)
at 
com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:40)
at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at 
org.apache.lucene.util.TestRuleAssertionsRequired$1.evaluate(TestRuleAssertionsRequired.java:53)
at 
org.apache.lucene.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:47)
at 
org.apache.lucene.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:64)
at 
org.apache.lucene.util.TestRuleIgnoreTestSuites$1.evaluate(TestRuleIgnoreTestSuites.java:54)
at 

[jira] [Updated] (LUCENE-8190) Replace dendency on LegacyCell for setting pruneLeafyBranches on RecursivePrefixTreeStrategy

2018-03-04 Thread Ignacio Vera (JIRA)

 [ 
https://issues.apache.org/jira/browse/LUCENE-8190?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Ignacio Vera updated LUCENE-8190:
-
Attachment: LUCENE-8190.patch

> Replace dendency on LegacyCell for setting pruneLeafyBranches on 
> RecursivePrefixTreeStrategy
> 
>
> Key: LUCENE-8190
> URL: https://issues.apache.org/jira/browse/LUCENE-8190
> Project: Lucene - Core
>  Issue Type: Improvement
>  Components: modules/spatial-extras
>Reporter: Ignacio Vera
>Priority: Major
> Attachments: LUCENE-8190.patch, LUCENE-8190.patch
>
>
> The setting {{pruneLeafyBranches}} on {{RecursivePrefixTreeStrategy}} depends 
> on abstract class {{LegacyCell}} and therefore trees like the newly added 
> {{S2PrefixTree}} cannot benefit for such optimization.
> It is proposed to add a new specialize interface for {{cell}} interface and 
> make the setting depends on it instead of {{LegacyCell.}}



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[jira] [Commented] (LUCENE-8190) Replace dendency on LegacyCell for setting pruneLeafyBranches on RecursivePrefixTreeStrategy

2018-03-04 Thread Ignacio Vera (JIRA)

[ 
https://issues.apache.org/jira/browse/LUCENE-8190?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16385324#comment-16385324
 ] 

Ignacio Vera commented on LUCENE-8190:
--

Noted the issue with javadocs.

I think there is some inconsistency how this property is treated. It seems it 
belongs to the cell but RPT strategy treats it as a property of the tree.

I was thinking in a (probably non realistic) tree with mixed type of cells. 
They will not be treated as expected as it stops iterating at the first cell 
that does not implement the new interface.

I am not sure if it is too aggressive but the logic should visit all cells 
regardless the type and stop when trying to prune if cell is not eligible not 
before. I attach a patch with the idea.

 

 

> Replace dendency on LegacyCell for setting pruneLeafyBranches on 
> RecursivePrefixTreeStrategy
> 
>
> Key: LUCENE-8190
> URL: https://issues.apache.org/jira/browse/LUCENE-8190
> Project: Lucene - Core
>  Issue Type: Improvement
>  Components: modules/spatial-extras
>Reporter: Ignacio Vera
>Priority: Major
> Attachments: LUCENE-8190.patch, LUCENE-8190.patch
>
>
> The setting {{pruneLeafyBranches}} on {{RecursivePrefixTreeStrategy}} depends 
> on abstract class {{LegacyCell}} and therefore trees like the newly added 
> {{S2PrefixTree}} cannot benefit for such optimization.
> It is proposed to add a new specialize interface for {{cell}} interface and 
> make the setting depends on it instead of {{LegacyCell.}}



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[jira] [Commented] (LUCENE-8186) CustomAnalyzer with a LowerCaseTokenizerFactory fails to normalize multiterms

2018-03-04 Thread Robert Muir (JIRA)

[ 
https://issues.apache.org/jira/browse/LUCENE-8186?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16385323#comment-16385323
 ] 

Robert Muir commented on LUCENE-8186:
-

Yeah, the biggest issue i see is the lack of type safety. Currently the method 
is on an interface like this:

{code}
public AbstractAnalysisFactory getMultiTermComponent();
{code}

This means a CharFilterFactory can return a TokenizerFactory or other crazy 
possibilities. Users will get ClassCastException in such cases. This is all 
unrelated to this issue, but its horrible.

IMO it would be better if the api worked different, e.g. three methods that 
enforce the correct return type. This would remove the casts and prevent stupid 
stuff from happening in the factories themselves.

{code}
TokenizerFactory:
  public TokenFilterFactory getMultiTermComponent() { return null; }
TokenFilterFactory:
  public TokenFilterFactory getMultiTermComponent() { return null; }
CharFilterFactory:
  public CharFilterFactory getMultiTermComponent() { return null; }
{code}


> CustomAnalyzer with a LowerCaseTokenizerFactory fails to normalize multiterms 
> --
>
> Key: LUCENE-8186
> URL: https://issues.apache.org/jira/browse/LUCENE-8186
> Project: Lucene - Core
>  Issue Type: Bug
>Reporter: Tim Allison
>Priority: Minor
> Attachments: LUCENE-8186.patch
>
>
> While working on SOLR-12034, a unit test that relied on the 
> LowerCaseTokenizerFactory failed.
> After some digging, I was able to replicate this at the Lucene level.
> Unit test:
> {noformat}
>   @Test
>   public void testLCTokenizerFactoryNormalize() throws Exception {
> Analyzer analyzer =  
> CustomAnalyzer.builder().withTokenizer(LowerCaseTokenizerFactory.class).build();
> //fails
> assertEquals(new BytesRef("hello"), analyzer.normalize("f", "Hello"));
> 
> //now try an integration test with the classic query parser
> QueryParser p = new QueryParser("f", analyzer);
> Query q = p.parse("Hello");
> //passes
> assertEquals(new TermQuery(new Term("f", "hello")), q);
> q = p.parse("Hello*");
> //fails
> assertEquals(new PrefixQuery(new Term("f", "hello")), q);
> q = p.parse("Hel*o");
> //fails
> assertEquals(new WildcardQuery(new Term("f", "hel*o")), q);
>   }
> {noformat}
> The problem is that the CustomAnalyzer iterates through the tokenfilters, but 
> does not call the tokenizer, which, in the case of the LowerCaseTokenizer, 
> does the filtering work.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[jira] [Commented] (LUCENE-8186) CustomAnalyzer with a LowerCaseTokenizerFactory fails to normalize multiterms

2018-03-04 Thread Uwe Schindler (JIRA)

[ 
https://issues.apache.org/jira/browse/LUCENE-8186?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16385321#comment-16385321
 ] 

Uwe Schindler commented on LUCENE-8186:
---

I still don't understand why the Solr TokenizerChain does not do this, although 
the reporter claimed that in Solr it works: 
https://github.com/apache/lucene-solr/blob/e2521b2a8baabdaf43b92192588f51e042d21e97/solr/core/src/java/org/apache/solr/analysis/TokenizerChain.java

> CustomAnalyzer with a LowerCaseTokenizerFactory fails to normalize multiterms 
> --
>
> Key: LUCENE-8186
> URL: https://issues.apache.org/jira/browse/LUCENE-8186
> Project: Lucene - Core
>  Issue Type: Bug
>Reporter: Tim Allison
>Priority: Minor
> Attachments: LUCENE-8186.patch
>
>
> While working on SOLR-12034, a unit test that relied on the 
> LowerCaseTokenizerFactory failed.
> After some digging, I was able to replicate this at the Lucene level.
> Unit test:
> {noformat}
>   @Test
>   public void testLCTokenizerFactoryNormalize() throws Exception {
> Analyzer analyzer =  
> CustomAnalyzer.builder().withTokenizer(LowerCaseTokenizerFactory.class).build();
> //fails
> assertEquals(new BytesRef("hello"), analyzer.normalize("f", "Hello"));
> 
> //now try an integration test with the classic query parser
> QueryParser p = new QueryParser("f", analyzer);
> Query q = p.parse("Hello");
> //passes
> assertEquals(new TermQuery(new Term("f", "hello")), q);
> q = p.parse("Hello*");
> //fails
> assertEquals(new PrefixQuery(new Term("f", "hello")), q);
> q = p.parse("Hel*o");
> //fails
> assertEquals(new WildcardQuery(new Term("f", "hel*o")), q);
>   }
> {noformat}
> The problem is that the CustomAnalyzer iterates through the tokenfilters, but 
> does not call the tokenizer, which, in the case of the LowerCaseTokenizer, 
> does the filtering work.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[jira] [Commented] (LUCENE-8186) CustomAnalyzer with a LowerCaseTokenizerFactory fails to normalize multiterms

2018-03-04 Thread Uwe Schindler (JIRA)

[ 
https://issues.apache.org/jira/browse/LUCENE-8186?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16385316#comment-16385316
 ] 

Uwe Schindler commented on LUCENE-8186:
---

Ok. It's so horrible. Who invented that? 樂

> CustomAnalyzer with a LowerCaseTokenizerFactory fails to normalize multiterms 
> --
>
> Key: LUCENE-8186
> URL: https://issues.apache.org/jira/browse/LUCENE-8186
> Project: Lucene - Core
>  Issue Type: Bug
>Reporter: Tim Allison
>Priority: Minor
> Attachments: LUCENE-8186.patch
>
>
> While working on SOLR-12034, a unit test that relied on the 
> LowerCaseTokenizerFactory failed.
> After some digging, I was able to replicate this at the Lucene level.
> Unit test:
> {noformat}
>   @Test
>   public void testLCTokenizerFactoryNormalize() throws Exception {
> Analyzer analyzer =  
> CustomAnalyzer.builder().withTokenizer(LowerCaseTokenizerFactory.class).build();
> //fails
> assertEquals(new BytesRef("hello"), analyzer.normalize("f", "Hello"));
> 
> //now try an integration test with the classic query parser
> QueryParser p = new QueryParser("f", analyzer);
> Query q = p.parse("Hello");
> //passes
> assertEquals(new TermQuery(new Term("f", "hello")), q);
> q = p.parse("Hello*");
> //fails
> assertEquals(new PrefixQuery(new Term("f", "hello")), q);
> q = p.parse("Hel*o");
> //fails
> assertEquals(new WildcardQuery(new Term("f", "hel*o")), q);
>   }
> {noformat}
> The problem is that the CustomAnalyzer iterates through the tokenfilters, but 
> does not call the tokenizer, which, in the case of the LowerCaseTokenizer, 
> does the filtering work.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[jira] [Commented] (LUCENE-8186) CustomAnalyzer with a LowerCaseTokenizerFactory fails to normalize multiterms

2018-03-04 Thread Robert Muir (JIRA)

[ 
https://issues.apache.org/jira/browse/LUCENE-8186?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16385312#comment-16385312
 ] 

Robert Muir commented on LUCENE-8186:
-

See code for that: 
https://github.com/apache/lucene-solr/blob/master/lucene/analysis/common/src/java/org/apache/lucene/analysis/custom/CustomAnalyzer.java#L125-L134

> CustomAnalyzer with a LowerCaseTokenizerFactory fails to normalize multiterms 
> --
>
> Key: LUCENE-8186
> URL: https://issues.apache.org/jira/browse/LUCENE-8186
> Project: Lucene - Core
>  Issue Type: Bug
>Reporter: Tim Allison
>Priority: Minor
> Attachments: LUCENE-8186.patch
>
>
> While working on SOLR-12034, a unit test that relied on the 
> LowerCaseTokenizerFactory failed.
> After some digging, I was able to replicate this at the Lucene level.
> Unit test:
> {noformat}
>   @Test
>   public void testLCTokenizerFactoryNormalize() throws Exception {
> Analyzer analyzer =  
> CustomAnalyzer.builder().withTokenizer(LowerCaseTokenizerFactory.class).build();
> //fails
> assertEquals(new BytesRef("hello"), analyzer.normalize("f", "Hello"));
> 
> //now try an integration test with the classic query parser
> QueryParser p = new QueryParser("f", analyzer);
> Query q = p.parse("Hello");
> //passes
> assertEquals(new TermQuery(new Term("f", "hello")), q);
> q = p.parse("Hello*");
> //fails
> assertEquals(new PrefixQuery(new Term("f", "hello")), q);
> q = p.parse("Hel*o");
> //fails
> assertEquals(new WildcardQuery(new Term("f", "hel*o")), q);
>   }
> {noformat}
> The problem is that the CustomAnalyzer iterates through the tokenfilters, but 
> does not call the tokenizer, which, in the case of the LowerCaseTokenizer, 
> does the filtering work.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[jira] [Commented] (LUCENE-8186) CustomAnalyzer with a LowerCaseTokenizerFactory fails to normalize multiterms

2018-03-04 Thread Robert Muir (JIRA)

[ 
https://issues.apache.org/jira/browse/LUCENE-8186?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16385309#comment-16385309
 ] 

Robert Muir commented on LUCENE-8186:
-

CharFilterFactories can normalize too, but I think CustomAnalyzer already does 
the right thing for them? 

> CustomAnalyzer with a LowerCaseTokenizerFactory fails to normalize multiterms 
> --
>
> Key: LUCENE-8186
> URL: https://issues.apache.org/jira/browse/LUCENE-8186
> Project: Lucene - Core
>  Issue Type: Bug
>Reporter: Tim Allison
>Priority: Minor
> Attachments: LUCENE-8186.patch
>
>
> While working on SOLR-12034, a unit test that relied on the 
> LowerCaseTokenizerFactory failed.
> After some digging, I was able to replicate this at the Lucene level.
> Unit test:
> {noformat}
>   @Test
>   public void testLCTokenizerFactoryNormalize() throws Exception {
> Analyzer analyzer =  
> CustomAnalyzer.builder().withTokenizer(LowerCaseTokenizerFactory.class).build();
> //fails
> assertEquals(new BytesRef("hello"), analyzer.normalize("f", "Hello"));
> 
> //now try an integration test with the classic query parser
> QueryParser p = new QueryParser("f", analyzer);
> Query q = p.parse("Hello");
> //passes
> assertEquals(new TermQuery(new Term("f", "hello")), q);
> q = p.parse("Hello*");
> //fails
> assertEquals(new PrefixQuery(new Term("f", "hello")), q);
> q = p.parse("Hel*o");
> //fails
> assertEquals(new WildcardQuery(new Term("f", "hel*o")), q);
>   }
> {noformat}
> The problem is that the CustomAnalyzer iterates through the tokenfilters, but 
> does not call the tokenizer, which, in the case of the LowerCaseTokenizer, 
> does the filtering work.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[jira] [Commented] (LUCENE-8186) CustomAnalyzer with a LowerCaseTokenizerFactory fails to normalize multiterms

2018-03-04 Thread Uwe Schindler (JIRA)

[ 
https://issues.apache.org/jira/browse/LUCENE-8186?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16385302#comment-16385302
 ] 

Uwe Schindler commented on LUCENE-8186:
---

Thanks Robert. Looks ok, although horrible.

How about CharFilters? Do they have the same problem?

> CustomAnalyzer with a LowerCaseTokenizerFactory fails to normalize multiterms 
> --
>
> Key: LUCENE-8186
> URL: https://issues.apache.org/jira/browse/LUCENE-8186
> Project: Lucene - Core
>  Issue Type: Bug
>Reporter: Tim Allison
>Priority: Minor
> Attachments: LUCENE-8186.patch
>
>
> While working on SOLR-12034, a unit test that relied on the 
> LowerCaseTokenizerFactory failed.
> After some digging, I was able to replicate this at the Lucene level.
> Unit test:
> {noformat}
>   @Test
>   public void testLCTokenizerFactoryNormalize() throws Exception {
> Analyzer analyzer =  
> CustomAnalyzer.builder().withTokenizer(LowerCaseTokenizerFactory.class).build();
> //fails
> assertEquals(new BytesRef("hello"), analyzer.normalize("f", "Hello"));
> 
> //now try an integration test with the classic query parser
> QueryParser p = new QueryParser("f", analyzer);
> Query q = p.parse("Hello");
> //passes
> assertEquals(new TermQuery(new Term("f", "hello")), q);
> q = p.parse("Hello*");
> //fails
> assertEquals(new PrefixQuery(new Term("f", "hello")), q);
> q = p.parse("Hel*o");
> //fails
> assertEquals(new WildcardQuery(new Term("f", "hel*o")), q);
>   }
> {noformat}
> The problem is that the CustomAnalyzer iterates through the tokenfilters, but 
> does not call the tokenizer, which, in the case of the LowerCaseTokenizer, 
> does the filtering work.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[jira] [Updated] (SOLR-12008) Settle a location for the "correct" log4j2.xml file.

2018-03-04 Thread Erick Erickson (JIRA)

 [ 
https://issues.apache.org/jira/browse/SOLR-12008?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Erick Erickson updated SOLR-12008:
--
Summary: Settle a location for the "correct" log4j2.xml file.  (was: Remove 
log4j.properties file in solr/example/resources (and perhaps others))

> Settle a location for the "correct" log4j2.xml file.
> 
>
> Key: SOLR-12008
> URL: https://issues.apache.org/jira/browse/SOLR-12008
> Project: Solr
>  Issue Type: Improvement
>  Security Level: Public(Default Security Level. Issues are Public) 
>  Components: logging
>Reporter: Erick Erickson
>Assignee: Erick Erickson
>Priority: Major
>
> As part of SOLR-11934 I started looking at log4j.properties files. Waaay back 
> in 2015, the %C in "/solr/server/resources/log4j.properties" was changed to 
> use %c, but the file in "solr/example/resources/log4j.properties" was not 
> changed. That got me to looking around and there are a bunch of 
> log4j.properties files:
> ./solr/core/src/test-files/log4j.properties
> ./solr/example/resources/log4j.properties
> ./solr/solrj/src/test-files/log4j.properties
> ./solr/server/resources/log4j.properties
> ./solr/server/scripts/cloud-scripts/log4j.properties
> ./solr/contrib/dataimporthandler/src/test-files/log4j.properties
> ./solr/contrib/clustering/src/test-files/log4j.properties
> ./solr/contrib/ltr/src/test-files/log4j.properties
> ./solr/test-framework/src/test-files/log4j.properties
> Why do we have so many? After the log4j2 ticket gets checked in (SOLR-7887) I 
> propose the logging configuration files get consolidated. The question is 
> "how far"? 
> I at least want to get rid of the one in solr/example, users should use the 
> one in server/resources. Having to maintain these two separately is asking 
> for trouble.
> [~markrmil...@gmail.com] Do you have any wisdom on the properties file in 
> server/scripts/cloud-scripts?
> Anyone else who has a clue about why the other properties files were created, 
> especially the ones in contrib?
> And what about all the ones in various test-files directories? People didn't 
> create them for no reason, and I don't want to rediscover that it's a real 
> pain to try to re-use the one in server/resources for instance.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[jira] [Commented] (SOLR-12008) Remove log4j.properties file in solr/example/resources (and perhaps others)

2018-03-04 Thread Erick Erickson (JIRA)

[ 
https://issues.apache.org/jira/browse/SOLR-12008?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16385301#comment-16385301
 ] 

Erick Erickson commented on SOLR-12008:
---

We have no fewer the three log4j2.xml (as of SOLR-7887) config files

solr/example/resources
solr/server/resources
server/scripts/cloud-scripts - this one referred to in bin/solr

As of SOLR-7887 they are all identical. 

> Remove log4j.properties file in solr/example/resources (and perhaps others)
> ---
>
> Key: SOLR-12008
> URL: https://issues.apache.org/jira/browse/SOLR-12008
> Project: Solr
>  Issue Type: Improvement
>  Security Level: Public(Default Security Level. Issues are Public) 
>  Components: logging
>Reporter: Erick Erickson
>Assignee: Erick Erickson
>Priority: Major
>
> As part of SOLR-11934 I started looking at log4j.properties files. Waaay back 
> in 2015, the %C in "/solr/server/resources/log4j.properties" was changed to 
> use %c, but the file in "solr/example/resources/log4j.properties" was not 
> changed. That got me to looking around and there are a bunch of 
> log4j.properties files:
> ./solr/core/src/test-files/log4j.properties
> ./solr/example/resources/log4j.properties
> ./solr/solrj/src/test-files/log4j.properties
> ./solr/server/resources/log4j.properties
> ./solr/server/scripts/cloud-scripts/log4j.properties
> ./solr/contrib/dataimporthandler/src/test-files/log4j.properties
> ./solr/contrib/clustering/src/test-files/log4j.properties
> ./solr/contrib/ltr/src/test-files/log4j.properties
> ./solr/test-framework/src/test-files/log4j.properties
> Why do we have so many? After the log4j2 ticket gets checked in (SOLR-7887) I 
> propose the logging configuration files get consolidated. The question is 
> "how far"? 
> I at least want to get rid of the one in solr/example, users should use the 
> one in server/resources. Having to maintain these two separately is asking 
> for trouble.
> [~markrmil...@gmail.com] Do you have any wisdom on the properties file in 
> server/scripts/cloud-scripts?
> Anyone else who has a clue about why the other properties files were created, 
> especially the ones in contrib?
> And what about all the ones in various test-files directories? People didn't 
> create them for no reason, and I don't want to rediscover that it's a real 
> pain to try to re-use the one in server/resources for instance.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[jira] [Updated] (LUCENE-8186) CustomAnalyzer with a LowerCaseTokenizerFactory fails to normalize multiterms

2018-03-04 Thread Robert Muir (JIRA)

 [ 
https://issues.apache.org/jira/browse/LUCENE-8186?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Robert Muir updated LUCENE-8186:

Attachment: LUCENE-8186.patch

> CustomAnalyzer with a LowerCaseTokenizerFactory fails to normalize multiterms 
> --
>
> Key: LUCENE-8186
> URL: https://issues.apache.org/jira/browse/LUCENE-8186
> Project: Lucene - Core
>  Issue Type: Bug
>Reporter: Tim Allison
>Priority: Minor
> Attachments: LUCENE-8186.patch
>
>
> While working on SOLR-12034, a unit test that relied on the 
> LowerCaseTokenizerFactory failed.
> After some digging, I was able to replicate this at the Lucene level.
> Unit test:
> {noformat}
>   @Test
>   public void testLCTokenizerFactoryNormalize() throws Exception {
> Analyzer analyzer =  
> CustomAnalyzer.builder().withTokenizer(LowerCaseTokenizerFactory.class).build();
> //fails
> assertEquals(new BytesRef("hello"), analyzer.normalize("f", "Hello"));
> 
> //now try an integration test with the classic query parser
> QueryParser p = new QueryParser("f", analyzer);
> Query q = p.parse("Hello");
> //passes
> assertEquals(new TermQuery(new Term("f", "hello")), q);
> q = p.parse("Hello*");
> //fails
> assertEquals(new PrefixQuery(new Term("f", "hello")), q);
> q = p.parse("Hel*o");
> //fails
> assertEquals(new WildcardQuery(new Term("f", "hel*o")), q);
>   }
> {noformat}
> The problem is that the CustomAnalyzer iterates through the tokenfilters, but 
> does not call the tokenizer, which, in the case of the LowerCaseTokenizer, 
> does the filtering work.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[jira] [Commented] (LUCENE-8186) CustomAnalyzer with a LowerCaseTokenizerFactory fails to normalize multiterms

2018-03-04 Thread Robert Muir (JIRA)

[ 
https://issues.apache.org/jira/browse/LUCENE-8186?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16385297#comment-16385297
 ] 

Robert Muir commented on LUCENE-8186:
-

Uwe: I agree with you. For "normalize" the tokenization is implicitly keyword. 
Tokenizers can't concern themselves with the syntax of wildcards or regex, 
sorry. They work on real text.

I also agree that LowerCaseTokenizer is stupid :) But its factory does the 
correct thing and returns a LowerCase*Filter* for these purposes!

https://github.com/apache/lucene-solr/blob/master/lucene/analysis/common/src/java/org/apache/lucene/analysis/core/LowerCaseTokenizerFactory.java#L70-L74

So I think CustomAnalyzer forgets to call this method on the TokenizerFactory, 
just in case the tokenizer is doing something like this. It seems like this is 
really easy to fix in CustomAnalyzer.normalize? And yeah, separately, we should 
deprecate LowerCaseTokenizer.

> CustomAnalyzer with a LowerCaseTokenizerFactory fails to normalize multiterms 
> --
>
> Key: LUCENE-8186
> URL: https://issues.apache.org/jira/browse/LUCENE-8186
> Project: Lucene - Core
>  Issue Type: Bug
>Reporter: Tim Allison
>Priority: Minor
>
> While working on SOLR-12034, a unit test that relied on the 
> LowerCaseTokenizerFactory failed.
> After some digging, I was able to replicate this at the Lucene level.
> Unit test:
> {noformat}
>   @Test
>   public void testLCTokenizerFactoryNormalize() throws Exception {
> Analyzer analyzer =  
> CustomAnalyzer.builder().withTokenizer(LowerCaseTokenizerFactory.class).build();
> //fails
> assertEquals(new BytesRef("hello"), analyzer.normalize("f", "Hello"));
> 
> //now try an integration test with the classic query parser
> QueryParser p = new QueryParser("f", analyzer);
> Query q = p.parse("Hello");
> //passes
> assertEquals(new TermQuery(new Term("f", "hello")), q);
> q = p.parse("Hello*");
> //fails
> assertEquals(new PrefixQuery(new Term("f", "hello")), q);
> q = p.parse("Hel*o");
> //fails
> assertEquals(new WildcardQuery(new Term("f", "hel*o")), q);
>   }
> {noformat}
> The problem is that the CustomAnalyzer iterates through the tokenfilters, but 
> does not call the tokenizer, which, in the case of the LowerCaseTokenizer, 
> does the filtering work.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[JENKINS-EA] Lucene-Solr-6.6-Linux (64bit/jdk-10-ea+43) - Build # 190 - Still Unstable!

2018-03-04 Thread Policeman Jenkins Server
Build: https://jenkins.thetaphi.de/job/Lucene-Solr-6.6-Linux/190/
Java: 64bit/jdk-10-ea+43 -XX:+UseCompressedOops -XX:+UseSerialGC

69 tests failed.
FAILED:  
junit.framework.TestSuite.org.apache.solr.cloud.OverseerCollectionConfigSetProcessorTest

Error Message:
 Mockito cannot mock this class: class org.apache.solr.cloud.OverseerTaskQueue. 
 Mockito can only mock non-private & non-final classes. If you're not sure why 
you're getting this error, please report to the mailing list.   Java
   : 10 JVM vendor name: "Oracle Corporation" JVM vendor version : 10+43 
JVM name   : OpenJDK 64-Bit Server VM JVM version: 10+43 JVM 
info   : mixed mode OS name: Linux OS version : 
4.13.0-36-generic   Underlying exception : 
java.lang.UnsupportedOperationException: Cannot define class using reflection

Stack Trace:
org.mockito.exceptions.base.MockitoException: 
Mockito cannot mock this class: class org.apache.solr.cloud.OverseerTaskQueue.

Mockito can only mock non-private & non-final classes.
If you're not sure why you're getting this error, please report to the mailing 
list.


Java   : 10
JVM vendor name: "Oracle Corporation"
JVM vendor version : 10+43
JVM name   : OpenJDK 64-Bit Server VM
JVM version: 10+43
JVM info   : mixed mode
OS name: Linux
OS version : 4.13.0-36-generic


Underlying exception : java.lang.UnsupportedOperationException: Cannot define 
class using reflection
at __randomizedtesting.SeedInfo.seed([2387225AEFB7286]:0)
at 
org.apache.solr.cloud.OverseerCollectionConfigSetProcessorTest.setUpOnce(OverseerCollectionConfigSetProcessorTest.java:103)
at 
java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at 
java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at 
java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.base/java.lang.reflect.Method.invoke(Method.java:564)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner.invoke(RandomizedRunner.java:1713)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$6.evaluate(RandomizedRunner.java:847)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$7.evaluate(RandomizedRunner.java:863)
at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at 
com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:57)
at 
org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:45)
at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at 
org.apache.lucene.util.TestRuleStoreClassName$1.evaluate(TestRuleStoreClassName.java:41)
at 
com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:40)
at 
com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:40)
at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at 
org.apache.lucene.util.TestRuleAssertionsRequired$1.evaluate(TestRuleAssertionsRequired.java:53)
at 
org.apache.lucene.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:47)
at 
org.apache.lucene.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:64)
at 
org.apache.lucene.util.TestRuleIgnoreTestSuites$1.evaluate(TestRuleIgnoreTestSuites.java:54)
at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at 
com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:368)
at java.base/java.lang.Thread.run(Thread.java:844)
Caused by: java.lang.UnsupportedOperationException: Cannot define class using 
reflection
at 
net.bytebuddy.dynamic.loading.ClassInjector$UsingReflection$Dispatcher$Unavailable.defineClass(ClassInjector.java:819)
at 
net.bytebuddy.dynamic.loading.ClassInjector$UsingReflection.inject(ClassInjector.java:183)
at 
net.bytebuddy.dynamic.loading.ClassLoadingStrategy$Default$InjectionDispatcher.load(ClassLoadingStrategy.java:187)
at 
net.bytebuddy.dynamic.TypeResolutionStrategy$Passive.initialize(TypeResolutionStrategy.java:79)
at 
net.bytebuddy.dynamic.DynamicType$Default$Unloaded.load(DynamicType.java:4352)
at 
org.mockito.internal.creation.bytebuddy.SubclassBytecodeGenerator.mockClass(SubclassBytecodeGenerator.java:94)
at 

[jira] [Updated] (SOLR-12054) ebeAdd and ebeSubtract should support matrix operations

2018-03-04 Thread Joel Bernstein (JIRA)

 [ 
https://issues.apache.org/jira/browse/SOLR-12054?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Joel Bernstein updated SOLR-12054:
--
Attachment: SOLR-12054.patch

> ebeAdd and ebeSubtract should support matrix operations
> ---
>
> Key: SOLR-12054
> URL: https://issues.apache.org/jira/browse/SOLR-12054
> Project: Solr
>  Issue Type: New Feature
>  Security Level: Public(Default Security Level. Issues are Public) 
>Reporter: Joel Bernstein
>Assignee: Joel Bernstein
>Priority: Major
> Fix For: 7.3
>
> Attachments: SOLR-12054.patch
>
>
> Currently ebeAdd and ebeSubtract perform element-by-element addition and 
> subtraction of vectors. This ticket will allow them to perform 
> element-by-element addition and subtraction of matrices as well.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[jira] [Updated] (SOLR-12054) ebeAdd and ebeSubtract should support matrix operations

2018-03-04 Thread Joel Bernstein (JIRA)

 [ 
https://issues.apache.org/jira/browse/SOLR-12054?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Joel Bernstein updated SOLR-12054:
--
Fix Version/s: 7.3

> ebeAdd and ebeSubtract should support matrix operations
> ---
>
> Key: SOLR-12054
> URL: https://issues.apache.org/jira/browse/SOLR-12054
> Project: Solr
>  Issue Type: New Feature
>  Security Level: Public(Default Security Level. Issues are Public) 
>Reporter: Joel Bernstein
>Assignee: Joel Bernstein
>Priority: Major
> Fix For: 7.3
>
>
> Currently ebeAdd and ebeSubtract perform element-by-element addition and 
> subtraction of vectors. This ticket will allow them to perform 
> element-by-element addition and subtraction of matrices as well.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[jira] [Assigned] (SOLR-12054) ebeAdd and ebeSubtract should support matrix operations

2018-03-04 Thread Joel Bernstein (JIRA)

 [ 
https://issues.apache.org/jira/browse/SOLR-12054?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Joel Bernstein reassigned SOLR-12054:
-

Assignee: Joel Bernstein

> ebeAdd and ebeSubtract should support matrix operations
> ---
>
> Key: SOLR-12054
> URL: https://issues.apache.org/jira/browse/SOLR-12054
> Project: Solr
>  Issue Type: New Feature
>  Security Level: Public(Default Security Level. Issues are Public) 
>Reporter: Joel Bernstein
>Assignee: Joel Bernstein
>Priority: Major
> Fix For: 7.3
>
>
> Currently ebeAdd and ebeSubtract perform element-by-element addition and 
> subtraction of vectors. This ticket will allow them to perform 
> element-by-element addition and subtraction of matrices as well.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[jira] [Created] (SOLR-12054) ebeAdd and ebeSubtract should support matrix operations

2018-03-04 Thread Joel Bernstein (JIRA)
Joel Bernstein created SOLR-12054:
-

 Summary: ebeAdd and ebeSubtract should support matrix operations
 Key: SOLR-12054
 URL: https://issues.apache.org/jira/browse/SOLR-12054
 Project: Solr
  Issue Type: New Feature
  Security Level: Public (Default Security Level. Issues are Public)
Reporter: Joel Bernstein


Currently ebeAdd and ebeSubtract perform element-by-element addition and 
subtraction of vectors. This ticket will allow them to perform 
element-by-element addition and subtraction of matrices as well.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[jira] [Updated] (SOLR-11947) 7.3 Streaming Expression Documentation

2018-03-04 Thread Joel Bernstein (JIRA)

 [ 
https://issues.apache.org/jira/browse/SOLR-11947?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Joel Bernstein updated SOLR-11947:
--
Attachment: SOLR-11947.patch

> 7.3 Streaming Expression Documentation
> --
>
> Key: SOLR-11947
> URL: https://issues.apache.org/jira/browse/SOLR-11947
> Project: Solr
>  Issue Type: New Feature
>  Security Level: Public(Default Security Level. Issues are Public) 
>  Components: documentation, streaming expressions
>Reporter: Joel Bernstein
>Priority: Major
> Attachments: SOLR-11947.patch, SOLR-11947.patch
>
>




--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[jira] [Commented] (LUCENE-8159) Add a copy constructor in AutomatonQuery to copy directly the compiled automaton

2018-03-04 Thread Robert Muir (JIRA)

[ 
https://issues.apache.org/jira/browse/LUCENE-8159?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16385277#comment-16385277
 ] 

Robert Muir commented on LUCENE-8159:
-

I don't think we should add this CompiledAutomaton stuff to the public api. Its 
an internal implementation detail.

It also doesn't match what the user expects since it does strange things for 
low-level efficiency such as conversion to UTF-8.

> Add a copy constructor in AutomatonQuery to copy directly the compiled 
> automaton
> 
>
> Key: LUCENE-8159
> URL: https://issues.apache.org/jira/browse/LUCENE-8159
> Project: Lucene - Core
>  Issue Type: Improvement
>  Components: core/search
>Affects Versions: trunk
>Reporter: Bruno Roustant
>Assignee: David Smiley
>Priority: Major
> Attachments: 
> 0001-Add-a-copy-constructor-in-AutomatonQuery-to-copy-dir.patch, 
> LUCENE-8159.patch
>
>
> When the query is composed of multiple AutomatonQuery with the same automaton 
> and which target different fields, it is much more efficient to reuse the 
> already compiled automaton by copying it directly and just changing the 
> target field.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[jira] [Commented] (LUCENE-8159) Add a copy constructor in AutomatonQuery to copy directly the compiled automaton

2018-03-04 Thread David Smiley (JIRA)

[ 
https://issues.apache.org/jira/browse/LUCENE-8159?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16385272#comment-16385272
 ] 

David Smiley commented on LUCENE-8159:
--

So the path forward is:
 * Add a constructor to AutomatonQuery that accepts a CompiledAutomaton.
 * Add a getter for this CompiledAutomaton to AutomatonQuery

No changes to AQ subclasses.

> Add a copy constructor in AutomatonQuery to copy directly the compiled 
> automaton
> 
>
> Key: LUCENE-8159
> URL: https://issues.apache.org/jira/browse/LUCENE-8159
> Project: Lucene - Core
>  Issue Type: Improvement
>  Components: core/search
>Affects Versions: trunk
>Reporter: Bruno Roustant
>Assignee: David Smiley
>Priority: Major
> Attachments: 
> 0001-Add-a-copy-constructor-in-AutomatonQuery-to-copy-dir.patch, 
> LUCENE-8159.patch
>
>
> When the query is composed of multiple AutomatonQuery with the same automaton 
> and which target different fields, it is much more efficient to reuse the 
> already compiled automaton by copying it directly and just changing the 
> target field.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[JENKINS] Lucene-Solr-repro - Build # 188 - Still Unstable

2018-03-04 Thread Apache Jenkins Server
Build: https://builds.apache.org/job/Lucene-Solr-repro/188/

[...truncated 28 lines...]
[repro] Jenkins log URL: 
https://builds.apache.org/job/Lucene-Solr-NightlyTests-master/1493/consoleText

[repro] Revision: 18edca0fb25ec7cb79cef8110852e6e0c4c89e8d

[repro] Ant options: -Dtests.multiplier=2 
-Dtests.linedocsfile=/home/jenkins/jenkins-slave/workspace/Lucene-Solr-NightlyTests-master/test-data/enwiki.random.lines.txt
[repro] Repro line:  ant test  -Dtestcase=ShardSplitTest 
-Dtests.method=testSplitAfterFailedSplit -Dtests.seed=CA8C715A4ADC7334 
-Dtests.multiplier=2 -Dtests.nightly=true -Dtests.slow=true 
-Dtests.linedocsfile=/home/jenkins/jenkins-slave/workspace/Lucene-Solr-NightlyTests-master/test-data/enwiki.random.lines.txt
 -Dtests.locale=lv -Dtests.timezone=PNT -Dtests.asserts=true 
-Dtests.file.encoding=UTF-8

[repro] Repro line:  ant test  -Dtestcase=ShardSplitTest 
-Dtests.method=testSplitShardWithRule -Dtests.seed=CA8C715A4ADC7334 
-Dtests.multiplier=2 -Dtests.nightly=true -Dtests.slow=true 
-Dtests.linedocsfile=/home/jenkins/jenkins-slave/workspace/Lucene-Solr-NightlyTests-master/test-data/enwiki.random.lines.txt
 -Dtests.locale=lv -Dtests.timezone=PNT -Dtests.asserts=true 
-Dtests.file.encoding=UTF-8

[repro] Repro line:  ant test  -Dtestcase=HdfsChaosMonkeySafeLeaderTest 
-Dtests.method=test -Dtests.seed=CA8C715A4ADC7334 -Dtests.multiplier=2 
-Dtests.nightly=true -Dtests.slow=true 
-Dtests.linedocsfile=/home/jenkins/jenkins-slave/workspace/Lucene-Solr-NightlyTests-master/test-data/enwiki.random.lines.txt
 -Dtests.locale=pl -Dtests.timezone=America/Kentucky/Louisville 
-Dtests.asserts=true -Dtests.file.encoding=UTF-8

[repro] Repro line:  ant test  -Dtestcase=HdfsRestartWhileUpdatingTest 
-Dtests.method=test -Dtests.seed=CA8C715A4ADC7334 -Dtests.multiplier=2 
-Dtests.nightly=true -Dtests.slow=true 
-Dtests.linedocsfile=/home/jenkins/jenkins-slave/workspace/Lucene-Solr-NightlyTests-master/test-data/enwiki.random.lines.txt
 -Dtests.locale=en -Dtests.timezone=US/Arizona -Dtests.asserts=true 
-Dtests.file.encoding=UTF-8

[repro] Repro line:  ant test  -Dtestcase=HdfsRestartWhileUpdatingTest 
-Dtests.seed=CA8C715A4ADC7334 -Dtests.multiplier=2 -Dtests.nightly=true 
-Dtests.slow=true 
-Dtests.linedocsfile=/home/jenkins/jenkins-slave/workspace/Lucene-Solr-NightlyTests-master/test-data/enwiki.random.lines.txt
 -Dtests.locale=en -Dtests.timezone=US/Arizona -Dtests.asserts=true 
-Dtests.file.encoding=UTF-8

[repro] git rev-parse --abbrev-ref HEAD
[repro] git rev-parse HEAD
[repro] Initial local git branch/revision: 
97299ed00699c248fc38465ee1b0eb0bb1561d3d
[repro] git fetch
[repro] git checkout 18edca0fb25ec7cb79cef8110852e6e0c4c89e8d

[...truncated 2 lines...]
[repro] git merge --ff-only

[...truncated 1 lines...]
[repro] ant clean

[...truncated 6 lines...]
[repro] Test suites by module:
[repro]solr/core
[repro]   HdfsRestartWhileUpdatingTest
[repro]   HdfsChaosMonkeySafeLeaderTest
[repro]   ShardSplitTest
[repro] ant compile-test

[...truncated 3292 lines...]
[repro] ant test-nocompile -Dtests.dups=5 -Dtests.maxfailures=15 
-Dtests.class="*.HdfsRestartWhileUpdatingTest|*.HdfsChaosMonkeySafeLeaderTest|*.ShardSplitTest"
 -Dtests.showOutput=onerror -Dtests.multiplier=2 
-Dtests.linedocsfile=/home/jenkins/jenkins-slave/workspace/Lucene-Solr-NightlyTests-master/test-data/enwiki.random.lines.txt
 -Dtests.seed=CA8C715A4ADC7334 -Dtests.multiplier=2 -Dtests.nightly=true 
-Dtests.slow=true 
-Dtests.linedocsfile=/home/jenkins/jenkins-slave/workspace/Lucene-Solr-NightlyTests-master/test-data/enwiki.random.lines.txt
 -Dtests.locale=en -Dtests.timezone=US/Arizona -Dtests.asserts=true 
-Dtests.file.encoding=UTF-8

[...truncated 85802 lines...]
[repro] Setting last failure code to 256

[repro] Failures:
[repro]   0/5 failed: org.apache.solr.cloud.api.collections.ShardSplitTest
[repro]   1/5 failed: org.apache.solr.cloud.hdfs.HdfsChaosMonkeySafeLeaderTest
[repro]   2/5 failed: org.apache.solr.cloud.hdfs.HdfsRestartWhileUpdatingTest
[repro] git checkout 97299ed00699c248fc38465ee1b0eb0bb1561d3d

[...truncated 2 lines...]
[repro] Exiting with code 256

[...truncated 5 lines...]

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org

[jira] [Commented] (LUCENE-8159) Add a copy constructor in AutomatonQuery to copy directly the compiled automaton

2018-03-04 Thread Adrien Grand (JIRA)

[ 
https://issues.apache.org/jira/browse/LUCENE-8159?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16385268#comment-16385268
 ] 

Adrien Grand commented on LUCENE-8159:
--

I understand why it helps in your case but searching over lots of fields is a 
bad practice so I would rather not design APIs specifically to address this 
problem. I don't like the idea of adding expert APIs to high-level queries like 
PrefixQuery, WildcardQuery or TermRangeQuery: there should be a single way to 
construct those queries and it should be simple.

> Add a copy constructor in AutomatonQuery to copy directly the compiled 
> automaton
> 
>
> Key: LUCENE-8159
> URL: https://issues.apache.org/jira/browse/LUCENE-8159
> Project: Lucene - Core
>  Issue Type: Improvement
>  Components: core/search
>Affects Versions: trunk
>Reporter: Bruno Roustant
>Assignee: David Smiley
>Priority: Major
> Attachments: 
> 0001-Add-a-copy-constructor-in-AutomatonQuery-to-copy-dir.patch, 
> LUCENE-8159.patch
>
>
> When the query is composed of multiple AutomatonQuery with the same automaton 
> and which target different fields, it is much more efficient to reuse the 
> already compiled automaton by copying it directly and just changing the 
> target field.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



Re: Review Request 65888: Upgrade Solr to use Log4J2

2018-03-04 Thread Erick Erickson


> On March 4, 2018, 1:28 a.m., Erick Erickson wrote:
> > solr/core/ivy.xml
> > Lines 56 (patched)
> > 
> >
> > Should this still be 1.2?
> 
> Shawn Heisey wrote:
> This is the compatibility shim.  It is part of log4j2.  If a program just 
> logs to log4j and doesn't access really deep log4j internals, this shim will 
> usually allow drop-in upgrade to log4j2 without code changes.
> 
> The zookeeper server code does import log4j classes, so we do need this 
> shim.  There may be other dependencies we include that use log4j 1.2 directly.
> 
> I think that our log4j watcher code accesses internals too deeply for 
> this to work on Solr, though.

Thanks!


- Erick


---
This is an automatically generated e-mail. To reply, visit:
https://reviews.apache.org/r/65888/#review198587
---


On March 3, 2018, 2:51 a.m., Varun Thacker wrote:
> 
> ---
> This is an automatically generated e-mail. To reply, visit:
> https://reviews.apache.org/r/65888/
> ---
> 
> (Updated March 3, 2018, 2:51 a.m.)
> 
> 
> Review request for lucene.
> 
> 
> Repository: lucene-solr
> 
> 
> Description
> ---
> 
> Upgrade Solr to use Log4J2
> 
> 
> Diffs
> -
> 
>   lucene/CHANGES.txt e3799bc9d5 
>   lucene/common-build.xml 4fa59ac936 
>   lucene/ivy-versions.properties 5ab36ddfa2 
>   solr/CHANGES.txt 05f4f560bd 
>   solr/bin/install_solr_service.sh b82957144d 
>   solr/bin/solr 4e178de945 
>   solr/bin/solr.cmd dcff0c6af7 
>   solr/bin/solr.in.cmd bfb33e0e9d 
>   solr/bin/solr.in.sh e7478cdf5c 
>   solr/contrib/clustering/src/test-files/log4j.properties b5216db8b2 
>   solr/contrib/clustering/src/test-files/log4j2.xml PRE-CREATION 
>   solr/contrib/dataimporthandler/src/test-files/log4j.properties d3ea4deafc 
>   solr/contrib/dataimporthandler/src/test-files/log4j2.xml PRE-CREATION 
>   solr/contrib/ltr/src/test-files/log4j.properties d86c6988d5 
>   solr/contrib/ltr/src/test-files/log4j2.xml PRE-CREATION 
>   solr/core/ivy.xml ff4fa48679 
>   
> solr/core/src/java/org/apache/solr/client/solrj/embedded/JettySolrRunner.java 
> 23a8dc1eb3 
>   solr/core/src/java/org/apache/solr/handler/admin/LoggingHandler.java 
> 122d2cbf8b 
>   solr/core/src/java/org/apache/solr/logging/LogWatcher.java c510590282 
>   solr/core/src/java/org/apache/solr/logging/log4j/EventAppender.java 
> ff2876fb2f 
>   solr/core/src/java/org/apache/solr/logging/log4j/Log4jInfo.java dfd3dde74a 
>   solr/core/src/java/org/apache/solr/logging/log4j/Log4jWatcher.java 
> 04fa5fb1d8 
>   solr/core/src/java/org/apache/solr/logging/log4j/package-info.java 
> f78953385c 
>   solr/core/src/java/org/apache/solr/logging/log4j2/Log4j2Watcher.java 
> PRE-CREATION 
>   solr/core/src/java/org/apache/solr/servlet/SolrDispatchFilter.java 
> 4d944d239b 
>   solr/core/src/java/org/apache/solr/util/SolrCLI.java 266ef3a28a 
>   solr/core/src/java/org/apache/solr/util/SolrLogLayout.java a60ada828b 
>   solr/core/src/java/org/apache/solr/util/StartupLoggingUtils.java c582eff4c0 
>   solr/core/src/test-files/log4j.properties 969439a228 
>   solr/core/src/test-files/log4j2.xml PRE-CREATION 
>   solr/core/src/test/org/apache/solr/handler/RequestLoggingTest.java 
> 4c780ccda4 
>   solr/core/src/test/org/apache/solr/handler/admin/LoggingHandlerTest.java 
> 555c1376a5 
>   solr/core/src/test/org/apache/solr/logging/log4j2/Log4j2WatcherTest.java 
> PRE-CREATION 
>   solr/core/src/test/org/apache/solr/util/TestSolrCLIRunExample.java 
> 89008517f8 
>   solr/example/README.txt 562c256377 
>   solr/example/example-DIH/solr/db/conf/solrconfig.xml 1ffbbe817f 
>   solr/example/example-DIH/solr/mail/conf/solrconfig.xml 770b0fd870 
>   solr/example/example-DIH/solr/solr/conf/solrconfig.xml 3f00141340 
>   solr/example/resources/log4j.properties 02f91c5dae 
>   solr/example/resources/log4j2.xml PRE-CREATION 
>   solr/licenses/audience-annotations-0.7.0.jar.sha1 PRE-CREATION 
>   solr/licenses/audience-annotations-LICENSE-ASL.txt PRE-CREATION 
>   solr/licenses/audience-annotations-NOTICE.txt PRE-CREATION 
>   solr/licenses/disruptor-3.4.0.jar.sha1 PRE-CREATION 
>   solr/licenses/disruptor-LICENSE-ASL.txt PRE-CREATION 
>   solr/licenses/disruptor-NOTICE.txt PRE-CREATION 
>   solr/licenses/log4j-1.2-api-2.10.0.jar.sha1 PRE-CREATION 
>   solr/licenses/log4j-1.2.17.jar.sha1 383110e29f 
>   solr/licenses/log4j-api-2.10.0.jar.sha1 PRE-CREATION 
>   solr/licenses/log4j-api-LICENSE-ASL.txt PRE-CREATION 
>   solr/licenses/log4j-api-NOTICE.txt PRE-CREATION 
>   solr/licenses/log4j-core-2.10.0.jar.sha1 PRE-CREATION 
>   solr/licenses/log4j-core-LICENSE-ASL.txt PRE-CREATION 
>   solr/licenses/log4j-core-NOTICE.txt PRE-CREATION 
>   solr/licenses/log4j-slf4j-LICENSE-ASL.txt PRE-CREATION 
>   

Re: [VOTE] Release Lucene/Solr 6.6.3 RC1

2018-03-04 Thread Dawid Weiss
SUCCESS! [1:08:13.795357]

+1.

Thank you Steve.

Dawid


On Fri, Mar 2, 2018 at 11:09 PM, Steve Rowe  wrote:
> Please vote for release candidate 1 for Lucene/Solr 6.6.3.
>
> The artifacts can be downloaded from:
>
> https://dist.apache.org/repos/dist/dev/lucene/lucene-solr-6.6.3-RC1-revd1e9bbd333ea55cfa0c75d324424606e857a775b
>
> You can run the smoke tester directly with this command:
>
> python3 -u dev-tools/scripts/smokeTestRelease.py \
> https://dist.apache.org/repos/dist/dev/lucene/lucene-solr-6.6.3-RC1-revd1e9bbd333ea55cfa0c75d324424606e857a775b
>
> Here's my +1, smoke tester says SUCCESS! [0:34:...] (from memory - terminal 
> scrollback is uncooperative...)
>
> --
> Steve
> www.lucidworks.com
>
>
>
> -
> To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
> For additional commands, e-mail: dev-h...@lucene.apache.org
>

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[jira] [Commented] (LUCENE-8190) Replace dendency on LegacyCell for setting pruneLeafyBranches on RecursivePrefixTreeStrategy

2018-03-04 Thread David Smiley (JIRA)

[ 
https://issues.apache.org/jira/browse/LUCENE-8190?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16385255#comment-16385255
 ] 

David Smiley commented on LUCENE-8190:
--

Perhaps instead of RPT throwing IllegalStateException, we do this pruning on a 
best effort basis.  The exception is quirky/unfriendly.  At the point this is 
thrown, we don't have a cell yet to check if it's an instanceof CellCanPrune.  
Hmm.  Maybe RPT's constructor could grab the world cell to check for 
initializing pruneLeafyBranches appropriately.  What do you think?  
recursiveTraverseAndPrune could also check to see if the cell is _not_ an 
instanceof CellCanPrune and simply return false.

Your patch has javadocs that references classes/methods plainly instead of 
using {{\{@link ...\}}} syntax.  Please correct them.  If/when they get 
renamed, IDEs will detect these.  IDEs also see them in find-usages.  At least 
this is true with IntellIJ. 

The docs refer to "They will be eligible for prune bunchy leaves" but "bunchy" 
isn't the setting, it's "leafy".  Still, it'd be nicer to @-link the particular 
setting as mentioned above.  This approach would have avoided the misnomer as 
well (another reason to use {{\{@link ...\}}} in javadocs).

> Replace dendency on LegacyCell for setting pruneLeafyBranches on 
> RecursivePrefixTreeStrategy
> 
>
> Key: LUCENE-8190
> URL: https://issues.apache.org/jira/browse/LUCENE-8190
> Project: Lucene - Core
>  Issue Type: Improvement
>  Components: modules/spatial-extras
>Reporter: Ignacio Vera
>Priority: Major
> Attachments: LUCENE-8190.patch
>
>
> The setting {{pruneLeafyBranches}} on {{RecursivePrefixTreeStrategy}} depends 
> on abstract class {{LegacyCell}} and therefore trees like the newly added 
> {{S2PrefixTree}} cannot benefit for such optimization.
> It is proposed to add a new specialize interface for {{cell}} interface and 
> make the setting depends on it instead of {{LegacyCell.}}



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[jira] [Commented] (LUCENE-8192) Remove offsetsAreCorrect from BaseTokenStreamTestCase

2018-03-04 Thread Robert Muir (JIRA)

[ 
https://issues.apache.org/jira/browse/LUCENE-8192?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16385254#comment-16385254
 ] 

Robert Muir commented on LUCENE-8192:
-

I was wrong about posinc/poslen checks, these checks weren't really "under" the 
boolean, but it was difficult to see that. 

I moved them in the latest patch to make this more obvious, but it doesn't 
change the logic.

> Remove offsetsAreCorrect from BaseTokenStreamTestCase
> -
>
> Key: LUCENE-8192
> URL: https://issues.apache.org/jira/browse/LUCENE-8192
> Project: Lucene - Core
>  Issue Type: Bug
>Reporter: Robert Muir
>Priority: Major
> Attachments: LUCENE-8192.patch, LUCENE-8192_prototype.patch, 
> LUCENE-8192_take_two.patch
>
>
> Similar to LUCENE-8191, now that indexwriter checks the offsets, this boolean 
> is useless: if offsets are broken it will still fail.
> We should just remove the boolean.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[jira] [Updated] (LUCENE-8192) Remove offsetsAreCorrect from BaseTokenStreamTestCase

2018-03-04 Thread Robert Muir (JIRA)

 [ 
https://issues.apache.org/jira/browse/LUCENE-8192?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Robert Muir updated LUCENE-8192:

Attachment: LUCENE-8192.patch

> Remove offsetsAreCorrect from BaseTokenStreamTestCase
> -
>
> Key: LUCENE-8192
> URL: https://issues.apache.org/jira/browse/LUCENE-8192
> Project: Lucene - Core
>  Issue Type: Bug
>Reporter: Robert Muir
>Priority: Major
> Attachments: LUCENE-8192.patch, LUCENE-8192_prototype.patch, 
> LUCENE-8192_take_two.patch
>
>
> Similar to LUCENE-8191, now that indexwriter checks the offsets, this boolean 
> is useless: if offsets are broken it will still fail.
> We should just remove the boolean.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[jira] [Commented] (LUCENE-8192) Remove offsetsAreCorrect from BaseTokenStreamTestCase

2018-03-04 Thread Robert Muir (JIRA)

[ 
https://issues.apache.org/jira/browse/LUCENE-8192?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16385253#comment-16385253
 ] 

Robert Muir commented on LUCENE-8192:
-

This boolean is also guarding some posInc checks that indexwriter will do too. 
I'll update the patch.

> Remove offsetsAreCorrect from BaseTokenStreamTestCase
> -
>
> Key: LUCENE-8192
> URL: https://issues.apache.org/jira/browse/LUCENE-8192
> Project: Lucene - Core
>  Issue Type: Bug
>Reporter: Robert Muir
>Priority: Major
> Attachments: LUCENE-8192_prototype.patch, LUCENE-8192_take_two.patch
>
>
> Similar to LUCENE-8191, now that indexwriter checks the offsets, this boolean 
> is useless: if offsets are broken it will still fail.
> We should just remove the boolean.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[JENKINS] Lucene-Solr-NightlyTests-7.x - Build # 165 - Still Unstable

2018-03-04 Thread Apache Jenkins Server
Build: https://builds.apache.org/job/Lucene-Solr-NightlyTests-7.x/165/

3 tests failed.
FAILED:  org.apache.solr.cloud.autoscaling.sim.TestLargeCluster.testAddNode

Error Message:
no MOVEREPLICA ops?

Stack Trace:
java.lang.AssertionError: no MOVEREPLICA ops?
at 
__randomizedtesting.SeedInfo.seed([EC5794D1006C94B6:4BB88972CF211BAE]:0)
at org.junit.Assert.fail(Assert.java:93)
at org.junit.Assert.assertTrue(Assert.java:43)
at 
org.apache.solr.cloud.autoscaling.sim.TestLargeCluster.testAddNode(TestLargeCluster.java:262)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner.invoke(RandomizedRunner.java:1737)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$8.evaluate(RandomizedRunner.java:934)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$9.evaluate(RandomizedRunner.java:970)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$10.evaluate(RandomizedRunner.java:984)
at 
com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:57)
at 
org.apache.lucene.util.TestRuleSetupTeardownChained$1.evaluate(TestRuleSetupTeardownChained.java:49)
at 
org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:45)
at 
org.apache.lucene.util.TestRuleThreadAndTestName$1.evaluate(TestRuleThreadAndTestName.java:48)
at 
org.apache.lucene.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:64)
at 
org.apache.lucene.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:47)
at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at 
com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:368)
at 
com.carrotsearch.randomizedtesting.ThreadLeakControl.forkTimeoutingTask(ThreadLeakControl.java:817)
at 
com.carrotsearch.randomizedtesting.ThreadLeakControl$3.evaluate(ThreadLeakControl.java:468)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner.runSingleTest(RandomizedRunner.java:943)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$5.evaluate(RandomizedRunner.java:829)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$6.evaluate(RandomizedRunner.java:879)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$7.evaluate(RandomizedRunner.java:890)
at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at 
com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:57)
at 
org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:45)
at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at 
org.apache.lucene.util.TestRuleStoreClassName$1.evaluate(TestRuleStoreClassName.java:41)
at 
com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:40)
at 
com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:40)
at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at 
org.apache.lucene.util.TestRuleAssertionsRequired$1.evaluate(TestRuleAssertionsRequired.java:53)
at 
org.apache.lucene.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:47)
at 
org.apache.lucene.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:64)
at 
org.apache.lucene.util.TestRuleIgnoreTestSuites$1.evaluate(TestRuleIgnoreTestSuites.java:54)
at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at 
com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:368)
at java.lang.Thread.run(Thread.java:748)


FAILED:  org.apache.solr.cloud.autoscaling.sim.TestLargeCluster.testNodeLost

Error Message:


Stack Trace:
java.util.concurrent.TimeoutException
at 
__randomizedtesting.SeedInfo.seed([EC5794D1006C94B6:53425A2F8386F130]:0)
at 
org.apache.solr.cloud.autoscaling.sim.SimSolrCloudTestCase.waitForState(SimSolrCloudTestCase.java:261)
at 

[jira] [Commented] (LUCENE-8191) merge TestRandomChains "brokenConstructors" list with "brokenOffsetsConstructors"

2018-03-04 Thread Robert Muir (JIRA)

[ 
https://issues.apache.org/jira/browse/LUCENE-8191?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16385252#comment-16385252
 ] 

Robert Muir commented on LUCENE-8191:
-

As noted in LUCENE-8192, this boolean actually mixed two concerns (offsets and 
graph-offsets). 

So its possible we could go back to two lists if we really want. I'm inclined 
not to do this: to keep the test simpler with just one blacklist. If the 
tokenfilter has bugs (be they offsets bugs OR graph offsets bugs), then it 
can't be tested. 


> merge TestRandomChains "brokenConstructors" list with 
> "brokenOffsetsConstructors"
> -
>
> Key: LUCENE-8191
> URL: https://issues.apache.org/jira/browse/LUCENE-8191
> Project: Lucene - Core
>  Issue Type: Bug
>Reporter: Robert Muir
>Priority: Major
> Fix For: master (8.0), 7.3
>
> Attachments: LUCENE-8191.patch, LUCENE-8191.patch
>
>
> Now that indexwriter checks offsets (LUCENE-7626), there is no difference 
> between the two: A tokenstream that has brokenoffsets will fail regardless, 
> only in a harder-to-debug way (e.g. some low level exception from 
> indexwriter).
> So I think we should just merge the two lists to reflect that: if it produces 
> brokenOffsets, its broken. 



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



RE: lucene-solr:master: re-enable test, there is nothing wrong with this test, its not a goddamn bad apple. its just great at finding bugs

2018-03-04 Thread Uwe Schindler
Thanks Robert!

I did not notice this otherwise I would have complained already!

This test is there to actually findbugs, if it fails we have to take care ASAP 
and open an issue! We are happy if it fails. 

It is not a bug like the Solr ones that fail from time to time because of the 
broken test setup, e.g.,  with timeouts that depend on CPU speed or similar 
horrible stuff.

Uwe

-
Uwe Schindler
Achterdiek 19, D-28357 Bremen
http://www.thetaphi.de
eMail: u...@thetaphi.de

> -Original Message-
> From: rm...@apache.org [mailto:rm...@apache.org]
> Sent: Sunday, March 4, 2018 2:45 PM
> To: comm...@lucene.apache.org
> Subject: lucene-solr:master: re-enable test, there is nothing wrong with this
> test, its not a goddamn bad apple. its just great at finding bugs
> 
> Repository: lucene-solr
> Updated Branches:
>   refs/heads/master 9de4225e9 -> b83d20337
> 
> 
> re-enable test, there is nothing wrong with this test, its not a goddamn bad
> apple. its just great at finding bugs
> 
> 
> Project: http://git-wip-us.apache.org/repos/asf/lucene-solr/repo
> Commit: http://git-wip-us.apache.org/repos/asf/lucene-
> solr/commit/b83d2033
> Tree: http://git-wip-us.apache.org/repos/asf/lucene-solr/tree/b83d2033
> Diff: http://git-wip-us.apache.org/repos/asf/lucene-solr/diff/b83d2033
> 
> Branch: refs/heads/master
> Commit: b83d20337279fcb90dbfddb176635616bec97bb5
> Parents: 9de4225
> Author: Robert Muir 
> Authored: Sun Mar 4 08:44:10 2018 -0500
> Committer: Robert Muir 
> Committed: Sun Mar 4 08:44:10 2018 -0500
> 
> --
>  .../src/test/org/apache/lucene/analysis/core/TestRandomChains.java  | 1 -
>  1 file changed, 1 deletion(-)
> --
> 
> 
> http://git-wip-us.apache.org/repos/asf/lucene-
> solr/blob/b83d2033/lucene/analysis/common/src/test/org/apache/lucene/a
> nalysis/core/TestRandomChains.java
> --
> diff --git
> a/lucene/analysis/common/src/test/org/apache/lucene/analysis/core/TestR
> andomChains.java
> b/lucene/analysis/common/src/test/org/apache/lucene/analysis/core/TestR
> andomChains.java
> index 3ef50fb..406addf 100644
> ---
> a/lucene/analysis/common/src/test/org/apache/lucene/analysis/core/TestR
> andomChains.java
> +++
> b/lucene/analysis/common/src/test/org/apache/lucene/analysis/core/TestR
> andomChains.java
> @@ -843,7 +843,6 @@ public class TestRandomChains extends
> BaseTokenStreamTestCase {
>  String toString;
>}
> 
> -  @BadApple(bugUrl="https://issues.apache.org/jira/browse/SOLR-12028;)
>public void testRandomChains() throws Throwable {
>  int numIterations = TEST_NIGHTLY ? atLeast(20) : 3;
>  Random random = random();


-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[jira] [Commented] (LUCENE-8192) Remove offsetsAreCorrect from BaseTokenStreamTestCase

2018-03-04 Thread Robert Muir (JIRA)

[ 
https://issues.apache.org/jira/browse/LUCENE-8192?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16385242#comment-16385242
 ] 

Robert Muir commented on LUCENE-8192:
-

Second, less aggressive patch: it *changes the boolean* from 
{{offsetsAreCorrect}} to {{graphOffsetsAreCorrect}} and always enables the 
checks consistent with what indexwriter will do (e.g. offsets don't go 
backwards).

I ran tests a couple times and nothing failed... probably deserves some 
beasting but I think this is a good step? It removes some useless leniency.

> Remove offsetsAreCorrect from BaseTokenStreamTestCase
> -
>
> Key: LUCENE-8192
> URL: https://issues.apache.org/jira/browse/LUCENE-8192
> Project: Lucene - Core
>  Issue Type: Bug
>Reporter: Robert Muir
>Priority: Major
> Attachments: LUCENE-8192_prototype.patch, LUCENE-8192_take_two.patch
>
>
> Similar to LUCENE-8191, now that indexwriter checks the offsets, this boolean 
> is useless: if offsets are broken it will still fail.
> We should just remove the boolean.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[jira] [Updated] (LUCENE-8192) Remove offsetsAreCorrect from BaseTokenStreamTestCase

2018-03-04 Thread Robert Muir (JIRA)

 [ 
https://issues.apache.org/jira/browse/LUCENE-8192?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Robert Muir updated LUCENE-8192:

Attachment: LUCENE-8192_take_two.patch

> Remove offsetsAreCorrect from BaseTokenStreamTestCase
> -
>
> Key: LUCENE-8192
> URL: https://issues.apache.org/jira/browse/LUCENE-8192
> Project: Lucene - Core
>  Issue Type: Bug
>Reporter: Robert Muir
>Priority: Major
> Attachments: LUCENE-8192_prototype.patch, LUCENE-8192_take_two.patch
>
>
> Similar to LUCENE-8191, now that indexwriter checks the offsets, this boolean 
> is useless: if offsets are broken it will still fail.
> We should just remove the boolean.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[GitHub] lucene-solr pull request #331: SOLR-11722 combine api

2018-03-04 Thread nsoft
GitHub user nsoft opened a pull request:

https://github.com/apache/lucene-solr/pull/331

SOLR-11722 combine api

combines CREATEROUTEDALIAS with CREATEALIAS including documentation and an 
additional unit test

You can merge this pull request into a Git repository by running:

$ git pull https://github.com/nsoft/lucene-solr SOLR-11722-combine-api

Alternatively you can review and apply these changes as the patch at:

https://github.com/apache/lucene-solr/pull/331.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

This closes #331


commit a8cb9d65c792acef06a246a09db9d0af18708f91
Author: Gus Heck 
Date:   2018-03-03T14:04:37Z

first shot at combine api, seems to work want to test a little more

commit 180d96da9d5e6b5ec055d4c9b6f2c97f227539e5
Author: gus 
Date:   2018-03-04T16:28:55Z

Additional unit test

commit 8e7ae5c588c279bd595dfedc290c08c9fc378901
Author: gus 
Date:   2018-03-04T16:57:40Z

Merge branch 'master' of https://github.com/apache/lucene-solr into 
SOLR-11722-combine-api

# Conflicts:
#   solr/solr-ref-guide/src/collections-api.adoc




---

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[jira] [Commented] (LUCENE-8192) Remove offsetsAreCorrect from BaseTokenStreamTestCase

2018-03-04 Thread Robert Muir (JIRA)

[ 
https://issues.apache.org/jira/browse/LUCENE-8192?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16385233#comment-16385233
 ] 

Robert Muir commented on LUCENE-8192:
-

I wrote a patch to do this, but it causes many tests to fail.

I think this boolean currently mixes up two concerns:

* "correct offsets" as far as what IndexWriter will check. This is the useless 
boolean, its mandatory that the tokenstream behave correctly here or its 
basically broke.
* "graph offsets checks". This seems to be a higher bar, and even tests for 
filters that claim to support graphs (SynonymGraphFilter) screw this up? 

Just at a glance, it seems like we want to separate these concerns. The first 
one should not be optional.

> Remove offsetsAreCorrect from BaseTokenStreamTestCase
> -
>
> Key: LUCENE-8192
> URL: https://issues.apache.org/jira/browse/LUCENE-8192
> Project: Lucene - Core
>  Issue Type: Bug
>Reporter: Robert Muir
>Priority: Major
> Attachments: LUCENE-8192_prototype.patch
>
>
> Similar to LUCENE-8191, now that indexwriter checks the offsets, this boolean 
> is useless: if offsets are broken it will still fail.
> We should just remove the boolean.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[jira] [Updated] (LUCENE-8192) Remove offsetsAreCorrect from BaseTokenStreamTestCase

2018-03-04 Thread Robert Muir (JIRA)

 [ 
https://issues.apache.org/jira/browse/LUCENE-8192?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Robert Muir updated LUCENE-8192:

Attachment: LUCENE-8192_prototype.patch

> Remove offsetsAreCorrect from BaseTokenStreamTestCase
> -
>
> Key: LUCENE-8192
> URL: https://issues.apache.org/jira/browse/LUCENE-8192
> Project: Lucene - Core
>  Issue Type: Bug
>Reporter: Robert Muir
>Priority: Major
> Attachments: LUCENE-8192_prototype.patch
>
>
> Similar to LUCENE-8191, now that indexwriter checks the offsets, this boolean 
> is useless: if offsets are broken it will still fail.
> We should just remove the boolean.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[jira] [Commented] (LUCENE-8191) merge TestRandomChains "brokenConstructors" list with "brokenOffsetsConstructors"

2018-03-04 Thread Adrien Grand (JIRA)

[ 
https://issues.apache.org/jira/browse/LUCENE-8191?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16385232#comment-16385232
 ] 

Adrien Grand commented on LUCENE-8191:
--

+1

> merge TestRandomChains "brokenConstructors" list with 
> "brokenOffsetsConstructors"
> -
>
> Key: LUCENE-8191
> URL: https://issues.apache.org/jira/browse/LUCENE-8191
> Project: Lucene - Core
>  Issue Type: Bug
>Reporter: Robert Muir
>Priority: Major
> Fix For: master (8.0), 7.3
>
> Attachments: LUCENE-8191.patch, LUCENE-8191.patch
>
>
> Now that indexwriter checks offsets (LUCENE-7626), there is no difference 
> between the two: A tokenstream that has brokenoffsets will fail regardless, 
> only in a harder-to-debug way (e.g. some low level exception from 
> indexwriter).
> So I think we should just merge the two lists to reflect that: if it produces 
> brokenOffsets, its broken. 



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[jira] [Resolved] (LUCENE-8191) merge TestRandomChains "brokenConstructors" list with "brokenOffsetsConstructors"

2018-03-04 Thread Robert Muir (JIRA)

 [ 
https://issues.apache.org/jira/browse/LUCENE-8191?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Robert Muir resolved LUCENE-8191.
-
   Resolution: Fixed
Fix Version/s: 7.3
   master (8.0)

> merge TestRandomChains "brokenConstructors" list with 
> "brokenOffsetsConstructors"
> -
>
> Key: LUCENE-8191
> URL: https://issues.apache.org/jira/browse/LUCENE-8191
> Project: Lucene - Core
>  Issue Type: Bug
>Reporter: Robert Muir
>Priority: Major
> Fix For: master (8.0), 7.3
>
> Attachments: LUCENE-8191.patch, LUCENE-8191.patch
>
>
> Now that indexwriter checks offsets (LUCENE-7626), there is no difference 
> between the two: A tokenstream that has brokenoffsets will fail regardless, 
> only in a harder-to-debug way (e.g. some low level exception from 
> indexwriter).
> So I think we should just merge the two lists to reflect that: if it produces 
> brokenOffsets, its broken. 



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[jira] [Commented] (LUCENE-8191) merge TestRandomChains "brokenConstructors" list with "brokenOffsetsConstructors"

2018-03-04 Thread ASF subversion and git services (JIRA)

[ 
https://issues.apache.org/jira/browse/LUCENE-8191?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16385225#comment-16385225
 ] 

ASF subversion and git services commented on LUCENE-8191:
-

Commit 96cd9c5d6279d23da8e86c241310d8aaf69bdf12 in lucene-solr's branch 
refs/heads/branch_7x from [~rcmuir]
[ https://git-wip-us.apache.org/repos/asf?p=lucene-solr.git;h=96cd9c5 ]

LUCENE-8191: if a tokenstream has broken offsets, its broken. IndexWriter 
always checks, so a separate whitelist can't work


> merge TestRandomChains "brokenConstructors" list with 
> "brokenOffsetsConstructors"
> -
>
> Key: LUCENE-8191
> URL: https://issues.apache.org/jira/browse/LUCENE-8191
> Project: Lucene - Core
>  Issue Type: Bug
>Reporter: Robert Muir
>Priority: Major
> Attachments: LUCENE-8191.patch, LUCENE-8191.patch
>
>
> Now that indexwriter checks offsets (LUCENE-7626), there is no difference 
> between the two: A tokenstream that has brokenoffsets will fail regardless, 
> only in a harder-to-debug way (e.g. some low level exception from 
> indexwriter).
> So I think we should just merge the two lists to reflect that: if it produces 
> brokenOffsets, its broken. 



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[jira] [Commented] (LUCENE-8191) merge TestRandomChains "brokenConstructors" list with "brokenOffsetsConstructors"

2018-03-04 Thread ASF subversion and git services (JIRA)

[ 
https://issues.apache.org/jira/browse/LUCENE-8191?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16385223#comment-16385223
 ] 

ASF subversion and git services commented on LUCENE-8191:
-

Commit 97299ed00699c248fc38465ee1b0eb0bb1561d3d in lucene-solr's branch 
refs/heads/master from [~rcmuir]
[ https://git-wip-us.apache.org/repos/asf?p=lucene-solr.git;h=97299ed ]

LUCENE-8191: if a tokenstream has broken offsets, its broken. IndexWriter 
always checks, so a separate whitelist can't work


> merge TestRandomChains "brokenConstructors" list with 
> "brokenOffsetsConstructors"
> -
>
> Key: LUCENE-8191
> URL: https://issues.apache.org/jira/browse/LUCENE-8191
> Project: Lucene - Core
>  Issue Type: Bug
>Reporter: Robert Muir
>Priority: Major
> Attachments: LUCENE-8191.patch, LUCENE-8191.patch
>
>
> Now that indexwriter checks offsets (LUCENE-7626), there is no difference 
> between the two: A tokenstream that has brokenoffsets will fail regardless, 
> only in a harder-to-debug way (e.g. some low level exception from 
> indexwriter).
> So I think we should just merge the two lists to reflect that: if it produces 
> brokenOffsets, its broken. 



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[jira] [Created] (LUCENE-8192) Remove offsetsAreCorrect from BaseTokenStreamTestCase

2018-03-04 Thread Robert Muir (JIRA)
Robert Muir created LUCENE-8192:
---

 Summary: Remove offsetsAreCorrect from BaseTokenStreamTestCase
 Key: LUCENE-8192
 URL: https://issues.apache.org/jira/browse/LUCENE-8192
 Project: Lucene - Core
  Issue Type: Bug
Reporter: Robert Muir


Similar to LUCENE-8191, now that indexwriter checks the offsets, this boolean 
is useless: if offsets are broken it will still fail.

We should just remove the boolean.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[jira] [Updated] (LUCENE-8191) merge TestRandomChains "brokenConstructors" list with "brokenOffsetsConstructors"

2018-03-04 Thread Robert Muir (JIRA)

 [ 
https://issues.apache.org/jira/browse/LUCENE-8191?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Robert Muir updated LUCENE-8191:

Attachment: LUCENE-8191.patch

> merge TestRandomChains "brokenConstructors" list with 
> "brokenOffsetsConstructors"
> -
>
> Key: LUCENE-8191
> URL: https://issues.apache.org/jira/browse/LUCENE-8191
> Project: Lucene - Core
>  Issue Type: Bug
>Reporter: Robert Muir
>Priority: Major
> Attachments: LUCENE-8191.patch, LUCENE-8191.patch
>
>
> Now that indexwriter checks offsets (LUCENE-7626), there is no difference 
> between the two: A tokenstream that has brokenoffsets will fail regardless, 
> only in a harder-to-debug way (e.g. some low level exception from 
> indexwriter).
> So I think we should just merge the two lists to reflect that: if it produces 
> brokenOffsets, its broken. 



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[jira] [Commented] (LUCENE-8092) TestRandomChains failure

2018-03-04 Thread Robert Muir (JIRA)

[ 
https://issues.apache.org/jira/browse/LUCENE-8092?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16385204#comment-16385204
 ] 

Robert Muir commented on LUCENE-8092:
-

CJKBigramFilter was already in the offsetsAreBroken list for this test, so it 
used to not fail in such cases. However, IndexWriter now always checks the 
offsets, so you'll get an exception from indexwriter regardless. Hence, this 
list doesn't work (if offsets are broken, the tokenfilter is broken). See 
LUCENE-8191.

This doesn't fix the fact the filter is buggy, we know that. it just reflects 
changes made in LUCENE-7626. 

> TestRandomChains failure
> 
>
> Key: LUCENE-8092
> URL: https://issues.apache.org/jira/browse/LUCENE-8092
> Project: Lucene - Core
>  Issue Type: Bug
>Reporter: Alan Woodward
>Priority: Major
>
> https://builds.apache.org/job/Lucene-Solr-NightlyTests-7.2/1/
> ant test  -Dtestcase=TestRandomChains -Dtests.method=testRandomChains 
> -Dtests.seed=C006DAD2E1FC77AF -Dtests.multiplier=2 -Dtests.nightly=true 
> -Dtests.slow=true 
> -Dtests.linedocsfile=/Users/romseygeek/projects/lucene-test-data/enwiki.random.lines.txt
>  -Dtests.locale=tr -Dtests.timezone=Europe/Simferopol -Dtests.asserts=true 
> -Dtests.file.encoding=UTF-8
> Reproduces locally on 7.2



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[jira] [Commented] (LUCENE-8191) merge TestRandomChains "brokenConstructors" list with "brokenOffsetsConstructors"

2018-03-04 Thread Robert Muir (JIRA)

[ 
https://issues.apache.org/jira/browse/LUCENE-8191?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16385202#comment-16385202
 ] 

Robert Muir commented on LUCENE-8191:
-

Attached is a patch: I tested it with {{ant beast -Dtestcase=TestRandomChains 
-Dbeasts.iters=100}}, I plan to commit it soon.

It makes the test easier to understand, there is only one broken list. If the 
test finds a bug in a tokenstream, add it to the that list until the 
tokenstream is fixed.

It will also silence recent failures about offsets coming from indexwriter with 
the output "offsetsAreCorrect=false", again that list makes no sense anymore 
post- LUCENE-7626.

> merge TestRandomChains "brokenConstructors" list with 
> "brokenOffsetsConstructors"
> -
>
> Key: LUCENE-8191
> URL: https://issues.apache.org/jira/browse/LUCENE-8191
> Project: Lucene - Core
>  Issue Type: Bug
>Reporter: Robert Muir
>Priority: Major
> Attachments: LUCENE-8191.patch
>
>
> Now that indexwriter checks offsets (LUCENE-7626), there is no difference 
> between the two: A tokenstream that has brokenoffsets will fail regardless, 
> only in a harder-to-debug way (e.g. some low level exception from 
> indexwriter).
> So I think we should just merge the two lists to reflect that: if it produces 
> brokenOffsets, its broken. 



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[jira] [Updated] (LUCENE-8191) merge TestRandomChains "brokenConstructors" list with "brokenOffsetsConstructors"

2018-03-04 Thread Robert Muir (JIRA)

 [ 
https://issues.apache.org/jira/browse/LUCENE-8191?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Robert Muir updated LUCENE-8191:

Attachment: LUCENE-8191.patch

> merge TestRandomChains "brokenConstructors" list with 
> "brokenOffsetsConstructors"
> -
>
> Key: LUCENE-8191
> URL: https://issues.apache.org/jira/browse/LUCENE-8191
> Project: Lucene - Core
>  Issue Type: Bug
>Reporter: Robert Muir
>Priority: Major
> Attachments: LUCENE-8191.patch
>
>
> Now that indexwriter checks offsets (LUCENE-7626), there is no difference 
> between the two: A tokenstream that has brokenoffsets will fail regardless, 
> only in a harder-to-debug way (e.g. some low level exception from 
> indexwriter).
> So I think we should just merge the two lists to reflect that: if it produces 
> brokenOffsets, its broken. 



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[jira] [Created] (LUCENE-8191) merge TestRandomChains "brokenConstructors" list with "brokenOffsetsConstructors"

2018-03-04 Thread Robert Muir (JIRA)
Robert Muir created LUCENE-8191:
---

 Summary: merge TestRandomChains "brokenConstructors" list with 
"brokenOffsetsConstructors"
 Key: LUCENE-8191
 URL: https://issues.apache.org/jira/browse/LUCENE-8191
 Project: Lucene - Core
  Issue Type: Bug
Reporter: Robert Muir


Now that indexwriter checks offsets (LUCENE-7626), there is no difference 
between the two: A tokenstream that has brokenoffsets will fail regardless, 
only in a harder-to-debug way (e.g. some low level exception from indexwriter).

So I think we should just merge the two lists to reflect that: if it produces 
brokenOffsets, its broken. 



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



Re: [jira] [Comment Edited] (LUCENE-8159) Add a copy constructor in AutomatonQuery to copy directly the compiled automaton

2018-03-04 Thread Michael Sokolov
Perhaps Robert is a fan of Object.clone()

On Feb 28, 2018 9:59 AM, "Bruno Roustant (JIRA)"  wrote:

>
> [ https://issues.apache.org/jira/browse/LUCENE-8159?page=
> com.atlassian.jira.plugin.system.issuetabpanels:comment-
> tabpanel=16380407#comment-16380407 ]
>
> Bruno Roustant edited comment on LUCENE-8159 at 2/28/18 2:58 PM:
> -
>
> [~rcmuir] could you be a little more explicit?
>
> Without context I don't understand why a copy constructor is bad in Java
> in general.
>
> Do you mean you prefer a copy method?
>
> PrefixQuery copy(String field)
>
>
> was (Author: bruno.roustant):
> [~rcmuir] could you be a little more explicit?
>
> Without context I don't understand why a copy constructor is bad in Java
> in general.
>
> > Add a copy constructor in AutomatonQuery to copy directly the compiled
> automaton
> > 
> 
> >
> > Key: LUCENE-8159
> > URL: https://issues.apache.org/jira/browse/LUCENE-8159
> > Project: Lucene - Core
> >  Issue Type: Improvement
> >  Components: core/search
> >Affects Versions: trunk
> >Reporter: Bruno Roustant
> >Assignee: David Smiley
> >Priority: Major
> > Attachments: 0001-Add-a-copy-constructor-
> in-AutomatonQuery-to-copy-dir.patch, LUCENE-8159.patch
> >
> >
> > When the query is composed of multiple AutomatonQuery with the same
> automaton and which target different fields, it is much more efficient to
> reuse the already compiled automaton by copying it directly and just
> changing the target field.
>
>
>
> --
> This message was sent by Atlassian JIRA
> (v7.6.3#76005)
>
> -
> To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
> For additional commands, e-mail: dev-h...@lucene.apache.org
>
>


[jira] [Commented] (LUCENE-8092) TestRandomChains failure

2018-03-04 Thread Robert Muir (JIRA)

[ 
https://issues.apache.org/jira/browse/LUCENE-8092?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16385188#comment-16385188
 ] 

Robert Muir commented on LUCENE-8092:
-

CJKBigramFilter isn't really prepared to handle an arbitrary input graph (or 
maybe even synonyms): its looking for a flat stream of tokens that may include 
some CJK. 

It already has a ridiculously complex job, its like a shinglefilter but with 
crazy custom logic: but it does manage that to support the use-case across 
different tokenizer variants (StandardTokenizer, UAXURLTokenizer, 
ICUTokenizer). 

Maybe it should throw a clear exception if it encounters posinc=0 or poslen > 1 
? It would at least make it totally clear that it won't work, rather than the 
user getting a more vague exception from indexwriter. Ideally this would be 
detected earlier though (in construction of the chain). Unfortunately its not 
so easy to simply require that its input is a tokenizer, see the CJKAnalyzer 
use-case where the width-filter comes before, because that impacts bigramming.

> TestRandomChains failure
> 
>
> Key: LUCENE-8092
> URL: https://issues.apache.org/jira/browse/LUCENE-8092
> Project: Lucene - Core
>  Issue Type: Bug
>Reporter: Alan Woodward
>Priority: Major
>
> https://builds.apache.org/job/Lucene-Solr-NightlyTests-7.2/1/
> ant test  -Dtestcase=TestRandomChains -Dtests.method=testRandomChains 
> -Dtests.seed=C006DAD2E1FC77AF -Dtests.multiplier=2 -Dtests.nightly=true 
> -Dtests.slow=true 
> -Dtests.linedocsfile=/Users/romseygeek/projects/lucene-test-data/enwiki.random.lines.txt
>  -Dtests.locale=tr -Dtests.timezone=Europe/Simferopol -Dtests.asserts=true 
> -Dtests.file.encoding=UTF-8
> Reproduces locally on 7.2



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[JENKINS-EA] Lucene-Solr-7.x-Linux (64bit/jdk-10-ea+43) - Build # 1464 - Still Unstable!

2018-03-04 Thread Policeman Jenkins Server
Build: https://jenkins.thetaphi.de/job/Lucene-Solr-7.x-Linux/1464/
Java: 64bit/jdk-10-ea+43 -XX:-UseCompressedOops -XX:+UseSerialGC

8 tests failed.
FAILED:  org.apache.solr.cloud.MoveReplicaHDFSTest.testFailedMove

Error Message:
No live SolrServers available to handle this 
request:[http://127.0.0.1:34409/solr/MoveReplicaHDFSTest_failed_coll_true, 
http://127.0.0.1:33177/solr/MoveReplicaHDFSTest_failed_coll_true]

Stack Trace:
org.apache.solr.client.solrj.SolrServerException: No live SolrServers available 
to handle this 
request:[http://127.0.0.1:34409/solr/MoveReplicaHDFSTest_failed_coll_true, 
http://127.0.0.1:33177/solr/MoveReplicaHDFSTest_failed_coll_true]
at 
__randomizedtesting.SeedInfo.seed([9DC7432E8661A3B4:370A90DC31B27664]:0)
at 
org.apache.solr.client.solrj.impl.LBHttpSolrClient.request(LBHttpSolrClient.java:462)
at 
org.apache.solr.client.solrj.impl.CloudSolrClient.sendRequest(CloudSolrClient.java:1104)
at 
org.apache.solr.client.solrj.impl.CloudSolrClient.requestWithRetryOnStaleState(CloudSolrClient.java:884)
at 
org.apache.solr.client.solrj.impl.CloudSolrClient.requestWithRetryOnStaleState(CloudSolrClient.java:991)
at 
org.apache.solr.client.solrj.impl.CloudSolrClient.request(CloudSolrClient.java:817)
at 
org.apache.solr.client.solrj.SolrRequest.process(SolrRequest.java:194)
at org.apache.solr.client.solrj.SolrClient.query(SolrClient.java:942)
at 
org.apache.solr.cloud.MoveReplicaTest.testFailedMove(MoveReplicaTest.java:307)
at 
java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at 
java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at 
java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.base/java.lang.reflect.Method.invoke(Method.java:564)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner.invoke(RandomizedRunner.java:1737)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$8.evaluate(RandomizedRunner.java:934)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$9.evaluate(RandomizedRunner.java:970)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$10.evaluate(RandomizedRunner.java:984)
at 
com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:57)
at 
org.apache.lucene.util.TestRuleSetupTeardownChained$1.evaluate(TestRuleSetupTeardownChained.java:49)
at 
org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:45)
at 
org.apache.lucene.util.TestRuleThreadAndTestName$1.evaluate(TestRuleThreadAndTestName.java:48)
at 
org.apache.lucene.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:64)
at 
org.apache.lucene.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:47)
at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at 
com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:368)
at 
com.carrotsearch.randomizedtesting.ThreadLeakControl.forkTimeoutingTask(ThreadLeakControl.java:817)
at 
com.carrotsearch.randomizedtesting.ThreadLeakControl$3.evaluate(ThreadLeakControl.java:468)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner.runSingleTest(RandomizedRunner.java:943)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$5.evaluate(RandomizedRunner.java:829)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$6.evaluate(RandomizedRunner.java:879)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$7.evaluate(RandomizedRunner.java:890)
at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at 
com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:57)
at 
org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:45)
at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at 
org.apache.lucene.util.TestRuleStoreClassName$1.evaluate(TestRuleStoreClassName.java:41)
at 
com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:40)
at 
com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:40)
at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at 
org.apache.lucene.util.TestRuleAssertionsRequired$1.evaluate(TestRuleAssertionsRequired.java:53)
   

[JENKINS] Lucene-Solr-6.6-Linux (64bit/jdk1.8.0_162) - Build # 189 - Unstable!

2018-03-04 Thread Policeman Jenkins Server
Build: https://jenkins.thetaphi.de/job/Lucene-Solr-6.6-Linux/189/
Java: 64bit/jdk1.8.0_162 -XX:+UseCompressedOops -XX:+UseSerialGC

2 tests failed.
FAILED:  
org.apache.solr.cloud.CollectionsAPIDistributedZkTest.testCollectionsAPI

Error Message:
Error from server at 
http://127.0.0.1:34931/solr/awhollynewcollection_0_shard4_replica2: 
ClusterState says we are the leader 
(http://127.0.0.1:34931/solr/awhollynewcollection_0_shard4_replica2), but 
locally we don't think so. Request came from null

Stack Trace:
org.apache.solr.client.solrj.impl.CloudSolrClient$RouteException: Error from 
server at http://127.0.0.1:34931/solr/awhollynewcollection_0_shard4_replica2: 
ClusterState says we are the leader 
(http://127.0.0.1:34931/solr/awhollynewcollection_0_shard4_replica2), but 
locally we don't think so. Request came from null
at 
__randomizedtesting.SeedInfo.seed([736D190BDC9112F2:3B186DBFDAA23D67]:0)
at 
org.apache.solr.client.solrj.impl.CloudSolrClient.directUpdate(CloudSolrClient.java:819)
at 
org.apache.solr.client.solrj.impl.CloudSolrClient.sendRequest(CloudSolrClient.java:1263)
at 
org.apache.solr.client.solrj.impl.CloudSolrClient.requestWithRetryOnStaleState(CloudSolrClient.java:1134)
at 
org.apache.solr.client.solrj.impl.CloudSolrClient.request(CloudSolrClient.java:1073)
at 
org.apache.solr.client.solrj.SolrRequest.process(SolrRequest.java:160)
at 
org.apache.solr.client.solrj.request.UpdateRequest.commit(UpdateRequest.java:233)
at 
org.apache.solr.cloud.CollectionsAPIDistributedZkTest.testCollectionsAPI(CollectionsAPIDistributedZkTest.java:521)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner.invoke(RandomizedRunner.java:1713)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$8.evaluate(RandomizedRunner.java:907)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$9.evaluate(RandomizedRunner.java:943)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$10.evaluate(RandomizedRunner.java:957)
at 
com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:57)
at 
org.apache.lucene.util.TestRuleSetupTeardownChained$1.evaluate(TestRuleSetupTeardownChained.java:49)
at 
org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:45)
at 
org.apache.lucene.util.TestRuleThreadAndTestName$1.evaluate(TestRuleThreadAndTestName.java:48)
at 
org.apache.lucene.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:64)
at 
org.apache.lucene.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:47)
at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at 
com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:368)
at 
com.carrotsearch.randomizedtesting.ThreadLeakControl.forkTimeoutingTask(ThreadLeakControl.java:817)
at 
com.carrotsearch.randomizedtesting.ThreadLeakControl$3.evaluate(ThreadLeakControl.java:468)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner.runSingleTest(RandomizedRunner.java:916)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$5.evaluate(RandomizedRunner.java:802)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$6.evaluate(RandomizedRunner.java:852)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$7.evaluate(RandomizedRunner.java:863)
at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at 
com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:57)
at 
org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:45)
at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at 
org.apache.lucene.util.TestRuleStoreClassName$1.evaluate(TestRuleStoreClassName.java:41)
at 
com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:40)
at 
com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:40)
at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at 

[JENKINS] Lucene-Solr-SmokeRelease-7.x - Build # 164 - Still Failing

2018-03-04 Thread Apache Jenkins Server
Build: https://builds.apache.org/job/Lucene-Solr-SmokeRelease-7.x/164/

No tests ran.

Build Log:
[...truncated 28782 lines...]
prepare-release-no-sign:
[mkdir] Created dir: 
/x1/jenkins/jenkins-slave/workspace/Lucene-Solr-SmokeRelease-7.x/lucene/build/smokeTestRelease/dist
 [copy] Copying 491 files to 
/x1/jenkins/jenkins-slave/workspace/Lucene-Solr-SmokeRelease-7.x/lucene/build/smokeTestRelease/dist/lucene
 [copy] Copying 215 files to 
/x1/jenkins/jenkins-slave/workspace/Lucene-Solr-SmokeRelease-7.x/lucene/build/smokeTestRelease/dist/solr
   [smoker] Java 1.8 JAVA_HOME=/home/jenkins/tools/java/latest1.8
   [smoker] Java 9 JAVA_HOME=/home/jenkins/tools/java/latest1.9
   [smoker] NOTE: output encoding is UTF-8
   [smoker] 
   [smoker] Load release URL 
"file:/x1/jenkins/jenkins-slave/workspace/Lucene-Solr-SmokeRelease-7.x/lucene/build/smokeTestRelease/dist/"...
   [smoker] 
   [smoker] Test Lucene...
   [smoker]   test basics...
   [smoker]   get KEYS
   [smoker] 0.2 MB in 0.01 sec (18.5 MB/sec)
   [smoker]   check changes HTML...
   [smoker]   download lucene-7.3.0-src.tgz...
   [smoker] 31.7 MB in 0.04 sec (732.6 MB/sec)
   [smoker] verify md5/sha1 digests
   [smoker]   download lucene-7.3.0.tgz...
   [smoker] 73.4 MB in 0.10 sec (735.6 MB/sec)
   [smoker] verify md5/sha1 digests
   [smoker]   download lucene-7.3.0.zip...
   [smoker] 83.9 MB in 0.12 sec (719.6 MB/sec)
   [smoker] verify md5/sha1 digests
   [smoker]   unpack lucene-7.3.0.tgz...
   [smoker] verify JAR metadata/identity/no javax.* or java.* classes...
   [smoker] test demo with 1.8...
   [smoker]   got 6298 hits for query "lucene"
   [smoker] checkindex with 1.8...
   [smoker] test demo with 9...
   [smoker]   got 6298 hits for query "lucene"
   [smoker] checkindex with 9...
   [smoker] check Lucene's javadoc JAR
   [smoker]   unpack lucene-7.3.0.zip...
   [smoker] verify JAR metadata/identity/no javax.* or java.* classes...
   [smoker] test demo with 1.8...
   [smoker]   got 6298 hits for query "lucene"
   [smoker] checkindex with 1.8...
   [smoker] test demo with 9...
   [smoker]   got 6298 hits for query "lucene"
   [smoker] checkindex with 9...
   [smoker] check Lucene's javadoc JAR
   [smoker]   unpack lucene-7.3.0-src.tgz...
   [smoker] make sure no JARs/WARs in src dist...
   [smoker] run "ant validate"
   [smoker] run tests w/ Java 8 and testArgs='-Dtests.badapples=false 
-Dtests.slow=false'...
   [smoker] test demo with 1.8...
   [smoker]   got 217 hits for query "lucene"
   [smoker] checkindex with 1.8...
   [smoker] generate javadocs w/ Java 8...
   [smoker] 
   [smoker] Crawl/parse...
   [smoker] 
   [smoker] Verify...
   [smoker] run tests w/ Java 9 and testArgs='-Dtests.badapples=false 
-Dtests.slow=false'...
   [smoker] test demo with 9...
   [smoker]   got 217 hits for query "lucene"
   [smoker] checkindex with 9...
   [smoker]   confirm all releases have coverage in TestBackwardsCompatibility
   [smoker] find all past Lucene releases...
   [smoker] run TestBackwardsCompatibility..
   [smoker] success!
   [smoker] 
   [smoker] Test Solr...
   [smoker]   test basics...
   [smoker]   get KEYS
   [smoker] 0.2 MB in 0.00 sec (239.2 MB/sec)
   [smoker]   check changes HTML...
   [smoker]   download solr-7.3.0-src.tgz...
   [smoker] 54.1 MB in 0.07 sec (791.3 MB/sec)
   [smoker] verify md5/sha1 digests
   [smoker]   download solr-7.3.0.tgz...
   [smoker] 151.0 MB in 0.19 sec (779.2 MB/sec)
   [smoker] verify md5/sha1 digests
   [smoker]   download solr-7.3.0.zip...
   [smoker] 152.0 MB in 0.19 sec (787.1 MB/sec)
   [smoker] verify md5/sha1 digests
   [smoker]   unpack solr-7.3.0.tgz...
   [smoker] verify JAR metadata/identity/no javax.* or java.* classes...
   [smoker] unpack lucene-7.3.0.tgz...
   [smoker]   **WARNING**: skipping check of 
/x1/jenkins/jenkins-slave/workspace/Lucene-Solr-SmokeRelease-7.x/lucene/build/smokeTestRelease/tmp/unpack/solr-7.3.0/contrib/dataimporthandler-extras/lib/javax.mail-1.5.1.jar:
 it has javax.* classes
   [smoker]   **WARNING**: skipping check of 
/x1/jenkins/jenkins-slave/workspace/Lucene-Solr-SmokeRelease-7.x/lucene/build/smokeTestRelease/tmp/unpack/solr-7.3.0/contrib/dataimporthandler-extras/lib/activation-1.1.1.jar:
 it has javax.* classes
   [smoker] copying unpacked distribution for Java 8 ...
   [smoker] test solr example w/ Java 8...
   [smoker]   start Solr instance 
(log=/x1/jenkins/jenkins-slave/workspace/Lucene-Solr-SmokeRelease-7.x/lucene/build/smokeTestRelease/tmp/unpack/solr-7.3.0-java8/solr-example.log)...
   [smoker] No process found for Solr node running on port 8983
   [smoker]   Running techproducts example on port 8983 from 
/x1/jenkins/jenkins-slave/workspace/Lucene-Solr-SmokeRelease-7.x/lucene/build/smokeTestRelease/tmp/unpack/solr-7.3.0-java8
   [smoker] 

[JENKINS] Lucene-Solr-NightlyTests-6.6 - Build # 34 - Failure

2018-03-04 Thread Apache Jenkins Server
Build: https://builds.apache.org/job/Lucene-Solr-NightlyTests-6.6/34/

5 tests failed.
FAILED:  
org.apache.solr.cloud.hdfs.HdfsCollectionsAPIDistributedZkTest.testCollectionsAPI

Error Message:
Something is broken in the assert for no shards using the same indexDir - 
probably something was changed in the attributes published in the MBean of 
SolrCore : {}

Stack Trace:
java.lang.AssertionError: Something is broken in the assert for no shards using 
the same indexDir - probably something was changed in the attributes published 
in the MBean of SolrCore : {}
at 
__randomizedtesting.SeedInfo.seed([78CE862CA70D20DD:30BBF298A13E0F48]:0)
at org.junit.Assert.fail(Assert.java:93)
at org.junit.Assert.assertTrue(Assert.java:43)
at 
org.apache.solr.cloud.CollectionsAPIDistributedZkTest.checkNoTwoShardsUseTheSameIndexDir(CollectionsAPIDistributedZkTest.java:646)
at 
org.apache.solr.cloud.CollectionsAPIDistributedZkTest.testCollectionsAPI(CollectionsAPIDistributedZkTest.java:524)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner.invoke(RandomizedRunner.java:1713)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$8.evaluate(RandomizedRunner.java:907)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$9.evaluate(RandomizedRunner.java:943)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$10.evaluate(RandomizedRunner.java:957)
at 
com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:57)
at 
org.apache.lucene.util.TestRuleSetupTeardownChained$1.evaluate(TestRuleSetupTeardownChained.java:49)
at 
org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:45)
at 
org.apache.lucene.util.TestRuleThreadAndTestName$1.evaluate(TestRuleThreadAndTestName.java:48)
at 
org.apache.lucene.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:64)
at 
org.apache.lucene.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:47)
at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at 
com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:368)
at 
com.carrotsearch.randomizedtesting.ThreadLeakControl.forkTimeoutingTask(ThreadLeakControl.java:817)
at 
com.carrotsearch.randomizedtesting.ThreadLeakControl$3.evaluate(ThreadLeakControl.java:468)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner.runSingleTest(RandomizedRunner.java:916)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$5.evaluate(RandomizedRunner.java:802)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$6.evaluate(RandomizedRunner.java:852)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$7.evaluate(RandomizedRunner.java:863)
at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at 
com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:57)
at 
org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:45)
at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at 
org.apache.lucene.util.TestRuleStoreClassName$1.evaluate(TestRuleStoreClassName.java:41)
at 
com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:40)
at 
com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:40)
at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at 
org.apache.lucene.util.TestRuleAssertionsRequired$1.evaluate(TestRuleAssertionsRequired.java:53)
at 
org.apache.lucene.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:47)
at 
org.apache.lucene.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:64)
at 
org.apache.lucene.util.TestRuleIgnoreTestSuites$1.evaluate(TestRuleIgnoreTestSuites.java:54)
at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at 

[JENKINS] Lucene-Solr-repro - Build # 187 - Still Unstable

2018-03-04 Thread Apache Jenkins Server
Build: https://builds.apache.org/job/Lucene-Solr-repro/187/

[...truncated 28 lines...]
[repro] Jenkins log URL: 
https://builds.apache.org/job/Lucene-Solr-NightlyTests-7.x/164/consoleText

[repro] Revision: 00a0e146be299be03e51356b511abf45a2ab326b

[repro] Ant options: -Dtests.multiplier=2 
-Dtests.linedocsfile=/home/jenkins/jenkins-slave/workspace/Lucene-Solr-NightlyTests-7.x/test-data/enwiki.random.lines.txt
[repro] Repro line:  ant test  -Dtestcase=ChaosMonkeySafeLeaderTest 
-Dtests.method=test -Dtests.seed=AA6039F26C94F633 -Dtests.multiplier=2 
-Dtests.nightly=true -Dtests.slow=true 
-Dtests.linedocsfile=/home/jenkins/jenkins-slave/workspace/Lucene-Solr-NightlyTests-7.x/test-data/enwiki.random.lines.txt
 -Dtests.locale=ar-TN -Dtests.timezone=ART -Dtests.asserts=true 
-Dtests.file.encoding=UTF-8

[repro] Repro line:  ant test  
-Dtestcase=ChaosMonkeySafeLeaderWithPullReplicasTest -Dtests.method=test 
-Dtests.seed=AA6039F26C94F633 -Dtests.multiplier=2 -Dtests.nightly=true 
-Dtests.slow=true 
-Dtests.linedocsfile=/home/jenkins/jenkins-slave/workspace/Lucene-Solr-NightlyTests-7.x/test-data/enwiki.random.lines.txt
 -Dtests.locale=no -Dtests.timezone=US/Alaska -Dtests.asserts=true 
-Dtests.file.encoding=UTF-8

[repro] git rev-parse --abbrev-ref HEAD
[repro] git rev-parse HEAD
[repro] Initial local git branch/revision: 
9de4225e9a54ba987c2c7c9d4510bea3e4f9de97
[repro] git fetch
[repro] git checkout 00a0e146be299be03e51356b511abf45a2ab326b

[...truncated 2 lines...]
[repro] git merge --ff-only

[...truncated 1 lines...]
[repro] ant clean

[...truncated 6 lines...]
[repro] Test suites by module:
[repro]solr/core
[repro]   ChaosMonkeySafeLeaderTest
[repro]   ChaosMonkeySafeLeaderWithPullReplicasTest
[repro] ant compile-test

[...truncated 3310 lines...]
[repro] ant test-nocompile -Dtests.dups=5 -Dtests.maxfailures=10 
-Dtests.class="*.ChaosMonkeySafeLeaderTest|*.ChaosMonkeySafeLeaderWithPullReplicasTest"
 -Dtests.showOutput=onerror -Dtests.multiplier=2 
-Dtests.linedocsfile=/home/jenkins/jenkins-slave/workspace/Lucene-Solr-NightlyTests-7.x/test-data/enwiki.random.lines.txt
 -Dtests.seed=AA6039F26C94F633 -Dtests.multiplier=2 -Dtests.nightly=true 
-Dtests.slow=true 
-Dtests.linedocsfile=/home/jenkins/jenkins-slave/workspace/Lucene-Solr-NightlyTests-7.x/test-data/enwiki.random.lines.txt
 -Dtests.locale=ar-TN -Dtests.timezone=ART -Dtests.asserts=true 
-Dtests.file.encoding=UTF-8

[...truncated 237171 lines...]
[repro] Setting last failure code to 256

[repro] Failures:
[repro]   4/5 failed: org.apache.solr.cloud.ChaosMonkeySafeLeaderTest
[repro]   5/5 failed: 
org.apache.solr.cloud.ChaosMonkeySafeLeaderWithPullReplicasTest

[repro] Re-testing 100% failures at the tip of branch_7x
[repro] ant clean

[...truncated 8 lines...]
[repro] Test suites by module:
[repro]solr/core
[repro]   ChaosMonkeySafeLeaderWithPullReplicasTest
[repro] ant compile-test

[...truncated 3310 lines...]
[repro] ant test-nocompile -Dtests.dups=5 -Dtests.maxfailures=5 
-Dtests.class="*.ChaosMonkeySafeLeaderWithPullReplicasTest" 
-Dtests.showOutput=onerror -Dtests.multiplier=2 
-Dtests.linedocsfile=/home/jenkins/jenkins-slave/workspace/Lucene-Solr-NightlyTests-7.x/test-data/enwiki.random.lines.txt
 -Dtests.seed=AA6039F26C94F633 -Dtests.multiplier=2 -Dtests.nightly=true 
-Dtests.slow=true 
-Dtests.linedocsfile=/home/jenkins/jenkins-slave/workspace/Lucene-Solr-NightlyTests-7.x/test-data/enwiki.random.lines.txt
 -Dtests.locale=no -Dtests.timezone=US/Alaska -Dtests.asserts=true 
-Dtests.file.encoding=UTF-8

[...truncated 54987 lines...]
[repro] Setting last failure code to 256

[repro] Failures at the tip of branch_7x:
[repro]   2/5 failed: 
org.apache.solr.cloud.ChaosMonkeySafeLeaderWithPullReplicasTest
[repro] git checkout 9de4225e9a54ba987c2c7c9d4510bea3e4f9de97

[...truncated 2 lines...]
[repro] Exiting with code 256

[...truncated 5 lines...]

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org

  1   2   >