[JENKINS] Lucene-Solr-trunk-Windows (64bit/jdk1.8.0_20) - Build # 4284 - Failure!

2014-08-31 Thread Policeman Jenkins Server
Build: http://jenkins.thetaphi.de/job/Lucene-Solr-trunk-Windows/4284/
Java: 64bit/jdk1.8.0_20 -XX:-UseCompressedOops -XX:+UseParallelGC

All tests passed

Build Log:
[...truncated 56111 lines...]
changes-to-html:
[mkdir] Created dir: 
C:\Users\JenkinsSlave\workspace\Lucene-Solr-trunk-Windows\lucene\build\docs\changes
  [get] Getting: https://issues.apache.org/jira/rest/api/2/project/LUCENE
  [get] To: 
C:\Users\JenkinsSlave\workspace\Lucene-Solr-trunk-Windows\lucene\build\docs\changes\jiraVersionList.json
 [exec] Bareword found where operator expected at (eval 1) line 3, near 
"System"
 [exec] (Missing operator before System?)
 [exec] Having no space between pattern and following word is deprecated at 
(eval 1) line 8.
 [exec] ERROR eval'ing munged JSON string ||
 [exec]   
 [exec] System Maintenance
 [exec]   
 [exec]   
 [exec] Maintenance in progress
 [exec] This system is currently down for maintenance. More details 
may be
 [exec] available from the 
 [exec] ASF Public Network Status page.
 [exec]   
 [exec] 
 [exec] ||: Can't find string terminator "'" anywhere before EOF at (eval 
1) line 8.
 [exec] 

BUILD FAILED
C:\Users\JenkinsSlave\workspace\Lucene-Solr-trunk-Windows\build.xml:492: The 
following error occurred while executing this line:
C:\Users\JenkinsSlave\workspace\Lucene-Solr-trunk-Windows\build.xml:78: The 
following error occurred while executing this line:
C:\Users\JenkinsSlave\workspace\Lucene-Solr-trunk-Windows\lucene\build.xml:212: 
The following error occurred while executing this line:
C:\Users\JenkinsSlave\workspace\Lucene-Solr-trunk-Windows\lucene\build.xml:548: 
The following error occurred while executing this line:
C:\Users\JenkinsSlave\workspace\Lucene-Solr-trunk-Windows\lucene\common-build.xml:2443:
 exec returned: 255

Total time: 160 minutes 32 seconds
Build step 'Invoke Ant' marked build as failure
[description-setter] Description set: Java: 64bit/jdk1.8.0_20 
-XX:-UseCompressedOops -XX:+UseParallelGC
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org

[jira] [Commented] (SOLR-5894) Speed up high-cardinality facets with sparse counters

2014-08-31 Thread Toke Eskildsen (JIRA)

[ 
https://issues.apache.org/jira/browse/SOLR-5894?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14116786#comment-14116786
 ] 

Toke Eskildsen commented on SOLR-5894:
--

Status update

The code seems to be in working state for Solr 4.8 and 4.9 for Field Cache 
faceting (single- and multi-value) and DocValue faceting (single- and 
multi-value) for Strings. It really needs testing by someone else than me, so 
that the validity of the responses and the speed upc can be verified. I am 
willing to port this to trunk if a committer is willing to work on getting the 
patch into Solr, but until then I will stick to stable versions as that is what 
we use locally.

Sparse faceting introduces some architectural changes that needs to be 
addressed.

1) The core sparse counting itself is surprisingly non-invasive. It could be a 
standalone patch. However, this only works really well in a non-distributed 
setting and has less effect in a distributed one.

2) Avoiding re-allocation of counter structures by maintaining a pool of 
structures lowers the minimum faceting time and lowers the need for GC. It is 
lowered quite a lot, I might add, as faceting is normally one of the big 
GC-triggers, so I would strongly recommend this feature.
Such a pool is very much like a cache and as such must play nice with index 
updates and other caching structures in Solr. The current state of the counter 
pool is that it works well, but is implemented independently of the usual 
caches. It is probably better to latch on to the standard caching mechanisms in 
Solr, but I have no experience with that part of the code base. The code for 
the pool itself is fairly simple and can be implemented as an experimental 
feature by setting the default pool size to 0.

3) Fast distributed faceting is about speeding up the secondary phase of 
distrubuted faceting: The fine count for specific terms. This is achieved by 
replacing the normal search-for-each-requested-term approach by a full faceted 
count approach, where the counts for all terms are calculated but only the 
counts for the specified terms are returned. This requires either a separation 
of counting and extraction in stock Solr code or an extension to the 
count-extract code block so that is can handle both the case where top-X terms 
are wanted and where specific terms are requested. Currently the sparse code 
use the latter approach, but the former is a much cleaner architecture. Both 
solutions does require some re-factoring and are a bit tricky to get right.

4) A further speed up for distributed faceting can be achieved by upgrading the 
sparse pools to real caches, so that they contain a mix of empty counters (to 
avoid GC) and filled counters (caching of counts). As the counting part of the 
second phase is always exactly the same as for the first phase, cache hit-rate 
should be very high with distribution. This is not currently implemented and is 
a bit tricky to get to work well.
I recommend that this improvement is postponed, in order to keep the current 
patch manageable.

The sparse principle would also work for Field Cache per Segment faceting of 
Strings, but no code has been written for this yet. I recommend that this is 
also postponed, to get things moving.

> Speed up high-cardinality facets with sparse counters
> -
>
> Key: SOLR-5894
> URL: https://issues.apache.org/jira/browse/SOLR-5894
> Project: Solr
>  Issue Type: Improvement
>  Components: SearchComponents - other
>Affects Versions: 4.7.1
>Reporter: Toke Eskildsen
>Priority: Minor
>  Labels: faceted-search, faceting, memory, performance
> Attachments: SOLR-5894.patch, SOLR-5894.patch, SOLR-5894.patch, 
> SOLR-5894.patch, SOLR-5894.patch, SOLR-5894.patch, SOLR-5894.patch, 
> SOLR-5894.patch, SOLR-5894.patch, SOLR-5894_test.zip, SOLR-5894_test.zip, 
> SOLR-5894_test.zip, SOLR-5894_test.zip, SOLR-5894_test.zip, 
> author_7M_tags_1852_logged_queries_warmed.png, 
> sparse_200docs_fc_cutoff_20140403-145412.png, 
> sparse_500docs_20140331-151918_multi.png, 
> sparse_500docs_20140331-151918_single.png, 
> sparse_5051docs_20140328-152807.png
>
>
> Field based faceting in Solr has two phases: Collecting counts for tags in 
> facets and extracting the requested tags.
> The execution time for the collecting phase is approximately linear to the 
> number of hits and the number of references from hits to tags. This phase is 
> not the focus here.
> The extraction time scales with the number of unique tags in the search 
> result, but is also heavily influenced by the total number of unique tags in 
> the facet as every counter, 0 or not, is visited by the extractor (at least 
> for count order). For fields with millions of unique tag values this means 
> 10s of millisecond

Re: Testing a custom distributed component

2014-08-31 Thread Mark Miller
bq.  so helper methods aren't there

Yeah, it would be nice to pull those out so they can easily be used in both
cases.

-- 
- Mark

http://about.me/markrmiller

On Sat, Aug 30, 2014 at 8:11 PM, Steve Davids  wrote:

> If you don't want to use the BaseDistributedSearchTestCase
> 
>  you
> can utilize the newly introduced MiniSolrCloudCluster (
> http://lucene.apache.org/solr/4_9_0/solr-test-framework/org/apache/solr/cloud/MiniSolrCloudCluster.html)
> it works rather well. This class doesn't extend the base solr tests case so
> helper methods aren't there, instead you can use that class to spin up a
> CloudSolrServer to index/query to your liking within the test.
>
> Sent from my iPhone
>
> On Aug 30, 2014, at 5:17 PM, Yonatan Nakar  wrote:
>
> I'm trying to write unit tests for a search component of my own. My
> component is intended to run in a distributed setting only. The problem is
> that it seems like Solr's testing framework doesn't make it easy to write
> unit tests for distributed test components. What is the right way to test
> such a component?
>
> More details about my problem here:
> http://stackoverflow.com/questions/25586021/testing-a-solr-distributed-component
>
>


[jira] [Commented] (SOLR-4919) Allow setting ResponseParser and RequestWriter on LBHttpSolrServer

2014-08-31 Thread David Smiley (JIRA)

[ 
https://issues.apache.org/jira/browse/SOLR-4919?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14116807#comment-14116807
 ] 

David Smiley commented on SOLR-4919:


The parser & writer methods *are* there now.  Apparently it got in as part of 
SOLR-3249 which landed in Solr 4.5.  Time to close this issue?

> Allow setting ResponseParser and RequestWriter on LBHttpSolrServer
> --
>
> Key: SOLR-4919
> URL: https://issues.apache.org/jira/browse/SOLR-4919
> Project: Solr
>  Issue Type: Improvement
>  Components: clients - java
>Affects Versions: 4.3
>Reporter: Shawn Heisey
>Assignee: Shawn Heisey
>Priority: Minor
> Fix For: 4.9, 5.0
>
> Attachments: SOLR-4919.patch, SOLR-4919.patch, SOLR-4919.patch, 
> SOLR-4919.patch, SOLR-4919.patch, SOLR-4919.patch, SOLR-4919.patch, 
> SOLR-4919.patch, SOLR-4919.patch, SOLR-4919.patch, 
> SolrExampleJettyTest-testfail.txt, 
> SolrExampleStreamingTest-failure-linux.txt, 
> TestReplicationHandler-testfail.txt
>
>
> Patch to allow setting parser/writer on LBHttpSolrServer.  Will only work if 
> no server objects exist within.  Part of larger issue SOLR-4715.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[jira] [Created] (SOLR-6456) SolrServer: add setRequestWriter and setParser

2014-08-31 Thread David Smiley (JIRA)
David Smiley created SOLR-6456:
--

 Summary: SolrServer: add setRequestWriter and setParser
 Key: SOLR-6456
 URL: https://issues.apache.org/jira/browse/SOLR-6456
 Project: Solr
  Issue Type: Improvement
  Components: clients - java
Reporter: David Smiley


Nearly every subclass of SolrServer supports setRequestWriter & setParser.  
It's crazy that if you have to cast your SolrServer to it's particular subclass 
implementation.  I want to have code like this without the cast:
{code:java}
if (useXml) {
  solrServer.setRequestWriter(new RequestWriter());
  solrServer.setParser(new XMLResponseParser());
} else {//javabin
  solrServer.setRequestWriter(new BinaryRequestWriter());
  solrServer.setParser(new BinaryResponseParser());
}
{code}
EmbeddedSolrServer could simply log a warning... treating matters like this as 
a hint.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[jira] [Created] (SOLR-6457) loadblance will not work when count > Integer.MAX_VALUE

2014-08-31 Thread longkeyy (JIRA)
longkeyy created SOLR-6457:
--

 Summary: loadblance will not work when count > Integer.MAX_VALUE
 Key: SOLR-6457
 URL: https://issues.apache.org/jira/browse/SOLR-6457
 Project: Solr
  Issue Type: Bug
  Components: clients - java
Affects Versions: 4.9, 4.8.1, 4.8, 4.7.2, 4.7.1, 4.7, 4.6.1, 4.6, 4.5.1, 
4.5, 4.4, 4.3.1, 4.3, 4.2.1, 4.2, 4.1, 4.0
Reporter: longkeyy


org.apache.solr.client.solrj.impl.LBHttpSolrServer
line 442
  int count = counter.incrementAndGet();  
  ServerWrapper wrapper = serverList[count % serverList.length];

when count >  Integer.MAX_VALUE
count % serverList.length will loop by 0,-1,0,-1..

suggess to fixup it ,eg:
//keep count is greater than 0
int count = counter.incrementAndGet() & 0x7FF;  



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[JENKINS] Lucene-Solr-trunk-MacOSX (64bit/jdk1.8.0) - Build # 1803 - Failure!

2014-08-31 Thread Policeman Jenkins Server
Build: http://jenkins.thetaphi.de/job/Lucene-Solr-trunk-MacOSX/1803/
Java: 64bit/jdk1.8.0 -XX:+UseCompressedOops -XX:+UseSerialGC

1 tests failed.
REGRESSION:  org.apache.solr.cloud.OverseerStatusTest.testDistribSearch

Error Message:
reloadcollection the collection time out:180s

Stack Trace:
org.apache.solr.client.solrj.impl.HttpSolrServer$RemoteSolrException: 
reloadcollection the collection time out:180s
at 
__randomizedtesting.SeedInfo.seed([5A92393B9AB95AF1:DB74B723EDE63ACD]:0)
at 
org.apache.solr.client.solrj.impl.HttpSolrServer.executeMethod(HttpSolrServer.java:550)
at 
org.apache.solr.client.solrj.impl.HttpSolrServer.request(HttpSolrServer.java:210)
at 
org.apache.solr.client.solrj.impl.HttpSolrServer.request(HttpSolrServer.java:206)
at 
org.apache.solr.cloud.AbstractFullDistribZkTestBase.invokeCollectionApi(AbstractFullDistribZkTestBase.java:1744)
at 
org.apache.solr.cloud.OverseerStatusTest.doTest(OverseerStatusTest.java:103)
at 
org.apache.solr.BaseDistributedSearchTestCase.testDistribSearch(BaseDistributedSearchTestCase.java:869)
at sun.reflect.GeneratedMethodAccessor36.invoke(Unknown Source)
at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:483)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner.invoke(RandomizedRunner.java:1618)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$6.evaluate(RandomizedRunner.java:827)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$7.evaluate(RandomizedRunner.java:863)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$8.evaluate(RandomizedRunner.java:877)
at 
com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:53)
at 
org.apache.lucene.util.TestRuleSetupTeardownChained$1.evaluate(TestRuleSetupTeardownChained.java:50)
at 
org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:46)
at 
com.carrotsearch.randomizedtesting.rules.SystemPropertiesInvariantRule$1.evaluate(SystemPropertiesInvariantRule.java:55)
at 
org.apache.lucene.util.TestRuleThreadAndTestName$1.evaluate(TestRuleThreadAndTestName.java:49)
at 
org.apache.lucene.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:65)
at 
org.apache.lucene.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:48)
at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at 
com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:365)
at 
com.carrotsearch.randomizedtesting.ThreadLeakControl.forkTimeoutingTask(ThreadLeakControl.java:798)
at 
com.carrotsearch.randomizedtesting.ThreadLeakControl$3.evaluate(ThreadLeakControl.java:458)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner.runSingleTest(RandomizedRunner.java:836)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$3.evaluate(RandomizedRunner.java:738)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$4.evaluate(RandomizedRunner.java:772)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$5.evaluate(RandomizedRunner.java:783)
at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at 
com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:53)
at 
org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:46)
at 
org.apache.lucene.util.TestRuleStoreClassName$1.evaluate(TestRuleStoreClassName.java:42)
at 
com.carrotsearch.randomizedtesting.rules.SystemPropertiesInvariantRule$1.evaluate(SystemPropertiesInvariantRule.java:55)
at 
com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:39)
at 
com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:39)
at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at 
org.apache.lucene.util.TestRuleAssertionsRequired$1.evaluate(TestRuleAssertionsRequired.java:43)
at 
org.apache.lucene.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:48)
at 
org.apache.lucene.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:65)
at 
org.apache.lucene.util.TestRuleIgnoreTestSuites$1.evaluate(TestRuleIgnoreTestSuites.java:55)
 

[jira] [Created] (SOLR-6458) SolrCellMorphlineTest.testSolrCellDocumentTypes2 fail: ignored_creation_date expected:<[2007-10-01T16:13:56Z]> but was:<[1464-10-01T16:13:56Z]>

2014-08-31 Thread Mark Miller (JIRA)
Mark Miller created SOLR-6458:
-

 Summary: SolrCellMorphlineTest.testSolrCellDocumentTypes2 fail: 
ignored_creation_date expected:<[2007-10-01T16:13:56Z]> but 
was:<[1464-10-01T16:13:56Z]>
 Key: SOLR-6458
 URL: https://issues.apache.org/jira/browse/SOLR-6458
 Project: Solr
  Issue Type: Test
  Components: Tests
Reporter: Mark Miller
Assignee: Mark Miller
Priority: Minor






--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[jira] [Commented] (SOLR-6458) SolrCellMorphlineTest.testSolrCellDocumentTypes2 fail: ignored_creation_date expected:<[2007-10-01T16:13:56Z]> but was:<[1464-10-01T16:13:56Z]>

2014-08-31 Thread ASF subversion and git services (JIRA)

[ 
https://issues.apache.org/jira/browse/SOLR-6458?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14116824#comment-14116824
 ] 

ASF subversion and git services commented on SOLR-6458:
---

Commit 1621610 from [~markrmil...@gmail.com] in branch 'dev/branches/branch_4x'
[ https://svn.apache.org/r1621610 ]

SOLR-6458: Hard coded ENGLISH locale for AbstractSolrMorphlineTestBase tests.

> SolrCellMorphlineTest.testSolrCellDocumentTypes2 fail: ignored_creation_date 
> expected:<[2007-10-01T16:13:56Z]> but was:<[1464-10-01T16:13:56Z]>
> ---
>
> Key: SOLR-6458
> URL: https://issues.apache.org/jira/browse/SOLR-6458
> Project: Solr
>  Issue Type: Test
>  Components: Tests
>Reporter: Mark Miller
>Assignee: Mark Miller
>Priority: Minor
>




--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[jira] [Commented] (SOLR-6458) SolrCellMorphlineTest.testSolrCellDocumentTypes2 fail: ignored_creation_date expected:<[2007-10-01T16:13:56Z]> but was:<[1464-10-01T16:13:56Z]>

2014-08-31 Thread ASF subversion and git services (JIRA)

[ 
https://issues.apache.org/jira/browse/SOLR-6458?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14116823#comment-14116823
 ] 

ASF subversion and git services commented on SOLR-6458:
---

Commit 1621609 from [~markrmil...@gmail.com] in branch 'dev/trunk'
[ https://svn.apache.org/r1621609 ]

SOLR-6458: Hard coded ENGLISH locale for AbstractSolrMorphlineTestBase tests.

> SolrCellMorphlineTest.testSolrCellDocumentTypes2 fail: ignored_creation_date 
> expected:<[2007-10-01T16:13:56Z]> but was:<[1464-10-01T16:13:56Z]>
> ---
>
> Key: SOLR-6458
> URL: https://issues.apache.org/jira/browse/SOLR-6458
> Project: Solr
>  Issue Type: Test
>  Components: Tests
>Reporter: Mark Miller
>Assignee: Mark Miller
>Priority: Minor
>




--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[jira] [Closed] (SOLR-5926) Add ComplexPhraseQParserPlugin to Ref Guide (cwiki)

2014-08-31 Thread Erick Erickson (JIRA)

 [ 
https://issues.apache.org/jira/browse/SOLR-5926?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Erick Erickson closed SOLR-5926.

   Resolution: Fixed
Fix Version/s: (was: 4.9)
   4.10

Thanks Chris! I added a little (mostly took the existing Wiki page and edited). 
I think it's enough for now at least.

> Add ComplexPhraseQParserPlugin to Ref Guide (cwiki)
> ---
>
> Key: SOLR-5926
> URL: https://issues.apache.org/jira/browse/SOLR-5926
> Project: Solr
>  Issue Type: Improvement
>  Components: documentation
>Affects Versions: 4.7
>Reporter: Ahmet Arslan
>Assignee: Erick Erickson
>Priority: Minor
>  Labels: documentation
> Fix For: 5.0, 4.10
>
>
> Documentation of http://wiki.apache.org/solr/ComplexPhraseQueryParser
> in the ref guide, "Other Parsers" section. 
> https://cwiki.apache.org/confluence/display/solr/Other+Parsers 



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[jira] [Commented] (SOLR-4351) JSON QParser integration

2014-08-31 Thread Manuel Lenormand (JIRA)

[ 
https://issues.apache.org/jira/browse/SOLR-4351?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14116827#comment-14116827
 ] 

Manuel Lenormand commented on SOLR-4351:


Is there any way to convert any Query object into a json representation instead 
of building it by my own? Is there any jsonQueryBuilder?

Has anyone found bugs or unimplemented functionalities in this Jira before I 
get it into test environment?

> JSON QParser integration
> 
>
> Key: SOLR-4351
> URL: https://issues.apache.org/jira/browse/SOLR-4351
> Project: Solr
>  Issue Type: New Feature
>Reporter: Yonik Seeley
> Attachments: SOLR-4351.patch
>
>
> Our QParser framework currently gets parameters from localParams.  JSON 
> integration would allow specifying parameters to the parsers in JSON.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[jira] [Resolved] (SOLR-6458) SolrCellMorphlineTest.testSolrCellDocumentTypes2 fail: ignored_creation_date expected:<[2007-10-01T16:13:56Z]> but was:<[1464-10-01T16:13:56Z]>

2014-08-31 Thread Mark Miller (JIRA)

 [ 
https://issues.apache.org/jira/browse/SOLR-6458?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Mark Miller resolved SOLR-6458.
---
   Resolution: Fixed
Fix Version/s: 4.11
   5.0

> SolrCellMorphlineTest.testSolrCellDocumentTypes2 fail: ignored_creation_date 
> expected:<[2007-10-01T16:13:56Z]> but was:<[1464-10-01T16:13:56Z]>
> ---
>
> Key: SOLR-6458
> URL: https://issues.apache.org/jira/browse/SOLR-6458
> Project: Solr
>  Issue Type: Test
>  Components: Tests
>Reporter: Mark Miller
>Assignee: Mark Miller
>Priority: Minor
> Fix For: 5.0, 4.11
>
>




--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[jira] [Closed] (SOLR-5896) Create and edit a CWiki page that describes UpdateRequestProcessors, especially FieldMutatingUpdateProcessors

2014-08-31 Thread Erick Erickson (JIRA)

 [ 
https://issues.apache.org/jira/browse/SOLR-5896?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Erick Erickson closed SOLR-5896.

   Resolution: Fixed
Fix Version/s: 4.10

Short descriptions are added to the Reference Guide. There's room for any 
supporting detail we need, but at least it lets users know these exists.

> Create and edit a CWiki page that describes UpdateRequestProcessors, 
> especially FieldMutatingUpdateProcessors
> -
>
> Key: SOLR-5896
> URL: https://issues.apache.org/jira/browse/SOLR-5896
> Project: Solr
>  Issue Type: Improvement
>  Components: documentation
>Affects Versions: 4.8, 5.0
>Reporter: Erick Erickson
>Assignee: Erick Erickson
> Fix For: 4.10
>
>
> The capabilities here aren't really documented as a group anywhere I could 
> see in the official pages, there are a couple of references to them but 
> nothing that really serves draws attention. These need to be documented.
> Where does it make sense to put this? It doesn't really fit under 
> "Understanding Analyzers, Tokenizers, and Filters", except kinda since they 
> can be used to alter how data gets indexed, think of the 
> Parse[Date|Int|Float..] factories.
> Straw-man: add child pages to "Understanding Analyzers, Tokenizers, and 
> Filters" for "What is an UpdateRequestProcessor", "UpdateRequestProcessors", 
> and probably something like "How to configure your UpdateRequestProcessor". 
> Or???



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



Re: Odd test failures

2014-08-31 Thread Dawid Weiss
It's because the exception is triggered in a static class
FaultyIndexInput (initialized in a static context
TestFieldsReader#beforeClass). When you ask for -Dtests.iters, only
the tests (methods) are duplicated, the static context remains the
same. So the "count" variable in FaultyIndexInput is actually
propagated from test to test and each repetition is not really atomic/
isolated from others (search for one of recent e-mail to Ryan, it
contains a deeper information on why and how this works).

You can repeat the failure if you repeat exactly the same seed for
each repetition (including test methods):

ant test  -Dtestcase=TestFieldsReader
-Dtests.seed=DFB0B84C4D087DFD:1DE75618D1B7C867 -Dtests.slow=true
-Dtests.locale=sr_ME -Dtests.timezone=Asia/Kashgar
-Dtests.file.encoding=UTF-8 -Dtests.iters=10

This yields:

Tests with failures:
  - org.apache.lucene.index.TestFieldsReader.testExceptions {#1
seed=[DFB0B84C4D087DFD:1DE75618D1B7C867]}
  - org.apache.lucene.index.TestFieldsReader.testExceptions {#2
seed=[DFB0B84C4D087DFD:1DE75618D1B7C867]}
  - org.apache.lucene.index.TestFieldsReader.testExceptions {#3
seed=[DFB0B84C4D087DFD:1DE75618D1B7C867]}
  - org.apache.lucene.index.TestFieldsReader.testExceptions {#4
seed=[DFB0B84C4D087DFD:1DE75618D1B7C867]}
  - org.apache.lucene.index.TestFieldsReader.testExceptions {#5
seed=[DFB0B84C4D087DFD:1DE75618D1B7C867]}
  - org.apache.lucene.index.TestFieldsReader.testExceptions {#6
seed=[DFB0B84C4D087DFD:1DE75618D1B7C867]}
  - org.apache.lucene.index.TestFieldsReader.testExceptions {#7
seed=[DFB0B84C4D087DFD:1DE75618D1B7C867]}
  - org.apache.lucene.index.TestFieldsReader.testExceptions {#8
seed=[DFB0B84C4D087DFD:1DE75618D1B7C867]}
  - org.apache.lucene.index.TestFieldsReader.testExceptions {#9
seed=[DFB0B84C4D087DFD:1DE75618D1B7C867]}

Note I included per-method seed in the -Dtests.seed. Also, #0
iteration *does pass*; the remaining ones fail because of what I said
above.

Dawid

On Sun, Aug 31, 2014 at 6:20 AM, Erick Erickson  wrote:
> I'm seeing the fairly easily-reproducible error below on trunk.
> Unfortunately it doesn't reproduce with the seed. I'm wondering if anyone
> has an inkling what's going on here?
>
> I did manage to notice that I screwed up the command I was _intending_ and
> actually issued the command below, although I have a hard time imagining
> that's the problem, unless it's something like running tests.iters on the
> full suite makes this happen. No wonder -Dtests.iters=100 didn't finish...
> Siii.
>
> ant -Dtestcasae=TestDistributedSearch -Dtests.iters=10 test
>
> Note I spelled 'testcase' as 'testcasae'...
>
>
> Stack trace:
>
>[junit4] Suite: org.apache.lucene.index.TestFieldsReader
>
>[junit4]   2> NOTE: reproduce with: ant test  -Dtestcase=TestFieldsReader
> -Dtests.method=testExceptions -Dtests.seed=DFB0B84C4D087DFD
> -Dtests.slow=true -Dtests.locale=sr_ME -Dtests.timezone=Asia/Kashgar
> -Dtests.file.encoding=UTF-8
>
>[junit4] ERROR   0.10s J2 | TestFieldsReader.testExceptions {#1
> seed=[DFB0B84C4D087DFD:1DE75618D1B7C867]} <<<
>
>[junit4]> Throwable #1: java.io.IOException: Simulated network outage
>
>[junit4]> at
> __randomizedtesting.SeedInfo.seed([DFB0B84C4D087DFD:1DE75618D1B7C867]:0)
>
>[junit4]> at
> org.apache.lucene.index.TestFieldsReader$FaultyIndexInput.simOutage(TestFieldsReader.java:156)
>
>[junit4]> at
> org.apache.lucene.index.TestFieldsReader$FaultyIndexInput.readInternal(TestFieldsReader.java:161)
>
>[junit4]> at
> org.apache.lucene.store.BufferedIndexInput.refill(BufferedIndexInput.java:342)
>
>[junit4]> at
> org.apache.lucene.store.BufferedIndexInput.readBytes(BufferedIndexInput.java:140)
>
>[junit4]> at
> org.apache.lucene.store.BufferedIndexInput.readBytes(BufferedIndexInput.java:116)
>
>[junit4]> at
> org.apache.lucene.store.BufferedChecksumIndexInput.readBytes(BufferedChecksumIndexInput.java:49)
>
>[junit4]> at
> org.apache.lucene.store.DataInput.readString(DataInput.java:234)
>
>[junit4]> at
> org.apache.lucene.store.DataInput.readStringStringMap(DataInput.java:263)
>
>[junit4]> at
> org.apache.lucene.codecs.lucene46.Lucene46FieldInfosReader.read(Lucene46FieldInfosReader.java:93)
>
>[junit4]> at
> org.apache.lucene.index.SegmentReader.readFieldInfos(SegmentReader.java:216)
>
>[junit4]> at
> org.apache.lucene.index.SegmentReader.(SegmentReader.java:97)
>
>[junit4]> at
> org.apache.lucene.index.StandardDirectoryReader$1.doBody(StandardDirectoryReader.java:59)
>
>[junit4]> at
> org.apache.lucene.index.SegmentInfos$FindSegmentsFile.run(SegmentInfos.java:795)
>
>[junit4]> at
> org.apache.lucene.index.StandardDirectoryReader.open(StandardDirectoryReader.java:50)
>
>[junit4]> at
> org.apache.lucene.index.DirectoryReader.open(DirectoryReader.java:64)
>
>[junit4]> at
> org.apache.lucene.index.TestFieldsReader.testExceptions(TestFieldsReader.java:209)
>
>[junit4]   

Re: Odd test failures

2014-08-31 Thread Erick Erickson
Thanks Dawid! So my take-away is that  this is just pilot error on my part,
not something intrinsic to the code.

Erick


On Sun, Aug 31, 2014 at 12:46 PM, Dawid Weiss 
wrote:

> It's because the exception is triggered in a static class
> FaultyIndexInput (initialized in a static context
> TestFieldsReader#beforeClass). When you ask for -Dtests.iters, only
> the tests (methods) are duplicated, the static context remains the
> same. So the "count" variable in FaultyIndexInput is actually
> propagated from test to test and each repetition is not really atomic/
> isolated from others (search for one of recent e-mail to Ryan, it
> contains a deeper information on why and how this works).
>
> You can repeat the failure if you repeat exactly the same seed for
> each repetition (including test methods):
>
> ant test  -Dtestcase=TestFieldsReader
> -Dtests.seed=DFB0B84C4D087DFD:1DE75618D1B7C867 -Dtests.slow=true
> -Dtests.locale=sr_ME -Dtests.timezone=Asia/Kashgar
> -Dtests.file.encoding=UTF-8 -Dtests.iters=10
>
> This yields:
>
> Tests with failures:
>   - org.apache.lucene.index.TestFieldsReader.testExceptions {#1
> seed=[DFB0B84C4D087DFD:1DE75618D1B7C867]}
>   - org.apache.lucene.index.TestFieldsReader.testExceptions {#2
> seed=[DFB0B84C4D087DFD:1DE75618D1B7C867]}
>   - org.apache.lucene.index.TestFieldsReader.testExceptions {#3
> seed=[DFB0B84C4D087DFD:1DE75618D1B7C867]}
>   - org.apache.lucene.index.TestFieldsReader.testExceptions {#4
> seed=[DFB0B84C4D087DFD:1DE75618D1B7C867]}
>   - org.apache.lucene.index.TestFieldsReader.testExceptions {#5
> seed=[DFB0B84C4D087DFD:1DE75618D1B7C867]}
>   - org.apache.lucene.index.TestFieldsReader.testExceptions {#6
> seed=[DFB0B84C4D087DFD:1DE75618D1B7C867]}
>   - org.apache.lucene.index.TestFieldsReader.testExceptions {#7
> seed=[DFB0B84C4D087DFD:1DE75618D1B7C867]}
>   - org.apache.lucene.index.TestFieldsReader.testExceptions {#8
> seed=[DFB0B84C4D087DFD:1DE75618D1B7C867]}
>   - org.apache.lucene.index.TestFieldsReader.testExceptions {#9
> seed=[DFB0B84C4D087DFD:1DE75618D1B7C867]}
>
> Note I included per-method seed in the -Dtests.seed. Also, #0
> iteration *does pass*; the remaining ones fail because of what I said
> above.
>
> Dawid
>
> On Sun, Aug 31, 2014 at 6:20 AM, Erick Erickson 
> wrote:
> > I'm seeing the fairly easily-reproducible error below on trunk.
> > Unfortunately it doesn't reproduce with the seed. I'm wondering if anyone
> > has an inkling what's going on here?
> >
> > I did manage to notice that I screwed up the command I was _intending_
> and
> > actually issued the command below, although I have a hard time imagining
> > that's the problem, unless it's something like running tests.iters on the
> > full suite makes this happen. No wonder -Dtests.iters=100 didn't
> finish...
> > Siii.
> >
> > ant -Dtestcasae=TestDistributedSearch -Dtests.iters=10 test
> >
> > Note I spelled 'testcase' as 'testcasae'...
> >
> >
> > Stack trace:
> >
> >[junit4] Suite: org.apache.lucene.index.TestFieldsReader
> >
> >[junit4]   2> NOTE: reproduce with: ant test
> -Dtestcase=TestFieldsReader
> > -Dtests.method=testExceptions -Dtests.seed=DFB0B84C4D087DFD
> > -Dtests.slow=true -Dtests.locale=sr_ME -Dtests.timezone=Asia/Kashgar
> > -Dtests.file.encoding=UTF-8
> >
> >[junit4] ERROR   0.10s J2 | TestFieldsReader.testExceptions {#1
> > seed=[DFB0B84C4D087DFD:1DE75618D1B7C867]} <<<
> >
> >[junit4]> Throwable #1: java.io.IOException: Simulated network
> outage
> >
> >[junit4]> at
> > __randomizedtesting.SeedInfo.seed([DFB0B84C4D087DFD:1DE75618D1B7C867]:0)
> >
> >[junit4]> at
> >
> org.apache.lucene.index.TestFieldsReader$FaultyIndexInput.simOutage(TestFieldsReader.java:156)
> >
> >[junit4]> at
> >
> org.apache.lucene.index.TestFieldsReader$FaultyIndexInput.readInternal(TestFieldsReader.java:161)
> >
> >[junit4]> at
> >
> org.apache.lucene.store.BufferedIndexInput.refill(BufferedIndexInput.java:342)
> >
> >[junit4]> at
> >
> org.apache.lucene.store.BufferedIndexInput.readBytes(BufferedIndexInput.java:140)
> >
> >[junit4]> at
> >
> org.apache.lucene.store.BufferedIndexInput.readBytes(BufferedIndexInput.java:116)
> >
> >[junit4]> at
> >
> org.apache.lucene.store.BufferedChecksumIndexInput.readBytes(BufferedChecksumIndexInput.java:49)
> >
> >[junit4]> at
> > org.apache.lucene.store.DataInput.readString(DataInput.java:234)
> >
> >[junit4]> at
> > org.apache.lucene.store.DataInput.readStringStringMap(DataInput.java:263)
> >
> >[junit4]> at
> >
> org.apache.lucene.codecs.lucene46.Lucene46FieldInfosReader.read(Lucene46FieldInfosReader.java:93)
> >
> >[junit4]> at
> >
> org.apache.lucene.index.SegmentReader.readFieldInfos(SegmentReader.java:216)
> >
> >[junit4]> at
> > org.apache.lucene.index.SegmentReader.(SegmentReader.java:97)
> >
> >[junit4]> at
> >
> org.apache.lucene.index.StandardDirectoryReader$1.doBody(StandardDirectoryReader.java:59)
> >
> >[junit4]

[jira] [Updated] (SOLR-6024) StatsComponent does not work for docValues enabled multiValued fields

2014-08-31 Thread Vitaliy Zhovtyuk (JIRA)

 [ 
https://issues.apache.org/jira/browse/SOLR-6024?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Vitaliy Zhovtyuk updated SOLR-6024:
---
Attachment: SOLR-6024-trunk.patch

Patch based on latest added trunk patch.
Added stats calculation tests for docValues and multiValued fields of float and 
integer numeric types, added calculate distinct count, added stats.facet query 
on docValues field (leads to field type exception) 

> StatsComponent does not work for docValues enabled multiValued fields
> -
>
> Key: SOLR-6024
> URL: https://issues.apache.org/jira/browse/SOLR-6024
> Project: Solr
>  Issue Type: Bug
>  Components: SearchComponents - other
>Affects Versions: 4.8
> Environment: java version "1.7.0_45"
> Mac OS X Version 10.7.5
>Reporter: Ahmet Arslan
>  Labels: StatsComponent, docValues, multiValued
> Fix For: 4.9
>
> Attachments: SOLR-6024-trunk.patch, SOLR-6024-trunk.patch, 
> SOLR-6024-trunk.patch, SOLR-6024-trunk.patch, SOLR-6024-trunk.patch, 
> SOLR-6024.patch, SOLR-6024.patch
>
>
> Harish Agarwal reported this in solr user mailing list : 
> http://search-lucene.com/m/QTPaoTJXV1
> It is east to re-produce with default example solr setup. Following types are 
> added example schema.xml. And exampledocs are indexed.
> {code:xml}
>   docValues="true" multiValued="true"/>
>docValues="true" multiValued="true"/>
> {code}
> When {{docValues="true"}} *and* {{multiValued="true"}} are used at the same 
> time, StatsComponent throws :
> {noformat}
> ERROR org.apache.solr.core.SolrCore  – org.apache.solr.common.SolrException: 
> Type mismatch: popularity was indexed as SORTED_SET
>   at 
> org.apache.solr.request.UnInvertedField.(UnInvertedField.java:193)
>   at 
> org.apache.solr.request.UnInvertedField.getUnInvertedField(UnInvertedField.java:699)
>   at 
> org.apache.solr.handler.component.SimpleStats.getStatsFields(StatsComponent.java:319)
>   at 
> org.apache.solr.handler.component.SimpleStats.getStatsCounts(StatsComponent.java:290)
>   at 
> org.apache.solr.handler.component.StatsComponent.process(StatsComponent.java:78)
>   at 
> org.apache.solr.handler.component.SearchHandler.handleRequestBody(SearchHandler.java:221)
>   at 
> org.apache.solr.handler.RequestHandlerBase.handleRequest(RequestHandlerBase.java:135)
>   at org.apache.solr.core.SolrCore.execute(SolrCore.java:1964)
> {noformat}



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[jira] [Commented] (LUCENE-5914) More options for stored fields compression

2014-08-31 Thread Adrien Grand (JIRA)

[ 
https://issues.apache.org/jira/browse/LUCENE-5914?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14116904#comment-14116904
 ] 

Adrien Grand commented on LUCENE-5914:
--

[~erickerickson] [~elyograg] Do you have pointers to emails/irc logs describing 
such issues? Maybe that is something that can be solved without disabling 
compression? By the way, the current patch already improves CPU usage in two 
cases: when doing random access since you can decompress a single document at a 
time, and also sequential access (typically used if you export your current 
dataset using stored fields, or internally by Lucene when merging mixed codecs) 
so maybe that would already help.

[~rcmuir] I will change this.

> More options for stored fields compression
> --
>
> Key: LUCENE-5914
> URL: https://issues.apache.org/jira/browse/LUCENE-5914
> Project: Lucene - Core
>  Issue Type: Improvement
>Reporter: Adrien Grand
>Assignee: Adrien Grand
> Fix For: 4.11
>
> Attachments: LUCENE-5914.patch
>
>
> Since we added codec-level compression in Lucene 4.1 I think I got about the 
> same amount of users complaining that compression was too aggressive and that 
> compression was too light.
> I think it is due to the fact that we have users that are doing very 
> different things with Lucene. For example if you have a small index that fits 
> in the filesystem cache (or is close to), then you might never pay for actual 
> disk seeks and in such a case the fact that the current stored fields format 
> needs to over-decompress data can sensibly slow search down on cheap queries.
> On the other hand, it is more and more common to use Lucene for things like 
> log analytics, and in that case you have huge amounts of data for which you 
> don't care much about stored fields performance. However it is very 
> frustrating to notice that the data that you store takes several times less 
> space when you gzip it compared to your index although Lucene claims to 
> compress stored fields.
> For that reason, I think it would be nice to have some kind of options that 
> would allow to trade speed for compression in the default codec.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[JENKINS] Lucene-Solr-4.x-Linux (32bit/jdk1.8.0_20) - Build # 11017 - Failure!

2014-08-31 Thread Policeman Jenkins Server
Build: http://jenkins.thetaphi.de/job/Lucene-Solr-4.x-Linux/11017/
Java: 32bit/jdk1.8.0_20 -client -XX:+UseSerialGC

6 tests failed.
REGRESSION:  
org.apache.solr.client.solrj.SolrExampleBinaryTest.testChildDoctransformer

Error Message:
Expected mime type application/octet-stream but got text/html.   
 
Error 500 Server Error   HTTP ERROR: 500 
Problem accessing /solr/collection1/select. Reason: Server 
Error Powered by Jetty:// 












  

Stack Trace:
org.apache.solr.client.solrj.impl.HttpSolrServer$RemoteSolrException: Expected 
mime type application/octet-stream but got text/html. 


Error 500 Server Error


HTTP ERROR: 500
Problem accessing /solr/collection1/select. Reason:
Server Error
Powered by Jetty://























at 
__randomizedtesting.SeedInfo.seed([EA23CFBE5BD580A8:99F9D024D7CDF7AE]:0)
at 
org.apache.solr.client.solrj.impl.HttpSolrServer.executeMethod(HttpSolrServer.java:512)
at 
org.apache.solr.client.solrj.impl.HttpSolrServer.request(HttpSolrServer.java:210)
at 
org.apache.solr.client.solrj.impl.HttpSolrServer.request(HttpSolrServer.java:206)
at 
org.apache.solr.client.solrj.request.QueryRequest.process(QueryRequest.java:91)
at org.apache.solr.client.solrj.SolrServer.query(SolrServer.java:301)
at 
org.apache.solr.client.solrj.SolrExampleTests.testChildDoctransformer(SolrExampleTests.java:1373)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:483)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner.invoke(RandomizedRunner.java:1618)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$6.evaluate(RandomizedRunner.java:827)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$7.evaluate(RandomizedRunner.java:863)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$8.evaluate(RandomizedRunner.java:877)
at 
com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:53)
at 
org.apache.lucene.util.TestRuleSetupTeardownChained$1.evaluate(TestRuleSetupTeardownChained.java:50)
at 
org.apache.lucene.util.TestRuleFieldCacheSanity$1.evaluate(TestRuleFieldCacheSanity.java:51)
at 
org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:46)
at 
com.carrotsearch.randomizedtesting.rules.SystemPropertiesInvariantRule$1.evaluate(SystemPropertiesInvariantRule.java:55)
at 
org.apache.lucene.util.TestRuleThreadAndTestName$1.evaluate(TestRuleThreadAndTestName.java:49)
at 
org.apache.lucene.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:65)
at 
org.apache.lucene.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:48)
at 
com.carrot

[jira] [Commented] (LUCENE-5914) More options for stored fields compression

2014-08-31 Thread Robert Muir (JIRA)

[ 
https://issues.apache.org/jira/browse/LUCENE-5914?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14116926#comment-14116926
 ] 

Robert Muir commented on LUCENE-5914:
-

[~jpountz] well I just was curious about the motivation behind it. 

There are advantages and disadvantages to each way, but as separate formats 
each use case would really be able to proceed in its own direction in the 
future. 

With the current patch, BEST_COMPRESSION = Deflate, but what if we wanted to 
replace this with bzip later and still support the old indexes etc (which I 
think is a goal of this issue). 

So I think its better to have separate formats (if they want to share some code 
behind the scenes at the moment, thats ok). If we want to provide back compat 
on both options, then thats something we can decide to do here (IMO, there 
should be a "price" for the added complexity, such as moving all ancient codecs 
out of lucene/core, dropping 3.x index support, something to keep it all 
manageable).


> More options for stored fields compression
> --
>
> Key: LUCENE-5914
> URL: https://issues.apache.org/jira/browse/LUCENE-5914
> Project: Lucene - Core
>  Issue Type: Improvement
>Reporter: Adrien Grand
>Assignee: Adrien Grand
> Fix For: 4.11
>
> Attachments: LUCENE-5914.patch
>
>
> Since we added codec-level compression in Lucene 4.1 I think I got about the 
> same amount of users complaining that compression was too aggressive and that 
> compression was too light.
> I think it is due to the fact that we have users that are doing very 
> different things with Lucene. For example if you have a small index that fits 
> in the filesystem cache (or is close to), then you might never pay for actual 
> disk seeks and in such a case the fact that the current stored fields format 
> needs to over-decompress data can sensibly slow search down on cheap queries.
> On the other hand, it is more and more common to use Lucene for things like 
> log analytics, and in that case you have huge amounts of data for which you 
> don't care much about stored fields performance. However it is very 
> frustrating to notice that the data that you store takes several times less 
> space when you gzip it compared to your index although Lucene claims to 
> compress stored fields.
> For that reason, I think it would be nice to have some kind of options that 
> would allow to trade speed for compression in the default codec.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[JENKINS] Lucene-Solr-trunk-Windows (32bit/jdk1.8.0_20) - Build # 4285 - Still Failing!

2014-08-31 Thread Policeman Jenkins Server
Build: http://jenkins.thetaphi.de/job/Lucene-Solr-trunk-Windows/4285/
Java: 32bit/jdk1.8.0_20 -server -XX:+UseG1GC

1 tests failed.
REGRESSION:  
org.apache.solr.client.solrj.impl.ConcurrentUpdateSolrServerTest.testConcurrentUpdate

Error Message:
Expected no errors but got 1, due to:  org.apache.http.NoHttpResponseException: 
The target server failed to respond

Stack Trace:
java.lang.AssertionError: Expected no errors but got 1, due to:  
org.apache.http.NoHttpResponseException: The target server failed to respond
at 
__randomizedtesting.SeedInfo.seed([30B2572329F37260:C8B40ED8FDCB172B]:0)
at org.junit.Assert.fail(Assert.java:93)
at org.junit.Assert.assertTrue(Assert.java:43)
at 
org.apache.solr.client.solrj.impl.ConcurrentUpdateSolrServerTest.testConcurrentUpdate(ConcurrentUpdateSolrServerTest.java:189)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:483)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner.invoke(RandomizedRunner.java:1618)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$6.evaluate(RandomizedRunner.java:827)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$7.evaluate(RandomizedRunner.java:863)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$8.evaluate(RandomizedRunner.java:877)
at 
com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:53)
at 
org.apache.lucene.util.TestRuleSetupTeardownChained$1.evaluate(TestRuleSetupTeardownChained.java:50)
at 
org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:46)
at 
com.carrotsearch.randomizedtesting.rules.SystemPropertiesInvariantRule$1.evaluate(SystemPropertiesInvariantRule.java:55)
at 
org.apache.lucene.util.TestRuleThreadAndTestName$1.evaluate(TestRuleThreadAndTestName.java:49)
at 
org.apache.lucene.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:65)
at 
org.apache.lucene.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:48)
at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at 
com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:365)
at 
com.carrotsearch.randomizedtesting.ThreadLeakControl.forkTimeoutingTask(ThreadLeakControl.java:798)
at 
com.carrotsearch.randomizedtesting.ThreadLeakControl$3.evaluate(ThreadLeakControl.java:458)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner.runSingleTest(RandomizedRunner.java:836)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$3.evaluate(RandomizedRunner.java:738)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$4.evaluate(RandomizedRunner.java:772)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$5.evaluate(RandomizedRunner.java:783)
at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at 
com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:53)
at 
org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:46)
at 
org.apache.lucene.util.TestRuleStoreClassName$1.evaluate(TestRuleStoreClassName.java:42)
at 
com.carrotsearch.randomizedtesting.rules.SystemPropertiesInvariantRule$1.evaluate(SystemPropertiesInvariantRule.java:55)
at 
com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:39)
at 
com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:39)
at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at 
org.apache.lucene.util.TestRuleAssertionsRequired$1.evaluate(TestRuleAssertionsRequired.java:43)
at 
org.apache.lucene.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:48)
at 
org.apache.lucene.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:65)
at 
org.apache.lucene.util.TestRuleIgnoreTestSuites$1.evaluate(TestRuleIgnoreTestSuites.java:55)
at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at 
com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRu

[JENKINS] Lucene-Solr-NightlyTests-trunk - Build # 613 - Still Failing

2014-08-31 Thread Apache Jenkins Server
Build: https://builds.apache.org/job/Lucene-Solr-NightlyTests-trunk/613/

1 tests failed.
REGRESSION:  org.apache.lucene.index.TestIndexWriterOutOfMemory.testBasics

Error Message:
TEST-TestIndexWriterOutOfMemory.testBasics-seed#[C4C9D7B4B9D5D290]: RefCount is 
0 pre-decrement for file "_2_2_Lucene410_0.dvd"

Stack Trace:
java.lang.AssertionError: 
TEST-TestIndexWriterOutOfMemory.testBasics-seed#[C4C9D7B4B9D5D290]: RefCount is 
0 pre-decrement for file "_2_2_Lucene410_0.dvd"
at 
__randomizedtesting.SeedInfo.seed([C4C9D7B4B9D5D290:F9117998813B8CE0]:0)
at 
org.apache.lucene.index.IndexFileDeleter$RefCount.DecRef(IndexFileDeleter.java:725)
at 
org.apache.lucene.index.IndexFileDeleter.decRef(IndexFileDeleter.java:611)
at 
org.apache.lucene.index.IndexFileDeleter.decRef(IndexFileDeleter.java:599)
at 
org.apache.lucene.index.IndexWriter.prepareCommitInternal(IndexWriter.java:2851)
at 
org.apache.lucene.index.IndexWriter.commitInternal(IndexWriter.java:2952)
at org.apache.lucene.index.IndexWriter.commit(IndexWriter.java:2919)
at 
org.apache.lucene.index.TestIndexWriterOutOfMemory.testBasics(TestIndexWriterOutOfMemory.java:215)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner.invoke(RandomizedRunner.java:1618)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$6.evaluate(RandomizedRunner.java:827)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$7.evaluate(RandomizedRunner.java:863)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$8.evaluate(RandomizedRunner.java:877)
at 
org.apache.lucene.util.TestRuleSetupTeardownChained$1.evaluate(TestRuleSetupTeardownChained.java:50)
at 
org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:46)
at 
com.carrotsearch.randomizedtesting.rules.SystemPropertiesInvariantRule$1.evaluate(SystemPropertiesInvariantRule.java:55)
at 
org.apache.lucene.util.TestRuleThreadAndTestName$1.evaluate(TestRuleThreadAndTestName.java:49)
at 
org.apache.lucene.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:65)
at 
org.apache.lucene.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:48)
at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at 
com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:365)
at 
com.carrotsearch.randomizedtesting.ThreadLeakControl.forkTimeoutingTask(ThreadLeakControl.java:798)
at 
com.carrotsearch.randomizedtesting.ThreadLeakControl$3.evaluate(ThreadLeakControl.java:458)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner.runSingleTest(RandomizedRunner.java:836)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$3.evaluate(RandomizedRunner.java:738)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$4.evaluate(RandomizedRunner.java:772)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$5.evaluate(RandomizedRunner.java:783)
at 
org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:46)
at 
org.apache.lucene.util.TestRuleStoreClassName$1.evaluate(TestRuleStoreClassName.java:42)
at 
com.carrotsearch.randomizedtesting.rules.SystemPropertiesInvariantRule$1.evaluate(SystemPropertiesInvariantRule.java:55)
at 
com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:39)
at 
com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:39)
at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at 
org.apache.lucene.util.TestRuleAssertionsRequired$1.evaluate(TestRuleAssertionsRequired.java:43)
at 
org.apache.lucene.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:48)
at 
org.apache.lucene.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:65)
at 
org.apache.lucene.util.TestRuleIgnoreTestSuites$1.evaluate(TestRuleIgnoreTestSuites.java:55)
at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at 
com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(Th

[JENKINS] Lucene-Solr-Tests-trunk-Java7 - Build # 4822 - Failure

2014-08-31 Thread Apache Jenkins Server
Build: https://builds.apache.org/job/Lucene-Solr-Tests-trunk-Java7/4822/

1 tests failed.
REGRESSION:  org.apache.solr.client.solrj.TestLBHttpSolrServer.testReliability

Error Message:
No live SolrServers available to handle this request

Stack Trace:
org.apache.solr.client.solrj.SolrServerException: No live SolrServers available 
to handle this request
at 
__randomizedtesting.SeedInfo.seed([AEC23B5D5F18357D:6F0AE61BFE7EE4D4]:0)
at 
org.apache.solr.client.solrj.impl.LBHttpSolrServer.request(LBHttpSolrServer.java:525)
at 
org.apache.solr.client.solrj.request.QueryRequest.process(QueryRequest.java:91)
at org.apache.solr.client.solrj.SolrServer.query(SolrServer.java:301)
at 
org.apache.solr.client.solrj.TestLBHttpSolrServer.testReliability(TestLBHttpSolrServer.java:222)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner.invoke(RandomizedRunner.java:1618)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$6.evaluate(RandomizedRunner.java:827)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$7.evaluate(RandomizedRunner.java:863)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$8.evaluate(RandomizedRunner.java:877)
at 
com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:53)
at 
org.apache.lucene.util.TestRuleSetupTeardownChained$1.evaluate(TestRuleSetupTeardownChained.java:50)
at 
org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:46)
at 
com.carrotsearch.randomizedtesting.rules.SystemPropertiesInvariantRule$1.evaluate(SystemPropertiesInvariantRule.java:55)
at 
org.apache.lucene.util.TestRuleThreadAndTestName$1.evaluate(TestRuleThreadAndTestName.java:49)
at 
org.apache.lucene.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:65)
at 
org.apache.lucene.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:48)
at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at 
com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:365)
at 
com.carrotsearch.randomizedtesting.ThreadLeakControl.forkTimeoutingTask(ThreadLeakControl.java:798)
at 
com.carrotsearch.randomizedtesting.ThreadLeakControl$3.evaluate(ThreadLeakControl.java:458)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner.runSingleTest(RandomizedRunner.java:836)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$3.evaluate(RandomizedRunner.java:738)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$4.evaluate(RandomizedRunner.java:772)
at 
com.carrotsearch.randomizedtesting.RandomizedRunner$5.evaluate(RandomizedRunner.java:783)
at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at 
com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:53)
at 
org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:46)
at 
org.apache.lucene.util.TestRuleStoreClassName$1.evaluate(TestRuleStoreClassName.java:42)
at 
com.carrotsearch.randomizedtesting.rules.SystemPropertiesInvariantRule$1.evaluate(SystemPropertiesInvariantRule.java:55)
at 
com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:39)
at 
com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:39)
at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at 
org.apache.lucene.util.TestRuleAssertionsRequired$1.evaluate(TestRuleAssertionsRequired.java:43)
at 
org.apache.lucene.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:48)
at 
org.apache.lucene.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:65)
at 
org.apache.lucene.util.TestRuleIgnoreTestSuites$1.evaluate(TestRuleIgnoreTestSuites.java:55)
at 
com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at 
com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:

Re: Odd test failures

2014-08-31 Thread Dawid Weiss
> Thanks Dawid! So my take-away is that  this is just pilot error on my part,
> not something intrinsic to the code.

I don't know enough about the code to say for sure, but to me that
FaultyIndexInput's count field should be reset before each test
(shouldn't propagate from test to test, effectively making each test
rely on the number of tests before it). As for the exception itself,
I've no idea -- didn't look at the code; I think it may be assuming
only one iteration.

Dawid

>
> Erick
>
>
> On Sun, Aug 31, 2014 at 12:46 PM, Dawid Weiss 
> wrote:
>>
>> It's because the exception is triggered in a static class
>> FaultyIndexInput (initialized in a static context
>> TestFieldsReader#beforeClass). When you ask for -Dtests.iters, only
>> the tests (methods) are duplicated, the static context remains the
>> same. So the "count" variable in FaultyIndexInput is actually
>> propagated from test to test and each repetition is not really atomic/
>> isolated from others (search for one of recent e-mail to Ryan, it
>> contains a deeper information on why and how this works).
>>
>> You can repeat the failure if you repeat exactly the same seed for
>> each repetition (including test methods):
>>
>> ant test  -Dtestcase=TestFieldsReader
>> -Dtests.seed=DFB0B84C4D087DFD:1DE75618D1B7C867 -Dtests.slow=true
>> -Dtests.locale=sr_ME -Dtests.timezone=Asia/Kashgar
>> -Dtests.file.encoding=UTF-8 -Dtests.iters=10
>>
>> This yields:
>>
>> Tests with failures:
>>   - org.apache.lucene.index.TestFieldsReader.testExceptions {#1
>> seed=[DFB0B84C4D087DFD:1DE75618D1B7C867]}
>>   - org.apache.lucene.index.TestFieldsReader.testExceptions {#2
>> seed=[DFB0B84C4D087DFD:1DE75618D1B7C867]}
>>   - org.apache.lucene.index.TestFieldsReader.testExceptions {#3
>> seed=[DFB0B84C4D087DFD:1DE75618D1B7C867]}
>>   - org.apache.lucene.index.TestFieldsReader.testExceptions {#4
>> seed=[DFB0B84C4D087DFD:1DE75618D1B7C867]}
>>   - org.apache.lucene.index.TestFieldsReader.testExceptions {#5
>> seed=[DFB0B84C4D087DFD:1DE75618D1B7C867]}
>>   - org.apache.lucene.index.TestFieldsReader.testExceptions {#6
>> seed=[DFB0B84C4D087DFD:1DE75618D1B7C867]}
>>   - org.apache.lucene.index.TestFieldsReader.testExceptions {#7
>> seed=[DFB0B84C4D087DFD:1DE75618D1B7C867]}
>>   - org.apache.lucene.index.TestFieldsReader.testExceptions {#8
>> seed=[DFB0B84C4D087DFD:1DE75618D1B7C867]}
>>   - org.apache.lucene.index.TestFieldsReader.testExceptions {#9
>> seed=[DFB0B84C4D087DFD:1DE75618D1B7C867]}
>>
>> Note I included per-method seed in the -Dtests.seed. Also, #0
>> iteration *does pass*; the remaining ones fail because of what I said
>> above.
>>
>> Dawid
>>
>> On Sun, Aug 31, 2014 at 6:20 AM, Erick Erickson 
>> wrote:
>> > I'm seeing the fairly easily-reproducible error below on trunk.
>> > Unfortunately it doesn't reproduce with the seed. I'm wondering if
>> > anyone
>> > has an inkling what's going on here?
>> >
>> > I did manage to notice that I screwed up the command I was _intending_
>> > and
>> > actually issued the command below, although I have a hard time imagining
>> > that's the problem, unless it's something like running tests.iters on
>> > the
>> > full suite makes this happen. No wonder -Dtests.iters=100 didn't
>> > finish...
>> > Siii.
>> >
>> > ant -Dtestcasae=TestDistributedSearch -Dtests.iters=10 test
>> >
>> > Note I spelled 'testcase' as 'testcasae'...
>> >
>> >
>> > Stack trace:
>> >
>> >[junit4] Suite: org.apache.lucene.index.TestFieldsReader
>> >
>> >[junit4]   2> NOTE: reproduce with: ant test
>> > -Dtestcase=TestFieldsReader
>> > -Dtests.method=testExceptions -Dtests.seed=DFB0B84C4D087DFD
>> > -Dtests.slow=true -Dtests.locale=sr_ME -Dtests.timezone=Asia/Kashgar
>> > -Dtests.file.encoding=UTF-8
>> >
>> >[junit4] ERROR   0.10s J2 | TestFieldsReader.testExceptions {#1
>> > seed=[DFB0B84C4D087DFD:1DE75618D1B7C867]} <<<
>> >
>> >[junit4]> Throwable #1: java.io.IOException: Simulated network
>> > outage
>> >
>> >[junit4]> at
>> > __randomizedtesting.SeedInfo.seed([DFB0B84C4D087DFD:1DE75618D1B7C867]:0)
>> >
>> >[junit4]> at
>> >
>> > org.apache.lucene.index.TestFieldsReader$FaultyIndexInput.simOutage(TestFieldsReader.java:156)
>> >
>> >[junit4]> at
>> >
>> > org.apache.lucene.index.TestFieldsReader$FaultyIndexInput.readInternal(TestFieldsReader.java:161)
>> >
>> >[junit4]> at
>> >
>> > org.apache.lucene.store.BufferedIndexInput.refill(BufferedIndexInput.java:342)
>> >
>> >[junit4]> at
>> >
>> > org.apache.lucene.store.BufferedIndexInput.readBytes(BufferedIndexInput.java:140)
>> >
>> >[junit4]> at
>> >
>> > org.apache.lucene.store.BufferedIndexInput.readBytes(BufferedIndexInput.java:116)
>> >
>> >[junit4]> at
>> >
>> > org.apache.lucene.store.BufferedChecksumIndexInput.readBytes(BufferedChecksumIndexInput.java:49)
>> >
>> >[junit4]> at
>> > org.apache.lucene.store.DataInput.readString(DataInput.java:234)
>> >
>> >[junit4]> at
>> >
>> > org.apache.lucene.store.Da