-1 because of SPARK-16181 which is a correctness regression from 1.6. Looks 
like the patch is ready though: https://github.com/apache/spark/pull/13884 – it 
would be ideal for this patch to make it into the release.

-Matt Cheah

From: Nick Pentreath <nick.pentre...@gmail.com<mailto:nick.pentre...@gmail.com>>
Date: Friday, June 24, 2016 at 4:37 AM
To: "dev@spark.apache.org<mailto:dev@spark.apache.org>" 
<dev@spark.apache.org<mailto:dev@spark.apache.org>>
Subject: Re: [VOTE] Release Apache Spark 2.0.0 (RC1)

I'm getting the following when trying to run ./dev/run-tests (not happening on 
master) from the extracted source tar. Anyone else seeing this?

error: Could not access 'fc0a1475ef'
**********************************************************************
File "./dev/run-tests.py", line 69, in 
__main__.identify_changed_files_from_git_commits
Failed example:
    
[x.name<https://urldefense.proofpoint.com/v2/url?u=http-3A__x.name&d=DQMFaQ&c=izlc9mHr637UR4lpLEZLFFS3Vn2UXBrZ4tFb6oOnmz8&r=hzwIMNQ9E99EMYGuqHI0kXhVbvX3nU3OSDadUnJxjAs&m=Y3d-oJvw2gK_2KXYjXY8_yzfAosPOqqaV4wtMg6ZPwM&s=wx5Qjw-efxMVvKXnjUsSkkQcEF6zQHQLQaGtAK9pxIw&e=>
 for x in determine_modules_for_files(             
identify_changed_files_from_git_commits("fc0a1475ef", target_ref="5da21f07"))]
Exception raised:
    Traceback (most recent call last):
      File "/Users/nick/miniconda2/lib/python2.7/doctest.py", line 1315, in 
__run
        compileflags, 1) in test.globs
      File "<doctest __main__.identify_changed_files_from_git_commits[0]>", 
line 1, in <module>
        
[x.name<https://urldefense.proofpoint.com/v2/url?u=http-3A__x.name&d=DQMFaQ&c=izlc9mHr637UR4lpLEZLFFS3Vn2UXBrZ4tFb6oOnmz8&r=hzwIMNQ9E99EMYGuqHI0kXhVbvX3nU3OSDadUnJxjAs&m=Y3d-oJvw2gK_2KXYjXY8_yzfAosPOqqaV4wtMg6ZPwM&s=wx5Qjw-efxMVvKXnjUsSkkQcEF6zQHQLQaGtAK9pxIw&e=>
 for x in determine_modules_for_files(             
identify_changed_files_from_git_commits("fc0a1475ef", target_ref="5da21f07"))]
      File "./dev/run-tests.py", line 86, in 
identify_changed_files_from_git_commits
        universal_newlines=True)
      File "/Users/nick/miniconda2/lib/python2.7/subprocess.py", line 573, in 
check_output
        raise CalledProcessError(retcode, cmd, output=output)
    CalledProcessError: Command '['git', 'diff', '--name-only', 'fc0a1475ef', 
'5da21f07']' returned non-zero exit status 1
error: Could not access '50a0496a43'
**********************************************************************
File "./dev/run-tests.py", line 71, in 
__main__.identify_changed_files_from_git_commits
Failed example:
    'root' in 
[x.name<https://urldefense.proofpoint.com/v2/url?u=http-3A__x.name&d=DQMFaQ&c=izlc9mHr637UR4lpLEZLFFS3Vn2UXBrZ4tFb6oOnmz8&r=hzwIMNQ9E99EMYGuqHI0kXhVbvX3nU3OSDadUnJxjAs&m=Y3d-oJvw2gK_2KXYjXY8_yzfAosPOqqaV4wtMg6ZPwM&s=wx5Qjw-efxMVvKXnjUsSkkQcEF6zQHQLQaGtAK9pxIw&e=>
 for x in determine_modules_for_files(          
identify_changed_files_from_git_commits("50a0496a43", target_ref="6765ef9"))]
Exception raised:
    Traceback (most recent call last):
      File "/Users/nick/miniconda2/lib/python2.7/doctest.py", line 1315, in 
__run
        compileflags, 1) in test.globs
      File "<doctest __main__.identify_changed_files_from_git_commits[1]>", 
line 1, in <module>
        'root' in 
[x.name<https://urldefense.proofpoint.com/v2/url?u=http-3A__x.name&d=DQMFaQ&c=izlc9mHr637UR4lpLEZLFFS3Vn2UXBrZ4tFb6oOnmz8&r=hzwIMNQ9E99EMYGuqHI0kXhVbvX3nU3OSDadUnJxjAs&m=Y3d-oJvw2gK_2KXYjXY8_yzfAosPOqqaV4wtMg6ZPwM&s=wx5Qjw-efxMVvKXnjUsSkkQcEF6zQHQLQaGtAK9pxIw&e=>
 for x in determine_modules_for_files(          
identify_changed_files_from_git_commits("50a0496a43", target_ref="6765ef9"))]
      File "./dev/run-tests.py", line 86, in 
identify_changed_files_from_git_commits
        universal_newlines=True)
      File "/Users/nick/miniconda2/lib/python2.7/subprocess.py", line 573, in 
check_output
        raise CalledProcessError(retcode, cmd, output=output)
    CalledProcessError: Command '['git', 'diff', '--name-only', '50a0496a43', 
'6765ef9']' returned non-zero exit status 1
**********************************************************************
1 items had failures:
   2 of   2 in __main__.identify_changed_files_from_git_commits
***Test Failed*** 2 failures.



On Fri, 24 Jun 2016 at 06:59 Yin Huai 
<yh...@databricks.com<mailto:yh...@databricks.com>> wrote:
-1 because of 
https://issues.apache.org/jira/browse/SPARK-16121<https://urldefense.proofpoint.com/v2/url?u=https-3A__issues.apache.org_jira_browse_SPARK-2D16121&d=DQMFaQ&c=izlc9mHr637UR4lpLEZLFFS3Vn2UXBrZ4tFb6oOnmz8&r=hzwIMNQ9E99EMYGuqHI0kXhVbvX3nU3OSDadUnJxjAs&m=Y3d-oJvw2gK_2KXYjXY8_yzfAosPOqqaV4wtMg6ZPwM&s=9200NP4SpeJSUNrSrlWWEC7vFvjWSyCHnx5LD7Sj9u4&e=>.

This jira was resolved after 2.0.0-RC1 was cut. Without the fix, Spark SQL 
effectively only uses the driver to list files when loading datasets and the 
driver-side file listing is very slow for datasets having many files and 
partitions. Since this bug causes a serious performance regression, I am giving 
-1.

On Thu, Jun 23, 2016 at 1:25 AM, Pete Robbins 
<robbin...@gmail.com<mailto:robbin...@gmail.com>> wrote:
I'm also seeing some of these same failures:

- spilling with compression *** FAILED ***
I have seen this occassionaly

- to UTC timestamp *** FAILED ***
This was fixed yesterday in branch-2.0 
(https://issues.apache.org/jira/browse/SPARK-16078<https://urldefense.proofpoint.com/v2/url?u=https-3A__issues.apache.org_jira_browse_SPARK-2D16078&d=DQMFaQ&c=izlc9mHr637UR4lpLEZLFFS3Vn2UXBrZ4tFb6oOnmz8&r=hzwIMNQ9E99EMYGuqHI0kXhVbvX3nU3OSDadUnJxjAs&m=Y3d-oJvw2gK_2KXYjXY8_yzfAosPOqqaV4wtMg6ZPwM&s=SuVdXUNGdAhYgtA2fMLe5vZ2PFrPOaeO3i3cbhYU4tc&e=>)

- offset recovery *** FAILED ***
Haven't seen this for a while and thought the flaky test was fixed but it 
popped up again in one of our builds.

StateStoreSuite:
- maintenance *** FAILED ***
Just seen this has been failing for last 2 days on one build machine (linux 
amd64)


On 23 June 2016 at 08:51, Sean Owen 
<so...@cloudera.com<mailto:so...@cloudera.com>> wrote:
First pass of feedback on the RC: all the sigs, hashes, etc are fine.
Licensing is up to date to the best of my knowledge.

I'm hitting test failures, some of which may be spurious. Just putting
them out there to see if they ring bells. This is Java 8 on Ubuntu 16.


- spilling with compression *** FAILED ***
  java.lang.Exception: Test failed with compression using codec
org.apache.spark.io.SnappyCompressionCodec:
assertion failed: expected cogroup to spill, but did not
  at scala.Predef$.assert(Predef.scala:170)
  at org.apache.spark.TestUtils$.assertSpilled(TestUtils.scala:170)
  at 
org.apache.spark.util.collection.ExternalAppendOnlyMapSuite.org<https://urldefense.proofpoint.com/v2/url?u=http-3A__org.apache.spark.util.collection.ExternalAppendOnlyMapSuite.org&d=DQMFaQ&c=izlc9mHr637UR4lpLEZLFFS3Vn2UXBrZ4tFb6oOnmz8&r=hzwIMNQ9E99EMYGuqHI0kXhVbvX3nU3OSDadUnJxjAs&m=Y3d-oJvw2gK_2KXYjXY8_yzfAosPOqqaV4wtMg6ZPwM&s=goarAptcJYfLg44f7BAwhbipqJlRFKz9Y6Z36HItiKg&e=>$apache$spark$util$collection$ExternalAppendOnlyMapSuite$$testSimpleSpilling(ExternalAppendOnlyMapSuite.scala:263)
...

I feel like I've seen this before, and see some possibly relevant
fixes, but they're in 2.0.0 already:
https://github.com/apache/spark/pull/10990<https://urldefense.proofpoint.com/v2/url?u=https-3A__github.com_apache_spark_pull_10990&d=DQMFaQ&c=izlc9mHr637UR4lpLEZLFFS3Vn2UXBrZ4tFb6oOnmz8&r=hzwIMNQ9E99EMYGuqHI0kXhVbvX3nU3OSDadUnJxjAs&m=Y3d-oJvw2gK_2KXYjXY8_yzfAosPOqqaV4wtMg6ZPwM&s=dFymYD9NRVHIJ5MKpmzPcH_NYwLjOWcZd7FUuQBpTUU&e=>
Is this something where a native library needs to be installed or something?


- to UTC timestamp *** FAILED ***
  "2016-03-13 [02]:00:00.0" did not equal "2016-03-13 [10]:00:00.0"
(DateTimeUtilsSuite.scala:506)

I know, we talked about this for the 1.6.2 RC, but I reproduced this
locally too. I will investigate, could still be spurious.


StateStoreSuite:
- maintenance *** FAILED ***
  The code passed to eventually never returned normally. Attempted 627
times over 10.000180116 seconds. Last failure message:
StateStoreSuite.this.fileExists(provider, 1L, false) was true earliest
file not deleted. (StateStoreSuite.scala:395)

No idea.


- offset recovery *** FAILED ***
  The code passed to eventually never returned normally. Attempted 197
times over 10.040864806 seconds. Last failure message:
strings.forall({
    ((x$1: Any) => DirectKafkaStreamSuite.collectedData.contains(x$1))
  }) was false. (DirectKafkaStreamSuite.scala:250)

Also something that was possibly fixed already for 2.0.0 and that I
just back-ported into 1.6. Could be just a very similar failure.

On Wed, Jun 22, 2016 at 2:26 AM, Reynold Xin 
<r...@databricks.com<mailto:r...@databricks.com>> wrote:
> Please vote on releasing the following candidate as Apache Spark version
> 2.0.0. The vote is open until Friday, June 24, 2016 at 19:00 PDT and passes
> if a majority of at least 3+1 PMC votes are cast.
>
> [ ] +1 Release this package as Apache Spark 2.0.0
> [ ] -1 Do not release this package because ...
>
>
> The tag to be voted on is v2.0.0-rc1
> (0c66ca41afade6db73c9aeddd5aed6e5dcea90df).
>
> This release candidate resolves ~2400 issues:
> https://s.apache.org/spark-2.0.0-rc1-jira<https://urldefense.proofpoint.com/v2/url?u=https-3A__s.apache.org_spark-2D2.0.0-2Drc1-2Djira&d=DQMFaQ&c=izlc9mHr637UR4lpLEZLFFS3Vn2UXBrZ4tFb6oOnmz8&r=hzwIMNQ9E99EMYGuqHI0kXhVbvX3nU3OSDadUnJxjAs&m=Y3d-oJvw2gK_2KXYjXY8_yzfAosPOqqaV4wtMg6ZPwM&s=ZD_PezvsJ1GyDhv7MhaeUrVba_uhED5mPkqKpfenKEE&e=>
>
> The release files, including signatures, digests, etc. can be found at:
> http://people.apache.org/~pwendell/spark-releases/spark-2.0.0-rc1-bin/<https://urldefense.proofpoint.com/v2/url?u=http-3A__people.apache.org_-7Epwendell_spark-2Dreleases_spark-2D2.0.0-2Drc1-2Dbin_&d=DQMFaQ&c=izlc9mHr637UR4lpLEZLFFS3Vn2UXBrZ4tFb6oOnmz8&r=hzwIMNQ9E99EMYGuqHI0kXhVbvX3nU3OSDadUnJxjAs&m=Y3d-oJvw2gK_2KXYjXY8_yzfAosPOqqaV4wtMg6ZPwM&s=wSbzZ2LyuDcNKaCijEPdt9rokQ0R9w66tn2jMfjKN2I&e=>
>
> Release artifacts are signed with the following key:
> https://people.apache.org/keys/committer/pwendell.asc<https://urldefense.proofpoint.com/v2/url?u=https-3A__people.apache.org_keys_committer_pwendell.asc&d=DQMFaQ&c=izlc9mHr637UR4lpLEZLFFS3Vn2UXBrZ4tFb6oOnmz8&r=hzwIMNQ9E99EMYGuqHI0kXhVbvX3nU3OSDadUnJxjAs&m=Y3d-oJvw2gK_2KXYjXY8_yzfAosPOqqaV4wtMg6ZPwM&s=i1Uxw1NyUf2iuA3CXbyiEODD1RR24rAXUvkc42ut8Ao&e=>
>
> The staging repository for this release can be found at:
> https://repository.apache.org/content/repositories/orgapachespark-1187/<https://urldefense.proofpoint.com/v2/url?u=https-3A__repository.apache.org_content_repositories_orgapachespark-2D1187_&d=DQMFaQ&c=izlc9mHr637UR4lpLEZLFFS3Vn2UXBrZ4tFb6oOnmz8&r=hzwIMNQ9E99EMYGuqHI0kXhVbvX3nU3OSDadUnJxjAs&m=Y3d-oJvw2gK_2KXYjXY8_yzfAosPOqqaV4wtMg6ZPwM&s=QjsvnxXe6JBQqXwKw6r-fIIHI9E0ugeeICAqjRXRNwc&e=>
>
> The documentation corresponding to this release can be found at:
> http://people.apache.org/~pwendell/spark-releases/spark-2.0.0-rc1-docs/<https://urldefense.proofpoint.com/v2/url?u=http-3A__people.apache.org_-7Epwendell_spark-2Dreleases_spark-2D2.0.0-2Drc1-2Ddocs_&d=DQMFaQ&c=izlc9mHr637UR4lpLEZLFFS3Vn2UXBrZ4tFb6oOnmz8&r=hzwIMNQ9E99EMYGuqHI0kXhVbvX3nU3OSDadUnJxjAs&m=Y3d-oJvw2gK_2KXYjXY8_yzfAosPOqqaV4wtMg6ZPwM&s=_6IZExLgc8WoxW0kft_weR7AvELgbFXnHZdezQ_IYGk&e=>
>
>
> =======================================
> == How can I help test this release? ==
> =======================================
> If you are a Spark user, you can help us test this release by taking an
> existing Spark workload and running on this release candidate, then
> reporting any regressions from 1.x.
>
> ================================================
> == What justifies a -1 vote for this release? ==
> ================================================
> Critical bugs impacting major functionalities.
>
> Bugs already present in 1.x, missing features, or bugs related to new
> features will not necessarily block this release. Note that historically
> Spark documentation has been published on the website separately from the
> main release so we do not need to block the release due to documentation
> errors either.
>
>

---------------------------------------------------------------------
To unsubscribe, e-mail: 
dev-unsubscr...@spark.apache.org<mailto:dev-unsubscr...@spark.apache.org>
For additional commands, e-mail: 
dev-h...@spark.apache.org<mailto:dev-h...@spark.apache.org>



Reply via email to