-1

Breaks the existing applications while using the Script Transformation in Spark 
SQL, as the default Record/Column delimiter class changed since we don’t get 
the default conf value from HiveConf any more, see SPARK-16515;

This is a regression.


From: Reynold Xin [mailto:r...@databricks.com]
Sent: Friday, July 15, 2016 7:26 AM
To: dev@spark.apache.org
Subject: Re: [VOTE] Release Apache Spark 2.0.0 (RC4)

Updated documentation at 
http://people.apache.org/~pwendell/spark-releases/spark-2.0.0-rc4-docs-updated/



On Thu, Jul 14, 2016 at 11:59 AM, Reynold Xin 
<r...@databricks.com<mailto:r...@databricks.com>> wrote:
Please vote on releasing the following candidate as Apache Spark version 2.0.0. 
The vote is open until Sunday, July 17, 2016 at 12:00 PDT and passes if a 
majority of at least 3 +1 PMC votes are cast.

[ ] +1 Release this package as Apache Spark 2.0.0
[ ] -1 Do not release this package because ...


The tag to be voted on is v2.0.0-rc4 (e5f8c1117e0c48499f54d62b556bc693435afae0).

This release candidate resolves ~2500 issues: 
https://s.apache.org/spark-2.0.0-jira

The release files, including signatures, digests, etc. can be found at:
http://people.apache.org/~pwendell/spark-releases/spark-2.0.0-rc4-bin/

Release artifacts are signed with the following key:
https://people.apache.org/keys/committer/pwendell.asc

The staging repository for this release can be found at:
https://repository.apache.org/content/repositories/orgapachespark-1192/

The documentation corresponding to this release can be found at:
http://people.apache.org/~pwendell/spark-releases/spark-2.0.0-rc4-docs/


=================================
How can I help test this release?
=================================
If you are a Spark user, you can help us test this release by taking an 
existing Spark workload and running on this release candidate, then reporting 
any regressions from 1.x.

==========================================
What justifies a -1 vote for this release?
==========================================
Critical bugs impacting major functionalities.

Bugs already present in 1.x, missing features, or bugs related to new features 
will not necessarily block this release. Note that historically Spark 
documentation has been published on the website separately from the main 
release so we do not need to block the release due to documentation errors 
either.


Note: There was a mistake made during "rc3" preparation, and as a result there 
is no "rc3", but only "rc4".


Reply via email to