Hi, Xingbo. PySpark seems to fail to build. There is only `sha512`.
SparkR_3.0.0-preview.tar.gz SparkR_3.0.0-preview.tar.gz.asc SparkR_3.0.0-preview.tar.gz.sha512 *pyspark-3.0.0.preview.tar.gz.sha512* spark-3.0.0-preview-bin-hadoop2.7.tgz spark-3.0.0-preview-bin-hadoop2.7.tgz.asc spark-3.0.0-preview-bin-hadoop2.7.tgz.sha512 spark-3.0.0-preview-bin-hadoop3.2.tgz spark-3.0.0-preview-bin-hadoop3.2.tgz.asc spark-3.0.0-preview-bin-hadoop3.2.tgz.sha512 spark-3.0.0-preview-bin-without-hadoop.tgz spark-3.0.0-preview-bin-without-hadoop.tgz.asc spark-3.0.0-preview-bin-without-hadoop.tgz.sha512 spark-3.0.0-preview.tgz spark-3.0.0-preview.tgz.asc spark-3.0.0-preview.tgz.sha512 Bests, Dongjoon. On Tue, Oct 29, 2019 at 7:18 PM Xingbo Jiang <jiangxb1...@gmail.com> wrote: > Thanks for the correction, we shall remove the statement >> >> Everything else please retarget to an appropriate release. >> > > Reynold Xin <r...@databricks.com> 于2019年10月29日周二 下午7:09写道: > >> Does the description make sense? This is a preview release so there is no >> need to retarget versions. >> >> On Tue, Oct 29, 2019 at 7:01 PM Xingbo Jiang <jiangxb1...@gmail.com> >> wrote: >> >>> Please vote on releasing the following candidate as Apache Spark version >>> 3.0.0-preview. >>> >>> The vote is open until November 2 PST and passes if a majority +1 PMC >>> votes are cast, with >>> a minimum of 3 +1 votes. >>> >>> [ ] +1 Release this package as Apache Spark 3.0.0-preview >>> [ ] -1 Do not release this package because ... >>> >>> To learn more about Apache Spark, please see http://spark.apache.org/ >>> >>> The tag to be voted on is v3.0.0-preview-rc1 (commit >>> 5eddbb5f1d9789696927f435c55df887e50a1389): >>> https://github.com/apache/spark/tree/v3.0.0-preview-rc1 >>> >>> The release files, including signatures, digests, etc. can be found at: >>> https://dist.apache.org/repos/dist/dev/spark/v3.0.0-preview-rc1-bin/ >>> >>> Signatures used for Spark RCs can be found in this file: >>> https://dist.apache.org/repos/dist/dev/spark/KEYS >>> >>> The staging repository for this release can be found at: >>> https://repository.apache.org/content/repositories/orgapachespark-1334/ >>> >>> The documentation corresponding to this release can be found at: >>> https://dist.apache.org/repos/dist/dev/spark/v3.0.0-preview-rc1-docs/ >>> >>> The list of bug fixes going into 3.0.0 can be found at the following URL: >>> https://issues.apache.org/jira/projects/SPARK/versions/12339177 >>> >>> FAQ >>> >>> ========================= >>> How can I help test this release? >>> ========================= >>> >>> If you are a Spark user, you can help us test this release by taking >>> an existing Spark workload and running on this release candidate, then >>> reporting any regressions. >>> >>> If you're working in PySpark you can set up a virtual env and install >>> the current RC and see if anything important breaks, in the Java/Scala >>> you can add the staging repository to your projects resolvers and test >>> with the RC (make sure to clean up the artifact cache before/after so >>> you don't end up building with a out of date RC going forward). >>> >>> =========================================== >>> What should happen to JIRA tickets still targeting 3.0.0? >>> =========================================== >>> >>> The current list of open tickets targeted at 3.0.0 can be found at: >>> https://issues.apache.org/jira/projects/SPARK and search for "Target >>> Version/s" = 3.0.0 >>> >>> Committers should look at those and triage. Extremely important bug >>> fixes, documentation, and API tweaks that impact compatibility should >>> be worked on immediately. Everything else please retarget to an >>> appropriate release. >>> >>> ================== >>> But my bug isn't fixed? >>> ================== >>> >>> In order to make timely releases, we will typically not hold the >>> release unless the bug in question is a regression from the previous >>> release. That being said, if there is something which is a regression >>> that has not been correctly targeted please ping me or a committer to >>> help target the issue. >>> >>