[ https://issues.apache.org/jira/browse/SPARK-23716?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16412423#comment-16412423 ]
Nicholas Chammas edited comment on SPARK-23716 at 3/24/18 5:13 AM: ------------------------------------------------------------------- For my use case, there is no value in updating the Spark release code if we're not going to also update the release hashes for all prior releases, which it sounds like we don't want to do. I wrote my own code to convert the GPG-style hashes to shasum style hashes, and that satisfies [my need|https://github.com/nchammas/flintrock/issues/238], which is focused on syncing Spark releases from the Apache archive to an S3 bucket. Closing this as "Won't Fix". was (Author: nchammas): For my use case, there is no value in updating the Spark release code if we're not going to also update the release hashes for all prior releases, which it sounds like we don't want to do. I wrote my own code to convert the GPG-style hashes to shasum style hashes, and that satisfies my need. I am syncing Spark releases from the Apache distribution archive to a personal S3 bucket and need a way to verify the integrity of the files. > Change SHA512 style in release artifacts to play nicely with shasum utility > --------------------------------------------------------------------------- > > Key: SPARK-23716 > URL: https://issues.apache.org/jira/browse/SPARK-23716 > Project: Spark > Issue Type: Improvement > Components: Project Infra > Affects Versions: 2.3.0 > Reporter: Nicholas Chammas > Priority: Minor > > As [discussed > here|http://apache-spark-developers-list.1001551.n3.nabble.com/Changing-how-we-compute-release-hashes-td23599.html]. -- This message was sent by Atlassian JIRA (v7.6.3#76005) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org