I don't think Spark would ever distributed except through the ASF and
mainstream channels like Maven Central, but you can redistribute the bits
as-is as you like. This would be in line with the terms of the Apache
license.

On Thu, Sep 28, 2017 at 6:17 AM Marco Vermeulen <vermeulen...@gmail.com>
wrote:

> Hi all,
>
>
>
> My name is Marco and I am the project lead of SDKMAN. For those of you who
> are not familiar with the project, it is a FLOSS SDK management tool which
> allows you to install and switch seamlessly between multiple versions of
> the same SDK when using UNIX shells. You can read more about it on our
> website[1].
>
>
>
> I’ve started using Spark myself on a project and was thinking that it
> would be a very good candidate too be hosted on SDKMAN. This becomes
> especially apparent when needing to switch between versions of Spark while
> developing.
>
>
>
> The reason I’m writing here is because our tool has an API that allows SDK
> providers to push their own releases to our service. We don’t host the
> actual binaries, but it merely enables our tool to point to your new
> release archives and allow for super easy installation. This can be done by
> either a few simple REST calls as part of your release process or else
> automated by using our Maven release plugin.
>
>
>
> Would the Spark dev community be open for something like this? A recent
> poll on Twitter shows a good appetite for Spark on SDKMAN by our users[2].
> Also, we already have many teams pushing to us in this manner including
> Groovy, Kotlin, Ceylon, OpenJDK, Gradle, SBT to name a few. Having Spark
> included would be really great.
>
>
>
> Apologies in advance if this is not the correct forum for release related
> posts.
>
> Many thanks,
>
> Marco.
>
>
>
> [1] http://sdkman.io
>
> [2] https://twitter.com/sdkman_/status/907698363877003264
>

Reply via email to