On Thu, Mar 17, 2016 at 12:01 PM, Cody Koeninger <c...@koeninger.org> wrote:
> i.  An ASF project can clearly decide that some of its code is no
> longer worth maintaining and delete it.  This isn't really any
> different. It's still apache licensed so ultimately whoever wants the
> code can get it.

Absolutely. But I don't remember this being discussed either way. Was
the intention, as you mention later, just to decouple the release of
those components from the main Spark release, or to completely disown
that code?

If the latter, is the ASF ok with it still retaining the current
package and artifact names? Changing those would break backwards
compatibility. Which is why I believe that keeping them as a
sub-project, even if their release cadence is much slower, would be a
better solution for both developers and users.

> ii.  I think part of the rationale is to not tie release management to
> Spark, so it can proceed on a schedule that makes sense.  I'm fine
> with helping out with release management for the Kafka subproject, for
> instance.  I agree that practical governance questions need to be
> worked out.
>
> iii.  How is this any different from how python users get access to
> any other third party Spark package?

True, but that requires the modules to be published somewhere, not
just to live as a bunch of .py files in a gitbub repo. Basically, I'm
worried that there's work to be done to keep those modules working in
this new environment - how to build, test, and publish things, remove
potential uses of internal Spark APIs, just to cite a couple of
things.

-- 
Marcelo

---------------------------------------------------------------------
To unsubscribe, e-mail: dev-unsubscr...@spark.apache.org
For additional commands, e-mail: dev-h...@spark.apache.org

Reply via email to