Re: Spark 2.0.0-preview artifacts still not available in Maven

2016-06-07 Thread Shivaram Venkataraman
As far as I know the process is just to copy docs/_site from the build
to the appropriate location in the SVN repo (i.e.
site/docs/2.0.0-preview).

Thanks
Shivaram

On Tue, Jun 7, 2016 at 8:14 AM, Sean Owen  wrote:
> As a stop-gap, I can edit that page to have a small section about
> preview releases and point to the nightly docs.
>
> Not sure who has the power to push 2.0.0-preview to site/docs, but, if
> that's done then we can symlink "preview" in that dir to it and be
> done, and update this section about preview docs accordingly.
>
> On Tue, Jun 7, 2016 at 4:10 PM, Tom Graves  wrote:
>> Thanks Sean, you were right, hard refresh made it show up.
>>
>> Seems like we should at least link to the preview docs from
>> http://spark.apache.org/documentation.html.
>>
>> Tom
>>
>>
>> On Tuesday, June 7, 2016 10:04 AM, Sean Owen  wrote:
>>
>>
>> It's there (refresh maybe?). See the end of the downloads dropdown.
>>
>> For the moment you can see the docs in the nightly docs build:
>> https://home.apache.org/~pwendell/spark-nightly/spark-branch-2.0-docs/latest/
>>
>> I don't know, what's the best way to put this into the main site?
>> under a /preview root? I am not sure how that process works.
>>
>> On Tue, Jun 7, 2016 at 4:01 PM, Tom Graves  wrote:
>>> I just checked and I don't see the 2.0 preview release at all anymore on
>>> .http://spark.apache.org/downloads.html, is it in transition?The only
>>> place I can see it is at
>>> http://spark.apache.org/news/spark-2.0.0-preview.html
>>>
>>>
>>> I would like to see docs there too.  My opinion is it should be as easy to
>>> use/try out as any other spark release.
>>>
>>> Tom
>>
>>>
>>
>> -
>> To unsubscribe, e-mail: dev-unsubscr...@spark.apache.org
>> For additional commands, e-mail: dev-h...@spark.apache.org
>>
>>
>>
>>
>
> -
> To unsubscribe, e-mail: dev-unsubscr...@spark.apache.org
> For additional commands, e-mail: dev-h...@spark.apache.org
>

-
To unsubscribe, e-mail: dev-unsubscr...@spark.apache.org
For additional commands, e-mail: dev-h...@spark.apache.org



Re: Spark 2.0.0-preview artifacts still not available in Maven

2016-06-07 Thread Tom Graves
Thanks Sean, you were right, hard refresh made it show up.
Seems like we should at least link to the preview docs from 
http://spark.apache.org/documentation.html.
Tom 

On Tuesday, June 7, 2016 10:04 AM, Sean Owen  wrote:
 

 It's there (refresh maybe?). See the end of the downloads dropdown.

For the moment you can see the docs in the nightly docs build:
https://home.apache.org/~pwendell/spark-nightly/spark-branch-2.0-docs/latest/

I don't know, what's the best way to put this into the main site?
under a /preview root? I am not sure how that process works.

On Tue, Jun 7, 2016 at 4:01 PM, Tom Graves  wrote:
> I just checked and I don't see the 2.0 preview release at all anymore on
> .http://spark.apache.org/downloads.html, is it in transition?    The only
> place I can see it is at
> http://spark.apache.org/news/spark-2.0.0-preview.html
>
>
> I would like to see docs there too.  My opinion is it should be as easy to
> use/try out as any other spark release.
>
> Tom
>

-
To unsubscribe, e-mail: dev-unsubscr...@spark.apache.org
For additional commands, e-mail: dev-h...@spark.apache.org



  

Re: Spark 2.0.0-preview artifacts still not available in Maven

2016-06-07 Thread Sean Owen
As a stop-gap, I can edit that page to have a small section about
preview releases and point to the nightly docs.

Not sure who has the power to push 2.0.0-preview to site/docs, but, if
that's done then we can symlink "preview" in that dir to it and be
done, and update this section about preview docs accordingly.

On Tue, Jun 7, 2016 at 4:10 PM, Tom Graves  wrote:
> Thanks Sean, you were right, hard refresh made it show up.
>
> Seems like we should at least link to the preview docs from
> http://spark.apache.org/documentation.html.
>
> Tom
>
>
> On Tuesday, June 7, 2016 10:04 AM, Sean Owen  wrote:
>
>
> It's there (refresh maybe?). See the end of the downloads dropdown.
>
> For the moment you can see the docs in the nightly docs build:
> https://home.apache.org/~pwendell/spark-nightly/spark-branch-2.0-docs/latest/
>
> I don't know, what's the best way to put this into the main site?
> under a /preview root? I am not sure how that process works.
>
> On Tue, Jun 7, 2016 at 4:01 PM, Tom Graves  wrote:
>> I just checked and I don't see the 2.0 preview release at all anymore on
>> .http://spark.apache.org/downloads.html, is it in transition?The only
>> place I can see it is at
>> http://spark.apache.org/news/spark-2.0.0-preview.html
>>
>>
>> I would like to see docs there too.  My opinion is it should be as easy to
>> use/try out as any other spark release.
>>
>> Tom
>
>>
>
> -
> To unsubscribe, e-mail: dev-unsubscr...@spark.apache.org
> For additional commands, e-mail: dev-h...@spark.apache.org
>
>
>
>

-
To unsubscribe, e-mail: dev-unsubscr...@spark.apache.org
For additional commands, e-mail: dev-h...@spark.apache.org



Re: Spark 2.0.0-preview artifacts still not available in Maven

2016-06-07 Thread Sean Owen
It's there (refresh maybe?). See the end of the downloads dropdown.

For the moment you can see the docs in the nightly docs build:
https://home.apache.org/~pwendell/spark-nightly/spark-branch-2.0-docs/latest/

I don't know, what's the best way to put this into the main site?
under a /preview root? I am not sure how that process works.

On Tue, Jun 7, 2016 at 4:01 PM, Tom Graves  wrote:
> I just checked and I don't see the 2.0 preview release at all anymore on
> .http://spark.apache.org/downloads.html, is it in transition?The only
> place I can see it is at
> http://spark.apache.org/news/spark-2.0.0-preview.html
>
>
> I would like to see docs there too.  My opinion is it should be as easy to
> use/try out as any other spark release.
>
> Tom
>

-
To unsubscribe, e-mail: dev-unsubscr...@spark.apache.org
For additional commands, e-mail: dev-h...@spark.apache.org



Re: Spark 2.0.0-preview artifacts still not available in Maven

2016-06-06 Thread Imran Rashid
I've been a bit on the fence on this, but I agree that Luciano makes a
compelling reason for why we really should publish things to maven
central.  Sure we slightly increase the risk somebody refers to the preview
release too late, but really that is their own fault.

And I also I agree with comments from Sean and Mark that this is *not* a
"Databricks vs The World" scenario at all.

On Mon, Jun 6, 2016 at 2:13 PM, Luciano Resende 
wrote:

>
>
> On Mon, Jun 6, 2016 at 12:05 PM, Reynold Xin  wrote:
>
>> The bahir one was a good argument actually. I just clicked the button to
>> push it into Maven central.
>>
>>
> Thank You !!!
>
>


Re: Spark 2.0.0-preview artifacts still not available in Maven

2016-06-06 Thread Luciano Resende
On Mon, Jun 6, 2016 at 12:05 PM, Reynold Xin  wrote:

> The bahir one was a good argument actually. I just clicked the button to
> push it into Maven central.
>
>
Thank You !!!


Re: Spark 2.0.0-preview artifacts still not available in Maven

2016-06-06 Thread Reynold Xin
The bahir one was a good argument actually. I just clicked the button to
push it into Maven central.


On Mon, Jun 6, 2016 at 12:00 PM, Mark Hamstra 
wrote:

> Fine.  I don't feel strongly enough about it to continue to argue against
> putting the artifacts on Maven Central.
>
> On Mon, Jun 6, 2016 at 11:48 AM, Sean Owen  wrote:
>
>> Artifacts can't be removed from Maven in any normal circumstance, but,
>> it's no problem.
>>
>> The argument that people might keep using it goes for any older
>> release. Why would anyone use 1.6.0 when 1.6.1 exists? yet we keep
>> 1.6.0 just for the record and to not break builds. It may be that
>> Foobar 3.0-beta depends on 2.0.0-preview and 3.0 will shortly depend
>> on 2.0.0, but, killing the -preview artifact breaks that other
>> historical release/branch.
>>
>> I agree that "-alpha-1" would have been better. But we're talking
>> about working around pretty bone-headed behavior, to not notice what
>> version of Spark they build against, or not understand what
>> 2.0.0-preview vs 2.0.0 means in a world of semver.
>>
>> BTW Maven sorts 2.0.0-preview before 2.0.0, so 2.0.0 would show up as
>> the latest, when released, in tools like mvn
>> versions:display-dependency-updates. You could exclude the preview
>> release by requiring version [2.0.0,).
>>
>> On Mon, Jun 6, 2016 at 7:19 PM, Mark Hamstra 
>> wrote:
>> > Precisely because the naming of the preview artifacts has to fall
>> outside of
>> > the normal versioning, I can easily see incautious Maven users a few
>> months
>> > from now mistaking the preview artifacts as spark-2.0-something-special
>> > instead of spark-2.0-something-stale.
>>
>> -
>> To unsubscribe, e-mail: dev-unsubscr...@spark.apache.org
>> For additional commands, e-mail: dev-h...@spark.apache.org
>>
>>
>


Re: Spark 2.0.0-preview artifacts still not available in Maven

2016-06-06 Thread Sean Owen
Artifacts can't be removed from Maven in any normal circumstance, but,
it's no problem.

The argument that people might keep using it goes for any older
release. Why would anyone use 1.6.0 when 1.6.1 exists? yet we keep
1.6.0 just for the record and to not break builds. It may be that
Foobar 3.0-beta depends on 2.0.0-preview and 3.0 will shortly depend
on 2.0.0, but, killing the -preview artifact breaks that other
historical release/branch.

I agree that "-alpha-1" would have been better. But we're talking
about working around pretty bone-headed behavior, to not notice what
version of Spark they build against, or not understand what
2.0.0-preview vs 2.0.0 means in a world of semver.

BTW Maven sorts 2.0.0-preview before 2.0.0, so 2.0.0 would show up as
the latest, when released, in tools like mvn
versions:display-dependency-updates. You could exclude the preview
release by requiring version [2.0.0,).

On Mon, Jun 6, 2016 at 7:19 PM, Mark Hamstra  wrote:
> Precisely because the naming of the preview artifacts has to fall outside of
> the normal versioning, I can easily see incautious Maven users a few months
> from now mistaking the preview artifacts as spark-2.0-something-special
> instead of spark-2.0-something-stale.

-
To unsubscribe, e-mail: dev-unsubscr...@spark.apache.org
For additional commands, e-mail: dev-h...@spark.apache.org



Re: Spark 2.0.0-preview artifacts still not available in Maven

2016-06-06 Thread Ovidiu-Cristian MARCU
+1 for moving this discussion to a proactive new (alpha/beta) release of Apache 
Spark 2.0!

> On 06 Jun 2016, at 20:25, Ovidiu Cristian Marcu  wrote:
> 
> Any chance to start preparing a new alpha/beta release for 2.0 this month or 
> the preview will be pushed to maven and considered an alpha?
> 
> Sent from TypeApp 
> 
> On Jun 6, 2016, at 20:12, Matei Zaharia  > wrote:
> Is there any way to remove artifacts from Maven Central? Maybe that would 
> help clean these things up long-term, though it would create problems for 
> users who for some reason decide to rely on these previews. 
> 
> In any case, if people are *really* concerned about this, we should just put 
> it there. My thought was that it's better for users to do something special 
> to link to this release (e.g. add a reference to the staging repo) so that 
> they are more likely to know that it's a special, unstable thing. Same thing 
> they do to use snapshots. 
> 
> Matei
> 
> On Mon, Jun 6, 2016 at 10:49 AM, Luciano Resende  > wrote: 
> 
> 
> On Mon, Jun 6, 2016 at 10:08 AM, Mark Hamstra  > wrote:
> I still don't know where this "severely compromised builds of limited 
> usefulness" thing comes from? what's so bad? You didn't veto its 
> release, after all.
> 
> I simply mean that it was released with the knowledge that there are still 
> significant bugs in the preview that definitely would warrant a veto if this 
> were intended to be on a par with other releases.  There have been repeated 
> announcements to that effect, but developers finding the preview artifacts on 
> Maven Central months from now may well not also see those announcements and 
> related discussion.  The artifacts will be very stale and no longer useful 
> for their limited testing purpose, but will persist in the repository.  
> 
> 
> A few months from now, why would a developer choose a preview, alpha, beta 
> compared to the GA 2.0 release ? 
> 
> As for the being stale part, this is true for every release anyone put out 
> there. 
> 
> 
> -- 
> Luciano Resende 
> http://twitter.com/lresende1975  
> http://lresende.blogspot.com/ 



Re: Spark 2.0.0-preview artifacts still not available in Maven

2016-06-06 Thread Luciano Resende
On Mon, Jun 6, 2016 at 11:12 AM, Matei Zaharia 
wrote:

> Is there any way to remove artifacts from Maven Central? Maybe that would
> help clean these things up long-term, though it would create problems for
> users who for some reason decide to rely on these previews.
>
> In any case, if people are *really* concerned about this, we should just
> put it there. My thought was that it's better for users to do something
> special to link to this release (e.g. add a reference to the staging repo)
> so that they are more likely to know that it's a special, unstable thing.
> Same thing they do to use snapshots.
>
> Matei
>
>
So, consider this thread started on another project :
https://www.mail-archive.com/dev@bahir.apache.org/msg00038.html

What would be your recommendation ?
   - Start a release based on Apache Spark 2.0.0 preview staging repo ? I
would  reject that...
   - Start a release on a set of artifacts that are going to be deleted ? I
would also reject that

To me, if companies are using the release on their products, and other
projects are relying on the release to provide a way for users to test,
this should be considered as any other release, published permanently,
which at some point will become obsolete and users will move on to more
stable releases.

Thanks



-- 
Luciano Resende
http://twitter.com/lresende1975
http://lresende.blogspot.com/


Re: Spark 2.0.0-preview artifacts still not available in Maven

2016-06-06 Thread Matei Zaharia
Is there any way to remove artifacts from Maven Central? Maybe that would
help clean these things up long-term, though it would create problems for
users who for some reason decide to rely on these previews.

In any case, if people are *really* concerned about this, we should just
put it there. My thought was that it's better for users to do something
special to link to this release (e.g. add a reference to the staging repo)
so that they are more likely to know that it's a special, unstable thing.
Same thing they do to use snapshots.

Matei

On Mon, Jun 6, 2016 at 10:49 AM, Luciano Resende 
wrote:

>
>
> On Mon, Jun 6, 2016 at 10:08 AM, Mark Hamstra 
> wrote:
>
>> I still don't know where this "severely compromised builds of limited
>>> usefulness" thing comes from? what's so bad? You didn't veto its
>>> release, after all.
>>
>>
>> I simply mean that it was released with the knowledge that there are
>> still significant bugs in the preview that definitely would warrant a veto
>> if this were intended to be on a par with other releases.  There have been
>> repeated announcements to that effect, but developers finding the preview
>> artifacts on Maven Central months from now may well not also see those
>> announcements and related discussion.  The artifacts will be very stale and
>> no longer useful for their limited testing purpose, but will persist in the
>> repository.
>>
>>
> A few months from now, why would a developer choose a preview, alpha, beta
> compared to the GA 2.0 release ?
>
> As for the being stale part, this is true for every release anyone put out
> there.
>
>
> --
> Luciano Resende
> http://twitter.com/lresende1975
> http://lresende.blogspot.com/
>


Re: Spark 2.0.0-preview artifacts still not available in Maven

2016-06-06 Thread Luciano Resende
On Mon, Jun 6, 2016 at 10:08 AM, Mark Hamstra 
wrote:

> I still don't know where this "severely compromised builds of limited
>> usefulness" thing comes from? what's so bad? You didn't veto its
>> release, after all.
>
>
> I simply mean that it was released with the knowledge that there are still
> significant bugs in the preview that definitely would warrant a veto if
> this were intended to be on a par with other releases.  There have been
> repeated announcements to that effect, but developers finding the preview
> artifacts on Maven Central months from now may well not also see those
> announcements and related discussion.  The artifacts will be very stale and
> no longer useful for their limited testing purpose, but will persist in the
> repository.
>
>
A few months from now, why would a developer choose a preview, alpha, beta
compared to the GA 2.0 release ?

As for the being stale part, this is true for every release anyone put out
there.


-- 
Luciano Resende
http://twitter.com/lresende1975
http://lresende.blogspot.com/


Re: Spark 2.0.0-preview artifacts still not available in Maven

2016-06-06 Thread Luciano Resende
On Mon, Jun 6, 2016 at 9:51 AM, Sean Owen  wrote:

> I still don't know where this "severely compromised builds of limited
> usefulness" thing comes from? what's so bad? You didn't veto its
> release, after all. And rightly so: a release doesn't mean "definitely
> works"; it means it was created the right way. It's OK to say it's
> buggy alpha software; this isn't an argument to not really release it.
>
> But aside from that: if it should be used by someone, then who did you
> have in mind?
>
> It would be coherent at least to decide not to make alpha-like
> release, but, we agreed to, which is why this argument sort of
> surprises me.
>
> I share some concerns about piling on Databricks. Nothing here is by
> nature about an organization. However, this release really began in
> response to a thread (which not everyone here can see) about
> Databricks releasing a "2.0.0 preview" option in their product before
> it existed. I presume employees of that company sort of endorse this,
> which has put this same release into the hands of not just developers
> or admins but end users -- even with caveats and warnings.
>
> (And I think that's right!)
>
>

In this case, I would only expect the 2.0.0 preview to be treated as just
any other release, period.


-- 
Luciano Resende
http://twitter.com/lresende1975
http://lresende.blogspot.com/


Re: Spark 2.0.0-preview artifacts still not available in Maven

2016-06-06 Thread Mark Hamstra
>
> I still don't know where this "severely compromised builds of limited
> usefulness" thing comes from? what's so bad? You didn't veto its
> release, after all.


I simply mean that it was released with the knowledge that there are still
significant bugs in the preview that definitely would warrant a veto if
this were intended to be on a par with other releases.  There have been
repeated announcements to that effect, but developers finding the preview
artifacts on Maven Central months from now may well not also see those
announcements and related discussion.  The artifacts will be very stale and
no longer useful for their limited testing purpose, but will persist in the
repository.

On Mon, Jun 6, 2016 at 9:51 AM, Sean Owen  wrote:

> I still don't know where this "severely compromised builds of limited
> usefulness" thing comes from? what's so bad? You didn't veto its
> release, after all. And rightly so: a release doesn't mean "definitely
> works"; it means it was created the right way. It's OK to say it's
> buggy alpha software; this isn't an argument to not really release it.
>
> But aside from that: if it should be used by someone, then who did you
> have in mind?
>
> It would be coherent at least to decide not to make alpha-like
> release, but, we agreed to, which is why this argument sort of
> surprises me.
>
> I share some concerns about piling on Databricks. Nothing here is by
> nature about an organization. However, this release really began in
> response to a thread (which not everyone here can see) about
> Databricks releasing a "2.0.0 preview" option in their product before
> it existed. I presume employees of that company sort of endorse this,
> which has put this same release into the hands of not just developers
> or admins but end users -- even with caveats and warnings.
>
> (And I think that's right!)
>
> While I'd like to see your reasons before I'd agree with you Mark,
> yours is a feasible position; I'm not as sure how people who work for
> Databricks can argue at the same time however that this should be
> carefully guarded as an ASF release -- even with caveats and warnings.
>
> We don't need to assume bad faith -- I don't. The appearance alone is
> enough to act to make this consistent.
>
> But, I think the resolution is simple: it's not 'dangerous' to release
> this and I don't think people who say they think this really do. So
> just finish this release normally, and we're done. Even if you think
> there's an argument against it, weigh vs the problems above.
>
>
> On Mon, Jun 6, 2016 at 4:00 PM, Mark Hamstra 
> wrote:
> > This is not a Databricks vs. The World situation, and the fact that some
> > persist in forcing every issue into that frame is getting annoying.
> There
> > are good engineering and project-management reasons not to populate the
> > long-term, canonical repository of Maven artifacts with what are known
> to be
> > severely compromised builds of limited usefulness, particularly over
> time.
> > It is a legitimate dispute over whether these preview artifacts should be
> > deployed to Maven Central, not one that must be seen as Databricks
> seeking
> > improper advantage.
> >
>


Re: Spark 2.0.0-preview artifacts still not available in Maven

2016-06-06 Thread Sean Owen
I still don't know where this "severely compromised builds of limited
usefulness" thing comes from? what's so bad? You didn't veto its
release, after all. And rightly so: a release doesn't mean "definitely
works"; it means it was created the right way. It's OK to say it's
buggy alpha software; this isn't an argument to not really release it.

But aside from that: if it should be used by someone, then who did you
have in mind?

It would be coherent at least to decide not to make alpha-like
release, but, we agreed to, which is why this argument sort of
surprises me.

I share some concerns about piling on Databricks. Nothing here is by
nature about an organization. However, this release really began in
response to a thread (which not everyone here can see) about
Databricks releasing a "2.0.0 preview" option in their product before
it existed. I presume employees of that company sort of endorse this,
which has put this same release into the hands of not just developers
or admins but end users -- even with caveats and warnings.

(And I think that's right!)

While I'd like to see your reasons before I'd agree with you Mark,
yours is a feasible position; I'm not as sure how people who work for
Databricks can argue at the same time however that this should be
carefully guarded as an ASF release -- even with caveats and warnings.

We don't need to assume bad faith -- I don't. The appearance alone is
enough to act to make this consistent.

But, I think the resolution is simple: it's not 'dangerous' to release
this and I don't think people who say they think this really do. So
just finish this release normally, and we're done. Even if you think
there's an argument against it, weigh vs the problems above.


On Mon, Jun 6, 2016 at 4:00 PM, Mark Hamstra  wrote:
> This is not a Databricks vs. The World situation, and the fact that some
> persist in forcing every issue into that frame is getting annoying.  There
> are good engineering and project-management reasons not to populate the
> long-term, canonical repository of Maven artifacts with what are known to be
> severely compromised builds of limited usefulness, particularly over time.
> It is a legitimate dispute over whether these preview artifacts should be
> deployed to Maven Central, not one that must be seen as Databricks seeking
> improper advantage.
>

-
To unsubscribe, e-mail: dev-unsubscr...@spark.apache.org
For additional commands, e-mail: dev-h...@spark.apache.org



Re: Spark 2.0.0-preview artifacts still not available in Maven

2016-06-06 Thread Nicholas Chammas
+1 to what Mark said. I've been following this discussion and I don't
understand where the sudden "Databricks vs. everybody else" narrative came
from.

On Mon, Jun 6, 2016 at 11:00 AM Mark Hamstra 
wrote:

> This is not a Databricks vs. The World situation, and the fact that some
> persist in forcing every issue into that frame is getting annoying.  There
> are good engineering and project-management reasons not to populate the
> long-term, canonical repository of Maven artifacts with what are known to
> be severely compromised builds of limited usefulness, particularly over
> time.  It is a legitimate dispute over whether these preview artifacts
> should be deployed to Maven Central, not one that must be seen as
> Databricks seeking improper advantage.
>
> On Mon, Jun 6, 2016 at 5:34 AM, Shane Curcuru 
> wrote:
>
>>
>>
>> On 2016-06-04 18:42 (-0400), Sean Owen  wrote:
>> ...
>> > The question is, can you just not fully release it? I don't think so,
>> > even as a matter of process, and don't see a good reason not to.
>> >
>> > To Reynold's quote, I think that's suggesting that not all projects
>> > will release to a repo at all (e.g. OpenOffice?). I don't think it
>> > means you're free to not release some things to Maven, if that's
>> > appropriate and common for the type of project.
>> >
>> > Regarding risk, remember that the audience for Maven artifacts are
>> > developers, not admins or end users. I understand that developers can
>> > temporarily change their build to use a different resolver if they
>> > care, but, why? (and, where would someone figure this out?)
>> >
>> > Regardless: the 2.0.0-preview docs aren't published to go along with
>> > the source/binary releases. Those need be released to the project
>> > site, though probably under a different /preview/ path or something.
>> > If they are, is it weird that someone wouldn't find the release in the
>> > usual place in Maven then?
>> >
>> > Given that the driver of this was concern over wide access to
>> > 2.0.0-preview, I think it's best to err on the side openness vs some
>> > theoretical problem.
>>
>> The mere fact that there continues to be repeated pushback from PMC
>> members employed by DataBricks to such a reasonable and easy question to
>> answer and take action on for the benefit of all the project's users
>> raises red flags for me.
>>
>> Immaterial of the actual motivations of individual PMC members, this
>> still gives the *appearance* that DataBricks as an organization
>> effectively exercises a more than healthy amount of control over how the
>> project operates in simple, day-to-day manners.
>>
>> I strongly urge everyone participating in Apache Spark development to
>> read and take to heart this required policy for Apache projects:
>>
>>   http://community.apache.org/projectIndependence
>>
>> - Shane, speaking as an individual
>>
>> (If I were speaking in other roles I hold, I wouldn't be as polite)
>>
>> -
>> To unsubscribe, e-mail: dev-unsubscr...@spark.apache.org
>> For additional commands, e-mail: dev-h...@spark.apache.org
>>
>>
>


Re: Spark 2.0.0-preview artifacts still not available in Maven

2016-06-06 Thread Mark Hamstra
This is not a Databricks vs. The World situation, and the fact that some
persist in forcing every issue into that frame is getting annoying.  There
are good engineering and project-management reasons not to populate the
long-term, canonical repository of Maven artifacts with what are known to
be severely compromised builds of limited usefulness, particularly over
time.  It is a legitimate dispute over whether these preview artifacts
should be deployed to Maven Central, not one that must be seen as
Databricks seeking improper advantage.

On Mon, Jun 6, 2016 at 5:34 AM, Shane Curcuru  wrote:

>
>
> On 2016-06-04 18:42 (-0400), Sean Owen  wrote:
> ...
> > The question is, can you just not fully release it? I don't think so,
> > even as a matter of process, and don't see a good reason not to.
> >
> > To Reynold's quote, I think that's suggesting that not all projects
> > will release to a repo at all (e.g. OpenOffice?). I don't think it
> > means you're free to not release some things to Maven, if that's
> > appropriate and common for the type of project.
> >
> > Regarding risk, remember that the audience for Maven artifacts are
> > developers, not admins or end users. I understand that developers can
> > temporarily change their build to use a different resolver if they
> > care, but, why? (and, where would someone figure this out?)
> >
> > Regardless: the 2.0.0-preview docs aren't published to go along with
> > the source/binary releases. Those need be released to the project
> > site, though probably under a different /preview/ path or something.
> > If they are, is it weird that someone wouldn't find the release in the
> > usual place in Maven then?
> >
> > Given that the driver of this was concern over wide access to
> > 2.0.0-preview, I think it's best to err on the side openness vs some
> > theoretical problem.
>
> The mere fact that there continues to be repeated pushback from PMC
> members employed by DataBricks to such a reasonable and easy question to
> answer and take action on for the benefit of all the project's users
> raises red flags for me.
>
> Immaterial of the actual motivations of individual PMC members, this
> still gives the *appearance* that DataBricks as an organization
> effectively exercises a more than healthy amount of control over how the
> project operates in simple, day-to-day manners.
>
> I strongly urge everyone participating in Apache Spark development to
> read and take to heart this required policy for Apache projects:
>
>   http://community.apache.org/projectIndependence
>
> - Shane, speaking as an individual
>
> (If I were speaking in other roles I hold, I wouldn't be as polite)
>
> -
> To unsubscribe, e-mail: dev-unsubscr...@spark.apache.org
> For additional commands, e-mail: dev-h...@spark.apache.org
>
>


Re: Spark 2.0.0-preview artifacts still not available in Maven

2016-06-05 Thread Marcin Tustin
+1 agree that right the problem is theoretical esp if the preview label is
in the version coordinates as it should be.

On Saturday, June 4, 2016, Sean Owen  wrote:

> Artifacts that are not for public consumption shouldn't be in a public
> release; this is instead what nightlies are for. However, this was a
> normal public release.
>
> I am not even sure why it's viewed as particularly unsafe, but, unsafe
> alpha and beta releases are just releases, and their name and
> documentation clarify their status for those who care. These are
> regularly released by other projects.
>
> That is, the question is not, is this a beta? Everyone agrees it
> probably is, and is documented as such.
>
> The question is, can you just not fully release it? I don't think so,
> even as a matter of process, and don't see a good reason not to.
>
> To Reynold's quote, I think that's suggesting that not all projects
> will release to a repo at all (e.g. OpenOffice?). I don't think it
> means you're free to not release some things to Maven, if that's
> appropriate and common for the type of project.
>
> Regarding risk, remember that the audience for Maven artifacts are
> developers, not admins or end users. I understand that developers can
> temporarily change their build to use a different resolver if they
> care, but, why? (and, where would someone figure this out?)
>
> Regardless: the 2.0.0-preview docs aren't published to go along with
> the source/binary releases. Those need be released to the project
> site, though probably under a different /preview/ path or something.
> If they are, is it weird that someone wouldn't find the release in the
> usual place in Maven then?
>
> Given that the driver of this was concern over wide access to
> 2.0.0-preview, I think it's best to err on the side openness vs some
> theoretical problem.
>
> On Sat, Jun 4, 2016 at 11:24 PM, Matei Zaharia  > wrote:
> > Personally I'd just put them on the staging repo and link to that on the
> > downloads page. It will create less confusion for people browsing Maven
> > Central later and wondering which releases are safe to use.
> >
> > Matei
> >
> > On Jun 3, 2016, at 8:22 AM, Mark Hamstra  > wrote:
> >
> > It's not a question of whether the preview artifacts can be made
> available
> > on Maven central, but rather whether they must be or should be.  I've
> got no
> > problems leaving these unstable, transitory artifacts out of the more
> > permanent, canonical repository.
>
> -
> To unsubscribe, e-mail: dev-unsubscr...@spark.apache.org 
> For additional commands, e-mail: dev-h...@spark.apache.org 
>
>

-- 
Want to work at Handy? Check out our culture deck and open roles 

Latest news  at Handy
Handy just raised $50m 

 led 
by Fidelity



Re: Spark 2.0.0-preview artifacts still not available in Maven

2016-06-05 Thread Ovidiu-Cristian MARCU
Hi all

IMHO the preview ‘release’ is good at is is now, so no further changes required.
For me the preview was a trigger to what will be the next Spark 2.0, really 
appreciate the effort guys made to describe it and market it:)

I’ll appreciate if the Apache Spark team will start a vote for a new alpha-beta 
release and point the current status of the project. Since the preview was 
released there are numerous updates.

Best,
Ovidiu 
 
> On 05 Jun 2016, at 00:42, Sean Owen  wrote:
> 
> Artifacts that are not for public consumption shouldn't be in a public
> release; this is instead what nightlies are for. However, this was a
> normal public release.
> 
> I am not even sure why it's viewed as particularly unsafe, but, unsafe
> alpha and beta releases are just releases, and their name and
> documentation clarify their status for those who care. These are
> regularly released by other projects.
> 
> That is, the question is not, is this a beta? Everyone agrees it
> probably is, and is documented as such.
> 
> The question is, can you just not fully release it? I don't think so,
> even as a matter of process, and don't see a good reason not to.
> 
> To Reynold's quote, I think that's suggesting that not all projects
> will release to a repo at all (e.g. OpenOffice?). I don't think it
> means you're free to not release some things to Maven, if that's
> appropriate and common for the type of project.
> 
> Regarding risk, remember that the audience for Maven artifacts are
> developers, not admins or end users. I understand that developers can
> temporarily change their build to use a different resolver if they
> care, but, why? (and, where would someone figure this out?)
> 
> Regardless: the 2.0.0-preview docs aren't published to go along with
> the source/binary releases. Those need be released to the project
> site, though probably under a different /preview/ path or something.
> If they are, is it weird that someone wouldn't find the release in the
> usual place in Maven then?
> 
> Given that the driver of this was concern over wide access to
> 2.0.0-preview, I think it's best to err on the side openness vs some
> theoretical problem.
> 
> On Sat, Jun 4, 2016 at 11:24 PM, Matei Zaharia  
> wrote:
>> Personally I'd just put them on the staging repo and link to that on the
>> downloads page. It will create less confusion for people browsing Maven
>> Central later and wondering which releases are safe to use.
>> 
>> Matei
>> 
>> On Jun 3, 2016, at 8:22 AM, Mark Hamstra  wrote:
>> 
>> It's not a question of whether the preview artifacts can be made available
>> on Maven central, but rather whether they must be or should be.  I've got no
>> problems leaving these unstable, transitory artifacts out of the more
>> permanent, canonical repository.
> 
> -
> To unsubscribe, e-mail: dev-unsubscr...@spark.apache.org
> For additional commands, e-mail: dev-h...@spark.apache.org
> 


-
To unsubscribe, e-mail: dev-unsubscr...@spark.apache.org
For additional commands, e-mail: dev-h...@spark.apache.org



Re: Spark 2.0.0-preview artifacts still not available in Maven

2016-06-04 Thread Sean Owen
Artifacts that are not for public consumption shouldn't be in a public
release; this is instead what nightlies are for. However, this was a
normal public release.

I am not even sure why it's viewed as particularly unsafe, but, unsafe
alpha and beta releases are just releases, and their name and
documentation clarify their status for those who care. These are
regularly released by other projects.

That is, the question is not, is this a beta? Everyone agrees it
probably is, and is documented as such.

The question is, can you just not fully release it? I don't think so,
even as a matter of process, and don't see a good reason not to.

To Reynold's quote, I think that's suggesting that not all projects
will release to a repo at all (e.g. OpenOffice?). I don't think it
means you're free to not release some things to Maven, if that's
appropriate and common for the type of project.

Regarding risk, remember that the audience for Maven artifacts are
developers, not admins or end users. I understand that developers can
temporarily change their build to use a different resolver if they
care, but, why? (and, where would someone figure this out?)

Regardless: the 2.0.0-preview docs aren't published to go along with
the source/binary releases. Those need be released to the project
site, though probably under a different /preview/ path or something.
If they are, is it weird that someone wouldn't find the release in the
usual place in Maven then?

Given that the driver of this was concern over wide access to
2.0.0-preview, I think it's best to err on the side openness vs some
theoretical problem.

On Sat, Jun 4, 2016 at 11:24 PM, Matei Zaharia  wrote:
> Personally I'd just put them on the staging repo and link to that on the
> downloads page. It will create less confusion for people browsing Maven
> Central later and wondering which releases are safe to use.
>
> Matei
>
> On Jun 3, 2016, at 8:22 AM, Mark Hamstra  wrote:
>
> It's not a question of whether the preview artifacts can be made available
> on Maven central, but rather whether they must be or should be.  I've got no
> problems leaving these unstable, transitory artifacts out of the more
> permanent, canonical repository.

-
To unsubscribe, e-mail: dev-unsubscr...@spark.apache.org
For additional commands, e-mail: dev-h...@spark.apache.org



Re: Spark 2.0.0-preview artifacts still not available in Maven

2016-06-04 Thread Matei Zaharia
Personally I'd just put them on the staging repo and link to that on the 
downloads page. It will create less confusion for people browsing Maven Central 
later and wondering which releases are safe to use.

Matei

> On Jun 3, 2016, at 8:22 AM, Mark Hamstra  wrote:
> 
> It's not a question of whether the preview artifacts can be made available on 
> Maven central, but rather whether they must be or should be.  I've got no 
> problems leaving these unstable, transitory artifacts out of the more 
> permanent, canonical repository.
> 
> On Fri, Jun 3, 2016 at 1:53 AM, Steve Loughran  > wrote:
> 
> It's been voted on by the project, so can go up on central
> 
> There's already some JIRAs being filed against it, this is a metric of 
> success as pre-beta of the artifacts.
> 
> The risk of exercising the m2 central option is that people may get 
> expectations that they can point their code at the 2.0.0-preview and then, 
> when a release comes out, simply
> update their dependency; this may/may not be the case. But is it harmful if 
> people do start building and testing against the preview? If it finds 
> problems early, it can only be a good thing
> 
> 
> > On 1 Jun 2016, at 23:10, Sean Owen  > > wrote:
> >
> > I'll be more specific about the issue that I think trumps all this,
> > which I realize maybe not everyone was aware of.
> >
> > There was a long and contentious discussion on the PMC about, among
> > other things, advertising a "Spark 2.0 preview" from Databricks, such
> > as at 
> > https://databricks.com/blog/2016/05/11/apache-spark-2-0-technical-preview-easier-faster-and-smarter.html
> >  
> > 
> >
> > That post has already been updated/fixed from an earlier version, but
> > part of the resolution was to make a full "2.0.0 preview" release in
> > order to continue to be able to advertise it as such. Without it, I
> > believe the PMC's conclusion remains that this blog post / product
> > announcement is not allowed by ASF policy. Hence, either the product
> > announcements need to be taken down and a bunch of wording changed in
> > the Databricks product, or, this needs to be a normal release.
> >
> > Obviously, it seems far easier to just finish the release per usual. I
> > actually didn't realize this had not been offered for download at
> > http://spark.apache.org/downloads.html 
> >  either. It needs to be
> > accessible there too.
> >
> >
> > We can get back in the weeds about what a "preview" release means,
> > but, normal voted releases can and even should be alpha/beta
> > (http://www.apache.org/dev/release.html 
> > ) The culture is, in theory, to
> > release early and often. I don't buy an argument that it's too old, at
> > 2 weeks, when the alternative is having nothing at all to test
> > against.
> >
> > On Wed, Jun 1, 2016 at 5:02 PM, Michael Armbrust  > > wrote:
> >>> I'd think we want less effort, not more, to let people test it? for
> >>> example, right now I can't easily try my product build against
> >>> 2.0.0-preview.
> >>
> >>
> >> I don't feel super strongly one way or the other, so if we need to publish
> >> it permanently we can.
> >>
> >> However, either way you can still test against this release.  You just need
> >> to add a resolver as well (which is how I have always tested packages
> >> against RCs).  One concern with making it permeant is this preview release
> >> is already fairly far behind branch-2.0, so many of the issues that people
> >> might report have already been fixed and that might continue even after the
> >> release is made.  I'd rather be able to force upgrades eventually when we
> >> vote on the final 2.0 release.
> >>
> >
> > -
> > To unsubscribe, e-mail: dev-unsubscr...@spark.apache.org 
> > 
> > For additional commands, e-mail: dev-h...@spark.apache.org 
> > 
> >
> >
> 
> 
> -
> To unsubscribe, e-mail: dev-unsubscr...@spark.apache.org 
> 
> For additional commands, e-mail: dev-h...@spark.apache.org 
> 
> 
> 



Re: Spark 2.0.0-preview artifacts still not available in Maven

2016-06-03 Thread Mark Hamstra
It's not a question of whether the preview artifacts can be made available
on Maven central, but rather whether they must be or should be.  I've got
no problems leaving these unstable, transitory artifacts out of the more
permanent, canonical repository.

On Fri, Jun 3, 2016 at 1:53 AM, Steve Loughran 
wrote:

>
> It's been voted on by the project, so can go up on central
>
> There's already some JIRAs being filed against it, this is a metric of
> success as pre-beta of the artifacts.
>
> The risk of exercising the m2 central option is that people may get
> expectations that they can point their code at the 2.0.0-preview and then,
> when a release comes out, simply
> update their dependency; this may/may not be the case. But is it harmful
> if people do start building and testing against the preview? If it finds
> problems early, it can only be a good thing
>
>
> > On 1 Jun 2016, at 23:10, Sean Owen  wrote:
> >
> > I'll be more specific about the issue that I think trumps all this,
> > which I realize maybe not everyone was aware of.
> >
> > There was a long and contentious discussion on the PMC about, among
> > other things, advertising a "Spark 2.0 preview" from Databricks, such
> > as at
> https://databricks.com/blog/2016/05/11/apache-spark-2-0-technical-preview-easier-faster-and-smarter.html
> >
> > That post has already been updated/fixed from an earlier version, but
> > part of the resolution was to make a full "2.0.0 preview" release in
> > order to continue to be able to advertise it as such. Without it, I
> > believe the PMC's conclusion remains that this blog post / product
> > announcement is not allowed by ASF policy. Hence, either the product
> > announcements need to be taken down and a bunch of wording changed in
> > the Databricks product, or, this needs to be a normal release.
> >
> > Obviously, it seems far easier to just finish the release per usual. I
> > actually didn't realize this had not been offered for download at
> > http://spark.apache.org/downloads.html either. It needs to be
> > accessible there too.
> >
> >
> > We can get back in the weeds about what a "preview" release means,
> > but, normal voted releases can and even should be alpha/beta
> > (http://www.apache.org/dev/release.html) The culture is, in theory, to
> > release early and often. I don't buy an argument that it's too old, at
> > 2 weeks, when the alternative is having nothing at all to test
> > against.
> >
> > On Wed, Jun 1, 2016 at 5:02 PM, Michael Armbrust 
> wrote:
> >>> I'd think we want less effort, not more, to let people test it? for
> >>> example, right now I can't easily try my product build against
> >>> 2.0.0-preview.
> >>
> >>
> >> I don't feel super strongly one way or the other, so if we need to
> publish
> >> it permanently we can.
> >>
> >> However, either way you can still test against this release.  You just
> need
> >> to add a resolver as well (which is how I have always tested packages
> >> against RCs).  One concern with making it permeant is this preview
> release
> >> is already fairly far behind branch-2.0, so many of the issues that
> people
> >> might report have already been fixed and that might continue even after
> the
> >> release is made.  I'd rather be able to force upgrades eventually when
> we
> >> vote on the final 2.0 release.
> >>
> >
> > -
> > To unsubscribe, e-mail: dev-unsubscr...@spark.apache.org
> > For additional commands, e-mail: dev-h...@spark.apache.org
> >
> >
>
>
> -
> To unsubscribe, e-mail: dev-unsubscr...@spark.apache.org
> For additional commands, e-mail: dev-h...@spark.apache.org
>
>


Re: Spark 2.0.0-preview artifacts still not available in Maven

2016-06-03 Thread Steve Loughran

It's been voted on by the project, so can go up on central

There's already some JIRAs being filed against it, this is a metric of success 
as pre-beta of the artifacts.

The risk of exercising the m2 central option is that people may get 
expectations that they can point their code at the 2.0.0-preview and then, when 
a release comes out, simply
update their dependency; this may/may not be the case. But is it harmful if 
people do start building and testing against the preview? If it finds problems 
early, it can only be a good thing


> On 1 Jun 2016, at 23:10, Sean Owen  wrote:
> 
> I'll be more specific about the issue that I think trumps all this,
> which I realize maybe not everyone was aware of.
> 
> There was a long and contentious discussion on the PMC about, among
> other things, advertising a "Spark 2.0 preview" from Databricks, such
> as at 
> https://databricks.com/blog/2016/05/11/apache-spark-2-0-technical-preview-easier-faster-and-smarter.html
> 
> That post has already been updated/fixed from an earlier version, but
> part of the resolution was to make a full "2.0.0 preview" release in
> order to continue to be able to advertise it as such. Without it, I
> believe the PMC's conclusion remains that this blog post / product
> announcement is not allowed by ASF policy. Hence, either the product
> announcements need to be taken down and a bunch of wording changed in
> the Databricks product, or, this needs to be a normal release.
> 
> Obviously, it seems far easier to just finish the release per usual. I
> actually didn't realize this had not been offered for download at
> http://spark.apache.org/downloads.html either. It needs to be
> accessible there too.
> 
> 
> We can get back in the weeds about what a "preview" release means,
> but, normal voted releases can and even should be alpha/beta
> (http://www.apache.org/dev/release.html) The culture is, in theory, to
> release early and often. I don't buy an argument that it's too old, at
> 2 weeks, when the alternative is having nothing at all to test
> against.
> 
> On Wed, Jun 1, 2016 at 5:02 PM, Michael Armbrust  
> wrote:
>>> I'd think we want less effort, not more, to let people test it? for
>>> example, right now I can't easily try my product build against
>>> 2.0.0-preview.
>> 
>> 
>> I don't feel super strongly one way or the other, so if we need to publish
>> it permanently we can.
>> 
>> However, either way you can still test against this release.  You just need
>> to add a resolver as well (which is how I have always tested packages
>> against RCs).  One concern with making it permeant is this preview release
>> is already fairly far behind branch-2.0, so many of the issues that people
>> might report have already been fixed and that might continue even after the
>> release is made.  I'd rather be able to force upgrades eventually when we
>> vote on the final 2.0 release.
>> 
> 
> -
> To unsubscribe, e-mail: dev-unsubscr...@spark.apache.org
> For additional commands, e-mail: dev-h...@spark.apache.org
> 
> 


-
To unsubscribe, e-mail: dev-unsubscr...@spark.apache.org
For additional commands, e-mail: dev-h...@spark.apache.org



Re: Spark 2.0.0-preview artifacts still not available in Maven

2016-06-02 Thread Reynold Xin
One thing we can do is to do monthly milestone releases, similar to other
projects (e.g. Scala).

So we can have Apache Spark 2.1.0-M1, Apache Spark 2.1.0-M2.




On Thu, Jun 2, 2016 at 12:42 PM, Tom Graves  wrote:

> The documentation for the preview release also seem to be missing?
>
> Also what happens if we want to do a second preview release?  The naming
> doesn't seem to allow then unless we call it preview 2.
>
> Tom
>
>
> On Wednesday, June 1, 2016 6:27 PM, Sean Owen  wrote:
>
>
> On Wed, Jun 1, 2016 at 5:58 PM, Reynold Xin  wrote:
> > The preview release is available here:
> > http://spark.apache.org/downloads.html (there is an entire section
> dedicated
> > to it and also there is a news link to it on the right).
>
> Oops, it is indeed down there at the bottom, before nightlies. I
> honestly missed it below the fold. I'd advocate for making it a (non
> default?) option in the main downloads dropdown, but this then becomes
> a minor issue. The core source/binary artifacts _are_ publicly
> available.
>
>
> > "In addition to the distribution directory, project that use Maven or a
> > related build tool sometimes place their releases on
> repository.apache.org
> > beside some convenience binaries. The distribution directory is required,
> > while the repository system is an optional convenience."
>
> Agree. The question is what makes this release special? because other
> releases have been published to Maven. I think the argument is that
> it's a buggy alpha/beta/preview release, but so were 0.x releases.
> Reasonable people could make up different policies, so here I'm
> appealing to guidance: http://www.apache.org/dev/release.html
>
> "Releases are packages that have been approved for general public
> release, with varying degrees of caveat regarding their perceived
> quality or potential for change. Releases that are intended for
> everyday usage by non-developers are usually referred to as "stable"
> or "general availability (GA)" releases. Releases that are believed to
> be usable by testers and developers outside the project, but perhaps
> not yet stable in terms of features or functionality, are usually
> referred to as "beta" or "unstable". Releases that only represent a
> project milestone and are intended only for bleeding-edge developers
> working outside the project are called "alpha"."
>
> I don't think releases are defined by whether they're stable or buggy,
> but by whether they were produced by a sanctioned process that
> protects contributors under the ASF umbrella, etc etc. Compare to a
> nightly build which we don't want everyone to consume, not so much
> because it might be buggier, but because these protections don't
> apply.
>
> Certainly, it's vital to communicate how to interpret the stability of
> the releases, but -preview releases are still normal releases to the
> public.
>
> I don't think bugginess therefore is the question. Any Spark dev knows
> that x.y.0 Spark releases have gone out with even Critical and in the
> past Blocker issues unresolved, and the world failed to fall apart.
> (We're better about this now.) I actually think the -preview release
> idea is worth repeating for this reason -- .0-preview is the new .0.
> It'd be more accurate IMHO and better for all.
>
>
> > I think it'd be pretty bad if preview releases in anyway become "default
> > version", because they are unstable and contain a lot of blocker bugs.
>
> Why would this happen? releases happen ~3 months and could happen
> faster if this is a concern. 2.0.0 final is, I'd wager, coming in <1
> month.
>
>
> > 2. On the download page, have two sections. One listing the normal
> releases,
> > and the other listing preview releases.
>
> +1, that puts it above the fold and easily findable to anyone willing
> to consume such a thing.
>
>
> > 3. Everywhere we mention preview releases, include the proper disclaimer
> > e.g. "This preview is not a stable release in terms of either API or
> > functionality, but it is meant to give the community early access to try
> the
> > code that will become Spark 2.0."
>
> Can't hurt to overcommunicate this for -preview releases in general.
>
>
> > 4. Publish normal releases to maven central, and preview releases only to
> > the staging maven repo. But of course we should include the temporary
> maven
> > repo for preview releases on the download page.
>
> This is the only thing I disagree with. AFAIK other ASF projects
> readily publish alpha and beta releases, under varying naming
> conventions (alpha, beta, RC1, etc) It's not something that needs to
> be hidden like a nightly.
>
> The audience for Maven artifacts are developers, not admins or users.
> Compare the risk of a developer somehow not understanding what they're
> getting, to the friction caused by making developers add a repo to get
>
> at it.
>
>
> I get it, that seems minor. But given the recent concern about making
> sure "2.0.0 preview" is 

Re: Spark 2.0.0-preview artifacts still not available in Maven

2016-06-02 Thread Tom Graves
The documentation for the preview release also seem to be missing?
Also what happens if we want to do a second preview release?  The naming 
doesn't seem to allow then unless we call it preview 2.
Tom 

On Wednesday, June 1, 2016 6:27 PM, Sean Owen  wrote:
 

 On Wed, Jun 1, 2016 at 5:58 PM, Reynold Xin  wrote:
> The preview release is available here:
> http://spark.apache.org/downloads.html (there is an entire section dedicated
> to it and also there is a news link to it on the right).

Oops, it is indeed down there at the bottom, before nightlies. I
honestly missed it below the fold. I'd advocate for making it a (non
default?) option in the main downloads dropdown, but this then becomes
a minor issue. The core source/binary artifacts _are_ publicly
available.


> "In addition to the distribution directory, project that use Maven or a
> related build tool sometimes place their releases on repository.apache.org
> beside some convenience binaries. The distribution directory is required,
> while the repository system is an optional convenience."

Agree. The question is what makes this release special? because other
releases have been published to Maven. I think the argument is that
it's a buggy alpha/beta/preview release, but so were 0.x releases.
Reasonable people could make up different policies, so here I'm
appealing to guidance: http://www.apache.org/dev/release.html

"Releases are packages that have been approved for general public
release, with varying degrees of caveat regarding their perceived
quality or potential for change. Releases that are intended for
everyday usage by non-developers are usually referred to as "stable"
or "general availability (GA)" releases. Releases that are believed to
be usable by testers and developers outside the project, but perhaps
not yet stable in terms of features or functionality, are usually
referred to as "beta" or "unstable". Releases that only represent a
project milestone and are intended only for bleeding-edge developers
working outside the project are called "alpha"."

I don't think releases are defined by whether they're stable or buggy,
but by whether they were produced by a sanctioned process that
protects contributors under the ASF umbrella, etc etc. Compare to a
nightly build which we don't want everyone to consume, not so much
because it might be buggier, but because these protections don't
apply.

Certainly, it's vital to communicate how to interpret the stability of
the releases, but -preview releases are still normal releases to the
public.

I don't think bugginess therefore is the question. Any Spark dev knows
that x.y.0 Spark releases have gone out with even Critical and in the
past Blocker issues unresolved, and the world failed to fall apart.
(We're better about this now.) I actually think the -preview release
idea is worth repeating for this reason -- .0-preview is the new .0.
It'd be more accurate IMHO and better for all.


> I think it'd be pretty bad if preview releases in anyway become "default
> version", because they are unstable and contain a lot of blocker bugs.

Why would this happen? releases happen ~3 months and could happen
faster if this is a concern. 2.0.0 final is, I'd wager, coming in <1
month.


> 2. On the download page, have two sections. One listing the normal releases,
> and the other listing preview releases.

+1, that puts it above the fold and easily findable to anyone willing
to consume such a thing.


> 3. Everywhere we mention preview releases, include the proper disclaimer
> e.g. "This preview is not a stable release in terms of either API or
> functionality, but it is meant to give the community early access to try the
> code that will become Spark 2.0."

Can't hurt to overcommunicate this for -preview releases in general.


> 4. Publish normal releases to maven central, and preview releases only to
> the staging maven repo. But of course we should include the temporary maven
> repo for preview releases on the download page.

This is the only thing I disagree with. AFAIK other ASF projects
readily publish alpha and beta releases, under varying naming
conventions (alpha, beta, RC1, etc) It's not something that needs to
be hidden like a nightly.

The audience for Maven artifacts are developers, not admins or users.
Compare the risk of a developer somehow not understanding what they're
getting, to the friction caused by making developers add a repo to get
at it.

I get it, that seems minor. But given the recent concern about making
sure "2.0.0 preview" is available as an ASF release, I'd advise us to
make sure this release is not any harder to get at than others, to
really put that to bed.

-
To unsubscribe, e-mail: dev-unsubscr...@spark.apache.org
For additional commands, e-mail: dev-h...@spark.apache.org



  

Re: Spark 2.0.0-preview artifacts still not available in Maven

2016-06-01 Thread Sean Owen
On Wed, Jun 1, 2016 at 5:58 PM, Reynold Xin  wrote:
> The preview release is available here:
> http://spark.apache.org/downloads.html (there is an entire section dedicated
> to it and also there is a news link to it on the right).

Oops, it is indeed down there at the bottom, before nightlies. I
honestly missed it below the fold. I'd advocate for making it a (non
default?) option in the main downloads dropdown, but this then becomes
a minor issue. The core source/binary artifacts _are_ publicly
available.


> "In addition to the distribution directory, project that use Maven or a
> related build tool sometimes place their releases on repository.apache.org
> beside some convenience binaries. The distribution directory is required,
> while the repository system is an optional convenience."

Agree. The question is what makes this release special? because other
releases have been published to Maven. I think the argument is that
it's a buggy alpha/beta/preview release, but so were 0.x releases.
Reasonable people could make up different policies, so here I'm
appealing to guidance: http://www.apache.org/dev/release.html

"Releases are packages that have been approved for general public
release, with varying degrees of caveat regarding their perceived
quality or potential for change. Releases that are intended for
everyday usage by non-developers are usually referred to as "stable"
or "general availability (GA)" releases. Releases that are believed to
be usable by testers and developers outside the project, but perhaps
not yet stable in terms of features or functionality, are usually
referred to as "beta" or "unstable". Releases that only represent a
project milestone and are intended only for bleeding-edge developers
working outside the project are called "alpha"."

I don't think releases are defined by whether they're stable or buggy,
but by whether they were produced by a sanctioned process that
protects contributors under the ASF umbrella, etc etc. Compare to a
nightly build which we don't want everyone to consume, not so much
because it might be buggier, but because these protections don't
apply.

Certainly, it's vital to communicate how to interpret the stability of
the releases, but -preview releases are still normal releases to the
public.

I don't think bugginess therefore is the question. Any Spark dev knows
that x.y.0 Spark releases have gone out with even Critical and in the
past Blocker issues unresolved, and the world failed to fall apart.
(We're better about this now.) I actually think the -preview release
idea is worth repeating for this reason -- .0-preview is the new .0.
It'd be more accurate IMHO and better for all.


> I think it'd be pretty bad if preview releases in anyway become "default
> version", because they are unstable and contain a lot of blocker bugs.

Why would this happen? releases happen ~3 months and could happen
faster if this is a concern. 2.0.0 final is, I'd wager, coming in <1
month.


> 2. On the download page, have two sections. One listing the normal releases,
> and the other listing preview releases.

+1, that puts it above the fold and easily findable to anyone willing
to consume such a thing.


> 3. Everywhere we mention preview releases, include the proper disclaimer
> e.g. "This preview is not a stable release in terms of either API or
> functionality, but it is meant to give the community early access to try the
> code that will become Spark 2.0."

Can't hurt to overcommunicate this for -preview releases in general.


> 4. Publish normal releases to maven central, and preview releases only to
> the staging maven repo. But of course we should include the temporary maven
> repo for preview releases on the download page.

This is the only thing I disagree with. AFAIK other ASF projects
readily publish alpha and beta releases, under varying naming
conventions (alpha, beta, RC1, etc) It's not something that needs to
be hidden like a nightly.

The audience for Maven artifacts are developers, not admins or users.
Compare the risk of a developer somehow not understanding what they're
getting, to the friction caused by making developers add a repo to get
at it.

I get it, that seems minor. But given the recent concern about making
sure "2.0.0 preview" is available as an ASF release, I'd advise us to
make sure this release is not any harder to get at than others, to
really put that to bed.

-
To unsubscribe, e-mail: dev-unsubscr...@spark.apache.org
For additional commands, e-mail: dev-h...@spark.apache.org



Re: Spark 2.0.0-preview artifacts still not available in Maven

2016-06-01 Thread Reynold Xin
Hi Sean,

(writing this email with my Apache hat on only and not Databricks hat)

The preview release is available here:
http://spark.apache.org/downloads.html (there is an entire section
dedicated to it and also there is a news link to it on the right).

Again, I think this is a good opportunity to define what a release should
contain. Based on
http://www.apache.org/dev/release.html#where-do-releases-go

"In addition to the distribution directory, project that use Maven or a
related build tool sometimes place their releases on repository.apache.org
beside some convenience binaries. The distribution directory is required,
while the repository system is an optional convenience."

So I'm reading it as that maven publication is not necessary. My
understanding is that the general community (beyond who follows the dev
list) should understand that preview is not a stable release, and we as the
PMC should set expectations accordingly. Developers that can test the
preview releases tend to be more savvy and are comfortable on the bleeding
edge. It is actually fairly easy for them to add a maven repo. Now reading
the page I realized no where on the page did we mention the temporary maven
repo. I will fix that.

I think it'd be pretty bad if preview releases in anyway become "default
version", because they are unstable and contain a lot of blocker bugs.

So my concrete proposal is:

1. Separate (officially voted) releases into normal and preview.

2. On the download page, have two sections. One listing the normal
releases, and the other listing preview releases.

3. Everywhere we mention preview releases, include the proper disclaimer
e.g. "This preview is not a stable release in terms of either API or
functionality, but it is meant to give the community early access to try
the code that will become Spark 2.0."

4. Publish normal releases to maven central, and preview releases only to
the staging maven repo. But of course we should include the temporary maven
repo for preview releases on the download page.






On Wed, Jun 1, 2016 at 3:10 PM, Sean Owen  wrote:

> I'll be more specific about the issue that I think trumps all this,
> which I realize maybe not everyone was aware of.
>
> There was a long and contentious discussion on the PMC about, among
> other things, advertising a "Spark 2.0 preview" from Databricks, such
> as at
> https://databricks.com/blog/2016/05/11/apache-spark-2-0-technical-preview-easier-faster-and-smarter.html
>
> That post has already been updated/fixed from an earlier version, but
> part of the resolution was to make a full "2.0.0 preview" release in
> order to continue to be able to advertise it as such. Without it, I
> believe the PMC's conclusion remains that this blog post / product
> announcement is not allowed by ASF policy. Hence, either the product
> announcements need to be taken down and a bunch of wording changed in
> the Databricks product, or, this needs to be a normal release.
>
> Obviously, it seems far easier to just finish the release per usual. I
> actually didn't realize this had not been offered for download at
> http://spark.apache.org/downloads.html either. It needs to be
> accessible there too.
>
>
> We can get back in the weeds about what a "preview" release means,
> but, normal voted releases can and even should be alpha/beta
> (http://www.apache.org/dev/release.html) The culture is, in theory, to
> release early and often. I don't buy an argument that it's too old, at
> 2 weeks, when the alternative is having nothing at all to test
> against.
>
> On Wed, Jun 1, 2016 at 5:02 PM, Michael Armbrust 
> wrote:
> >> I'd think we want less effort, not more, to let people test it? for
> >> example, right now I can't easily try my product build against
> >> 2.0.0-preview.
> >
> >
> > I don't feel super strongly one way or the other, so if we need to
> publish
> > it permanently we can.
> >
> > However, either way you can still test against this release.  You just
> need
> > to add a resolver as well (which is how I have always tested packages
> > against RCs).  One concern with making it permeant is this preview
> release
> > is already fairly far behind branch-2.0, so many of the issues that
> people
> > might report have already been fixed and that might continue even after
> the
> > release is made.  I'd rather be able to force upgrades eventually when we
> > vote on the final 2.0 release.
> >
>


Re: Spark 2.0.0-preview artifacts still not available in Maven

2016-06-01 Thread Sean Owen
I'll be more specific about the issue that I think trumps all this,
which I realize maybe not everyone was aware of.

There was a long and contentious discussion on the PMC about, among
other things, advertising a "Spark 2.0 preview" from Databricks, such
as at 
https://databricks.com/blog/2016/05/11/apache-spark-2-0-technical-preview-easier-faster-and-smarter.html

That post has already been updated/fixed from an earlier version, but
part of the resolution was to make a full "2.0.0 preview" release in
order to continue to be able to advertise it as such. Without it, I
believe the PMC's conclusion remains that this blog post / product
announcement is not allowed by ASF policy. Hence, either the product
announcements need to be taken down and a bunch of wording changed in
the Databricks product, or, this needs to be a normal release.

Obviously, it seems far easier to just finish the release per usual. I
actually didn't realize this had not been offered for download at
http://spark.apache.org/downloads.html either. It needs to be
accessible there too.


We can get back in the weeds about what a "preview" release means,
but, normal voted releases can and even should be alpha/beta
(http://www.apache.org/dev/release.html) The culture is, in theory, to
release early and often. I don't buy an argument that it's too old, at
2 weeks, when the alternative is having nothing at all to test
against.

On Wed, Jun 1, 2016 at 5:02 PM, Michael Armbrust  wrote:
>> I'd think we want less effort, not more, to let people test it? for
>> example, right now I can't easily try my product build against
>> 2.0.0-preview.
>
>
> I don't feel super strongly one way or the other, so if we need to publish
> it permanently we can.
>
> However, either way you can still test against this release.  You just need
> to add a resolver as well (which is how I have always tested packages
> against RCs).  One concern with making it permeant is this preview release
> is already fairly far behind branch-2.0, so many of the issues that people
> might report have already been fixed and that might continue even after the
> release is made.  I'd rather be able to force upgrades eventually when we
> vote on the final 2.0 release.
>

-
To unsubscribe, e-mail: dev-unsubscr...@spark.apache.org
For additional commands, e-mail: dev-h...@spark.apache.org



Re: Spark 2.0.0-preview artifacts still not available in Maven

2016-06-01 Thread Michael Armbrust
>
> I'd think we want less effort, not more, to let people test it? for
> example, right now I can't easily try my product build against
> 2.0.0-preview.


I don't feel super strongly one way or the other, so if we need to publish
it permanently we can.

However, either way you can still test against this release.  You just need
to add a resolver as well (which is how I have always tested packages
against RCs).  One concern with making it permeant is this preview release
is already fairly far behind branch-2.0, so many of the issues that people
might report have already been fixed and that might continue even after the
release is made.  I'd rather be able to force upgrades eventually when we
vote on the final 2.0 release.


Re: Spark 2.0.0-preview artifacts still not available in Maven

2016-06-01 Thread Marcelo Vanzin
On Wed, Jun 1, 2016 at 2:51 PM, Sean Owen  wrote:
> I'd think we want less effort, not more, to let people test it? for
> example, right now I can't easily try my product build against
> 2.0.0-preview.

While I understand your point of view, I like the extra effort to get
to these artifacts because it prevents people from easily building
their applications on top of what is known to be an unstable release
(either API-wise or quality wise).

I see this preview release more like a snapshot release that was voted
on for wide testing, instead of a proper release that we want to
encourage people to build on. And like snapshots, I like that to use
it on your application you have to go out of your way and add a
separate repository instead of just changing a version string or
command line argument.

My 2 bits.

-- 
Marcelo

-
To unsubscribe, e-mail: dev-unsubscr...@spark.apache.org
For additional commands, e-mail: dev-h...@spark.apache.org



Re: Spark 2.0.0-preview artifacts still not available in Maven

2016-06-01 Thread Sean Owen
An RC is something that gets voted on, and the final one is turned
into a blessed release. I agree that RCs don't get published to Maven
Central, but releases do of course.

This was certainly to be an official release, right? A beta or alpha
can still be an official, published release. The proximate motivation
was to solve a problem of advertising "Apache Spark 2.0.0 preview" in
a product, when no such release existed from the ASF. Hence the point
was to produce a full regular release, and I think that needs to
include the usual Maven artifacts.

I'd think we want less effort, not more, to let people test it? for
example, right now I can't easily try my product build against
2.0.0-preview.

On Wed, Jun 1, 2016 at 3:53 PM, Marcelo Vanzin  wrote:
> So are RCs, aren't they?
>
> Personally I'm fine with not releasing to maven central. Any extra
> effort needed by regular users to use a preview / RC is good with me.
>
> On Wed, Jun 1, 2016 at 1:50 PM, Reynold Xin  wrote:
>> To play devil's advocate, previews are technically not RCs. They are
>> actually voted releases.
>>
>> On Wed, Jun 1, 2016 at 1:46 PM, Michael Armbrust 
>> wrote:
>>>
>>> Yeah, we don't usually publish RCs to central, right?
>>>
>>> On Wed, Jun 1, 2016 at 1:06 PM, Reynold Xin  wrote:

 They are here ain't they?

 https://repository.apache.org/content/repositories/orgapachespark-1182/

 Did you mean publishing them to maven central? My understanding is that
 publishing to maven central isn't a required step of doing theses. This
 might be a good opportunity to discuss that. My thought is that it is since
 Maven central is immutable, and the purposes of the preview releases are to
 get people to test it early on in preparation for the actual release, it
 might be better to not publish preview releases to maven central. Users
 testing with preview releases can just use the temporary repository above.




 On Wed, Jun 1, 2016 at 11:36 AM, Sean Owen  wrote:
>
> Just checked and they are still not published this week. Can these be
> published ASAP to complete the 2.0.0-preview release?


>>>
>>
>
>
>
> --
> Marcelo

-
To unsubscribe, e-mail: dev-unsubscr...@spark.apache.org
For additional commands, e-mail: dev-h...@spark.apache.org



Re: Spark 2.0.0-preview artifacts still not available in Maven

2016-06-01 Thread Jonathan Kelly
I think what Reynold probably means is that previews are releases for which
a vote *passed*.

~ Jonathan

On Wed, Jun 1, 2016 at 1:53 PM Marcelo Vanzin  wrote:

> So are RCs, aren't they?
>
> Personally I'm fine with not releasing to maven central. Any extra
> effort needed by regular users to use a preview / RC is good with me.
>
> On Wed, Jun 1, 2016 at 1:50 PM, Reynold Xin  wrote:
> > To play devil's advocate, previews are technically not RCs. They are
> > actually voted releases.
> >
> > On Wed, Jun 1, 2016 at 1:46 PM, Michael Armbrust  >
> > wrote:
> >>
> >> Yeah, we don't usually publish RCs to central, right?
> >>
> >> On Wed, Jun 1, 2016 at 1:06 PM, Reynold Xin 
> wrote:
> >>>
> >>> They are here ain't they?
> >>>
> >>>
> https://repository.apache.org/content/repositories/orgapachespark-1182/
> >>>
> >>> Did you mean publishing them to maven central? My understanding is that
> >>> publishing to maven central isn't a required step of doing theses. This
> >>> might be a good opportunity to discuss that. My thought is that it is
> since
> >>> Maven central is immutable, and the purposes of the preview releases
> are to
> >>> get people to test it early on in preparation for the actual release,
> it
> >>> might be better to not publish preview releases to maven central. Users
> >>> testing with preview releases can just use the temporary repository
> above.
> >>>
> >>>
> >>>
> >>>
> >>> On Wed, Jun 1, 2016 at 11:36 AM, Sean Owen  wrote:
> 
>  Just checked and they are still not published this week. Can these be
>  published ASAP to complete the 2.0.0-preview release?
> >>>
> >>>
> >>
> >
>
>
>
> --
> Marcelo
>
> -
> To unsubscribe, e-mail: dev-unsubscr...@spark.apache.org
> For additional commands, e-mail: dev-h...@spark.apache.org
>
>


Re: Spark 2.0.0-preview artifacts still not available in Maven

2016-06-01 Thread Marcelo Vanzin
So are RCs, aren't they?

Personally I'm fine with not releasing to maven central. Any extra
effort needed by regular users to use a preview / RC is good with me.

On Wed, Jun 1, 2016 at 1:50 PM, Reynold Xin  wrote:
> To play devil's advocate, previews are technically not RCs. They are
> actually voted releases.
>
> On Wed, Jun 1, 2016 at 1:46 PM, Michael Armbrust 
> wrote:
>>
>> Yeah, we don't usually publish RCs to central, right?
>>
>> On Wed, Jun 1, 2016 at 1:06 PM, Reynold Xin  wrote:
>>>
>>> They are here ain't they?
>>>
>>> https://repository.apache.org/content/repositories/orgapachespark-1182/
>>>
>>> Did you mean publishing them to maven central? My understanding is that
>>> publishing to maven central isn't a required step of doing theses. This
>>> might be a good opportunity to discuss that. My thought is that it is since
>>> Maven central is immutable, and the purposes of the preview releases are to
>>> get people to test it early on in preparation for the actual release, it
>>> might be better to not publish preview releases to maven central. Users
>>> testing with preview releases can just use the temporary repository above.
>>>
>>>
>>>
>>>
>>> On Wed, Jun 1, 2016 at 11:36 AM, Sean Owen  wrote:

 Just checked and they are still not published this week. Can these be
 published ASAP to complete the 2.0.0-preview release?
>>>
>>>
>>
>



-- 
Marcelo

-
To unsubscribe, e-mail: dev-unsubscr...@spark.apache.org
For additional commands, e-mail: dev-h...@spark.apache.org



Re: Spark 2.0.0-preview artifacts still not available in Maven

2016-06-01 Thread Reynold Xin
To play devil's advocate, previews are technically not RCs. They are
actually voted releases.

On Wed, Jun 1, 2016 at 1:46 PM, Michael Armbrust 
wrote:

> Yeah, we don't usually publish RCs to central, right?
>
> On Wed, Jun 1, 2016 at 1:06 PM, Reynold Xin  wrote:
>
>> They are here ain't they?
>>
>> https://repository.apache.org/content/repositories/orgapachespark-1182/
>>
>> Did you mean publishing them to maven central? My understanding is that
>> publishing to maven central isn't a required step of doing theses. This
>> might be a good opportunity to discuss that. My thought is that it is since
>> Maven central is immutable, and the purposes of the preview releases are to
>> get people to test it early on in preparation for the actual release, it
>> might be better to not publish preview releases to maven central. Users
>> testing with preview releases can just use the temporary repository above.
>>
>>
>>
>>
>> On Wed, Jun 1, 2016 at 11:36 AM, Sean Owen  wrote:
>>
>>> Just checked and they are still not published this week. Can these be
>>> published ASAP to complete the 2.0.0-preview release?
>>>
>>
>>
>


Re: Spark 2.0.0-preview artifacts still not available in Maven

2016-06-01 Thread Michael Armbrust
Yeah, we don't usually publish RCs to central, right?

On Wed, Jun 1, 2016 at 1:06 PM, Reynold Xin  wrote:

> They are here ain't they?
>
> https://repository.apache.org/content/repositories/orgapachespark-1182/
>
> Did you mean publishing them to maven central? My understanding is that
> publishing to maven central isn't a required step of doing theses. This
> might be a good opportunity to discuss that. My thought is that it is since
> Maven central is immutable, and the purposes of the preview releases are to
> get people to test it early on in preparation for the actual release, it
> might be better to not publish preview releases to maven central. Users
> testing with preview releases can just use the temporary repository above.
>
>
>
>
> On Wed, Jun 1, 2016 at 11:36 AM, Sean Owen  wrote:
>
>> Just checked and they are still not published this week. Can these be
>> published ASAP to complete the 2.0.0-preview release?
>>
>
>


Re: Spark 2.0.0-preview artifacts still not available in Maven

2016-06-01 Thread Reynold Xin
They are here ain't they?

https://repository.apache.org/content/repositories/orgapachespark-1182/

Did you mean publishing them to maven central? My understanding is that
publishing to maven central isn't a required step of doing theses. This
might be a good opportunity to discuss that. My thought is that it is since
Maven central is immutable, and the purposes of the preview releases are to
get people to test it early on in preparation for the actual release, it
might be better to not publish preview releases to maven central. Users
testing with preview releases can just use the temporary repository above.




On Wed, Jun 1, 2016 at 11:36 AM, Sean Owen  wrote:

> Just checked and they are still not published this week. Can these be
> published ASAP to complete the 2.0.0-preview release?
>


Spark 2.0.0-preview artifacts still not available in Maven

2016-06-01 Thread Sean Owen
Just checked and they are still not published this week. Can these be
published ASAP to complete the 2.0.0-preview release?

-
To unsubscribe, e-mail: dev-unsubscr...@spark.apache.org
For additional commands, e-mail: dev-h...@spark.apache.org