As far as I know the process is just to copy docs/_site from the build
to the appropriate location in the SVN repo (i.e.
site/docs/2.0.0-preview).
Thanks
Shivaram
On Tue, Jun 7, 2016 at 8:14 AM, Sean Owen wrote:
> As a stop-gap, I can edit that page to have a small section
Thanks Sean, you were right, hard refresh made it show up.
Seems like we should at least link to the preview docs from
http://spark.apache.org/documentation.html.
Tom
On Tuesday, June 7, 2016 10:04 AM, Sean Owen wrote:
It's there (refresh maybe?). See the end of the
As a stop-gap, I can edit that page to have a small section about
preview releases and point to the nightly docs.
Not sure who has the power to push 2.0.0-preview to site/docs, but, if
that's done then we can symlink "preview" in that dir to it and be
done, and update this section about preview
It's there (refresh maybe?). See the end of the downloads dropdown.
For the moment you can see the docs in the nightly docs build:
https://home.apache.org/~pwendell/spark-nightly/spark-branch-2.0-docs/latest/
I don't know, what's the best way to put this into the main site?
under a /preview
I've been a bit on the fence on this, but I agree that Luciano makes a
compelling reason for why we really should publish things to maven
central. Sure we slightly increase the risk somebody refers to the preview
release too late, but really that is their own fault.
And I also I agree with
On Mon, Jun 6, 2016 at 12:05 PM, Reynold Xin wrote:
> The bahir one was a good argument actually. I just clicked the button to
> push it into Maven central.
>
>
Thank You !!!
The bahir one was a good argument actually. I just clicked the button to
push it into Maven central.
On Mon, Jun 6, 2016 at 12:00 PM, Mark Hamstra
wrote:
> Fine. I don't feel strongly enough about it to continue to argue against
> putting the artifacts on Maven
Artifacts can't be removed from Maven in any normal circumstance, but,
it's no problem.
The argument that people might keep using it goes for any older
release. Why would anyone use 1.6.0 when 1.6.1 exists? yet we keep
1.6.0 just for the record and to not break builds. It may be that
Foobar
+1 for moving this discussion to a proactive new (alpha/beta) release of Apache
Spark 2.0!
> On 06 Jun 2016, at 20:25, Ovidiu Cristian Marcu wrote:
>
> Any chance to start preparing a new alpha/beta release for 2.0 this month or
> the preview will be pushed to maven and
On Mon, Jun 6, 2016 at 11:12 AM, Matei Zaharia
wrote:
> Is there any way to remove artifacts from Maven Central? Maybe that would
> help clean these things up long-term, though it would create problems for
> users who for some reason decide to rely on these previews.
>
>
Is there any way to remove artifacts from Maven Central? Maybe that would
help clean these things up long-term, though it would create problems for
users who for some reason decide to rely on these previews.
In any case, if people are *really* concerned about this, we should just
put it there. My
On Mon, Jun 6, 2016 at 10:08 AM, Mark Hamstra
wrote:
> I still don't know where this "severely compromised builds of limited
>> usefulness" thing comes from? what's so bad? You didn't veto its
>> release, after all.
>
>
> I simply mean that it was released with the
On Mon, Jun 6, 2016 at 9:51 AM, Sean Owen wrote:
> I still don't know where this "severely compromised builds of limited
> usefulness" thing comes from? what's so bad? You didn't veto its
> release, after all. And rightly so: a release doesn't mean "definitely
> works"; it
>
> I still don't know where this "severely compromised builds of limited
> usefulness" thing comes from? what's so bad? You didn't veto its
> release, after all.
I simply mean that it was released with the knowledge that there are still
significant bugs in the preview that definitely would
I still don't know where this "severely compromised builds of limited
usefulness" thing comes from? what's so bad? You didn't veto its
release, after all. And rightly so: a release doesn't mean "definitely
works"; it means it was created the right way. It's OK to say it's
buggy alpha software;
+1 to what Mark said. I've been following this discussion and I don't
understand where the sudden "Databricks vs. everybody else" narrative came
from.
On Mon, Jun 6, 2016 at 11:00 AM Mark Hamstra
wrote:
> This is not a Databricks vs. The World situation, and the fact
This is not a Databricks vs. The World situation, and the fact that some
persist in forcing every issue into that frame is getting annoying. There
are good engineering and project-management reasons not to populate the
long-term, canonical repository of Maven artifacts with what are known to
be
+1 agree that right the problem is theoretical esp if the preview label is
in the version coordinates as it should be.
On Saturday, June 4, 2016, Sean Owen wrote:
> Artifacts that are not for public consumption shouldn't be in a public
> release; this is instead what
Hi all
IMHO the preview ‘release’ is good at is is now, so no further changes required.
For me the preview was a trigger to what will be the next Spark 2.0, really
appreciate the effort guys made to describe it and market it:)
I’ll appreciate if the Apache Spark team will start a vote for a new
Artifacts that are not for public consumption shouldn't be in a public
release; this is instead what nightlies are for. However, this was a
normal public release.
I am not even sure why it's viewed as particularly unsafe, but, unsafe
alpha and beta releases are just releases, and their name and
Personally I'd just put them on the staging repo and link to that on the
downloads page. It will create less confusion for people browsing Maven Central
later and wondering which releases are safe to use.
Matei
> On Jun 3, 2016, at 8:22 AM, Mark Hamstra wrote:
>
>
It's not a question of whether the preview artifacts can be made available
on Maven central, but rather whether they must be or should be. I've got
no problems leaving these unstable, transitory artifacts out of the more
permanent, canonical repository.
On Fri, Jun 3, 2016 at 1:53 AM, Steve
It's been voted on by the project, so can go up on central
There's already some JIRAs being filed against it, this is a metric of success
as pre-beta of the artifacts.
The risk of exercising the m2 central option is that people may get
expectations that they can point their code at the
One thing we can do is to do monthly milestone releases, similar to other
projects (e.g. Scala).
So we can have Apache Spark 2.1.0-M1, Apache Spark 2.1.0-M2.
On Thu, Jun 2, 2016 at 12:42 PM, Tom Graves wrote:
> The documentation for the preview release also seem to be
The documentation for the preview release also seem to be missing?
Also what happens if we want to do a second preview release? The naming
doesn't seem to allow then unless we call it preview 2.
Tom
On Wednesday, June 1, 2016 6:27 PM, Sean Owen wrote:
On Wed, Jun
On Wed, Jun 1, 2016 at 5:58 PM, Reynold Xin wrote:
> The preview release is available here:
> http://spark.apache.org/downloads.html (there is an entire section dedicated
> to it and also there is a news link to it on the right).
Oops, it is indeed down there at the bottom,
Hi Sean,
(writing this email with my Apache hat on only and not Databricks hat)
The preview release is available here:
http://spark.apache.org/downloads.html (there is an entire section
dedicated to it and also there is a news link to it on the right).
Again, I think this is a good opportunity
I'll be more specific about the issue that I think trumps all this,
which I realize maybe not everyone was aware of.
There was a long and contentious discussion on the PMC about, among
other things, advertising a "Spark 2.0 preview" from Databricks, such
as at
>
> I'd think we want less effort, not more, to let people test it? for
> example, right now I can't easily try my product build against
> 2.0.0-preview.
I don't feel super strongly one way or the other, so if we need to publish
it permanently we can.
However, either way you can still test
On Wed, Jun 1, 2016 at 2:51 PM, Sean Owen wrote:
> I'd think we want less effort, not more, to let people test it? for
> example, right now I can't easily try my product build against
> 2.0.0-preview.
While I understand your point of view, I like the extra effort to get
to
An RC is something that gets voted on, and the final one is turned
into a blessed release. I agree that RCs don't get published to Maven
Central, but releases do of course.
This was certainly to be an official release, right? A beta or alpha
can still be an official, published release. The
I think what Reynold probably means is that previews are releases for which
a vote *passed*.
~ Jonathan
On Wed, Jun 1, 2016 at 1:53 PM Marcelo Vanzin wrote:
> So are RCs, aren't they?
>
> Personally I'm fine with not releasing to maven central. Any extra
> effort needed by
So are RCs, aren't they?
Personally I'm fine with not releasing to maven central. Any extra
effort needed by regular users to use a preview / RC is good with me.
On Wed, Jun 1, 2016 at 1:50 PM, Reynold Xin wrote:
> To play devil's advocate, previews are technically not RCs.
To play devil's advocate, previews are technically not RCs. They are
actually voted releases.
On Wed, Jun 1, 2016 at 1:46 PM, Michael Armbrust
wrote:
> Yeah, we don't usually publish RCs to central, right?
>
> On Wed, Jun 1, 2016 at 1:06 PM, Reynold Xin
Yeah, we don't usually publish RCs to central, right?
On Wed, Jun 1, 2016 at 1:06 PM, Reynold Xin wrote:
> They are here ain't they?
>
> https://repository.apache.org/content/repositories/orgapachespark-1182/
>
> Did you mean publishing them to maven central? My
They are here ain't they?
https://repository.apache.org/content/repositories/orgapachespark-1182/
Did you mean publishing them to maven central? My understanding is that
publishing to maven central isn't a required step of doing theses. This
might be a good opportunity to discuss that. My
Just checked and they are still not published this week. Can these be
published ASAP to complete the 2.0.0-preview release?
-
To unsubscribe, e-mail: dev-unsubscr...@spark.apache.org
For additional commands, e-mail:
37 matches
Mail list logo