Nightlies are built and made available in the ASF snapshot repo, from
master. This is noted at the bottom of the downloads page, and at
https://cwiki.apache.org/confluence/display/SPARK/Useful+Developer+Tools#UsefulDeveloperTools-NightlyBuilds
. This hasn't changed in as long as I can recall.

Nightlies are not blessed, and are not for consumption other than by
developers. That is you shouldn't bundle them in a release, shouldn't
release a product based on "2.0.1 snapshot" for example because no
such ASF release exists. This info isn't meant to be secret, but is
not made obvious to casual end users for this reason. Yes it's for
developers who want to test other products in advance.

So-called preview releases are really just normal releases and are
made available in the usual way. They just have a different name. I
don't know if another one of those will happen; maybe for 3.0.

The published master snapshot would give you 2.1.0-SNAPSHOT at the
moment. Other branches don't have nightlies, but are likely to be of
less interest.

You can always "mvn -DskipTests install" from a checkout of any branch
to make the branch's SNAPSHOT available in your local Maven repo, or
even publish it to your private repo.

On Tue, Aug 9, 2016 at 7:32 PM, Chris Fregly <ch...@fregly.com> wrote:
> this is a valid question.  there are many people building products and
> tooling on top of spark and would like access to the latest snapshots and
> such.  today's ink is yesterday's news to these people - including myself.
>
> what is the best way to get snapshot releases including nightly and
> specially-blessed "preview" releases so that we, too, can say "try the
> latest release in our product"?
>
> there was a lot of chatter during the 2.0.0/2.0.1 release that i largely
> ignored because of conflicting/confusing/changing responses.  and i'd rather
> not dig through jenkins builds to figure this out as i'll likely get it
> wrong.
>
> please provide the relevant snapshot/preview/nightly/whatever repos (or
> equivalent) that we need to include in our builds to have access to the
> absolute latest build assets for every major and minor release.
>
> thanks!
>
> -chris
>
>
> On Tue, Aug 9, 2016 at 10:00 AM, Mich Talebzadeh <mich.talebza...@gmail.com>
> wrote:
>>
>> LOL
>>
>> Ink has not dried on Spark 2 yet so to speak :)
>>
>> Dr Mich Talebzadeh
>>
>>
>>
>> LinkedIn
>> https://www.linkedin.com/profile/view?id=AAEAAAAWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw
>>
>>
>>
>> http://talebzadehmich.wordpress.com
>>
>>
>> Disclaimer: Use it at your own risk. Any and all responsibility for any
>> loss, damage or destruction of data or any other property which may arise
>> from relying on this email's technical content is explicitly disclaimed. The
>> author will in no case be liable for any monetary damages arising from such
>> loss, damage or destruction.
>>
>>
>>
>>
>> On 9 August 2016 at 17:56, Mark Hamstra <m...@clearstorydata.com> wrote:
>>>
>>> What are you expecting to find?  There currently are no releases beyond
>>> Spark 2.0.0.
>>>
>>> On Tue, Aug 9, 2016 at 9:55 AM, Jestin Ma <jestinwith.a...@gmail.com>
>>> wrote:
>>>>
>>>> If we want to use versions of Spark beyond the official 2.0.0 release,
>>>> specifically on Maven + Java, what steps should we take to upgrade? I can't
>>>> find the newer versions on Maven central.
>>>>
>>>> Thank you!
>>>> Jestin
>>>
>>>
>>
>
>
>
> --
> Chris Fregly
> Research Scientist @ PipelineIO
> San Francisco, CA
> pipeline.io
> advancedspark.com
>

---------------------------------------------------------------------
To unsubscribe e-mail: user-unsubscr...@spark.apache.org

Reply via email to