Re: Spark 2.0.1 / 2.1.0 on Maven

2016-08-14 Thread Jacek Laskowski
Hi Jestin,

You can find the docs of the latest and greatest Spark at
http://people.apache.org/~pwendell/spark-nightly/spark-master-docs/latest/.

The jars are at the ASF SNAPSHOT repo at
http://repository.apache.org/snapshots/.

Pozdrawiam,
Jacek Laskowski

https://medium.com/@jaceklaskowski/
Mastering Apache Spark 2.0 http://bit.ly/mastering-apache-spark
Follow me at https://twitter.com/jaceklaskowski


On Tue, Aug 9, 2016 at 9:55 AM, Jestin Ma  wrote:
> If we want to use versions of Spark beyond the official 2.0.0 release,
> specifically on Maven + Java, what steps should we take to upgrade? I can't
> find the newer versions on Maven central.
>
> Thank you!
> Jestin

-
To unsubscribe e-mail: user-unsubscr...@spark.apache.org



Re: Spark 2.0.1 / 2.1.0 on Maven

2016-08-09 Thread Chris Fregly
alrighty then!

bcc'ing user list.  cc'ing dev list.

@user list people:  do not read any further or you will be in violation of
ASF policies!

On Tue, Aug 9, 2016 at 11:50 AM, Mark Hamstra 
wrote:

> That's not going to happen on the user list, since that is against ASF
> policy (http://www.apache.org/dev/release.html):
>
> During the process of developing software and preparing a release, various
>> packages are made available to the developer community for testing
>> purposes. Do not include any links on the project website that might
>> encourage non-developers to download and use nightly builds, snapshots,
>> release candidates, or any other similar package. The only people who
>> are supposed to know about such packages are the people following the dev
>> list (or searching its archives) and thus aware of the conditions placed on
>> the package. If you find that the general public are downloading such test
>> packages, then remove them.
>>
>
> On Tue, Aug 9, 2016 at 11:32 AM, Chris Fregly  wrote:
>
>> this is a valid question.  there are many people building products and
>> tooling on top of spark and would like access to the latest snapshots and
>> such.  today's ink is yesterday's news to these people - including myself.
>>
>> what is the best way to get snapshot releases including nightly and
>> specially-blessed "preview" releases so that we, too, can say "try the
>> latest release in our product"?
>>
>> there was a lot of chatter during the 2.0.0/2.0.1 release that i largely
>> ignored because of conflicting/confusing/changing responses.  and i'd
>> rather not dig through jenkins builds to figure this out as i'll likely get
>> it wrong.
>>
>> please provide the relevant snapshot/preview/nightly/whatever repos (or
>> equivalent) that we need to include in our builds to have access to the
>> absolute latest build assets for every major and minor release.
>>
>> thanks!
>>
>> -chris
>>
>>
>> On Tue, Aug 9, 2016 at 10:00 AM, Mich Talebzadeh <
>> mich.talebza...@gmail.com> wrote:
>>
>>> LOL
>>>
>>> Ink has not dried on Spark 2 yet so to speak :)
>>>
>>> Dr Mich Talebzadeh
>>>
>>>
>>>
>>> LinkedIn * 
>>> https://www.linkedin.com/profile/view?id=AAEWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw
>>> *
>>>
>>>
>>>
>>> http://talebzadehmich.wordpress.com
>>>
>>>
>>> *Disclaimer:* Use it at your own risk. Any and all responsibility for
>>> any loss, damage or destruction of data or any other property which may
>>> arise from relying on this email's technical content is explicitly
>>> disclaimed. The author will in no case be liable for any monetary damages
>>> arising from such loss, damage or destruction.
>>>
>>>
>>>
>>> On 9 August 2016 at 17:56, Mark Hamstra  wrote:
>>>
 What are you expecting to find?  There currently are no releases beyond
 Spark 2.0.0.

 On Tue, Aug 9, 2016 at 9:55 AM, Jestin Ma 
 wrote:

> If we want to use versions of Spark beyond the official 2.0.0 release,
> specifically on Maven + Java, what steps should we take to upgrade? I 
> can't
> find the newer versions on Maven central.
>
> Thank you!
> Jestin
>


>>>
>>
>>
>> --
>> *Chris Fregly*
>> Research Scientist @ PipelineIO
>> San Francisco, CA
>> pipeline.io
>> advancedspark.com
>>
>>
>


-- 
*Chris Fregly*
Research Scientist @ PipelineIO
San Francisco, CA
pipeline.io
advancedspark.com


Re: Spark 2.0.1 / 2.1.0 on Maven

2016-08-09 Thread Sean Owen
Nightlies are built and made available in the ASF snapshot repo, from
master. This is noted at the bottom of the downloads page, and at
https://cwiki.apache.org/confluence/display/SPARK/Useful+Developer+Tools#UsefulDeveloperTools-NightlyBuilds
. This hasn't changed in as long as I can recall.

Nightlies are not blessed, and are not for consumption other than by
developers. That is you shouldn't bundle them in a release, shouldn't
release a product based on "2.0.1 snapshot" for example because no
such ASF release exists. This info isn't meant to be secret, but is
not made obvious to casual end users for this reason. Yes it's for
developers who want to test other products in advance.

So-called preview releases are really just normal releases and are
made available in the usual way. They just have a different name. I
don't know if another one of those will happen; maybe for 3.0.

The published master snapshot would give you 2.1.0-SNAPSHOT at the
moment. Other branches don't have nightlies, but are likely to be of
less interest.

You can always "mvn -DskipTests install" from a checkout of any branch
to make the branch's SNAPSHOT available in your local Maven repo, or
even publish it to your private repo.

On Tue, Aug 9, 2016 at 7:32 PM, Chris Fregly  wrote:
> this is a valid question.  there are many people building products and
> tooling on top of spark and would like access to the latest snapshots and
> such.  today's ink is yesterday's news to these people - including myself.
>
> what is the best way to get snapshot releases including nightly and
> specially-blessed "preview" releases so that we, too, can say "try the
> latest release in our product"?
>
> there was a lot of chatter during the 2.0.0/2.0.1 release that i largely
> ignored because of conflicting/confusing/changing responses.  and i'd rather
> not dig through jenkins builds to figure this out as i'll likely get it
> wrong.
>
> please provide the relevant snapshot/preview/nightly/whatever repos (or
> equivalent) that we need to include in our builds to have access to the
> absolute latest build assets for every major and minor release.
>
> thanks!
>
> -chris
>
>
> On Tue, Aug 9, 2016 at 10:00 AM, Mich Talebzadeh 
> wrote:
>>
>> LOL
>>
>> Ink has not dried on Spark 2 yet so to speak :)
>>
>> Dr Mich Talebzadeh
>>
>>
>>
>> LinkedIn
>> https://www.linkedin.com/profile/view?id=AAEWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw
>>
>>
>>
>> http://talebzadehmich.wordpress.com
>>
>>
>> Disclaimer: Use it at your own risk. Any and all responsibility for any
>> loss, damage or destruction of data or any other property which may arise
>> from relying on this email's technical content is explicitly disclaimed. The
>> author will in no case be liable for any monetary damages arising from such
>> loss, damage or destruction.
>>
>>
>>
>>
>> On 9 August 2016 at 17:56, Mark Hamstra  wrote:
>>>
>>> What are you expecting to find?  There currently are no releases beyond
>>> Spark 2.0.0.
>>>
>>> On Tue, Aug 9, 2016 at 9:55 AM, Jestin Ma 
>>> wrote:

 If we want to use versions of Spark beyond the official 2.0.0 release,
 specifically on Maven + Java, what steps should we take to upgrade? I can't
 find the newer versions on Maven central.

 Thank you!
 Jestin
>>>
>>>
>>
>
>
>
> --
> Chris Fregly
> Research Scientist @ PipelineIO
> San Francisco, CA
> pipeline.io
> advancedspark.com
>

-
To unsubscribe e-mail: user-unsubscr...@spark.apache.org



Re: Spark 2.0.1 / 2.1.0 on Maven

2016-08-09 Thread Mark Hamstra
That's not going to happen on the user list, since that is against ASF
policy (http://www.apache.org/dev/release.html):

During the process of developing software and preparing a release, various
> packages are made available to the developer community for testing
> purposes. Do not include any links on the project website that might
> encourage non-developers to download and use nightly builds, snapshots,
> release candidates, or any other similar package. The only people who are
> supposed to know about such packages are the people following the dev list
> (or searching its archives) and thus aware of the conditions placed on the
> package. If you find that the general public are downloading such test
> packages, then remove them.
>

On Tue, Aug 9, 2016 at 11:32 AM, Chris Fregly  wrote:

> this is a valid question.  there are many people building products and
> tooling on top of spark and would like access to the latest snapshots and
> such.  today's ink is yesterday's news to these people - including myself.
>
> what is the best way to get snapshot releases including nightly and
> specially-blessed "preview" releases so that we, too, can say "try the
> latest release in our product"?
>
> there was a lot of chatter during the 2.0.0/2.0.1 release that i largely
> ignored because of conflicting/confusing/changing responses.  and i'd
> rather not dig through jenkins builds to figure this out as i'll likely get
> it wrong.
>
> please provide the relevant snapshot/preview/nightly/whatever repos (or
> equivalent) that we need to include in our builds to have access to the
> absolute latest build assets for every major and minor release.
>
> thanks!
>
> -chris
>
>
> On Tue, Aug 9, 2016 at 10:00 AM, Mich Talebzadeh <
> mich.talebza...@gmail.com> wrote:
>
>> LOL
>>
>> Ink has not dried on Spark 2 yet so to speak :)
>>
>> Dr Mich Talebzadeh
>>
>>
>>
>> LinkedIn * 
>> https://www.linkedin.com/profile/view?id=AAEWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw
>> *
>>
>>
>>
>> http://talebzadehmich.wordpress.com
>>
>>
>> *Disclaimer:* Use it at your own risk. Any and all responsibility for
>> any loss, damage or destruction of data or any other property which may
>> arise from relying on this email's technical content is explicitly
>> disclaimed. The author will in no case be liable for any monetary damages
>> arising from such loss, damage or destruction.
>>
>>
>>
>> On 9 August 2016 at 17:56, Mark Hamstra  wrote:
>>
>>> What are you expecting to find?  There currently are no releases beyond
>>> Spark 2.0.0.
>>>
>>> On Tue, Aug 9, 2016 at 9:55 AM, Jestin Ma 
>>> wrote:
>>>
 If we want to use versions of Spark beyond the official 2.0.0 release,
 specifically on Maven + Java, what steps should we take to upgrade? I can't
 find the newer versions on Maven central.

 Thank you!
 Jestin

>>>
>>>
>>
>
>
> --
> *Chris Fregly*
> Research Scientist @ PipelineIO
> San Francisco, CA
> pipeline.io
> advancedspark.com
>
>


Re: Spark 2.0.1 / 2.1.0 on Maven

2016-08-09 Thread Chris Fregly
this is a valid question.  there are many people building products and
tooling on top of spark and would like access to the latest snapshots and
such.  today's ink is yesterday's news to these people - including myself.

what is the best way to get snapshot releases including nightly and
specially-blessed "preview" releases so that we, too, can say "try the
latest release in our product"?

there was a lot of chatter during the 2.0.0/2.0.1 release that i largely
ignored because of conflicting/confusing/changing responses.  and i'd
rather not dig through jenkins builds to figure this out as i'll likely get
it wrong.

please provide the relevant snapshot/preview/nightly/whatever repos (or
equivalent) that we need to include in our builds to have access to the
absolute latest build assets for every major and minor release.

thanks!

-chris


On Tue, Aug 9, 2016 at 10:00 AM, Mich Talebzadeh 
wrote:

> LOL
>
> Ink has not dried on Spark 2 yet so to speak :)
>
> Dr Mich Talebzadeh
>
>
>
> LinkedIn * 
> https://www.linkedin.com/profile/view?id=AAEWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw
> *
>
>
>
> http://talebzadehmich.wordpress.com
>
>
> *Disclaimer:* Use it at your own risk. Any and all responsibility for any
> loss, damage or destruction of data or any other property which may arise
> from relying on this email's technical content is explicitly disclaimed.
> The author will in no case be liable for any monetary damages arising from
> such loss, damage or destruction.
>
>
>
> On 9 August 2016 at 17:56, Mark Hamstra  wrote:
>
>> What are you expecting to find?  There currently are no releases beyond
>> Spark 2.0.0.
>>
>> On Tue, Aug 9, 2016 at 9:55 AM, Jestin Ma 
>> wrote:
>>
>>> If we want to use versions of Spark beyond the official 2.0.0 release,
>>> specifically on Maven + Java, what steps should we take to upgrade? I can't
>>> find the newer versions on Maven central.
>>>
>>> Thank you!
>>> Jestin
>>>
>>
>>
>


-- 
*Chris Fregly*
Research Scientist @ PipelineIO
San Francisco, CA
pipeline.io
advancedspark.com


Re: Spark 2.0.1 / 2.1.0 on Maven

2016-08-09 Thread Mich Talebzadeh
LOL

Ink has not dried on Spark 2 yet so to speak :)

Dr Mich Talebzadeh



LinkedIn * 
https://www.linkedin.com/profile/view?id=AAEWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw
*



http://talebzadehmich.wordpress.com


*Disclaimer:* Use it at your own risk. Any and all responsibility for any
loss, damage or destruction of data or any other property which may arise
from relying on this email's technical content is explicitly disclaimed.
The author will in no case be liable for any monetary damages arising from
such loss, damage or destruction.



On 9 August 2016 at 17:56, Mark Hamstra  wrote:

> What are you expecting to find?  There currently are no releases beyond
> Spark 2.0.0.
>
> On Tue, Aug 9, 2016 at 9:55 AM, Jestin Ma 
> wrote:
>
>> If we want to use versions of Spark beyond the official 2.0.0 release,
>> specifically on Maven + Java, what steps should we take to upgrade? I can't
>> find the newer versions on Maven central.
>>
>> Thank you!
>> Jestin
>>
>
>


Re: Spark 2.0.1 / 2.1.0 on Maven

2016-08-09 Thread Mark Hamstra
What are you expecting to find?  There currently are no releases beyond
Spark 2.0.0.

On Tue, Aug 9, 2016 at 9:55 AM, Jestin Ma  wrote:

> If we want to use versions of Spark beyond the official 2.0.0 release,
> specifically on Maven + Java, what steps should we take to upgrade? I can't
> find the newer versions on Maven central.
>
> Thank you!
> Jestin
>