Re: With 2.2.0 PySpark is now available for pip install from PyPI :)

2017-07-12 Thread Jeff Zhang
Awesome !

Hyukjin Kwon <gurwls...@gmail.com>于2017年7月13日周四 上午8:48写道:

> Cool!
>
> 2017-07-13 9:43 GMT+09:00 Denny Lee <denny.g@gmail.com>:
>
>> This is amazingly awesome! :)
>>
>> On Wed, Jul 12, 2017 at 13:23 lucas.g...@gmail.com <lucas.g...@gmail.com>
>> wrote:
>>
>>> That's great!
>>>
>>>
>>>
>>> On 12 July 2017 at 12:41, Felix Cheung <felixcheun...@hotmail.com>
>>> wrote:
>>>
>>>> Awesome! Congrats!!
>>>>
>>>> --
>>>> *From:* holden.ka...@gmail.com <holden.ka...@gmail.com> on behalf of
>>>> Holden Karau <hol...@pigscanfly.ca>
>>>> *Sent:* Wednesday, July 12, 2017 12:26:00 PM
>>>> *To:* user@spark.apache.org
>>>> *Subject:* With 2.2.0 PySpark is now available for pip install from
>>>> PyPI :)
>>>>
>>>> Hi wonderful Python + Spark folks,
>>>>
>>>> I'm excited to announce that with Spark 2.2.0 we finally have PySpark
>>>> published on PyPI (see https://pypi.python.org/pypi/pyspark /
>>>> https://twitter.com/holdenkarau/status/885207416173756417). This has
>>>> been a long time coming (previous releases included pip installable
>>>> artifacts that for a variety of reasons couldn't be published to PyPI). So
>>>> if you (or your friends) want to be able to work with PySpark locally on
>>>> your laptop you've got an easier path getting started (pip install 
>>>> pyspark).
>>>>
>>>> If you are setting up a standalone cluster your cluster will still need
>>>> the "full" Spark packaging, but the pip installed PySpark should be able to
>>>> work with YARN or an existing standalone cluster installation (of the same
>>>> version).
>>>>
>>>> Happy Sparking y'all!
>>>>
>>>> Holden :)
>>>>
>>>>
>>>> --
>>>> Cell : 425-233-8271 <(425)%20233-8271>
>>>> Twitter: https://twitter.com/holdenkarau
>>>>
>>>
>>>
>


Re: With 2.2.0 PySpark is now available for pip install from PyPI :)

2017-07-12 Thread Hyukjin Kwon
Cool!

2017-07-13 9:43 GMT+09:00 Denny Lee <denny.g@gmail.com>:

> This is amazingly awesome! :)
>
> On Wed, Jul 12, 2017 at 13:23 lucas.g...@gmail.com <lucas.g...@gmail.com>
> wrote:
>
>> That's great!
>>
>>
>>
>> On 12 July 2017 at 12:41, Felix Cheung <felixcheun...@hotmail.com> wrote:
>>
>>> Awesome! Congrats!!
>>>
>>> --
>>> *From:* holden.ka...@gmail.com <holden.ka...@gmail.com> on behalf of
>>> Holden Karau <hol...@pigscanfly.ca>
>>> *Sent:* Wednesday, July 12, 2017 12:26:00 PM
>>> *To:* user@spark.apache.org
>>> *Subject:* With 2.2.0 PySpark is now available for pip install from
>>> PyPI :)
>>>
>>> Hi wonderful Python + Spark folks,
>>>
>>> I'm excited to announce that with Spark 2.2.0 we finally have PySpark
>>> published on PyPI (see https://pypi.python.org/pypi/pyspark /
>>> https://twitter.com/holdenkarau/status/885207416173756417). This has
>>> been a long time coming (previous releases included pip installable
>>> artifacts that for a variety of reasons couldn't be published to PyPI). So
>>> if you (or your friends) want to be able to work with PySpark locally on
>>> your laptop you've got an easier path getting started (pip install pyspark).
>>>
>>> If you are setting up a standalone cluster your cluster will still need
>>> the "full" Spark packaging, but the pip installed PySpark should be able to
>>> work with YARN or an existing standalone cluster installation (of the same
>>> version).
>>>
>>> Happy Sparking y'all!
>>>
>>> Holden :)
>>>
>>>
>>> --
>>> Cell : 425-233-8271 <(425)%20233-8271>
>>> Twitter: https://twitter.com/holdenkarau
>>>
>>
>>


Re: With 2.2.0 PySpark is now available for pip install from PyPI :)

2017-07-12 Thread Denny Lee
This is amazingly awesome! :)

On Wed, Jul 12, 2017 at 13:23 lucas.g...@gmail.com <lucas.g...@gmail.com>
wrote:

> That's great!
>
>
>
> On 12 July 2017 at 12:41, Felix Cheung <felixcheun...@hotmail.com> wrote:
>
>> Awesome! Congrats!!
>>
>> --
>> *From:* holden.ka...@gmail.com <holden.ka...@gmail.com> on behalf of
>> Holden Karau <hol...@pigscanfly.ca>
>> *Sent:* Wednesday, July 12, 2017 12:26:00 PM
>> *To:* user@spark.apache.org
>> *Subject:* With 2.2.0 PySpark is now available for pip install from PyPI
>> :)
>>
>> Hi wonderful Python + Spark folks,
>>
>> I'm excited to announce that with Spark 2.2.0 we finally have PySpark
>> published on PyPI (see https://pypi.python.org/pypi/pyspark /
>> https://twitter.com/holdenkarau/status/885207416173756417). This has
>> been a long time coming (previous releases included pip installable
>> artifacts that for a variety of reasons couldn't be published to PyPI). So
>> if you (or your friends) want to be able to work with PySpark locally on
>> your laptop you've got an easier path getting started (pip install pyspark).
>>
>> If you are setting up a standalone cluster your cluster will still need
>> the "full" Spark packaging, but the pip installed PySpark should be able to
>> work with YARN or an existing standalone cluster installation (of the same
>> version).
>>
>> Happy Sparking y'all!
>>
>> Holden :)
>>
>>
>> --
>> Cell : 425-233-8271 <(425)%20233-8271>
>> Twitter: https://twitter.com/holdenkarau
>>
>
>


Re: With 2.2.0 PySpark is now available for pip install from PyPI :)

2017-07-12 Thread lucas.g...@gmail.com
That's great!



On 12 July 2017 at 12:41, Felix Cheung <felixcheun...@hotmail.com> wrote:

> Awesome! Congrats!!
>
> --
> *From:* holden.ka...@gmail.com <holden.ka...@gmail.com> on behalf of
> Holden Karau <hol...@pigscanfly.ca>
> *Sent:* Wednesday, July 12, 2017 12:26:00 PM
> *To:* user@spark.apache.org
> *Subject:* With 2.2.0 PySpark is now available for pip install from PyPI
> :)
>
> Hi wonderful Python + Spark folks,
>
> I'm excited to announce that with Spark 2.2.0 we finally have PySpark
> published on PyPI (see https://pypi.python.org/pypi/pyspark /
> https://twitter.com/holdenkarau/status/885207416173756417). This has been
> a long time coming (previous releases included pip installable artifacts
> that for a variety of reasons couldn't be published to PyPI). So if you (or
> your friends) want to be able to work with PySpark locally on your laptop
> you've got an easier path getting started (pip install pyspark).
>
> If you are setting up a standalone cluster your cluster will still need
> the "full" Spark packaging, but the pip installed PySpark should be able to
> work with YARN or an existing standalone cluster installation (of the same
> version).
>
> Happy Sparking y'all!
>
> Holden :)
>
>
> --
> Cell : 425-233-8271 <(425)%20233-8271>
> Twitter: https://twitter.com/holdenkarau
>


Re: With 2.2.0 PySpark is now available for pip install from PyPI :)

2017-07-12 Thread Felix Cheung
Awesome! Congrats!!


From: holden.ka...@gmail.com <holden.ka...@gmail.com> on behalf of Holden Karau 
<hol...@pigscanfly.ca>
Sent: Wednesday, July 12, 2017 12:26:00 PM
To: user@spark.apache.org
Subject: With 2.2.0 PySpark is now available for pip install from PyPI :)

Hi wonderful Python + Spark folks,

I'm excited to announce that with Spark 2.2.0 we finally have PySpark published 
on PyPI (see https://pypi.python.org/pypi/pyspark / 
https://twitter.com/holdenkarau/status/885207416173756417). This has been a 
long time coming (previous releases included pip installable artifacts that for 
a variety of reasons couldn't be published to PyPI). So if you (or your 
friends) want to be able to work with PySpark locally on your laptop you've got 
an easier path getting started (pip install pyspark).

If you are setting up a standalone cluster your cluster will still need the 
"full" Spark packaging, but the pip installed PySpark should be able to work 
with YARN or an existing standalone cluster installation (of the same version).

Happy Sparking y'all!

Holden :)


--
Cell : 425-233-8271
Twitter: https://twitter.com/holdenkarau


With 2.2.0 PySpark is now available for pip install from PyPI :)

2017-07-12 Thread Holden Karau
Hi wonderful Python + Spark folks,

I'm excited to announce that with Spark 2.2.0 we finally have PySpark
published on PyPI (see https://pypi.python.org/pypi/pyspark /
https://twitter.com/holdenkarau/status/885207416173756417). This has been a
long time coming (previous releases included pip installable artifacts that
for a variety of reasons couldn't be published to PyPI). So if you (or your
friends) want to be able to work with PySpark locally on your laptop you've
got an easier path getting started (pip install pyspark).

If you are setting up a standalone cluster your cluster will still need the
"full" Spark packaging, but the pip installed PySpark should be able to
work with YARN or an existing standalone cluster installation (of the same
version).

Happy Sparking y'all!

Holden :)


-- 
Cell : 425-233-8271
Twitter: https://twitter.com/holdenkarau