Re: With 2.2.0 PySpark is now available for pip install from PyPI :)
Awesome ! Hyukjin Kwon于2017年7月13日周四 上午8:48写道: > Cool! > > 2017-07-13 9:43 GMT+09:00 Denny Lee : > >> This is amazingly awesome! :) >> >> On Wed, Jul 12, 2017 at 13:23 lucas.g...@gmail.com >> wrote: >> >>> That's great! >>> >>> >>> >>> On 12 July 2017 at 12:41, Felix Cheung >>> wrote: >>> Awesome! Congrats!! -- *From:* holden.ka...@gmail.com on behalf of Holden Karau *Sent:* Wednesday, July 12, 2017 12:26:00 PM *To:* user@spark.apache.org *Subject:* With 2.2.0 PySpark is now available for pip install from PyPI :) Hi wonderful Python + Spark folks, I'm excited to announce that with Spark 2.2.0 we finally have PySpark published on PyPI (see https://pypi.python.org/pypi/pyspark / https://twitter.com/holdenkarau/status/885207416173756417). This has been a long time coming (previous releases included pip installable artifacts that for a variety of reasons couldn't be published to PyPI). So if you (or your friends) want to be able to work with PySpark locally on your laptop you've got an easier path getting started (pip install pyspark). If you are setting up a standalone cluster your cluster will still need the "full" Spark packaging, but the pip installed PySpark should be able to work with YARN or an existing standalone cluster installation (of the same version). Happy Sparking y'all! Holden :) -- Cell : 425-233-8271 <(425)%20233-8271> Twitter: https://twitter.com/holdenkarau >>> >>> >
Re: With 2.2.0 PySpark is now available for pip install from PyPI :)
Cool! 2017-07-13 9:43 GMT+09:00 Denny Lee: > This is amazingly awesome! :) > > On Wed, Jul 12, 2017 at 13:23 lucas.g...@gmail.com > wrote: > >> That's great! >> >> >> >> On 12 July 2017 at 12:41, Felix Cheung wrote: >> >>> Awesome! Congrats!! >>> >>> -- >>> *From:* holden.ka...@gmail.com on behalf of >>> Holden Karau >>> *Sent:* Wednesday, July 12, 2017 12:26:00 PM >>> *To:* user@spark.apache.org >>> *Subject:* With 2.2.0 PySpark is now available for pip install from >>> PyPI :) >>> >>> Hi wonderful Python + Spark folks, >>> >>> I'm excited to announce that with Spark 2.2.0 we finally have PySpark >>> published on PyPI (see https://pypi.python.org/pypi/pyspark / >>> https://twitter.com/holdenkarau/status/885207416173756417). This has >>> been a long time coming (previous releases included pip installable >>> artifacts that for a variety of reasons couldn't be published to PyPI). So >>> if you (or your friends) want to be able to work with PySpark locally on >>> your laptop you've got an easier path getting started (pip install pyspark). >>> >>> If you are setting up a standalone cluster your cluster will still need >>> the "full" Spark packaging, but the pip installed PySpark should be able to >>> work with YARN or an existing standalone cluster installation (of the same >>> version). >>> >>> Happy Sparking y'all! >>> >>> Holden :) >>> >>> >>> -- >>> Cell : 425-233-8271 <(425)%20233-8271> >>> Twitter: https://twitter.com/holdenkarau >>> >> >>
Re: With 2.2.0 PySpark is now available for pip install from PyPI :)
This is amazingly awesome! :) On Wed, Jul 12, 2017 at 13:23 lucas.g...@gmail.comwrote: > That's great! > > > > On 12 July 2017 at 12:41, Felix Cheung wrote: > >> Awesome! Congrats!! >> >> -- >> *From:* holden.ka...@gmail.com on behalf of >> Holden Karau >> *Sent:* Wednesday, July 12, 2017 12:26:00 PM >> *To:* user@spark.apache.org >> *Subject:* With 2.2.0 PySpark is now available for pip install from PyPI >> :) >> >> Hi wonderful Python + Spark folks, >> >> I'm excited to announce that with Spark 2.2.0 we finally have PySpark >> published on PyPI (see https://pypi.python.org/pypi/pyspark / >> https://twitter.com/holdenkarau/status/885207416173756417). This has >> been a long time coming (previous releases included pip installable >> artifacts that for a variety of reasons couldn't be published to PyPI). So >> if you (or your friends) want to be able to work with PySpark locally on >> your laptop you've got an easier path getting started (pip install pyspark). >> >> If you are setting up a standalone cluster your cluster will still need >> the "full" Spark packaging, but the pip installed PySpark should be able to >> work with YARN or an existing standalone cluster installation (of the same >> version). >> >> Happy Sparking y'all! >> >> Holden :) >> >> >> -- >> Cell : 425-233-8271 <(425)%20233-8271> >> Twitter: https://twitter.com/holdenkarau >> > >
Re: With 2.2.0 PySpark is now available for pip install from PyPI :)
That's great! On 12 July 2017 at 12:41, Felix Cheungwrote: > Awesome! Congrats!! > > -- > *From:* holden.ka...@gmail.com on behalf of > Holden Karau > *Sent:* Wednesday, July 12, 2017 12:26:00 PM > *To:* user@spark.apache.org > *Subject:* With 2.2.0 PySpark is now available for pip install from PyPI > :) > > Hi wonderful Python + Spark folks, > > I'm excited to announce that with Spark 2.2.0 we finally have PySpark > published on PyPI (see https://pypi.python.org/pypi/pyspark / > https://twitter.com/holdenkarau/status/885207416173756417). This has been > a long time coming (previous releases included pip installable artifacts > that for a variety of reasons couldn't be published to PyPI). So if you (or > your friends) want to be able to work with PySpark locally on your laptop > you've got an easier path getting started (pip install pyspark). > > If you are setting up a standalone cluster your cluster will still need > the "full" Spark packaging, but the pip installed PySpark should be able to > work with YARN or an existing standalone cluster installation (of the same > version). > > Happy Sparking y'all! > > Holden :) > > > -- > Cell : 425-233-8271 <(425)%20233-8271> > Twitter: https://twitter.com/holdenkarau >
Re: With 2.2.0 PySpark is now available for pip install from PyPI :)
Awesome! Congrats!! From: holden.ka...@gmail.comon behalf of Holden Karau Sent: Wednesday, July 12, 2017 12:26:00 PM To: user@spark.apache.org Subject: With 2.2.0 PySpark is now available for pip install from PyPI :) Hi wonderful Python + Spark folks, I'm excited to announce that with Spark 2.2.0 we finally have PySpark published on PyPI (see https://pypi.python.org/pypi/pyspark / https://twitter.com/holdenkarau/status/885207416173756417). This has been a long time coming (previous releases included pip installable artifacts that for a variety of reasons couldn't be published to PyPI). So if you (or your friends) want to be able to work with PySpark locally on your laptop you've got an easier path getting started (pip install pyspark). If you are setting up a standalone cluster your cluster will still need the "full" Spark packaging, but the pip installed PySpark should be able to work with YARN or an existing standalone cluster installation (of the same version). Happy Sparking y'all! Holden :) -- Cell : 425-233-8271 Twitter: https://twitter.com/holdenkarau