+1, excited for 2.0!

On Wed, May 18, 2016 at 10:06 AM, Krishna Sankar <ksanka...@gmail.com>
wrote:

> +1. Looks Good.
> The mllib results are in line with 1.6.1. Deprecation messages. I will
> convert to ml and test later in the day.
> Also will try GraphX exercises for our Strata London Tutorial
>
> Quick Notes:
>
>    1. pyspark env variables need to be changed
>    - IPYTHON and IPYTHON_OPTS are removed
>       - This works
>          - PYSPARK_DRIVER_PYTHON=ipython
>          PYSPARK_DRIVER_PYTHON_OPTS="notebook"
>          ~/Downloads/spark-2.0.0-preview/bin/pyspark --packages
>          com.databricks:spark-csv_2.10:1.4.0
>       2.  maven 3.3.9 is required. (I was running 3.3.3)
>    3.  Tons of interesting warnings and deprecations.
>       - The messages look descriptive and very helpful (Thanks. This will
>       help migration to 2.0, mllib -> ml et al). Will dig deeper.
>       4. Compiled OSX 10.10 (Yosemite) OK Total time: 31:28 min
>         mvn clean package -Pyarn -Phadoop-2.6 -DskipTests
>    - Spark version is 2.0.0-preview
>       - Tested pyspark, mllib (iPython 4.2.0)
>
> Cheers & Good work folks
> <k/>
>
> On Wed, May 18, 2016 at 7:28 AM, Sean Owen <so...@cloudera.com> wrote:
>
>> I think it's a good idea. Although releases have been preceded before
>> by release candidates for developers, it would be good to get a formal
>> preview/beta release ratified for public consumption ahead of a new
>> major release. Better to have a little more testing in the wild to
>> identify problems before 2.0.0 is finalized.
>>
>> +1 to the release. License, sigs, etc check out. On Ubuntu 16 + Java
>> 8, compilation and tests succeed for "-Pyarn -Phive
>> -Phive-thriftserver -Phadoop-2.6".
>>
>> On Wed, May 18, 2016 at 6:40 AM, Reynold Xin <r...@apache.org> wrote:
>> > Hi,
>> >
>> > In the past the Apache Spark community have created preview packages
>> (not
>> > official releases) and used those as opportunities to ask community
>> members
>> > to test the upcoming versions of Apache Spark. Several people in the
>> Apache
>> > community have suggested we conduct votes for these preview packages and
>> > turn them into formal releases by the Apache foundation's standard.
>> Preview
>> > releases are not meant to be functional, i.e. they can and highly likely
>> > will contain critical bugs or documentation errors, but we will be able
>> to
>> > post them to the project's website to get wider feedback. They should
>> > satisfy the legal requirements of Apache's release policy
>> > (http://www.apache.org/dev/release.html) such as having proper
>> licenses.
>> >
>> >
>> > Please vote on releasing the following candidate as Apache Spark version
>> > 2.0.0-preview. The vote is open until Friday, May 20, 2015 at 11:00 PM
>> PDT
>> > and passes if a majority of at least 3 +1 PMC votes are cast.
>> >
>> > [ ] +1 Release this package as Apache Spark 2.0.0-preview
>> > [ ] -1 Do not release this package because ...
>> >
>> > To learn more about Apache Spark, please see http://spark.apache.org/
>> >
>> > The tag to be voted on is 2.0.0-preview
>> > (8f5a04b6299e3a47aca13cbb40e72344c0114860)
>> >
>> > The release files, including signatures, digests, etc. can be found at:
>> >
>> http://home.apache.org/~pwendell/spark-releases/spark-2.0.0-preview-bin/
>> >
>> > Release artifacts are signed with the following key:
>> > https://people.apache.org/keys/committer/pwendell.asc
>> >
>> > The documentation corresponding to this release can be found at:
>> >
>> http://home.apache.org/~pwendell/spark-releases/spark-2.0.0-preview-docs/
>> >
>> > The list of resolved issues are:
>> >
>> https://issues.apache.org/jira/browse/SPARK-15351?jql=project%20%3D%20SPARK%20AND%20fixVersion%20%3D%202.0.0
>> >
>> >
>> > If you are a Spark user, you can help us test this release by taking an
>> > existing Apache Spark workload and running on this candidate, then
>> reporting
>> > any regressions.
>> >
>>
>> ---------------------------------------------------------------------
>> To unsubscribe, e-mail: dev-unsubscr...@spark.apache.org
>> For additional commands, e-mail: dev-h...@spark.apache.org
>>
>>
>

Reply via email to