Re: Welcoming Tejas Patil as a Spark committer

2017-10-03 Thread Takuya UESHIN
Congratulations!


On Tue, Oct 3, 2017 at 2:47 AM, Tejas Patil 
wrote:

> Thanks everyone !!! It's a great privilege to be part of the Spark
> community.
>
> ~tejasp
>
> On Sat, Sep 30, 2017 at 2:27 PM, Jacek Laskowski  wrote:
>
>> Hi,
>>
>> Oh, yeah. Seen Tejas here and there in the commits. Well deserved.
>>
>> Jacek
>>
>> On 29 Sep 2017 9:58 pm, "Matei Zaharia"  wrote:
>>
>> Hi all,
>>
>> The Spark PMC recently added Tejas Patil as a committer on the
>> project. Tejas has been contributing across several areas of Spark for
>> a while, focusing especially on scalability issues and SQL. Please
>> join me in welcoming Tejas!
>>
>> Matei
>>
>> -
>> To unsubscribe e-mail: dev-unsubscr...@spark.apache.org
>>
>>
>>
>


-- 
Takuya UESHIN
Tokyo, Japan

http://twitter.com/ueshin


Re: [VOTE] Spark 2.1.2 (RC4)

2017-10-03 Thread Marcelo Vanzin
Maybe you're running as root (or the admin account on your OS)?

On Tue, Oct 3, 2017 at 12:12 PM, Nick Pentreath
 wrote:
> Hmm I'm consistently getting this error in core tests:
>
> - SPARK-3697: ignore directories that cannot be read. *** FAILED ***
>   2 was not equal to 1 (FsHistoryProviderSuite.scala:146)
>
>
> Anyone else? Any insight? Perhaps it's my set up.
>
>>>
>>>
>>> On Tue, Oct 3, 2017 at 7:24 AM Holden Karau  wrote:

 Please vote on releasing the following candidate as Apache Spark version
 2.1.2. The vote is open until Saturday October 7th at 9:00 PST and passes 
 if
 a majority of at least 3 +1 PMC votes are cast.

 [ ] +1 Release this package as Apache Spark 2.1.2
 [ ] -1 Do not release this package because ...


 To learn more about Apache Spark, please see https://spark.apache.org/

 The tag to be voted on is v2.1.2-rc4
 (2abaea9e40fce81cd4626498e0f5c28a70917499)

 List of JIRA tickets resolved in this release can be found with this
 filter.

 The release files, including signatures, digests, etc. can be found at:
 https://home.apache.org/~holden/spark-2.1.2-rc4-bin/

 Release artifacts are signed with a key from:
 https://people.apache.org/~holden/holdens_keys.asc

 The staging repository for this release can be found at:
 https://repository.apache.org/content/repositories/orgapachespark-1252

 The documentation corresponding to this release can be found at:
 https://people.apache.org/~holden/spark-2.1.2-rc4-docs/


 FAQ

 How can I help test this release?

 If you are a Spark user, you can help us test this release by taking an
 existing Spark workload and running on this release candidate, then
 reporting any regressions.

 If you're working in PySpark you can set up a virtual env and install
 the current RC and see if anything important breaks, in the Java/Scala you
 can add the staging repository to your projects resolvers and test with the
 RC (make sure to clean up the artifact cache before/after so you don't end
 up building with a out of date RC going forward).

 What should happen to JIRA tickets still targeting 2.1.2?

 Committers should look at those and triage. Extremely important bug
 fixes, documentation, and API tweaks that impact compatibility should be
 worked on immediately. Everything else please retarget to 2.1.3.

 But my bug isn't fixed!??!

 In order to make timely releases, we will typically not hold the release
 unless the bug in question is a regression from 2.1.1. That being said if
 there is something which is a regression form 2.1.1 that has not been
 correctly targeted please ping a committer to help target the issue (you 
 can
 see the open issues listed as impacting Spark 2.1.1 & 2.1.2)

 What are the unresolved issues targeted for 2.1.2?

 At this time there are no open unresolved issues.

 Is there anything different about this release?

 This is the first release in awhile not built on the AMPLAB Jenkins.
 This is good because it means future releases can more easily be built and
 signed securely (and I've been updating the documentation in
 https://github.com/apache/spark-website/pull/66 as I progress), however the
 chances of a mistake are higher with any change like this. If there
 something you normally take for granted as correct when checking a release,
 please double check this time :)

 Should I be committing code to branch-2.1?

 Thanks for asking! Please treat this stage in the RC process as "code
 freeze" so bug fixes only. If you're uncertain if something should be back
 ported please reach out. If you do commit to branch-2.1 please tag your 
 JIRA
 issue fix version for 2.1.3 and if we cut another RC I'll move the 2.1.3
 fixed into 2.1.2 as appropriate.

 What happened to RC3?

 Some R+zinc interactions kept it from getting out the door.
 --
 Twitter: https://twitter.com/holdenkarau
>>
>>
>



-- 
Marcelo

-
To unsubscribe e-mail: dev-unsubscr...@spark.apache.org



Re: [VOTE] Spark 2.1.2 (RC4)

2017-10-03 Thread Nick Pentreath
Hmm I'm consistently getting this error in core tests:

- SPARK-3697: ignore directories that cannot be read. *** FAILED ***
  2 was not equal to 1 (FsHistoryProviderSuite.scala:146)


Anyone else? Any insight? Perhaps it's my set up.


>>
>> On Tue, Oct 3, 2017 at 7:24 AM Holden Karau  wrote:
>>
>>> Please vote on releasing the following candidate as Apache Spark
>>> version 2.1.2. The vote is open until Saturday October 7th at 9:00
>>> PST and passes if a majority of at least 3 +1 PMC votes are cast.
>>>
>>> [ ] +1 Release this package as Apache Spark 2.1.2
>>> [ ] -1 Do not release this package because ...
>>>
>>>
>>> To learn more about Apache Spark, please see https://spark.apache.org/
>>>
>>> The tag to be voted on is v2.1.2-rc4
>>>  (
>>> 2abaea9e40fce81cd4626498e0f5c28a70917499)
>>>
>>> List of JIRA tickets resolved in this release can be found with this
>>> filter.
>>> 
>>>
>>> The release files, including signatures, digests, etc. can be found at:
>>> https://home.apache.org/~holden/spark-2.1.2-rc4-bin/
>>>
>>> Release artifacts are signed with a key from:
>>> https://people.apache.org/~holden/holdens_keys.asc
>>>
>>> The staging repository for this release can be found at:
>>> https://repository.apache.org/content/repositories/orgapachespark-1252
>>>
>>> The documentation corresponding to this release can be found at:
>>> https://people.apache.org/~holden/spark-2.1.2-rc4-docs/
>>>
>>>
>>> *FAQ*
>>>
>>> *How can I help test this release?*
>>>
>>> If you are a Spark user, you can help us test this release by taking an
>>> existing Spark workload and running on this release candidate, then
>>> reporting any regressions.
>>>
>>> If you're working in PySpark you can set up a virtual env and install
>>> the current RC and see if anything important breaks, in the Java/Scala
>>> you can add the staging repository to your projects resolvers and test with
>>> the RC (make sure to clean up the artifact cache before/after so you
>>> don't end up building with a out of date RC going forward).
>>>
>>> *What should happen to JIRA tickets still targeting 2.1.2?*
>>>
>>> Committers should look at those and triage. Extremely important bug
>>> fixes, documentation, and API tweaks that impact compatibility should be
>>> worked on immediately. Everything else please retarget to 2.1.3.
>>>
>>> *But my bug isn't fixed!??!*
>>>
>>> In order to make timely releases, we will typically not hold the release
>>> unless the bug in question is a regression from 2.1.1. That being said
>>> if there is something which is a regression form 2.1.1 that has not
>>> been correctly targeted please ping a committer to help target the issue
>>> (you can see the open issues listed as impacting Spark 2.1.1 & 2.1.2
>>> 
>>> )
>>>
>>> *What are the unresolved* issues targeted for 2.1.2
>>> 
>>> ?
>>>
>>> At this time there are no open unresolved issues.
>>>
>>> *Is there anything different about this release?*
>>>
>>> This is the first release in awhile not built on the AMPLAB Jenkins.
>>> This is good because it means future releases can more easily be built and
>>> signed securely (and I've been updating the documentation in
>>> https://github.com/apache/spark-website/pull/66 as I progress), however
>>> the chances of a mistake are higher with any change like this. If there
>>> something you normally take for granted as correct when checking a release,
>>> please double check this time :)
>>>
>>> *Should I be committing code to branch-2.1?*
>>>
>>> Thanks for asking! Please treat this stage in the RC process as "code
>>> freeze" so bug fixes only. If you're uncertain if something should be back
>>> ported please reach out. If you do commit to branch-2.1 please tag your
>>> JIRA issue fix version for 2.1.3 and if we cut another RC I'll move the
>>> 2.1.3 fixed into 2.1.2 as appropriate.
>>>
>>> *What happened to RC3?*
>>>
>>> Some R+zinc interactions kept it from getting out the door.
>>> --
>>> Twitter: https://twitter.com/holdenkarau
>>>
>>
>


Re: Welcoming Tejas Patil as a Spark committer

2017-10-03 Thread Dilip Biswal
Congratulations , Tejas!
 
-- Dilip
 
 
- Original message -From: Suresh Thalamati To: "dev@spark.apache.org" Cc:Subject: Re: Welcoming Tejas Patil as a Spark committerDate: Tue, Oct 3, 2017 12:01 PM 
Congratulations , Tejas!-suresh> On Sep 29, 2017, at 12:58 PM, Matei Zaharia  wrote:>> Hi all,>> The Spark PMC recently added Tejas Patil as a committer on the> project. Tejas has been contributing across several areas of Spark for> a while, focusing especially on scalability issues and SQL. Please> join me in welcoming Tejas!>> Matei>> -> To unsubscribe e-mail: dev-unsubscr...@spark.apache.org>-To unsubscribe e-mail: dev-unsubscr...@spark.apache.org 
 


-
To unsubscribe e-mail: dev-unsubscr...@spark.apache.org



Re: Welcoming Tejas Patil as a Spark committer

2017-10-03 Thread Suresh Thalamati
Congratulations , Tejas!

-suresh

> On Sep 29, 2017, at 12:58 PM, Matei Zaharia  wrote:
> 
> Hi all,
> 
> The Spark PMC recently added Tejas Patil as a committer on the
> project. Tejas has been contributing across several areas of Spark for
> a while, focusing especially on scalability issues and SQL. Please
> join me in welcoming Tejas!
> 
> Matei
> 
> -
> To unsubscribe e-mail: dev-unsubscr...@spark.apache.org
> 


-
To unsubscribe e-mail: dev-unsubscr...@spark.apache.org



Re: [VOTE] Spark 2.1.2 (RC4)

2017-10-03 Thread Ryan Blue
+1

Verified checksums and signatures for the archives in home.apache.org, spot
checked the same for artifacts in Nexus.

On Tue, Oct 3, 2017 at 8:06 AM, Wenchen Fan  wrote:

> +1
>
> On Tue, Oct 3, 2017 at 11:00 PM, Kazuaki Ishizaki 
> wrote:
>
>> +1 (non-binding)
>>
>> I tested it on Ubuntu 16.04 and OpenJDK8 on ppc64le. All of the tests for
>> core/sql-core/sql-catalyst/mllib/mllib-local have passed.
>>
>> $ java -version
>> openjdk version "1.8.0_131"
>> OpenJDK Runtime Environment (build 1.8.0_131-8u131-b11-2ubuntu1.1
>> 6.04.3-b11)
>> OpenJDK 64-Bit Server VM (build 25.131-b11, mixed mode)
>>
>> % build/mvn -DskipTests -Phive -Phive-thriftserver -Pyarn -Phadoop-2.7 -T
>> 24 clean package install
>> % build/mvn -Phive -Phive-thriftserver -Pyarn -Phadoop-2.7 test -pl core
>> -pl 'sql/core' -pl 'sql/catalyst' -pl mllib -pl mllib-local
>> ...
>> Run completed in 12 minutes, 19 seconds.
>> Total number of tests run: 1035
>> Suites: completed 166, aborted 0
>> Tests: succeeded 1035, failed 0, canceled 0, ignored 5, pending 0
>> All tests passed.
>> [INFO] 
>> 
>> [INFO] Reactor Summary:
>> [INFO]
>> [INFO] Spark Project Core . SUCCESS
>> [17:13 min]
>> [INFO] Spark Project ML Local Library . SUCCESS [
>>  5.759 s]
>> [INFO] Spark Project Catalyst . SUCCESS
>> [09:48 min]
>> [INFO] Spark Project SQL .. SUCCESS
>> [12:01 min]
>> [INFO] Spark Project ML Library ... SUCCESS
>> [15:16 min]
>> [INFO] 
>> 
>> [INFO] BUILD SUCCESS
>> [INFO] 
>> 
>> [INFO] Total time: 54:28 min
>> [INFO] Finished at: 2017-10-03T23:53:33+09:00
>> [INFO] Final Memory: 112M/322M
>> [INFO] 
>> 
>> [WARNING] The requested profile "hive" could not be activated because it
>> does not exist.
>>
>> Kazuaki Ishizaki
>>
>>
>>
>>
>> From:Dongjoon Hyun 
>> To:Spark dev list 
>> Date:2017/10/03 23:23
>> Subject:Re: [VOTE] Spark 2.1.2 (RC4)
>> --
>>
>>
>>
>> +1 (non-binding)
>>
>> Dongjoon.
>>
>> On Tue, Oct 3, 2017 at 5:13 AM, Herman van Hövell tot Westerflier <
>> *hvanhov...@databricks.com* > wrote:
>> +1
>>
>> On Tue, Oct 3, 2017 at 1:32 PM, Sean Owen <*so...@cloudera.com*
>> > wrote:
>> +1 same as last RC. Tests pass, sigs and hashes are OK.
>>
>> On Tue, Oct 3, 2017 at 7:24 AM Holden Karau <*hol...@pigscanfly.ca*
>> > wrote:
>> Please vote on releasing the following candidate as Apache Spark
>> version 2.1.2. The vote is open until Saturday October 7th at 9:00 PST and
>> passes if a majority of at least 3 +1 PMC votes are cast.
>>
>> [ ] +1 Release this package as Apache Spark 2.1.2
>> [ ] -1 Do not release this package because ...
>>
>>
>> To learn more about Apache Spark, please see *https://spark.apache.org/*
>> 
>>
>> The tag to be voted on is *v2.1.2-rc4*
>> 
>>  (2abaea9e40fce81cd4626498e0f5c28a70917499)
>>
>> List of JIRA tickets resolved in this release can be found *with this
>> filter.*
>> 
>>
>> The release files, including signatures, digests, etc. can be found at:
>> *https://home.apache.org/~holden/spark-2.1.2-rc4-bin/*
>> 
>>
>> Release artifacts are signed with a key from:
>> *https://people.apache.org/~holden/holdens_keys.asc*
>> 

Re: Configuration docs pages are broken

2017-10-03 Thread Sean Owen
I think this was fixed in https://issues.apache.org/jira/browse/SPARK-21593 but
the docs only go out with releases. It would be fixed when 2.2.1 goes out.

On Tue, Oct 3, 2017 at 5:53 PM Nick Dimiduk  wrote:

> Heya,
>
> Looks like the Configuration sections of your docs, both latest [0], and
> 2.1 [1] are broken. The last couple sections are smashed into a single
> unrendered paragraph of markdown at the bottom.
>
> Thanks,
> Nick
>
> [0]: https://spark.apache.org/docs/latest/configuration.html
> [1]: https://spark.apache.org/docs/2.1.0/configuration.html
>


Re: Configuration docs pages are broken

2017-10-03 Thread Reynold Xin
Interested in submitting a patch to fix them?

On Tue, Oct 3, 2017 at 9:53 AM Nick Dimiduk  wrote:

> Heya,
>
> Looks like the Configuration sections of your docs, both latest [0], and
> 2.1 [1] are broken. The last couple sections are smashed into a single
> unrendered paragraph of markdown at the bottom.
>
> Thanks,
> Nick
>
> [0]: https://spark.apache.org/docs/latest/configuration.html
> [1]: https://spark.apache.org/docs/2.1.0/configuration.html
>


Configuration docs pages are broken

2017-10-03 Thread Nick Dimiduk
Heya,

Looks like the Configuration sections of your docs, both latest [0], and
2.1 [1] are broken. The last couple sections are smashed into a single
unrendered paragraph of markdown at the bottom.

Thanks,
Nick

[0]: https://spark.apache.org/docs/latest/configuration.html
[1]: https://spark.apache.org/docs/2.1.0/configuration.html


Re: [VOTE] Spark 2.1.2 (RC4)

2017-10-03 Thread Wenchen Fan
+1

On Tue, Oct 3, 2017 at 11:00 PM, Kazuaki Ishizaki 
wrote:

> +1 (non-binding)
>
> I tested it on Ubuntu 16.04 and OpenJDK8 on ppc64le. All of the tests for
> core/sql-core/sql-catalyst/mllib/mllib-local have passed.
>
> $ java -version
> openjdk version "1.8.0_131"
> OpenJDK Runtime Environment (build 1.8.0_131-8u131-b11-2ubuntu1.
> 16.04.3-b11)
> OpenJDK 64-Bit Server VM (build 25.131-b11, mixed mode)
>
> % build/mvn -DskipTests -Phive -Phive-thriftserver -Pyarn -Phadoop-2.7 -T
> 24 clean package install
> % build/mvn -Phive -Phive-thriftserver -Pyarn -Phadoop-2.7 test -pl core
> -pl 'sql/core' -pl 'sql/catalyst' -pl mllib -pl mllib-local
> ...
> Run completed in 12 minutes, 19 seconds.
> Total number of tests run: 1035
> Suites: completed 166, aborted 0
> Tests: succeeded 1035, failed 0, canceled 0, ignored 5, pending 0
> All tests passed.
> [INFO] 
> 
> [INFO] Reactor Summary:
> [INFO]
> [INFO] Spark Project Core . SUCCESS
> [17:13 min]
> [INFO] Spark Project ML Local Library . SUCCESS [
>  5.759 s]
> [INFO] Spark Project Catalyst . SUCCESS [09:48
> min]
> [INFO] Spark Project SQL .. SUCCESS
> [12:01 min]
> [INFO] Spark Project ML Library ... SUCCESS [15:16
> min]
> [INFO] 
> 
> [INFO] BUILD SUCCESS
> [INFO] 
> 
> [INFO] Total time: 54:28 min
> [INFO] Finished at: 2017-10-03T23:53:33+09:00
> [INFO] Final Memory: 112M/322M
> [INFO] 
> 
> [WARNING] The requested profile "hive" could not be activated because it
> does not exist.
>
> Kazuaki Ishizaki
>
>
>
>
> From:Dongjoon Hyun 
> To:Spark dev list 
> Date:2017/10/03 23:23
> Subject:Re: [VOTE] Spark 2.1.2 (RC4)
> --
>
>
>
> +1 (non-binding)
>
> Dongjoon.
>
> On Tue, Oct 3, 2017 at 5:13 AM, Herman van Hövell tot Westerflier <
> *hvanhov...@databricks.com* > wrote:
> +1
>
> On Tue, Oct 3, 2017 at 1:32 PM, Sean Owen <*so...@cloudera.com*
> > wrote:
> +1 same as last RC. Tests pass, sigs and hashes are OK.
>
> On Tue, Oct 3, 2017 at 7:24 AM Holden Karau <*hol...@pigscanfly.ca*
> > wrote:
> Please vote on releasing the following candidate as Apache Spark
> version 2.1.2. The vote is open until Saturday October 7th at 9:00 PST and
> passes if a majority of at least 3 +1 PMC votes are cast.
>
> [ ] +1 Release this package as Apache Spark 2.1.2
> [ ] -1 Do not release this package because ...
>
>
> To learn more about Apache Spark, please see *https://spark.apache.org/*
> 
>
> The tag to be voted on is *v2.1.2-rc4*
> 
>  (2abaea9e40fce81cd4626498e0f5c28a70917499)
>
> List of JIRA tickets resolved in this release can be found *with this
> filter.*
> 
>
> The release files, including signatures, digests, etc. can be found at:
> *https://home.apache.org/~holden/spark-2.1.2-rc4-bin/*
> 
>
> Release artifacts are signed with a key from:
> *https://people.apache.org/~holden/holdens_keys.asc*
> 
>
> The staging repository for this release can be found at:
> *https://repository.apache.org/content/repositories/orgapachespark-1252*
> 

Re: [VOTE] Spark 2.1.2 (RC4)

2017-10-03 Thread Kazuaki Ishizaki
+1 (non-binding)

I tested it on Ubuntu 16.04 and OpenJDK8 on ppc64le. All of the tests for 
core/sql-core/sql-catalyst/mllib/mllib-local have passed.

$ java -version
openjdk version "1.8.0_131"
OpenJDK Runtime Environment (build 
1.8.0_131-8u131-b11-2ubuntu1.16.04.3-b11)
OpenJDK 64-Bit Server VM (build 25.131-b11, mixed mode)

% build/mvn -DskipTests -Phive -Phive-thriftserver -Pyarn -Phadoop-2.7 -T 
24 clean package install
% build/mvn -Phive -Phive-thriftserver -Pyarn -Phadoop-2.7 test -pl core 
-pl 'sql/core' -pl 'sql/catalyst' -pl mllib -pl mllib-local
...
Run completed in 12 minutes, 19 seconds.
Total number of tests run: 1035
Suites: completed 166, aborted 0
Tests: succeeded 1035, failed 0, canceled 0, ignored 5, pending 0
All tests passed.
[INFO] 

[INFO] Reactor Summary:
[INFO] 
[INFO] Spark Project Core . SUCCESS [17:13 
min]
[INFO] Spark Project ML Local Library . SUCCESS [ 
5.759 s]
[INFO] Spark Project Catalyst . SUCCESS [09:48 
min]
[INFO] Spark Project SQL .. SUCCESS [12:01 
min]
[INFO] Spark Project ML Library ... SUCCESS [15:16 
min]
[INFO] 

[INFO] BUILD SUCCESS
[INFO] 

[INFO] Total time: 54:28 min
[INFO] Finished at: 2017-10-03T23:53:33+09:00
[INFO] Final Memory: 112M/322M
[INFO] 

[WARNING] The requested profile "hive" could not be activated because it 
does not exist.

Kazuaki Ishizaki




From:   Dongjoon Hyun 
To: Spark dev list 
Date:   2017/10/03 23:23
Subject:Re: [VOTE] Spark 2.1.2 (RC4)



+1 (non-binding)

Dongjoon.

On Tue, Oct 3, 2017 at 5:13 AM, Herman van Hövell tot Westerflier <
hvanhov...@databricks.com> wrote:
+1

On Tue, Oct 3, 2017 at 1:32 PM, Sean Owen  wrote:
+1 same as last RC. Tests pass, sigs and hashes are OK.

On Tue, Oct 3, 2017 at 7:24 AM Holden Karau  wrote:
Please vote on releasing the following candidate as Apache Spark 
version 2.1.2. The vote is open until Saturday October 7th at 9:00 PST and 
passes if a majority of at least 3 +1 PMC votes are cast.

[ ] +1 Release this package as Apache Spark 2.1.2
[ ] -1 Do not release this package because ...


To learn more about Apache Spark, please see https://spark.apache.org/

The tag to be voted on is v2.1.2-rc4 (
2abaea9e40fce81cd4626498e0f5c28a70917499)

List of JIRA tickets resolved in this release can be found with this 
filter.

The release files, including signatures, digests, etc. can be found at:
https://home.apache.org/~holden/spark-2.1.2-rc4-bin/

Release artifacts are signed with a key from:
https://people.apache.org/~holden/holdens_keys.asc

The staging repository for this release can be found at:
https://repository.apache.org/content/repositories/orgapachespark-1252

The documentation corresponding to this release can be found at:
https://people.apache.org/~holden/spark-2.1.2-rc4-docs/


FAQ

How can I help test this release?

If you are a Spark user, you can help us test this release by taking an 
existing Spark workload and running on this release candidate, then 
reporting any regressions.

If you're working in PySpark you can set up a virtual env and install the 
current RC and see if anything important breaks, in the Java/Scala you can 
add the staging repository to your projects resolvers and test with 
the RC (make sure to clean up the artifact cache before/after so you don't 
end up building with a out of date RC going forward).

What should happen to JIRA tickets still targeting 2.1.2?

Committers should look at those and triage. Extremely important bug fixes, 
documentation, and API tweaks that impact compatibility should be worked 
on immediately. Everything else please retarget to 2.1.3.

But my bug isn't fixed!??!

In order to make timely releases, we will typically not hold the release 
unless the bug in question is a regression from 2.1.1. That being said if 
there is something which is a regression form 2.1.1 that has not been 
correctly targeted please ping a committer to help target the issue (you 
can see the open issues listed as impacting Spark 2.1.1 & 2.1.2)

What are the unresolved issues targeted for 2.1.2?

At this time there are no open unresolved issues.

Is there anything different about this release?

This is the first release in awhile not built on the AMPLAB Jenkins. This 
is good because it means future releases can more easily be built and 
signed securely (and I've been updating the documentation in 
https://github.com/apache/spark-website/pull/66 as I progress), however 
the chances of a mistake are higher with any change like this. If there 

Re: [VOTE] Spark 2.1.2 (RC4)

2017-10-03 Thread Dongjoon Hyun
+1 (non-binding)

Dongjoon.

On Tue, Oct 3, 2017 at 5:13 AM, Herman van Hövell tot Westerflier <
hvanhov...@databricks.com> wrote:

> +1
>
> On Tue, Oct 3, 2017 at 1:32 PM, Sean Owen  wrote:
>
>> +1 same as last RC. Tests pass, sigs and hashes are OK.
>>
>> On Tue, Oct 3, 2017 at 7:24 AM Holden Karau  wrote:
>>
>>> Please vote on releasing the following candidate as Apache Spark
>>> version 2.1.2. The vote is open until Saturday October 7th at 9:00
>>> PST and passes if a majority of at least 3 +1 PMC votes are cast.
>>>
>>> [ ] +1 Release this package as Apache Spark 2.1.2
>>> [ ] -1 Do not release this package because ...
>>>
>>>
>>> To learn more about Apache Spark, please see https://spark.apache.org/
>>>
>>> The tag to be voted on is v2.1.2-rc4
>>>  (2abaea9e40fce81
>>> cd4626498e0f5c28a70917499)
>>>
>>> List of JIRA tickets resolved in this release can be found with this
>>> filter.
>>> 
>>>
>>> The release files, including signatures, digests, etc. can be found at:
>>> https://home.apache.org/~holden/spark-2.1.2-rc4-bin/
>>>
>>> Release artifacts are signed with a key from:
>>> https://people.apache.org/~holden/holdens_keys.asc
>>>
>>> The staging repository for this release can be found at:
>>> https://repository.apache.org/content/repositories/orgapachespark-1252
>>>
>>> The documentation corresponding to this release can be found at:
>>> https://people.apache.org/~holden/spark-2.1.2-rc4-docs/
>>>
>>>
>>> *FAQ*
>>>
>>> *How can I help test this release?*
>>>
>>> If you are a Spark user, you can help us test this release by taking an
>>> existing Spark workload and running on this release candidate, then
>>> reporting any regressions.
>>>
>>> If you're working in PySpark you can set up a virtual env and install
>>> the current RC and see if anything important breaks, in the Java/Scala
>>> you can add the staging repository to your projects resolvers and test with
>>> the RC (make sure to clean up the artifact cache before/after so you
>>> don't end up building with a out of date RC going forward).
>>>
>>> *What should happen to JIRA tickets still targeting 2.1.2?*
>>>
>>> Committers should look at those and triage. Extremely important bug
>>> fixes, documentation, and API tweaks that impact compatibility should be
>>> worked on immediately. Everything else please retarget to 2.1.3.
>>>
>>> *But my bug isn't fixed!??!*
>>>
>>> In order to make timely releases, we will typically not hold the release
>>> unless the bug in question is a regression from 2.1.1. That being said
>>> if there is something which is a regression form 2.1.1 that has not
>>> been correctly targeted please ping a committer to help target the issue
>>> (you can see the open issues listed as impacting Spark 2.1.1 & 2.1.2
>>> 
>>> )
>>>
>>> *What are the unresolved* issues targeted for 2.1.2
>>> 
>>> ?
>>>
>>> At this time there are no open unresolved issues.
>>>
>>> *Is there anything different about this release?*
>>>
>>> This is the first release in awhile not built on the AMPLAB Jenkins.
>>> This is good because it means future releases can more easily be built and
>>> signed securely (and I've been updating the documentation in
>>> https://github.com/apache/spark-website/pull/66 as I progress), however
>>> the chances of a mistake are higher with any change like this. If there
>>> something you normally take for granted as correct when checking a release,
>>> please double check this time :)
>>>
>>> *Should I be committing code to branch-2.1?*
>>>
>>> Thanks for asking! Please treat this stage in the RC process as "code
>>> freeze" so bug fixes only. If you're uncertain if something should be back
>>> ported please reach out. If you do commit to branch-2.1 please tag your
>>> JIRA issue fix version for 2.1.3 and if we cut another RC I'll move the
>>> 2.1.3 fixed into 2.1.2 as appropriate.
>>>
>>> *What happened to RC3?*
>>>
>>> Some R+zinc interactions kept it from getting out the door.
>>> --
>>> Twitter: https://twitter.com/holdenkarau
>>>
>>
>


Re: [VOTE] Spark 2.1.2 (RC4)

2017-10-03 Thread Herman van Hövell tot Westerflier
+1

On Tue, Oct 3, 2017 at 1:32 PM, Sean Owen  wrote:

> +1 same as last RC. Tests pass, sigs and hashes are OK.
>
> On Tue, Oct 3, 2017 at 7:24 AM Holden Karau  wrote:
>
>> Please vote on releasing the following candidate as Apache Spark version
>> 2.1.2. The vote is open until Saturday October 7th at 9:00 PST and
>> passes if a majority of at least 3 +1 PMC votes are cast.
>>
>> [ ] +1 Release this package as Apache Spark 2.1.2
>> [ ] -1 Do not release this package because ...
>>
>>
>> To learn more about Apache Spark, please see https://spark.apache.org/
>>
>> The tag to be voted on is v2.1.2-rc4
>>  (2abaea9e40fce81
>> cd4626498e0f5c28a70917499)
>>
>> List of JIRA tickets resolved in this release can be found with this
>> filter.
>> 
>>
>> The release files, including signatures, digests, etc. can be found at:
>> https://home.apache.org/~holden/spark-2.1.2-rc4-bin/
>>
>> Release artifacts are signed with a key from:
>> https://people.apache.org/~holden/holdens_keys.asc
>>
>> The staging repository for this release can be found at:
>> https://repository.apache.org/content/repositories/orgapachespark-1252
>>
>> The documentation corresponding to this release can be found at:
>> https://people.apache.org/~holden/spark-2.1.2-rc4-docs/
>>
>>
>> *FAQ*
>>
>> *How can I help test this release?*
>>
>> If you are a Spark user, you can help us test this release by taking an
>> existing Spark workload and running on this release candidate, then
>> reporting any regressions.
>>
>> If you're working in PySpark you can set up a virtual env and install the
>> current RC and see if anything important breaks, in the Java/Scala you
>> can add the staging repository to your projects resolvers and test with the
>> RC (make sure to clean up the artifact cache before/after so you don't
>> end up building with a out of date RC going forward).
>>
>> *What should happen to JIRA tickets still targeting 2.1.2?*
>>
>> Committers should look at those and triage. Extremely important bug
>> fixes, documentation, and API tweaks that impact compatibility should be
>> worked on immediately. Everything else please retarget to 2.1.3.
>>
>> *But my bug isn't fixed!??!*
>>
>> In order to make timely releases, we will typically not hold the release
>> unless the bug in question is a regression from 2.1.1. That being said
>> if there is something which is a regression form 2.1.1 that has not been
>> correctly targeted please ping a committer to help target the issue (you
>> can see the open issues listed as impacting Spark 2.1.1 & 2.1.2
>> 
>> )
>>
>> *What are the unresolved* issues targeted for 2.1.2
>> 
>> ?
>>
>> At this time there are no open unresolved issues.
>>
>> *Is there anything different about this release?*
>>
>> This is the first release in awhile not built on the AMPLAB Jenkins. This
>> is good because it means future releases can more easily be built and
>> signed securely (and I've been updating the documentation in
>> https://github.com/apache/spark-website/pull/66 as I progress), however
>> the chances of a mistake are higher with any change like this. If there
>> something you normally take for granted as correct when checking a release,
>> please double check this time :)
>>
>> *Should I be committing code to branch-2.1?*
>>
>> Thanks for asking! Please treat this stage in the RC process as "code
>> freeze" so bug fixes only. If you're uncertain if something should be back
>> ported please reach out. If you do commit to branch-2.1 please tag your
>> JIRA issue fix version for 2.1.3 and if we cut another RC I'll move the 2.1.3
>> fixed into 2.1.2 as appropriate.
>>
>> *What happened to RC3?*
>>
>> Some R+zinc interactions kept it from getting out the door.
>> --
>> Twitter: https://twitter.com/holdenkarau
>>
>


Re: [VOTE] Spark 2.1.2 (RC4)

2017-10-03 Thread Sean Owen
+1 same as last RC. Tests pass, sigs and hashes are OK.

On Tue, Oct 3, 2017 at 7:24 AM Holden Karau  wrote:

> Please vote on releasing the following candidate as Apache Spark version 2
> .1.2. The vote is open until Saturday October 7th at 9:00 PST and passes
> if a majority of at least 3 +1 PMC votes are cast.
>
> [ ] +1 Release this package as Apache Spark 2.1.2
> [ ] -1 Do not release this package because ...
>
>
> To learn more about Apache Spark, please see https://spark.apache.org/
>
> The tag to be voted on is v2.1.2-rc4
>  (
> 2abaea9e40fce81cd4626498e0f5c28a70917499)
>
> List of JIRA tickets resolved in this release can be found with this
> filter.
> 
>
> The release files, including signatures, digests, etc. can be found at:
> https://home.apache.org/~holden/spark-2.1.2-rc4-bin/
>
> Release artifacts are signed with a key from:
> https://people.apache.org/~holden/holdens_keys.asc
>
> The staging repository for this release can be found at:
> https://repository.apache.org/content/repositories/orgapachespark-1252
>
> The documentation corresponding to this release can be found at:
> https://people.apache.org/~holden/spark-2.1.2-rc4-docs/
>
>
> *FAQ*
>
> *How can I help test this release?*
>
> If you are a Spark user, you can help us test this release by taking an
> existing Spark workload and running on this release candidate, then
> reporting any regressions.
>
> If you're working in PySpark you can set up a virtual env and install the
> current RC and see if anything important breaks, in the Java/Scala you
> can add the staging repository to your projects resolvers and test with the
> RC (make sure to clean up the artifact cache before/after so you don't
> end up building with a out of date RC going forward).
>
> *What should happen to JIRA tickets still targeting 2.1.2?*
>
> Committers should look at those and triage. Extremely important bug fixes,
> documentation, and API tweaks that impact compatibility should be worked on
> immediately. Everything else please retarget to 2.1.3.
>
> *But my bug isn't fixed!??!*
>
> In order to make timely releases, we will typically not hold the release
> unless the bug in question is a regression from 2.1.1. That being said if
> there is something which is a regression form 2.1.1 that has not been
> correctly targeted please ping a committer to help target the issue (you
> can see the open issues listed as impacting Spark 2.1.1 & 2.1.2
> 
> )
>
> *What are the unresolved* issues targeted for 2.1.2
> 
> ?
>
> At this time there are no open unresolved issues.
>
> *Is there anything different about this release?*
>
> This is the first release in awhile not built on the AMPLAB Jenkins. This
> is good because it means future releases can more easily be built and
> signed securely (and I've been updating the documentation in
> https://github.com/apache/spark-website/pull/66 as I progress), however
> the chances of a mistake are higher with any change like this. If there
> something you normally take for granted as correct when checking a release,
> please double check this time :)
>
> *Should I be committing code to branch-2.1?*
>
> Thanks for asking! Please treat this stage in the RC process as "code
> freeze" so bug fixes only. If you're uncertain if something should be back
> ported please reach out. If you do commit to branch-2.1 please tag your
> JIRA issue fix version for 2.1.3 and if we cut another RC I'll move the 2.1.3
> fixed into 2.1.2 as appropriate.
>
> *What happened to RC3?*
>
> Some R+zinc interactions kept it from getting out the door.
> --
> Twitter: https://twitter.com/holdenkarau
>


Re: Welcoming Tejas Patil as a Spark committer

2017-10-03 Thread Vishal Verma
Congrats Tejas!!

On Tue, Oct 3, 2017 at 4:55 AM, Xiao Li  wrote:

> Congratulations!
>
> Xiao
>
> 2017-10-02 10:47 GMT-07:00 Tejas Patil :
>
>> Thanks everyone !!! It's a great privilege to be part of the Spark
>> community.
>>
>> ~tejasp
>>
>> On Sat, Sep 30, 2017 at 2:27 PM, Jacek Laskowski  wrote:
>>
>>> Hi,
>>>
>>> Oh, yeah. Seen Tejas here and there in the commits. Well deserved.
>>>
>>> Jacek
>>>
>>> On 29 Sep 2017 9:58 pm, "Matei Zaharia"  wrote:
>>>
>>> Hi all,
>>>
>>> The Spark PMC recently added Tejas Patil as a committer on the
>>> project. Tejas has been contributing across several areas of Spark for
>>> a while, focusing especially on scalability issues and SQL. Please
>>> join me in welcoming Tejas!
>>>
>>> Matei
>>>
>>> -
>>> To unsubscribe e-mail: dev-unsubscr...@spark.apache.org
>>>
>>>
>>>
>>
>


-- 
Thanks & Regards,

*Vishal Verma*
*BigData Engineer* | Exadatum Software Services Pvt. Ltd.
[image: www.exadatum.com] 
Mob: +91- 967-331-3735
Pune | India

-- 

*DISCLAIMER:*
All the content in email is intended for the recipient and not to be 
published elsewhere without Exadatum consent. And attachments shall be send 
only if required and with ownership of the sender. This message contains 
confidential information and is intended only for the individual named. If 
you are not the named addressee, you should not disseminate, distribute or 
copy this email. Please notify the sender immediately by email if you have 
received this email by mistake and delete this email from your system. 
Email transmission cannot be guaranteed to be secure or error-free, as 
information could be intercepted, corrupted, lost, destroyed, arrive late 
or incomplete, or contain viruses. The sender, therefore, does not accept 
liability for any errors or omissions in the contents of this message which 
arise as a result of email transmission. If verification is required, 
please request a hard-copy version.


[VOTE] Spark 2.1.2 (RC4)

2017-10-03 Thread Holden Karau
Please vote on releasing the following candidate as Apache Spark version 2
.1.2. The vote is open until Saturday October 7th at 9:00 PST and passes if
a majority of at least 3 +1 PMC votes are cast.

[ ] +1 Release this package as Apache Spark 2.1.2
[ ] -1 Do not release this package because ...


To learn more about Apache Spark, please see https://spark.apache.org/

The tag to be voted on is v2.1.2-rc4
 (
2abaea9e40fce81cd4626498e0f5c28a70917499)

List of JIRA tickets resolved in this release can be found with this filter.


The release files, including signatures, digests, etc. can be found at:
https://home.apache.org/~holden/spark-2.1.2-rc4-bin/

Release artifacts are signed with a key from:
https://people.apache.org/~holden/holdens_keys.asc

The staging repository for this release can be found at:
https://repository.apache.org/content/repositories/orgapachespark-1252

The documentation corresponding to this release can be found at:
https://people.apache.org/~holden/spark-2.1.2-rc4-docs/


*FAQ*

*How can I help test this release?*

If you are a Spark user, you can help us test this release by taking an
existing Spark workload and running on this release candidate, then
reporting any regressions.

If you're working in PySpark you can set up a virtual env and install the
current RC and see if anything important breaks, in the Java/Scala you can
add the staging repository to your projects resolvers and test with
the RC (make
sure to clean up the artifact cache before/after so you don't end up
building with a out of date RC going forward).

*What should happen to JIRA tickets still targeting 2.1.2?*

Committers should look at those and triage. Extremely important bug fixes,
documentation, and API tweaks that impact compatibility should be worked on
immediately. Everything else please retarget to 2.1.3.

*But my bug isn't fixed!??!*

In order to make timely releases, we will typically not hold the release
unless the bug in question is a regression from 2.1.1. That being said if
there is something which is a regression form 2.1.1 that has not been
correctly targeted please ping a committer to help target the issue (you
can see the open issues listed as impacting Spark 2.1.1 & 2.1.2

)

*What are the unresolved* issues targeted for 2.1.2

?

At this time there are no open unresolved issues.

*Is there anything different about this release?*

This is the first release in awhile not built on the AMPLAB Jenkins. This
is good because it means future releases can more easily be built and
signed securely (and I've been updating the documentation in
https://github.com/apache/spark-website/pull/66 as I progress), however the
chances of a mistake are higher with any change like this. If there
something you normally take for granted as correct when checking a release,
please double check this time :)

*Should I be committing code to branch-2.1?*

Thanks for asking! Please treat this stage in the RC process as "code
freeze" so bug fixes only. If you're uncertain if something should be back
ported please reach out. If you do commit to branch-2.1 please tag your
JIRA issue fix version for 2.1.3 and if we cut another RC I'll move the 2.1.3
fixed into 2.1.2 as appropriate.

*What happened to RC3?*

Some R+zinc interactions kept it from getting out the door.
-- 
Twitter: https://twitter.com/holdenkarau