Re: [RESULT] [VOTE] Release Apache Spark 1.2.2

2015-04-17 Thread Sree V
cleaned up ~/.m2 and ~/.zinc.
received exact same error, again. So, -1 from me.

[INFO] 
[INFO] Building Spark Project External Flume 1.2.2
[INFO] 
[INFO] 
[INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ 
spark-streaming-flume_2.10 ---
[INFO] Deleting /root/sources/github/spark/external/flume/target
[INFO] 
[INFO] --- maven-enforcer-plugin:1.3.1:enforce (enforce-versions) @ 
spark-streaming-flume_2.10 ---
[INFO] 
[INFO] --- build-helper-maven-plugin:1.8:add-source (add-scala-sources) @ 
spark-streaming-flume_2.10 ---
[INFO] Source directory: 
/root/sources/github/spark/external/flume/src/main/scala added.
[INFO] 
[INFO] --- maven-remote-resources-plugin:1.5:process (default) @ 
spark-streaming-flume_2.10 ---
[INFO] 
[INFO] --- maven-resources-plugin:2.6:resources (default-resources) @ 
spark-streaming-flume_2.10 ---
[INFO] Using 'UTF-8' encoding to copy filtered resources.
[INFO] skip non existing resourceDirectory 
/root/sources/github/spark/external/flume/src/main/resources
[INFO] Copying 3 resources
[INFO] 
[INFO] --- scala-maven-plugin:3.2.0:compile (scala-compile-first) @ 
spark-streaming-flume_2.10 ---
[WARNING] Zinc server is not available at port 3030 - reverting to normal 
incremental compile
[INFO] Using incremental compilation
[INFO] compiler plugin: 
BasicArtifact(org.scalamacros,paradise_2.10.4,2.0.1,null)
[INFO] Compiling 6 Scala sources and 1 Java source to 
/root/sources/github/spark/external/flume/target/scala-2.10/classes...
[ERROR] 
/root/sources/github/spark/external/flume/src/main/scala/org/apache/spark/streaming/flume/FlumeBatchFetcher.scala:22:
 object Throwables is not a member of package com.google.common.base
[ERROR] import com.google.common.base.Throwables
[ERROR]    ^
[ERROR] 
/root/sources/github/spark/external/flume/src/main/scala/org/apache/spark/streaming/flume/FlumeBatchFetcher.scala:59:
 not found: value Throwables
[ERROR]   Throwables.getRootCause(e) match {
[ERROR]   ^
[ERROR] 
/root/sources/github/spark/external/flume/src/main/scala/org/apache/spark/streaming/flume/FlumePollingInputDStream.scala:26:
 object util is not a member of package com.google.common
[ERROR] import com.google.common.util.concurrent.ThreadFactoryBuilder
[ERROR]  ^
[ERROR] 
/root/sources/github/spark/external/flume/src/main/scala/org/apache/spark/streaming/flume/FlumePollingInputDStream.scala:69:
 not found: type ThreadFactoryBuilder
[ERROR] Executors.newCachedThreadPool(new 
ThreadFactoryBuilder().setDaemon(true).
[ERROR]   ^
[ERROR] 
/root/sources/github/spark/external/flume/src/main/scala/org/apache/spark/streaming/flume/FlumePollingInputDStream.scala:76:
 not found: type ThreadFactoryBuilder
[ERROR] new ThreadFactoryBuilder().setDaemon(true).setNameFormat("Flume 
Receiver Thread - %d").build())
[ERROR] ^
[ERROR] 5 errors found
[INFO] 
[INFO] Reactor Summary:
[INFO] 
[INFO] Spark Project Parent POM ... SUCCESS [01:44 min]
[INFO] Spark Project Networking ... SUCCESS [ 49.128 s]
[INFO] Spark Project Shuffle Streaming Service  SUCCESS [  8.503 s]
[INFO] Spark Project Core . SUCCESS [05:22 min]
[INFO] Spark Project Bagel  SUCCESS [ 25.647 s]
[INFO] Spark Project GraphX ... SUCCESS [01:13 min]
[INFO] Spark Project Streaming  SUCCESS [01:29 min]
[INFO] Spark Project Catalyst . SUCCESS [01:51 min]
[INFO] Spark Project SQL .. SUCCESS [01:57 min]
[INFO] Spark Project ML Library ... SUCCESS [02:25 min]
[INFO] Spark Project Tools  SUCCESS [ 16.665 s]
[INFO] Spark Project Hive . SUCCESS [02:03 min]
[INFO] Spark Project REPL . SUCCESS [ 50.294 s]
[INFO] Spark Project YARN Parent POM .. SUCCESS [  5.777 s]
[INFO] Spark Project YARN Stable API .. SUCCESS [ 53.803 s]
[INFO] Spark Project Assembly . SUCCESS [ 59.515 s]
[INFO] Spark Project External Twitter . SUCCESS [ 40.038 s]
[INFO] Spark Project External Flume Sink .. SUCCESS [ 32.779 s]
[INFO] Spark Project External Flume ... FAILURE [  7.936 s]
[INFO] Spark Project External MQTT  SKIPPED
[INFO] Spark Project External ZeroMQ .. SKIPPED
[INFO] Spark Project External Kafka ... SKIPPED
[INFO] Spark Project Examples . SKIPPED
[INFO] Spark Project YARN Shuffle Service ...

Re: [RESULT] [VOTE] Release Apache Spark 1.2.2

2015-04-17 Thread Sree V
Hi Sean,
This is from build log.  I made a master branch build earlier on this 
machine.Do you think, it needs a clean up of .m2 folder, that you suggested in 
onetime earlier ?Giving it another try, while you take a look at this.

[INFO] --- scala-maven-plugin:3.2.0:compile (scala-compile-first) @ 
spark-streaming-flume_2.10 ---
[WARNING] Zinc server is not available at port 3030 - reverting to normal 
incremental compile
[INFO] Using incremental compilation
[INFO] compiler plugin: 
BasicArtifact(org.scalamacros,paradise_2.10.4,2.0.1,null)
[INFO] Compiling 6 Scala sources and 1 Java source to 
/root/sources/github/spark/external/flume/target/scala-2
.10/classes...
[ERROR] 
/root/sources/github/spark/external/flume/src/main/scala/org/apache/spark/streaming/flume/FlumeBatchFe
tcher.scala:22: object Throwables is not a member of package 
com.google.common.base
[ERROR] import com.google.common.base.Throwables
[ERROR]    ^
[ERROR] 
/root/sources/github/spark/external/flume/src/main/scala/org/apache/spark/streaming/flume/FlumeBatchFe
tcher.scala:59: not found: value Throwables
[ERROR]   Throwables.getRootCause(e) match {
[ERROR]   ^
[ERROR] 
/root/sources/github/spark/external/flume/src/main/scala/org/apache/spark/streaming/flume/FlumePolling
InputDStream.scala:26: object util is not a member of package com.google.common
[ERROR] import com.google.common.util.concurrent.ThreadFactoryBuilder
[ERROR]  ^
[ERROR] 
/root/sources/github/spark/external/flume/src/main/scala/org/apache/spark/streaming/flume/FlumePolling
InputDStream.scala:69: not found: type ThreadFactoryBuilder
[ERROR] Executors.newCachedThreadPool(new 
ThreadFactoryBuilder().setDaemon(true).
[ERROR]   ^
[ERROR] 
/root/sources/github/spark/external/flume/src/main/scala/org/apache/spark/streaming/flume/FlumePolling
InputDStream.scala:76: not found: type ThreadFactoryBuilder
[ERROR] new ThreadFactoryBuilder().setDaemon(true).setNameFormat("Flume 
Receiver Thread - %d").build())
[ERROR] ^
[ERROR] 5 errors found
[INFO] 
[INFO] Reactor Summary:
[INFO] 
[INFO] Spark Project Parent POM ... SUCCESS [ 15.894 s]
[INFO] Spark Project Networking ... SUCCESS [ 20.801 s]
[INFO] Spark Project Shuffle Streaming Service  SUCCESS [ 18.111 s]
[INFO] Spark Project Core . SUCCESS [08:09 min]
[INFO] Spark Project Bagel  SUCCESS [ 43.592 s]
[INFO] Spark Project GraphX ... SUCCESS [01:55 min]
[INFO] Spark Project Streaming  SUCCESS [03:02 min]
[INFO] Spark Project Catalyst . SUCCESS [02:59 min]
[INFO] Spark Project SQL .. SUCCESS [03:09 min]
[INFO] Spark Project ML Library ... SUCCESS [03:24 min]
[INFO] Spark Project Tools  SUCCESS [ 24.816 s]
[INFO] Spark Project Hive . SUCCESS [02:14 min]
[INFO] Spark Project REPL . SUCCESS [01:12 min]
[INFO] Spark Project YARN Parent POM .. SUCCESS [  6.080 s]
[INFO] Spark Project YARN Stable API .. SUCCESS [01:27 min]
[INFO] Spark Project Assembly . SUCCESS [01:22 min]
[INFO] Spark Project External Twitter . SUCCESS [ 35.881 s]
[INFO] Spark Project External Flume Sink .. SUCCESS [ 39.561 s]
[INFO] Spark Project External Flume ... FAILURE [ 11.212 s]
[INFO] Spark Project External MQTT  SKIPPED
[INFO] Spark Project External ZeroMQ .. SKIPPED
[INFO] Spark Project External Kafka ... SKIPPED
[INFO] Spark Project Examples . SKIPPED
[INFO] Spark Project YARN Shuffle Service . SKIPPED
[INFO] 
[INFO] BUILD FAILURE
[INFO] 
[INFO] Total time: 32:36 min
[INFO] Finished at: 2015-04-16T23:02:18-07:00
[INFO] Final Memory: 91M/2043M
[INFO] 
[ERROR] Failed to execute goal 
net.alchim31.maven:scala-maven-plugin:3.2.0:compile (scala-compile-first) on pr
oject spark-streaming-flume_2.10: Execution scala-compile-first of goal 
net.alchim31.maven:scala-maven-plugin:
3.2.0:compile failed. CompileFailed -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e 
switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please 
read the following articles:
[ERROR] [Help 1

Re: [RESULT] [VOTE] Release Apache Spark 1.2.2

2015-04-17 Thread Sree V
Hi Sean,
This is from build log.  I made a master branch build earlier on this 
machine.Do you think, it needs a clean up of .m2 folder, that you suggested in 
onetime earlier ?Giving it another try, while you take a look at this.

[INFO] --- scala-maven-plugin:3.2.0:compile (scala-compile-first) @ 
spark-streaming-flume_2.10 ---
[WARNING] Zinc server is not available at port 3030 - reverting to normal 
incremental compile
[INFO] Using incremental compilation
[INFO] compiler plugin: 
BasicArtifact(org.scalamacros,paradise_2.10.4,2.0.1,null)
[INFO] Compiling 6 Scala sources and 1 Java source to 
/root/sources/github/spark/external/flume/target/scala-2
.10/classes...
[ERROR] 
/root/sources/github/spark/external/flume/src/main/scala/org/apache/spark/streaming/flume/FlumeBatchFe
tcher.scala:22: object Throwables is not a member of package 
com.google.common.base
[ERROR] import com.google.common.base.Throwables
[ERROR]    ^
[ERROR] 
/root/sources/github/spark/external/flume/src/main/scala/org/apache/spark/streaming/flume/FlumeBatchFe
tcher.scala:59: not found: value Throwables
[ERROR]   Throwables.getRootCause(e) match {
[ERROR]   ^
[ERROR] 
/root/sources/github/spark/external/flume/src/main/scala/org/apache/spark/streaming/flume/FlumePolling
InputDStream.scala:26: object util is not a member of package com.google.common
[ERROR] import com.google.common.util.concurrent.ThreadFactoryBuilder
[ERROR]  ^
[ERROR] 
/root/sources/github/spark/external/flume/src/main/scala/org/apache/spark/streaming/flume/FlumePolling
InputDStream.scala:69: not found: type ThreadFactoryBuilder
[ERROR] Executors.newCachedThreadPool(new 
ThreadFactoryBuilder().setDaemon(true).
[ERROR]   ^
[ERROR] 
/root/sources/github/spark/external/flume/src/main/scala/org/apache/spark/streaming/flume/FlumePolling
InputDStream.scala:76: not found: type ThreadFactoryBuilder
[ERROR] new ThreadFactoryBuilder().setDaemon(true).setNameFormat("Flume 
Receiver Thread - %d").build())
[ERROR] ^
[ERROR] 5 errors found
[INFO] 
[INFO] Reactor Summary:
[INFO] 
[INFO] Spark Project Parent POM ... SUCCESS [ 15.894 s]
[INFO] Spark Project Networking ... SUCCESS [ 20.801 s]
[INFO] Spark Project Shuffle Streaming Service  SUCCESS [ 18.111 s]
[INFO] Spark Project Core . SUCCESS [08:09 min]
[INFO] Spark Project Bagel  SUCCESS [ 43.592 s]
[INFO] Spark Project GraphX ... SUCCESS [01:55 min]
[INFO] Spark Project Streaming  SUCCESS [03:02 min]
[INFO] Spark Project Catalyst . SUCCESS [02:59 min]
[INFO] Spark Project SQL .. SUCCESS [03:09 min]
[INFO] Spark Project ML Library ... SUCCESS [03:24 min]
[INFO] Spark Project Tools  SUCCESS [ 24.816 s]
[INFO] Spark Project Hive . SUCCESS [02:14 min]
[INFO] Spark Project REPL . SUCCESS [01:12 min]
[INFO] Spark Project YARN Parent POM .. SUCCESS [  6.080 s]
[INFO] Spark Project YARN Stable API .. SUCCESS [01:27 min]
[INFO] Spark Project Assembly . SUCCESS [01:22 min]
[INFO] Spark Project External Twitter . SUCCESS [ 35.881 s]
[INFO] Spark Project External Flume Sink .. SUCCESS [ 39.561 s]
[INFO] Spark Project External Flume ... FAILURE [ 11.212 s]
[INFO] Spark Project External MQTT  SKIPPED
[INFO] Spark Project External ZeroMQ .. SKIPPED
[INFO] Spark Project External Kafka ... SKIPPED
[INFO] Spark Project Examples . SKIPPED
[INFO] Spark Project YARN Shuffle Service . SKIPPED
[INFO] 
[INFO] BUILD FAILURE
[INFO] 
[INFO] Total time: 32:36 min
[INFO] Finished at: 2015-04-16T23:02:18-07:00
[INFO] Final Memory: 91M/2043M
[INFO] 
[ERROR] Failed to execute goal 
net.alchim31.maven:scala-maven-plugin:3.2.0:compile (scala-compile-first) on pr
oject spark-streaming-flume_2.10: Execution scala-compile-first of goal 
net.alchim31.maven:scala-maven-plugin:
3.2.0:compile failed. CompileFailed -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e 
switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please 
read the following articles:
[ERROR] [Help 1

Re: [RESULT] [VOTE] Release Apache Spark 1.2.2

2015-04-17 Thread Sean Owen
Sree that doesn't show any error, so it doesn't help. I built with the
same flags when I tested and it succeeded.

On Fri, Apr 17, 2015 at 8:53 AM, Sree V  wrote:
> Sorry, I couldn't catch up before closing the voting.If it still counts, mvn 
> package fails (1).  And didn't run test (2).  So, -1.1.mvn -Phadoop-2.4 
> -Pyarn -Phive -Phive-0.13.1 -Dhadoop.version=2.6.0 -DskipTests clean package
> 2. mvn -Phadoop-2.4 -Pyarn -Phive -Phive-0.13.1 -Dhadoop.version=2.6.0 test
> Error:
> [INFO] Spark Project External Flume Sink .. SUCCESS [ 39.561 
> s]
> [INFO] Spark Project External Flume ... FAILURE [ 11.212 
> s]
> [INFO] Spark Project External MQTT  SKIPPED
> [INFO] Spark Project External ZeroMQ .. SKIPPED
> [INFO] Spark Project External Kafka ... SKIPPED
> [INFO] Spark Project Examples . SKIPPED
> [INFO] Spark Project YARN Shuffle Service . SKIPPED
>
>
> Thanking you.
>
> With Regards
> Sree
>
>
>  On Thursday, April 16, 2015 3:42 PM, Patrick Wendell 
>  wrote:
>
>
>  I'm gonna go ahead and close this now - thanks everyone for voting!
>
> This vote passes with 7 +1 votes (6 binding) and no 0 or -1 votes.
>
> +1:
> Mark Hamstra*
> Reynold Xin
> Kirshna Sankar
> Sean Owen*
> Tom Graves*
> Joseph Bradley*
> Sean McNamara*
>
> 0:
>
> -1:
>
> Thanks!
> - Patrick
>
> On Thu, Apr 16, 2015 at 3:27 PM, Sean Owen  wrote:
>> No, of course Jenkins runs tests. The way some of the tests work, they
>> need the build artifacts to have been created first. So it runs "mvn
>> ... -DskipTests package" then "mvn ... test"
>>
>> On Thu, Apr 16, 2015 at 11:09 PM, Sree V  wrote:
>>> In my effort to vote for this release, I found these along:
>>>
>>> This is from jenkins.  It uses "-DskipTests".
>>>
>>> [centos] $
>>> /home/jenkins/tools/hudson.tasks.Maven_MavenInstallation/Maven_3.0.5/bin/mvn
>>> -Dhadoop.version=2.0.0-mr1-cdh4.1.2 -Dlabel=centos -DskipTests clean package
>>>
>>> We build on our locals / servers using same flag.
>>>
>>>
>>> Usually, for releases we build with running all the tests, as well. and at
>>> some level of code coverage.
>>>
>>> Are we by-passing it ?
>>>
>>>
>>>
>>> Thanking you.
>>>
>>> With Regards
>>> Sree
>>>
>>>
>>>
>>> On Wednesday, April 15, 2015 3:32 PM, Sean McNamara
>>>  wrote:
>>>
>>>
>>> Ran tests on OS X
>>>
>>> +1
>>>
>>> Sean
>>>
>>>
 On Apr 14, 2015, at 10:59 PM, Patrick Wendell  wrote:

 I'd like to close this vote to coincide with the 1.3.1 release,
 however, it would be great to have more people test this release
 first. I'll leave it open for a bit longer and see if others can give
 a +1.

 On Tue, Apr 14, 2015 at 9:55 PM, Patrick Wendell 
 wrote:
> +1 from me ass well.
>
> On Tue, Apr 7, 2015 at 4:36 AM, Sean Owen  wrote:
>> I think that's close enough for a +1:
>>
>> Signatures and hashes are good.
>> LICENSE, NOTICE still check out.
>> Compiles for a Hadoop 2.6 + YARN + Hive profile.
>>
>> JIRAs with target version = 1.2.x look legitimate; no blockers.
>>
>> I still observe several Hive test failures with:
>> mvn -Phadoop-2.4 -Pyarn -Phive -Phive-0.13.1 -Dhadoop.version=2.6.0
>> -DskipTests clean package; mvn -Phadoop-2.4 -Pyarn -Phive
>> -Phive-0.13.1 -Dhadoop.version=2.6.0 test
>> .. though again I think these are not regressions but known issues in
>> older branches.
>>
>> FYI there are 16 Critical issues still open for 1.2.x:
>>
>> SPARK-6209,ExecutorClassLoader can leak connections after failing to
>> load classes from the REPL class server,Josh Rosen,In Progress,4/5/15
>> SPARK-5098,Number of running tasks become negative after tasks
>> lost,,Open,1/14/15
>> SPARK-4888,"Spark EC2 doesn't mount local disks for i2.8xlarge
>> instances",,Open,1/27/15
>> SPARK-4879,Missing output partitions after job completes with
>> speculative execution,Josh Rosen,Open,3/5/15
>> SPARK-4568,Publish release candidates under $VERSION-RCX instead of
>> $VERSION,Patrick Wendell,Open,11/24/14
>> SPARK-4520,SparkSQL exception when reading certain columns from a
>> parquet file,sadhan sood,Open,1/21/15
>> SPARK-4514,SparkContext localProperties does not inherit property
>> updates across thread reuse,Josh Rosen,Open,3/31/15
>> SPARK-4454,Race condition in DAGScheduler,Josh Rosen,Reopened,2/18/15
>> SPARK-4452,Shuffle data structures can starve others on the same
>> thread for memory,Tianshuo Deng,Open,1/24/15
>> SPARK-4356,Test Scala 2.11 on Jenkins,Patrick Wendell,Open,11/12/14
>> SPARK-4258,NPE with new Parquet Filters,Cheng Lian,Reopened,4/3/15
>> SPARK-4194,Exceptions thrown during SparkContext or SparkEnv
>> construction might lead to resource leaks or corrupted global
>> state,,In Progress,4/2/15
>> SPARK-4159,"Maven build doesn't run JUnit test 

Re: [RESULT] [VOTE] Release Apache Spark 1.2.2

2015-04-17 Thread Sree V
Sorry, I couldn't catch up before closing the voting.If it still counts, mvn 
package fails (1).  And didn't run test (2).  So, -1.1.mvn -Phadoop-2.4 -Pyarn 
-Phive -Phive-0.13.1 -Dhadoop.version=2.6.0 -DskipTests clean package
2. mvn -Phadoop-2.4 -Pyarn -Phive -Phive-0.13.1 -Dhadoop.version=2.6.0 test
Error:
[INFO] Spark Project External Flume Sink .. SUCCESS [ 39.561 s]
[INFO] Spark Project External Flume ... FAILURE [ 11.212 s]
[INFO] Spark Project External MQTT  SKIPPED
[INFO] Spark Project External ZeroMQ .. SKIPPED
[INFO] Spark Project External Kafka ... SKIPPED
[INFO] Spark Project Examples . SKIPPED
[INFO] Spark Project YARN Shuffle Service . SKIPPED


Thanking you.

With Regards
Sree 


 On Thursday, April 16, 2015 3:42 PM, Patrick Wendell  
wrote:
   

 I'm gonna go ahead and close this now - thanks everyone for voting!

This vote passes with 7 +1 votes (6 binding) and no 0 or -1 votes.

+1:
Mark Hamstra*
Reynold Xin
Kirshna Sankar
Sean Owen*
Tom Graves*
Joseph Bradley*
Sean McNamara*

0:

-1:

Thanks!
- Patrick

On Thu, Apr 16, 2015 at 3:27 PM, Sean Owen  wrote:
> No, of course Jenkins runs tests. The way some of the tests work, they
> need the build artifacts to have been created first. So it runs "mvn
> ... -DskipTests package" then "mvn ... test"
>
> On Thu, Apr 16, 2015 at 11:09 PM, Sree V  wrote:
>> In my effort to vote for this release, I found these along:
>>
>> This is from jenkins.  It uses "-DskipTests".
>>
>> [centos] $
>> /home/jenkins/tools/hudson.tasks.Maven_MavenInstallation/Maven_3.0.5/bin/mvn
>> -Dhadoop.version=2.0.0-mr1-cdh4.1.2 -Dlabel=centos -DskipTests clean package
>>
>> We build on our locals / servers using same flag.
>>
>>
>> Usually, for releases we build with running all the tests, as well. and at
>> some level of code coverage.
>>
>> Are we by-passing it ?
>>
>>
>>
>> Thanking you.
>>
>> With Regards
>> Sree
>>
>>
>>
>> On Wednesday, April 15, 2015 3:32 PM, Sean McNamara
>>  wrote:
>>
>>
>> Ran tests on OS X
>>
>> +1
>>
>> Sean
>>
>>
>>> On Apr 14, 2015, at 10:59 PM, Patrick Wendell  wrote:
>>>
>>> I'd like to close this vote to coincide with the 1.3.1 release,
>>> however, it would be great to have more people test this release
>>> first. I'll leave it open for a bit longer and see if others can give
>>> a +1.
>>>
>>> On Tue, Apr 14, 2015 at 9:55 PM, Patrick Wendell 
>>> wrote:
 +1 from me ass well.

 On Tue, Apr 7, 2015 at 4:36 AM, Sean Owen  wrote:
> I think that's close enough for a +1:
>
> Signatures and hashes are good.
> LICENSE, NOTICE still check out.
> Compiles for a Hadoop 2.6 + YARN + Hive profile.
>
> JIRAs with target version = 1.2.x look legitimate; no blockers.
>
> I still observe several Hive test failures with:
> mvn -Phadoop-2.4 -Pyarn -Phive -Phive-0.13.1 -Dhadoop.version=2.6.0
> -DskipTests clean package; mvn -Phadoop-2.4 -Pyarn -Phive
> -Phive-0.13.1 -Dhadoop.version=2.6.0 test
> .. though again I think these are not regressions but known issues in
> older branches.
>
> FYI there are 16 Critical issues still open for 1.2.x:
>
> SPARK-6209,ExecutorClassLoader can leak connections after failing to
> load classes from the REPL class server,Josh Rosen,In Progress,4/5/15
> SPARK-5098,Number of running tasks become negative after tasks
> lost,,Open,1/14/15
> SPARK-4888,"Spark EC2 doesn't mount local disks for i2.8xlarge
> instances",,Open,1/27/15
> SPARK-4879,Missing output partitions after job completes with
> speculative execution,Josh Rosen,Open,3/5/15
> SPARK-4568,Publish release candidates under $VERSION-RCX instead of
> $VERSION,Patrick Wendell,Open,11/24/14
> SPARK-4520,SparkSQL exception when reading certain columns from a
> parquet file,sadhan sood,Open,1/21/15
> SPARK-4514,SparkContext localProperties does not inherit property
> updates across thread reuse,Josh Rosen,Open,3/31/15
> SPARK-4454,Race condition in DAGScheduler,Josh Rosen,Reopened,2/18/15
> SPARK-4452,Shuffle data structures can starve others on the same
> thread for memory,Tianshuo Deng,Open,1/24/15
> SPARK-4356,Test Scala 2.11 on Jenkins,Patrick Wendell,Open,11/12/14
> SPARK-4258,NPE with new Parquet Filters,Cheng Lian,Reopened,4/3/15
> SPARK-4194,Exceptions thrown during SparkContext or SparkEnv
> construction might lead to resource leaks or corrupted global
> state,,In Progress,4/2/15
> SPARK-4159,"Maven build doesn't run JUnit test suites",Sean
> Owen,Open,1/11/15
> SPARK-4106,Shuffle write and spill to disk metrics are
> incorrect,,Open,10/28/14
> SPARK-3492,Clean up Yarn integration code,Andrew Or,Open,9/12/14
> SPARK-3461,Support external groupByKey using
> repartitionAndSortWithinPartitions,Sandy Ryza,Open,11/10/14
> SPARK-2984,F

[RESULT] [VOTE] Release Apache Spark 1.2.2

2015-04-16 Thread Patrick Wendell
I'm gonna go ahead and close this now - thanks everyone for voting!

This vote passes with 7 +1 votes (6 binding) and no 0 or -1 votes.

+1:
Mark Hamstra*
Reynold Xin
Kirshna Sankar
Sean Owen*
Tom Graves*
Joseph Bradley*
Sean McNamara*

0:

-1:

Thanks!
- Patrick

On Thu, Apr 16, 2015 at 3:27 PM, Sean Owen  wrote:
> No, of course Jenkins runs tests. The way some of the tests work, they
> need the build artifacts to have been created first. So it runs "mvn
> ... -DskipTests package" then "mvn ... test"
>
> On Thu, Apr 16, 2015 at 11:09 PM, Sree V  wrote:
>> In my effort to vote for this release, I found these along:
>>
>> This is from jenkins.  It uses "-DskipTests".
>>
>> [centos] $
>> /home/jenkins/tools/hudson.tasks.Maven_MavenInstallation/Maven_3.0.5/bin/mvn
>> -Dhadoop.version=2.0.0-mr1-cdh4.1.2 -Dlabel=centos -DskipTests clean package
>>
>> We build on our locals / servers using same flag.
>>
>>
>> Usually, for releases we build with running all the tests, as well. and at
>> some level of code coverage.
>>
>> Are we by-passing it ?
>>
>>
>>
>> Thanking you.
>>
>> With Regards
>> Sree
>>
>>
>>
>> On Wednesday, April 15, 2015 3:32 PM, Sean McNamara
>>  wrote:
>>
>>
>> Ran tests on OS X
>>
>> +1
>>
>> Sean
>>
>>
>>> On Apr 14, 2015, at 10:59 PM, Patrick Wendell  wrote:
>>>
>>> I'd like to close this vote to coincide with the 1.3.1 release,
>>> however, it would be great to have more people test this release
>>> first. I'll leave it open for a bit longer and see if others can give
>>> a +1.
>>>
>>> On Tue, Apr 14, 2015 at 9:55 PM, Patrick Wendell 
>>> wrote:
 +1 from me ass well.

 On Tue, Apr 7, 2015 at 4:36 AM, Sean Owen  wrote:
> I think that's close enough for a +1:
>
> Signatures and hashes are good.
> LICENSE, NOTICE still check out.
> Compiles for a Hadoop 2.6 + YARN + Hive profile.
>
> JIRAs with target version = 1.2.x look legitimate; no blockers.
>
> I still observe several Hive test failures with:
> mvn -Phadoop-2.4 -Pyarn -Phive -Phive-0.13.1 -Dhadoop.version=2.6.0
> -DskipTests clean package; mvn -Phadoop-2.4 -Pyarn -Phive
> -Phive-0.13.1 -Dhadoop.version=2.6.0 test
> .. though again I think these are not regressions but known issues in
> older branches.
>
> FYI there are 16 Critical issues still open for 1.2.x:
>
> SPARK-6209,ExecutorClassLoader can leak connections after failing to
> load classes from the REPL class server,Josh Rosen,In Progress,4/5/15
> SPARK-5098,Number of running tasks become negative after tasks
> lost,,Open,1/14/15
> SPARK-4888,"Spark EC2 doesn't mount local disks for i2.8xlarge
> instances",,Open,1/27/15
> SPARK-4879,Missing output partitions after job completes with
> speculative execution,Josh Rosen,Open,3/5/15
> SPARK-4568,Publish release candidates under $VERSION-RCX instead of
> $VERSION,Patrick Wendell,Open,11/24/14
> SPARK-4520,SparkSQL exception when reading certain columns from a
> parquet file,sadhan sood,Open,1/21/15
> SPARK-4514,SparkContext localProperties does not inherit property
> updates across thread reuse,Josh Rosen,Open,3/31/15
> SPARK-4454,Race condition in DAGScheduler,Josh Rosen,Reopened,2/18/15
> SPARK-4452,Shuffle data structures can starve others on the same
> thread for memory,Tianshuo Deng,Open,1/24/15
> SPARK-4356,Test Scala 2.11 on Jenkins,Patrick Wendell,Open,11/12/14
> SPARK-4258,NPE with new Parquet Filters,Cheng Lian,Reopened,4/3/15
> SPARK-4194,Exceptions thrown during SparkContext or SparkEnv
> construction might lead to resource leaks or corrupted global
> state,,In Progress,4/2/15
> SPARK-4159,"Maven build doesn't run JUnit test suites",Sean
> Owen,Open,1/11/15
> SPARK-4106,Shuffle write and spill to disk metrics are
> incorrect,,Open,10/28/14
> SPARK-3492,Clean up Yarn integration code,Andrew Or,Open,9/12/14
> SPARK-3461,Support external groupByKey using
> repartitionAndSortWithinPartitions,Sandy Ryza,Open,11/10/14
> SPARK-2984,FileNotFoundException on _temporary directory,,Open,12/11/14
> SPARK-2532,Fix issues with consolidated shuffle,,Open,3/26/15
> SPARK-1312,Batch should read based on the batch interval provided in
> the StreamingContext,Tathagata Das,Open,12/24/14
>
> On Sun, Apr 5, 2015 at 7:24 PM, Patrick Wendell 
> wrote:
>> Please vote on releasing the following candidate as Apache Spark
>> version 1.2.2!
>>
>> The tag to be voted on is v1.2.2-rc1 (commit 7531b50):
>>
>> https://git-wip-us.apache.org/repos/asf?p=spark.git;a=commit;h=7531b50e406ee2e3301b009ceea7c684272b2e27
>>
>> The list of fixes present in this release can be found at:
>> http://bit.ly/1DCNddt
>>
>> The release files, including signatures, digests, etc. can be found at:
>> http://people.apache.org/~pwendell/spark-1.2.2-rc1/
>>
>> Release artifacts are signed with the following