Sree that doesn't show any error, so it doesn't help. I built with the
same flags when I tested and it succeeded.

On Fri, Apr 17, 2015 at 8:53 AM, Sree V <sree_at_ch...@yahoo.com.invalid> wrote:
> Sorry, I couldn't catch up before closing the voting.If it still counts, mvn 
> package fails (1).  And didn't run test (2).  So, -1.1.mvn -Phadoop-2.4 
> -Pyarn -Phive -Phive-0.13.1 -Dhadoop.version=2.6.0 -DskipTests clean package
> 2. mvn -Phadoop-2.4 -Pyarn -Phive -Phive-0.13.1 -Dhadoop.version=2.6.0 test
> Error:
> [INFO] Spark Project External Flume Sink .................. SUCCESS [ 39.561 
> s]
> [INFO] Spark Project External Flume ....................... FAILURE [ 11.212 
> s]
> [INFO] Spark Project External MQTT ........................ SKIPPED
> [INFO] Spark Project External ZeroMQ ...................... SKIPPED
> [INFO] Spark Project External Kafka ....................... SKIPPED
> [INFO] Spark Project Examples ............................. SKIPPED
> [INFO] Spark Project YARN Shuffle Service ................. SKIPPED
>
>
> Thanking you.
>
> With Regards
> Sree
>
>
>      On Thursday, April 16, 2015 3:42 PM, Patrick Wendell 
> <pwend...@gmail.com> wrote:
>
>
>  I'm gonna go ahead and close this now - thanks everyone for voting!
>
> This vote passes with 7 +1 votes (6 binding) and no 0 or -1 votes.
>
> +1:
> Mark Hamstra*
> Reynold Xin
> Kirshna Sankar
> Sean Owen*
> Tom Graves*
> Joseph Bradley*
> Sean McNamara*
>
> 0:
>
> -1:
>
> Thanks!
> - Patrick
>
> On Thu, Apr 16, 2015 at 3:27 PM, Sean Owen <so...@cloudera.com> wrote:
>> No, of course Jenkins runs tests. The way some of the tests work, they
>> need the build artifacts to have been created first. So it runs "mvn
>> ... -DskipTests package" then "mvn ... test"
>>
>> On Thu, Apr 16, 2015 at 11:09 PM, Sree V <sree_at_ch...@yahoo.com> wrote:
>>> In my effort to vote for this release, I found these along:
>>>
>>> This is from jenkins.  It uses "-DskipTests".
>>>
>>> [centos] $
>>> /home/jenkins/tools/hudson.tasks.Maven_MavenInstallation/Maven_3.0.5/bin/mvn
>>> -Dhadoop.version=2.0.0-mr1-cdh4.1.2 -Dlabel=centos -DskipTests clean package
>>>
>>> We build on our locals / servers using same flag.
>>>
>>>
>>> Usually, for releases we build with running all the tests, as well. and at
>>> some level of code coverage.
>>>
>>> Are we by-passing it ?
>>>
>>>
>>>
>>> Thanking you.
>>>
>>> With Regards
>>> Sree
>>>
>>>
>>>
>>> On Wednesday, April 15, 2015 3:32 PM, Sean McNamara
>>> <sean.mcnam...@webtrends.com> wrote:
>>>
>>>
>>> Ran tests on OS X
>>>
>>> +1
>>>
>>> Sean
>>>
>>>
>>>> On Apr 14, 2015, at 10:59 PM, Patrick Wendell <pwend...@gmail.com> wrote:
>>>>
>>>> I'd like to close this vote to coincide with the 1.3.1 release,
>>>> however, it would be great to have more people test this release
>>>> first. I'll leave it open for a bit longer and see if others can give
>>>> a +1.
>>>>
>>>> On Tue, Apr 14, 2015 at 9:55 PM, Patrick Wendell <pwend...@gmail.com>
>>>> wrote:
>>>>> +1 from me ass well.
>>>>>
>>>>> On Tue, Apr 7, 2015 at 4:36 AM, Sean Owen <so...@cloudera.com> wrote:
>>>>>> I think that's close enough for a +1:
>>>>>>
>>>>>> Signatures and hashes are good.
>>>>>> LICENSE, NOTICE still check out.
>>>>>> Compiles for a Hadoop 2.6 + YARN + Hive profile.
>>>>>>
>>>>>> JIRAs with target version = 1.2.x look legitimate; no blockers.
>>>>>>
>>>>>> I still observe several Hive test failures with:
>>>>>> mvn -Phadoop-2.4 -Pyarn -Phive -Phive-0.13.1 -Dhadoop.version=2.6.0
>>>>>> -DskipTests clean package; mvn -Phadoop-2.4 -Pyarn -Phive
>>>>>> -Phive-0.13.1 -Dhadoop.version=2.6.0 test
>>>>>> .. though again I think these are not regressions but known issues in
>>>>>> older branches.
>>>>>>
>>>>>> FYI there are 16 Critical issues still open for 1.2.x:
>>>>>>
>>>>>> SPARK-6209,ExecutorClassLoader can leak connections after failing to
>>>>>> load classes from the REPL class server,Josh Rosen,In Progress,4/5/15
>>>>>> SPARK-5098,Number of running tasks become negative after tasks
>>>>>> lost,,Open,1/14/15
>>>>>> SPARK-4888,"Spark EC2 doesn't mount local disks for i2.8xlarge
>>>>>> instances",,Open,1/27/15
>>>>>> SPARK-4879,Missing output partitions after job completes with
>>>>>> speculative execution,Josh Rosen,Open,3/5/15
>>>>>> SPARK-4568,Publish release candidates under $VERSION-RCX instead of
>>>>>> $VERSION,Patrick Wendell,Open,11/24/14
>>>>>> SPARK-4520,SparkSQL exception when reading certain columns from a
>>>>>> parquet file,sadhan sood,Open,1/21/15
>>>>>> SPARK-4514,SparkContext localProperties does not inherit property
>>>>>> updates across thread reuse,Josh Rosen,Open,3/31/15
>>>>>> SPARK-4454,Race condition in DAGScheduler,Josh Rosen,Reopened,2/18/15
>>>>>> SPARK-4452,Shuffle data structures can starve others on the same
>>>>>> thread for memory,Tianshuo Deng,Open,1/24/15
>>>>>> SPARK-4356,Test Scala 2.11 on Jenkins,Patrick Wendell,Open,11/12/14
>>>>>> SPARK-4258,NPE with new Parquet Filters,Cheng Lian,Reopened,4/3/15
>>>>>> SPARK-4194,Exceptions thrown during SparkContext or SparkEnv
>>>>>> construction might lead to resource leaks or corrupted global
>>>>>> state,,In Progress,4/2/15
>>>>>> SPARK-4159,"Maven build doesn't run JUnit test suites",Sean
>>>>>> Owen,Open,1/11/15
>>>>>> SPARK-4106,Shuffle write and spill to disk metrics are
>>>>>> incorrect,,Open,10/28/14
>>>>>> SPARK-3492,Clean up Yarn integration code,Andrew Or,Open,9/12/14
>>>>>> SPARK-3461,Support external groupByKey using
>>>>>> repartitionAndSortWithinPartitions,Sandy Ryza,Open,11/10/14
>>>>>> SPARK-2984,FileNotFoundException on _temporary directory,,Open,12/11/14
>>>>>> SPARK-2532,Fix issues with consolidated shuffle,,Open,3/26/15
>>>>>> SPARK-1312,Batch should read based on the batch interval provided in
>>>>>> the StreamingContext,Tathagata Das,Open,12/24/14
>>>>>>
>>>>>> On Sun, Apr 5, 2015 at 7:24 PM, Patrick Wendell <pwend...@gmail.com>
>>>>>> wrote:
>>>>>>> Please vote on releasing the following candidate as Apache Spark
>>>>>>> version 1.2.2!
>>>>>>>
>>>>>>> The tag to be voted on is v1.2.2-rc1 (commit 7531b50):
>>>>>>>
>>>>>>> https://git-wip-us.apache.org/repos/asf?p=spark.git;a=commit;h=7531b50e406ee2e3301b009ceea7c684272b2e27
>>>>>>>
>>>>>>> The list of fixes present in this release can be found at:
>>>>>>> http://bit.ly/1DCNddt
>>>>>>>
>>>>>>> The release files, including signatures, digests, etc. can be found at:
>>>>>>> http://people.apache.org/~pwendell/spark-1.2.2-rc1/
>>>>>>>
>>>>>>> Release artifacts are signed with the following key:
>>>>>>> https://people.apache.org/keys/committer/pwendell.asc
>>>>>>>
>>>>>>> The staging repository for this release can be found at:
>>>>>>> https://repository.apache.org/content/repositories/orgapachespark-1082/
>>>>>>>
>>>>>>> The documentation corresponding to this release can be found at:
>>>>>>> http://people.apache.org/~pwendell/spark-1.2.2-rc1-docs/
>>>>>>>
>>>>>>> Please vote on releasing this package as Apache Spark 1.2.2!
>>>>>>>
>>>>>>> The vote is open until Thursday, April 08, at 00:30 UTC and passes
>>>>>>> if a majority of at least 3 +1 PMC votes are cast.
>>>>>>>
>>>>>>> [ ] +1 Release this package as Apache Spark 1.2.2
>>>>>>> [ ] -1 Do not release this package because ...
>>>>>>>
>>>>>>> To learn more about Apache Spark, please see
>>>>>>> http://spark.apache.org/
>>>>>>>
>>>>>>> ---------------------------------------------------------------------
>>>>>>> To unsubscribe, e-mail: dev-unsubscr...@spark.apache.org
>>>>>>> For additional commands, e-mail: dev-h...@spark.apache.org
>>>
>>>>>>>
>>>>
>>>> ---------------------------------------------------------------------
>>>> To unsubscribe, e-mail: dev-unsubscr...@spark.apache.org
>>>> For additional commands, e-mail: dev-h...@spark.apache.org
>>>>
>>>
>>>
>>> ---------------------------------------------------------------------
>>> To unsubscribe, e-mail: dev-unsubscr...@spark.apache.org
>>> For additional commands, e-mail: dev-h...@spark.apache.org
>>>
>>>
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: dev-unsubscr...@spark.apache.org
> For additional commands, e-mail: dev-h...@spark.apache.org
>
>
>
>

---------------------------------------------------------------------
To unsubscribe, e-mail: dev-unsubscr...@spark.apache.org
For additional commands, e-mail: dev-h...@spark.apache.org

Reply via email to