Hi,

Just built the sources using the following command and it worked fine.

➜  spark git:(master) ✗ ./build/mvn -Pyarn -Phadoop-2.6
-Dhadoop.version=2.7.1 -Dscala-2.11 -Phive -Phive-thriftserver
-DskipTests clean install
...
[INFO] ------------------------------------------------------------------------
[INFO] BUILD SUCCESS
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 14:15 min
[INFO] Finished at: 2015-11-03T14:40:40+01:00
[INFO] Final Memory: 438M/1972M
[INFO] ------------------------------------------------------------------------

➜  spark git:(master) ✗ java -version
java version "1.8.0_66"
Java(TM) SE Runtime Environment (build 1.8.0_66-b17)
Java HotSpot(TM) 64-Bit Server VM (build 25.66-b17, mixed mode)

I'm on Mac OS.

Pozdrawiam,
Jacek

--
Jacek Laskowski | http://blog.japila.pl | http://blog.jaceklaskowski.pl
Follow me at https://twitter.com/jaceklaskowski
Upvote at http://stackoverflow.com/users/1305344/jacek-laskowski


On Tue, Nov 3, 2015 at 1:37 PM, Jean-Baptiste Onofré <j...@nanthrax.net> wrote:
> Thanks for the update, I used mvn to build but without hive profile.
>
> Let me try with mvn with the same options as you and sbt also.
>
> I keep you posted.
>
> Regards
> JB
>
> On 11/03/2015 12:55 PM, Jeff Zhang wrote:
>>
>> I found it is due to SPARK-11073.
>>
>> Here's the command I used to build
>>
>> build/sbt clean compile -Pyarn -Phadoop-2.6 -Phive -Phive-thriftserver
>> -Psparkr
>>
>> On Tue, Nov 3, 2015 at 7:52 PM, Jean-Baptiste Onofré <j...@nanthrax.net
>> <mailto:j...@nanthrax.net>> wrote:
>>
>>     Hi Jeff,
>>
>>     it works for me (with skipping the tests).
>>
>>     Let me try again, just to be sure.
>>
>>     Regards
>>     JB
>>
>>
>>     On 11/03/2015 11:50 AM, Jeff Zhang wrote:
>>
>>         Looks like it's due to guava version conflicts, I see both guava
>>         14.0.1
>>         and 16.0.1 under lib_managed/bundles. Anyone meet this issue too ?
>>
>>         [error]
>>
>> /Users/jzhang/github/spark_apache/core/src/main/scala/org/apache/spark/SecurityManager.scala:26:
>>         object HashCodes is not a member of package com.google.common.hash
>>         [error] import com.google.common.hash.HashCodes
>>         [error]        ^
>>         [info] Resolving org.apache.commons#commons-math;2.2 ...
>>         [error]
>>
>> /Users/jzhang/github/spark_apache/core/src/main/scala/org/apache/spark/SecurityManager.scala:384:
>>         not found: value HashCodes
>>         [error]         val cookie =
>> HashCodes.fromBytes(secret).toString()
>>         [error]                      ^
>>
>>
>>
>>
>>         --
>>         Best Regards
>>
>>         Jeff Zhang
>>
>>
>>     --
>>     Jean-Baptiste Onofré
>>     jbono...@apache.org <mailto:jbono...@apache.org>
>>     http://blog.nanthrax.net
>>     Talend - http://www.talend.com
>>
>>     ---------------------------------------------------------------------
>>     To unsubscribe, e-mail: dev-unsubscr...@spark.apache.org
>>     <mailto:dev-unsubscr...@spark.apache.org>
>>     For additional commands, e-mail: dev-h...@spark.apache.org
>>     <mailto:dev-h...@spark.apache.org>
>>
>>
>>
>>
>> --
>> Best Regards
>>
>> Jeff Zhang
>
>
> --
> Jean-Baptiste Onofré
> jbono...@apache.org
> http://blog.nanthrax.net
> Talend - http://www.talend.com
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: dev-unsubscr...@spark.apache.org
> For additional commands, e-mail: dev-h...@spark.apache.org
>

---------------------------------------------------------------------
To unsubscribe, e-mail: dev-unsubscr...@spark.apache.org
For additional commands, e-mail: dev-h...@spark.apache.org

Reply via email to