Re: [apache-spark][Spark SQL][Debug] Maven Spark build fails while compiling spark-hive-thriftserver_2.12 for Hadoop 2.10.1

2021-09-17 Thread Sean Owen
I don't think that has ever showed up in the CI/CD builds and can't recall someone reporting this. What did you change? it may be some local env issue On Fri, Sep 17, 2021 at 7:09 AM Enrico Minardi wrote: > > Hello, > > > the Maven build of Apache Spark 3.1.2 for user-provided Hadoop 2.10.1

[apache-spark][Spark SQL][Debug] Maven Spark build fails while compiling spark-hive-thriftserver_2.12 for Hadoop 2.10.1

2021-09-17 Thread Enrico Minardi
Hello, the Maven build of Apache Spark 3.1.2 for user-provided Hadoop 2.10.1 with Hive and Hive-Thriftserver profiles fails while compiling spark-hive-thriftserver_2.12. I am most probably missing something. Could you please help? I have searched the Scala-Maven-Plugin website

spark ./build/mvn test failed on aarch64

2019-06-05 Thread Tianhua huang
Hi all, Recently I run './build/mvn test' of spark on aarch64, and master and branch-2.4 are all failled, the log pieces as below: .. [INFO] T E S T S [INFO] --- [INFO] Running org.apache.spark.util.kvstore.LevelDBTypeInfoSuite [INFO] Tests

spark ./build/mvn test failed on aarch64

2019-06-05 Thread Tianhua huang
Hi all, Recently I run './build/mvn test' of spark on aarch64, and master and branch-2.4 are all failled, the log pieces as below: .. [INFO] T E S T S [INFO] --- [INFO] Running org.apache.spark.util.kvstore.LevelDBTypeInfoSuite [INFO] Tests

CVE-2018-11804: Apache Spark build/mvn runs zinc, and can expose information from build machines

2018-10-24 Thread Sean Owen
Severity: Low Vendor: The Apache Software Foundation Versions Affected: 1.3.x release branch and later, including master Description: Spark's Apache Maven-based build includes a convenience script, 'build/mvn', that downloads and runs a zinc server to speed up compilation. This server will

Re: Spark build 1.6.2 error

2016-09-03 Thread Diwakar Dhanuskodi
Sorry my bad. In both the runs I included -Dscala-2.11 On Sat, Sep 3, 2016 at 12:39 PM, Nachiketa wrote: > I think the difference was the -Dscala2.11 to the command line. > > I have seen this show up when I miss that. > > Regards, > Nachiketa > > On Sat 3 Sep, 2016,

Re: Spark build 1.6.2 error

2016-09-03 Thread Nachiketa
I think the difference was the -Dscala2.11 to the command line. I have seen this show up when I miss that. Regards, Nachiketa On Sat 3 Sep, 2016, 12:14 PM Diwakar Dhanuskodi, < diwakar.dhanusk...@gmail.com> wrote: > Hi, > > Just re-ran again without killing zinc server process > >

Re: Spark build 1.6.2 error

2016-09-03 Thread Diwakar Dhanuskodi
Hi, Just re-ran again without killing zinc server process /make-distribution.sh --name custom-spark --tgz -Phadoop-2.6 -Phive -Pyarn -Dmaven.version=3.0.4 -Dscala-2.11 -X -rf :spark-sql_2.11 Build is success. Not sure how it worked with just re-running command again. On Sat, Sep 3, 2016 at

Re: Spark build 1.6.2 error

2016-09-03 Thread Diwakar Dhanuskodi
Hi, java version 7 mvn command ./make-distribution.sh --name custom-spark --tgz -Phadoop-2.6 -Phive -Phive-thriftserver -Pyarn -Dmaven.version=3.0.4 yes, I executed script to change scala version to 2.11 killed "com.typesafe zinc.Nailgun" process re-ran mvn with below command again

Re: Spark build 1.6.2 error

2016-08-31 Thread Divya Gehlot
Which java version are you using ? On 31 August 2016 at 04:30, Diwakar Dhanuskodi wrote: > Hi, > > While building Spark 1.6.2 , getting below error in spark-sql. Much > appreciate for any help. > > ERROR] missing or invalid dependency detected while loading class

Re: Spark build 1.6.2 error

2016-08-31 Thread Adam Roberts
(started with build/mvn) From: Nachiketa <nachiketa.shu...@gmail.com> To: Diwakar Dhanuskodi <diwakar.dhanusk...@gmail.com> Cc: user <user@spark.apache.org> Date: 31/08/2016 12:17 Subject:Re: Spark build 1.6.2 error Hi Diwakar, Could you please sh

Re: Spark build 1.6.2 error

2016-08-31 Thread Nachiketa
Hi Diwakar, Could you please share the entire maven command that you are using to build ? And also the JDK version you are using ? Also could you please confirm that you did execute the script for change scala version to 2.11 before starting the build ? Thanks. Regards, Nachiketa On Wed, Aug

Spark build 1.6.2 error

2016-08-30 Thread Diwakar Dhanuskodi
Hi, While building Spark 1.6.2 , getting below error in spark-sql. Much appreciate for any help. ERROR] missing or invalid dependency detected while loading class file 'WebUI.class'. Could not access term eclipse in package org, because it (or its dependencies) are missing. Check your build

Re: sbt for Spark build with Scala 2.11

2016-05-16 Thread Eric Richardson
Good news - and Java 8 as well. I saw Matei after his talk at Scala days and he said he would look into a 2.11 default but it seems that is already the plan. Scala 2.12 is getting closer as well. On Mon, May 16, 2016 at 2:55 PM, Ted Yu wrote: > For 2.0, I believe that is

Re: sbt for Spark build with Scala 2.11

2016-05-16 Thread Ted Yu
For 2.0, I believe that is the case. Jenkins jobs have been running against Scala 2.11: [INFO] --- scala-maven-plugin:3.2.2:testCompile (scala-test-compile-first) @ java8-tests_2.11 --- FYI On Mon, May 16, 2016 at 2:45 PM, Eric Richardson wrote: > On Thu, May 12,

Re: sbt for Spark build with Scala 2.11

2016-05-16 Thread Eric Richardson
On Thu, May 12, 2016 at 9:23 PM, Luciano Resende wrote: > Spark has moved to build using Scala 2.11 by default in master/trunk. > Does this mean that the pre-built binaries for download will also move to 2.11 as well? > > > As for the 2.0.0-SNAPSHOT, it is actually the

Re: sbt for Spark build with Scala 2.11

2016-05-13 Thread Raghava Mutharaju
Thank you for the response. I used the following command to build from source build/mvn -Dhadoop.version=2.6.4 -Phadoop-2.6 -DskipTests clean package Would this put in the required jars in .ivy2 during the build process? If so, how can I make the spark distribution runnable, so that I can use

Re: sbt for Spark build with Scala 2.11

2016-05-12 Thread Luciano Resende
Spark has moved to build using Scala 2.11 by default in master/trunk. As for the 2.0.0-SNAPSHOT, it is actually the version of master/trunk and you might be missing some modules/profiles for your build. What command did you use to build ? On Thu, May 12, 2016 at 9:01 PM, Raghava Mutharaju <

sbt for Spark build with Scala 2.11

2016-05-12 Thread Raghava Mutharaju
Hello All, I built Spark from the source code available at https://github.com/apache/spark/. Although I haven't specified the "-Dscala-2.11" option (to build with Scala 2.11), from the build messages I see that it ended up using Scala 2.11. Now, for my application sbt, what should be the spark

Re: Spark build failure with com.oracle:ojdbc6:jar:11.2.0.1.0

2016-05-09 Thread Luciano Resende
gt; blogs.oracle.com > Get Oracle JDBC drivers and UCP from Oracle Maven Repository (without > IDEs) By Nirmala Sundarappa-Oracle on Feb 15, 2016 > > > > > > > > -- > *From:* Mich Talebzadeh <mich.talebza...@gmail.com > <

Re: Spark build failure with com.oracle:ojdbc6:jar:11.2.0.1.0

2016-05-09 Thread Andrew Lee
ich Talebzadeh <mich.talebza...@gmail.com> Sent: Tuesday, May 3, 2016 1:04 AM To: Luciano Resende Cc: Hien Luu; ☼ R Nair (रविशंकर नायर); user Subject: Re: Spark build failure with com.oracle:ojdbc6:jar:11.2.0.1.0 which version of Spark are using? Dr Mich Talebzadeh LinkedIn https:

Re: Spark build failure with com.oracle:ojdbc6:jar:11.2.0.1.0

2016-05-03 Thread Mich Talebzadeh
which version of Spark are using? Dr Mich Talebzadeh LinkedIn * https://www.linkedin.com/profile/view?id=AAEWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw * http://talebzadehmich.wordpress.com On 3 May 2016 at

Re: Spark build failure with com.oracle:ojdbc6:jar:11.2.0.1.0

2016-05-02 Thread Luciano Resende
You might have a settings.xml that is forcing your internal Maven repository to be the mirror of external repositories and thus not finding the dependency. On Mon, May 2, 2016 at 6:11 PM, Hien Luu wrote: > Not I am not. I am considering downloading it manually and place it

Re: Spark build failure with com.oracle:ojdbc6:jar:11.2.0.1.0

2016-05-02 Thread Hien Luu
Not I am not. I am considering downloading it manually and place it in my local repository. On Mon, May 2, 2016 at 5:54 PM, ☼ R Nair (रविशंकर नायर) < ravishankar.n...@gmail.com> wrote: > Oracle jdbc is not part of Maven repository, are you keeping a downloaded > file in your local repo? > >

Re: Spark build failure with com.oracle:ojdbc6:jar:11.2.0.1.0

2016-05-02 Thread Ted Yu
>From the output of dependency:tree of master branch: [INFO] [INFO] Building Spark Project Docker Integration Tests 2.0.0-SNAPSHOT [INFO] [WARNING] The

Re: Spark build failure with com.oracle:ojdbc6:jar:11.2.0.1.0

2016-05-02 Thread रविशंकर नायर
Oracle jdbc is not part of Maven repository, are you keeping a downloaded file in your local repo? Best, RS On May 2, 2016 8:51 PM, "Hien Luu" wrote: > Hi all, > > I am running into a build problem with com.oracle:ojdbc6:jar:11.2.0.1.0. > It kept getting "Operation timed

Spark build failure with com.oracle:ojdbc6:jar:11.2.0.1.0

2016-05-02 Thread Hien Luu
Hi all, I am running into a build problem with com.oracle:ojdbc6:jar:11.2.0.1.0. It kept getting "Operation timed out" while building Spark Project Docker Integration Tests module (see the error below). Has anyone run this problem before? If so, how did you resolve around this problem? [INFO]

Re: [spark] build/sbt gen-idea error

2016-04-12 Thread Sean Owen
We just removed the gen-idea plugin. Just import the Maven project into IDEA or Eclipse. On Tue, Apr 12, 2016 at 4:52 PM, ImMr.K <875061...@qq.com> wrote: > But how to import spark repo into idea or eclipse? > > > > -- 原始邮件 -- > 发件人: Ted Yu >

Re??[spark] build/sbt gen-idea error

2016-04-12 Thread ImMr.K
-idea doesn't seem to be a valid command: [warn] Ignoring load failure: no project loaded. [error] Not a valid command: gen-idea [error] gen-idea On Tue, Apr 12, 2016 at 8:28 AM, ImMr.K <875061...@qq.com> wrote: Hi, I have cloned spark and , cd spark build/sbt gen-idea got the followin

Re: Re:[spark] build/sbt gen-idea error

2016-04-12 Thread Marco Mistroni
doesn't seem to be a valid command: [warn] Ignoring load failure: no project loaded. [error] Not a valid command: gen-idea [error] gen-idea On Tue, Apr 12, 2016 at 8:28 AM, ImMr.K <875061...@qq.com> wrote: > Hi, > I have cloned spark and , > cd spark > build/sbt gen-idea > > g

Re: [spark] build/sbt gen-idea error

2016-04-12 Thread Ted Yu
ailure: no project loaded. > [error] Not a valid command: gen-idea > [error] gen-idea > > On Tue, Apr 12, 2016 at 8:28 AM, ImMr.K <875061...@qq.com> wrote: > >> Hi, >> I have cloned spark and , >> cd spark >> build/sbt gen-idea >> >> got the

Spark build error

2015-11-17 Thread 金国栋
Hi! I tried to build spark source code from github, and I successfully built it from command line using `*sbt/sbt assembly*`. While I encountered an error when compiling the project in Intellij IDEA(V14.1.5). The error log is below: *Error:scala: * * while compiling:

Re: Spark build error

2015-11-17 Thread Ted Yu
Is the Scala version in Intellij the same as the one used by sbt ? Cheers On Tue, Nov 17, 2015 at 6:45 PM, 金国栋 wrote: > Hi! > > I tried to build spark source code from github, and I successfully built > it from command line using `*sbt/sbt assembly*`. While I encountered an >

Re: Spark build error

2015-11-17 Thread Jeff Zhang
This also bother me for a long time. I suspect the intellij builder conflicts with the sbt/maven builder. I resolve this issue by rebuild spark in intellij. You may meet compilation issue when building it in intellij. For that you need to put external/flume-sink/target/java on the source build

Re: Spark build/sbt assembly

2015-07-30 Thread Rahul Palamuttam
.1001560.n3.nabble.com/Spark-build-sbt-assembly-tp24012.html Sent from the Apache Spark User List mailing list archive at Nabble.com. - To unsubscribe, e-mail: user-unsubscr...@spark.apache.org For additional commands, e-mail: user-h

Re: Spark build/sbt assembly

2015-07-30 Thread Akhil Das
in context: http://apache-spark-user-list.1001560.n3.nabble.com/Spark-build-sbt-assembly-tp24012.html Sent from the Apache Spark User List mailing list archive at Nabble.com. - To unsubscribe, e-mail: user-unsubscr

Re: Spark build/sbt assembly

2015-07-27 Thread Ted Yu
node it works but on the other it gives me the above error. Thanks, Rahul P -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Spark-build-sbt-assembly-tp24012.html Sent from the Apache Spark User List mailing list archive at Nabble.com

Re: Spark build/sbt assembly

2015-07-27 Thread Rahul Palamuttam
the above error. Thanks, Rahul P -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Spark-build-sbt-assembly-tp24012.html Sent from the Apache Spark User List mailing list archive at Nabble.com

Re: Spark build/sbt assembly

2015-07-27 Thread Rahul Palamuttam
: http://apache-spark-user-list.1001560.n3.nabble.com/Spark-build-sbt-assembly-tp24012.html Sent from the Apache Spark User List mailing list archive at Nabble.com. - To unsubscribe, e-mail: user-unsubscr...@spark.apache.org

Spark build/sbt assembly

2015-07-27 Thread Rahul Palamuttam
do notice on one node it works but on the other it gives me the above error. Thanks, Rahul P -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Spark-build-sbt-assembly-tp24012.html Sent from the Apache Spark User List mailing list archive at Nabble.com

Spark build with Hive

2015-05-20 Thread guoqing0...@yahoo.com.hk
Hi , is the Spark-1.3.1 can build with the Hive-1.2 ? it seem to Spark-1.3.1 can only build with 0.13 , 0.12 according to the document . # Apache Hadoop 2.4.X with Hive 13 support mvn -Pyarn -Phadoop-2.4 -Dhadoop.version=2.4.0 -Phive -Phive-thriftserver -DskipTests clean package # Apache Hadoop

Re: RE: Spark build with Hive

2015-05-20 Thread guoqing0...@yahoo.com.hk
Thanks very much , Which version will be support In the upcome 1.4 ? I hope it will be support more versions. guoqing0...@yahoo.com.hk From: Cheng, Hao Date: 2015-05-21 11:20 To: Ted Yu; guoqing0...@yahoo.com.hk CC: user Subject: RE: Spark build with Hive Yes, ONLY support 0.12.0 and 0.13.1

RE: RE: Spark build with Hive

2015-05-20 Thread Wang, Daoyuan
In 1.4 I think we still only support 0.12.0 and 0.13.1. From: guoqing0...@yahoo.com.hk [mailto:guoqing0...@yahoo.com.hk] Sent: Thursday, May 21, 2015 12:03 PM To: Cheng, Hao; Ted Yu Cc: user Subject: Re: RE: Spark build with Hive Thanks very much , Which version will be support In the upcome 1.4

Re: Spark build with Hive

2015-05-20 Thread Ted Yu
I am afraid even Hive 1.0 is not supported, let alone Hive 1.2 Cheers On Wed, May 20, 2015 at 8:08 PM, guoqing0...@yahoo.com.hk guoqing0...@yahoo.com.hk wrote: Hi , is the Spark-1.3.1 can build with the Hive-1.2 ? it seem to Spark-1.3.1 can only build with 0.13 , 0.12 according to the

Re: Spark Build with Hadoop 2.6, yarn - encounter java.lang.NoClassDefFoundError: org/codehaus/jackson/map/deser/std/StdDeserializer

2015-03-06 Thread Todd Nist
First, thanks to everyone for their assistance and recommendations. @Marcelo I applied the patch that you recommended and am now able to get into the shell, thank you worked great after I realized that the pom was pointing to the 1.3.0-SNAPSHOT for parent, need to bump that down to 1.2.1. @Zhan

Re: Spark Build with Hadoop 2.6, yarn - encounter java.lang.NoClassDefFoundError: org/codehaus/jackson/map/deser/std/StdDeserializer

2015-03-06 Thread Zhan Zhang
Hi Todd, Looks like the thrift server can connect to metastore, but something wrong in the executors. You can try to get the log with yarn logs -applicationID xxx” to check why it failed. If there is no log (master or executor is not started at all), you can go to the RM webpage, click the

Re: Spark Build with Hadoop 2.6, yarn - encounter java.lang.NoClassDefFoundError: org/codehaus/jackson/map/deser/std/StdDeserializer

2015-03-06 Thread Todd Nist
Hi Zhan, I applied the patch you recommended, https://github.com/apache/spark/pull/3409, it it now works. It was failing with this: Exception message: /hadoop/yarn/local/usercache/root/appcache/application_1425078697953_0020/container_1425078697953_0020_01_02/launch_container.sh: line 14:

Re: Spark Build with Hadoop 2.6, yarn - encounter java.lang.NoClassDefFoundError: org/codehaus/jackson/map/deser/std/StdDeserializer

2015-03-06 Thread Zhan Zhang
Sorry. Misunderstanding. Looks like it already worked. If you still met some hdp.version problem, you can try it :) Thanks. Zhan Zhang On Mar 6, 2015, at 11:40 AM, Zhan Zhang zzh...@hortonworks.commailto:zzh...@hortonworks.com wrote: You are using 1.2.1 right? If so, please add java-opts in

Re: Spark Build with Hadoop 2.6, yarn - encounter java.lang.NoClassDefFoundError: org/codehaus/jackson/map/deser/std/StdDeserializer

2015-03-06 Thread Zhan Zhang
You are using 1.2.1 right? If so, please add java-opts in conf directory and give it a try. [root@c6401 conf]# more java-opts -Dhdp.version=2.2.2.0-2041 Thanks. Zhan Zhang On Mar 6, 2015, at 11:35 AM, Todd Nist tsind...@gmail.commailto:tsind...@gmail.com wrote:

Re: Spark Build with Hadoop 2.6, yarn - encounter java.lang.NoClassDefFoundError: org/codehaus/jackson/map/deser/std/StdDeserializer

2015-03-06 Thread Todd Nist
Working great now, after applying that patch; thanks again. On Fri, Mar 6, 2015 at 2:42 PM, Zhan Zhang zzh...@hortonworks.com wrote: Sorry. Misunderstanding. Looks like it already worked. If you still met some hdp.version problem, you can try it :) Thanks. Zhan Zhang On Mar 6, 2015,

Re: Spark Build with Hadoop 2.6, yarn - encounter java.lang.NoClassDefFoundError: org/codehaus/jackson/map/deser/std/StdDeserializer

2015-03-05 Thread Marcelo Vanzin
It seems from the excerpt below that your cluster is set up to use the Yarn ATS, and the code is failing in that path. I think you'll need to apply the following patch to your Spark sources if you want this to work: https://github.com/apache/spark/pull/3938 On Thu, Mar 5, 2015 at 10:04 AM, Todd

Re: Spark Build with Hadoop 2.6, yarn - encounter java.lang.NoClassDefFoundError: org/codehaus/jackson/map/deser/std/StdDeserializer

2015-03-05 Thread Zhan Zhang
In addition, you may need following patch if it is not in 1.2.1 to solve some system property issue if you use HDP 2.2. https://github.com/apache/spark/pull/3409 You can follow the following link to set hdp.version for java options.

Spark Build with Hadoop 2.6, yarn - encounter java.lang.NoClassDefFoundError: org/codehaus/jackson/map/deser/std/StdDeserializer

2015-03-05 Thread Todd Nist
I am running Spark on a HortonWorks HDP Cluster. I have deployed there prebuilt version but it is only for Spark 1.2.0 not 1.2.1 and there are a few fixes and features in there that I would like to leverage. I just downloaded the spark-1.2.1 source and built it to support Hadoop 2.6 by doing the

Re: Spark Build with Hadoop 2.6, yarn - encounter java.lang.NoClassDefFoundError: org/codehaus/jackson/map/deser/std/StdDeserializer

2015-03-05 Thread Sean Owen
Jackson 1.9.13? and codehaus.jackson.version? that's already set by the profile hadoop-2.4. On Thu, Mar 5, 2015 at 6:13 PM, Ted Yu yuzhih...@gmail.com wrote: Please add the following to build command: -Djackson.version=1.9.3 Cheers On Thu, Mar 5, 2015 at 10:04 AM, Todd Nist

Re: Spark Build with Hadoop 2.6, yarn - encounter java.lang.NoClassDefFoundError: org/codehaus/jackson/map/deser/std/StdDeserializer

2015-03-05 Thread Victor Tso-Guillen
That particular class you did find is under parquet/... which means it was shaded. Did you build your application against a hadoop2.6 dependency? The maven central repo only has 2.2 but HDP has its own repos. On Thu, Mar 5, 2015 at 10:04 AM, Todd Nist tsind...@gmail.com wrote: I am running

Re: Spark Build with Hadoop 2.6, yarn - encounter java.lang.NoClassDefFoundError: org/codehaus/jackson/map/deser/std/StdDeserializer

2015-03-05 Thread Ted Yu
Please add the following to build command: -Djackson.version=1.9.3 Cheers On Thu, Mar 5, 2015 at 10:04 AM, Todd Nist tsind...@gmail.com wrote: I am running Spark on a HortonWorks HDP Cluster. I have deployed there prebuilt version but it is only for Spark 1.2.0 not 1.2.1 and there are a few

Re: Spark Build with Hadoop 2.6, yarn - encounter java.lang.NoClassDefFoundError: org/codehaus/jackson/map/deser/std/StdDeserializer

2015-03-05 Thread Todd Nist
@Victor, I'm pretty sure I built it correctly, I specified -Dhadoop.version=2.6.0, am I missing something here? Followed the docs on this but I'm open to suggestions. make-distribution.sh --name hadoop2.6 --tgz -Pyarn -Phadoop-2.4 *-Dhadoop.version=2.6.0* -Phive -Phive-thriftserver -DskipTests

Re: Is it safe to use Scala 2.11 for Spark build?

2014-11-18 Thread Jianshi Huang
Ok, I'll wait until -Pscala-2.11 is more stable and used by more people. Thanks for the help! Jianshi On Tue, Nov 18, 2014 at 3:49 PM, Ye Xianjin advance...@gmail.com wrote: Hi Prashant Sharma, It's not even ok to build with scala-2.11 profile on my machine. Just check out the

Is it safe to use Scala 2.11 for Spark build?

2014-11-17 Thread Jianshi Huang
Any notable issues for using Scala 2.11? Is it stable now? Or can I use Scala 2.11 in my spark application and use Spark dist build with 2.10 ? I'm looking forward to migrate to 2.11 for some quasiquote features. Couldn't make it run in 2.10... Cheers, -- Jianshi Huang LinkedIn: jianshi

Re: Is it safe to use Scala 2.11 for Spark build?

2014-11-17 Thread Prashant Sharma
It is safe in the sense we would help you with the fix if you run into issues. I have used it, but since I worked on the patch the opinion can be biased. I am using scala 2.11 for day to day development. You should checkout the build instructions here :

Re: Is it safe to use Scala 2.11 for Spark build?

2014-11-17 Thread Prashant Sharma
Looks like sbt/sbt -Pscala-2.11 is broken by a recent patch for improving maven build. Prashant Sharma On Tue, Nov 18, 2014 at 12:57 PM, Prashant Sharma scrapco...@gmail.com wrote: It is safe in the sense we would help you with the fix if you run into issues. I have used it, but since I

Re: Is it safe to use Scala 2.11 for Spark build?

2014-11-17 Thread Ye Xianjin
Hi Prashant Sharma, It's not even ok to build with scala-2.11 profile on my machine. Just check out the master(c6e0c2ab1c29c184a9302d23ad75e4ccd8060242) run sbt/sbt -Pscala-2.11 clean assembly: .. skip the normal part info] Resolving org.scalamacros#quasiquotes_2.11;2.0.1 ... [warn] module

Spark Build

2014-10-31 Thread Terry Siu
I am synced up to the Spark master branch as of commit 23468e7e96. I have Maven 3.0.5, Scala 2.10.3, and SBT 0.13.1. I’ve built the master branch successfully previously and am trying to rebuild again to take advantage of the new Hive 0.13.1 profile. I execute the following command: $ mvn

Re: Spark Build

2014-10-31 Thread Shivaram Venkataraman
Yeah looks like https://github.com/apache/spark/pull/2744 broke the build. We will fix it soon On Fri, Oct 31, 2014 at 12:21 PM, Terry Siu terry@smartfocus.com wrote: I am synced up to the Spark master branch as of commit 23468e7e96. I have Maven 3.0.5, Scala 2.10.3, and SBT 0.13.1. I’ve

Re: Spark Build

2014-10-31 Thread Terry Siu
Thanks for the update, Shivaram. -Terry On 10/31/14, 12:37 PM, Shivaram Venkataraman shiva...@eecs.berkeley.edu wrote: Yeah looks like https://github.com/apache/spark/pull/2744 broke the build. We will fix it soon On Fri, Oct 31, 2014 at 12:21 PM, Terry Siu terry@smartfocus.com wrote: I

Spark build error

2014-08-06 Thread Priya Ch
Hi, I am trying to build jars using the command : mvn -Pyarn -Phadoop-2.2 -Dhadoop.version=2.2.0 -DskipTests clean package Execution of the above command is throwing the following error: [INFO] Spark Project Core . FAILURE [ 0.295 s] [INFO] Spark Project Bagel