Re: Issue while building spark project

2022-07-19 Thread rajat kumar
Thanks a lot Sean On Mon, Jul 18, 2022, 21:58 Sean Owen wrote: > Increase the stack size for the JVM when Maven / SBT run. The build sets > this but you may still need something like "-Xss4m" in your MAVEN_OPTS > > On Mon, Jul 18, 2022 at 11:18 AM rajat kumar > wrote: > >> Hello , >> >> Can any

Re: Issue while building spark project

2022-07-18 Thread Sean Owen
Increase the stack size for the JVM when Maven / SBT run. The build sets this but you may still need something like "-Xss4m" in your MAVEN_OPTS On Mon, Jul 18, 2022 at 11:18 AM rajat kumar wrote: > Hello , > > Can anyone pls help me in below error. It is a maven project. It is coming > while bui

Issue while building spark project

2022-07-18 Thread rajat kumar
Hello , Can anyone pls help me in below error. It is a maven project. It is coming while building it [ERROR] error: java.lang.StackOverflowError [INFO] at scala.tools.nsc.typechecker.Typers$Typer.typedApply$1(Typers.scala:4885)

Building Spark 3.0.0 for Hive 1.2

2020-07-10 Thread Patrick McCarthy
I'm trying to build Spark 3.0.0 for my Yarn cluster, with Hadoop 2.7.3 and Hive 1.2.1. I downloaded the source and created a runnable dist with ./dev/make-distribution.sh --name custom-spark --pip --r --tgz -Psparkr -Phive-1.2 -Phadoop-2.7 -Pyarn We're running Spark 2.4.0 in production so I copie

Building Spark + hadoop docker for openshift

2020-03-30 Thread Antoine DUBOIS
Hello, I'm trying to build a spark+hadoop docker image compatible with Openshift. I've used oshinko Spark build script here https://github.com/radanalyticsio/openshift-spark to build something with Hadoop jar in classpath to allow usage of S3 storage. However I'm now stuk on the spark entrypoi

Target java version not set when building spark with tags/v2.4.0-rc2

2018-10-07 Thread Shubham Chaurasia
Hi All, I built spark with *tags/v2.4.0-rc2* using ./build/mvn -DskipTests -Phadoop-2.7 -Dhadoop.version=3.1.0 clean install Now from spark-shell when ever I call any static method residing in an interface, it shows me error like : :28: error: Static methods in interface require -target:jvm-1.8

Building Spark with hive 1.1.0

2017-11-06 Thread HARSH TAKKAR
Hi I am using the cloudera (cdh5.11.0) setup, which have the hive version as 1.1.0, but when i build spark with hive and thrift support it pack the hive version as 1.6.0, Please let me know how can i build spark with hive 1.1.0 ? command i am using to build : ./dev/make-distribution.sh --name my

Re: building Spark 2.1 vs Java 1.8 on Ubuntu 16/06

2016-10-06 Thread Marco Mistroni
Thanks Fred for pointers...so far I was only able to build 2.1 with Java 7 and no zinc. Will try options u suggest. FYI building with sbt ends up in oom even with Java 7 I will try and update this thread Kr On 6 Oct 2016 8:58 pm, "Fred Reiss" wrote: > There's no option to prevent build/mvn from

Re: building Spark 2.1 vs Java 1.8 on Ubuntu 16/06

2016-10-06 Thread Fred Reiss
There's no option to prevent build/mvn from starting the zinc server, but you should be able to prevent the maven build from using the zinc server by changing the option at line 1935 of the master pom.xml. Note that the zinc-based compile works on my Ubuntu 16.04 box. You might be able to get zin

Re: building Spark 2.1 vs Java 1.8 on Ubuntu 16/06

2016-10-06 Thread Marco Mistroni
Thanks Fred The build/mvn will trigger compilation using zinc and I want to avoid that as every time I have tried it runs into errors while compiling spark core. How can I disable zinc by default? Kr On 5 Oct 2016 10:53 pm, "Fred Reiss" wrote: > Actually the memory options *are* required for Jav

Re: building Spark 2.1 vs Java 1.8 on Ubuntu 16/06

2016-10-05 Thread Fred Reiss
Actually the memory options *are* required for Java 1.8. Without them the build will fail intermittently. We just updated the documentation with regard to this fact in Spark 2.0.1. Relevant PR is here: https://github.com/apache/spark/pull/15005 Your best bet as the project transitions from Java 7

Re: building Spark 2.1 vs Java 1.8 on Ubuntu 16/06

2016-10-05 Thread Marco Mistroni
Thanks Richard. It also says that for Java 1.8 the mavenopts are not required..unless I misinterpreted the instructions... Kr On 5 Oct 2016 9:20 am, "Richard Siebeling" wrote: > sorry, now with the link included, see http://spark.apache.org/ > docs/latest/building-spark.html > > On Wed, Oct 5,

Re: building Spark 2.1 vs Java 1.8 on Ubuntu 16/06

2016-10-05 Thread Richard Siebeling
sorry, now with the link included, see http://spark.apache.org/docs/latest/building-spark.html On Wed, Oct 5, 2016 at 10:19 AM, Richard Siebeling wrote: > Hi, > > did you set the following option: export MAVEN_OPTS="-Xmx2g > -XX:ReservedCodeCacheSize=512m" > > kind regards, > Richard > > On Tue,

Re: building Spark 2.1 vs Java 1.8 on Ubuntu 16/06

2016-10-05 Thread Richard Siebeling
Hi, did you set the following option: export MAVEN_OPTS="-Xmx2g -XX:ReservedCodeCacheSize=512m" kind regards, Richard On Tue, Oct 4, 2016 at 10:21 PM, Marco Mistroni wrote: > Hi all > my mvn build of Spark 2.1 using Java 1.8 is spinning out of memory with > an error saying it cannot allocate

building Spark 2.1 vs Java 1.8 on Ubuntu 16/06

2016-10-04 Thread Marco Mistroni
Hi all my mvn build of Spark 2.1 using Java 1.8 is spinning out of memory with an error saying it cannot allocate enough memory during maven compilation Instructions (in the Spark 2.0 page) says that MAVENOPTS are not needed for Java 1.8 and , accoding to my understanding, spark build process wil

Re: Error in building spark core on windows - any suggestions please

2016-08-03 Thread Sean Owen
Hm, all of the Jenkins builds are OK, but none of them run on Windows. It could be a Windows-specific thing. It means the launcher process exited abnormally. I don't see any log output from the process, so maybe it failed entirely to launch. Anyone else on Windows seeing this fail or work? On We

Re: Error in building spark core on windows - any suggestions please

2016-08-03 Thread Tony Lane
Compiling without running tests... and this is going fine .. On Wed, Aug 3, 2016 at 8:00 PM, Tony Lane wrote: > I am trying to build spark in windows, and getting the following test > failures and consequent build failures. > > [INFO] --- maven-surefire-plugin:2.19.1:test (default-test) @ > spar

Error in building spark core on windows - any suggestions please

2016-08-03 Thread Tony Lane
I am trying to build spark in windows, and getting the following test failures and consequent build failures. [INFO] --- maven-surefire-plugin:2.19.1:test (default-test) @ spark-core_2.11 --- --- T E S T S --

Re: build error - failing test- Error while building spark 2.0 trunk from github

2016-07-31 Thread Jacek Laskowski
Hi, Can you share what's the command to run the build? What's the OS? Java? Pozdrawiam, Jacek Laskowski https://medium.com/@jaceklaskowski/ Mastering Apache Spark 2.0 http://bit.ly/mastering-apache-spark Follow me at https://twitter.com/jaceklaskowski On Sun, Jul 31, 2016 at 6:54 PM, Rohit

build error - failing test- Error while building spark 2.0 trunk from github

2016-07-31 Thread Rohit Chaddha
--- T E S T S --- Running org.apache.spark.api.java.OptionalSuite Tests run: 8, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.052 sec - in org.apache.spark.api.java.OptionalSuite Running o

Building Spark 2 from source that does not include the Hive jars

2016-07-27 Thread Mich Talebzadeh
Hi, This has worked before including 1.6.1 etc Build Spark without Hive jars. The idea being to use Spark as Hive execution engine. There is some notes on Hive on Spark: Getting Started The usual process is to d

Re: Building Spark 2.X in Intellij

2016-06-23 Thread Stephen Boesch
I just checked out completely fresh directory and created new IJ project. Then followed your tip for adding the avro source. Here is an additional set of errors Error:(31, 12) object razorvine is not a member of package net import net.razorvine.pickle.{IObjectPickler, Opcodes, Pickler}

Re: Building Spark 2.X in Intellij

2016-06-22 Thread Stephen Boesch
Thanks Jeff - I remember that now from long time ago. After making that change the next errors are: Error:scalac: missing or invalid dependency detected while loading class file 'RDDOperationScope.class'. Could not access term fasterxml in package com, because it (or its dependencies) are missing

Re: Building Spark 2.X in Intellij

2016-06-22 Thread Jeff Zhang
You need to spark/external/flume-sink/target/scala-2.11/src_managed/main/compiled_avro under build path, this is the only thing you need to do manually if I remember correctly. On Thu, Jun 23, 2016 at 2:30 PM, Stephen Boesch wrote: > Hi Jeff, > I'd like to understand what may be different. I

Re: Building Spark 2.X in Intellij

2016-06-22 Thread Stephen Boesch
Hi Jeff, I'd like to understand what may be different. I have rebuilt and reimported many times. Just now I blew away the .idea/* and *.iml to start from scratch. I just opened the $SPARK_HOME directory from intellij File | Open . After it finished the initial import I tried to run one of the

Re: Building Spark 2.X in Intellij

2016-06-22 Thread Praveen R
I had some errors like SqlBaseParser class missing, and figured out I needed to get these classes from SqlBase.g4 using antlr4. It works fine now. On Thu, Jun 23, 2016 at 9:20 AM, Jeff Zhang wrote: > It works well with me. You can try reimport it into intellij. > > On Thu, Jun 23, 2016 at 10:25

Re: Building Spark 2.X in Intellij

2016-06-22 Thread Jeff Zhang
It works well with me. You can try reimport it into intellij. On Thu, Jun 23, 2016 at 10:25 AM, Stephen Boesch wrote: > > Building inside intellij is an ever moving target. Anyone have the magical > procedures to get it going for 2.X? > > There are numerous library references that - although inc

Building Spark 2.X in Intellij

2016-06-22 Thread Stephen Boesch
Building inside intellij is an ever moving target. Anyone have the magical procedures to get it going for 2.X? There are numerous library references that - although included in the pom.xml build - are for some reason not found when processed within Intellij.

Re: Building spark submodule source code

2016-03-21 Thread Jakob Odersky
Another gotcha to watch out for are the SPARK_* environment variables. Have you exported SPARK_HOME? In that case, 'spark-shell' will use Spark from the variable, regardless of the place the script is called from. I.e. if SPARK_HOME points to a release version of Spark, your code changes will never

Re: Building spark submodule source code

2016-03-20 Thread Akhil Das
Have a look at the intellij setup https://cwiki.apache.org/confluence/display/SPARK/Useful+Developer+Tools#UsefulDeveloperTools-IntelliJ Once you have the setup ready, you don't have to recompile the whole stuff every time. Thanks Best Regards On Mon, Mar 21, 2016 at 8:14 AM, Tenghuan He wrote:

Re: Building spark submodule source code

2016-03-20 Thread Ted Yu
To speed up the build process, take a look at install_zinc() in build/mvn, around line 83. And the following around line 137: # Now that zinc is ensured to be installed, check its status and, if its # not running or just installed, start it FYI On Sun, Mar 20, 2016 at 7:44 PM, Tenghuan He wrot

Building spark submodule source code

2016-03-20 Thread Tenghuan He
Hi everyone, I am trying to add a new method to spark RDD. After changing the code of RDD.scala and running the following command mvn -pl :spark-core_2.10 -DskipTests clean install It BUILD SUCCESS, however, when starting the bin\spark-shell, my method cannot be found. Do I have to

Re: Error building spark app with Maven

2016-03-15 Thread Ted Yu
bq. remove them after the job finished. bq. That will keep audit people happy Looks like the above two may not be achieved at the same time "-) On Tue, Mar 15, 2016 at 5:04 PM, Mich Talebzadeh wrote: > in mvn the build mvn package will look for a file called pom.xml > > in sbt the build sbt pa

Re: Error building spark app with Maven

2016-03-15 Thread Mich Talebzadeh
that should read anything.sbt Dr Mich Talebzadeh LinkedIn * https://www.linkedin.com/profile/view?id=AAEWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw * http://talebzadehmich.wordpress.com On 16 March 2016 at 00:04,

Re: Error building spark app with Maven

2016-03-15 Thread Mich Talebzadeh
in mvn the build mvn package will look for a file called pom.xml in sbt the build sbt package will look for a file called anything.smt It works Keep it simple I will write a ksh script that will create both generic and sbt files on the fly in the correct directory (at the top of the tree) and

Re: Error building spark app with Maven

2016-03-15 Thread Jakob Odersky
The artifactId in maven basically (in a simple case) corresponds to name in sbt. Note however that you will manually need to append the _scalaBinaryVersion to the artifactId in case you would like to build against multiple scala versions (otherwise maven will overwrite the generated jar with the l

Re: Error building spark app with Maven

2016-03-15 Thread Ted Yu
Feel free to adjust artifact Id and version in maven. They're under your control. > On Mar 15, 2016, at 4:27 PM, Mich Talebzadeh > wrote: > > ok Ted > > In sbt I have > > name := "ImportCSV" > version := "1.0" > scalaVersion := "2.10.4" > > which ends up in importcsv_2.10-1.0.jar as part

Re: Error building spark app with Maven

2016-03-15 Thread Mich Talebzadeh
ok Ted In sbt I have name := "ImportCSV" version := "1.0" scalaVersion := "2.10.4" which ends up in importcsv_2.10-1.0.jar as part of *target/scala-2.10/importcsv_2.**10-1.0.jar* In mvn I have 1.0 scala Does it matter? Dr Mich Talebzadeh LinkedIn * https://www.linkedin.com/profile/vie

Re: Error building spark app with Maven

2016-03-15 Thread Ted Yu
1.0 ... scala On Tue, Mar 15, 2016 at 4:14 PM, Mich Talebzadeh wrote: > An observation > > Once compiled with MVN the job submit works as follows: > > + /usr/lib/spark-1.5.2-bin-hadoop2.6/bin/spark-submit --packages > com.databricks:spark-csv_2.11:1.3.0 --class ImportCSV --master spark:// > 50.1

Re: Error building spark app with Maven

2016-03-15 Thread Mich Talebzadeh
An observation Once compiled with MVN the job submit works as follows: + /usr/lib/spark-1.5.2-bin-hadoop2.6/bin/spark-submit --packages com.databricks:spark-csv_2.11:1.3.0 --class ImportCSV --master spark:// 50.140.197.217:7077 --executor-memory=12G --executor-cores=12 --num-executors=2 *target/s

Re: Error building spark app with Maven

2016-03-15 Thread Mich Talebzadeh
Many thanks Ted and thanks for heads up Jakob Just these two changes to dependencies org.apache.spark spark-core*_2.10* 1.5.1 org.apache.spark spark-sql*_2.10* 1.5.1 [DEBUG] endProcessChildren: artifact=spark:scala:jar:1.0 [INFO] -

Re: Error building spark app with Maven

2016-03-15 Thread Jakob Odersky
Hi Mich, probably unrelated to the current error you're seeing, however the following dependencies will bite you later: spark-hive_2.10 spark-csv_2.11 the problem here is that you're using libraries built for different Scala binary versions (the numbers after the underscore). The simple fix here is

Re: Error building spark app with Maven

2016-03-15 Thread Ted Yu
Please suffix _2.10 to artifact name See: http://mvnrepository.com/artifact/org.apache.spark/spark-core_2.10 On Tue, Mar 15, 2016 at 3:08 PM, Mich Talebzadeh wrote: > Hi, > > I normally use sbt and using this sbt file works fine for me > > cat ImportCSV.sbt > name := "ImportCSV" > version := "1

Error building spark app with Maven

2016-03-15 Thread Mich Talebzadeh
Hi, I normally use sbt and using this sbt file works fine for me cat ImportCSV.sbt name := "ImportCSV" version := "1.0" scalaVersion := "2.10.4" libraryDependencies += "org.apache.spark" %% "spark-core" % "1.5.1" libraryDependencies += "org.apache.spark" %% "spark-sql" % "1.5.1" libraryDependenc

Re: Building Spark packages with SBTor Maven

2016-03-15 Thread Mich Talebzadeh
Thanks the maven structure is identical to sbt. just sbt file I will have to replace with pom.xml I will use your pom.xml to start with it. Cheers Dr Mich Talebzadeh LinkedIn * https://www.linkedin.com/profile/view?id=AAEWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw

Re: Building Spark packages with SBTor Maven

2016-03-15 Thread Chandeep Singh
Yes, sbt uses the same structure as maven for source files. > On Mar 15, 2016, at 1:53 PM, Mich Talebzadeh > wrote: > > Thanks the maven structure is identical to sbt. just sbt file I will have to > replace with pom.xml > > I will use your pom.xml to start with it. > > Cheers > > Dr Mich Ta

Re: Building Spark packages with SBTor Maven

2016-03-15 Thread Chandeep Singh
You can build using maven from the command line as well. This layout should give you an idea and here are some resources - http://www.scala-lang.org/old/node/345 project/ pom.xml - Defines the project src/ main/ java/ - Contains a

Re: Building Spark packages with SBTor Maven

2016-03-15 Thread Mich Talebzadeh
sounds like the layout is basically the same as sbt layout, the sbt file is replaced by pom.xml? Dr Mich Talebzadeh LinkedIn * https://www.linkedin.com/profile/view?id=AAEWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw *

Re: Building Spark packages with SBTor Maven

2016-03-15 Thread Mich Talebzadeh
Thanks again Is there anyway one can set this one up without eclipse much like what I did with sbt? I need to know the directory structure foe MVN project. Cheers Dr Mich Talebzadeh LinkedIn * https://www.linkedin.com/profile/view?id=AAEWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw

Re: Building Spark packages with SBTor Maven

2016-03-15 Thread Chandeep Singh
Do you have the Eclipse Maven plugin setup? http://www.eclipse.org/m2e/ Once you have it setup, File -> New -> Other -> MavenProject -> Next / Finish. You’ll see a default POM.xml which you can modify / replace. Here is some documentation that should help: http

Re: Building Spark packages with SBTor Maven

2016-03-15 Thread Mich Talebzadeh
Great Chandeep. I also have Eclipse Scala IDE below scala IDE build of Eclipse SDK Build id: 4.3.0-vfinal-2015-12-01T15:55:22Z-Typesafe I am no expert on Eclipse so if I create project called ImportCSV where do I need to put the pom file or how do I reference it please. My Eclipse runs on a Linux

Re: Building Spark packages with SBTor Maven

2016-03-15 Thread Chandeep Singh
Btw, just to add to the confusion ;) I use Maven as well since I moved from Java to Scala but everyone I talk to has been recommending SBT for Scala. I use the Eclipse Scala IDE to build. http://scala-ide.org/ Here is my sample PoM. You can add dependancies based on you

Re: Building Spark packages with SBTor Maven

2016-03-15 Thread Mich Talebzadeh
Ok. Sounds like opinion is divided :) I will try to build a scala app with Maven. When I build with SBT I follow this directory structure High level directory the package name like ImportCSV under ImportCSV I have a directory src and the sbt file ImportCSV.sbt in directory src I have main an

Re: Building Spark packages with SBTor Maven

2016-03-15 Thread Sean Owen
FWIW, I strongly prefer Maven over SBT even for Scala projects. The Spark build of reference is Maven. On Tue, Mar 15, 2016 at 10:45 AM, Chandeep Singh wrote: > For Scala, SBT is recommended. > > On Mar 15, 2016, at 10:42 AM, Mich Talebzadeh > wrote: > > Hi, > > I build my Spark/Scala packages u

Re: Building Spark packages with SBTor Maven

2016-03-15 Thread Chandeep Singh
Puled this from stack overflow: We're using Maven to build Scala projects at work because it integrates well with our CI server. We could just run a shell script to kick off a build, of course, but we've got a bunch of other information coming out of Maven that we want to go into CI. That's abo

Re: Building Spark packages with SBTor Maven

2016-03-15 Thread Ted Yu
There're build jobs for both on Jenkins: https://amplab.cs.berkeley.edu/jenkins/job/spark-master-test-sbt-hadoop-2.7/ https://amplab.cs.berkeley.edu/jenkins/job/spark-master-test-maven-hadoop-2.7/ You can choose either one. I use mvn. On Tue, Mar 15, 2016 at 3:42 AM, Mich Talebzadeh wrote: >

Re: Building Spark packages with SBTor Maven

2016-03-15 Thread Chandeep Singh
For Scala, SBT is recommended. > On Mar 15, 2016, at 10:42 AM, Mich Talebzadeh > wrote: > > Hi, > > I build my Spark/Scala packages using SBT that works fine. I have created > generic shell scripts to build and submit it. > > Yesterday I noticed that some use Maven and Pom for this purpose.

Building Spark packages with SBTor Maven

2016-03-15 Thread Mich Talebzadeh
Hi, I build my Spark/Scala packages using SBT that works fine. I have created generic shell scripts to build and submit it. Yesterday I noticed that some use Maven and Pom for this purpose. Which approach is recommended? Thanks, Dr Mich Talebzadeh LinkedIn * https://www.linkedin.com/profi

Re: Building Spark with a Custom Version of Hadoop: HDFS ClassNotFoundException

2016-02-11 Thread Ted Yu
the 1.6.0 release. > > > Charles. > > -- > Date: Thu, 11 Feb 2016 17:41:54 -0800 > Subject: Re: Building Spark with a Custom Version of Hadoop: HDFS > ClassNotFoundException > From: yuzhih...@gmail.com > To: charliewri...@live.ca; user@spark.apache

Re: Building Spark with a Custom Version of Hadoop: HDFS ClassNotFoundException

2016-02-11 Thread Ted Yu
Thu, 11 Feb 2016 17:29:00 -0800 > Subject: Re: Building Spark with a Custom Version of Hadoop: HDFS > ClassNotFoundException > From: yuzhih...@gmail.com > To: charliewri...@live.ca > CC: d...@spark.apache.org > > Hdfs class is in hadoop-hdfs-XX.jar > > Can you check the clas

Re: building spark 1.6.0 fails

2016-01-29 Thread Sean Owen
You're somehow building with Java 6. At least this is what the error means. On Fri, Jan 29, 2016, 05:25 Carlile, Ken wrote: > I am attempting to build Spark 1.6.0 from source on EL 6.3, using Oracle > jdk 1.8.0.45, Python 2.7.6, and Scala 2.10.3. When I try to issue > build/mvn/ -DskipTests clea

Re: building spark 1.6.0 fails

2016-01-28 Thread Ted Yu
I tried the following command: build/mvn clean -Phive -Phive-thriftserver -Pyarn -Phadoop-2.4 -Dhadoop.version=2.7.0 package -DskipTests I didn't encounter the error you mentioned. bq. Using zinc server for incremental compilation Was it possible that zinc was running before you started the bui

building spark 1.6.0 fails

2016-01-28 Thread Carlile, Ken
I am attempting to build Spark 1.6.0 from source on EL 6.3, using Oracle jdk 1.8.0.45, Python 2.7.6, and Scala 2.10.3. When I try to issue build/mvn/ -DskipTests clean package, I get the following: [INFO] Using zinc server for incremental compilation [info] Compiling 3 Java sources to /misc/lo

RE: building spark 1.6 throws error Rscript: command not found

2016-01-19 Thread Sun, Rui
Hi, Mich, Building Spark with SparkR profile enabled requires installation of R on your building machine. From: Ted Yu [mailto:yuzhih...@gmail.com] Sent: Tuesday, January 19, 2016 5:27 AM To: Mich Talebzadeh Cc: user @spark Subject: Re: building spark 1.6 throws error Rscript: command not found

Re: building spark 1.6 throws error Rscript: command not found

2016-01-18 Thread Ted Yu
Please see: http://www.jason-french.com/blog/2013/03/11/installing-r-in-linux/ On Mon, Jan 18, 2016 at 1:22 PM, Mich Talebzadeh wrote: > ./make-distribution.sh --name custom-spark --tgz -Psparkr -Phadoop-2.6 > -Phive -Phive-thriftserver -Pyarn > > > > > > INFO] --- exec-maven-plugin:1.4.0:exec (

building spark 1.6 throws error Rscript: command not found

2016-01-18 Thread Mich Talebzadeh
./make-distribution.sh --name custom-spark --tgz -Psparkr -Phadoop-2.6 -Phive -Phive-thriftserver -Pyarn INFO] --- exec-maven-plugin:1.4.0:exec (sparkr-pkg) @ spark-core_2.10 --- ../R/install-dev.sh: line 40: Rscript: command not found [INFO] ---

Re: problem building spark on centos

2016-01-06 Thread Jade Liu
Yes I’m using maven 3.3.9. From: Todd Nist mailto:tsind...@gmail.com>> Date: Wednesday, January 6, 2016 at 12:33 PM To: Jade Liu mailto:jade@nor1.com>> Cc: "user@spark.apache.org<mailto:user@spark.apache.org>" mailto:user@spark.apache.org>> Subject: Re: pro

Re: problem building spark on centos

2016-01-06 Thread Marcelo Vanzin
If you're trying to compile against Scala 2.11, you're missing "-Dscala-2.11" in that command. On Wed, Jan 6, 2016 at 12:27 PM, Jade Liu wrote: > Hi, Todd: > > Thanks for your suggestion. Yes I did run the ./dev/change-scala-version.sh > 2.11 script when using scala version 2.11. > > I just tried

Re: problem building spark on centos

2016-01-06 Thread Todd Nist
y/MAVEN/PluginExecutionException > [ERROR] > [ERROR] After correcting the problems, you can resume the build with the > command > [ERROR] mvn -rf :spark-launcher_2.10 > > Do you think it’s java problem? I’m using oracle JDK 1.7. Should I update > it to 1.8 instead? I just

Re: problem building spark on centos

2016-01-06 Thread Jade Liu
l.com>>, "user@spark.apache.org<mailto:user@spark.apache.org>" mailto:user@spark.apache.org>> Subject: Re: problem building spark on centos Hi Jade, I think you "--name" option. The makedistribution should look like this: ./make-distribution.sh --name h

Re: problem building spark on centos

2016-01-06 Thread Todd Nist
> sbt_inc.SbtIncrementalCompiler.zincCompile(SbtIncrementalCompiler.java:136) >> at sbt_inc.SbtIncrementalCompiler.compile(SbtIncrementalCompiler.java:86) >> at >> scala_maven.ScalaCompilerSupport.incrementalCompile(ScalaCompilerSupport.java:303) >> at scala_maven.ScalaCompilerSupport.compile(ScalaCompilerSupport.jav

Re: problem building spark on centos

2016-01-06 Thread Todd Nist
cuteMojo(DefaultBuildPluginManager.java:134) > > Not sure what’s causing it. Does anyone have any idea? > > Thanks! > > Jade > From: Ted Yu > Date: Wednesday, January 6, 2016 at 10:40 AM > To: Jade Liu , user > > Subject: Re: problem building spark on centos &g

Re: problem building spark on centos

2016-01-06 Thread Jade Liu
nks! Jade From: Ted Yu mailto:yuzhih...@gmail.com>> Date: Wednesday, January 6, 2016 at 10:40 AM To: Jade Liu mailto:jade@nor1.com>>, user mailto:user@spark.apache.org>> Subject: Re: problem building spark on centos w.r.t. the second error, have you read this ? http://www.cap

Re: problem building spark on centos

2016-01-06 Thread Ted Yu
16 at 4:57 PM > To: Jade Liu > Cc: "user@spark.apache.org" > Subject: Re: problem building spark on centos > > Which version of maven are you using ? > > It should be 3.3.3+ > > On Tue, Jan 5, 2016 at 4:54 PM, Jade Liu wrote: > >> Hi, All: >&g

Re: problem building spark on centos

2016-01-05 Thread Ted Yu
Which version of maven are you using ? It should be 3.3.3+ On Tue, Jan 5, 2016 at 4:54 PM, Jade Liu wrote: > Hi, All: > > I’m trying to build spark 1.5.2 from source using maven with the following > command: > > ./make-distribution.sh --tgz -Phadoop-2.6 -Pyarn -Dhadoop.version=2.6.0 > -Dscala-2

problem building spark on centos

2016-01-05 Thread Jade Liu
Hi, All: I'm trying to build spark 1.5.2 from source using maven with the following command: ./make-distribution.sh --tgz -Phadoop-2.6 -Pyarn -Dhadoop.version=2.6.0 -Dscala-2.11 -Phive -Phive-thriftserver -DskipTests I got the following error: + VERSION='[ERROR] [Help 2] http://cwiki.apache.or

Getting error when trying to start master node after building spark 1.3

2015-12-07 Thread Mich Talebzadeh
mailto:m...@peridale.co.uk> > Cc: user mailto:user@spark.apache.org> > Subject: Re: Getting error when trying to start master node after building spark 1.3 Did you read http://spark.apache.org/docs/latest/building-spark.html#building-with-hive-and-jdbc-support Thanks Best R

Re: Getting error when trying to start master node after building spark 1.3

2015-12-07 Thread Akhil Das
Did you read http://spark.apache.org/docs/latest/building-spark.html#building-with-hive-and-jdbc-support Thanks Best Regards On Fri, Dec 4, 2015 at 4:12 PM, Mich Talebzadeh wrote: > Hi, > > > > > > I am trying to make Hive work with Spark. > > > > I have been told that I need to use Spark 1.3

Getting error when trying to start master node after building spark 1.3

2015-12-04 Thread Mich Talebzadeh
Hi, I am trying to make Hive work with Spark. I have been told that I need to use Spark 1.3 and build it from source code WITHOUT HIVE libraries. I have built it as follows: ./make-distribution.sh --name "hadoop2-without-hive" --tgz "-Pyarn,hadoop-provided,hadoop-2.4,parquet-prov

Re: Building spark 1.3 from source code to work with Hive 1.2.1

2015-12-03 Thread zhangjp
From: "Mich Talebzadeh";; Date: Thu, Dec 3, 2015 06:28 PM To: "user"; "user"; Subject: Building spark 1.3 from source code to work with Hive 1.2.1 Hi, I have seen mails that state that the user has managed to build spark 1.3 to work with

Building spark 1.3 from source code to work with Hive 1.2.1

2015-12-03 Thread Mich Talebzadeh
Hi, I have seen mails that state that the user has managed to build spark 1.3 to work with Hive. I tried Spark 1.5.2 but no luck I downloaded spark source 1.3 source code spark-1.3.0.tar and built it as follows ./make-distribution.sh --name "hadoop2-without-hive" --tgz "-Pyarn,hadoop-pr

building spark from 1.3 release without Hive

2015-11-26 Thread Mich Talebzadeh
Hi, I am not having much luck making Hive run on Spark! I tried to build spark 1.5.2 without Hive jards. It worked but could not run hive sql on Spark. I saw in this link: http://stackoverflow.com/questions/33233431/hive-on-spark-java-lang-noclassdeffounderror-org-apache-hive-spark-cli

RE: Building Spark without hive libraries

2015-11-25 Thread Mich Talebzadeh
ipient to ensure that this email is virus free, therefore neither Peridale Ltd, its subsidiaries nor their employees accept any responsibility. From: Mich Talebzadeh [mailto:m...@peridale.co.uk] Sent: 25 November 2015 23:08 To: 'Ted Yu' Cc: 'user' Subject: RE: Building

RE: Building Spark without hive libraries

2015-11-25 Thread Mich Talebzadeh
s expressly so stated. It is the responsibility of the recipient to ensure that this email is virus free, therefore neither Peridale Ltd, its subsidiaries nor their employees accept any responsibility. From: Ted Yu [mailto:yuzhih...@gmail.com] Sent: 25 November 2015 22:54 To: Mich Talebzadeh Cc: u

Re: Building Spark without hive libraries

2015-11-25 Thread Ted Yu
sponsibility. > > > > *From:* Ted Yu [mailto:yuzhih...@gmail.com] > *Sent:* 25 November 2015 22:35 > > *To:* Mich Talebzadeh > *Cc:* user > *Subject:* Re: Building Spark without hive libraries > > > > bq. ^[[0m[^[[31merror^[[0m] ^[[0mRequired file not found: >

RE: Building Spark without hive libraries

2015-11-25 Thread Mich Talebzadeh
user@spark.apache.org> > Subject: Re: Building Spark without hive libraries Take a look at install_zinc() in build/mvn Cheers On Wed, Nov 25, 2015 at 1:30 PM, Mich Talebzadeh mailto:m...@peridale.co.uk> > wrote: Hi, I am trying to build sparc from the source and not usi

Re: Building Spark without hive libraries

2015-11-25 Thread Ted Yu
ponsibility of the recipient to ensure that this email is virus > free, therefore neither Peridale Ltd, its subsidiaries nor their employees > accept any responsibility. > > > > *From:* Ted Yu [mailto:yuzhih...@gmail.com] > *Sent:* 25 November 2015 21:52 > *To:* Mich Taleb

Re: Building Spark without hive libraries

2015-11-25 Thread Ted Yu
> > [INFO] Spark Project Bagel > > [INFO] Spark Project GraphX > > [INFO] Spark Project Streaming > > [INFO] Spark Project Catalyst > > [INFO] Spark Project SQL > > [INFO] Spark Project ML Library > > [INFO] Spark Project Tools > > [INFO] Spark Project Hive > > [INFO] Spark Pr

Building Spark without hive libraries

2015-11-25 Thread Mich Talebzadeh
lume Sink [INFO] Spark Project External Flume [INFO] Spark Project External Flume Assembly [INFO] Spark Project External MQTT [INFO] Spark Project External MQTT Assembly [INFO] Spark Project External ZeroMQ [INFO] Spark Project External

RE: Error building Spark on Windows with sbt

2015-10-30 Thread Judy Nash
I have not had any success building using sbt/sbt on windows. However, I have been able to binary by using maven command directly. From: Richard Eggert [mailto:richard.egg...@gmail.com] Sent: Sunday, October 25, 2015 12:51 PM To: Ted Yu Cc: User Subject: Re: Error building Spark on Windows with

Re: Building spark-1.5.x and MQTT

2015-10-28 Thread Ted Yu
Using your command, I did get: [ERROR] Failed to execute goal org.apache.maven.plugins:maven-assembly-plugin:2.5.5:single (test-jar-with-dependencies) on project spark-streaming-mqtt_2.10: Failed to create assembly: Error creating assembly archive test-jar-with-dependencies: Problem creating jar:

Re: Building spark-1.5.x and MQTT

2015-10-28 Thread Steve Loughran
> On 28 Oct 2015, at 13:19, Bob Corsaro wrote: > > Has anyone successful built this? I'm trying to determine if there is a > defect in the source package or something strange about my environment. I get > a FileNotFound exception on MQTTUtils.class during the build of the MQTT > module. The o

Re: Building spark-1.5.x and MQTT

2015-10-28 Thread Bob Corsaro
Built from http://mirror.olnevhost.net/pub/apache/spark/spark-1.5.1/spark-1.5.1.tgz using the following command: build/mvn -DskipTests=true -Dhadoop.version=2.4.1 -P"hadoop-2.4,kinesis-asl,netlib-lgpl" package install build/mvn is from the packaged source. Tried on a couple of ubuntu boxen and a

Re: Building spark-1.5.x and MQTT

2015-10-28 Thread Ted Yu
MQTTUtils.class is generated from external/mqtt/src/main/scala/org/apache/spark/streaming/mqtt/MQTTUtils.scala What command did you use to build ? Which release / branch were you building ? Thanks On Wed, Oct 28, 2015 at 6:19 AM, Bob Corsaro wrote: > Has anyone successful built this? I'm tryin

Building spark-1.5.x and MQTT

2015-10-28 Thread Bob Corsaro
Has anyone successful built this? I'm trying to determine if there is a defect in the source package or something strange about my environment. I get a FileNotFound exception on MQTTUtils.class during the build of the MQTT module. The only work around I've found is to remove the MQTT modules from t

Re: Error building Spark on Windows with sbt

2015-10-25 Thread Richard Eggert
Yes, I know, but it would be nice to be able to test things myself before I push commits. On Sun, Oct 25, 2015 at 3:50 PM, Ted Yu wrote: > If you have a pull request, Jenkins can test your change for you. > > FYI > > On Oct 25, 2015, at 12:43 PM, Richard Eggert > wrote: > > Also, if I run the M

Re: Error building Spark on Windows with sbt

2015-10-25 Thread Ted Yu
If you have a pull request, Jenkins can test your change for you. FYI > On Oct 25, 2015, at 12:43 PM, Richard Eggert wrote: > > Also, if I run the Maven build on Windows or Linux without setting > -DskipTests=true, it hangs indefinitely when it gets to > org.apache.spark.JavaAPISuite. > >

Re: Error building Spark on Windows with sbt

2015-10-25 Thread Richard Eggert
Also, if I run the Maven build on Windows or Linux without setting -DskipTests=true, it hangs indefinitely when it gets to org.apache.spark.JavaAPISuite. It's hard to test patches when the build doesn't work. :-/ On Sun, Oct 25, 2015 at 3:41 PM, Richard Eggert wrote: > By "it works", I mean, "I

Re: Error building Spark on Windows with sbt

2015-10-25 Thread Richard Eggert
By "it works", I mean, "It gets past that particular error". It still fails several minutes later with a different error: java.lang.IllegalStateException: impossible to get artifacts when data has not been loaded. IvyNode = org.scala-lang#scala-library;2.10.3 On Sun, Oct 25, 2015 at 3:38 PM, Ric

Error building Spark on Windows with sbt

2015-10-25 Thread Richard Eggert
When I try to start up sbt for the Spark build, or if I try to import it in IntelliJ IDEA as an sbt project, it fails with a "No such file or directory" error when it attempts to "git clone" sbt-pom-reader into .sbt/0.13/staging/some-sha1-hash. If I manually create the expected directory before r

  1   2   3   >