Re: Building standalone spark application via sbt

2016-07-20 Thread Sachin Mittal
I got the error during run time. It was for mongo-spark-connector class files. My build.sbt is like this name := "Test Advice Project" version := "1.0" scalaVersion := "2.10.6" libraryDependencies ++= Seq( "org.mongodb.spark" %% "mongo-spark-connector" % "1.0.0", "org.apache.spark" %%

Re: Building standalone spark application via sbt

2016-07-20 Thread Marco Mistroni
that will work but ideally you should not include any of the spark-releated jars as they are provided to you by the spark environment whenever you launch your app via spark-submit (this will prevent unexpected errors e.g. when you kick off your app using a different version of spark where some of

Re: Building standalone spark application via sbt

2016-07-20 Thread Sachin Mittal
NoClassDefFound error was for spark classes like say SparkConext. When running a standalone spark application I was not passing external jars using --jars option. However I have fixed this by making a fat jar using sbt assembly plugin. Now all the dependencies are included in that jar and I use

Re: Building standalone spark application via sbt

2016-07-20 Thread Marco Mistroni
Hello Sachin pls paste the NoClassDefFound Exception so we can see what's failing, aslo please advise how are you running your Spark App For an extremely simple case, let's assume you have your MyFirstSparkApp packaged in your myFirstSparkApp.jar Then all you need to do would be to kick off

Re: Building standalone spark application via sbt

2016-07-20 Thread Mich Talebzadeh
you need an uber jar file. Have you actually followed the dependencies and project sub-directory build? check this. http://stackoverflow.com/questions/28459333/how-to-build-an-uber-jar-fat-jar-using-sbt-within-intellij-idea under three answers the top one. I started reading the official SBT

Re: Building standalone spark application via sbt

2016-07-20 Thread Sachin Mittal
Hi, I am following the example under https://spark.apache.org/docs/latest/quick-start.html For standalone scala application. I added all my dependencies via build.sbt (one dependency is under lib folder). When I run sbt package I see the jar created under target/scala-2.10/ So compile seems to

Re: Building standalone spark application via sbt

2016-07-19 Thread Andrew Ehrlich
Yes, spark-core will depend on Hadoop and several other jars. Here’s the list of dependencies: https://github.com/apache/spark/blob/master/core/pom.xml#L35 Whether you need spark-sql depends on whether you will use the DataFrame

Building standalone spark application via sbt

2016-07-19 Thread Sachin Mittal
Hi, Can someone please guide me what all jars I need to place in my lib folder of the project to build a standalone scala application via sbt. Note I need to provide static dependencies and I cannot download the jars using libraryDependencies. So I need to provide all the jars upfront. So far I