Multiple maven profiles may be the ideal way. You can also do this with: 1. The defaul build command “mvn compile” , for local builds(use this to build with Eclipse’s “Run As->Maven build” option when you right-click on the pom.xml file)
2. Add maven build options to the same build command as above, for the uber jar build: “mvn compile assembly:single”(use this to build with Eclipse’s “Run As->Maven build…” option when you right-click on the pom.xml file). Note the extra dots(…) after “Maven build” in this option. Regards, Prajod From: bit1...@163.com [mailto:bit1...@163.com] Sent: 19 June 2015 13:01 To: Prajod S Vettiyattil (WT01 - BAS); Akhil Das Cc: user Subject: Re: RE: Build spark application into uber jar Thanks. I guess what you mean by "maven build target" is maven profile. I added two profiles, one is LocalRun, the other is ClusterRun for the spark related artifact scope. So that, I don't have to change the pom file but just to select a profile. <profile> <id>LocalRun</id> <properties> <spark.scope>compile</spark.scope> </properties> </profile> <profile> <id>ClusterRun</id> <properties> <spark.scope>provided</spark.scope> </properties> </profile> ________________________________ bit1...@163.com<mailto:bit1...@163.com> From: prajod.vettiyat...@wipro.com<mailto:prajod.vettiyat...@wipro.com> Date: 2015-06-19 15:22 To: bit1...@163.com<mailto:bit1...@163.com>; ak...@sigmoidanalytics.com<mailto:ak...@sigmoidanalytics.com> CC: user@spark.apache.org<mailto:user@spark.apache.org> Subject: RE: Re: Build spark application into uber jar Hi, When running inside Eclipse IDE, I use another maven target to build. That is the default maven target. For building for uber jar. I use the assembly jar target. So use two maven build targets in the same pom file to solve this issue. In maven you can have multiple build targets, and each target can have its own command line options. prajod From: bit1...@163.com<mailto:bit1...@163.com> [mailto:bit1...@163.com] Sent: 19 June 2015 12:36 To: Akhil Das; Prajod S Vettiyattil (WT01 - BAS) Cc: user Subject: Re: Re: Build spark application into uber jar Thank you Akhil. Hmm.. but I am using Maven as the building tool,>< ________________________________ bit1...@163.com<mailto:bit1...@163.com> From: Akhil Das<mailto:ak...@sigmoidanalytics.com> Date: 2015-06-19 15:31 To: Prajod S Vettiyattil (WT01 - BAS)<mailto:prajod.vettiyat...@wipro.com> CC: user@spark.apache.org<mailto:user@spark.apache.org> Subject: Re: Build spark application into uber jar This is how i used to build a assembly jar with sbt: Your build.sbt file would look like this: import AssemblyKeys._ assemblySettings name := "FirstScala" version := "1.0" scalaVersion := "2.10.4" libraryDependencies += "org.apache.spark" %% "spark-core" % "1.3.1" libraryDependencies += "org.apache.spark" %% "spark-streaming" % "1.3.1" libraryDependencies += "org.apache.spark" %% "spark-mllib" % "1.3.1" Also create a file inside project directory named plugins.sbt and add this line inside it: addSbtPlugin("com.eed3si9n" % "sbt-assembly" % "0.11.2") And then You will be able to do sbt assembly Thanks Best Regards On Fri, Jun 19, 2015 at 12:09 PM, <prajod.vettiyat...@wipro.com<mailto:prajod.vettiyat...@wipro.com>> wrote: > but when I run the application locally, it complains that spark related stuff > is missing I use the uber jar option. What do you mean by “locally” ? In the Spark scala shell ? In the From: bit1...@163.com<mailto:bit1...@163.com> [mailto:bit1...@163.com<mailto:bit1...@163.com>] Sent: 19 June 2015 08:11 To: user Subject: Build spark application into uber jar Hi,sparks, I have a spark streaming application that is a maven project, I would like to build it into a uber jar and run in the cluster. I have found out two options to build the uber jar, either of them has its shortcomings, so I would ask how you guys do it. Thanks. 1. Use the maven shade jar, and I have marked the spark related stuff as provided in the pom.xml, like: <dependency> <groupId>org.apache.spark</groupId> <artifactId>spark-core_2.10</artifactId> <version>${spark.version}</version> <scope>provided</scope> </dependency> With this, looks it can build the uber jar, but when I run the application locally, it complains that spark related stuff is missing which is not surprising because the spark related things are marked as provided, which will not included in runtime time 2. Instead of marking the spark things as provided, i configure the maven shade plugin to exclude the spark things as following, but there are still many things are there. <executions> <execution> <phase>package</phase> <goals> <goal>shade</goal> </goals> <configuration> <artifactSet> <excludes> <exclude>junit:junit</exclude> <exclude>log4j:log4j:jar:</exclude> <exclude>org.scala-lang:scala-library:jar:</exclude> <exclude>org.apache.spark:spark-core_2.10</exclude> <exclude>org.apache.spark:spark-sql_2.10</exclude> <exclude>org.apache.spark:spark-streaming_2.10</exclude> </excludes> </artifactSet> </configuration> Does someone ever build uber jar for the spark application, I would like to see how you do it, thanks! ________________________________ bit1...@163.com<mailto:bit1...@163.com> The information contained in this electronic message and any attachments to this message are intended for the exclusive use of the addressee(s) and may contain proprietary, confidential or privileged information. If you are not the intended recipient, you should not disseminate, distribute or copy this e-mail. Please notify the sender immediately and destroy all copies of this message and any attachments. WARNING: Computer viruses can be transmitted via email. The recipient should check this email and any attachments for the presence of viruses. The company accepts no liability for any damage caused by any virus transmitted by this email. www.wipro.com<http://www.wipro.com> The information contained in this electronic message and any attachments to this message are intended for the exclusive use of the addressee(s) and may contain proprietary, confidential or privileged information. If you are not the intended recipient, you should not disseminate, distribute or copy this e-mail. Please notify the sender immediately and destroy all copies of this message and any attachments. WARNING: Computer viruses can be transmitted via email. The recipient should check this email and any attachments for the presence of viruses. The company accepts no liability for any damage caused by any virus transmitted by this email. www.wipro.com<http://www.wipro.com> The information contained in this electronic message and any attachments to this message are intended for the exclusive use of the addressee(s) and may contain proprietary, confidential or privileged information. If you are not the intended recipient, you should not disseminate, distribute or copy this e-mail. Please notify the sender immediately and destroy all copies of this message and any attachments. WARNING: Computer viruses can be transmitted via email. The recipient should check this email and any attachments for the presence of viruses. The company accepts no liability for any damage caused by any virus transmitted by this email. www.wipro.com