This is how i used to build a assembly jar with sbt:

Your build.sbt file would look like this:

*import AssemblyKeys._*

*assemblySettings*

*name := "FirstScala"*

*version := "1.0"*

*scalaVersion := "2.10.4"*

*libraryDependencies += "org.apache.spark" %% "spark-core" % "1.3.1"*

*libraryDependencies += "org.apache.spark" %% "spark-streaming" % "1.3.1"*

*libraryDependencies += "org.apache.spark" %% "spark-mllib" % "1.3.1"*

*​​*Also create a file inside project directory named *plugins.sbt* and add
this line inside it:

*addSbtPlugin("com.eed3si9n" % "sbt-assembly" % "0.11.2")*

*​​*And then You will be able to do *sbt assembly*


Thanks
Best Regards

On Fri, Jun 19, 2015 at 12:09 PM, <prajod.vettiyat...@wipro.com> wrote:

>  > but when I run the application locally, it complains that spark
> related stuff is missing
>
>
>
> I use the uber jar option. What do you mean by “locally” ? In the Spark
> scala shell ? In the
>
>
>
> *From:* bit1...@163.com [mailto:bit1...@163.com]
> *Sent:* 19 June 2015 08:11
> *To:* user
> *Subject:* Build spark application into uber jar
>
>
>
> Hi,sparks,
>
>
>
> I have a spark streaming application that is a maven project, I would like
> to build it into a uber jar and run in the cluster.
>
> I have found out two options to build the uber jar, either of them has its
> shortcomings, so I would ask how you guys do it.
>
> Thanks.
>
>
>
> 1. Use the maven shade jar, and I have marked the spark related stuff as
> provided in the pom.xml, like:
>
>         <dependency>
> <groupId>org.apache.spark</groupId>
> <artifactId>spark-core_2.10</artifactId>
> <version>${spark.version}</version>
>
> <scope>provided</scope>
> </dependency>
>
>
>
>   With this, looks it can build the uber jar, but when I run the
> application locally, it complains that spark related stuff is missing which
> is not surprising because the spark related things are marked as provided,
> which will not included in runtime time
>
>
>
>   2. Instead of marking the spark things as provided, i configure the
> maven shade plugin to exclude the spark things as following, but there are
> still many things are there.
>
>
>
>   <executions>
> <execution>
> <phase>package</phase>
> <goals>
> <goal>shade</goal>
> </goals>
> <configuration>
> <artifactSet>
> <excludes>
> <exclude>junit:junit</exclude>
> <exclude>log4j:log4j:jar:</exclude>
> <exclude>org.scala-lang:scala-library:jar:</exclude>
> <exclude>org.apache.spark:spark-core_2.10</exclude>
> <exclude>org.apache.spark:spark-sql_2.10</exclude>
> <exclude>org.apache.spark:spark-streaming_2.10</exclude>
> </excludes>
> </artifactSet>
> </configuration>
>
>
>
>
>
>   Does someone ever build uber jar for the spark application, I would
> like to see how you do it, thanks!
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>  ------------------------------
>
> bit1...@163.com
>   The information contained in this electronic message and any
> attachments to this message are intended for the exclusive use of the
> addressee(s) and may contain proprietary, confidential or privileged
> information. If you are not the intended recipient, you should not
> disseminate, distribute or copy this e-mail. Please notify the sender
> immediately and destroy all copies of this message and any attachments.
> WARNING: Computer viruses can be transmitted via email. The recipient
> should check this email and any attachments for the presence of viruses.
> The company accepts no liability for any damage caused by any virus
> transmitted by this email. www.wipro.com
>

Reply via email to