Have you tried adding it as --packages at the beginning of you spark-submit
?

--packages amplab:succinct:0.1.5

Also, I would usually have the Spark dependencies as "provided" in the build.sbt


On Fri, Mar 4, 2016 at 6:46 AM, Mich Talebzadeh <mich.talebza...@gmail.com>
wrote:

> Hi,
>
> I have a simple Scala code that I want to use it in an sbt project.
>
> It is pretty simple but imports the following:
>
> // Import SuccinctRDD
> import edu.berkeley.cs.succinct._
>
>
> name := "Simple Project"
> version := "1.0"
> scalaVersion := "2.10.5"
> libraryDependencies += "org.apache.spark" %% "spark-core" % "1.5.0"
> libraryDependencies += "org.apache.spark" %% "spark-sql" % "1.0.0"
> libraryDependencies += "org.apache.spark" %% "spark-hive" % "1.5.0"
> libraryDependencies += "amplab" % "succinct" % "0.1.5"
>
> The jar file is created OK but at execution time it fails
>
>
> *sbt package*[info] Set current project to Simple Project (in build
> file:/home/hduser/dba/bin/scala/)
> [success] Total time: 1 s, completed Mar 4, 2016 2:50:16 PM
>
> *spark-submit --class "SimpleApp" --master local
> target/scala-2.10/simple-project_2.10-1.0.jar*
> Some(/home/hduser/dba/bin/scala/target/scala-2.10/simple-project_2.10-1.0.jar)
> Lines with a: 60, Lines with b: 29
> Exception in thread "main" java.lang.NoClassDefFoundError:
> edu/berkeley/cs/succinct/package$
>
>
> I suspect I have not got the Library dependencies corrects?
>
>
> Dr Mich Talebzadeh
>
>
>
> LinkedIn * 
> https://www.linkedin.com/profile/view?id=AAEAAAAWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw
> <https://www.linkedin.com/profile/view?id=AAEAAAAWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw>*
>
>
>
> http://talebzadehmich.wordpress.com
>
>
>



-- 
Luciano Resende
http://people.apache.org/~lresende
http://twitter.com/lresende1975
http://lresende.blogspot.com/

Reply via email to