Hi,

Why do you provided spark-core while the others are non-provided? How do
you assemble the app? How do you submit it for execution? What's the
deployment environment?

More info...more info...

Jacek
On 15 Jun 2016 10:26 p.m., "S Sarkar" <ssarkarayushnet...@gmail.com> wrote:

Hello,

I built package for a spark application with the following sbt file:

name := "Simple Project"

version := "1.0"

scalaVersion := "2.10.3"

libraryDependencies ++= Seq(
  "org.apache.spark"  %% "spark-core"              % "1.4.0" % "provided",
  "org.apache.spark"  %% "spark-mllib"             % "1.4.0",
  "org.apache.spark"  %% "spark-sql"                   % "1.4.0",
  "org.apache.spark"  %% "spark-sql"                   % "1.4.0"
  )
resolvers += "Akka Repository" at "http://repo.akka.io/releases/";

I am getting TaskResultGetter error with ClassNotFoundException for
scala.Some .

Can I please get some help how to fix it?

Thanks,
S. Sarkar



--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/ERROR-TaskResultGetter-Exception-while-getting-task-result-java-io-IOException-java-lang-ClassNotFoue-tp27178.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to