Hi,

I have built spark version 1.3 and tried to use this in my spark scala application. When I tried to compile and build the application by SBT, I got error> bad symbolic reference. A signature in SparkContext.class refers to term conf in value org.apache.hadoop which is not available

It seems hadoop library is missing, but it should be referred automatically by SBT, isn't it.

This application is buit-able on spark version 1.2

Here is my build.sbt

name := "wind25t-v013"
version := "0.1"
scalaVersion := "2.10.4"
unmanagedBase := baseDirectory.value / "lib"
libraryDependencies += "org.apache.spark" %% "spark-core" % "1.3.0"
libraryDependencies += "org.apache.spark" %% "spark-streaming" % "1.3.0"
libraryDependencies += "org.apache.spark" %% "spark-sql" % "1.3.0"
libraryDependencies += "org.apache.spark" % "spark-hive_2.10" % "1.3.0"

What should I do to fix it?

BR,
Patcharee




---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to