Hi
 please share your build.sbt
here's mine for reference (using Spark 1.6.1 + scala 2.10)  (pls ignore
extra stuff i have added for assembly and logging)

// Set the project name to the string 'My Project'
name := "SparkExamples"

// The := method used in Name and Version is one of two fundamental methods.
// The other method is <<=
// All other initialization methods are implemented in terms of these.
version := "1.0"

scalaVersion := "2.10.5"

assemblyJarName in assembly := "sparkexamples.jar"


// Add a single dependency
libraryDependencies += "junit" % "junit" % "4.8" % "test"
libraryDependencies += "org.mockito" % "mockito-core" % "1.9.5"
libraryDependencies ++= Seq("org.slf4j" % "slf4j-api" % "1.7.5",
                            "org.slf4j" % "slf4j-simple" % "1.7.5",
                            "org.clapper" %% "grizzled-slf4j" % "1.0.2")
libraryDependencies += "org.powermock" % "powermock-mockito-release-full" %
"1.5.4" % "test"
libraryDependencies += "org.apache.spark" %% "spark-core"   % "1.6.1" %
"provided"
libraryDependencies += "org.apache.spark" %% "spark-streaming"   % "1.6.1"
% "provided"
libraryDependencies += "org.apache.spark" %% "spark-mllib"   % "1.6.1"  %
"provided"
libraryDependencies += "org.apache.spark" %% "spark-streaming-flume"   %
"1.3.0"  % "provided"

resolvers += "softprops-maven" at "
http://dl.bintray.com/content/softprops/maven";

kr
 marco


On Wed, Apr 27, 2016 at 9:27 AM, shengshanzhang <shengshanzh...@icloud.com>
wrote:

> Hello :
>         my code is as follows:
> ---------------------------------------------------------------------------
> import org.apache.spark.{SparkConf, SparkContext}
> import org.apache.spark.sql.SQLContext
>
> case class Record(key: Int, value: String)
>     object RDDRelation {
>         def main(args: Array[String]) {
>
>             val sparkConf = new SparkConf().setAppName("RDDRelation")
>             val sc = new SparkContext(sparkConf)
>             //val sqlContext = new SQLContext(sc)
>         }
>     }
> ——————————————————————————————————————
> when I run "sbt package”, i come to a error as follows1
>
> $ sbt package
> [info] Set current project to Simple Project (in build
> file:/data/users/zhangshengshan/spark_work/)
> [info] Compiling 1 Scala source to
> /data/users/zhangshengshan/spark_work/target/scala-2.10/classes...
> [error]
> /data/users/zhangshengshan/spark_work/src/main/scala/SimpleApp.scala:2:
> object sql is not a member of package org.apache.spark
> [error] import org.apache.spark.sql.SQLContext
> [error]                         ^
> [error] one error found
> [error] (compile:compileIncremental) Compilation failed
> [error] Total time: 3 s, completed Apr 27, 2016 4:20:37 PM
>
>
>
>  who can tell me how can i fix this problem
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
> For additional commands, e-mail: user-h...@spark.apache.org
>
>
  • n shengshanzhang
    • Re: n Marco Mistroni
    • Re: n ramesh reddy

Reply via email to