thanks a lot. I add a spark-sql dependence in build.sb as red line shows.

name := "Simple Project"

version := "1.0"

scalaVersion := "2.10.5"

libraryDependencies += "org.apache.spark" %% "spark-core" % "1.6.1"
libraryDependencies += "org.apache.spark" %% "spark-sql"  % "1.6.1"
~
> 在 2016年4月27日,下午4:58,ramesh reddy <ramesh_sre...@yahoo.co.in 
> <mailto:ramesh_sre...@yahoo.co.in>> 写道:
> 
> Spark Sql jar has to be added as a dependency in build.sbt.
> 
> 
> On Wednesday, 27 April 2016 1:57 PM, shengshanzhang 
> <shengshanzh...@icloud.com <mailto:shengshanzh...@icloud.com>> wrote:
> 
> 
> Hello :
>     my code is as follows:
> ---------------------------------------------------------------------------
> import org.apache.spark.{SparkConf, SparkContext}
> import org.apache.spark.sql.SQLContext
> 
> case class Record(key: Int, value: String)
>     object RDDRelation {
>         def main(args: Array[String]) {
> 
>             val sparkConf = new SparkConf().setAppName("RDDRelation")
>             val sc = new SparkContext(sparkConf)
>             //val sqlContext = new SQLContext(sc)
>         }
>     }
> ——————————————————————————————————————
> when I run "sbt package”, i come to a error as follows1
> 
> $ sbt package
> [info] Set current project to Simple Project (in build 
> file:/data/users/zhangshengshan/spark_work/)
> [info] Compiling 1 Scala source to 
> /data/users/zhangshengshan/spark_work/target/scala-2.10/classes...
> [error] 
> /data/users/zhangshengshan/spark_work/src/main/scala/SimpleApp.scala:2: 
> object sql is not a member of package org.apache.spark
> [error] import org.apache.spark.sql.SQLContext
> [error]                        ^
> [error] one error found
> [error] (compile:compileIncremental) Compilation failed
> [error] Total time: 3 s, completed Apr 27, 2016 4:20:37 PM
> 
> 
> 
> who can tell me how can i fix this problem
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org 
> <mailto:user-unsubscr...@spark.apache.org>
> For additional commands, e-mail: user-h...@spark.apache.org 
> <mailto:user-h...@spark.apache.org>
> 
> 

  • n shengshanzhang
    • Re: n Marco Mistroni
    • Re: n ramesh reddy
      • Re: n shengshanzhang
      • Re: n shengshanzhang

Reply via email to