Spark Sql jar has to be added as a dependency in build.sbt. 

    On Wednesday, 27 April 2016 1:57 PM, shengshanzhang 
<shengshanzh...@icloud.com> wrote:
 

 Hello :
    my code is as follows:
---------------------------------------------------------------------------
import org.apache.spark.{SparkConf, SparkContext}
import org.apache.spark.sql.SQLContext

case class Record(key: Int, value: String)
    object RDDRelation {
        def main(args: Array[String]) {

            val sparkConf = new SparkConf().setAppName("RDDRelation")
            val sc = new SparkContext(sparkConf)
            //val sqlContext = new SQLContext(sc)
        }
    }
——————————————————————————————————————
when I run "sbt package”, i come to a error as follows1

$ sbt package
[info] Set current project to Simple Project (in build 
file:/data/users/zhangshengshan/spark_work/)
[info] Compiling 1 Scala source to 
/data/users/zhangshengshan/spark_work/target/scala-2.10/classes...
[error] /data/users/zhangshengshan/spark_work/src/main/scala/SimpleApp.scala:2: 
object sql is not a member of package org.apache.spark
[error] import org.apache.spark.sql.SQLContext
[error]                        ^
[error] one error found
[error] (compile:compileIncremental) Compilation failed
[error] Total time: 3 s, completed Apr 27, 2016 4:20:37 PM



 who can tell me how can i fix this problem
---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org


  
  • n shengshanzhang
    • Re: n Marco Mistroni
    • Re: n ramesh reddy

Reply via email to