Re: correct Scala Imports for creating DFs from RDDs?

2015-07-14 Thread DW @ Gmail
You are mixing the 1.0.0 Spark SQL jar with Spark 1.4.0 jars in your build file

Sent from my rotary phone. 


 On Jul 14, 2015, at 7:57 AM, ashwang168 ashw...@mit.edu wrote:
 
 Hello!
 
 I am currently using Spark 1.4.0, scala 2.10.4, and sbt 0.13.8 to try and
 create a jar file from a scala file (attached above) and run it using
 spark-submit. I am also using Hive, Hadoop 2.6.0-cdh5.4.0 which has the
 files that I'm trying to read in.
 
 Currently I am very confused about how the imports work and if there . I am
 getting the error:
 
 
 [error] bad symbolic reference. A signature in SQLContext.class refers to
 term package
 [error] in package org.apache.spark.sql which is not available.
 [error] It may be completely missing from the current classpath, or the
 version on
 [error] the classpath might be incompatible with the version used when
 compiling SQLContext.class.
 [error] bad symbolic reference. A signature in SQLContext.class refers to
 type Logging
 [error] in value org.apache.spark.sql.package which is not available.
 [error] It may be completely missing from the current classpath, or the
 version on
 [error] the classpath might be incompatible with the version used when
 compiling SQLContext.class.
 [error] bad symbolic reference. A signature in SchemaRDD.class refers to
 term package
 [error] in package org.apache.spark.sql which is not available.
 [error] It may be completely missing from the current classpath, or the
 version on
 [error] the classpath might be incompatible with the version used when
 compiling SchemaRDD.class.
 [error] /root/awang/time/rddSpark/create/src/main/scala/create.scala:20:
 value implicits is not a member of org.apache.spark.sql.SQLContext
 ...
 
 [error] /root/awang/time/rddSpark/create/src/main/scala/create.scala:39:
 value toDF is not a member of org.apache.spark.rdd.RDD[TSTData]
 
 
 The imports in my code are:
 import org.apache.spark._
 import org.apache.spark.SparkContext 
 import org.apache.spark.SparkContext._ 
 import org.apache.spark.SparkConf
 import org.apache.spark.sql
 
 import org.apache.spark.sql._
 
 and in the object Create :
 val sqlContext = new org.apache.spark.sql.SQLContext(sc)
import sqlContext._
import sqlContext.implicits._
import sqlContext.createSchemaRDD
 
 
 My libraryDependencies are:
 libraryDependencies ++= Seq(
// spark will already be on classpath when using spark-submit.
// marked as provided, so that it isn't included in assembly.
org.apache.spark %% spark-catalyst % 1.4.0 % provided,
 
org.apache.spark %% spark-sql % 1.0.0)
 
 so why is package org.apache.spark.sql not available?
 
 Also, what are the correct imports to get this working?
 
 I'm using sbt assembly to try to compile these files, and would really
 appreciate any help.
 
 Thanks,
 Ashley Wang
 
 
 
 --
 View this message in context: 
 http://apache-spark-user-list.1001560.n3.nabble.com/correct-Scala-Imports-for-creating-DFs-from-RDDs-tp23829.html
 Sent from the Apache Spark User List mailing list archive at Nabble.com.
 
 -
 To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
 For additional commands, e-mail: user-h...@spark.apache.org
 

-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org



correct Scala Imports for creating DFs from RDDs?

2015-07-14 Thread ashwang168
Hello!

I am currently using Spark 1.4.0, scala 2.10.4, and sbt 0.13.8 to try and
create a jar file from a scala file (attached above) and run it using
spark-submit. I am also using Hive, Hadoop 2.6.0-cdh5.4.0 which has the
files that I'm trying to read in.

Currently I am very confused about how the imports work and if there . I am
getting the error:


[error] bad symbolic reference. A signature in SQLContext.class refers to
term package
[error] in package org.apache.spark.sql which is not available.
[error] It may be completely missing from the current classpath, or the
version on
[error] the classpath might be incompatible with the version used when
compiling SQLContext.class.
[error] bad symbolic reference. A signature in SQLContext.class refers to
type Logging
[error] in value org.apache.spark.sql.package which is not available.
[error] It may be completely missing from the current classpath, or the
version on
[error] the classpath might be incompatible with the version used when
compiling SQLContext.class.
[error] bad symbolic reference. A signature in SchemaRDD.class refers to
term package
[error] in package org.apache.spark.sql which is not available.
[error] It may be completely missing from the current classpath, or the
version on
[error] the classpath might be incompatible with the version used when
compiling SchemaRDD.class.
[error] /root/awang/time/rddSpark/create/src/main/scala/create.scala:20:
value implicits is not a member of org.apache.spark.sql.SQLContext
...
 
[error] /root/awang/time/rddSpark/create/src/main/scala/create.scala:39:
value toDF is not a member of org.apache.spark.rdd.RDD[TSTData]


The imports in my code are:
import org.apache.spark._
import org.apache.spark.SparkContext 
import org.apache.spark.SparkContext._ 
import org.apache.spark.SparkConf
import org.apache.spark.sql
 
import org.apache.spark.sql._

and in the object Create :
 val sqlContext = new org.apache.spark.sql.SQLContext(sc)
import sqlContext._
import sqlContext.implicits._
import sqlContext.createSchemaRDD


My libraryDependencies are:
libraryDependencies ++= Seq(
// spark will already be on classpath when using spark-submit.
// marked as provided, so that it isn't included in assembly.
org.apache.spark %% spark-catalyst % 1.4.0 % provided,
 
org.apache.spark %% spark-sql % 1.0.0)

so why is package org.apache.spark.sql not available?

Also, what are the correct imports to get this working?

I'm using sbt assembly to try to compile these files, and would really
appreciate any help.

Thanks,
Ashley Wang



--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/correct-Scala-Imports-for-creating-DFs-from-RDDs-tp23829.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org