[ 
https://issues.apache.org/jira/browse/SPARK-6042?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14340061#comment-14340061
 ] 

Sean Owen commented on SPARK-6042:
----------------------------------

There is no Spark 1.3.0 release at this point. You should match the version you 
build your app against to the version you run against (and the Spark dep should 
be marked provided). 

Here it looks like you are buliding against master since you're using a method 
that seems added in (unreleased) spark 1.3 
(https://github.com/apache/spark/commit/119f45d61d7b48d376cca05e1b4f0c7fcf65bfa8)
 but I bet you are running on a cluster running Spark 1.2.

Please clarify how you are running this and what your cluster version is. 
SPARK-6018 is not related.

> spark-submit giving Exception in thread "main" java.lang.NoSuchMethodError: 
> org.apache.spark.sql.hive.HiveContext.sql(Ljava/lang/String;)Lorg/apache/spark/sql/SchemaRDD;
> -------------------------------------------------------------------------------------------------------------------------------------------------------------------------
>
>                 Key: SPARK-6042
>                 URL: https://issues.apache.org/jira/browse/SPARK-6042
>             Project: Spark
>          Issue Type: Question
>          Components: SQL
>            Reporter: Tarek Abouzeid
>              Labels: hive
>
> i am trying to create a table in hive using spark , i tried the code in 
> spark-shell and it worked and created the table , but when i use spark-submit 
> it gives this error:
> Exception in thread "main" java.lang.NoSuchMethodError: 
> org.apache.spark.sql.hive.HiveContext.sql(Ljava/lang/String;)Lorg/apache/spark/sql/SchemaRDD;
> at this line : sqlContext.sql("CREATE TABLE IF NOT EXISTS Test123 (key INT, 
> value STRING)")
> the code i submit is :
> import org.apache.spark.SparkContext
> import org.apache.spark.SparkContext._
> import org.apache.spark.SparkConf
> import org.apache.spark._
> import org.apache.spark.streaming._
> import org.apache.spark.streaming.StreamingContext._
> import org.apache.spark.storage.StorageLevel
> import org.apache.spark.streaming.flume._
> import org.apache.spark.util.IntParam
> import org.apache.spark.sql._
> import org.apache.spark.sql.hive.HiveContext
> object WordCount {
>   def main(args: Array[String]) {
>     if (args.length < 2) {
>       System.err.println(
>         "Usage: WordCount <host> <port>")
>       System.exit(1)
>     }
>     val Array(host, port) = args
>     val batchInterval = Milliseconds(2000)
>     // Create the context and set the batch size
>     val sparkConf = new SparkConf().setAppName("WordCount")
>     val sc = new SparkContext(sparkConf)
>     val ssc = new StreamingContext(sc, batchInterval)
>     // Create a flume stream
>     val stream = FlumeUtils.createStream(ssc, host, port.toInt)
>     // Print out the count of events received from this server in each batch
>     stream.count().map(cnt => "Received !!!:::::" + cnt + " flume events." 
> ).print()
>     
>     // it holds the string stream (converted event body array into string)
>     val body = stream.map(e => new String(e.event.getBody.array))
>     
>     
>    val counts = body.flatMap(line => 
> line.toLowerCase.replaceAll("[^a-zA-Z0-9\\s]", "").split("\\s+"))
>                  .map(word => (word, 1))
>                  .reduceByKey(_ + _)
>  
> val sqlContext = new org.apache.spark.sql.hive.HiveContext(sc)
> sqlContext.sql("CREATE TABLE IF NOT EXISTS tarek (key INT, value STRING)")
>  
>     ssc.start()
>     ssc.awaitTermination()
>   }
> }
> i tried to submit this code on local[*]   and yarn-master and both gave same 
> error , the error is at this specific line : 
> "sqlContext.sql("CREATE TABLE IF NOT EXISTS tarek (key INT, value STRING)")"
> but i executed the exact same line to execute create table query and it 
> succeeded ,, i found a kinda similar issue here :
> https://issues.apache.org/jira/browse/SPARK-6018
> can any one helps please ? 



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to