Hi, 

I am using Spark SQL on 1.2.1-snapshot.

Here is problem I encountered. Bacially, I want to save a schemaRDD to
HiveContext

val scm = StructType(
      Seq(
        StructField("name", StringType, nullable = false),
        StructField("cnt", IntegerType, nullable = false)
      ))

val schRdd = hiveContext.applySchema(ranked, scm)
// ranked above is RDD[Row] whose row contains 2 fields
schRdd.registerTempTable("schRdd")

hiveContext sql "select count(name) from schRdd limit 20" // => ok

hiveContext sql "create table t as select * from schRdd" // => table not
found


A query like "select" works well and gives the correct answer, but when I
try to save the temple table into Hive Context by createTableAsSelect, it
does not work.

*Caused by: org.apache.hadoop.hive.ql.parse.SemanticException: Line 1:32
Table not found 'schRdd'
        at
org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.getMetaData(SemanticAnalyzer.java:1243)*

I thought that was caused by registerTempTable, so I replace it by
saveAsTable. It does not work neither.

*Exception in thread "main" java.lang.AssertionError: assertion failed: No
plan for CreateTableAsSelect Some(sephcn), schRdd, false, None
 LogicalRDD [name#6,cnt#7], MappedRDD[3] at map at Creation.scala:70

        at scala.Predef$.assert(Predef.scala:179)
        at
org.apache.spark.sql.catalyst.planning.QueryPlanner.apply(QueryPlanner.scala:59)*

I also checked source code of QueryPlanner:

 def apply(plan: LogicalPlan): Iterator[PhysicalPlan] = {
    // Obviously a lot to do here still...
    val iter = strategies.view.flatMap(_(plan)).toIterator
    assert(iter.hasNext, s"No plan for $plan")
    iter
  }

The comment shows that there are some works to do with it. 

Any help is appreciated.

Thx.

Hao



--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/registerTempTable-Table-not-found-tp20592.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to