Actually, when registering the table, it is only available within the sc 
context you are running it in. For Spark 1.1, the method name is changed to 
RegisterAsTempTable to better reflect that. 

The Thrift server process runs under a different process meaning that it cannot 
see any of the tables generated within the sc context. You would need to save 
the sc table into Hive and then the Thrift process would be able to see them.

HTH!

> On Sep 10, 2014, at 13:08, alexandria1101 <alexandria.shea...@gmail.com> 
> wrote:
> 
> I used the hiveContext to register the tables and the tables are still not
> being found by the thrift server.  Do I have to pass the hiveContext to JDBC
> somehow?
> 
> 
> 
> --
> View this message in context: 
> http://apache-spark-user-list.1001560.n3.nabble.com/Table-not-found-using-jdbc-console-to-query-sparksql-hive-thriftserver-tp13840p13922.html
> Sent from the Apache Spark User List mailing list archive at Nabble.com.
> 
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
> For additional commands, e-mail: user-h...@spark.apache.org
> 

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to