Thank you!! I can do this using saveAsTable with the schemaRDD, right?
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/Table-not-found-using-jdbc-console-to-query-sparksql-hive-thriftserver-tp13840p13979.html
Sent from the Apache Spark User List mailing
in context:
http://apache-spark-user-list.1001560.n3.nabble.com/Table-not-found-using
-jdbc-console-to-query-sparksql-hive-thriftserver-tp13840p13922.html
Sent from the Apache Spark User List mailing list archive at Nabble.com
to pass the hiveContext to
JDBC
somehow?
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/Table-not-found-using
-jdbc-console-to-query-sparksql-hive-thriftserver-tp13840p13922.html
Sent from the Apache Spark User List mailing list archive at Nabble.com
I used the hiveContext to register the tables and the tables are still not
being found by the thrift server. Do I have to pass the hiveContext to JDBC
somehow?
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/Table-not-found-using-jdbc-console-to-query
not
being found by the thrift server. Do I have to pass the hiveContext to JDBC
somehow?
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/Table-not-found-using-jdbc-console-to-query-sparksql-hive-thriftserver-tp13840p13922.html
Sent from the Apache
://apache-spark-user-list.1001560.n3.nabble.com/Table-not-found-using
-jdbc-console-to-query-sparksql-hive-thriftserver-tp13840p13922.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.
-
To unsubscribe, e
(ForkJoinPool.java:1979)
at
scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/Table-not-found-using-jdbc-console-to-query-sparksql-hive-thriftserver-tp13840.html
Sent from the Apache
in context:
http://apache-spark-user-list.1001560.n3.nabble.com/Table-not-found-using-
jdbc-console-to-query-sparksql-hive-thriftserver-tp13840.html
Sent from the Apache Spark User List mailing list archive at Nabble.com