Querying Temp table using JDBC

2014-12-19 Thread shahab
Hi, According to Spark documentation the data sharing between two different Spark contexts is not possible. So I just wonder if it is possible to first run a job that loads some data from DB into Schema RDDs, then cache it and next register it as a temp table (let's say Table_1), now I would

Re: Querying Temp table using JDBC

2014-12-19 Thread Michael Armbrust
This is experimental, but you can start the JDBC server from within your own programs https://github.com/apache/spark/blob/master/sql/hive-thriftserver/src/main/scala/org/apache/spark/sql/hive/thriftserver/HiveThriftServer2.scala#L45 by passing it the HiveContext. On Fri, Dec 19, 2014 at 6:04