Evert - Thanks for the instructions, this is generally useful in other
scenarios, but I think this isn’t what Shahab needs, because
|saveAsTable| actually saves the contents of the SchemaRDD into Hive.
Shahab - As Michael has answered in another thread, you may try
Hi,
Sorry for repeating the same question, just wanted to clarify the issue :
Is it possible to expose a RDD (or SchemaRDD) to external components
(outside spark) so it can be queried over JDBC (my goal is not to place
the RDD back in a database but use this cached RDD to server JDBC queries)
Yes you can, using HiveContext, a metastore and the thriftserver. The
metastore persists information about your SchemaRDD, and the HiveContext,
initialised with information on the metastore, can interact with the
metastore. The thriftserver provides JDBC connections using the metastore.
Using