Hi
I am using a shared sparkContext for all of my Spark jobs. Some of the jobs
use HiveContext, but there isn't a getOrCreate method on HiveContext which
will allow reuse of an existing HiveContext. Such a method exists on
SQLContext only (def getOrCreate(sparkContext: SparkContext): SQLContext).
Have you noticed the following method of HiveContext ?
* Returns a new HiveContext as new session, which will have separated
SQLConf, UDF/UDAF,
* temporary tables and SessionState, but sharing the same CacheManager,
IsolatedClientLoader
* and Hive client (both of execution and metadata)
On 25 January 2016 at 21:09, Deenar Toraskar <
deenar.toras...@thinkreactive.co.uk> wrote:
> No I hadn't. This is useful, but in some cases we do want to share the
> same temporary tables between jobs so really wanted a getOrCreate
> equivalent on HIveContext.
>
> Deenar
>
>
>
> On 25 January