From: je...@hotmail.com
To: yuzhih...@gmail.com
Subject: RE: How to get the singleton instance of SQLContext/HiveContext: val 
sqlContext = SQLContext.getOrCreate(rdd.sparkContext)‏
Date: Fri, 4 Mar 2016 14:09:20 -0800




Below code is from the soruces, is this what you ask?
 
class HiveContext private[hive]( 
79     sc: SparkContext, 
80     cacheManager: CacheManager, 
81     listener: SQLListener, 
82     @transient private val execHive: HiveClientImpl, 
83     @transient private val metaHive: HiveClient, 
84     isRootContext: Boolean) 
85   extends SQLContext(sc, cacheManager, listener, isRootContext) with Logging 
{ 

 
J
 
Date: Fri, 4 Mar 2016 13:53:38 -0800
Subject: Re: How to get the singleton instance of SQLContext/HiveContext: val 
sqlContext = SQLContext.getOrCreate(rdd.sparkContext)‏
From: yuzhih...@gmail.com
To: je...@hotmail.com
CC: user@spark.apache.org

bq. However the method does not seem inherited to HiveContext.
Can you clarify the above observation ?HiveContext extends SQLContext .

On Fri, Mar 4, 2016 at 1:23 PM, jelez <je...@hotmail.com> wrote:
What is the best approach to use getOrCreate for streaming job with

HiveContext.

It seems for SQLContext the recommended approach is to use getOrCreate:

https://spark.apache.org/docs/latest/streaming-programming-guide.html#dataframe-and-sql-operations

    val sqlContext = SQLContext.getOrCreate(rdd.sparkContext)

However the method does not seem inherited to HiveContext.

I currently create my own singleton class and use it like this:

    val sqlContext =

SQLHiveContextSingleton.getInstance(linesRdd.sparkContext)



However, i am not sure if this is reliable. What would be the best approach?

Any examples?







--

View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/How-to-get-the-singleton-instance-of-SQLContext-HiveContext-val-sqlContext-SQLContext-getOrCreate-rd-tp26399.html

Sent from the Apache Spark User List mailing list archive at Nabble.com.



---------------------------------------------------------------------

To unsubscribe, e-mail: user-unsubscr...@spark.apache.org

For additional commands, e-mail: user-h...@spark.apache.org




                                                                                
  

Reply via email to