Yes. But in order to access methods available only in HiveContext a user
cast is required.
On Tuesday, July 19, 2016, Maciej Bryński wrote:
> @Reynold Xin,
> How this will work with Hive Support ?
> SparkSession.sqlContext return HiveContext ?
>
> 2016-07-19 0:26 GMT+02:00
@Reynold Xin,
How this will work with Hive Support ?
SparkSession.sqlContext return HiveContext ?
2016-07-19 0:26 GMT+02:00 Reynold Xin :
> Good idea.
>
> https://github.com/apache/spark/pull/14252
>
>
>
> On Mon, Jul 18, 2016 at 12:16 PM, Michael Armbrust
Good idea.
https://github.com/apache/spark/pull/14252
On Mon, Jul 18, 2016 at 12:16 PM, Michael Armbrust
wrote:
> + dev, reynold
>
> Yeah, thats a good point. I wonder if SparkSession.sqlContext should be
> public/deprecated?
>
> On Mon, Jul 18, 2016 at 8:37 AM,
From what I read, there is no more Contexts.
"SparkContext, SQLContext, HiveContext merged into SparkSession"
I have not tested it, but I don’t know if it’s true.
Cheers,
Ben
> On Jul 18, 2016, at 8:37 AM, Koert Kuipers wrote:
>
> in my codebase i would like to
+ dev, reynold
Yeah, thats a good point. I wonder if SparkSession.sqlContext should be
public/deprecated?
On Mon, Jul 18, 2016 at 8:37 AM, Koert Kuipers wrote:
> in my codebase i would like to gradually transition to SparkSession, so
> while i start using SparkSession i
in my codebase i would like to gradually transition to SparkSession, so
while i start using SparkSession i also want a SQLContext to be available
as before (but with a deprecated warning when i use it). this should be
easy since SQLContext is now a wrapper for SparkSession.
so basically:
val