Re: Spark 2.1 and Hive Metastore

2017-04-09 Thread Benjamin Kim
Dan,

Yes, you’re correct. I sent it to the wrong users’ group.

Thanks,
Ben


> On Apr 9, 2017, at 1:21 PM, Dan Burkert  wrote:
> 
> Hi Ben,
> 
> Was this meant for the Spark user list, or is there something specific to the 
> Spark/Kudu integration you are asking about?
> 
> - Dan
> 
> On Sun, Apr 9, 2017 at 11:13 AM, Benjamin Kim  > wrote:
> I’m curious about if and when Spark SQL will ever remove its dependency on 
> Hive Metastore. Now that Spark 2.1’s SparkSession has superseded the need for 
> HiveContext, are there plans for Spark to no longer use the Hive Metastore 
> service with a “SparkSchema” service with a PostgreSQL, MySQL, etc. DB 
> backend? Hive is growing long in the tooth, and it would be nice to retire it 
> someday.
> 
> Cheers,
> Ben
> 



Re: Spark 2.1 and Hive Metastore

2017-04-09 Thread Dan Burkert
Hi Ben,

Was this meant for the Spark user list, or is there something specific to
the Spark/Kudu integration you are asking about?

- Dan

On Sun, Apr 9, 2017 at 11:13 AM, Benjamin Kim  wrote:

> I’m curious about if and when Spark SQL will ever remove its dependency on
> Hive Metastore. Now that Spark 2.1’s SparkSession has superseded the need
> for HiveContext, are there plans for Spark to no longer use the Hive
> Metastore service with a “SparkSchema” service with a PostgreSQL, MySQL,
> etc. DB backend? Hive is growing long in the tooth, and it would be nice to
> retire it someday.
>
> Cheers,
> Ben


Spark 2.1 and Hive Metastore

2017-04-09 Thread Benjamin Kim
I’m curious about if and when Spark SQL will ever remove its dependency on Hive 
Metastore. Now that Spark 2.1’s SparkSession has superseded the need for 
HiveContext, are there plans for Spark to no longer use the Hive Metastore 
service with a “SparkSchema” service with a PostgreSQL, MySQL, etc. DB backend? 
Hive is growing long in the tooth, and it would be nice to retire it someday.

Cheers,
Ben

Re: Spark on Kudu Roadmap

2017-04-09 Thread Benjamin Kim
Hi Mike,

Thanks for the link. I guess further, deeper Spark integration is slowly 
coming. But when, we will have to wait and see.

Cheers,
Ben
 

> On Mar 27, 2017, at 12:25 PM, Mike Percy  wrote:
> 
> Hi Ben,
> I don't really know so I'll let someone else more familiar with the Spark 
> integration chime in on that. However I searched the Kudu JIRA and I don't 
> see a tracking ticket filed on this (the closest thing I could find was 
> https://issues.apache.org/jira/browse/KUDU-1676 
>  ) so you may want to file a 
> JIRA to help track this feature.
> 
> Mike
> 
> 
> On Mon, Mar 27, 2017 at 11:55 AM, Benjamin Kim  > wrote:
> Hi Mike,
> 
> I believe what we are looking for is this below. It is an often request use 
> case.
> 
> Anyone know if the Spark package will ever allow for creating tables in Spark 
> SQL?
> 
> Such as:
>CREATE EXTERNAL TABLE 
>USING org.apache.kudu.spark.kudu
>OPTIONS (Map("kudu.master" -> “", "kudu.table" -> 
> “table-name”));
> 
> In this way, plain SQL can be used to do DDL, DML statements whether in Spark 
> SQL code or using JDBC to interface with Spark SQL Thriftserver.
> 
> Thanks,
> Ben
> 
> 
> 
>> On Mar 27, 2017, at 11:01 AM, Mike Percy > > wrote:
>> 
>> Hi Ben,
>> Is there anything in particular you are looking for?
>> 
>> Thanks,
>> Mike
>> 
>> On Mon, Mar 27, 2017 at 9:48 AM, Benjamin Kim > > wrote:
>> Hi,
>> 
>> Are there any plans for deeper integration with Spark especially Spark SQL? 
>> Is there a roadmap to look at, so I can know what to expect in the future?
>> 
>> Cheers,
>> Ben
>> 
> 
>