Re: Using Spark SQL to Create JDBC Tables

2016-09-13 Thread ayan guha
I did not install myself, as it is part of Oracle's product, However, you
can bring in any SerDe yourself and add them to library. See this

blog for more information.

On Wed, Sep 14, 2016 at 2:15 PM, Benjamin Kim  wrote:

> Thank you for the idea. I will look for a PostgreSQL Serde for Hive. But,
> if you don’t mind me asking, how did you install the Oracle Serde?
>
> Cheers,
> Ben
>
>
>
> On Sep 13, 2016, at 7:12 PM, ayan guha  wrote:
>
> One option is have Hive as the central point of exposing data ie create
> hive tables which "point to" any other DB. i know Oracle provides there own
> Serde for hive. Not sure about PG though.
>
> Once tables are created in hive, STS will automatically see it.
>
> On Wed, Sep 14, 2016 at 11:08 AM, Benjamin Kim  wrote:
>
>> Has anyone created tables using Spark SQL that directly connect to a JDBC
>> data source such as PostgreSQL? I would like to use Spark SQL Thriftserver
>> to access and query remote PostgreSQL tables. In this way, we can
>> centralize data access to Spark SQL tables along with PostgreSQL making it
>> very convenient for users. They would not know or care where the data is
>> physically located anymore.
>>
>> By the way, our users only know SQL.
>>
>> If anyone has a better suggestion, then please let me know too.
>>
>> Thanks,
>> Ben
>> -
>> To unsubscribe e-mail: user-unsubscr...@spark.apache.org
>>
>>
>
>
> --
> Best Regards,
> Ayan Guha
>
>
>


-- 
Best Regards,
Ayan Guha


Re: Using Spark SQL to Create JDBC Tables

2016-09-13 Thread Benjamin Kim
Thank you for the idea. I will look for a PostgreSQL Serde for Hive. But, if 
you don’t mind me asking, how did you install the Oracle Serde?

Cheers,
Ben


> On Sep 13, 2016, at 7:12 PM, ayan guha  wrote:
> 
> One option is have Hive as the central point of exposing data ie create hive 
> tables which "point to" any other DB. i know Oracle provides there own Serde 
> for hive. Not sure about PG though.
> 
> Once tables are created in hive, STS will automatically see it. 
> 
> On Wed, Sep 14, 2016 at 11:08 AM, Benjamin Kim  > wrote:
> Has anyone created tables using Spark SQL that directly connect to a JDBC 
> data source such as PostgreSQL? I would like to use Spark SQL Thriftserver to 
> access and query remote PostgreSQL tables. In this way, we can centralize 
> data access to Spark SQL tables along with PostgreSQL making it very 
> convenient for users. They would not know or care where the data is 
> physically located anymore.
> 
> By the way, our users only know SQL.
> 
> If anyone has a better suggestion, then please let me know too.
> 
> Thanks,
> Ben
> -
> To unsubscribe e-mail: user-unsubscr...@spark.apache.org 
> 
> 
> 
> 
> 
> -- 
> Best Regards,
> Ayan Guha



Re: Using Spark SQL to Create JDBC Tables

2016-09-13 Thread ayan guha
One option is have Hive as the central point of exposing data ie create
hive tables which "point to" any other DB. i know Oracle provides there own
Serde for hive. Not sure about PG though.

Once tables are created in hive, STS will automatically see it.

On Wed, Sep 14, 2016 at 11:08 AM, Benjamin Kim  wrote:

> Has anyone created tables using Spark SQL that directly connect to a JDBC
> data source such as PostgreSQL? I would like to use Spark SQL Thriftserver
> to access and query remote PostgreSQL tables. In this way, we can
> centralize data access to Spark SQL tables along with PostgreSQL making it
> very convenient for users. They would not know or care where the data is
> physically located anymore.
>
> By the way, our users only know SQL.
>
> If anyone has a better suggestion, then please let me know too.
>
> Thanks,
> Ben
> -
> To unsubscribe e-mail: user-unsubscr...@spark.apache.org
>
>


-- 
Best Regards,
Ayan Guha