Since hive and spark SQL internally use HDFS and Hive metastore. The only
thing you want to change is the processing engine. You can try to bring
your hive-site.xml to %SPARK_HOME%/conf/hive-site.xml.(Ensure that the hive
site xml captures the metastore connection details).

Its a hack,  i havnt tried it. I have played around with the metastore and
it should work.

On Fri, Mar 27, 2015 at 12:04 PM, ÐΞ€ρ@Ҝ (๏̯͡๏) <deepuj...@gmail.com> wrote:

> I have few tables that are created in Hive. I wan to transform data stored
> in these Hive tables using Spark SQL. Is this even possible ?
>
> So far i have seen that i can create new tables using Spark SQL dialect.
> However when i run show tables or do desc hive_table it says table not
> found.
>
> I am now wondering is this support present or not in Spark SQL ?
>
> --
> Deepak
>
>


-- 

[image: Sigmoid Analytics] <http://htmlsig.com/www.sigmoidanalytics.com>

*Arush Kharbanda* || Technical Teamlead

ar...@sigmoidanalytics.com || www.sigmoidanalytics.com

Reply via email to