Yes, it works for me.  Make sure the Spark machine can access the hive
machine.

On Thu, Mar 26, 2015 at 6:55 PM, ÐΞ€ρ@Ҝ (๏̯͡๏) <deepuj...@gmail.com> wrote:

> Did you manage to connect to Hive metastore from Spark SQL. I copied hive
> conf file into Spark conf folder but when i run show tables, or do select *
> from dw_bid (dw_bid is stored in Hive) it says table not found.
>
>
>
> On Thu, Mar 26, 2015 at 11:43 PM, Chang Lim <chang...@gmail.com> wrote:
>
>> Solved.  In IDE, project settings was missing the dependent lib jars (jar
>> files under spark-xx/lib). When theses jar is not set, I got class not
>> found
>> error about datanucleus classes (compared to an out of memory error in
>> Spark
>> Shell).
>>
>> In the context of Spark Shell, these dependent jars needs to be passed in
>> at
>> the spark-shell command line.
>>
>>
>>
>> --
>> View this message in context:
>> http://apache-spark-user-list.1001560.n3.nabble.com/HQL-function-Rollup-and-Cube-tp22241p22246.html
>> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>>
>> ---------------------------------------------------------------------
>> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
>> For additional commands, e-mail: user-h...@spark.apache.org
>>
>>
>
>
> --
> Deepak
>
>

Reply via email to