Thanks Ted, I know in spark-she'll can we set same in spark-sql shell ?

If I don't set hive context from my understanding spark is using its own SQL 
and date functions right ? Like for example interval ?

Thanks
Sri


Sent from my iPhone

> On 21 May 2016, at 08:19, Ted Yu <yuzhih...@gmail.com> wrote:
> 
> In spark-shell:
> 
> scala> import org.apache.spark.sql.hive.HiveContext
> import org.apache.spark.sql.hive.HiveContext
> 
> scala> var hc: HiveContext = new HiveContext(sc)
> 
> FYI
> 
>> On Sat, May 21, 2016 at 8:11 AM, Sri <kali.tumm...@gmail.com> wrote:
>> Hi ,
>> 
>> You mean hive-site.xml file right ?,I did placed the hive-site.xml in spark 
>> conf but not sure how spark certain date functions like interval is still 
>> working .
>> Hive 0.14 don't have interval function but how spark is managing to do that ?
>> Does spark has its own date functions ? I am using spark-sql shell for your 
>> information.
>> 
>> Can I set hive context.sql in spark-Sql shell ? As we do in traditional 
>> spark Scala application.
>> 
>> Thanks
>> Sri
>> 
>> Sent from my iPhone
>> 
>>> On 21 May 2016, at 02:24, Mich Talebzadeh <mich.talebza...@gmail.com> wrote:
>>> 
>>> Sou want to use hive version 0.14 when using Spark 1.6?
>>> 
>>> Go to directory $SPARK_HOME/conf and create a softlink to hive-core.xml file
>>> 
>>> cd $SPARK_HOME
>>> hduser@rhes564: /usr/lib/spark-1.6.1-bin-hadoop2.6> cd conf
>>> hduser@rhes564: /usr/lib/spark-1.6.1-bin-hadoop2.6/conf> ls -ltr
>>> 
>>> lrwxrwxrwx  1 hduser hadoop   32 May  3 17:48 hive-site.xml -> 
>>> /usr/lib/hive/conf/hive-site.xml
>>> -
>>> 
>>> You can see the softlink in mine. Just create one as below
>>> 
>>> ln -s /usr/lib/hive/conf/hive-site.xml hive-site.xml
>>> 
>>> 
>>> That should work
>>> 
>>> HTH
>>> 
>>> Dr Mich Talebzadeh
>>>  
>>> LinkedIn  
>>> https://www.linkedin.com/profile/view?id=AAEAAAAWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw
>>>  
>>> http://talebzadehmich.wordpress.com
>>>  
>>> 
>>>> On 21 May 2016 at 00:57, kali.tumm...@gmail.com <kali.tumm...@gmail.com> 
>>>> wrote:
>>>> Hi All ,
>>>> 
>>>> Is there a way to ask spark and spark-sql to use Hive 0.14 version instead
>>>> of inbuilt hive 1.2.1.
>>>> 
>>>> I am testing spark-sql locally by downloading spark 1.6 from internet , I
>>>> want to execute my hive queries in spark sql using hive version 0.14 can I
>>>> go back to previous version just for a simple test.
>>>> 
>>>> Please share out the steps involved.
>>>> 
>>>> 
>>>> Thanks
>>>> Sri
>>>> 
>>>> 
>>>> 
>>>> --
>>>> View this message in context: 
>>>> http://apache-spark-user-list.1001560.n3.nabble.com/set-spark-1-6-with-Hive-0-14-tp26989.html
>>>> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>>>> 
>>>> ---------------------------------------------------------------------
>>>> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
>>>> For additional commands, e-mail: user-h...@spark.apache.org
>>>> 
>>> 
> 

Reply via email to