Thanks Ted, I know in spark-she'll can we set same in spark-sql shell ?
If I don't set hive context from my understanding spark is using its own SQL
and date functions right ? Like for example interval ?
Thanks
Sri
Sent from my iPhone
> On 21 May 2016, at 08:19, Ted Yu
In spark-shell:
scala> import org.apache.spark.sql.hive.HiveContext
import org.apache.spark.sql.hive.HiveContext
scala> var hc: HiveContext = new HiveContext(sc)
FYI
On Sat, May 21, 2016 at 8:11 AM, Sri wrote:
> Hi ,
>
> You mean hive-site.xml file right ?,I did
Hi ,
You mean hive-site.xml file right ?,I did placed the hive-site.xml in spark
conf but not sure how spark certain date functions like interval is still
working .
Hive 0.14 don't have interval function but how spark is managing to do that ?
Does spark has its own date functions ? I am using
What is the motivation to use such an old version of Hive? This will lead to
less performance and other risks.
> On 21 May 2016, at 01:57, "kali.tumm...@gmail.com"
> wrote:
>
> Hi All ,
>
> Is there a way to ask spark and spark-sql to use Hive 0.14 version instead
>
Sou want to use hive version 0.14 when using Spark 1.6?
Go to directory $SPARK_HOME/conf and create a softlink to hive-core.xml file
*cd $SPARK_HOME*
hduser@rhes564: /usr/lib/spark-1.6.1-bin-hadoop2.6>
*cd conf*hduser@rhes564: /usr/lib/spark-1.6.1-bin-hadoop2.6/conf> ls -ltr
lrwxrwxrwx 1
Hi All ,
Is there a way to ask spark and spark-sql to use Hive 0.14 version instead
of inbuilt hive 1.2.1.
I am testing spark-sql locally by downloading spark 1.6 from internet , I
want to execute my hive queries in spark sql using hive version 0.14 can I
go back to previous version just for a