You need to include the Hadoop native library in your spark-shell/spark-sql,
assuming your hadoop native library including native snappy library.
spark-sql --driver-library-path point_to_your_hadoop_native_library
In spark-sql, you can just use any command as you are in Hive CLI.
Yong
Date: Wed,
Hi, Robert
Spark SQL currently only support Hive 0.12.0(need to re-compile the package)
and 0.13.1(by default), I am not so sure if it supports the Hive 0.14 metastore
service as backend. Another way you can try is configure the
$SPARK_HOME/conf/hive-site.xml to access the remote metastore