One option is you can read the data via JDBC, however, probably it's the worst 
option, as you probably need some hacky work to enable the parallel reading in 
Spark SQL.
Another option is copy the hive-site.xml of your Hive Server to 
$SPARK_HOME/conf, then Spark SQL will see everything that Hive Server does, and 
you can load the Hive table as need.


-----Original Message-----
From: Hafiz Mujadid [mailto:hafizmujadi...@gmail.com] 
Sent: Monday, October 12, 2015 1:43 AM
To: user@spark.apache.org
Subject: Hive with apache spark

Hi

how can we read data from external hive server. Hive server is running and I 
want to read data remotely using spark. is there any example ?


thanks



--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/Hive-with-apache-spark-tp25020.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org For additional 
commands, e-mail: user-h...@spark.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to