Can you try a hive JDBC java client from eclipse and query a hive table 
successfully ?

This way we can narrow down where the issue is ?


Sent from my iPhone

> On May 23, 2016, at 5:26 PM, Ajay Chander <itsche...@gmail.com> wrote:
> 
> I downloaded the spark 1.5 untilities and exported SPARK_HOME pointing to it. 
> I copied all the cluster configuration files(hive-site.xml, hdfs-site.xml etc 
> files) inside the ${SPARK_HOME}/conf/ . My application looks like below,
> 
> 
> public class SparkSqlTest {
> 
> public static void main(String[] args) {
> 
> 
> 
> SparkConf sc = new SparkConf().setAppName("SQL_Test").setMaster("local");
> 
> JavaSparkContext jsc = new JavaSparkContext(sc);
> 
> HiveContext hiveContext = new org.apache.spark.sql.hive.HiveContext(jsc.sc());
> 
> DataFrame sampleDataFrame = hiveContext.sql("show tables");
> 
> sampleDataFrame.show();
> 
> 
> 
> }
> 
> }
> 
> 
> 
> I am expecting my application to return all the tables from the default 
> database. But somehow it returns empty list. I am just wondering if I need to 
> add anything to my code to point it to hive metastore. Thanks for your time. 
> Any pointers are appreciated.
> 
> 
> 
> Regards,
> 
> Aj
> 
> 
> 
>> On Monday, May 23, 2016, Ajay Chander <itsche...@gmail.com> wrote:
>> Hi Everyone,
>> 
>> I am building a Java Spark application in eclipse IDE. From my application I 
>> want to use hiveContext to read tables from the remote Hive(Hadoop cluster). 
>> On my machine I have exported $HADOOP_CONF_DIR = {$HOME}/hadoop/conf/. This 
>> path has all the remote cluster conf details like hive-site.xml, 
>> hdfs-site.xml ... Somehow I am not able to communicate to remote cluster 
>> from my app. Is there any additional configuration work that I am supposed 
>> to do to get it work? I specified master as 'local' in the code. Thank you.
>> 
>> Regards,
>> Aj

Reply via email to