Error in Hive on Spark

2016-03-10 Thread Stana
System.out.println("Error Message:" + res.getErrorMessage()); System.out.println("SQL State:" + res.getSQLState()); } } Exception of spark-engine: 16/03/10 18:32:58 INFO SparkClientImpl: Running client driver with argv: /Volumes/Sdhd/Documents

Re: Error in Hive on Spark

2016-03-10 Thread Stana
u Zhang : > You can probably avoid the problem by set environment variable SPARK_HOME > or JVM property spark.home that points to your spark installation. > > --Xuefu > > On Thu, Mar 10, 2016 at 3:11 AM, Stana wrote: > > > I am trying out Hive on Spark with hive 2.0.0 and sp

Re: Error in Hive on Spark

2016-03-20 Thread Stana
Does anyone have suggestions in setting property of hive-exec-2.0.0.jar path in application? Something like 'hiveConf.set("hive.remote.driver.jar","hdfs://storm0:9000/tmp/spark-assembly-1.4.1-hadoop2.6.0.jar")'. 2016-03-11 10:53 GMT+08:00 Stana : > Thanks for

Re: Error in Hive on Spark

2016-03-22 Thread Stana
Hi, Xuefu You are right. Maybe I should launch spark-submit by HS2 or Hive CLI ? Thanks a lot, Stana 2016-03-22 1:16 GMT+08:00 Xuefu Zhang : > Stana, > > I'm not sure if I fully understand the problem. spark-submit is launched in > the same host as your application, which