, 2015 4:11 AM
To: guxiaobo1...@qq.commailto:guxiaobo1...@qq.com;
Cc:
user@spark.apache.orgmailto:user@spark.apache.orguser@spark.apache.orgmailto:user@spark.apache.org;
Cheng Lianlian.cs@gmail.commailto:lian.cs@gmail.com;
Subject: Re: Can't access remote Hive table from spark
Yes. You
...@qq.commailto:guxiaobo1...@qq.com;
Cc:
user@spark.apache.orgmailto:user@spark.apache.orguser@spark.apache.orgmailto:user@spark.apache.org;
Cheng Lianlian.cs@gmail.commailto:lian.cs@gmail.com;
Subject: Re: Can't access remote Hive table from spark
Yes. You need to create xiaobogu under
...@hortonworks.com;
Send time: Thursday, Feb 12, 2015 2:00 AM
To: guxiaobo1...@qq.com;
Cc: user@spark.apache.orguser@spark.apache.org; Cheng
Lianlian.cs@gmail.com;
Subject: Re: Can't access remote Hive table from spark
You need to have right hdfs account, e.g., hdfs, to create directory
...@qq.com; user@spark.apache.orguser@spark.apache.org;
Subject: Re: Can't access remote Hive table from spark
Please note that Spark 1.2.0 only support Hive 0.13.1 or 0.12.0,
none of other versions are supported.
Best,
Cheng
]$
-- Original --
From: Zhan Zhang;zzh...@hortonworks.com;
Send time: Friday, Feb 6, 2015 2:55 PM
To: guxiaobo1...@qq.com;
Cc: user@spark.apache.orguser@spark.apache.org; Cheng
Lianlian.cs@gmail.com;
Subject: Re: Can't access remote Hive table from
Zhang;zzh...@hortonworks.com;
*Send time:* Friday, Feb 6, 2015 2:55 PM
*To:* guxiaobo1...@qq.com;
*Cc:* user@spark.apache.orguser@spark.apache.org; Cheng Lian
lian.cs@gmail.com;
*Subject: * Re: Can't access remote Hive table from spark
Not sure spark standalone mode. But on spark-on-yarn
@spark.apache.orguser@spark.apache.orgmailto:user@spark.apache.org;
Cheng Lianlian.cs@gmail.commailto:lian.cs@gmail.com;
Subject: Re: Can't access remote Hive table from spark
Not sure spark standalone mode. But on spark-on-yarn, it should work. You can
check following link:
http://hortonworks.com/hadoop
Not sure spark standalone mode. But on spark-on-yarn, it should work. You can
check following link:
http://hortonworks.com/hadoop-tutorial/using-apache-spark-hdp/
Thanks.
Zhan Zhang
On Feb 5, 2015, at 5:02 PM, Cheng Lian
lian.cs@gmail.commailto:lian.cs@gmail.com wrote:
Please note
*To:* guxiaobo1...@qq.com; user@spark.apache.org
user@spark.apache.org;
*Subject: * RE: Can't access remote Hive table from spark
This happened to me as well, putting hive-site.xml inside conf doesn't
seem to work. Instead I added /etc/hive/conf to SPARK_CLASSPATH and it
worked. You can try this approach
Please note that Spark 1.2.0 /only/ support Hive 0.13.1 /or/ 0.12.0,
none of other versions are supported.
Best,
Cheng
On 1/25/15 12:18 AM, guxiaobo1982 wrote:
Hi,
I built and started a single node standalone Spark 1.2.0 cluster along
with a single node Hive 0.14.0 instance installed by
To: Jörn Frankejornfra...@gmail.com;
Subject: Re: Can't access remote Hive table from spark
I am sorry , i forget to say that I have created the table manually .
在 2015年2月1日,下午4:14,Jörn Franke jornfra...@gmail.com 写道:
You commented the line which is suppose to create a table.
Le 25 janv
...@gmail.com;
Send time: Monday, Jan 26, 2015 7:41 AM
To: guxiaobo1...@qq.com; user@spark.apache.orguser@spark.apache.org;
Subject: RE: Can't access remote Hive table from spark
This happened to me as well, putting hive-site.xml inside conf doesn't seem to
work. Instead I added /etc/hive/conf
;
Cc: 徐涛77044...@qq.com;
Subject: Re: RE: Can't access remote Hive table from spark
Hi Skanda,
How do set up your SPARK_CLASSPATH?
I add the following line to my SPARK_HOME/conf/spark-env.sh , and still got the
same error.
export SPARK_CLASSPATH=${SPARK_CLASSPATH}:/etc/hive/conf
This happened to me as well, putting hive-site.xml inside conf doesn't seem to
work. Instead I added /etc/hive/conf to SPARK_CLASSPATH and it worked. You can
try this approach.
-Skanda
-Original Message-
From: guxiaobo1982 guxiaobo1...@qq.com
Sent: 25-01-2015 13:50
To:
14 matches
Mail list logo