Seems like you have "hive.server2.enable.doAs" enabled; you can either
disable it, or configure hs2 so that the user running the service
("hadoop" in your case) can impersonate others.
See:
https://hadoop.apache.org/docs/r2.7.0/hadoop-project-dist/hadoop-common/Superusers.html
On Fri, Sep 25,
exited with code 1.
-Original Message-
From: Marcelo Vanzin [mailto:van...@cloudera.com]
Sent: Friday, September 25, 2015 1:12 PM
To: Garry Chen <g...@cornell.edu>
Cc: Jimmy Xiang <jxi...@cloudera.com>; user@spark.apache.org
Subject: Re: hive on spark query error
On Fri, Sep 25
: hive on spark query error
> Error: Master must start with yarn, spark, mesos, or local
What's your setting for spark.master?
On Fri, Sep 25, 2015 at 9:56 AM, Garry Chen
<g...@cornell.edu<mailto:g...@cornell.edu>> wrote:
Hi All,
I am following
https://cwiki.apache.o
On Fri, Sep 25, 2015 at 10:05 AM, Garry Chen wrote:
> In spark-defaults.conf the spark.master is spark://hostname:7077. From
> hive-site.xml
> spark.master
> hostname
>
That's not a valid value for spark.master (as the error indicates).
You should set it to