) - Error: Master must start with yarn, spark, mesos, or local please set
the spark.master to either local[*] or yarn-cluster

On Fri, Sep 25, 2015 at 10:45 PM, Xuefu Zhang <xzh...@cloudera.com> wrote:

> What's the value of spark.master in your case? The error specifically says
> something wrong with it.
>
> --Xuefu
>
> On Fri, Sep 25, 2015 at 9:18 AM, Garry Chen <g...@cornell.edu> wrote:
>
>> Hi All,
>>
>>                 I am following
>> https://cwiki.apache.org/confluence/display/Hive/Hive+on+Spark%3A+Getting+Started?
>> To setup hive on spark.  After setup/configuration everything startup I am
>> able to show tables but when executing sql statement within beeline I got
>> error.  Please help and thank you very much.
>>
>>
>>
>> Cluster Environment (3 nodes) as following
>>
>> hadoop-2.7.1
>>
>> spark-1.4.1-bin-hadoop2.6
>>
>> zookeeper-3.4.6
>>
>> apache-hive-1.2.1-bin
>>
>>
>>
>> Error from hive log:
>>
>> 2015-09-25 11:51:03,123 INFO  [HiveServer2-Handler-Pool: Thread-50]:
>> client.SparkClientImpl (SparkClientImpl.java:startDriver(375)) - Attempting
>> impersonation of oracle
>>
>> 2015-09-25 11:51:03,133 INFO  [HiveServer2-Handler-Pool: Thread-50]:
>> client.SparkClientImpl (SparkClientImpl.java:startDriver(409)) - Running
>> client driver with argv:
>> /u01/app/spark-1.4.1-bin-hadoop2.6/bin/spark-submit --proxy-user oracle
>> --properties-file /tmp/spark-submit.840692098393819749.properties --class
>> org.apache.hive.spark.client.RemoteDriver
>> /u01/app/apache-hive-1.2.1-bin/lib/hive-exec-1.2.1.jar --remote-host
>> ip-10-92-82-229.ec2.internal --remote-port 40476 --conf
>> hive.spark.client.connect.timeout=1000 --conf
>> hive.spark.client.server.connect.timeout=90000 --conf
>> hive.spark.client.channel.log.level=null --conf
>> hive.spark.client.rpc.max.size=52428800 --conf
>> hive.spark.client.rpc.threads=8 --conf hive.spark.client.secret.bits=256
>>
>> 2015-09-25 11:51:03,867 INFO  [stderr-redir-1]: client.SparkClientImpl
>> (SparkClientImpl.java:run(569)) - Warning: Ignoring non-spark config
>> property: hive.spark.client.server.connect.timeout=90000
>>
>> 2015-09-25 11:51:03,868 INFO  [stderr-redir-1]: client.SparkClientImpl
>> (SparkClientImpl.java:run(569)) - Warning: Ignoring non-spark config
>> property: hive.spark.client.rpc.threads=8
>>
>> 2015-09-25 11:51:03,868 INFO  [stderr-redir-1]: client.SparkClientImpl
>> (SparkClientImpl.java:run(569)) - Warning: Ignoring non-spark config
>> property: hive.spark.client.connect.timeout=1000
>>
>> 2015-09-25 11:51:03,868 INFO  [stderr-redir-1]: client.SparkClientImpl
>> (SparkClientImpl.java:run(569)) - Warning: Ignoring non-spark config
>> property: hive.spark.client.secret.bits=256
>>
>> 2015-09-25 11:51:03,868 INFO  [stderr-redir-1]: client.SparkClientImpl
>> (SparkClientImpl.java:run(569)) - Warning: Ignoring non-spark config
>> property: hive.spark.client.rpc.max.size=52428800
>>
>> 2015-09-25 11:51:03,876 INFO  [stderr-redir-1]: client.SparkClientImpl
>> (SparkClientImpl.java:run(569)) - Error: Master must start with yarn,
>> spark, mesos, or local
>>
>> 2015-09-25 11:51:03,876 INFO  [stderr-redir-1]: client.SparkClientImpl
>> (SparkClientImpl.java:run(569)) - Run with --help for usage help or
>> --verbose for debug output
>>
>> 2015-09-25 11:51:03,885 INFO  [stderr-redir-1]: client.SparkClientImpl
>> (SparkClientImpl.java:run(569)) - 15/09/25 11:51:03 INFO util.Utils:
>> Shutdown hook called
>>
>> 2015-09-25 11:51:03,889 WARN  [Driver]: client.SparkClientImpl
>> (SparkClientImpl.java:run(427)) - Child process exited with code 1.
>>
>>
>>
>
>

Reply via email to