[ 
https://issues.apache.org/jira/browse/SPARK-16292?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Alpha updated SPARK-16292:
--------------------------
    Comment: was deleted

(was: Hi Arcflash ,

I also face this problem, could you tell me the solution??

thank you!)

> Failed to create spark client
> -----------------------------
>
>                 Key: SPARK-16292
>                 URL: https://issues.apache.org/jira/browse/SPARK-16292
>             Project: Spark
>          Issue Type: Bug
>          Components: SQL
>         Environment: hadoop-2.6.0
> hive-2.1.0
>            Reporter: Arcflash
>
> when I use hive on spark ,I get this error
> {noformat}
> Failed to execute spark task, with exception 
> 'org.apache.hadoop.hive.ql.metadata.HiveException(Failed to create spark 
> client.)'
> FAILED: Execution Error, return code 1 from 
> org.apache.hadoop.hive.ql.exec.spark.SparkTask
> {noformat}
> my settings
> {noformat}
> set hive.execution.engine=spark;
> set spark.home=/opt/spark1.6.0;
> set spark.master=192.168.3.111;
> set spark.eventLog.enabled=true;
> set spark.eventLog.dir=/tmp;
> set spark.executor.memory=512m;             
> set spark.serializer=org.apache.spark.serializer.KryoSerializer;
> {noformat}
> Exeptions seen:
> {noformat}
> 16/06/30 16:17:02 [main]: INFO client.SparkClientImpl: Loading spark 
> defaults: file:/opt/hive2.1/conf/spark-defaults.conf
> 16/06/30 16:17:02 [main]: INFO client.SparkClientImpl: Running client driver 
> with argv: /opt/spark1.6.0/bin/spark-submit --properties-file 
> /tmp/spark-submit.7397226318023137500.properties --class 
> org.apache.hive.spark.client.RemoteDriver 
> /opt/hive2.1/lib/hive-exec-2.1.0.jar --remote-host master-0 --remote-port 
> 34055 --conf hive.spark.client.connect.timeout=1000 --conf 
> hive.spark.client.server.connect.timeout=90000 --conf 
> hive.spark.client.channel.log.level=null --conf 
> hive.spark.client.rpc.max.size=52428800 --conf 
> hive.spark.client.rpc.threads=8 --conf hive.spark.client.secret.bits=256 
> --conf hive.spark.client.rpc.server.address=null
> 16/06/30 16:17:03 [stderr-redir-1]: INFO client.SparkClientImpl: Warning: 
> Ignoring non-spark config property: 
> hive.spark.client.server.connect.timeout=90000
> 16/06/30 16:17:03 [stderr-redir-1]: INFO client.SparkClientImpl: Warning: 
> Ignoring non-spark config property: hive.spark.client.rpc.threads=8
> 16/06/30 16:17:03 [stderr-redir-1]: INFO client.SparkClientImpl: Warning: 
> Ignoring non-spark config property: hive.spark.client.connect.timeout=1000
> 16/06/30 16:17:03 [stderr-redir-1]: INFO client.SparkClientImpl: Warning: 
> Ignoring non-spark config property: hive.spark.client.secret.bits=256
> 16/06/30 16:17:03 [stderr-redir-1]: INFO client.SparkClientImpl: Warning: 
> Ignoring non-spark config property: hive.spark.client.rpc.max.size=52428800
> 16/06/30 16:17:03 [stderr-redir-1]: INFO client.SparkClientImpl: 16/06/30 
> 16:17:03 INFO client.RemoteDriver: Connecting to: master-0:34055
> 16/06/30 16:17:04 [stderr-redir-1]: INFO client.SparkClientImpl: Exception in 
> thread "main" java.lang.NoSuchFieldError: SPARK_RPC_SERVER_ADDRESS
> 16/06/30 16:17:04 [stderr-redir-1]: INFO client.SparkClientImpl:        at 
> org.apache.hive.spark.client.rpc.RpcConfiguration.<clinit>(RpcConfiguration.java:45)
> 16/06/30 16:17:04 [stderr-redir-1]: INFO client.SparkClientImpl:        at 
> org.apache.hive.spark.client.RemoteDriver.<init>(RemoteDriver.java:134)
> 16/06/30 16:17:04 [stderr-redir-1]: INFO client.SparkClientImpl:        at 
> org.apache.hive.spark.client.RemoteDriver.main(RemoteDriver.java:516)
> 16/06/30 16:17:04 [stderr-redir-1]: INFO client.SparkClientImpl:        at 
> sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> 16/06/30 16:17:04 [stderr-redir-1]: INFO client.SparkClientImpl:        at 
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
> 16/06/30 16:17:04 [stderr-redir-1]: INFO client.SparkClientImpl:        at 
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> 16/06/30 16:17:04 [stderr-redir-1]: INFO client.SparkClientImpl:        at 
> java.lang.reflect.Method.invoke(Method.java:497)
> 16/06/30 16:17:04 [stderr-redir-1]: INFO client.SparkClientImpl:        at 
> org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:731)
> 16/06/30 16:17:04 [stderr-redir-1]: INFO client.SparkClientImpl:        at 
> org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:181)
> 16/06/30 16:17:04 [stderr-redir-1]: INFO client.SparkClientImpl:        at 
> org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:206)
> 16/06/30 16:17:04 [stderr-redir-1]: INFO client.SparkClientImpl:        at 
> org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:121)
> 16/06/30 16:17:04 [stderr-redir-1]: INFO client.SparkClientImpl:        at 
> org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
> 16/06/30 16:17:04 [main]: ERROR client.SparkClientImpl: Error while waiting 
> for client to connect.
> java.util.concurrent.ExecutionException: java.lang.RuntimeException: Cancel 
> client 'ea6abea2-3346-4cfe-8c3e-ede49c8e6ae5'. Error: Child process exited 
> before connecting back with error log Warning: Ignoring non-spark config 
> property: hive.spark.client.server.connect.timeout=90000
> Warning: Ignoring non-spark config property: hive.spark.client.rpc.threads=8
> Warning: Ignoring non-spark config property: 
> hive.spark.client.connect.timeout=1000
> Warning: Ignoring non-spark config property: hive.spark.client.secret.bits=256
> Warning: Ignoring non-spark config property: 
> hive.spark.client.rpc.max.size=52428800
> 16/06/30 16:17:03 INFO client.RemoteDriver: Connecting to: master-0:34055
> Exception in thread "main" java.lang.NoSuchFieldError: 
> SPARK_RPC_SERVER_ADDRESS
>         at 
> org.apache.hive.spark.client.rpc.RpcConfiguration.<clinit>(RpcConfiguration.java:45)
>         at 
> org.apache.hive.spark.client.RemoteDriver.<init>(RemoteDriver.java:134)
>         at 
> org.apache.hive.spark.client.RemoteDriver.main(RemoteDriver.java:516)
>         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>         at 
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
>         at 
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>         at java.lang.reflect.Method.invoke(Method.java:497)
>         at 
> org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:731)
>         at 
> org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:181)
>         at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:206)
>         at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:121)
>         at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
>         at io.netty.util.concurrent.AbstractFuture.get(AbstractFuture.java:37)
>         at 
> org.apache.hive.spark.client.SparkClientImpl.<init>(SparkClientImpl.java:104)
>         at 
> org.apache.hive.spark.client.SparkClientFactory.createClient(SparkClientFactory.java:80)
>         at 
> org.apache.hadoop.hive.ql.exec.spark.RemoteHiveSparkClient.createRemoteClient(RemoteHiveSparkClient.java:99)
>         at 
> org.apache.hadoop.hive.ql.exec.spark.RemoteHiveSparkClient.<init>(RemoteHiveSparkClient.java:95)
>         at 
> org.apache.hadoop.hive.ql.exec.spark.HiveSparkClientFactory.createHiveSparkClient(HiveSparkClientFactory.java:67)
>         at 
> org.apache.hadoop.hive.ql.exec.spark.session.SparkSessionImpl.open(SparkSessionImpl.java:62)
>         at 
> org.apache.hadoop.hive.ql.exec.spark.session.SparkSessionManagerImpl.getSession(SparkSessionManagerImpl.java:114)
>         at 
> org.apache.hadoop.hive.ql.exec.spark.SparkUtilities.getSparkSession(SparkUtilities.java:136)
>         at 
> org.apache.hadoop.hive.ql.exec.spark.SparkTask.execute(SparkTask.java:89)
>         at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:197)
>         at 
> org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:100)
>         at org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:1858)
>         at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:1562)
>         at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1313)
>         at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1084)
>         at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1072)
>         at 
> org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:232)
>         at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:183)
>         at 
> org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:399)
>         at 
> org.apache.hadoop.hive.cli.CliDriver.executeDriver(CliDriver.java:776)
>         at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:714)
>         at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:641)
>         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>         at 
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
>         at 
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>         at java.lang.reflect.Method.invoke(Method.java:497)
>         at org.apache.hadoop.util.RunJar.run(RunJar.java:221)
>         at org.apache.hadoop.util.RunJar.main(RunJar.java:136)
> Caused by: java.lang.RuntimeException: Cancel client 
> 'ea6abea2-3346-4cfe-8c3e-ede49c8e6ae5'. Error: Child process exited before 
> connecting back with error log Warning: Ignoring non-spark config property: 
> hive.spark.client.server.connect.timeout=90000
> Warning: Ignoring non-spark config property: hive.spark.client.rpc.threads=8
> Warning: Ignoring non-spark config property: 
> hive.spark.client.connect.timeout=1000
> Warning: Ignoring non-spark config property: hive.spark.client.secret.bits=256
> Warning: Ignoring non-spark config property: 
> hive.spark.client.rpc.max.size=52428800
> 16/06/30 16:17:03 INFO client.RemoteDriver: Connecting to: master-0:34055
> Exception in thread "main" java.lang.NoSuchFieldError: 
> SPARK_RPC_SERVER_ADDRESS
>         at 
> org.apache.hive.spark.client.rpc.RpcConfiguration.<clinit>(RpcConfiguration.java:45)
>         at 
> org.apache.hive.spark.client.RemoteDriver.<init>(RemoteDriver.java:134)
>         at 
> org.apache.hive.spark.client.RemoteDriver.main(RemoteDriver.java:516)
>         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>         at 
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
>         at 
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>         at java.lang.reflect.Method.invoke(Method.java:497)
>         at 
> org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:731)
>         at 
> org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:181)
>         at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:206)
>         at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:121)
>         at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
>         at 
> org.apache.hive.spark.client.rpc.RpcServer.cancelClient(RpcServer.java:179)
>         at 
> org.apache.hive.spark.client.SparkClientImpl$3.run(SparkClientImpl.java:465)
>         at java.lang.Thread.run(Thread.java:745)
> 16/06/30 16:17:04 [Driver]: WARN client.SparkClientImpl: Child process exited 
> with code 1
> Failed to execute spark task, with exception 
> 'org.apache.hadoop.hive.ql.metadata.HiveException(Failed to create spark 
> client.)'
> 16/06/30 16:17:04 [main]: ERROR spark.SparkTask: Failed to execute spark 
> task, with exception 'org.apache.hadoop.hive.ql.metadata.HiveException(Failed 
> to create spark client.)'
> org.apache.hadoop.hive.ql.metadata.HiveException: Failed to create spark 
> client.
>         at 
> org.apache.hadoop.hive.ql.exec.spark.session.SparkSessionImpl.open(SparkSessionImpl.java:64)
>         at 
> org.apache.hadoop.hive.ql.exec.spark.session.SparkSessionManagerImpl.getSession(SparkSessionManagerImpl.java:114)
>         at 
> org.apache.hadoop.hive.ql.exec.spark.SparkUtilities.getSparkSession(SparkUtilities.java:136)
>         at 
> org.apache.hadoop.hive.ql.exec.spark.SparkTask.execute(SparkTask.java:89)
>         at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:197)
>         at 
> org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:100)
>         at org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:1858)
>         at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:1562)
>         at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1313)
>         at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1084)
>         at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1072)
>         at 
> org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:232)
>         at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:183)
>         at 
> org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:399)
>         at 
> org.apache.hadoop.hive.cli.CliDriver.executeDriver(CliDriver.java:776)
>         at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:714)
>         at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:641)
>         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>         at 
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
>         at 
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>         at java.lang.reflect.Method.invoke(Method.java:497)
>         at org.apache.hadoop.util.RunJar.run(RunJar.java:221)
>         at org.apache.hadoop.util.RunJar.main(RunJar.java:136)
> Caused by: java.lang.RuntimeException: 
> java.util.concurrent.ExecutionException: java.lang.RuntimeException: Cancel 
> client 'ea6abea2-3346-4cfe-8c3e-ede49c8e6ae5'. Error: Child process exited 
> before connecting back with error log Warning: Ignoring non-spark config 
> property: hive.spark.client.server.connect.timeout=90000
> Warning: Ignoring non-spark config property: hive.spark.client.rpc.threads=8
> Warning: Ignoring non-spark config property: 
> hive.spark.client.connect.timeout=1000
> Warning: Ignoring non-spark config property: hive.spark.client.secret.bits=256
> Warning: Ignoring non-spark config property: 
> hive.spark.client.rpc.max.size=52428800
> 16/06/30 16:17:03 INFO client.RemoteDriver: Connecting to: master-0:34055
> Exception in thread "main" java.lang.NoSuchFieldError: 
> SPARK_RPC_SERVER_ADDRESS
>         at 
> org.apache.hive.spark.client.rpc.RpcConfiguration.<clinit>(RpcConfiguration.java:45)
>         at 
> org.apache.hive.spark.client.RemoteDriver.<init>(RemoteDriver.java:134)
>         at 
> org.apache.hive.spark.client.RemoteDriver.main(RemoteDriver.java:516)
>         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>         at 
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
>         at 
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>         at java.lang.reflect.Method.invoke(Method.java:497)
>         at 
> org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:731)
>         at 
> org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:181)
>         at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:206)
>         at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:121)
>         at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
>         at com.google.common.base.Throwables.propagate(Throwables.java:160)
>         at 
> org.apache.hive.spark.client.SparkClientImpl.<init>(SparkClientImpl.java:120)
>         at 
> org.apache.hive.spark.client.SparkClientFactory.createClient(SparkClientFactory.java:80)
>         at 
> org.apache.hadoop.hive.ql.exec.spark.RemoteHiveSparkClient.createRemoteClient(RemoteHiveSparkClient.java:99)
>         at 
> org.apache.hadoop.hive.ql.exec.spark.RemoteHiveSparkClient.<init>(RemoteHiveSparkClient.java:95)
>         at 
> org.apache.hadoop.hive.ql.exec.spark.HiveSparkClientFactory.createHiveSparkClient(HiveSparkClientFactory.java:67)
>         at 
> org.apache.hadoop.hive.ql.exec.spark.session.SparkSessionImpl.open(SparkSessionImpl.java:62)
>         ... 22 more
> Caused by: java.util.concurrent.ExecutionException: 
> java.lang.RuntimeException: Cancel client 
> 'ea6abea2-3346-4cfe-8c3e-ede49c8e6ae5'. Error: Child process exited before 
> connecting back with error log Warning: Ignoring non-spark config property: 
> hive.spark.client.server.connect.timeout=90000
> Warning: Ignoring non-spark config property: hive.spark.client.rpc.threads=8
> Warning: Ignoring non-spark config property: 
> hive.spark.client.connect.timeout=1000
> Warning: Ignoring non-spark config property: hive.spark.client.secret.bits=256
> Warning: Ignoring non-spark config property: 
> hive.spark.client.rpc.max.size=52428800
> 16/06/30 16:17:03 INFO client.RemoteDriver: Connecting to: master-0:34055
> Exception in thread "main" java.lang.NoSuchFieldError: 
> SPARK_RPC_SERVER_ADDRESS
>         at 
> org.apache.hive.spark.client.rpc.RpcConfiguration.<clinit>(RpcConfiguration.java:45)
>         at 
> org.apache.hive.spark.client.RemoteDriver.<init>(RemoteDriver.java:134)
>         at 
> org.apache.hive.spark.client.RemoteDriver.main(RemoteDriver.java:516)
>         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>         at 
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
>         at 
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>         at java.lang.reflect.Method.invoke(Method.java:497)
>         at 
> org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:731)
>         at 
> org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:181)
>         at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:206)
>         at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:121)
>         at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
>         at io.netty.util.concurrent.AbstractFuture.get(AbstractFuture.java:37)
>         at 
> org.apache.hive.spark.client.SparkClientImpl.<init>(SparkClientImpl.java:104)
>         ... 27 more
> Caused by: java.lang.RuntimeException: Cancel client 
> 'ea6abea2-3346-4cfe-8c3e-ede49c8e6ae5'. Error: Child process exited before 
> connecting back with error log Warning: Ignoring non-spark config property: 
> hive.spark.client.server.connect.timeout=90000
> Warning: Ignoring non-spark config property: hive.spark.client.rpc.threads=8
> Warning: Ignoring non-spark config property: 
> hive.spark.client.connect.timeout=1000
> Warning: Ignoring non-spark config property: hive.spark.client.secret.bits=256
> Warning: Ignoring non-spark config property: 
> hive.spark.client.rpc.max.size=52428800
> 16/06/30 16:17:03 INFO client.RemoteDriver: Connecting to: master-0:34055
> Exception in thread "main" java.lang.NoSuchFieldError: 
> SPARK_RPC_SERVER_ADDRESS
>         at 
> org.apache.hive.spark.client.rpc.RpcConfiguration.<clinit>(RpcConfiguration.java:45)
>         at 
> org.apache.hive.spark.client.RemoteDriver.<init>(RemoteDriver.java:134)
>         at 
> org.apache.hive.spark.client.RemoteDriver.main(RemoteDriver.java:516)
>         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>         at 
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
>         at 
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>         at java.lang.reflect.Method.invoke(Method.java:497)
>         at 
> org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:731)
>         at 
> org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:181)
>         at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:206)
>         at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:121)
>         at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
>         at 
> org.apache.hive.spark.client.rpc.RpcServer.cancelClient(RpcServer.java:179)
>         at 
> org.apache.hive.spark.client.SparkClientImpl$3.run(SparkClientImpl.java:465)
>         at java.lang.Thread.run(Thread.java:745)
> 16/06/30 16:17:04 [main]: ERROR spark.SparkTask: Failed to execute spark 
> task, with exception 'org.apache.hadoop.hive.ql.metadata.HiveException(Failed 
> to create spark client.)'
> org.apache.hadoop.hive.ql.metadata.HiveException: Failed to create spark 
> client.
>         at 
> org.apache.hadoop.hive.ql.exec.spark.session.SparkSessionImpl.open(SparkSessionImpl.java:64)
>         at 
> org.apache.hadoop.hive.ql.exec.spark.session.SparkSessionManagerImpl.getSession(SparkSessionManagerImpl.java:114)
>         at 
> org.apache.hadoop.hive.ql.exec.spark.SparkUtilities.getSparkSession(SparkUtilities.java:136)
>         at 
> org.apache.hadoop.hive.ql.exec.spark.SparkTask.execute(SparkTask.java:89)
>         at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:197)
>         at 
> org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:100)
>         at org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:1858)
>         at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:1562)
>         at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1313)
>         at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1084)
>         at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1072)
>         at 
> org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:232)
>         at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:183)
>         at 
> org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:399)
>         at 
> org.apache.hadoop.hive.cli.CliDriver.executeDriver(CliDriver.java:776)
>         at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:714)
>         at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:641)
>         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>         at 
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
>         at 
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>         at java.lang.reflect.Method.invoke(Method.java:497)
>         at org.apache.hadoop.util.RunJar.run(RunJar.java:221)
>         at org.apache.hadoop.util.RunJar.main(RunJar.java:136)
> Caused by: java.lang.RuntimeException: 
> java.util.concurrent.ExecutionException: java.lang.RuntimeException: Cancel 
> client 'ea6abea2-3346-4cfe-8c3e-ede49c8e6ae5'. Error: Child process exited 
> before connecting back with error log Warning: Ignoring non-spark config 
> property: hive.spark.client.server.connect.timeout=90000
> Warning: Ignoring non-spark config property: hive.spark.client.rpc.threads=8
> Warning: Ignoring non-spark config property: 
> hive.spark.client.connect.timeout=1000
> Warning: Ignoring non-spark config property: hive.spark.client.secret.bits=256
> Warning: Ignoring non-spark config property: 
> hive.spark.client.rpc.max.size=52428800
> 16/06/30 16:17:03 INFO client.RemoteDriver: Connecting to: master-0:34055
> Exception in thread "main" java.lang.NoSuchFieldError: 
> SPARK_RPC_SERVER_ADDRESS
>         at 
> org.apache.hive.spark.client.rpc.RpcConfiguration.<clinit>(RpcConfiguration.java:45)
>         at 
> org.apache.hive.spark.client.RemoteDriver.<init>(RemoteDriver.java:134)
>         at 
> org.apache.hive.spark.client.RemoteDriver.main(RemoteDriver.java:516)
>         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>         at 
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
>         at 
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>         at java.lang.reflect.Method.invoke(Method.java:497)
>         at 
> org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:731)
>         at 
> org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:181)
>         at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:206)
>         at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:121)
>         at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
>         at com.google.common.base.Throwables.propagate(Throwables.java:160)
>         at 
> org.apache.hive.spark.client.SparkClientImpl.<init>(SparkClientImpl.java:120)
>         at 
> org.apache.hive.spark.client.SparkClientFactory.createClient(SparkClientFactory.java:80)
>         at 
> org.apache.hadoop.hive.ql.exec.spark.RemoteHiveSparkClient.createRemoteClient(RemoteHiveSparkClient.java:99)
>         at 
> org.apache.hadoop.hive.ql.exec.spark.RemoteHiveSparkClient.<init>(RemoteHiveSparkClient.java:95)
>         at 
> org.apache.hadoop.hive.ql.exec.spark.HiveSparkClientFactory.createHiveSparkClient(HiveSparkClientFactory.java:67)
>         at 
> org.apache.hadoop.hive.ql.exec.spark.session.SparkSessionImpl.open(SparkSessionImpl.java:62)
>         ... 22 more
> Caused by: java.util.concurrent.ExecutionException: 
> java.lang.RuntimeException: Cancel client 
> 'ea6abea2-3346-4cfe-8c3e-ede49c8e6ae5'. Error: Child process exited before 
> connecting back with error log Warning: Ignoring non-spark config property: 
> hive.spark.client.server.connect.timeout=90000
> Warning: Ignoring non-spark config property: hive.spark.client.rpc.threads=8
> Warning: Ignoring non-spark config property: 
> hive.spark.client.connect.timeout=1000
> Warning: Ignoring non-spark config property: hive.spark.client.secret.bits=256
> Warning: Ignoring non-spark config property: 
> hive.spark.client.rpc.max.size=52428800
> 16/06/30 16:17:03 INFO client.RemoteDriver: Connecting to: master-0:34055
> Exception in thread "main" java.lang.NoSuchFieldError: 
> SPARK_RPC_SERVER_ADDRESS
>         at 
> org.apache.hive.spark.client.rpc.RpcConfiguration.<clinit>(RpcConfiguration.java:45)
>         at 
> org.apache.hive.spark.client.RemoteDriver.<init>(RemoteDriver.java:134)
>         at 
> org.apache.hive.spark.client.RemoteDriver.main(RemoteDriver.java:516)
>         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>         at 
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
>         at 
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>         at java.lang.reflect.Method.invoke(Method.java:497)
>         at 
> org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:731)
>         at 
> org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:181)
>         at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:206)
>         at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:121)
>         at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
>         at io.netty.util.concurrent.AbstractFuture.get(AbstractFuture.java:37)
>         at 
> org.apache.hive.spark.client.SparkClientImpl.<init>(SparkClientImpl.java:104)
>         ... 27 more
> Caused by: java.lang.RuntimeException: Cancel client 
> 'ea6abea2-3346-4cfe-8c3e-ede49c8e6ae5'. Error: Child process exited before 
> connecting back with error log Warning: Ignoring non-spark config property: 
> hive.spark.client.server.connect.timeout=90000
> Warning: Ignoring non-spark config property: hive.spark.client.rpc.threads=8
> Warning: Ignoring non-spark config property: 
> hive.spark.client.connect.timeout=1000
> Warning: Ignoring non-spark config property: hive.spark.client.secret.bits=256
> Warning: Ignoring non-spark config property: 
> hive.spark.client.rpc.max.size=52428800
> 16/06/30 16:17:03 INFO client.RemoteDriver: Connecting to: master-0:34055
> Exception in thread "main" java.lang.NoSuchFieldError: 
> SPARK_RPC_SERVER_ADDRESS
>         at 
> org.apache.hive.spark.client.rpc.RpcConfiguration.<clinit>(RpcConfiguration.java:45)
>         at 
> org.apache.hive.spark.client.RemoteDriver.<init>(RemoteDriver.java:134)
>         at 
> org.apache.hive.spark.client.RemoteDriver.main(RemoteDriver.java:516)
>         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>         at 
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
>         at 
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>         at java.lang.reflect.Method.invoke(Method.java:497)
>         at 
> org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:731)
>         at 
> org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:181)
>         at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:206)
>         at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:121)
>         at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
>         at 
> org.apache.hive.spark.client.rpc.RpcServer.cancelClient(RpcServer.java:179)
>         at 
> org.apache.hive.spark.client.SparkClientImpl$3.run(SparkClientImpl.java:465)
>         at java.lang.Thread.run(Thread.java:745)
> FAILED: Execution Error, return code 1 from 
> org.apache.hadoop.hive.ql.exec.spark.SparkTask
> {noformat}
> I connect  by beeline,it work fine



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to