[ 
https://issues.apache.org/jira/browse/SPARK-17700?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16769888#comment-16769888
 ] 

t oo commented on SPARK-17700:
------------------------------

did u fix/\?

> org.apache.hadoop.hive.ql.metadata.HiveException: Failed to create spark 
> client
> -------------------------------------------------------------------------------
>
>                 Key: SPARK-17700
>                 URL: https://issues.apache.org/jira/browse/SPARK-17700
>             Project: Spark
>          Issue Type: Bug
>         Environment: hadoop2.6.0
> spark1.6.0
> hive2.1.0
>            Reporter: Alpha
>            Priority: Major
>
> I configue hive on spark,I get this error:
> org.apache.hadoop.hive.ql.metadata.HiveException: Failed to create spark 
> client
> Exception log :
> log Warning: Ignoring non-spark config property: 
> hive.spark.client.server.connect.timeout=90000
> Warning: Ignoring non-spark config property: hive.spark.client.rpc.threads=8
> Warning: Ignoring non-spark config property: 
> hive.spark.client.connect.timeout=1000
> Warning: Ignoring non-spark config property: hive.spark.client.secret.bits=256
> Warning: Ignoring non-spark config property: 
> hive.spark.client.rpc.max.size=52428800
> SLF4J: Class path contains multiple SLF4J bindings.
> SLF4J: Found binding in 
> [jar:file:/home/sparkadm/path/spark-1.6.0-bin-hadoop2.6/lib/spark-assembly-1.6.0-hadoop2.6.0.jar!/org/slf4j/impl/StaticLoggerBinder.class]
> SLF4J: Found binding in 
> [jar:file:/home/sparkadm/path/hive2.1.0/lib/log4j-slf4j-impl-2.4.1.jar!/org/slf4j/impl/StaticLoggerBinder.class]
> SLF4J: Found binding in 
> [jar:file:/home/sparkadm/path/hadoop-2.6.0/share/hadoop/common/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
> SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an 
> explanation.
> SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
> 16/09/28 09:24:29 INFO client.RemoteDriver: Connecting to: master:54286
> Exception in thread "main" java.lang.NoSuchFieldError: 
> SPARK_RPC_SERVER_ADDRESS
>       at 
> org.apache.hive.spark.client.rpc.RpcConfiguration.<clinit>(RpcConfiguration.java:45)
>       at 
> org.apache.hive.spark.client.RemoteDriver.<init>(RemoteDriver.java:134)
>       at org.apache.hive.spark.client.RemoteDriver.main(RemoteDriver.java:516)
>       at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>       at 
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>       at 
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>       at java.lang.reflect.Method.invoke(Method.java:606)
>       at 
> org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:731)
>       at 
> org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:181)
>       at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:206)
>       at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:121)
>       at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
>       at 
> org.apache.hive.spark.client.rpc.RpcServer.cancelClient(RpcServer.java:179)
>       at 
> org.apache.hive.spark.client.SparkClientImpl$3.run(SparkClientImpl.java:465)
>       ... 1 more
> 2016-09-28T09:24:30,034 ERROR [HiveServer2-Background-Pool: Thread-56]: 
> spark.SparkTask (:()) - Failed to execute spark task, with exception 
> 'org.apache.hadoop.hive.ql.metadata.HiveException(Failed to create spark 
> client.)'
> org.apache.hadoop.hive.ql.metadata.HiveException: Failed to create spark 
> client.
>       at 
> org.apache.hadoop.hive.ql.exec.spark.session.SparkSessionImpl.open(SparkSessionImpl.java:64)
>       at 
> org.apache.hadoop.hive.ql.exec.spark.session.SparkSessionManagerImpl.getSession(SparkSessionManagerImpl.java:114)
>       at 
> org.apache.hadoop.hive.ql.exec.spark.SparkUtilities.getSparkSession(SparkUtilities.java:136)
>       at 
> org.apache.hadoop.hive.ql.exec.spark.SparkTask.execute(SparkTask.java:89)
>       at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:197)
>       at 
> org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:100)
>       at org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:1858)
>       at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:1562)
>       at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1313)
>       at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1084)
>       at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1077)
>       at 
> org.apache.hive.service.cli.operation.SQLOperation.runQuery(SQLOperation.java:235)
>       at 
> org.apache.hive.service.cli.operation.SQLOperation.access$300(SQLOperation.java:90)
>       at 
> org.apache.hive.service.cli.operation.SQLOperation$2$1.run(SQLOperation.java:299)
>       at java.security.AccessController.doPrivileged(Native Method)
>       at javax.security.auth.Subject.doAs(Subject.java:415)
>       at 
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628)
>       at 
> org.apache.hive.service.cli.operation.SQLOperation$2.run(SQLOperation.java:312)
>       at 
> java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
>       at java.util.concurrent.FutureTask.run(FutureTask.java:262)
>       at 
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
>       at 
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
>       at java.lang.Thread.run(Thread.java:745)
> Caused by: java.lang.RuntimeException: 
> java.util.concurrent.ExecutionException: java.lang.RuntimeException: Cancel 
> client '2fcea7a2-b497-4939-b0ad-2f052fd6fd88'. Error: Child process exited 
> before connecting back with error log Warning: Ignoring non-spark config 
> property: hive.spark.client.server.connect.timeout=90000
> Warning: Ignoring non-spark config property: hive.spark.client.rpc.threads=8
> Warning: Ignoring non-spark config property: 
> hive.spark.client.connect.timeout=1000
> Warning: Ignoring non-spark config property: hive.spark.client.secret.bits=256
> Warning: Ignoring non-spark config property: 
> hive.spark.client.rpc.max.size=52428800
> SLF4J: Class path contains multiple SLF4J bindings.
> SLF4J: Found binding in 
> [jar:file:/home/sparkadm/path/spark-1.6.0-bin-hadoop2.6/lib/spark-assembly-1.6.0-hadoop2.6.0.jar!/org/slf4j/impl/StaticLoggerBinder.class]
> SLF4J: Found binding in 
> [jar:file:/home/sparkadm/path/hive2.1.0/lib/log4j-slf4j-impl-2.4.1.jar!/org/slf4j/impl/StaticLoggerBinder.class]
> SLF4J: Found binding in 
> [jar:file:/home/sparkadm/path/hadoop-2.6.0/share/hadoop/common/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
> SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an 
> explanation.
> SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
> 16/09/28 09:24:29 INFO client.RemoteDriver: Connecting to: master:54286
> Exception in thread "main" java.lang.NoSuchFieldError: 
> SPARK_RPC_SERVER_ADDRESS
>       at 
> org.apache.hive.spark.client.rpc.RpcConfiguration.<clinit>(RpcConfiguration.java:45)
>       at 
> org.apache.hive.spark.client.RemoteDriver.<init>(RemoteDriver.java:134)
>       at org.apache.hive.spark.client.RemoteDriver.main(RemoteDriver.java:516)
>       at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>       at 
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>       at 
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>       at java.lang.reflect.Method.invoke(Method.java:606)
>       at 
> org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:731)
>       at 
> org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:181)
>       at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:206)
>       at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:121)
>       at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
>       at com.google.common.base.Throwables.propagate(Throwables.java:160)
>       at 
> org.apache.hive.spark.client.SparkClientImpl.<init>(SparkClientImpl.java:120)
>       at 
> org.apache.hive.spark.client.SparkClientFactory.createClient(SparkClientFactory.java:80)
>       at 
> org.apache.hadoop.hive.ql.exec.spark.RemoteHiveSparkClient.createRemoteClient(RemoteHiveSparkClient.java:99)
>       at 
> org.apache.hadoop.hive.ql.exec.spark.RemoteHiveSparkClient.<init>(RemoteHiveSparkClient.java:95)
>       at 
> org.apache.hadoop.hive.ql.exec.spark.HiveSparkClientFactory.createHiveSparkClient(HiveSparkClientFactory.java:67)
>       at 
> org.apache.hadoop.hive.ql.exec.spark.session.SparkSessionImpl.open(SparkSessionImpl.java:62)
>       ... 22 more
> Caused by: java.util.concurrent.ExecutionException: 
> java.lang.RuntimeException: Cancel client 
> '2fcea7a2-b497-4939-b0ad-2f052fd6fd88'. Error: Child process exited before 
> connecting back with error log Warning: Ignoring non-spark config property: 
> hive.spark.client.server.connect.timeout=90000
> Warning: Ignoring non-spark config property: hive.spark.client.rpc.threads=8
> Warning: Ignoring non-spark config property: 
> hive.spark.client.connect.timeout=1000
> Warning: Ignoring non-spark config property: hive.spark.client.secret.bits=256
> Warning: Ignoring non-spark config property: 
> hive.spark.client.rpc.max.size=52428800
> SLF4J: Class path contains multiple SLF4J bindings.
> SLF4J: Found binding in 
> [jar:file:/home/sparkadm/path/spark-1.6.0-bin-hadoop2.6/lib/spark-assembly-1.6.0-hadoop2.6.0.jar!/org/slf4j/impl/StaticLoggerBinder.class]
> SLF4J: Found binding in 
> [jar:file:/home/sparkadm/path/hive2.1.0/lib/log4j-slf4j-impl-2.4.1.jar!/org/slf4j/impl/StaticLoggerBinder.class]
> SLF4J: Found binding in 
> [jar:file:/home/sparkadm/path/hadoop-2.6.0/share/hadoop/common/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
> SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an 
> explanation.
> SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
> 16/09/28 09:24:29 INFO client.RemoteDriver: Connecting to: master:54286
> Exception in thread "main" java.lang.NoSuchFieldError: 
> SPARK_RPC_SERVER_ADDRESS
>       at 
> org.apache.hive.spark.client.rpc.RpcConfiguration.<clinit>(RpcConfiguration.java:45)
>       at 
> org.apache.hive.spark.client.RemoteDriver.<init>(RemoteDriver.java:134)
>       at org.apache.hive.spark.client.RemoteDriver.main(RemoteDriver.java:516)
>       at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>       at 
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>       at 
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>       at java.lang.reflect.Method.invoke(Method.java:606)
>       at 
> org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:731)
>       at 
> org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:181)
>       at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:206)
>       at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:121)
>       at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
>       at io.netty.util.concurrent.AbstractFuture.get(AbstractFuture.java:37)
>       at 
> org.apache.hive.spark.client.SparkClientImpl.<init>(SparkClientImpl.java:104)
>       ... 27 more
> Caused by: java.lang.RuntimeException: Cancel client 
> '2fcea7a2-b497-4939-b0ad-2f052fd6fd88'. Error: Child process exited before 
> connecting back with error log Warning: Ignoring non-spark config property: 
> hive.spark.client.server.connect.timeout=90000
> Warning: Ignoring non-spark config property: hive.spark.client.rpc.threads=8
> Warning: Ignoring non-spark config property: 
> hive.spark.client.connect.timeout=1000
> Warning: Ignoring non-spark config property: hive.spark.client.secret.bits=256
> Warning: Ignoring non-spark config property: 
> hive.spark.client.rpc.max.size=52428800
> SLF4J: Class path contains multiple SLF4J bindings.
> SLF4J: Found binding in 
> [jar:file:/home/sparkadm/path/spark-1.6.0-bin-hadoop2.6/lib/spark-assembly-1.6.0-hadoop2.6.0.jar!/org/slf4j/impl/StaticLoggerBinder.class]
> SLF4J: Found binding in 
> [jar:file:/home/sparkadm/path/hive2.1.0/lib/log4j-slf4j-impl-2.4.1.jar!/org/slf4j/impl/StaticLoggerBinder.class]
> SLF4J: Found binding in 
> [jar:file:/home/sparkadm/path/hadoop-2.6.0/share/hadoop/common/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
> SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an 
> explanation.
> SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
> 16/09/28 09:24:29 INFO client.RemoteDriver: Connecting to: master:54286
> Exception in thread "main" java.lang.NoSuchFieldError: 
> SPARK_RPC_SERVER_ADDRESS
>       at 
> org.apache.hive.spark.client.rpc.RpcConfiguration.<clinit>(RpcConfiguration.java:45)
>       at 
> org.apache.hive.spark.client.RemoteDriver.<init>(RemoteDriver.java:134)
>       at org.apache.hive.spark.client.RemoteDriver.main(RemoteDriver.java:516)
>       at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>       at 
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>       at 
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>       at java.lang.reflect.Method.invoke(Method.java:606)
>       at 
> org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:731)
>       at 
> org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:181)
>       at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:206)
>       at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:121)
>       at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
>       at 
> org.apache.hive.spark.client.rpc.RpcServer.cancelClient(RpcServer.java:179)
>       at 
> org.apache.hive.spark.client.SparkClientImpl$3.run(SparkClientImpl.java:465)
>       ... 1 more



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to