Re: error: Failed to create spark client. for hive on spark

2015-03-02 Thread Xuefu Zhang
It seems that the remote spark context failed to come up. I saw you're
using Spark standalone cluster. Please make sure spark cluster is up. You
may try spark.master=local first.

On Mon, Mar 2, 2015 at 5:15 PM, scwf  wrote:

> yes, have placed spark-assembly jar in hive lib folder.
>
> hive.log---
> bmit.2317151720491931059.properties --class 
> org.apache.hive.spark.client.RemoteDriver
> /opt/cluster/apache-hive-1.2.0-SNAPSHOT-bin/lib/hive-exec-1.2.0-SNAPSHOT.jar
> --remote-host M151 --remote-port 56996 --conf 
> hive.spark.client.connect.timeout=1
> --conf hive.spark.client.server.connect.timeout=9 --conf
> hive.spark.client.channel.log.level=null --conf 
> hive.spark.client.rpc.max.size=52428800
> --conf hive.spark.client.rpc.threads=8 --conf
> hive.spark.client.secret.bits=256
> 2015-03-02 20:33:39,893 INFO  [stderr-redir-1]: client.SparkClientImpl
> (SparkClientImpl.java:run(553)) - Warning: Ignoring non-spark config
> property: hive.spark.client.connect.timeout=1
> 2015-03-02 20:33:39,894 INFO  [stderr-redir-1]: client.SparkClientImpl
> (SparkClientImpl.java:run(553)) - Warning: Ignoring non-spark config
> property: hive.spark.client.rpc.threads=8
> 2015-03-02 20:33:39,894 INFO  [stderr-redir-1]: client.SparkClientImpl
> (SparkClientImpl.java:run(553)) - Warning: Ignoring non-spark config
> property: hive.spark.client.rpc.max.size=52428800
> 2015-03-02 20:33:39,894 INFO  [stderr-redir-1]: client.SparkClientImpl
> (SparkClientImpl.java:run(553)) - Warning: Ignoring non-spark config
> property: hive.spark.client.secret.bits=256
> 2015-03-02 20:33:39,894 INFO  [stderr-redir-1]: client.SparkClientImpl
> (SparkClientImpl.java:run(553)) - Warning: Ignoring non-spark config
> property: hive.spark.client.server.connect.timeout=9
> 2015-03-02 20:33:40,002 INFO  [stderr-redir-1]: client.SparkClientImpl
> (SparkClientImpl.java:run(553)) - 15/03/02 20:33:40 INFO
> client.RemoteDriver: Connecting to: M151:56996
> 2015-03-02 20:33:40,005 INFO  [stderr-redir-1]: client.SparkClientImpl
> (SparkClientImpl.java:run(553)) - Exception in thread "main"
> java.lang.NoSuchFieldError: SPARK_RPC_CLIENT_CONNECT_TIMEOUT
> 2015-03-02 20:33:40,005 INFO  [stderr-redir-1]: client.SparkClientImpl
> (SparkClientImpl.java:run(553)) -at org.apache.hive.spark.client.
> rpc.RpcConfiguration.(RpcConfiguration.java:46)
> 2015-03-02 20:33:40,005 INFO  [stderr-redir-1]: client.SparkClientImpl
> (SparkClientImpl.java:run(553)) -at org.apache.hive.spark.client.
> RemoteDriver.(RemoteDriver.java:139)
> 2015-03-02 20:33:40,005 INFO  [stderr-redir-1]: client.SparkClientImpl
> (SparkClientImpl.java:run(553)) -at org.apache.hive.spark.client.
> RemoteDriver.main(RemoteDriver.java:544)
> 2015-03-02 20:33:40,006 INFO  [stderr-redir-1]: client.SparkClientImpl
> (SparkClientImpl.java:run(553)) -at sun.reflect.
> NativeMethodAccessorImpl.invoke0(Native Method)
> 2015-03-02 20:33:40,006 INFO  [stderr-redir-1]: client.SparkClientImpl
> (SparkClientImpl.java:run(553)) -at sun.reflect.
> NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
> 2015-03-02 20:33:40,006 INFO  [stderr-redir-1]: client.SparkClientImpl
> (SparkClientImpl.java:run(553)) -at sun.reflect.
> DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> 2015-03-02 20:33:40,006 INFO  [stderr-redir-1]: client.SparkClientImpl
> (SparkClientImpl.java:run(553)) -at java.lang.reflect.Method.
> invoke(Method.java:601)
> 2015-03-02 20:33:40,006 INFO  [stderr-redir-1]: client.SparkClientImpl
> (SparkClientImpl.java:run(553)) -at org.apache.spark.deploy.
> SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(
> SparkSubmit.scala:569)
> 2015-03-02 20:33:40,006 INFO  [stderr-redir-1]: client.SparkClientImpl
> (SparkClientImpl.java:run(553)) -at org.apache.spark.deploy.
> SparkSubmit$.doRunMain$1(SparkSubmit.scala:166)
> 2015-03-02 20:33:40,006 INFO  [stderr-redir-1]: client.SparkClientImpl
> (SparkClientImpl.java:run(553)) -at org.apache.spark.deploy.
> SparkSubmit$.submit(SparkSubmit.scala:189)
> 2015-03-02 20:33:40,006 INFO  [stderr-redir-1]: client.SparkClientImpl
> (SparkClientImpl.java:run(553)) -at org.apache.spark.deploy.
> SparkSubmit$.main(SparkSubmit.scala:110)
> 2015-03-02 20:33:40,006 INFO  [stderr-redir-1]: client.SparkClientImpl
> (SparkClientImpl.java:run(553)) -at org.apache.spark.deploy.
> SparkSubmit.main(SparkSubmit.scala)
> 2015-03-02 20:33:40,410 WARN  [Driver]: client.SparkClientImpl
> (SparkClientImpl.java:run(411)) - Child process exited with code 1.
> 2015-03-02 20:35:08,950 WARN  [main]: client.SparkClientImpl
> (SparkClientImpl.java:(98)) - Error while waiting for client to
> connect.
> java.util.concurrent.ExecutionException: 
> java.util.concurrent.TimeoutException:
> Timed out waiting for client connection.
> at io.netty.util.concurrent.AbstractFuture.get(
> AbstractFuture.java:37)
> at org.apache.hive.spark.c

Re: error: Failed to create spark client. for hive on spark

2015-03-02 Thread scwf

yes, have placed spark-assembly jar in hive lib folder.

hive.log---
bmit.2317151720491931059.properties --class 
org.apache.hive.spark.client.RemoteDriver 
/opt/cluster/apache-hive-1.2.0-SNAPSHOT-bin/lib/hive-exec-1.2.0-SNAPSHOT.jar 
--remote-host M151 --remote-port 56996 --conf 
hive.spark.client.connect.timeout=1 --conf 
hive.spark.client.server.connect.timeout=9 --conf 
hive.spark.client.channel.log.level=null --conf 
hive.spark.client.rpc.max.size=52428800 --conf hive.spark.client.rpc.threads=8 
--conf hive.spark.client.secret.bits=256
2015-03-02 20:33:39,893 INFO  [stderr-redir-1]: client.SparkClientImpl 
(SparkClientImpl.java:run(553)) - Warning: Ignoring non-spark config property: 
hive.spark.client.connect.timeout=1
2015-03-02 20:33:39,894 INFO  [stderr-redir-1]: client.SparkClientImpl 
(SparkClientImpl.java:run(553)) - Warning: Ignoring non-spark config property: 
hive.spark.client.rpc.threads=8
2015-03-02 20:33:39,894 INFO  [stderr-redir-1]: client.SparkClientImpl 
(SparkClientImpl.java:run(553)) - Warning: Ignoring non-spark config property: 
hive.spark.client.rpc.max.size=52428800
2015-03-02 20:33:39,894 INFO  [stderr-redir-1]: client.SparkClientImpl 
(SparkClientImpl.java:run(553)) - Warning: Ignoring non-spark config property: 
hive.spark.client.secret.bits=256
2015-03-02 20:33:39,894 INFO  [stderr-redir-1]: client.SparkClientImpl 
(SparkClientImpl.java:run(553)) - Warning: Ignoring non-spark config property: 
hive.spark.client.server.connect.timeout=9
2015-03-02 20:33:40,002 INFO  [stderr-redir-1]: client.SparkClientImpl 
(SparkClientImpl.java:run(553)) - 15/03/02 20:33:40 INFO client.RemoteDriver: 
Connecting to: M151:56996
2015-03-02 20:33:40,005 INFO  [stderr-redir-1]: client.SparkClientImpl 
(SparkClientImpl.java:run(553)) - Exception in thread "main" 
java.lang.NoSuchFieldError: SPARK_RPC_CLIENT_CONNECT_TIMEOUT
2015-03-02 20:33:40,005 INFO  [stderr-redir-1]: client.SparkClientImpl 
(SparkClientImpl.java:run(553)) -at 
org.apache.hive.spark.client.rpc.RpcConfiguration.(RpcConfiguration.java:46)
2015-03-02 20:33:40,005 INFO  [stderr-redir-1]: client.SparkClientImpl 
(SparkClientImpl.java:run(553)) -at 
org.apache.hive.spark.client.RemoteDriver.(RemoteDriver.java:139)
2015-03-02 20:33:40,005 INFO  [stderr-redir-1]: client.SparkClientImpl 
(SparkClientImpl.java:run(553)) -at 
org.apache.hive.spark.client.RemoteDriver.main(RemoteDriver.java:544)
2015-03-02 20:33:40,006 INFO  [stderr-redir-1]: client.SparkClientImpl 
(SparkClientImpl.java:run(553)) -at 
sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
2015-03-02 20:33:40,006 INFO  [stderr-redir-1]: client.SparkClientImpl 
(SparkClientImpl.java:run(553)) -at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
2015-03-02 20:33:40,006 INFO  [stderr-redir-1]: client.SparkClientImpl 
(SparkClientImpl.java:run(553)) -at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
2015-03-02 20:33:40,006 INFO  [stderr-redir-1]: client.SparkClientImpl 
(SparkClientImpl.java:run(553)) -at 
java.lang.reflect.Method.invoke(Method.java:601)
2015-03-02 20:33:40,006 INFO  [stderr-redir-1]: client.SparkClientImpl 
(SparkClientImpl.java:run(553)) -at 
org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:569)
2015-03-02 20:33:40,006 INFO  [stderr-redir-1]: client.SparkClientImpl 
(SparkClientImpl.java:run(553)) -at 
org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:166)
2015-03-02 20:33:40,006 INFO  [stderr-redir-1]: client.SparkClientImpl 
(SparkClientImpl.java:run(553)) -at 
org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:189)
2015-03-02 20:33:40,006 INFO  [stderr-redir-1]: client.SparkClientImpl 
(SparkClientImpl.java:run(553)) -at 
org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:110)
2015-03-02 20:33:40,006 INFO  [stderr-redir-1]: client.SparkClientImpl 
(SparkClientImpl.java:run(553)) -at 
org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
2015-03-02 20:33:40,410 WARN  [Driver]: client.SparkClientImpl 
(SparkClientImpl.java:run(411)) - Child process exited with code 1.
2015-03-02 20:35:08,950 WARN  [main]: client.SparkClientImpl 
(SparkClientImpl.java:(98)) - Error while waiting for client to connect.
java.util.concurrent.ExecutionException: java.util.concurrent.TimeoutException: 
Timed out waiting for client connection.
at io.netty.util.concurrent.AbstractFuture.get(AbstractFuture.java:37)
at 
org.apache.hive.spark.client.SparkClientImpl.(SparkClientImpl.java:96)
at 
org.apache.hive.spark.client.SparkClientFactory.createClient(SparkClientFactory.java:80)
at 
org.apache.hadoop.hive.ql.exec.spark.RemoteHiveSparkClient.(RemoteHiveSparkClient.java:88)
at 
org.apache.hadoop.hive.ql.exec.spark.HiveSparkClientFactory.createHiveSparkClient(HiveSparkC

Re: error: Failed to create spark client. for hive on spark

2015-03-02 Thread Xuefu Zhang
Could you check your hive.log and spark.log for more detailed error
message? Quick check though, do you have spark-assembly.jar in your hive
lib folder?

Thanks,
Xuefu

On Mon, Mar 2, 2015 at 5:14 AM, scwf  wrote:

> Hi all,
>   anyone met this error: HiveException(Failed to create spark client.)
>
> M151:/opt/cluster/apache-hive-1.2.0-SNAPSHOT-bin # bin/hive
>
> Logging initialized using configuration in jar:file:/opt/cluster/apache-
> hive-1.2.0-SNAPSHOT-bin/lib/hive-common-1.2.0-SNAPSHOT.
> jar!/hive-log4j.properties
> [INFO] Unable to bind key for unsupported operation: backward-delete-word
> [INFO] Unable to bind key for unsupported operation: backward-delete-word
> [INFO] Unable to bind key for unsupported operation: down-history
> [INFO] Unable to bind key for unsupported operation: up-history
> [INFO] Unable to bind key for unsupported operation: up-history
> [INFO] Unable to bind key for unsupported operation: down-history
> [INFO] Unable to bind key for unsupported operation: up-history
> [INFO] Unable to bind key for unsupported operation: down-history
> [INFO] Unable to bind key for unsupported operation: up-history
> [INFO] Unable to bind key for unsupported operation: down-history
> [INFO] Unable to bind key for unsupported operation: up-history
> [INFO] Unable to bind key for unsupported operation: down-history
> hive> set spark.home=/opt/cluster/spark-1.3.0-bin-hadoop2-without-hive;
> hive> set hive.execution.engine=spark;
> hive> set spark.master=spark://9.91.8.151:7070;
> hive> select count(1) from src;
> Query ID = root_2015030220_4bed4c2a-b9a5-4d99-a485-67570e2712b7
> Total jobs = 1
> Launching Job 1 out of 1
> In order to change the average load for a reducer (in bytes):
>   set hive.exec.reducers.bytes.per.reducer=
> In order to limit the maximum number of reducers:
>   set hive.exec.reducers.max=
> In order to set a constant number of reducers:
>   set mapreduce.job.reduces=
> Failed to execute spark task, with exception 
> 'org.apache.hadoop.hive.ql.metadata.HiveException(Failed
> to create spark client.)'
> FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.
> exec.spark.SparkTask
>
> thanks
>
>


error: Failed to create spark client. for hive on spark

2015-03-02 Thread scwf

Hi all,
  anyone met this error: HiveException(Failed to create spark client.)

M151:/opt/cluster/apache-hive-1.2.0-SNAPSHOT-bin # bin/hive

Logging initialized using configuration in 
jar:file:/opt/cluster/apache-hive-1.2.0-SNAPSHOT-bin/lib/hive-common-1.2.0-SNAPSHOT.jar!/hive-log4j.properties
[INFO] Unable to bind key for unsupported operation: backward-delete-word
[INFO] Unable to bind key for unsupported operation: backward-delete-word
[INFO] Unable to bind key for unsupported operation: down-history
[INFO] Unable to bind key for unsupported operation: up-history
[INFO] Unable to bind key for unsupported operation: up-history
[INFO] Unable to bind key for unsupported operation: down-history
[INFO] Unable to bind key for unsupported operation: up-history
[INFO] Unable to bind key for unsupported operation: down-history
[INFO] Unable to bind key for unsupported operation: up-history
[INFO] Unable to bind key for unsupported operation: down-history
[INFO] Unable to bind key for unsupported operation: up-history
[INFO] Unable to bind key for unsupported operation: down-history
hive> set spark.home=/opt/cluster/spark-1.3.0-bin-hadoop2-without-hive;
hive> set hive.execution.engine=spark;
hive> set spark.master=spark://9.91.8.151:7070;
hive> select count(1) from src;
Query ID = root_2015030220_4bed4c2a-b9a5-4d99-a485-67570e2712b7
Total jobs = 1
Launching Job 1 out of 1
In order to change the average load for a reducer (in bytes):
  set hive.exec.reducers.bytes.per.reducer=
In order to limit the maximum number of reducers:
  set hive.exec.reducers.max=
In order to set a constant number of reducers:
  set mapreduce.job.reduces=
Failed to execute spark task, with exception 
'org.apache.hadoop.hive.ql.metadata.HiveException(Failed to create spark 
client.)'
FAILED: Execution Error, return code 1 from 
org.apache.hadoop.hive.ql.exec.spark.SparkTask

thanks