[jira] [Issue Comment Deleted] (SPARK-16292) Failed to create spark client

2016-09-29 Thread Arcflash (JIRA)

 [ 
https://issues.apache.org/jira/browse/SPARK-16292?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Arcflash updated SPARK-16292:
-
Comment: was deleted

(was: spark 1.6.0 is compiled with scala-2.11 by default,using scala 2.10 may 
cause compilation errors, personally recommend the use of scala 2.11 to compile.

if you use command

mvn -Pyarn -Phadoop-2.6 -Dscala-2.11 -DskipTests clean package

You may not need to use the following command to get the tar like

./make-distribution.sh --name "hadoop2-without-hive" --tgz 
"-Pyarn,hadoop-provided,hadoop-2.6,parquet-provided"
P.S. In fact, the effect of this order and the same article, but the tar 
package after the translation of a name.)

> Failed to create spark client
> -
>
> Key: SPARK-16292
> URL: https://issues.apache.org/jira/browse/SPARK-16292
> Project: Spark
>  Issue Type: Bug
>  Components: SQL
> Environment: hadoop-2.6.0
> hive-2.1.0
>Reporter: Arcflash
>
> when I use hive on spark ,I get this error
> {noformat}
> Failed to execute spark task, with exception 
> 'org.apache.hadoop.hive.ql.metadata.HiveException(Failed to create spark 
> client.)'
> FAILED: Execution Error, return code 1 from 
> org.apache.hadoop.hive.ql.exec.spark.SparkTask
> {noformat}
> my settings
> {noformat}
> set hive.execution.engine=spark;
> set spark.home=/opt/spark1.6.0;
> set spark.master=192.168.3.111;
> set spark.eventLog.enabled=true;
> set spark.eventLog.dir=/tmp;
> set spark.executor.memory=512m; 
> set spark.serializer=org.apache.spark.serializer.KryoSerializer;
> {noformat}
> Exeptions seen:
> {noformat}
> 16/06/30 16:17:02 [main]: INFO client.SparkClientImpl: Loading spark 
> defaults: file:/opt/hive2.1/conf/spark-defaults.conf
> 16/06/30 16:17:02 [main]: INFO client.SparkClientImpl: Running client driver 
> with argv: /opt/spark1.6.0/bin/spark-submit --properties-file 
> /tmp/spark-submit.7397226318023137500.properties --class 
> org.apache.hive.spark.client.RemoteDriver 
> /opt/hive2.1/lib/hive-exec-2.1.0.jar --remote-host master-0 --remote-port 
> 34055 --conf hive.spark.client.connect.timeout=1000 --conf 
> hive.spark.client.server.connect.timeout=9 --conf 
> hive.spark.client.channel.log.level=null --conf 
> hive.spark.client.rpc.max.size=52428800 --conf 
> hive.spark.client.rpc.threads=8 --conf hive.spark.client.secret.bits=256 
> --conf hive.spark.client.rpc.server.address=null
> 16/06/30 16:17:03 [stderr-redir-1]: INFO client.SparkClientImpl: Warning: 
> Ignoring non-spark config property: 
> hive.spark.client.server.connect.timeout=9
> 16/06/30 16:17:03 [stderr-redir-1]: INFO client.SparkClientImpl: Warning: 
> Ignoring non-spark config property: hive.spark.client.rpc.threads=8
> 16/06/30 16:17:03 [stderr-redir-1]: INFO client.SparkClientImpl: Warning: 
> Ignoring non-spark config property: hive.spark.client.connect.timeout=1000
> 16/06/30 16:17:03 [stderr-redir-1]: INFO client.SparkClientImpl: Warning: 
> Ignoring non-spark config property: hive.spark.client.secret.bits=256
> 16/06/30 16:17:03 [stderr-redir-1]: INFO client.SparkClientImpl: Warning: 
> Ignoring non-spark config property: hive.spark.client.rpc.max.size=52428800
> 16/06/30 16:17:03 [stderr-redir-1]: INFO client.SparkClientImpl: 16/06/30 
> 16:17:03 INFO client.RemoteDriver: Connecting to: master-0:34055
> 16/06/30 16:17:04 [stderr-redir-1]: INFO client.SparkClientImpl: Exception in 
> thread "main" java.lang.NoSuchFieldError: SPARK_RPC_SERVER_ADDRESS
> 16/06/30 16:17:04 [stderr-redir-1]: INFO client.SparkClientImpl:at 
> org.apache.hive.spark.client.rpc.RpcConfiguration.(RpcConfiguration.java:45)
> 16/06/30 16:17:04 [stderr-redir-1]: INFO client.SparkClientImpl:at 
> org.apache.hive.spark.client.RemoteDriver.(RemoteDriver.java:134)
> 16/06/30 16:17:04 [stderr-redir-1]: INFO client.SparkClientImpl:at 
> org.apache.hive.spark.client.RemoteDriver.main(RemoteDriver.java:516)
> 16/06/30 16:17:04 [stderr-redir-1]: INFO client.SparkClientImpl:at 
> sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> 16/06/30 16:17:04 [stderr-redir-1]: INFO client.SparkClientImpl:at 
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
> 16/06/30 16:17:04 [stderr-redir-1]: INFO client.SparkClientImpl:at 
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> 16/06/30 16:17:04 [stderr-redir-1]: INFO client.SparkClientImpl:at 
> java.lang.reflect.Method.invoke(Method.java:497)
> 16/06/30 16:17:04 [stderr-redir-1]: INFO client.SparkClientImpl:at 
> org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:731)
> 16/06/30 16:17:04 [stderr-redir-1]: INFO client.SparkClientImpl:at 
> org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:18

[jira] [Commented] (SPARK-16292) Failed to create spark client

2016-09-29 Thread Arcflash (JIRA)

[ 
https://issues.apache.org/jira/browse/SPARK-16292?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15532056#comment-15532056
 ] 

Arcflash commented on SPARK-16292:
--

spark 1.6.0 is compiled with scala-2.11 by default,using scala 2.10 may cause 
compilation errors, personally recommend the use of scala 2.11 to compile.

if you use command

mvn -Pyarn -Phadoop-2.6 -Dscala-2.11 -DskipTests clean package

You may not need to use the following command to get the tar like

./make-distribution.sh --name "hadoop2-without-hive" --tgz 
"-Pyarn,hadoop-provided,hadoop-2.6,parquet-provided"
P.S. In fact, the effect of this order and the same article, but the tar 
package after the translation of a name.

> Failed to create spark client
> -
>
> Key: SPARK-16292
> URL: https://issues.apache.org/jira/browse/SPARK-16292
> Project: Spark
>  Issue Type: Bug
>  Components: SQL
> Environment: hadoop-2.6.0
> hive-2.1.0
>Reporter: Arcflash
>
> when I use hive on spark ,I get this error
> {noformat}
> Failed to execute spark task, with exception 
> 'org.apache.hadoop.hive.ql.metadata.HiveException(Failed to create spark 
> client.)'
> FAILED: Execution Error, return code 1 from 
> org.apache.hadoop.hive.ql.exec.spark.SparkTask
> {noformat}
> my settings
> {noformat}
> set hive.execution.engine=spark;
> set spark.home=/opt/spark1.6.0;
> set spark.master=192.168.3.111;
> set spark.eventLog.enabled=true;
> set spark.eventLog.dir=/tmp;
> set spark.executor.memory=512m; 
> set spark.serializer=org.apache.spark.serializer.KryoSerializer;
> {noformat}
> Exeptions seen:
> {noformat}
> 16/06/30 16:17:02 [main]: INFO client.SparkClientImpl: Loading spark 
> defaults: file:/opt/hive2.1/conf/spark-defaults.conf
> 16/06/30 16:17:02 [main]: INFO client.SparkClientImpl: Running client driver 
> with argv: /opt/spark1.6.0/bin/spark-submit --properties-file 
> /tmp/spark-submit.7397226318023137500.properties --class 
> org.apache.hive.spark.client.RemoteDriver 
> /opt/hive2.1/lib/hive-exec-2.1.0.jar --remote-host master-0 --remote-port 
> 34055 --conf hive.spark.client.connect.timeout=1000 --conf 
> hive.spark.client.server.connect.timeout=9 --conf 
> hive.spark.client.channel.log.level=null --conf 
> hive.spark.client.rpc.max.size=52428800 --conf 
> hive.spark.client.rpc.threads=8 --conf hive.spark.client.secret.bits=256 
> --conf hive.spark.client.rpc.server.address=null
> 16/06/30 16:17:03 [stderr-redir-1]: INFO client.SparkClientImpl: Warning: 
> Ignoring non-spark config property: 
> hive.spark.client.server.connect.timeout=9
> 16/06/30 16:17:03 [stderr-redir-1]: INFO client.SparkClientImpl: Warning: 
> Ignoring non-spark config property: hive.spark.client.rpc.threads=8
> 16/06/30 16:17:03 [stderr-redir-1]: INFO client.SparkClientImpl: Warning: 
> Ignoring non-spark config property: hive.spark.client.connect.timeout=1000
> 16/06/30 16:17:03 [stderr-redir-1]: INFO client.SparkClientImpl: Warning: 
> Ignoring non-spark config property: hive.spark.client.secret.bits=256
> 16/06/30 16:17:03 [stderr-redir-1]: INFO client.SparkClientImpl: Warning: 
> Ignoring non-spark config property: hive.spark.client.rpc.max.size=52428800
> 16/06/30 16:17:03 [stderr-redir-1]: INFO client.SparkClientImpl: 16/06/30 
> 16:17:03 INFO client.RemoteDriver: Connecting to: master-0:34055
> 16/06/30 16:17:04 [stderr-redir-1]: INFO client.SparkClientImpl: Exception in 
> thread "main" java.lang.NoSuchFieldError: SPARK_RPC_SERVER_ADDRESS
> 16/06/30 16:17:04 [stderr-redir-1]: INFO client.SparkClientImpl:at 
> org.apache.hive.spark.client.rpc.RpcConfiguration.(RpcConfiguration.java:45)
> 16/06/30 16:17:04 [stderr-redir-1]: INFO client.SparkClientImpl:at 
> org.apache.hive.spark.client.RemoteDriver.(RemoteDriver.java:134)
> 16/06/30 16:17:04 [stderr-redir-1]: INFO client.SparkClientImpl:at 
> org.apache.hive.spark.client.RemoteDriver.main(RemoteDriver.java:516)
> 16/06/30 16:17:04 [stderr-redir-1]: INFO client.SparkClientImpl:at 
> sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> 16/06/30 16:17:04 [stderr-redir-1]: INFO client.SparkClientImpl:at 
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
> 16/06/30 16:17:04 [stderr-redir-1]: INFO client.SparkClientImpl:at 
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> 16/06/30 16:17:04 [stderr-redir-1]: INFO client.SparkClientImpl:at 
> java.lang.reflect.Method.invoke(Method.java:497)
> 16/06/30 16:17:04 [stderr-redir-1]: INFO client.SparkClientImpl:at 
> org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:731)
> 16/06/30 16:17:04 [stderr-redir-1]: INFO client.SparkClientImpl:at 
> org.apache.spark.deploy.SparkSubmit$.doRunMa

[jira] [Commented] (SPARK-16292) Failed to create spark client

2016-09-29 Thread Arcflash (JIRA)

[ 
https://issues.apache.org/jira/browse/SPARK-16292?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15532055#comment-15532055
 ] 

Arcflash commented on SPARK-16292:
--

spark 1.6.0 is compiled with scala-2.11 by default,using scala 2.10 may cause 
compilation errors, personally recommend the use of scala 2.11 to compile.

if you use command

mvn -Pyarn -Phadoop-2.6 -Dscala-2.11 -DskipTests clean package

You may not need to use the following command to get the tar like

./make-distribution.sh --name "hadoop2-without-hive" --tgz 
"-Pyarn,hadoop-provided,hadoop-2.6,parquet-provided"
P.S. In fact, the effect of this order and the same article, but the tar 
package after the translation of a name.

> Failed to create spark client
> -
>
> Key: SPARK-16292
> URL: https://issues.apache.org/jira/browse/SPARK-16292
> Project: Spark
>  Issue Type: Bug
>  Components: SQL
> Environment: hadoop-2.6.0
> hive-2.1.0
>Reporter: Arcflash
>
> when I use hive on spark ,I get this error
> {noformat}
> Failed to execute spark task, with exception 
> 'org.apache.hadoop.hive.ql.metadata.HiveException(Failed to create spark 
> client.)'
> FAILED: Execution Error, return code 1 from 
> org.apache.hadoop.hive.ql.exec.spark.SparkTask
> {noformat}
> my settings
> {noformat}
> set hive.execution.engine=spark;
> set spark.home=/opt/spark1.6.0;
> set spark.master=192.168.3.111;
> set spark.eventLog.enabled=true;
> set spark.eventLog.dir=/tmp;
> set spark.executor.memory=512m; 
> set spark.serializer=org.apache.spark.serializer.KryoSerializer;
> {noformat}
> Exeptions seen:
> {noformat}
> 16/06/30 16:17:02 [main]: INFO client.SparkClientImpl: Loading spark 
> defaults: file:/opt/hive2.1/conf/spark-defaults.conf
> 16/06/30 16:17:02 [main]: INFO client.SparkClientImpl: Running client driver 
> with argv: /opt/spark1.6.0/bin/spark-submit --properties-file 
> /tmp/spark-submit.7397226318023137500.properties --class 
> org.apache.hive.spark.client.RemoteDriver 
> /opt/hive2.1/lib/hive-exec-2.1.0.jar --remote-host master-0 --remote-port 
> 34055 --conf hive.spark.client.connect.timeout=1000 --conf 
> hive.spark.client.server.connect.timeout=9 --conf 
> hive.spark.client.channel.log.level=null --conf 
> hive.spark.client.rpc.max.size=52428800 --conf 
> hive.spark.client.rpc.threads=8 --conf hive.spark.client.secret.bits=256 
> --conf hive.spark.client.rpc.server.address=null
> 16/06/30 16:17:03 [stderr-redir-1]: INFO client.SparkClientImpl: Warning: 
> Ignoring non-spark config property: 
> hive.spark.client.server.connect.timeout=9
> 16/06/30 16:17:03 [stderr-redir-1]: INFO client.SparkClientImpl: Warning: 
> Ignoring non-spark config property: hive.spark.client.rpc.threads=8
> 16/06/30 16:17:03 [stderr-redir-1]: INFO client.SparkClientImpl: Warning: 
> Ignoring non-spark config property: hive.spark.client.connect.timeout=1000
> 16/06/30 16:17:03 [stderr-redir-1]: INFO client.SparkClientImpl: Warning: 
> Ignoring non-spark config property: hive.spark.client.secret.bits=256
> 16/06/30 16:17:03 [stderr-redir-1]: INFO client.SparkClientImpl: Warning: 
> Ignoring non-spark config property: hive.spark.client.rpc.max.size=52428800
> 16/06/30 16:17:03 [stderr-redir-1]: INFO client.SparkClientImpl: 16/06/30 
> 16:17:03 INFO client.RemoteDriver: Connecting to: master-0:34055
> 16/06/30 16:17:04 [stderr-redir-1]: INFO client.SparkClientImpl: Exception in 
> thread "main" java.lang.NoSuchFieldError: SPARK_RPC_SERVER_ADDRESS
> 16/06/30 16:17:04 [stderr-redir-1]: INFO client.SparkClientImpl:at 
> org.apache.hive.spark.client.rpc.RpcConfiguration.(RpcConfiguration.java:45)
> 16/06/30 16:17:04 [stderr-redir-1]: INFO client.SparkClientImpl:at 
> org.apache.hive.spark.client.RemoteDriver.(RemoteDriver.java:134)
> 16/06/30 16:17:04 [stderr-redir-1]: INFO client.SparkClientImpl:at 
> org.apache.hive.spark.client.RemoteDriver.main(RemoteDriver.java:516)
> 16/06/30 16:17:04 [stderr-redir-1]: INFO client.SparkClientImpl:at 
> sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> 16/06/30 16:17:04 [stderr-redir-1]: INFO client.SparkClientImpl:at 
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
> 16/06/30 16:17:04 [stderr-redir-1]: INFO client.SparkClientImpl:at 
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> 16/06/30 16:17:04 [stderr-redir-1]: INFO client.SparkClientImpl:at 
> java.lang.reflect.Method.invoke(Method.java:497)
> 16/06/30 16:17:04 [stderr-redir-1]: INFO client.SparkClientImpl:at 
> org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:731)
> 16/06/30 16:17:04 [stderr-redir-1]: INFO client.SparkClientImpl:at 
> org.apache.spark.deploy.SparkSubmit$.doRunMa

[jira] [Commented] (SPARK-16292) Failed to create spark client

2016-09-27 Thread Arcflash (JIRA)

[ 
https://issues.apache.org/jira/browse/SPARK-16292?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15528561#comment-15528561
 ] 

Arcflash commented on SPARK-16292:
--

If you use this command like 
./make-distribution.sh --tgz--skip-java-test -Pyarn -Phadoop-2.2 
-Dhadoop.version=2.2.0
to compile spark.

You may need to specify the hadoop jars path,or compile hadoop jars into 
spark,like
mvn -Pyarn -Phadoop-2.6 -Dscala-2.10 -DskipTests clean package

Spark 1.6.0 is compiled with scala-2.11 by default,if you use scala 
2.11,-Dscala is unnecessary.

> Failed to create spark client
> -
>
> Key: SPARK-16292
> URL: https://issues.apache.org/jira/browse/SPARK-16292
> Project: Spark
>  Issue Type: Bug
>  Components: SQL
> Environment: hadoop-2.6.0
> hive-2.1.0
>Reporter: Arcflash
>
> when I use hive on spark ,I get this error
> {noformat}
> Failed to execute spark task, with exception 
> 'org.apache.hadoop.hive.ql.metadata.HiveException(Failed to create spark 
> client.)'
> FAILED: Execution Error, return code 1 from 
> org.apache.hadoop.hive.ql.exec.spark.SparkTask
> {noformat}
> my settings
> {noformat}
> set hive.execution.engine=spark;
> set spark.home=/opt/spark1.6.0;
> set spark.master=192.168.3.111;
> set spark.eventLog.enabled=true;
> set spark.eventLog.dir=/tmp;
> set spark.executor.memory=512m; 
> set spark.serializer=org.apache.spark.serializer.KryoSerializer;
> {noformat}
> Exeptions seen:
> {noformat}
> 16/06/30 16:17:02 [main]: INFO client.SparkClientImpl: Loading spark 
> defaults: file:/opt/hive2.1/conf/spark-defaults.conf
> 16/06/30 16:17:02 [main]: INFO client.SparkClientImpl: Running client driver 
> with argv: /opt/spark1.6.0/bin/spark-submit --properties-file 
> /tmp/spark-submit.7397226318023137500.properties --class 
> org.apache.hive.spark.client.RemoteDriver 
> /opt/hive2.1/lib/hive-exec-2.1.0.jar --remote-host master-0 --remote-port 
> 34055 --conf hive.spark.client.connect.timeout=1000 --conf 
> hive.spark.client.server.connect.timeout=9 --conf 
> hive.spark.client.channel.log.level=null --conf 
> hive.spark.client.rpc.max.size=52428800 --conf 
> hive.spark.client.rpc.threads=8 --conf hive.spark.client.secret.bits=256 
> --conf hive.spark.client.rpc.server.address=null
> 16/06/30 16:17:03 [stderr-redir-1]: INFO client.SparkClientImpl: Warning: 
> Ignoring non-spark config property: 
> hive.spark.client.server.connect.timeout=9
> 16/06/30 16:17:03 [stderr-redir-1]: INFO client.SparkClientImpl: Warning: 
> Ignoring non-spark config property: hive.spark.client.rpc.threads=8
> 16/06/30 16:17:03 [stderr-redir-1]: INFO client.SparkClientImpl: Warning: 
> Ignoring non-spark config property: hive.spark.client.connect.timeout=1000
> 16/06/30 16:17:03 [stderr-redir-1]: INFO client.SparkClientImpl: Warning: 
> Ignoring non-spark config property: hive.spark.client.secret.bits=256
> 16/06/30 16:17:03 [stderr-redir-1]: INFO client.SparkClientImpl: Warning: 
> Ignoring non-spark config property: hive.spark.client.rpc.max.size=52428800
> 16/06/30 16:17:03 [stderr-redir-1]: INFO client.SparkClientImpl: 16/06/30 
> 16:17:03 INFO client.RemoteDriver: Connecting to: master-0:34055
> 16/06/30 16:17:04 [stderr-redir-1]: INFO client.SparkClientImpl: Exception in 
> thread "main" java.lang.NoSuchFieldError: SPARK_RPC_SERVER_ADDRESS
> 16/06/30 16:17:04 [stderr-redir-1]: INFO client.SparkClientImpl:at 
> org.apache.hive.spark.client.rpc.RpcConfiguration.(RpcConfiguration.java:45)
> 16/06/30 16:17:04 [stderr-redir-1]: INFO client.SparkClientImpl:at 
> org.apache.hive.spark.client.RemoteDriver.(RemoteDriver.java:134)
> 16/06/30 16:17:04 [stderr-redir-1]: INFO client.SparkClientImpl:at 
> org.apache.hive.spark.client.RemoteDriver.main(RemoteDriver.java:516)
> 16/06/30 16:17:04 [stderr-redir-1]: INFO client.SparkClientImpl:at 
> sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> 16/06/30 16:17:04 [stderr-redir-1]: INFO client.SparkClientImpl:at 
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
> 16/06/30 16:17:04 [stderr-redir-1]: INFO client.SparkClientImpl:at 
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> 16/06/30 16:17:04 [stderr-redir-1]: INFO client.SparkClientImpl:at 
> java.lang.reflect.Method.invoke(Method.java:497)
> 16/06/30 16:17:04 [stderr-redir-1]: INFO client.SparkClientImpl:at 
> org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:731)
> 16/06/30 16:17:04 [stderr-redir-1]: INFO client.SparkClientImpl:at 
> org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:181)
> 16/06/30 16:17:04 [stderr-redir-1]: INFO client.SparkClientImpl:at 
> org.apache.spark.deploy.SparkSubmit$.subm

[jira] [Commented] (SPARK-16292) Failed to create spark client

2016-06-29 Thread Arcflash (JIRA)

[ 
https://issues.apache.org/jira/browse/SPARK-16292?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15356301#comment-15356301
 ] 

Arcflash commented on SPARK-16292:
--

Thanks ,I check my settings and it works fine

> Failed to create spark client
> -
>
> Key: SPARK-16292
> URL: https://issues.apache.org/jira/browse/SPARK-16292
> Project: Spark
>  Issue Type: Bug
>  Components: SQL
> Environment: hadoop-2.6.0
> hive-2.1.0
>Reporter: Arcflash
>
> when I use hive on spark ,I get this error
> {noformat}
> Failed to execute spark task, with exception 
> 'org.apache.hadoop.hive.ql.metadata.HiveException(Failed to create spark 
> client.)'
> FAILED: Execution Error, return code 1 from 
> org.apache.hadoop.hive.ql.exec.spark.SparkTask
> {noformat}
> my settings
> {noformat}
> set hive.execution.engine=spark;
> set spark.home=/opt/spark1.6.0;
> set spark.master=192.168.3.111;
> set spark.eventLog.enabled=true;
> set spark.eventLog.dir=/tmp;
> set spark.executor.memory=512m; 
> set spark.serializer=org.apache.spark.serializer.KryoSerializer;
> {noformat}
> Exeptions seen:
> {noformat}
> 16/06/30 16:17:02 [main]: INFO client.SparkClientImpl: Loading spark 
> defaults: file:/opt/hive2.1/conf/spark-defaults.conf
> 16/06/30 16:17:02 [main]: INFO client.SparkClientImpl: Running client driver 
> with argv: /opt/spark1.6.0/bin/spark-submit --properties-file 
> /tmp/spark-submit.7397226318023137500.properties --class 
> org.apache.hive.spark.client.RemoteDriver 
> /opt/hive2.1/lib/hive-exec-2.1.0.jar --remote-host master-0 --remote-port 
> 34055 --conf hive.spark.client.connect.timeout=1000 --conf 
> hive.spark.client.server.connect.timeout=9 --conf 
> hive.spark.client.channel.log.level=null --conf 
> hive.spark.client.rpc.max.size=52428800 --conf 
> hive.spark.client.rpc.threads=8 --conf hive.spark.client.secret.bits=256 
> --conf hive.spark.client.rpc.server.address=null
> 16/06/30 16:17:03 [stderr-redir-1]: INFO client.SparkClientImpl: Warning: 
> Ignoring non-spark config property: 
> hive.spark.client.server.connect.timeout=9
> 16/06/30 16:17:03 [stderr-redir-1]: INFO client.SparkClientImpl: Warning: 
> Ignoring non-spark config property: hive.spark.client.rpc.threads=8
> 16/06/30 16:17:03 [stderr-redir-1]: INFO client.SparkClientImpl: Warning: 
> Ignoring non-spark config property: hive.spark.client.connect.timeout=1000
> 16/06/30 16:17:03 [stderr-redir-1]: INFO client.SparkClientImpl: Warning: 
> Ignoring non-spark config property: hive.spark.client.secret.bits=256
> 16/06/30 16:17:03 [stderr-redir-1]: INFO client.SparkClientImpl: Warning: 
> Ignoring non-spark config property: hive.spark.client.rpc.max.size=52428800
> 16/06/30 16:17:03 [stderr-redir-1]: INFO client.SparkClientImpl: 16/06/30 
> 16:17:03 INFO client.RemoteDriver: Connecting to: master-0:34055
> 16/06/30 16:17:04 [stderr-redir-1]: INFO client.SparkClientImpl: Exception in 
> thread "main" java.lang.NoSuchFieldError: SPARK_RPC_SERVER_ADDRESS
> 16/06/30 16:17:04 [stderr-redir-1]: INFO client.SparkClientImpl:at 
> org.apache.hive.spark.client.rpc.RpcConfiguration.(RpcConfiguration.java:45)
> 16/06/30 16:17:04 [stderr-redir-1]: INFO client.SparkClientImpl:at 
> org.apache.hive.spark.client.RemoteDriver.(RemoteDriver.java:134)
> 16/06/30 16:17:04 [stderr-redir-1]: INFO client.SparkClientImpl:at 
> org.apache.hive.spark.client.RemoteDriver.main(RemoteDriver.java:516)
> 16/06/30 16:17:04 [stderr-redir-1]: INFO client.SparkClientImpl:at 
> sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> 16/06/30 16:17:04 [stderr-redir-1]: INFO client.SparkClientImpl:at 
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
> 16/06/30 16:17:04 [stderr-redir-1]: INFO client.SparkClientImpl:at 
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> 16/06/30 16:17:04 [stderr-redir-1]: INFO client.SparkClientImpl:at 
> java.lang.reflect.Method.invoke(Method.java:497)
> 16/06/30 16:17:04 [stderr-redir-1]: INFO client.SparkClientImpl:at 
> org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:731)
> 16/06/30 16:17:04 [stderr-redir-1]: INFO client.SparkClientImpl:at 
> org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:181)
> 16/06/30 16:17:04 [stderr-redir-1]: INFO client.SparkClientImpl:at 
> org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:206)
> 16/06/30 16:17:04 [stderr-redir-1]: INFO client.SparkClientImpl:at 
> org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:121)
> 16/06/30 16:17:04 [stderr-redir-1]: INFO client.SparkClientImpl:at 
> org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
> 16/06/30 16:17:04 [main]: 

[jira] [Created] (SPARK-16292) Failed to create spark client

2016-06-29 Thread Arcflash (JIRA)
Arcflash created SPARK-16292:


 Summary: Failed to create spark client
 Key: SPARK-16292
 URL: https://issues.apache.org/jira/browse/SPARK-16292
 Project: Spark
  Issue Type: Bug
  Components: SQL
 Environment: hadoop-2.6.0
hive-2.1.0
Reporter: Arcflash


when I use hive on spark ,I get this error
{noformat}
Failed to execute spark task, with exception 
'org.apache.hadoop.hive.ql.metadata.HiveException(Failed to create spark 
client.)'
FAILED: Execution Error, return code 1 from 
org.apache.hadoop.hive.ql.exec.spark.SparkTask
{noformat}

my settings
{noformat}
set hive.execution.engine=spark;
set spark.home=/opt/spark1.6.0;
set spark.master=192.168.3.111;
set spark.eventLog.enabled=true;
set spark.eventLog.dir=/tmp;
set spark.executor.memory=512m; 
set spark.serializer=org.apache.spark.serializer.KryoSerializer;
{noformat}

Exeptions seen:
{noformat}
16/06/30 16:17:02 [main]: INFO client.SparkClientImpl: Loading spark defaults: 
file:/opt/hive2.1/conf/spark-defaults.conf
16/06/30 16:17:02 [main]: INFO client.SparkClientImpl: Running client driver 
with argv: /opt/spark1.6.0/bin/spark-submit --properties-file 
/tmp/spark-submit.7397226318023137500.properties --class 
org.apache.hive.spark.client.RemoteDriver /opt/hive2.1/lib/hive-exec-2.1.0.jar 
--remote-host master-0 --remote-port 34055 --conf 
hive.spark.client.connect.timeout=1000 --conf 
hive.spark.client.server.connect.timeout=9 --conf 
hive.spark.client.channel.log.level=null --conf 
hive.spark.client.rpc.max.size=52428800 --conf hive.spark.client.rpc.threads=8 
--conf hive.spark.client.secret.bits=256 --conf 
hive.spark.client.rpc.server.address=null
16/06/30 16:17:03 [stderr-redir-1]: INFO client.SparkClientImpl: Warning: 
Ignoring non-spark config property: 
hive.spark.client.server.connect.timeout=9
16/06/30 16:17:03 [stderr-redir-1]: INFO client.SparkClientImpl: Warning: 
Ignoring non-spark config property: hive.spark.client.rpc.threads=8
16/06/30 16:17:03 [stderr-redir-1]: INFO client.SparkClientImpl: Warning: 
Ignoring non-spark config property: hive.spark.client.connect.timeout=1000
16/06/30 16:17:03 [stderr-redir-1]: INFO client.SparkClientImpl: Warning: 
Ignoring non-spark config property: hive.spark.client.secret.bits=256
16/06/30 16:17:03 [stderr-redir-1]: INFO client.SparkClientImpl: Warning: 
Ignoring non-spark config property: hive.spark.client.rpc.max.size=52428800
16/06/30 16:17:03 [stderr-redir-1]: INFO client.SparkClientImpl: 16/06/30 
16:17:03 INFO client.RemoteDriver: Connecting to: master-0:34055
16/06/30 16:17:04 [stderr-redir-1]: INFO client.SparkClientImpl: Exception in 
thread "main" java.lang.NoSuchFieldError: SPARK_RPC_SERVER_ADDRESS
16/06/30 16:17:04 [stderr-redir-1]: INFO client.SparkClientImpl:at 
org.apache.hive.spark.client.rpc.RpcConfiguration.(RpcConfiguration.java:45)
16/06/30 16:17:04 [stderr-redir-1]: INFO client.SparkClientImpl:at 
org.apache.hive.spark.client.RemoteDriver.(RemoteDriver.java:134)
16/06/30 16:17:04 [stderr-redir-1]: INFO client.SparkClientImpl:at 
org.apache.hive.spark.client.RemoteDriver.main(RemoteDriver.java:516)
16/06/30 16:17:04 [stderr-redir-1]: INFO client.SparkClientImpl:at 
sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
16/06/30 16:17:04 [stderr-redir-1]: INFO client.SparkClientImpl:at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
16/06/30 16:17:04 [stderr-redir-1]: INFO client.SparkClientImpl:at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
16/06/30 16:17:04 [stderr-redir-1]: INFO client.SparkClientImpl:at 
java.lang.reflect.Method.invoke(Method.java:497)
16/06/30 16:17:04 [stderr-redir-1]: INFO client.SparkClientImpl:at 
org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:731)
16/06/30 16:17:04 [stderr-redir-1]: INFO client.SparkClientImpl:at 
org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:181)
16/06/30 16:17:04 [stderr-redir-1]: INFO client.SparkClientImpl:at 
org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:206)
16/06/30 16:17:04 [stderr-redir-1]: INFO client.SparkClientImpl:at 
org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:121)
16/06/30 16:17:04 [stderr-redir-1]: INFO client.SparkClientImpl:at 
org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
16/06/30 16:17:04 [main]: ERROR client.SparkClientImpl: Error while waiting for 
client to connect.
java.util.concurrent.ExecutionException: java.lang.RuntimeException: Cancel 
client 'ea6abea2-3346-4cfe-8c3e-ede49c8e6ae5'. Error: Child process exited 
before connecting back with error log Warning: Ignoring non-spark config 
property: hive.spark.client.server.connect.timeout=9
Warning: Ignoring non-spark config prop