Ophir,

Can you provide your hive.log here? Also, have you checked your spark
application log?

When this happens, it usually means that Hive is not able to launch an
spark application. In case of spark on YARN, this application is the
application master. If Hive fails to launch it, or the application master
fails before it can connect back, you would see such error messages. To get
more information, you should check the spark application log.

--Xuefu

On Tue, Dec 15, 2015 at 2:26 PM, Ophir Etzion <op...@foursquare.com> wrote:

> Hi,
>
> when trying to do Hive on Spark on CDH5.4.3 I get the following error when
> trying to run a simple query using spark.
>
> I've tried setting everything written here (
> https://cwiki.apache.org/confluence/display/Hive/Hive+on+Spark%3A+Getting+Started)
> as well as what the cdh recommends.
>
> any one encountered this as well? (searching for it didn't help much)
>
> the error:
>
> ERROR : Failed to execute spark task, with exception
> 'org.apache.hadoop.hive.ql.metadata.HiveException(Failed to create spark
> client.)'
> org.apache.hadoop.hive.ql.metadata.HiveException: Failed to create spark
> client.
> at
> org.apache.hadoop.hive.ql.exec.spark.session.SparkSessionImpl.open(SparkSessionImpl.java:57)
> at
> org.apache.hadoop.hive.ql.exec.spark.session.SparkSessionManagerImpl.getSession(SparkSessionManagerImpl.java:114)
> at
> org.apache.hadoop.hive.ql.exec.spark.SparkUtilities.getSparkSession(SparkUtilities.java:120)
> at
> org.apache.hadoop.hive.ql.exec.spark.SparkTask.execute(SparkTask.java:97)
> at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:160)
> at
> org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:88)
> at org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:1640)
> at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:1399)
> at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1183)
> at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1049)
> at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1044)
> at
> org.apache.hive.service.cli.operation.SQLOperation.runQuery(SQLOperation.java:144)
> at
> org.apache.hive.service.cli.operation.SQLOperation.access$100(SQLOperation.java:69)
> at
> org.apache.hive.service.cli.operation.SQLOperation$1$1.run(SQLOperation.java:196)
> at java.security.AccessController.doPrivileged(Native Method)
> at javax.security.auth.Subject.doAs(Subject.java:415)
> at
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1671)
> at
> org.apache.hive.service.cli.operation.SQLOperation$1.run(SQLOperation.java:208)
> at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
> at java.util.concurrent.FutureTask.run(FutureTask.java:262)
> at
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
> at
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
> at java.lang.Thread.run(Thread.java:745)
> Caused by: java.lang.RuntimeException:
> java.util.concurrent.ExecutionException: java.lang.RuntimeException: Cancel
> client '2b2d7314-e0cc-4933-82a1-992a3299d109'. Error: Child process exited
> before connecting back
> at com.google.common.base.Throwables.propagate(Throwables.java:156)
> at
> org.apache.hive.spark.client.SparkClientImpl.<init>(SparkClientImpl.java:109)
> at
> org.apache.hive.spark.client.SparkClientFactory.createClient(SparkClientFactory.java:80)
> at
> org.apache.hadoop.hive.ql.exec.spark.RemoteHiveSparkClient.<init>(RemoteHiveSparkClient.java:91)
> at
> org.apache.hadoop.hive.ql.exec.spark.HiveSparkClientFactory.createHiveSparkClient(HiveSparkClientFactory.java:65)
> at
> org.apache.hadoop.hive.ql.exec.spark.session.SparkSessionImpl.open(SparkSessionImpl.java:55)
> ... 22 more
> Caused by: java.util.concurrent.ExecutionException:
> java.lang.RuntimeException: Cancel client
> '2b2d7314-e0cc-4933-82a1-992a3299d109'. Error: Child process exited before
> connecting back
> at io.netty.util.concurrent.AbstractFuture.get(AbstractFuture.java:37)
> at
> org.apache.hive.spark.client.SparkClientImpl.<init>(SparkClientImpl.java:99)
> ... 26 more
> Caused by: java.lang.RuntimeException: Cancel client
> '2b2d7314-e0cc-4933-82a1-992a3299d109'. Error: Child process exited before
> connecting back
> at
> org.apache.hive.spark.client.rpc.RpcServer.cancelClient(RpcServer.java:179)
> at
> org.apache.hive.spark.client.SparkClientImpl$3.run(SparkClientImpl.java:427)
> ... 1 more
>
> ERROR : Failed to execute spark task, with exception
> 'org.apache.hadoop.hive.ql.metadata.HiveException(Failed to create spark
> client.)'
> org.apache.hadoop.hive.ql.metadata.HiveException: Failed to create spark
> client.
> at
> org.apache.hadoop.hive.ql.exec.spark.session.SparkSessionImpl.open(SparkSessionImpl.java:57)
> at
> org.apache.hadoop.hive.ql.exec.spark.session.SparkSessionManagerImpl.getSession(SparkSessionManagerImpl.java:114)
> at
> org.apache.hadoop.hive.ql.exec.spark.SparkUtilities.getSparkSession(SparkUtilities.java:120)
> at
> org.apache.hadoop.hive.ql.exec.spark.SparkTask.execute(SparkTask.java:97)
> at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:160)
> at
> org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:88)
> at org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:1640)
> at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:1399)
> at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1183)
> at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1049)
> at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1044)
> at
> org.apache.hive.service.cli.operation.SQLOperation.runQuery(SQLOperation.java:144)
> at
> org.apache.hive.service.cli.operation.SQLOperation.access$100(SQLOperation.java:69)
> at
> org.apache.hive.service.cli.operation.SQLOperation$1$1.run(SQLOperation.java:196)
> at java.security.AccessController.doPrivileged(Native Method)
> at javax.security.auth.Subject.doAs(Subject.java:415)
> at
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1671)
> at
> org.apache.hive.service.cli.operation.SQLOperation$1.run(SQLOperation.java:208)
> at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
> at java.util.concurrent.FutureTask.run(FutureTask.java:262)
> at
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
> at
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
> at java.lang.Thread.run(Thread.java:745)
> Caused by: java.lang.RuntimeException:
> java.util.concurrent.ExecutionException: java.lang.RuntimeException: Cancel
> client '2b2d7314-e0cc-4933-82a1-992a3299d109'. Error: Child process exited
> before connecting back
> at com.google.common.base.Throwables.propagate(Throwables.java:156)
> at
> org.apache.hive.spark.client.SparkClientImpl.<init>(SparkClientImpl.java:109)
> at
> org.apache.hive.spark.client.SparkClientFactory.createClient(SparkClientFactory.java:80)
> at
> org.apache.hadoop.hive.ql.exec.spark.RemoteHiveSparkClient.<init>(RemoteHiveSparkClient.java:91)
> at
> org.apache.hadoop.hive.ql.exec.spark.HiveSparkClientFactory.createHiveSparkClient(HiveSparkClientFactory.java:65)
> at
> org.apache.hadoop.hive.ql.exec.spark.session.SparkSessionImpl.open(SparkSessionImpl.java:55)
> ... 22 more
> Caused by: java.util.concurrent.ExecutionException:
> java.lang.RuntimeException: Cancel client
> '2b2d7314-e0cc-4933-82a1-992a3299d109'. Error: Child process exited before
> connecting back
> at io.netty.util.concurrent.AbstractFuture.get(AbstractFuture.java:37)
> at
> org.apache.hive.spark.client.SparkClientImpl.<init>(SparkClientImpl.java:99)
> ... 26 more
> Caused by: java.lang.RuntimeException: Cancel client
> '2b2d7314-e0cc-4933-82a1-992a3299d109'. Error: Child process exited before
> connecting back
> at
> org.apache.hive.spark.client.rpc.RpcServer.cancelClient(RpcServer.java:179)
> at
> org.apache.hive.spark.client.SparkClientImpl$3.run(SparkClientImpl.java:427)
> ... 1 more
> Error: Error while processing statement: FAILED: Execution Error, return
> code 1 from org.apache.hadoop.hive.ql.exec.spark.SparkTask
> (state=08S01,code=1)
>
>

Reply via email to