[ https://issues.apache.org/jira/browse/HIVE-16930?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16057557#comment-16057557 ]
Hive QA commented on HIVE-16930: -------------------------------- Here are the results of testing the latest attachment: https://issues.apache.org/jira/secure/attachment/12873813/HIVE-16930.1.patch {color:red}ERROR:{color} -1 due to no test(s) being added or modified. {color:red}ERROR:{color} -1 due to 14 failed/errored test(s), 10841 tests executed *Failed tests:* {noformat} org.apache.hadoop.hive.cli.TestBeeLineDriver.testCliDriver[insert_overwrite_local_directory_1] (batchId=237) org.apache.hadoop.hive.cli.TestMiniLlapLocalCliDriver.testCliDriver[tez_smb_main] (batchId=149) org.apache.hadoop.hive.cli.TestMiniLlapLocalCliDriver.testCliDriver[vector_if_expr] (batchId=145) org.apache.hadoop.hive.cli.TestMiniTezCliDriver.testCliDriver[explainanalyze_2] (batchId=99) org.apache.hadoop.hive.cli.TestPerfCliDriver.testCliDriver[query14] (batchId=232) org.apache.hadoop.hive.cli.TestPerfCliDriver.testCliDriver[query16] (batchId=232) org.apache.hadoop.hive.cli.TestPerfCliDriver.testCliDriver[query94] (batchId=232) org.apache.hadoop.hive.ql.parse.TestReplicationScenarios.testExchangePartition (batchId=216) org.apache.hadoop.hive.ql.parse.TestReplicationScenariosAcrossInstances.testBootstrapFunctionReplication (batchId=216) org.apache.hadoop.hive.ql.parse.TestReplicationScenariosAcrossInstances.testCreateFunctionIncrementalReplication (batchId=216) org.apache.hadoop.hive.ql.parse.TestReplicationScenariosAcrossInstances.testCreateFunctionWithFunctionBinaryJarsOnHDFS (batchId=216) org.apache.hive.hcatalog.api.TestHCatClient.testPartitionRegistrationWithCustomSchema (batchId=177) org.apache.hive.hcatalog.api.TestHCatClient.testPartitionSpecRegistrationWithCustomSchema (batchId=177) org.apache.hive.hcatalog.api.TestHCatClient.testTableSchemaPropagation (batchId=177) {noformat} Test results: https://builds.apache.org/job/PreCommit-HIVE-Build/5713/testReport Console output: https://builds.apache.org/job/PreCommit-HIVE-Build/5713/console Test logs: http://104.198.109.242/logs/PreCommit-HIVE-Build-5713/ Messages: {noformat} Executing org.apache.hive.ptest.execution.TestCheckPhase Executing org.apache.hive.ptest.execution.PrepPhase Executing org.apache.hive.ptest.execution.ExecutionPhase Executing org.apache.hive.ptest.execution.ReportingPhase Tests exited with: TestsFailedException: 14 tests failed {noformat} This message is automatically generated. ATTACHMENT ID: 12873813 - PreCommit-HIVE-Build > HoS should verify the value of Kerberos principal and keytab file before > adding them to spark-submit command parameters > ----------------------------------------------------------------------------------------------------------------------- > > Key: HIVE-16930 > URL: https://issues.apache.org/jira/browse/HIVE-16930 > Project: Hive > Issue Type: Bug > Components: Spark > Reporter: Yibing Shi > Assignee: Yibing Shi > Attachments: HIVE-16930.1.patch > > > When Kerberos is enabled, Hive CLI fails to run Hive on Spark queries: > {noformat} > >hive -e "set hive.execution.engine=spark; create table if not exists test(a > >int); select count(*) from test" --hiveconf hive.root.logger=INFO,console > > >/var/tmp/hive_log.txt > /var/tmp/hive_log_2.txt > 17/06/16 16:13:13 [main]: ERROR client.SparkClientImpl: Error while waiting > for client to connect. > java.util.concurrent.ExecutionException: java.lang.RuntimeException: Cancel > client 'a5de85d1-6933-43e7-986f-5f8e5c001b5f'. Error: Child process exited > before connecting back with error log Error: Cannot load main class from JAR > file:/tmp/spark-submit.7196051517706529285.properties > Run with --help for usage help or --verbose for debug output > at > io.netty.util.concurrent.AbstractFuture.get(AbstractFuture.java:37) > at > org.apache.hive.spark.client.SparkClientImpl.<init>(SparkClientImpl.java:107) > at > org.apache.hive.spark.client.SparkClientFactory.createClient(SparkClientFactory.java:80) > > at > org.apache.hadoop.hive.ql.exec.spark.RemoteHiveSparkClient.createRemoteClient(RemoteHiveSparkClient.java:100) > > at > org.apache.hadoop.hive.ql.exec.spark.RemoteHiveSparkClient.<init>(RemoteHiveSparkClient.java:96) > > at > org.apache.hadoop.hive.ql.exec.spark.HiveSparkClientFactory.createHiveSparkClient(HiveSparkClientFactory.java:66) > > at > org.apache.hadoop.hive.ql.exec.spark.session.SparkSessionImpl.open(SparkSessionImpl.java:62) > > at > org.apache.hadoop.hive.ql.exec.spark.session.SparkSessionManagerImpl.getSession(SparkSessionManagerImpl.java:114) > > at > org.apache.hadoop.hive.ql.exec.spark.SparkUtilities.getSparkSession(SparkUtilities.java:111) > > at > org.apache.hadoop.hive.ql.exec.spark.SparkTask.execute(SparkTask.java:97) > at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:160) > at > org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:100) > at org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:1972) > at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:1685) > at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1421) > at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1205) > at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1195) > at > org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:220) > at > org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:172) > at > org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:383) > at > org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:318) > at > org.apache.hadoop.hive.cli.CliDriver.executeDriver(CliDriver.java:720) > at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:693) > at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:628) > at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) > at > sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) > at > sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) > > at java.lang.reflect.Method.invoke(Method.java:606) > at org.apache.hadoop.util.RunJar.run(RunJar.java:221) > at org.apache.hadoop.util.RunJar.main(RunJar.java:136) > Caused by: java.lang.RuntimeException: Cancel client > 'a5de85d1-6933-43e7-986f-5f8e5c001b5f'. Error: Child process exited before > connecting back with error log Error: Cannot load main class from JAR > file:/tmp/spark-submit.7196051517706529285.properties > Run with --help for usage help or --verbose for debug output > at > org.apache.hive.spark.client.rpc.RpcServer.cancelClient(RpcServer.java:179) > at > org.apache.hive.spark.client.SparkClientImpl$3.run(SparkClientImpl.java:490) > at java.lang.Thread.run(Thread.java:745) > 17/06/16 16:13:13 [Driver]: WARN client.SparkClientImpl: Child process exited > with code 1 > {noformat} > In the log, below message shows up: > {noformat} > 17/06/16 16:13:12 [main]: INFO client.SparkClientImpl: Running client driver > with argv: /usr/lib/spark/bin/spark-submit --executor-cores 1 > --executor-memory 268435456 --principal > hive/nightly58-1.gce.cloudera....@gce.cloudera.com --keytab > --properties-file /tmp/spark-submit.7196051517706529285.properties --class > org.apache.hive.spark.client.RemoteDriver > /usr/lib/hive/lib/hive-exec-1.1.0-cdh5.8.5.jar --remote-host > nightly58-1.gce.cloudera.com --remote-port 36074 --conf > hive.spark.client.connect.timeout=1000 --conf > hive.spark.client.server.connect.timeout=90000 --conf > hive.spark.client.channel.log.level=null --conf > hive.spark.client.rpc.max.size=52428800 --conf > hive.spark.client.rpc.threads=8 --conf hive.spark.client.secret.bits=256 > --conf hive.spark.client.rpc.server.address=null > {noformat} > There isn't any value after parameter "--keytab", which makes the > spark-submit command syntax wrong. > Hive should verifies the setting before use it. -- This message was sent by Atlassian JIRA (v6.4.14#64029)