[ 
https://issues.apache.org/jira/browse/FLINK-17576?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Dian Fu updated FLINK-17576:
----------------------------
    Labels: test-stability  (was: )

> HiveTableSinkTest and TableEnvHiveConnectorTest are instable
> ------------------------------------------------------------
>
>                 Key: FLINK-17576
>                 URL: https://issues.apache.org/jira/browse/FLINK-17576
>             Project: Flink
>          Issue Type: Bug
>          Components: Connectors / Hive, Tests
>    Affects Versions: 1.11.0
>            Reporter: Dian Fu
>            Priority: Major
>              Labels: test-stability
>
> HiveTableSinkTest and TableEnvHiveConnectorTest failed with the following 
> exception:
> {code:java}
> 2020-05-08T09:38:44.5916441Z [ERROR] 
> testWriteComplexType(org.apache.flink.connectors.hive.HiveTableSinkTest)  
> Time elapsed: 1.362 s  <<< ERROR!
> 2020-05-08T09:38:44.5932270Z java.util.concurrent.ExecutionException: 
> org.apache.flink.runtime.messages.FlinkJobNotFoundException: Could not find 
> Flink job (e27d50c5a780264a576aa8a21a6dd6c6)
> 2020-05-08T09:38:44.5938598Z  at 
> java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
> 2020-05-08T09:38:44.5939435Z  at 
> java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1908)
> 2020-05-08T09:38:44.5939970Z  at 
> org.apache.flink.streaming.api.environment.StreamExecutionEnvironment.execute(StreamExecutionEnvironment.java:1663)
> 2020-05-08T09:38:44.5940551Z  at 
> org.apache.flink.streaming.api.environment.LocalStreamEnvironment.execute(LocalStreamEnvironment.java:74)
> 2020-05-08T09:38:44.5941188Z  at 
> org.apache.flink.table.planner.delegation.ExecutorBase.execute(ExecutorBase.java:52)
> 2020-05-08T09:38:44.5941834Z  at 
> org.apache.flink.table.api.internal.TableEnvironmentImpl.execute(TableEnvironmentImpl.java:916)
> 2020-05-08T09:38:44.5945405Z  at 
> org.apache.flink.connectors.hive.HiveTableSinkTest.testWriteComplexType(HiveTableSinkTest.java:143)
> 2020-05-08T09:38:44.5946105Z  at 
> sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> 2020-05-08T09:38:44.5946628Z  at 
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
> 2020-05-08T09:38:44.5947106Z  at 
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> 2020-05-08T09:38:44.5947770Z  at 
> java.lang.reflect.Method.invoke(Method.java:498)
> 2020-05-08T09:38:44.5948393Z  at 
> org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:50)
> 2020-05-08T09:38:44.5949102Z  at 
> org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
> 2020-05-08T09:38:44.5949853Z  at 
> org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:47)
> 2020-05-08T09:38:44.5950587Z  at 
> org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
> 2020-05-08T09:38:44.5951763Z  at 
> org.apache.flink.connectors.hive.FlinkStandaloneHiveRunner.runTestMethod(FlinkStandaloneHiveRunner.java:169)
> 2020-05-08T09:38:44.5952660Z  at 
> org.apache.flink.connectors.hive.FlinkStandaloneHiveRunner.runChild(FlinkStandaloneHiveRunner.java:154)
> 2020-05-08T09:38:44.5953829Z  at 
> org.apache.flink.connectors.hive.FlinkStandaloneHiveRunner.runChild(FlinkStandaloneHiveRunner.java:92)
> 2020-05-08T09:38:44.5966233Z  at 
> org.junit.runners.ParentRunner$3.run(ParentRunner.java:290)
> 2020-05-08T09:38:44.5967051Z  at 
> org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:71)
> 2020-05-08T09:38:44.5968062Z  at 
> org.junit.runners.ParentRunner.runChildren(ParentRunner.java:288)
> 2020-05-08T09:38:44.5968949Z  at 
> org.junit.runners.ParentRunner.access$000(ParentRunner.java:58)
> 2020-05-08T09:38:44.5969824Z  at 
> org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:268)
> 2020-05-08T09:38:44.5970751Z  at 
> org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
> 2020-05-08T09:38:44.5971584Z  at 
> org.junit.internal.runners.statements.RunAfters.evaluate(RunAfters.java:27)
> 2020-05-08T09:38:44.5972386Z  at 
> org.junit.rules.ExternalResource$1.evaluate(ExternalResource.java:48)
> 2020-05-08T09:38:44.5973469Z  at 
> org.junit.rules.ExternalResource$1.evaluate(ExternalResource.java:48)
> 2020-05-08T09:38:44.5974147Z  at 
> org.junit.rules.ExternalResource$1.evaluate(ExternalResource.java:48)
> 2020-05-08T09:38:44.5974684Z  at 
> org.junit.rules.RunRules.evaluate(RunRules.java:20)
> 2020-05-08T09:38:44.5975227Z  at 
> org.junit.runners.ParentRunner.run(ParentRunner.java:363)
> 2020-05-08T09:38:44.5975827Z  at 
> org.apache.maven.surefire.junit4.JUnit4Provider.execute(JUnit4Provider.java:365)
> 2020-05-08T09:38:44.5976533Z  at 
> org.apache.maven.surefire.junit4.JUnit4Provider.executeWithRerun(JUnit4Provider.java:273)
> 2020-05-08T09:38:44.5977627Z  at 
> org.apache.maven.surefire.junit4.JUnit4Provider.executeTestSet(JUnit4Provider.java:238)
> 2020-05-08T09:38:44.5978552Z  at 
> org.apache.maven.surefire.junit4.JUnit4Provider.invoke(JUnit4Provider.java:159)
> 2020-05-08T09:38:44.5979527Z  at 
> org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:384)
> 2020-05-08T09:38:44.5980556Z  at 
> org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:345)
> 2020-05-08T09:38:44.5981696Z  at 
> org.apache.maven.surefire.booter.ForkedBooter.execute(ForkedBooter.java:126)
> 2020-05-08T09:38:44.5982559Z  at 
> org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:418)
> 2020-05-08T09:38:44.5983422Z Caused by: 
> org.apache.flink.runtime.messages.FlinkJobNotFoundException: Could not find 
> Flink job (e27d50c5a780264a576aa8a21a6dd6c6)
> 2020-05-08T09:38:44.5984215Z  at 
> org.apache.flink.runtime.dispatcher.Dispatcher.requestJobResult(Dispatcher.java:563)
> 2020-05-08T09:38:44.5984798Z  at 
> sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> 2020-05-08T09:38:44.5985384Z  at 
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
> 2020-05-08T09:38:44.5986037Z  at 
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> 2020-05-08T09:38:44.5986721Z  at 
> java.lang.reflect.Method.invoke(Method.java:498)
> 2020-05-08T09:38:44.5987383Z  at 
> org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleRpcInvocation(AkkaRpcActor.java:284)
> 2020-05-08T09:38:44.5988199Z  at 
> org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleRpcMessage(AkkaRpcActor.java:199)
> 2020-05-08T09:38:44.5989080Z  at 
> org.apache.flink.runtime.rpc.akka.FencedAkkaRpcActor.handleRpcMessage(FencedAkkaRpcActor.java:74)
> 2020-05-08T09:38:44.5989914Z  at 
> org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleMessage(AkkaRpcActor.java:152)
> 2020-05-08T09:38:44.5990781Z  at 
> akka.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:26)
> 2020-05-08T09:38:44.5991545Z  at 
> akka.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:21)
> 2020-05-08T09:38:44.5992368Z  at 
> scala.PartialFunction$class.applyOrElse(PartialFunction.scala:123)
> 2020-05-08T09:38:44.5993122Z  at 
> akka.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:21)
> 2020-05-08T09:38:44.5993826Z  at 
> scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:170)
> 2020-05-08T09:38:44.5994473Z  at 
> scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:171)
> 2020-05-08T09:38:44.5995161Z  at 
> scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:171)
> 2020-05-08T09:38:44.5995794Z  at 
> akka.actor.Actor$class.aroundReceive(Actor.scala:517)
> 2020-05-08T09:38:44.5996445Z  at 
> akka.actor.AbstractActor.aroundReceive(AbstractActor.scala:225)
> 2020-05-08T09:38:44.5997053Z  at 
> akka.actor.ActorCell.receiveMessage(ActorCell.scala:592)
> 2020-05-08T09:38:44.5997658Z  at 
> akka.actor.ActorCell.invoke(ActorCell.scala:561)
> 2020-05-08T09:38:44.5998227Z  at 
> akka.dispatch.Mailbox.processMailbox(Mailbox.scala:258)
> 2020-05-08T09:38:44.5998805Z  at akka.dispatch.Mailbox.run(Mailbox.scala:225)
> 2020-05-08T09:38:44.5999332Z  at akka.dispatch.Mailbox.exec(Mailbox.scala:235)
> 2020-05-08T09:38:44.5999920Z  at 
> akka.dispatch.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
> 2020-05-08T09:38:44.6000703Z  at 
> akka.dispatch.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
> 2020-05-08T09:38:44.6001454Z  at 
> akka.dispatch.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
> 2020-05-08T09:38:44.6002040Z  at 
> akka.dispatch.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)
> 2020-05-08T09:38:44.6002538Z 
> 2020-05-08T09:38:44.6003181Z [ERROR] 
> testInsertIntoNonPartitionTable(org.apache.flink.connectors.hive.HiveTableSinkTest)
>   Time elapsed: 0.05 s  <<< ERROR!
> 2020-05-08T09:38:44.6004907Z 
> org.apache.flink.table.catalog.exceptions.TableAlreadyExistException: Table 
> (or view) default.dest already exists in Catalog test-catalog.
> 2020-05-08T09:38:44.6005671Z  at 
> org.apache.flink.table.catalog.hive.HiveCatalog.createTable(HiveCatalog.java:381)
> 2020-05-08T09:38:44.6006731Z  at 
> org.apache.flink.connectors.hive.HiveTableSinkTest.createHiveDestTable(HiveTableSinkTest.java:221)
> 2020-05-08T09:38:44.6007458Z  at 
> org.apache.flink.connectors.hive.HiveTableSinkTest.createHiveDestTable(HiveTableSinkTest.java:233)
> 2020-05-08T09:38:44.6008204Z  at 
> org.apache.flink.connectors.hive.HiveTableSinkTest.testInsertIntoNonPartitionTable(HiveTableSinkTest.java:91)
> 2020-05-08T09:38:44.6008828Z  at 
> sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> 2020-05-08T09:38:44.6009371Z  at 
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
> 2020-05-08T09:38:44.6010558Z  at 
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> 2020-05-08T09:38:44.6011208Z  at 
> java.lang.reflect.Method.invoke(Method.java:498)
> 2020-05-08T09:38:44.6011809Z  at 
> org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:50)
> 2020-05-08T09:38:44.6012473Z  at 
> org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
> 2020-05-08T09:38:44.6013197Z  at 
> org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:47)
> 2020-05-08T09:38:44.6013851Z  at 
> org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
> 2020-05-08T09:38:44.6014545Z  at 
> org.apache.flink.connectors.hive.FlinkStandaloneHiveRunner.runTestMethod(FlinkStandaloneHiveRunner.java:169)
> 2020-05-08T09:38:44.6015457Z  at 
> org.apache.flink.connectors.hive.FlinkStandaloneHiveRunner.runChild(FlinkStandaloneHiveRunner.java:154)
> 2020-05-08T09:38:44.6016216Z  at 
> org.apache.flink.connectors.hive.FlinkStandaloneHiveRunner.runChild(FlinkStandaloneHiveRunner.java:92)
> 2020-05-08T09:38:44.6017023Z  at 
> org.junit.runners.ParentRunner$3.run(ParentRunner.java:290)
> 2020-05-08T09:38:44.6017701Z  at 
> org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:71)
> 2020-05-08T09:38:44.6018172Z  at 
> org.junit.runners.ParentRunner.runChildren(ParentRunner.java:288)
> 2020-05-08T09:38:44.6018703Z  at 
> org.junit.runners.ParentRunner.access$000(ParentRunner.java:58)
> 2020-05-08T09:38:44.6019284Z  at 
> org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:268)
> 2020-05-08T09:38:44.6019938Z  at 
> org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
> 2020-05-08T09:38:44.6020713Z  at 
> org.junit.internal.runners.statements.RunAfters.evaluate(RunAfters.java:27)
> 2020-05-08T09:38:44.6021472Z  at 
> org.junit.rules.ExternalResource$1.evaluate(ExternalResource.java:48)
> 2020-05-08T09:38:44.6022069Z  at 
> org.junit.rules.ExternalResource$1.evaluate(ExternalResource.java:48)
> 2020-05-08T09:38:44.6022685Z  at 
> org.junit.rules.ExternalResource$1.evaluate(ExternalResource.java:48)
> 2020-05-08T09:38:44.6023296Z  at 
> org.junit.rules.RunRules.evaluate(RunRules.java:20)
> 2020-05-08T09:38:44.6023904Z  at 
> org.junit.runners.ParentRunner.run(ParentRunner.java:363)
> 2020-05-08T09:38:44.6024527Z  at 
> org.apache.maven.surefire.junit4.JUnit4Provider.execute(JUnit4Provider.java:365)
> 2020-05-08T09:38:44.6025235Z  at 
> org.apache.maven.surefire.junit4.JUnit4Provider.executeWithRerun(JUnit4Provider.java:273)
> 2020-05-08T09:38:44.6025945Z  at 
> org.apache.maven.surefire.junit4.JUnit4Provider.executeTestSet(JUnit4Provider.java:238)
> 2020-05-08T09:38:44.6026745Z  at 
> org.apache.maven.surefire.junit4.JUnit4Provider.invoke(JUnit4Provider.java:159)
> 2020-05-08T09:38:44.6027625Z  at 
> org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:384)
> 2020-05-08T09:38:44.6028430Z  at 
> org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:345)
> 2020-05-08T09:38:44.6029179Z  at 
> org.apache.maven.surefire.booter.ForkedBooter.execute(ForkedBooter.java:126)
> 2020-05-08T09:38:44.6029875Z  at 
> org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:418)
> 2020-05-08T09:38:44.6030663Z Caused by: AlreadyExistsException(message:Table 
> dest already exists)
> 2020-05-08T09:38:44.6031755Z  at 
> org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$create_table_with_environment_context_result$create_table_with_environment_context_resultStandardScheme.read(ThriftHiveMetastore.java:42052)
> 2020-05-08T09:38:44.6033379Z  at 
> org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$create_table_with_environment_context_result$create_table_with_environment_context_resultStandardScheme.read(ThriftHiveMetastore.java:42038)
> 2020-05-08T09:38:44.6034566Z  at 
> org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$create_table_with_environment_context_result.read(ThriftHiveMetastore.java:41964)
> 2020-05-08T09:38:44.6035344Z  at 
> org.apache.thrift.TServiceClient.receiveBase(TServiceClient.java:86)
> 2020-05-08T09:38:44.6036352Z  at 
> org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Client.recv_create_table_with_environment_context(ThriftHiveMetastore.java:1199)
> 2020-05-08T09:38:44.6037302Z  at 
> org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Client.create_table_with_environment_context(ThriftHiveMetastore.java:1185)
> 2020-05-08T09:38:44.6038089Z  at 
> org.apache.hadoop.hive.metastore.HiveMetaStoreClient.create_table_with_environment_context(HiveMetaStoreClient.java:2399)
> 2020-05-08T09:38:44.6038849Z  at 
> org.apache.hadoop.hive.metastore.HiveMetaStoreClient.createTable(HiveMetaStoreClient.java:752)
> 2020-05-08T09:38:44.6039507Z  at 
> org.apache.hadoop.hive.metastore.HiveMetaStoreClient.createTable(HiveMetaStoreClient.java:740)
> 2020-05-08T09:38:44.6040073Z  at 
> sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> 2020-05-08T09:38:44.6040752Z  at 
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
> 2020-05-08T09:38:44.6041435Z  at 
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> 2020-05-08T09:38:44.6041899Z  at 
> java.lang.reflect.Method.invoke(Method.java:498)
> 2020-05-08T09:38:44.6042490Z  at 
> org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.invoke(RetryingMetaStoreClient.java:169)
> 2020-05-08T09:38:44.6042926Z  at com.sun.proxy.$Proxy34.createTable(Unknown 
> Source)
> 2020-05-08T09:38:44.6043476Z  at 
> org.apache.flink.table.catalog.hive.client.HiveMetastoreClientWrapper.createTable(HiveMetastoreClientWrapper.java:142)
> 2020-05-08T09:38:44.6044191Z  at 
> org.apache.flink.table.catalog.hive.HiveCatalog.createTable(HiveCatalog.java:378)
> 2020-05-08T09:38:44.6044593Z  ... 34 more
> 2020-05-08T09:38:44.6044759Z 
> 2020-05-08T09:38:45.4944094Z [INFO] Running 
> org.apache.flink.connectors.hive.TableEnvHiveConnectorTest
> 2020-05-08T09:38:57.2455895Z OK
> 2020-05-08T09:38:57.5074904Z OK
> 2020-05-08T09:39:54.4485216Z WARNING: Hive-on-MR is deprecated in Hive 2 and 
> may not be available in the future versions. Consider using a different 
> execution engine (i.e. spark, tez) or using Hive 1.X releases.
> 2020-05-08T09:39:54.4486112Z Query ID = 
> agent02_azpcontainer_20200508093954_5b7c9f45-4c3c-4baa-9192-cec3724fd063
> 2020-05-08T09:39:54.4486420Z Total jobs = 3
> 2020-05-08T09:39:54.4486631Z Launching Job 1 out of 3
> 2020-05-08T09:39:54.4493419Z Number of reduce tasks is set to 0 since there's 
> no reduce operator
> 2020-05-08T09:39:54.6913698Z Job running in-process (local Hadoop)
> 2020-05-08T09:39:54.6931724Z 2020-05-08 09:39:54,692 Stage-1 map = 0%,  
> reduce = 0%
> 2020-05-08T09:39:54.7230751Z 2020-05-08 09:39:54,722 Stage-1 map = 100%,  
> reduce = 0%
> 2020-05-08T09:39:54.7268752Z Ended Job = job_local1575498928_0003
> 2020-05-08T09:39:54.7282475Z Stage-3 is selected by condition resolver.
> 2020-05-08T09:39:54.7283011Z Stage-2 is filtered out by condition resolver.
> 2020-05-08T09:39:54.7283698Z Stage-4 is filtered out by condition resolver.
> 2020-05-08T09:39:54.7293228Z Moving data to directory 
> file:/tmp/junit1597676277387724188/warehouse/db1.db/src1/.hive-staging_hive_2020-05-08_09-39-54_359_5169339369873513430-1/-ext-10000
> 2020-05-08T09:39:54.7619636Z Loading data to table db1.src1
> 2020-05-08T09:39:54.8385179Z MapReduce Jobs Launched: 
> 2020-05-08T09:39:54.8386671Z Stage-Stage-1:  HDFS Read: 0 HDFS Write: 0 
> SUCCESS
> 2020-05-08T09:39:54.8387003Z Total MapReduce CPU Time Spent: 0 msec
> 2020-05-08T09:39:54.8387337Z OK
> 2020-05-08T09:39:56.4934855Z OK
> 2020-05-08T09:39:56.5937839Z OK
> 2020-05-08T09:39:58.5593808Z OK
> 2020-05-08T09:39:58.6535469Z OK
> 2020-05-08T09:39:58.7554994Z OK
> 2020-05-08T09:39:58.7831835Z OK
> 2020-05-08T09:39:58.8160236Z OK
> 2020-05-08T09:39:58.8509517Z OK
> 2020-05-08T09:39:58.8633482Z Loading data to table db1.src
> 2020-05-08T09:39:58.9445381Z OK
> 2020-05-08T09:40:02.0134976Z FAILED: Execution Error, return code 1 from 
> org.apache.hadoop.hive.ql.exec.DDLTask. Database db1 already exists
> 2020-05-08T09:40:02.0813385Z FAILED: Execution Error, return code 1 from 
> org.apache.hadoop.hive.ql.exec.DDLTask. Database db1 already exists
> 2020-05-08T09:40:02.1006755Z FAILED: Execution Error, return code 1 from 
> org.apache.hadoop.hive.ql.exec.DDLTask. Database db1 already exists
> 2020-05-08T09:40:02.1427929Z OK
> 2020-05-08T09:40:02.1877136Z [ERROR] Tests run: 21, Failures: 0, Errors: 4, 
> Skipped: 2, Time elapsed: 76.685 s <<< FAILURE! - in 
> org.apache.flink.connectors.hive.TableEnvHiveConnectorTest
> 2020-05-08T09:40:02.1878103Z [ERROR] 
> testDefaultPartitionName(org.apache.flink.connectors.hive.TableEnvHiveConnectorTest)
>   Time elapsed: 3.191 s  <<< ERROR!
> 2020-05-08T09:40:02.1878675Z java.util.concurrent.ExecutionException: 
> org.apache.flink.runtime.client.JobExecutionException: Job execution failed.
> 2020-05-08T09:40:02.1879172Z  at 
> java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
> 2020-05-08T09:40:02.1879747Z  at 
> java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1908)
> 2020-05-08T09:40:02.1880286Z  at 
> org.apache.flink.streaming.api.environment.StreamExecutionEnvironment.execute(StreamExecutionEnvironment.java:1663)
> 2020-05-08T09:40:02.1881369Z  at 
> org.apache.flink.streaming.api.environment.LocalStreamEnvironment.execute(LocalStreamEnvironment.java:74)
> 2020-05-08T09:40:02.1881929Z  at 
> org.apache.flink.table.planner.delegation.ExecutorBase.execute(ExecutorBase.java:52)
> 2020-05-08T09:40:02.1882689Z  at 
> org.apache.flink.table.api.internal.TableEnvironmentImpl.execute(TableEnvironmentImpl.java:916)
> 2020-05-08T09:40:02.1883216Z  at 
> org.apache.flink.table.api.TableUtils.collectToList(TableUtils.java:85)
> 2020-05-08T09:40:02.1884413Z  at 
> org.apache.flink.connectors.hive.TableEnvHiveConnectorTest.testDefaultPartitionName(TableEnvHiveConnectorTest.java:106)
> 2020-05-08T09:40:02.1885239Z  at 
> sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> 2020-05-08T09:40:02.1886075Z  at 
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
> 2020-05-08T09:40:02.1886761Z  at 
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> 2020-05-08T09:40:02.1887300Z  at 
> java.lang.reflect.Method.invoke(Method.java:498)
> 2020-05-08T09:40:02.1887851Z  at 
> org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:50)
> 2020-05-08T09:40:02.1888403Z  at 
> org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
> 2020-05-08T09:40:02.1888874Z  at 
> org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:47)
> 2020-05-08T09:40:02.1889500Z  at 
> org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
> 2020-05-08T09:40:02.1890020Z  at 
> org.apache.flink.connectors.hive.FlinkStandaloneHiveRunner.runTestMethod(FlinkStandaloneHiveRunner.java:169)
> 2020-05-08T09:40:02.1894803Z  at 
> org.apache.flink.connectors.hive.FlinkStandaloneHiveRunner.runChild(FlinkStandaloneHiveRunner.java:154)
> 2020-05-08T09:40:02.1895796Z  at 
> org.apache.flink.connectors.hive.FlinkStandaloneHiveRunner.runChild(FlinkStandaloneHiveRunner.java:92)
> 2020-05-08T09:40:02.1896503Z  at 
> org.junit.runners.ParentRunner$3.run(ParentRunner.java:290)
> 2020-05-08T09:40:02.1897131Z  at 
> org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:71)
> 2020-05-08T09:40:02.1897749Z  at 
> org.junit.runners.ParentRunner.runChildren(ParentRunner.java:288)
> 2020-05-08T09:40:02.1898405Z  at 
> org.junit.runners.ParentRunner.access$000(ParentRunner.java:58)
> 2020-05-08T09:40:02.1899264Z  at 
> org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:268)
> 2020-05-08T09:40:02.1899935Z  at 
> org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
> 2020-05-08T09:40:02.1900618Z  at 
> org.junit.rules.ExternalResource$1.evaluate(ExternalResource.java:48)
> 2020-05-08T09:40:02.1901425Z  at 
> org.junit.rules.ExternalResource$1.evaluate(ExternalResource.java:48)
> 2020-05-08T09:40:02.1902191Z  at 
> org.junit.rules.ExternalResource$1.evaluate(ExternalResource.java:48)
> 2020-05-08T09:40:02.1902841Z  at 
> org.junit.rules.RunRules.evaluate(RunRules.java:20)
> 2020-05-08T09:40:02.1903518Z  at 
> org.junit.runners.ParentRunner.run(ParentRunner.java:363)
> 2020-05-08T09:40:02.1904171Z  at 
> org.apache.maven.surefire.junit4.JUnit4Provider.execute(JUnit4Provider.java:365)
> 2020-05-08T09:40:02.1905065Z  at 
> org.apache.maven.surefire.junit4.JUnit4Provider.executeWithRerun(JUnit4Provider.java:273)
> 2020-05-08T09:40:02.1905855Z  at 
> org.apache.maven.surefire.junit4.JUnit4Provider.executeTestSet(JUnit4Provider.java:238)
> 2020-05-08T09:40:02.1906654Z  at 
> org.apache.maven.surefire.junit4.JUnit4Provider.invoke(JUnit4Provider.java:159)
> 2020-05-08T09:40:02.1907706Z  at 
> org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:384)
> 2020-05-08T09:40:02.1908493Z  at 
> org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:345)
> 2020-05-08T09:40:02.1909217Z  at 
> org.apache.maven.surefire.booter.ForkedBooter.execute(ForkedBooter.java:126)
> 2020-05-08T09:40:02.1909955Z  at 
> org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:418)
> 2020-05-08T09:40:02.1910663Z Caused by: 
> org.apache.flink.runtime.client.JobExecutionException: Job execution failed.
> 2020-05-08T09:40:02.1911517Z  at 
> org.apache.flink.runtime.jobmaster.JobResult.toJobExecutionResult(JobResult.java:147)
> 2020-05-08T09:40:02.1912462Z  at 
> org.apache.flink.client.program.PerJobMiniClusterFactory$PerJobMiniClusterJobClient.lambda$getJobExecutionResult$2(PerJobMiniClusterFactory.java:179)
> 2020-05-08T09:40:02.1913762Z  at 
> java.util.concurrent.CompletableFuture.uniApply(CompletableFuture.java:616)
> 2020-05-08T09:40:02.1914637Z  at 
> java.util.concurrent.CompletableFuture$UniApply.tryFire(CompletableFuture.java:591)
> 2020-05-08T09:40:02.1915388Z  at 
> java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
> 2020-05-08T09:40:02.1916119Z  at 
> java.util.concurrent.CompletableFuture.complete(CompletableFuture.java:1975)
> 2020-05-08T09:40:02.1917085Z  at 
> org.apache.flink.runtime.rpc.akka.AkkaInvocationHandler.lambda$invokeRpc$0(AkkaInvocationHandler.java:229)
> 2020-05-08T09:40:02.1917933Z  at 
> java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:774)
> 2020-05-08T09:40:02.1918701Z  at 
> java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:750)
> 2020-05-08T09:40:02.1919480Z  at 
> java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
> 2020-05-08T09:40:02.1920131Z  at 
> java.util.concurrent.CompletableFuture.complete(CompletableFuture.java:1975)
> 2020-05-08T09:40:02.1920772Z  at 
> org.apache.flink.runtime.concurrent.FutureUtils$1.onComplete(FutureUtils.java:890)
> 2020-05-08T09:40:02.1921625Z  at 
> akka.dispatch.OnComplete.internal(Future.scala:264)
> 2020-05-08T09:40:02.1922178Z  at 
> akka.dispatch.OnComplete.internal(Future.scala:261)
> 2020-05-08T09:40:02.1922718Z  at 
> akka.dispatch.japi$CallbackBridge.apply(Future.scala:191)
> 2020-05-08T09:40:02.1923375Z  at 
> akka.dispatch.japi$CallbackBridge.apply(Future.scala:188)
> 2020-05-08T09:40:02.1924014Z  at 
> scala.concurrent.impl.CallbackRunnable.run(Promise.scala:36)
> 2020-05-08T09:40:02.1924841Z  at 
> org.apache.flink.runtime.concurrent.Executors$DirectExecutionContext.execute(Executors.java:74)
> 2020-05-08T09:40:02.1925627Z  at 
> scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:44)
> 2020-05-08T09:40:02.1926319Z  at 
> scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:252)
> 2020-05-08T09:40:02.1926996Z  at 
> akka.pattern.PromiseActorRef.$bang(AskSupport.scala:572)
> 2020-05-08T09:40:02.1927720Z  at 
> akka.pattern.PipeToSupport$PipeableFuture$$anonfun$pipeTo$1.applyOrElse(PipeToSupport.scala:22)
> 2020-05-08T09:40:02.1928567Z  at 
> akka.pattern.PipeToSupport$PipeableFuture$$anonfun$pipeTo$1.applyOrElse(PipeToSupport.scala:21)
> 2020-05-08T09:40:02.1929329Z  at 
> scala.concurrent.Future$$anonfun$andThen$1.apply(Future.scala:436)
> 2020-05-08T09:40:02.1929984Z  at 
> scala.concurrent.Future$$anonfun$andThen$1.apply(Future.scala:435)
> 2020-05-08T09:40:02.1930641Z  at 
> scala.concurrent.impl.CallbackRunnable.run(Promise.scala:36)
> 2020-05-08T09:40:02.1931456Z  at 
> akka.dispatch.BatchingExecutor$AbstractBatch.processBatch(BatchingExecutor.scala:55)
> 2020-05-08T09:40:02.1932304Z  at 
> akka.dispatch.BatchingExecutor$BlockableBatch$$anonfun$run$1.apply$mcV$sp(BatchingExecutor.scala:91)
> 2020-05-08T09:40:02.1933210Z  at 
> akka.dispatch.BatchingExecutor$BlockableBatch$$anonfun$run$1.apply(BatchingExecutor.scala:91)
> 2020-05-08T09:40:02.1934044Z  at 
> akka.dispatch.BatchingExecutor$BlockableBatch$$anonfun$run$1.apply(BatchingExecutor.scala:91)
> 2020-05-08T09:40:02.1934899Z  at 
> scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:72)
> 2020-05-08T09:40:02.1935581Z  at 
> akka.dispatch.BatchingExecutor$BlockableBatch.run(BatchingExecutor.scala:90)
> 2020-05-08T09:40:02.1936203Z  at 
> akka.dispatch.TaskInvocation.run(AbstractDispatcher.scala:40)
> 2020-05-08T09:40:02.1936956Z  at 
> akka.dispatch.ForkJoinExecutorConfigurator$AkkaForkJoinTask.exec(ForkJoinExecutorConfigurator.scala:44)
> 2020-05-08T09:40:02.1937707Z  at 
> akka.dispatch.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
> 2020-05-08T09:40:02.1938395Z  at 
> akka.dispatch.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
> 2020-05-08T09:40:02.1939096Z  at 
> akka.dispatch.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
> 2020-05-08T09:40:02.1939777Z  at 
> akka.dispatch.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)
> 2020-05-08T09:40:02.1940540Z Caused by: 
> org.apache.flink.runtime.JobException: Recovery is suppressed by 
> NoRestartBackoffTimeStrategy
> 2020-05-08T09:40:02.1941702Z  at 
> org.apache.flink.runtime.executiongraph.failover.flip1.ExecutionFailureHandler.handleFailure(ExecutionFailureHandler.java:112)
> 2020-05-08T09:40:02.1942721Z  at 
> org.apache.flink.runtime.executiongraph.failover.flip1.ExecutionFailureHandler.getFailureHandlingResult(ExecutionFailureHandler.java:78)
> 2020-05-08T09:40:02.1943727Z  at 
> org.apache.flink.runtime.scheduler.DefaultScheduler.handleTaskFailure(DefaultScheduler.java:189)
> 2020-05-08T09:40:02.1946503Z  at 
> org.apache.flink.runtime.scheduler.DefaultScheduler.maybeHandleTaskFailure(DefaultScheduler.java:183)
> 2020-05-08T09:40:02.1947357Z  at 
> org.apache.flink.runtime.scheduler.DefaultScheduler.updateTaskExecutionStateInternal(DefaultScheduler.java:177)
> 2020-05-08T09:40:02.1948239Z  at 
> org.apache.flink.runtime.scheduler.SchedulerBase.updateTaskExecutionState(SchedulerBase.java:497)
> 2020-05-08T09:40:02.1949045Z  at 
> org.apache.flink.runtime.jobmaster.JobMaster.updateTaskExecutionState(JobMaster.java:384)
> 2020-05-08T09:40:02.1949693Z  at 
> sun.reflect.GeneratedMethodAccessor46.invoke(Unknown Source)
> 2020-05-08T09:40:02.1950329Z  at 
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> 2020-05-08T09:40:02.1950963Z  at 
> java.lang.reflect.Method.invoke(Method.java:498)
> 2020-05-08T09:40:02.1951755Z  at 
> org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleRpcInvocation(AkkaRpcActor.java:284)
> 2020-05-08T09:40:02.1952473Z  at 
> org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleRpcMessage(AkkaRpcActor.java:199)
> 2020-05-08T09:40:02.1953417Z  at 
> org.apache.flink.runtime.rpc.akka.FencedAkkaRpcActor.handleRpcMessage(FencedAkkaRpcActor.java:74)
> 2020-05-08T09:40:02.1954212Z  at 
> org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleMessage(AkkaRpcActor.java:152)
> 2020-05-08T09:40:02.1955031Z  at 
> akka.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:26)
> 2020-05-08T09:40:02.1955651Z  at 
> akka.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:21)
> 2020-05-08T09:40:02.1956318Z  at 
> scala.PartialFunction$class.applyOrElse(PartialFunction.scala:123)
> 2020-05-08T09:40:02.1956946Z  at 
> akka.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:21)
> 2020-05-08T09:40:02.1957628Z  at 
> scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:170)
> 2020-05-08T09:40:02.1958290Z  at 
> scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:171)
> 2020-05-08T09:40:02.1958959Z  at 
> scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:171)
> 2020-05-08T09:40:02.1959598Z  at 
> akka.actor.Actor$class.aroundReceive(Actor.scala:517)
> 2020-05-08T09:40:02.1960197Z  at 
> akka.actor.AbstractActor.aroundReceive(AbstractActor.scala:225)
> 2020-05-08T09:40:02.1960845Z  at 
> akka.actor.ActorCell.receiveMessage(ActorCell.scala:592)
> 2020-05-08T09:40:02.1961540Z  at 
> akka.actor.ActorCell.invoke(ActorCell.scala:561)
> 2020-05-08T09:40:02.1962127Z  at 
> akka.dispatch.Mailbox.processMailbox(Mailbox.scala:258)
> 2020-05-08T09:40:02.1962677Z  at akka.dispatch.Mailbox.run(Mailbox.scala:225)
> 2020-05-08T09:40:02.1963303Z  at akka.dispatch.Mailbox.exec(Mailbox.scala:235)
> 2020-05-08T09:40:02.1963710Z  ... 4 more
> 2020-05-08T09:40:02.1964182Z Caused by: java.lang.OutOfMemoryError: unable to 
> create new native thread
> 2020-05-08T09:40:02.1964816Z  at java.lang.Thread.start0(Native Method)
> 2020-05-08T09:40:02.1965294Z  at java.lang.Thread.start(Thread.java:717)
> 2020-05-08T09:40:02.1965949Z  at 
> org.apache.flink.streaming.runtime.tasks.SourceStreamTask.processInput(SourceStreamTask.java:133)
> 2020-05-08T09:40:02.1966867Z  at 
> org.apache.flink.streaming.runtime.tasks.mailbox.MailboxProcessor.runMailboxStep(MailboxProcessor.java:206)
> 2020-05-08T09:40:02.1967767Z  at 
> org.apache.flink.streaming.runtime.tasks.mailbox.MailboxProcessor.runMailboxLoop(MailboxProcessor.java:196)
> 2020-05-08T09:40:02.1968639Z  at 
> org.apache.flink.streaming.runtime.tasks.StreamTask.runMailboxLoop(StreamTask.java:503)
> 2020-05-08T09:40:02.1969406Z  at 
> org.apache.flink.streaming.runtime.tasks.StreamTask.invoke(StreamTask.java:483)
> 2020-05-08T09:40:02.1970296Z  at 
> org.apache.flink.runtime.taskmanager.Task.doRun(Task.java:713)
> 2020-05-08T09:40:02.1970944Z  at 
> org.apache.flink.runtime.taskmanager.Task.run(Task.java:539)
> 2020-05-08T09:40:02.1971568Z  at java.lang.Thread.run(Thread.java:748)
> 2020-05-08T09:40:02.1971844Z 
> 2020-05-08T09:40:02.1972463Z [ERROR] 
> testInsertOverwrite(org.apache.flink.connectors.hive.TableEnvHiveConnectorTest)
>   Time elapsed: 0.116 s  <<< ERROR!
> 2020-05-08T09:40:02.1973835Z java.lang.IllegalArgumentException: Failed to 
> executeQuery Hive query create database db1: Error while processing 
> statement: FAILED: Execution Error, return code 1 from 
> org.apache.hadoop.hive.ql.exec.DDLTask. Database db1 already exists
> 2020-05-08T09:40:02.1975322Z  at 
> com.klarna.hiverunner.HiveServerContainer.executeStatement(HiveServerContainer.java:143)
> 2020-05-08T09:40:02.1976209Z  at 
> com.klarna.hiverunner.builder.HiveShellBase.executeStatementsWithCommandShellEmulation(HiveShellBase.java:121)
> 2020-05-08T09:40:02.1977145Z  at 
> com.klarna.hiverunner.builder.HiveShellBase.executeScriptWithCommandShellEmulation(HiveShellBase.java:110)
> 2020-05-08T09:40:02.1977986Z  at 
> com.klarna.hiverunner.builder.HiveShellBase.execute(HiveShellBase.java:129)
> 2020-05-08T09:40:02.1978847Z  at 
> org.apache.flink.connectors.hive.TableEnvHiveConnectorTest.testInsertOverwrite(TableEnvHiveConnectorTest.java:238)
> 2020-05-08T09:40:02.1979610Z  at 
> sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> 2020-05-08T09:40:02.1980251Z  at 
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
> 2020-05-08T09:40:02.1981013Z  at 
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> 2020-05-08T09:40:02.1981814Z  at 
> java.lang.reflect.Method.invoke(Method.java:498)
> 2020-05-08T09:40:02.1982562Z  at 
> org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:50)
> 2020-05-08T09:40:02.1983409Z  at 
> org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
> 2020-05-08T09:40:02.1984113Z  at 
> org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:47)
> 2020-05-08T09:40:02.1984996Z  at 
> org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
> 2020-05-08T09:40:02.1985742Z  at 
> org.apache.flink.connectors.hive.FlinkStandaloneHiveRunner.runTestMethod(FlinkStandaloneHiveRunner.java:169)
> 2020-05-08T09:40:02.1986654Z  at 
> org.apache.flink.connectors.hive.FlinkStandaloneHiveRunner.runChild(FlinkStandaloneHiveRunner.java:154)
> 2020-05-08T09:40:02.1987532Z  at 
> org.apache.flink.connectors.hive.FlinkStandaloneHiveRunner.runChild(FlinkStandaloneHiveRunner.java:92)
> 2020-05-08T09:40:02.1988310Z  at 
> org.junit.runners.ParentRunner$3.run(ParentRunner.java:290)
> 2020-05-08T09:40:02.1988950Z  at 
> org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:71)
> 2020-05-08T09:40:02.1989594Z  at 
> org.junit.runners.ParentRunner.runChildren(ParentRunner.java:288)
> 2020-05-08T09:40:02.1990247Z  at 
> org.junit.runners.ParentRunner.access$000(ParentRunner.java:58)
> 2020-05-08T09:40:02.1990883Z  at 
> org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:268)
> 2020-05-08T09:40:02.1991751Z  at 
> org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
> 2020-05-08T09:40:02.1992455Z  at 
> org.junit.rules.ExternalResource$1.evaluate(ExternalResource.java:48)
> 2020-05-08T09:40:02.1993245Z  at 
> org.junit.rules.ExternalResource$1.evaluate(ExternalResource.java:48)
> 2020-05-08T09:40:02.1993917Z  at 
> org.junit.rules.ExternalResource$1.evaluate(ExternalResource.java:48)
> 2020-05-08T09:40:02.1994676Z  at 
> org.junit.rules.RunRules.evaluate(RunRules.java:20)
> 2020-05-08T09:40:02.1995261Z  at 
> org.junit.runners.ParentRunner.run(ParentRunner.java:363)
> 2020-05-08T09:40:02.1995958Z  at 
> org.apache.maven.surefire.junit4.JUnit4Provider.execute(JUnit4Provider.java:365)
> 2020-05-08T09:40:02.1996725Z  at 
> org.apache.maven.surefire.junit4.JUnit4Provider.executeWithRerun(JUnit4Provider.java:273)
> 2020-05-08T09:40:02.1997542Z  at 
> org.apache.maven.surefire.junit4.JUnit4Provider.executeTestSet(JUnit4Provider.java:238)
> 2020-05-08T09:40:02.1998499Z  at 
> org.apache.maven.surefire.junit4.JUnit4Provider.invoke(JUnit4Provider.java:159)
> 2020-05-08T09:40:02.1999330Z  at 
> org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:384)
> 2020-05-08T09:40:02.2000177Z  at 
> org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:345)
> 2020-05-08T09:40:02.2000934Z  at 
> org.apache.maven.surefire.booter.ForkedBooter.execute(ForkedBooter.java:126)
> 2020-05-08T09:40:02.2001791Z  at 
> org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:418)
> 2020-05-08T09:40:02.2002957Z Caused by: 
> org.apache.hive.service.cli.HiveSQLException: Error while processing 
> statement: FAILED: Execution Error, return code 1 from 
> org.apache.hadoop.hive.ql.exec.DDLTask. Database db1 already exists
> 2020-05-08T09:40:02.2004116Z  at 
> org.apache.hive.service.cli.operation.Operation.toSQLException(Operation.java:380)
> 2020-05-08T09:40:02.2005005Z  at 
> org.apache.hive.service.cli.operation.SQLOperation.runQuery(SQLOperation.java:257)
> 2020-05-08T09:40:02.2005745Z  at 
> org.apache.hive.service.cli.operation.SQLOperation.runInternal(SQLOperation.java:293)
> 2020-05-08T09:40:02.2006515Z  at 
> org.apache.hive.service.cli.operation.Operation.run(Operation.java:320)
> 2020-05-08T09:40:02.2007317Z  at 
> org.apache.hive.service.cli.session.HiveSessionImpl.executeStatementInternal(HiveSessionImpl.java:530)
> 2020-05-08T09:40:02.2008209Z  at 
> org.apache.hive.service.cli.session.HiveSessionImpl.executeStatement(HiveSessionImpl.java:500)
> 2020-05-08T09:40:02.2008976Z  at 
> org.apache.hive.service.cli.CLIService.executeStatement(CLIService.java:265)
> 2020-05-08T09:40:02.2009739Z  at 
> com.klarna.hiverunner.HiveServerContainer.executeStatement(HiveServerContainer.java:116)
> 2020-05-08T09:40:02.2010277Z  ... 34 more
> 2020-05-08T09:40:02.2010807Z Caused by: 
> org.apache.hadoop.hive.ql.metadata.HiveException: Database db1 already exists
> 2020-05-08T09:40:02.2011697Z  at 
> org.apache.hadoop.hive.ql.exec.DDLTask.createDatabase(DDLTask.java:4247)
> 2020-05-08T09:40:02.2012387Z  at 
> org.apache.hadoop.hive.ql.exec.DDLTask.execute(DDLTask.java:319)
> 2020-05-08T09:40:02.2013040Z  at 
> org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:199)
> 2020-05-08T09:40:02.2013834Z  at 
> org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:100)
> 2020-05-08T09:40:02.2014628Z  at 
> org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:2183)
> 2020-05-08T09:40:02.2015266Z  at 
> org.apache.hadoop.hive.ql.Driver.execute(Driver.java:1839)
> 2020-05-08T09:40:02.2015878Z  at 
> org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1526)
> 2020-05-08T09:40:02.2016484Z  at 
> org.apache.hadoop.hive.ql.Driver.run(Driver.java:1237)
> 2020-05-08T09:40:02.2017073Z  at 
> org.apache.hadoop.hive.ql.Driver.run(Driver.java:1232)
> 2020-05-08T09:40:02.2017723Z  at 
> org.apache.hive.service.cli.operation.SQLOperation.runQuery(SQLOperation.java:255)
> 2020-05-08T09:40:02.2018263Z  ... 40 more
> 2020-05-08T09:40:02.2018703Z Caused by: 
> AlreadyExistsException(message:Database db1 already exists)
> 2020-05-08T09:40:02.2019631Z  at 
> org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$create_database_result$create_database_resultStandardScheme.read(ThriftHiveMetastore.java:26487)
> 2020-05-08T09:40:02.2020824Z  at 
> org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$create_database_result$create_database_resultStandardScheme.read(ThriftHiveMetastore.java:26473)
> 2020-05-08T09:40:02.2022044Z  at 
> org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$create_database_result.read(ThriftHiveMetastore.java:26407)
> 2020-05-08T09:40:02.2022859Z  at 
> org.apache.thrift.TServiceClient.receiveBase(TServiceClient.java:86)
> 2020-05-08T09:40:02.2023761Z  at 
> org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Client.recv_create_database(ThriftHiveMetastore.java:749)
> 2020-05-08T09:40:02.2024792Z  at 
> org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Client.create_database(ThriftHiveMetastore.java:736)
> 2020-05-08T09:40:02.2025682Z  at 
> org.apache.hadoop.hive.metastore.HiveMetaStoreClient.createDatabase(HiveMetaStoreClient.java:727)
> 2020-05-08T09:40:02.2026541Z  at 
> sun.reflect.GeneratedMethodAccessor56.invoke(Unknown Source)
> 2020-05-08T09:40:02.2027193Z  at 
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> 2020-05-08T09:40:02.2027866Z  at 
> java.lang.reflect.Method.invoke(Method.java:498)
> 2020-05-08T09:40:02.2028536Z  at 
> org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.invoke(RetryingMetaStoreClient.java:173)
> 2020-05-08T09:40:02.2029209Z  at 
> com.sun.proxy.$Proxy33.createDatabase(Unknown Source)
> 2020-05-08T09:40:02.2029824Z  at 
> sun.reflect.GeneratedMethodAccessor56.invoke(Unknown Source)
> 2020-05-08T09:40:02.2030493Z  at 
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> 2020-05-08T09:40:02.2031244Z  at 
> java.lang.reflect.Method.invoke(Method.java:498)
> 2020-05-08T09:40:02.2031998Z  at 
> org.apache.hadoop.hive.metastore.HiveMetaStoreClient$SynchronizedHandler.invoke(HiveMetaStoreClient.java:2330)
> 2020-05-08T09:40:02.2032716Z  at 
> com.sun.proxy.$Proxy33.createDatabase(Unknown Source)
> 2020-05-08T09:40:02.2033414Z  at 
> org.apache.hadoop.hive.ql.metadata.Hive.createDatabase(Hive.java:427)
> 2020-05-08T09:40:02.2034174Z  at 
> org.apache.hadoop.hive.ql.exec.DDLTask.createDatabase(DDLTask.java:4243)
> 2020-05-08T09:40:02.2034959Z  ... 49 more
> 2020-05-08T09:40:02.2035232Z 
> 2020-05-08T09:40:02.2036012Z [ERROR] 
> testDifferentFormats(org.apache.flink.connectors.hive.TableEnvHiveConnectorTest)
>   Time elapsed: 0.031 s  <<< ERROR!
> 2020-05-08T09:40:02.2037188Z java.lang.IllegalArgumentException: Failed to 
> executeQuery Hive query create database db1: Error while processing 
> statement: FAILED: Execution Error, return code 1 from 
> org.apache.hadoop.hive.ql.exec.DDLTask. Database db1 already exists
> 2020-05-08T09:40:02.2038329Z  at 
> com.klarna.hiverunner.HiveServerContainer.executeStatement(HiveServerContainer.java:143)
> 2020-05-08T09:40:02.2039176Z  at 
> com.klarna.hiverunner.builder.HiveShellBase.executeStatementsWithCommandShellEmulation(HiveShellBase.java:121)
> 2020-05-08T09:40:02.2040066Z  at 
> com.klarna.hiverunner.builder.HiveShellBase.executeScriptWithCommandShellEmulation(HiveShellBase.java:110)
> 2020-05-08T09:40:02.2040964Z  at 
> com.klarna.hiverunner.builder.HiveShellBase.execute(HiveShellBase.java:129)
> 2020-05-08T09:40:02.2042010Z  at 
> org.apache.flink.connectors.hive.TableEnvHiveConnectorTest.readWriteFormat(TableEnvHiveConnectorTest.java:142)
> 2020-05-08T09:40:02.2042941Z  at 
> org.apache.flink.connectors.hive.TableEnvHiveConnectorTest.testDifferentFormats(TableEnvHiveConnectorTest.java:135)
> 2020-05-08T09:40:02.2043764Z  at 
> sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> 2020-05-08T09:40:02.2044514Z  at 
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
> 2020-05-08T09:40:02.2045198Z  at 
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> 2020-05-08T09:40:02.2045812Z  at 
> java.lang.reflect.Method.invoke(Method.java:498)
> 2020-05-08T09:40:02.2046423Z  at 
> org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:50)
> 2020-05-08T09:40:02.2047134Z  at 
> org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
> 2020-05-08T09:40:02.2047860Z  at 
> org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:47)
> 2020-05-08T09:40:02.2048579Z  at 
> org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
> 2020-05-08T09:40:02.2049372Z  at 
> org.apache.flink.connectors.hive.FlinkStandaloneHiveRunner.runTestMethod(FlinkStandaloneHiveRunner.java:169)
> 2020-05-08T09:40:02.2050242Z  at 
> org.apache.flink.connectors.hive.FlinkStandaloneHiveRunner.runChild(FlinkStandaloneHiveRunner.java:154)
> 2020-05-08T09:40:02.2051230Z  at 
> org.apache.flink.connectors.hive.FlinkStandaloneHiveRunner.runChild(FlinkStandaloneHiveRunner.java:92)
> 2020-05-08T09:40:02.2051916Z  at 
> org.junit.runners.ParentRunner$3.run(ParentRunner.java:290)
> 2020-05-08T09:40:02.2052516Z  at 
> org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:71)
> 2020-05-08T09:40:02.2053144Z  at 
> org.junit.runners.ParentRunner.runChildren(ParentRunner.java:288)
> 2020-05-08T09:40:02.2054136Z  at 
> org.junit.runners.ParentRunner.access$000(ParentRunner.java:58)
> 2020-05-08T09:40:02.2055027Z  at 
> org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:268)
> 2020-05-08T09:40:02.2055775Z  at 
> org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
> 2020-05-08T09:40:02.2056731Z  at 
> org.junit.rules.ExternalResource$1.evaluate(ExternalResource.java:48)
> 2020-05-08T09:40:02.2057369Z  at 
> org.junit.rules.ExternalResource$1.evaluate(ExternalResource.java:48)
> 2020-05-08T09:40:02.2058109Z  at 
> org.junit.rules.ExternalResource$1.evaluate(ExternalResource.java:48)
> 2020-05-08T09:40:02.2058580Z  at 
> org.junit.rules.RunRules.evaluate(RunRules.java:20)
> 2020-05-08T09:40:02.2059127Z  at 
> org.junit.runners.ParentRunner.run(ParentRunner.java:363)
> 2020-05-08T09:40:02.2059523Z  at 
> org.apache.maven.surefire.junit4.JUnit4Provider.execute(JUnit4Provider.java:365)
> 2020-05-08T09:40:02.2060161Z  at 
> org.apache.maven.surefire.junit4.JUnit4Provider.executeWithRerun(JUnit4Provider.java:273)
> 2020-05-08T09:40:02.2060798Z  at 
> org.apache.maven.surefire.junit4.JUnit4Provider.executeTestSet(JUnit4Provider.java:238)
> 2020-05-08T09:40:02.2061500Z  at 
> org.apache.maven.surefire.junit4.JUnit4Provider.invoke(JUnit4Provider.java:159)
> 2020-05-08T09:40:02.2062191Z  at 
> org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:384)
> 2020-05-08T09:40:02.2062826Z  at 
> org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:345)
> 2020-05-08T09:40:02.2063446Z  at 
> org.apache.maven.surefire.booter.ForkedBooter.execute(ForkedBooter.java:126)
> 2020-05-08T09:40:02.2063921Z  at 
> org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:418)
> 2020-05-08T09:40:02.2064783Z Caused by: 
> org.apache.hive.service.cli.HiveSQLException: Error while processing 
> statement: FAILED: Execution Error, return code 1 from 
> org.apache.hadoop.hive.ql.exec.DDLTask. Database db1 already exists
> 2020-05-08T09:40:02.2065630Z  at 
> org.apache.hive.service.cli.operation.Operation.toSQLException(Operation.java:380)
> 2020-05-08T09:40:02.2066238Z  at 
> org.apache.hive.service.cli.operation.SQLOperation.runQuery(SQLOperation.java:257)
> 2020-05-08T09:40:02.2066729Z  at 
> org.apache.hive.service.cli.operation.SQLOperation.runInternal(SQLOperation.java:293)
> 2020-05-08T09:40:02.2067415Z  at 
> org.apache.hive.service.cli.operation.Operation.run(Operation.java:320)
> 2020-05-08T09:40:02.2068080Z  at 
> org.apache.hive.service.cli.session.HiveSessionImpl.executeStatementInternal(HiveSessionImpl.java:530)
> 2020-05-08T09:40:02.2068798Z  at 
> org.apache.hive.service.cli.session.HiveSessionImpl.executeStatement(HiveSessionImpl.java:500)
> 2020-05-08T09:40:02.2069312Z  at 
> org.apache.hive.service.cli.CLIService.executeStatement(CLIService.java:265)
> 2020-05-08T09:40:02.2069907Z  at 
> com.klarna.hiverunner.HiveServerContainer.executeStatement(HiveServerContainer.java:116)
> 2020-05-08T09:40:02.2070342Z  ... 35 more
> 2020-05-08T09:40:02.2070713Z Caused by: 
> org.apache.hadoop.hive.ql.metadata.HiveException: Database db1 already exists
> 2020-05-08T09:40:02.2071379Z  at 
> org.apache.hadoop.hive.ql.exec.DDLTask.createDatabase(DDLTask.java:4247)
> 2020-05-08T09:40:02.2071954Z  at 
> org.apache.hadoop.hive.ql.exec.DDLTask.execute(DDLTask.java:319)
> 2020-05-08T09:40:02.2072449Z  at 
> org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:199)
> 2020-05-08T09:40:02.2073014Z  at 
> org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:100)
> 2020-05-08T09:40:02.2073720Z  at 
> org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:2183)
> 2020-05-08T09:40:02.2074206Z  at 
> org.apache.hadoop.hive.ql.Driver.execute(Driver.java:1839)
> 2020-05-08T09:40:02.2074833Z  at 
> org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1526)
> 2020-05-08T09:40:02.2075300Z  at 
> org.apache.hadoop.hive.ql.Driver.run(Driver.java:1237)
> 2020-05-08T09:40:02.2075837Z  at 
> org.apache.hadoop.hive.ql.Driver.run(Driver.java:1232)
> 2020-05-08T09:40:02.2076320Z  at 
> org.apache.hive.service.cli.operation.SQLOperation.runQuery(SQLOperation.java:255)
> 2020-05-08T09:40:02.2076713Z  ... 41 more
> 2020-05-08T09:40:02.2077221Z Caused by: 
> AlreadyExistsException(message:Database db1 already exists)
> 2020-05-08T09:40:02.2078000Z  at 
> org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$create_database_result$create_database_resultStandardScheme.read(ThriftHiveMetastore.java:26487)
> 2020-05-08T09:40:02.2078840Z  at 
> org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$create_database_result$create_database_resultStandardScheme.read(ThriftHiveMetastore.java:26473)
> 2020-05-08T09:40:02.2079803Z  at 
> org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$create_database_result.read(ThriftHiveMetastore.java:26407)
> 2020-05-08T09:40:02.2080296Z  at 
> org.apache.thrift.TServiceClient.receiveBase(TServiceClient.java:86)
> 2020-05-08T09:40:02.2080897Z  at 
> org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Client.recv_create_database(ThriftHiveMetastore.java:749)
> 2020-05-08T09:40:02.2081643Z  at 
> org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Client.create_database(ThriftHiveMetastore.java:736)
> 2020-05-08T09:40:02.2082363Z  at 
> org.apache.hadoop.hive.metastore.HiveMetaStoreClient.createDatabase(HiveMetaStoreClient.java:727)
> 2020-05-08T09:40:02.2082848Z  at 
> sun.reflect.GeneratedMethodAccessor56.invoke(Unknown Source)
> 2020-05-08T09:40:02.2083299Z  at 
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> 2020-05-08T09:40:02.2083946Z  at 
> java.lang.reflect.Method.invoke(Method.java:498)
> 2020-05-08T09:40:02.2084542Z  at 
> org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.invoke(RetryingMetaStoreClient.java:173)
> 2020-05-08T09:40:02.2084952Z  at 
> com.sun.proxy.$Proxy33.createDatabase(Unknown Source)
> 2020-05-08T09:40:02.2085384Z  at 
> sun.reflect.GeneratedMethodAccessor56.invoke(Unknown Source)
> 2020-05-08T09:40:02.2086078Z  at 
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> 2020-05-08T09:40:02.2086655Z  at 
> java.lang.reflect.Method.invoke(Method.java:498)
> 2020-05-08T09:40:02.2087354Z  at 
> org.apache.hadoop.hive.metastore.HiveMetaStoreClient$SynchronizedHandler.invoke(HiveMetaStoreClient.java:2330)
> 2020-05-08T09:40:02.2087790Z  at 
> com.sun.proxy.$Proxy33.createDatabase(Unknown Source)
> 2020-05-08T09:40:02.2088356Z  at 
> org.apache.hadoop.hive.ql.metadata.Hive.createDatabase(Hive.java:427)
> 2020-05-08T09:40:02.2088784Z  at 
> org.apache.hadoop.hive.ql.exec.DDLTask.createDatabase(DDLTask.java:4243)
> 2020-05-08T09:40:02.2089306Z  ... 50 more
> 2020-05-08T09:40:02.2089426Z 
> 2020-05-08T09:40:02.2089939Z [ERROR] 
> testUpdatePartitionSD(org.apache.flink.connectors.hive.TableEnvHiveConnectorTest)
>   Time elapsed: 0.015 s  <<< ERROR!
> 2020-05-08T09:40:02.2090950Z java.lang.IllegalArgumentException: Failed to 
> executeQuery Hive query create database db1: Error while processing 
> statement: FAILED: Execution Error, return code 1 from 
> org.apache.hadoop.hive.ql.exec.DDLTask. Database db1 already exists
> 2020-05-08T09:40:02.2091981Z  at 
> com.klarna.hiverunner.HiveServerContainer.executeStatement(HiveServerContainer.java:143)
> 2020-05-08T09:40:02.2092712Z  at 
> com.klarna.hiverunner.builder.HiveShellBase.executeStatementsWithCommandShellEmulation(HiveShellBase.java:121)
> 2020-05-08T09:40:02.2093480Z  at 
> com.klarna.hiverunner.builder.HiveShellBase.executeScriptWithCommandShellEmulation(HiveShellBase.java:110)
> 2020-05-08T09:40:02.2094185Z  at 
> com.klarna.hiverunner.builder.HiveShellBase.execute(HiveShellBase.java:129)
> 2020-05-08T09:40:02.2094909Z  at 
> org.apache.flink.connectors.hive.TableEnvHiveConnectorTest.testUpdatePartitionSD(TableEnvHiveConnectorTest.java:588)
> 2020-05-08T09:40:02.2095525Z  at 
> sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> 2020-05-08T09:40:02.2095998Z  at 
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
> 2020-05-08T09:40:02.2096642Z  at 
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> 2020-05-08T09:40:02.2097088Z  at 
> java.lang.reflect.Method.invoke(Method.java:498)
> 2020-05-08T09:40:02.2097619Z  at 
> org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:50)
> 2020-05-08T09:40:02.2098340Z  at 
> org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
> 2020-05-08T09:40:02.2098795Z  at 
> org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:47)
> 2020-05-08T09:40:02.2099326Z  at 
> org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
> 2020-05-08T09:40:02.2099978Z  at 
> org.apache.flink.connectors.hive.FlinkStandaloneHiveRunner.runTestMethod(FlinkStandaloneHiveRunner.java:169)
> 2020-05-08T09:40:02.2100828Z  at 
> org.apache.flink.connectors.hive.FlinkStandaloneHiveRunner.runChild(FlinkStandaloneHiveRunner.java:154)
> 2020-05-08T09:40:02.2101591Z  at 
> org.apache.flink.connectors.hive.FlinkStandaloneHiveRunner.runChild(FlinkStandaloneHiveRunner.java:92)
> 2020-05-08T09:40:02.2102040Z  at 
> org.junit.runners.ParentRunner$3.run(ParentRunner.java:290)
> 2020-05-08T09:40:02.2102668Z  at 
> org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:71)
> 2020-05-08T09:40:02.2103233Z  at 
> org.junit.runners.ParentRunner.runChildren(ParentRunner.java:288)
> 2020-05-08T09:40:02.2103755Z  at 
> org.junit.runners.ParentRunner.access$000(ParentRunner.java:58)
> 2020-05-08T09:40:02.2104461Z  at 
> org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:268)
> 2020-05-08T09:40:02.2105023Z  at 
> org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
> 2020-05-08T09:40:02.2105600Z  at 
> org.junit.rules.ExternalResource$1.evaluate(ExternalResource.java:48)
> 2020-05-08T09:40:02.2106187Z  at 
> org.junit.rules.ExternalResource$1.evaluate(ExternalResource.java:48)
> 2020-05-08T09:40:02.2106792Z  at 
> org.junit.rules.ExternalResource$1.evaluate(ExternalResource.java:48)
> 2020-05-08T09:40:02.2107217Z  at 
> org.junit.rules.RunRules.evaluate(RunRules.java:20)
> 2020-05-08T09:40:02.2107586Z  at 
> org.junit.runners.ParentRunner.run(ParentRunner.java:363)
> 2020-05-08T09:40:02.2108205Z  at 
> org.apache.maven.surefire.junit4.JUnit4Provider.execute(JUnit4Provider.java:365)
> 2020-05-08T09:40:02.2108794Z  at 
> org.apache.maven.surefire.junit4.JUnit4Provider.executeWithRerun(JUnit4Provider.java:273)
> 2020-05-08T09:40:02.2109434Z  at 
> org.apache.maven.surefire.junit4.JUnit4Provider.executeTestSet(JUnit4Provider.java:238)
> 2020-05-08T09:40:02.2109943Z  at 
> org.apache.maven.surefire.junit4.JUnit4Provider.invoke(JUnit4Provider.java:159)
> 2020-05-08T09:40:02.2110695Z  at 
> org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:384)
> 2020-05-08T09:40:02.2111342Z  at 
> org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:345)
> 2020-05-08T09:40:02.2111801Z  at 
> org.apache.maven.surefire.booter.ForkedBooter.execute(ForkedBooter.java:126)
> 2020-05-08T09:40:02.2112443Z  at 
> org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:418)
> 2020-05-08T09:40:02.2113300Z Caused by: 
> org.apache.hive.service.cli.HiveSQLException: Error while processing 
> statement: FAILED: Execution Error, return code 1 from 
> org.apache.hadoop.hive.ql.exec.DDLTask. Database db1 already exists
> 2020-05-08T09:40:02.2114079Z  at 
> org.apache.hive.service.cli.operation.Operation.toSQLException(Operation.java:380)
> 2020-05-08T09:40:02.2114738Z  at 
> org.apache.hive.service.cli.operation.SQLOperation.runQuery(SQLOperation.java:257)
> 2020-05-08T09:40:02.2115208Z  at 
> org.apache.hive.service.cli.operation.SQLOperation.runInternal(SQLOperation.java:293)
> 2020-05-08T09:40:02.2115635Z  at 
> org.apache.hive.service.cli.operation.Operation.run(Operation.java:320)
> 2020-05-08T09:40:02.2116251Z  at 
> org.apache.hive.service.cli.session.HiveSessionImpl.executeStatementInternal(HiveSessionImpl.java:530)
> 2020-05-08T09:40:02.2117029Z  at 
> org.apache.hive.service.cli.session.HiveSessionImpl.executeStatement(HiveSessionImpl.java:500)
> 2020-05-08T09:40:02.2117612Z  at 
> org.apache.hive.service.cli.CLIService.executeStatement(CLIService.java:265)
> 2020-05-08T09:40:02.2118212Z  at 
> com.klarna.hiverunner.HiveServerContainer.executeStatement(HiveServerContainer.java:116)
> 2020-05-08T09:40:02.2118692Z  ... 34 more
> 2020-05-08T09:40:02.2119010Z Caused by: 
> org.apache.hadoop.hive.ql.metadata.HiveException: Database db1 already exists
> 2020-05-08T09:40:02.2119697Z  at 
> org.apache.hadoop.hive.ql.exec.DDLTask.createDatabase(DDLTask.java:4247)
> 2020-05-08T09:40:02.2120268Z  at 
> org.apache.hadoop.hive.ql.exec.DDLTask.execute(DDLTask.java:319)
> 2020-05-08T09:40:02.2120643Z  at 
> org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:199)
> 2020-05-08T09:40:02.2121352Z  at 
> org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:100)
> 2020-05-08T09:40:02.2121812Z  at 
> org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:2183)
> 2020-05-08T09:40:02.2122461Z  at 
> org.apache.hadoop.hive.ql.Driver.execute(Driver.java:1839)
> 2020-05-08T09:40:02.2122824Z  at 
> org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1526)
> 2020-05-08T09:40:02.2123422Z  at 
> org.apache.hadoop.hive.ql.Driver.run(Driver.java:1237)
> 2020-05-08T09:40:02.2123756Z  at 
> org.apache.hadoop.hive.ql.Driver.run(Driver.java:1232)
> 2020-05-08T09:40:02.2124548Z  at 
> org.apache.hive.service.cli.operation.SQLOperation.runQuery(SQLOperation.java:255)
> 2020-05-08T09:40:02.2125000Z  ... 40 more
> 2020-05-08T09:40:02.2125286Z Caused by: 
> AlreadyExistsException(message:Database db1 already exists)
> 2020-05-08T09:40:02.2125956Z  at 
> org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$create_database_result$create_database_resultStandardScheme.read(ThriftHiveMetastore.java:26487)
> 2020-05-08T09:40:02.2126801Z  at 
> org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$create_database_result$create_database_resultStandardScheme.read(ThriftHiveMetastore.java:26473)
> 2020-05-08T09:40:02.2127553Z  at 
> org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$create_database_result.read(ThriftHiveMetastore.java:26407)
> 2020-05-08T09:40:02.2128061Z  at 
> org.apache.thrift.TServiceClient.receiveBase(TServiceClient.java:86)
> 2020-05-08T09:40:02.2128553Z  at 
> org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Client.recv_create_database(ThriftHiveMetastore.java:749)
> 2020-05-08T09:40:02.2129172Z  at 
> org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Client.create_database(ThriftHiveMetastore.java:736)
> 2020-05-08T09:40:02.2129821Z  at 
> org.apache.hadoop.hive.metastore.HiveMetaStoreClient.createDatabase(HiveMetaStoreClient.java:727)
> 2020-05-08T09:40:02.2130291Z  at 
> sun.reflect.GeneratedMethodAccessor56.invoke(Unknown Source)
> 2020-05-08T09:40:02.2130753Z  at 
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> 2020-05-08T09:40:02.2131444Z  at 
> java.lang.reflect.Method.invoke(Method.java:498)
> 2020-05-08T09:40:02.2131903Z  at 
> org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.invoke(RetryingMetaStoreClient.java:173)
> 2020-05-08T09:40:02.2132399Z  at 
> com.sun.proxy.$Proxy33.createDatabase(Unknown Source)
> 2020-05-08T09:40:02.2132829Z  at 
> sun.reflect.GeneratedMethodAccessor56.invoke(Unknown Source)
> 2020-05-08T09:40:02.2133338Z  at 
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> 2020-05-08T09:40:02.2133848Z  at 
> java.lang.reflect.Method.invoke(Method.java:498)
> 2020-05-08T09:40:02.2134353Z  at 
> org.apache.hadoop.hive.metastore.HiveMetaStoreClient$SynchronizedHandler.invoke(HiveMetaStoreClient.java:2330)
> 2020-05-08T09:40:02.2134876Z  at 
> com.sun.proxy.$Proxy33.createDatabase(Unknown Source)
> 2020-05-08T09:40:02.2135338Z  at 
> org.apache.hadoop.hive.ql.metadata.Hive.createDatabase(Hive.java:427)
> 2020-05-08T09:40:02.2135847Z  at 
> org.apache.hadoop.hive.ql.exec.DDLTask.createDatabase(DDLTask.java:4243)
> 2020-05-08T09:40:02.2136195Z  ... 49 more
> 2020-05-08T09:40:02.2136313Z 
> 2020-05-08T09:40:03.2246042Z [INFO] Running 
> org.apache.flink.connectors.hive.HiveDialectTest
> 2020-05-08T09:40:16.1437654Z [INFO] Tests run: 2, Failures: 0, Errors: 0, 
> Skipped: 0, Time elapsed: 12.914 s - in 
> org.apache.flink.connectors.hive.HiveDialectTest
> 2020-05-08T09:40:16.5569091Z [INFO] 
> 2020-05-08T09:40:16.5569441Z [INFO] Results:
> 2020-05-08T09:40:16.5569645Z [INFO] 
> 2020-05-08T09:40:16.5569962Z [ERROR] Errors: 
> 2020-05-08T09:40:16.5571473Z [ERROR]   
> HiveTableSinkTest.testInsertIntoNonPartitionTable:91->createHiveDestTable:233->createHiveDestTable:221
>  » TableAlreadyExist
> 2020-05-08T09:40:16.5572957Z [ERROR]   
> HiveTableSinkTest.testWriteComplexType:143 » Execution 
> org.apache.flink.runtim...
> 2020-05-08T09:40:16.5573995Z [ERROR]   
> TableEnvHiveConnectorTest.testDefaultPartitionName:106 » Execution 
> org.apache....
> 2020-05-08T09:40:16.5574692Z [ERROR]   
> TableEnvHiveConnectorTest.testDifferentFormats:135->readWriteFormat:142 » 
> IllegalArgument
> 2020-05-08T09:40:16.5575454Z [ERROR]   
> TableEnvHiveConnectorTest.testInsertOverwrite:238 » IllegalArgument Failed 
> to ...
> 2020-05-08T09:40:16.5576562Z [ERROR]   
> TableEnvHiveConnectorTest.testUpdatePartitionSD:588 » IllegalArgument Failed 
> t...
> {code}
> instance: 
> [https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_apis/build/builds/806/logs/127]



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

Reply via email to