RrazzmatazZ opened a new issue, #5164:
URL: https://github.com/apache/kyuubi/issues/5164

   ### Code of Conduct
   
   - [X] I agree to follow this project's [Code of 
Conduct](https://www.apache.org/foundation/policies/conduct)
   
   
   ### Search before asking
   
   - [X] I have searched in the 
[issues](https://github.com/apache/kyuubi/issues?q=is%3Aissue) and found no 
similar issues.
   
   
   ### Describe the bug
   
   When I create a table on hudi 
   ```
   CREATE TABLE hudi_cow_nonpcf_tbl (
     uuid INT,
     name STRING,
     price DOUBLE
   ) USING HUDI;
   ```
   
   then I got these error:
   ```
   23/08/15 16:53:29 ERROR Schema: Failed initialising database.
   Unable to open a test connection to the given database. JDBC url = 
jdbc:derby:;databaseName=metastore_db;create=true, username = APP. Terminating 
connection pool (set lazyInit to true if you expect to start your database 
after your app). Original Exception: ------
   java.sql.SQLException: Failed to start database 'metastore_db' with class 
loader org.apache.spark.sql.hive.client.IsolatedClientLoader$$anon$1@2222ea4, 
see the next exception for details.
           at 
org.apache.derby.impl.jdbc.SQLExceptionFactory.getSQLException(Unknown Source)
           at 
org.apache.derby.impl.jdbc.SQLExceptionFactory.getSQLException(Unknown Source)
           at org.apache.derby.impl.jdbc.Util.seeNextException(Unknown Source)
           at org.apache.derby.impl.jdbc.EmbedConnection.bootDatabase(Unknown 
Source)
           at org.apache.derby.impl.jdbc.EmbedConnection.<init>(Unknown Source)
           at org.apache.derby.jdbc.InternalDriver$1.run(Unknown Source)
           at org.apache.derby.jdbc.InternalDriver$1.run(Unknown Source)
           at java.security.AccessController.doPrivileged(Native Method)
           at 
org.apache.derby.jdbc.InternalDriver.getNewEmbedConnection(Unknown Source)
           at org.apache.derby.jdbc.InternalDriver.connect(Unknown Source)
           at org.apache.derby.jdbc.InternalDriver.connect(Unknown Source)
           at org.apache.derby.jdbc.AutoloadedDriver.connect(Unknown Source)
           at java.sql.DriverManager.getConnection(DriverManager.java:664)
           at java.sql.DriverManager.getConnection(DriverManager.java:208)
           at 
com.jolbox.bonecp.BoneCP.obtainRawInternalConnection(BoneCP.java:361)
           at com.jolbox.bonecp.BoneCP.<init>(BoneCP.java:416)
           at 
com.jolbox.bonecp.BoneCPDataSource.getConnection(BoneCPDataSource.java:120)
           at 
org.datanucleus.store.rdbms.ConnectionFactoryImpl$ManagedConnectionImpl.getConnection(ConnectionFactoryImpl.java:483)
           at 
org.datanucleus.store.rdbms.RDBMSStoreManager.<init>(RDBMSStoreManager.java:297)
           at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native 
Method)
           at 
sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
           at 
sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
           at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
           at 
org.datanucleus.plugin.NonManagedPluginRegistry.createExecutableExtension(NonManagedPluginRegistry.java:606)
           at 
org.datanucleus.plugin.PluginManager.createExecutableExtension(PluginManager.java:301)
           at 
org.datanucleus.NucleusContextHelper.createStoreManagerForProperties(NucleusContextHelper.java:133)
           at 
org.datanucleus.PersistenceNucleusContextImpl.initialise(PersistenceNucleusContextImpl.java:422)
           at 
org.datanucleus.api.jdo.JDOPersistenceManagerFactory.freezeConfiguration(JDOPersistenceManagerFactory.java:817)
           at 
org.datanucleus.api.jdo.JDOPersistenceManagerFactory.createPersistenceManagerFactory(JDOPersistenceManagerFactory.java:334)
           at 
org.datanucleus.api.jdo.JDOPersistenceManagerFactory.getPersistenceManagerFactory(JDOPersistenceManagerFactory.java:213)
           at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
           at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
           at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
           at java.lang.reflect.Method.invoke(Method.java:498)
           at javax.jdo.JDOHelper$16.run(JDOHelper.java:1975)
           at java.security.AccessController.doPrivileged(Native Method)
           at javax.jdo.JDOHelper.invoke(JDOHelper.java:1970)
           at 
javax.jdo.JDOHelper.invokeGetPersistenceManagerFactoryOnImplementation(JDOHelper.java:1177)
           at 
javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:814)
           at 
javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:702)
           at 
org.apache.hadoop.hive.metastore.ObjectStore.getPMF(ObjectStore.java:521)
           at 
org.apache.hadoop.hive.metastore.ObjectStore.getPersistenceManager(ObjectStore.java:550)
           at 
org.apache.hadoop.hive.metastore.ObjectStore.initializeHelper(ObjectStore.java:405)
           at 
org.apache.hadoop.hive.metastore.ObjectStore.initialize(ObjectStore.java:342)
           at 
org.apache.hadoop.hive.metastore.ObjectStore.setConf(ObjectStore.java:303)
           at 
org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:79)
           at 
org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:139)
           at 
org.apache.hadoop.hive.metastore.RawStoreProxy.<init>(RawStoreProxy.java:58)
           at 
org.apache.hadoop.hive.metastore.RawStoreProxy.getProxy(RawStoreProxy.java:67)
           at 
org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.newRawStoreForConf(HiveMetaStore.java:628)
           at 
org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.getMSForConf(HiveMetaStore.java:594)
           at 
org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.getMS(HiveMetaStore.java:588)
           at 
org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.createDefaultDB(HiveMetaStore.java:655)
           at 
org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.init(HiveMetaStore.java:431)
           at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
           at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
           at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
           at java.lang.reflect.Method.invoke(Method.java:498)
           at 
org.apache.hadoop.hive.metastore.RetryingHMSHandler.invokeInternal(RetryingHMSHandler.java:148)
           at 
org.apache.hadoop.hive.metastore.RetryingHMSHandler.invoke(RetryingHMSHandler.java:107)
           at 
org.apache.hadoop.hive.metastore.RetryingHMSHandler.<init>(RetryingHMSHandler.java:79)
           at 
org.apache.hadoop.hive.metastore.RetryingHMSHandler.getProxy(RetryingHMSHandler.java:92)
           at 
org.apache.hadoop.hive.metastore.HiveMetaStore.newRetryingHMSHandler(HiveMetaStore.java:6902)
           at 
org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:162)
           at 
org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient.<init>(SessionHiveMetaStoreClient.java:70)
           at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native 
Method)
           at 
sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
           at 
sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
           at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
           at 
org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1740)
           at 
org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.<init>(RetryingMetaStoreClient.java:83)
           at 
org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:133)
           at 
org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:104)
           at 
org.apache.hadoop.hive.ql.metadata.Hive.createMetaStoreClient(Hive.java:3607)
           at org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:3659)
           at org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:3639)
           at 
org.apache.hadoop.hive.ql.session.SessionState.setAuthorizerV2Config(SessionState.java:917)
           at 
org.apache.hadoop.hive.ql.session.SessionState.setupAuth(SessionState.java:881)
           at 
org.apache.hadoop.hive.ql.session.SessionState.getAuthenticator(SessionState.java:1483)
           at 
org.apache.hadoop.hive.ql.session.SessionState.getUserFromAuthenticator(SessionState.java:1154)
           at 
org.apache.hadoop.hive.ql.metadata.Table.getEmptyTable(Table.java:180)
           at org.apache.hadoop.hive.ql.metadata.Table.<init>(Table.java:122)
           at 
org.apache.spark.sql.hive.client.HiveClientImpl$.toHiveTable(HiveClientImpl.scala:1057)
           at 
org.apache.spark.sql.hive.client.HiveClientImpl.$anonfun$createTable$1(HiveClientImpl.scala:555)
           at 
scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23)
           at 
org.apache.spark.sql.hive.client.HiveClientImpl.$anonfun$withHiveState$1(HiveClientImpl.scala:305)
           at 
org.apache.spark.sql.hive.client.HiveClientImpl.liftedTree1$1(HiveClientImpl.scala:236)
           at 
org.apache.spark.sql.hive.client.HiveClientImpl.retryLocked(HiveClientImpl.scala:235)
           at 
org.apache.spark.sql.hive.client.HiveClientImpl.withHiveState(HiveClientImpl.scala:285)
           at 
org.apache.spark.sql.hive.client.HiveClientImpl.createTable(HiveClientImpl.scala:553)
           at 
org.apache.spark.sql.hudi.command.CreateHoodieTableCommand$.createHiveDataSourceTable(CreateHoodieTableCommand.scala:195)
           at 
org.apache.spark.sql.hudi.command.CreateHoodieTableCommand$.createTableInCatalog(CreateHoodieTableCommand.scala:166)
           at 
org.apache.spark.sql.hudi.command.CreateHoodieTableCommand.run(CreateHoodieTableCommand.scala:83)
           at 
org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult$lzycompute(commands.scala:75)
           at 
org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult(commands.scala:73)
           at 
org.apache.spark.sql.execution.command.ExecutedCommandExec.executeCollect(commands.scala:84)
           at 
org.apache.spark.sql.execution.QueryExecution$$anonfun$eagerlyExecuteCommands$1.$anonfun$applyOrElse$1(QueryExecution.scala:97)
           at 
org.apache.spark.sql.execution.SQLExecution$.$anonfun$withNewExecutionId$5(SQLExecution.scala:103)
           at 
org.apache.spark.sql.execution.SQLExecution$.withSQLConfPropagated(SQLExecution.scala:163)
           at 
org.apache.spark.sql.execution.SQLExecution$.$anonfun$withNewExecutionId$1(SQLExecution.scala:90)
           at 
org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:775)
           at 
org.apache.spark.sql.execution.SQLExecution$.withNewExecutionId(SQLExecution.scala:64)
           at 
org.apache.spark.sql.execution.QueryExecution$$anonfun$eagerlyExecuteCommands$1.applyOrElse(QueryExecution.scala:97)
           at 
org.apache.spark.sql.execution.QueryExecution$$anonfun$eagerlyExecuteCommands$1.applyOrElse(QueryExecution.scala:93)
           at 
org.apache.spark.sql.catalyst.trees.TreeNode.$anonfun$transformDownWithPruning$1(TreeNode.scala:481)
           at 
org.apache.spark.sql.catalyst.trees.CurrentOrigin$.withOrigin(TreeNode.scala:82)
           at 
org.apache.spark.sql.catalyst.trees.TreeNode.transformDownWithPruning(TreeNode.scala:481)
           at 
org.apache.spark.sql.catalyst.plans.logical.LogicalPlan.org$apache$spark$sql$catalyst$plans$logical$AnalysisHelper$$super$transformDownWithPruning(LogicalPlan.scala:30)
           at 
org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper.transformDownWithPruning(AnalysisHelper.scala:267)
           at 
org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper.transformDownWithPruning$(AnalysisHelper.scala:263)
           at 
org.apache.spark.sql.catalyst.plans.logical.LogicalPlan.transformDownWithPruning(LogicalPlan.scala:30)
           at 
org.apache.spark.sql.catalyst.plans.logical.LogicalPlan.transformDownWithPruning(LogicalPlan.scala:30)
           at 
org.apache.spark.sql.catalyst.trees.TreeNode.transformDown(TreeNode.scala:457)
           at 
org.apache.spark.sql.execution.QueryExecution.eagerlyExecuteCommands(QueryExecution.scala:93)
           at 
org.apache.spark.sql.execution.QueryExecution.commandExecuted$lzycompute(QueryExecution.scala:80)
           at 
org.apache.spark.sql.execution.QueryExecution.commandExecuted(QueryExecution.scala:78)
           at org.apache.spark.sql.Dataset.<init>(Dataset.scala:219)
           at org.apache.spark.sql.Dataset$.$anonfun$ofRows$2(Dataset.scala:99)
           at 
org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:775)
           at org.apache.spark.sql.Dataset$.ofRows(Dataset.scala:96)
           at 
org.apache.spark.sql.SparkSession.$anonfun$sql$1(SparkSession.scala:618)
           at 
org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:775)
           at org.apache.spark.sql.SparkSession.sql(SparkSession.scala:613)
           at 
org.apache.kyuubi.engine.spark.operation.ExecuteStatement.$anonfun$executeStatement$1(ExecuteStatement.scala:83)
           at 
scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23)
           at 
org.apache.kyuubi.engine.spark.operation.SparkOperation.$anonfun$withLocalProperties$1(SparkOperation.scala:155)
           at 
org.apache.spark.sql.execution.SQLExecution$.withSQLConfPropagated(SQLExecution.scala:163)
           at 
org.apache.kyuubi.engine.spark.operation.SparkOperation.withLocalProperties(SparkOperation.scala:139)
           at 
org.apache.kyuubi.engine.spark.operation.ExecuteStatement.executeStatement(ExecuteStatement.scala:78)
           at 
org.apache.kyuubi.engine.spark.operation.ExecuteStatement$$anon$1.run(ExecuteStatement.scala:100)
           at 
java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
           at java.util.concurrent.FutureTask.run(FutureTask.java:266)
           at 
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
           at 
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
           at java.lang.Thread.run(Thread.java:748)
   Caused by: ERROR XJ040: Failed to start database 'metastore_db' with class 
loader org.apache.spark.sql.hive.client.IsolatedClientLoader$$anon$1@2222ea4, 
see the next exception for details.
           at 
org.apache.derby.iapi.error.StandardException.newException(Unknown Source)
           at 
org.apache.derby.impl.jdbc.SQLExceptionFactory.wrapArgsForTransportAcrossDRDA(Unknown
 Source)
           ... 135 more
   Caused by: ERROR XSDB6: Another instance of Derby may have already booted 
the database /opt/apache-kyuubi-1.7.1-bin/work/hudi/metastore_db.
           at 
org.apache.derby.iapi.error.StandardException.newException(Unknown Source)
           at 
org.apache.derby.iapi.error.StandardException.newException(Unknown Source)
           at 
org.apache.derby.impl.store.raw.data.BaseDataFileFactory.privGetJBMSLockOnDB(Unknown
 Source)
           at 
org.apache.derby.impl.store.raw.data.BaseDataFileFactory.run(Unknown Source)
           at java.security.AccessController.doPrivileged(Native Method)
           at 
org.apache.derby.impl.store.raw.data.BaseDataFileFactory.getJBMSLockOnDB(Unknown
 Source)
           at 
org.apache.derby.impl.store.raw.data.BaseDataFileFactory.boot(Unknown Source)
           at org.apache.derby.impl.services.monitor.BaseMonitor.boot(Unknown 
Source)
           at 
org.apache.derby.impl.services.monitor.TopService.bootModule(Unknown Source)
           at 
org.apache.derby.impl.services.monitor.BaseMonitor.startModule(Unknown Source)
           at 
org.apache.derby.impl.services.monitor.FileMonitor.startModule(Unknown Source)
           at 
org.apache.derby.iapi.services.monitor.Monitor.bootServiceModule(Unknown Source)
           at org.apache.derby.impl.store.raw.RawStore$6.run(Unknown Source)
           at java.security.AccessController.doPrivileged(Native Method)
           at 
org.apache.derby.impl.store.raw.RawStore.bootServiceModule(Unknown Source)
           at org.apache.derby.impl.store.raw.RawStore.boot(Unknown Source)
           at org.apache.derby.impl.services.monitor.BaseMonitor.boot(Unknown 
Source)
           at 
org.apache.derby.impl.services.monitor.TopService.bootModule(Unknown Source)
           at 
org.apache.derby.impl.services.monitor.BaseMonitor.startModule(Unknown Source)
           at 
org.apache.derby.impl.services.monitor.FileMonitor.startModule(Unknown Source)
           at 
org.apache.derby.iapi.services.monitor.Monitor.bootServiceModule(Unknown Source)
           at org.apache.derby.impl.store.access.RAMAccessManager$5.run(Unknown 
Source)
           at java.security.AccessController.doPrivileged(Native Method)
           at 
org.apache.derby.impl.store.access.RAMAccessManager.bootServiceModule(Unknown 
Source)
           at org.apache.derby.impl.store.access.RAMAccessManager.boot(Unknown 
Source)
           at org.apache.derby.impl.services.monitor.BaseMonitor.boot(Unknown 
Source)
           at 
org.apache.derby.impl.services.monitor.TopService.bootModule(Unknown Source)
           at 
org.apache.derby.impl.services.monitor.BaseMonitor.startModule(Unknown Source)
           at 
org.apache.derby.impl.services.monitor.FileMonitor.startModule(Unknown Source)
           at 
org.apache.derby.iapi.services.monitor.Monitor.bootServiceModule(Unknown Source)
           at org.apache.derby.impl.db.BasicDatabase$5.run(Unknown Source)
           at java.security.AccessController.doPrivileged(Native Method)
           at org.apache.derby.impl.db.BasicDatabase.bootServiceModule(Unknown 
Source)
           at org.apache.derby.impl.db.BasicDatabase.bootStore(Unknown Source)
           at org.apache.derby.impl.db.BasicDatabase.boot(Unknown Source)
           at org.apache.derby.impl.services.monitor.BaseMonitor.boot(Unknown 
Source)
           at 
org.apache.derby.impl.services.monitor.TopService.bootModule(Unknown Source)
           at 
org.apache.derby.impl.services.monitor.BaseMonitor.bootService(Unknown Source)
           at 
org.apache.derby.impl.services.monitor.BaseMonitor.startProviderService(Unknown 
Source)
           at 
org.apache.derby.impl.services.monitor.BaseMonitor.findProviderAndStartService(Unknown
 Source)
           at 
org.apache.derby.impl.services.monitor.BaseMonitor.startPersistentService(Unknown
 Source)
           at 
org.apache.derby.iapi.services.monitor.Monitor.startPersistentService(Unknown 
Source)
           at org.apache.derby.impl.jdbc.EmbedConnection$4.run(Unknown Source)
           at org.apache.derby.impl.jdbc.EmbedConnection$4.run(Unknown Source)
           at java.security.AccessController.doPrivileged(Native Method)
           at 
org.apache.derby.impl.jdbc.EmbedConnection.startPersistentService(Unknown 
Source)
           ... 132 more
   ```
   I didn't set **hive.metastore.uris** in my spark conf, I wonder if hive env 
is necessary. 
   
   
   
   ### Affects Version(s)
   
   1.7.1/1.7.0
   
   ### Kyuubi Server Log Output
   
   ```logtalk
   2023-08-15 16:53:28.098 INFO org.apache.kyuubi.operation.ExecuteStatement: 
Processing hudi's query[6df1e6ac-e4d3-4383-82d2-1c6691c91a8d]: PENDING_STATE -> 
RUNNING_STATE, statement:
   CREATE TABLE hudi_cow_nonpcf_tbl (
     uuid INT,
     name STRING,
     price DOUBLE
   ) USING HUDI
   2023-08-15 16:53:33.106 INFO org.apache.kyuubi.operation.ExecuteStatement: 
Query[6df1e6ac-e4d3-4383-82d2-1c6691c91a8d] in RUNNING_STATE
   2023-08-15 16:53:38.113 INFO org.apache.kyuubi.operation.ExecuteStatement: 
Query[6df1e6ac-e4d3-4383-82d2-1c6691c91a8d] in RUNNING_STATE
   2023-08-15 16:53:43.119 INFO org.apache.kyuubi.operation.ExecuteStatement: 
Query[6df1e6ac-e4d3-4383-82d2-1c6691c91a8d] in RUNNING_STATE
   2023-08-15 16:53:43.122 INFO org.apache.kyuubi.operation.ExecuteStatement: 
Processing hudi's query[6df1e6ac-e4d3-4383-82d2-1c6691c91a8d]: RUNNING_STATE -> 
CANCELED_STATE, time taken: 15.024 seconds
   2023-08-15 16:53:48.124 INFO org.apache.kyuubi.operation.ExecuteStatement: 
Query[6df1e6ac-e4d3-4383-82d2-1c6691c91a8d] in RUNNING_STATE
   2023-08-15 16:53:48.135 INFO 
org.apache.kyuubi.client.KyuubiSyncThriftClient: 
TCancelOperationReq(operationHandle:TOperationHandle(operationId:THandleIdentifier(guid:60
 7D A3 0F DD C1 40 BD B9 CD A8 8E C9 76 99 8C, secret:C2 EE 5B 97 3E A0 41 FC 
AC 16 9B D7 08 ED 8F 38), operationType:EXECUTE_STATEMENT, hasResultSet:true)) 
succeed on engine side
   2023-08-15 16:53:48.143 INFO org.apache.kyuubi.operation.ExecuteStatement: 
Query[6df1e6ac-e4d3-4383-82d2-1c6691c91a8d] in CANCELED_STATE
   2023-08-15 16:53:48.153 WARN org.apache.kyuubi.operation.ExecuteStatement: 
Ignore exception in terminal state with 6df1e6ac-e4d3-4383-82d2-1c6691c91a8d
   ```
   
   
   ### Kyuubi Engine Log Output
   
   ```logtalk
   Caused by: ERROR XJ040: Failed to start database 'metastore_db' with class 
loader org.apache.spark.sql.hive.client.IsolatedClientLoader$$anon$1@2222ea4, 
see the next exception for details.
           at 
org.apache.derby.iapi.error.StandardException.newException(Unknown Source)
           at 
org.apache.derby.impl.jdbc.SQLExceptionFactory.wrapArgsForTransportAcrossDRDA(Unknown
 Source)
           ... 133 more
   Caused by: ERROR XSDB6: Another instance of Derby may have already booted 
the database /opt/apache-kyuubi-1.7.1-bin/work/hudi/metastore_db.
           at 
org.apache.derby.iapi.error.StandardException.newException(Unknown Source)
           at 
org.apache.derby.iapi.error.StandardException.newException(Unknown Source)
           at 
org.apache.derby.impl.store.raw.data.BaseDataFileFactory.privGetJBMSLockOnDB(Unknown
 Source)
           at 
org.apache.derby.impl.store.raw.data.BaseDataFileFactory.run(Unknown Source)
           at java.security.AccessController.doPrivileged(Native Method)
           at 
org.apache.derby.impl.store.raw.data.BaseDataFileFactory.getJBMSLockOnDB(Unknown
 Source)
           at 
org.apache.derby.impl.store.raw.data.BaseDataFileFactory.boot(Unknown Source)
           at org.apache.derby.impl.services.monitor.BaseMonitor.boot(Unknown 
Source)
           at 
org.apache.derby.impl.services.monitor.TopService.bootModule(Unknown Source)
           at 
org.apache.derby.impl.services.monitor.BaseMonitor.startModule(Unknown Source)
           at 
org.apache.derby.impl.services.monitor.FileMonitor.startModule(Unknown Source)
           at 
org.apache.derby.iapi.services.monitor.Monitor.bootServiceModule(Unknown Source)
           at org.apache.derby.impl.store.raw.RawStore$6.run(Unknown Source)
           at java.security.AccessController.doPrivileged(Native Method)
           at 
org.apache.derby.impl.store.raw.RawStore.bootServiceModule(Unknown Source)
           at org.apache.derby.impl.store.raw.RawStore.boot(Unknown Source)
           at org.apache.derby.impl.services.monitor.BaseMonitor.boot(Unknown 
Source)
           at 
org.apache.derby.impl.services.monitor.TopService.bootModule(Unknown Source)
           at 
org.apache.derby.impl.services.monitor.BaseMonitor.startModule(Unknown Source)
           at 
org.apache.derby.impl.services.monitor.FileMonitor.startModule(Unknown Source)
           at 
org.apache.derby.iapi.services.monitor.Monitor.bootServiceModule(Unknown Source)
           at org.apache.derby.impl.store.access.RAMAccessManager$5.run(Unknown 
Source)
           at java.security.AccessController.doPrivileged(Native Method)
           at 
org.apache.derby.impl.store.access.RAMAccessManager.bootServiceModule(Unknown 
Source)
           at org.apache.derby.impl.store.access.RAMAccessManager.boot(Unknown 
Source)
           at org.apache.derby.impl.services.monitor.BaseMonitor.boot(Unknown 
Source)
           at 
org.apache.derby.impl.services.monitor.TopService.bootModule(Unknown Source)
           at 
org.apache.derby.impl.services.monitor.BaseMonitor.startModule(Unknown Source)
           at 
org.apache.derby.impl.services.monitor.FileMonitor.startModule(Unknown Source)
           at 
org.apache.derby.iapi.services.monitor.Monitor.bootServiceModule(Unknown Source)
           at org.apache.derby.impl.db.BasicDatabase$5.run(Unknown Source)
           at java.security.AccessController.doPrivileged(Native Method)
           at org.apache.derby.impl.db.BasicDatabase.bootServiceModule(Unknown 
Source)
           at org.apache.derby.impl.db.BasicDatabase.bootStore(Unknown Source)
           at org.apache.derby.impl.db.BasicDatabase.boot(Unknown Source)
           at org.apache.derby.impl.services.monitor.BaseMonitor.boot(Unknown 
Source)
           at 
org.apache.derby.impl.services.monitor.TopService.bootModule(Unknown Source)
           at 
org.apache.derby.impl.services.monitor.BaseMonitor.bootService(Unknown Source)
           at 
org.apache.derby.impl.services.monitor.BaseMonitor.startProviderService(Unknown 
Source)
           at 
org.apache.derby.impl.services.monitor.BaseMonitor.findProviderAndStartService(Unknown
 Source)
           at 
org.apache.derby.impl.services.monitor.BaseMonitor.startPersistentService(Unknown
 Source)
           at 
org.apache.derby.iapi.services.monitor.Monitor.startPersistentService(Unknown 
Source)
           at org.apache.derby.impl.jdbc.EmbedConnection$4.run(Unknown Source)
           at org.apache.derby.impl.jdbc.EmbedConnection$4.run(Unknown Source)
           at java.security.AccessController.doPrivileged(Native Method)
           at 
org.apache.derby.impl.jdbc.EmbedConnection.startPersistentService(Unknown 
Source)
           ... 130 more
   ```
   
   
   ### Kyuubi Server Configurations
   
   ```yaml
   kyuubi.engine.single.spark.session=true
   
   
   spark.sql.extensions=org.apache.spark.sql.hudi.HoodieSparkSessionExtension
   spark.serializer=org.apache.spark.serializer.KryoSerializer
   
spark.sql.catalog.spark_catalog=org.apache.spark.sql.hudi.catalog.HoodieCatalog
   
   spark.hoodie.schema.on.read.enable=true
   spark.hoodie.datasource.write.reconcile.schema=true
   spark.hoodie.datasource.write.schema.allow.auto.evolution.column.drop=true
   ```
   
   
   ### Kyuubi Engine Configurations
   
   _No response_
   
   ### Additional context
   
   spark version 3.2.4-bin-hadoop3.2
   hudi version 0.12.2
   
   
   ### Are you willing to submit PR?
   
   - [ ] Yes. I would be willing to submit a PR with guidance from the Kyuubi 
community to fix.
   - [ ] No. I cannot submit a PR at this time.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]


---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to