Hey folks,

I have a project where I test with Hive using an embedded HiveServer2
instance within a JVM running integration tests. This has worked for Hive
1.2.2 in the past, and I've been able to get it to work with Hive 2.3.8,
but have been having trouble getting it working on Hive 3.0+

The error I keep running into is that the metastore tables are not present
in the local embedded metastore. I have enabled both
"hive.metastore.schema.verification" to be "false" and
"datanucleus.schema.autoCreateAll" to be "true", but it seems like the
latter setting is being ignored. Instead of starting up, the HiveServer2
fails while trying to read from the DBS table:

Self-test query [select "DB_ID" from "DBS"] failed; direct SQL is disabled
javax.jdo.JDODataStoreException: Error executing SQL query "select "DB_ID"
from "DBS"".
     at
org.datanucleus.api.jdo.NucleusJDOHelper.getJDOExceptionForNucleusException(NucleusJDOHelper.java:543)
~[datanucleus-api-jdo-4.2.4.jar:?]
     at org.datanucleus.api.jdo.JDOQuery.executeInternal(JDOQuery.java:391)
~[datanucleus-api-jdo-4.2.4.jar:?]
     at org.datanucleus.api.jdo.JDOQuery.execute(JDOQuery.java:216)
~[datanucleus-api-jdo-4.2.4.jar:?]
     at
org.apache.hadoop.hive.metastore.MetaStoreDirectSql.runTestQuery(MetaStoreDirectSql.java:276)
[hive-exec-3.1.2.jar:3.1.2]
     at
org.apache.hadoop.hive.metastore.MetaStoreDirectSql.<init>(MetaStoreDirectSql.java:184)
[hive-exec-3.1.2.jar:3.1.2]
     at
org.apache.hadoop.hive.metastore.ObjectStore.initializeHelper(ObjectStore.java:498)
[hive-exec-3.1.2.jar:3.1.2]
     at
org.apache.hadoop.hive.metastore.ObjectStore.initialize(ObjectStore.java:420)
[hive-exec-3.1.2.jar:3.1.2]
     at
org.apache.hadoop.hive.metastore.ObjectStore.setConf(ObjectStore.java:375)
[hive-exec-3.1.2.jar:3.1.2]
     at
org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:77)
[hadoop-common-3.1.2.jar:?]
     at
org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:137)
[hadoop-common-3.1.2.jar:?]
     at
org.apache.hadoop.hive.metastore.RawStoreProxy.<init>(RawStoreProxy.java:59)
[hive-exec-3.1.2.jar:3.1.2]
     at
org.apache.hadoop.hive.metastore.RawStoreProxy.getProxy(RawStoreProxy.java:67)
[hive-exec-3.1.2.jar:3.1.2]
     at
org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.newRawStoreForConf(HiveMetaStore.java:718)
[hive-exec-3.1.2.jar:3.1.2]
     at
org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.getMSForConf(HiveMetaStore.java:696)
[hive-exec-3.1.2.jar:3.1.2]
     at
org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.getMS(HiveMetaStore.java:690)
[hive-exec-3.1.2.jar:3.1.2]
     at
org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.createDefaultDB(HiveMetaStore.java:767)
[hive-exec-3.1.2.jar:3.1.2]
     at
org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.init(HiveMetaStore.java:538)
[hive-exec-3.1.2.jar:3.1.2]
     <....>

Looking into the documentation, it seems that many things mention using the
schematool to set up the metastore the first time, but this is an embedded
use case, and there is no Hive installation locally to use for this.

I've also tried using the HiveJDBC driver with "jdbc:hive2:///" as the url
to run on an embedded server, and I am getting the same errors.

Is this use case not supported anymore in Hive 3? Am I missing something
here?

Reply via email to