[ 
https://issues.apache.org/jira/browse/HIVE-13742?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Sergio Peña updated HIVE-13742:
-------------------------------
    Attachment: hive.log

Here's the hive.log from the slave when I run {{TestFilterHooks}} manually.

> Hive ptest has many failures due to metastore connection refused
> ----------------------------------------------------------------
>
>                 Key: HIVE-13742
>                 URL: https://issues.apache.org/jira/browse/HIVE-13742
>             Project: Hive
>          Issue Type: Bug
>            Reporter: Sergio Peña
>         Attachments: hive.log
>
>
> The following exception is thrown on the Hive ptest with many tests, and it 
> is due to some Derby database issues:
> {noformat}
> 016-05-11T15:46:25,123 INFO  [Thread-2[]]: metastore.HiveMetaStore 
> (HiveMetaStore.java:newRawStore(563)) - 0: Opening raw store with 
> implementation class:org.apache.hadoop.hive.metastore.ObjectStore
> 2016-05-11T15:46:25,175 INFO  [Thread-2[]]: metastore.ObjectStore 
> (ObjectStore.java:initialize(324)) - ObjectStore, initialize called
> 2016-05-11T15:46:25,966 DEBUG [Thread-2[]]: bonecp.BoneCPDataSource 
> (BoneCPDataSource.java:getConnection(119)) - JDBC URL = 
> jdbc:derby:;databaseName=/home/hiveptest/54.177.132.113-hiveptest-1/apache-github-source-source/itests/hive-unit/target/tmpTestFilterHooksmetastore_db;create=true,
>  Username = APP, partitions = 1, max (per partition) = 10, min (per 
> partition) = 0, idle max age = 60 min, idle test period = 240 min, strategy = 
> DEFAULT
> 2016-05-11T15:46:26,003 ERROR [Thread-2[]]: Datastore.Schema 
> (Log4JLogger.java:error(125)) - Failed initialising database.
> org.datanucleus.exceptions.NucleusDataStoreException: Unable to open a test 
> connection to the given database. JDBC url = 
> jdbc:derby:;databaseName=/home/hiveptest/54.177.132.113-hiveptest-1/apache-github-source-source/itests/hive-unit/target/tmpTestFilterHooksmetastore_db;create=true,
>  username = APP. Terminating connection pool (set lazyInit to true if you 
> expect to start your database after your app). Original Exception: ------
> java.sql.SQLException: Failed to create database 
> '/home/hiveptest/54.177.132.113-hiveptest-1/apache-github-source-source/itests/hive-unit/target/tmpTestFilterHooksmetastore_db',
>  see the next exception for details.
>       at 
> org.apache.derby.impl.jdbc.SQLExceptionFactory40.getSQLException(Unknown 
> Source)
>       at org.apache.derby.impl.jdbc.Util.newEmbedSQLException(Unknown Source)
>       at org.apache.derby.impl.jdbc.Util.seeNextException(Unknown Source)
>       at org.apache.derby.impl.jdbc.EmbedConnection.createDatabase(Unknown 
> Source)
>       at org.apache.derby.impl.jdbc.EmbedConnection.<init>(Unknown Source)
>       at org.apache.derby.impl.jdbc.EmbedConnection40.<init>(Unknown Source)
>       at org.apache.derby.jdbc.Driver40.getNewEmbedConnection(Unknown Source)
>       at org.apache.derby.jdbc.InternalDriver.connect(Unknown Source)
>       at org.apache.derby.jdbc.Driver20.connect(Unknown Source)
>       at org.apache.derby.jdbc.AutoloadedDriver.connect(Unknown Source)
>       at java.sql.DriverManager.getConnection(DriverManager.java:664)
>       at java.sql.DriverManager.getConnection(DriverManager.java:208)
>       at com.jolbox.bonecp.BoneCP.obtainRawInternalConnection(BoneCP.java:361)
>       at com.jolbox.bonecp.BoneCP.<init>(BoneCP.java:416)
>       at 
> com.jolbox.bonecp.BoneCPDataSource.getConnection(BoneCPDataSource.java:120)
>       at 
> org.datanucleus.store.rdbms.ConnectionFactoryImpl$ManagedConnectionImpl.getConnection(ConnectionFactoryImpl.java:483)
>       at 
> org.datanucleus.store.rdbms.RDBMSStoreManager.<init>(RDBMSStoreManager.java:296)
>       at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
>       at 
> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
>       at 
> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
>       at java.lang.reflect.Constructor.newInstance(Constructor.java:408)
>       at 
> org.datanucleus.plugin.NonManagedPluginRegistry.createExecutableExtension(NonManagedPluginRegistry.java:606)
>       at 
> org.datanucleus.plugin.PluginManager.createExecutableExtension(PluginManager.java:301)
>       at 
> org.datanucleus.NucleusContextHelper.createStoreManagerForProperties(NucleusContextHelper.java:133)
>       at 
> org.datanucleus.PersistenceNucleusContextImpl.initialise(PersistenceNucleusContextImpl.java:420)
>       at 
> org.datanucleus.api.jdo.JDOPersistenceManagerFactory.freezeConfiguration(JDOPersistenceManagerFactory.java:821)
>       at 
> org.datanucleus.api.jdo.JDOPersistenceManagerFactory.createPersistenceManagerFactory(JDOPersistenceManagerFactory.java:338)
>       at 
> org.datanucleus.api.jdo.JDOPersistenceManagerFactory.getPersistenceManagerFactory(JDOPersistenceManagerFactory.java:217)
>       at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>       at 
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
>       at 
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>       at java.lang.reflect.Method.invoke(Method.java:483)
>       at javax.jdo.JDOHelper$16.run(JDOHelper.java:1965)
>       at java.security.AccessController.doPrivileged(Native Method)
>       at javax.jdo.JDOHelper.invoke(JDOHelper.java:1960)
>       at 
> javax.jdo.JDOHelper.invokeGetPersistenceManagerFactoryOnImplementation(JDOHelper.java:1166)
>       at javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:808)
>       at javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:701)
>       at 
> org.apache.hadoop.hive.metastore.ObjectStore.getPMF(ObjectStore.java:423)
>       at 
> org.apache.hadoop.hive.metastore.ObjectStore.getPersistenceManager(ObjectStore.java:452)
>       at 
> org.apache.hadoop.hive.metastore.ObjectStore.initialize(ObjectStore.java:326)
>       at 
> org.apache.hadoop.hive.metastore.ObjectStore.setConf(ObjectStore.java:293)
>       at 
> org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:73)
>       at 
> org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:133)
>       at 
> org.apache.hadoop.hive.metastore.RawStoreProxy.<init>(RawStoreProxy.java:58)
>       at 
> org.apache.hadoop.hive.metastore.RawStoreProxy.getProxy(RawStoreProxy.java:67)
>       at 
> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.newRawStore(HiveMetaStore.java:580)
>       at 
> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.getMS(HiveMetaStore.java:545)
>       at 
> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.createDefaultDB(HiveMetaStore.java:607)
>       at 
> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.init(HiveMetaStore.java:397)
>       at 
> org.apache.hadoop.hive.metastore.RetryingHMSHandler.<init>(RetryingHMSHandler.java:78)
>       at 
> org.apache.hadoop.hive.metastore.RetryingHMSHandler.getProxy(RetryingHMSHandler.java:84)
>       at 
> org.apache.hadoop.hive.metastore.HiveMetaStore.newRetryingHMSHandler(HiveMetaStore.java:6261)
>       at 
> org.apache.hadoop.hive.metastore.HiveMetaStore.newRetryingHMSHandler(HiveMetaStore.java:6256)
>       at 
> org.apache.hadoop.hive.metastore.HiveMetaStore.startMetaStore(HiveMetaStore.java:6514)
>       at 
> org.apache.hadoop.hive.metastore.HiveMetaStore.startMetaStore(HiveMetaStore.java:6472)
>       at 
> org.apache.hadoop.hive.metastore.MetaStoreUtils$1.run(MetaStoreUtils.java:1210)
>       at java.lang.Thread.run(Thread.java:745)
> Caused by: java.sql.SQLException: Failed to create database 
> '/home/hiveptest/54.177.132.113-hiveptest-1/apache-github-source-source/itests/hive-unit/target/tmpTestFilterHooksmetastore_db',
>  see the next exception for details.
>       at 
> org.apache.derby.impl.jdbc.SQLExceptionFactory.getSQLException(Unknown Source)
>       at 
> org.apache.derby.impl.jdbc.SQLExceptionFactory40.wrapArgsForTransportAcrossDRDA(Unknown
>  Source)
>       ... 58 more
> {noformat}
> I did delete the directory from the ptest slave, and the test finally pass 
> when run it manually. However, this issue is happening on other nodes, so 
> possibly another test is causing a corruption on the Derby database.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

Reply via email to