Hi all,

I have Configured Hive 1.1.0 in Hadoop 2.4.1 successfully.Have started
the metastore by [ hive --service metastore -p 7099 ]

and got the log as

2015-03-10 11:32:42,649 ERROR [main]: DataNucleus.Datastore
(Log4JLogger.java:error(115)) - An exception was thrown while
adding/validating class(es) : Specified key was too long; max key
length is 767 bytes
com.mysql.jdbc.exceptions.jdbc4.MySQLSyntaxErrorException: Specified
key was too long; max key length is 767 bytes
    at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
    at 
sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
    at 
sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
    at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
    at com.mysql.jdbc.Util.handleNewInstance(Util.java:411)
    at com.mysql.jdbc.Util.getInstance(Util.java:386)
    at com.mysql.jdbc.SQLError.createSQLException(SQLError.java:1054)
    at com.mysql.jdbc.MysqlIO.checkErrorPacket(MysqlIO.java:4237)
    at com.mysql.jdbc.MysqlIO.checkErrorPacket(MysqlIO.java:4169)
    at com.mysql.jdbc.MysqlIO.sendCommand(MysqlIO.java:2617)
    at com.mysql.jdbc.MysqlIO.sqlQueryDirect(MysqlIO.java:2778)
    at com.mysql.jdbc.ConnectionImpl.execSQL(ConnectionImpl.java:2819)
    at com.mysql.jdbc.ConnectionImpl.execSQL(ConnectionImpl.java:2768)
    at com.mysql.jdbc.StatementImpl.execute(StatementImpl.java:949)
    at com.mysql.jdbc.StatementImpl.execute(StatementImpl.java:795)
    at com.jolbox.bonecp.StatementHandle.execute(StatementHandle.java:254)
    at 
org.datanucleus.store.rdbms.table.AbstractTable.executeDdlStatement(AbstractTable.java:760)
    at 
org.datanucleus.store.rdbms.table.TableImpl.createIndices(TableImpl.java:648)
    at 
org.datanucleus.store.rdbms.table.TableImpl.validateIndices(TableImpl.java:593)
    at 
org.datanucleus.store.rdbms.table.TableImpl.validateConstraints(TableImpl.java:390)
    at 
org.datanucleus.store.rdbms.table.ClassTable.validateConstraints(ClassTable.java:3463)
    at 
org.datanucleus.store.rdbms.RDBMSStoreManager$ClassAdder.performTablesValidation(RDBMSStoreManager.java:3464)
    at 
org.datanucleus.store.rdbms.RDBMSStoreManager$ClassAdder.addClassTablesAndValidate(RDBMSStoreManager.java:3190)
    at 
org.datanucleus.store.rdbms.RDBMSStoreManager$ClassAdder.run(RDBMSStoreManager.java:2841)
    at 
org.datanucleus.store.rdbms.AbstractSchemaTransaction.execute(AbstractSchemaTransaction.java:122)
    at 
org.datanucleus.store.rdbms.RDBMSStoreManager.addClasses(RDBMSStoreManager.java:1605)
    at 
org.datanucleus.store.AbstractStoreManager.addClass(AbstractStoreManager.java:954)
    at 
org.datanucleus.store.rdbms.RDBMSStoreManager.getDatastoreClass(RDBMSStoreManager.java:679)
    at 
org.datanucleus.store.rdbms.query.RDBMSQueryUtils.getStatementForCandidates(RDBMSQueryUtils.java:408)
    at 
org.datanucleus.store.rdbms.query.JDOQLQuery.compileQueryFull(JDOQLQuery.java:947)
    at 
org.datanucleus.store.rdbms.query.JDOQLQuery.compileInternal(JDOQLQuery.java:370)
    at org.datanucleus.store.query.Query.executeQuery(Query.java:1744)
    at org.datanucleus.store.query.Query.executeWithArray(Query.java:1672)
    at org.datanucleus.store.query.Query.execute(Query.java:1654)
    at org.datanucleus.api.jdo.JDOQuery.execute(JDOQuery.java:221)
    at 
org.apache.hadoop.hive.metastore.MetaStoreDirectSql.ensureDbInit(MetaStoreDirectSql.java:172)
    at 
org.apache.hadoop.hive.metastore.MetaStoreDirectSql.<init>(MetaStoreDirectSql.java:130)
    at 
org.apache.hadoop.hive.metastore.ObjectStore.initialize(ObjectStore.java:275)
    at 
org.apache.hadoop.hive.metastore.ObjectStore.setConf(ObjectStore.java:238)
    at org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:73)
    at 
org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:133)
    at 
org.apache.hadoop.hive.metastore.RawStoreProxy.<init>(RawStoreProxy.java:56)
    at 
org.apache.hadoop.hive.metastore.RawStoreProxy.getProxy(RawStoreProxy.java:65)
    at 
org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.newRawStore(HiveMetaStore.java:579)
    at 
org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.getMS(HiveMetaStore.java:557)
    at 
org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.createDefaultDB(HiveMetaStore.java:606)
    at 
org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.init(HiveMetaStore.java:448)
    at 
org.apache.hadoop.hive.metastore.RetryingHMSHandler.<init>(RetryingHMSHandler.java:66)
    at 
org.apache.hadoop.hive.metastore.RetryingHMSHandler.getProxy(RetryingHMSHandler.java:72)
    at 
org.apache.hadoop.hive.metastore.HiveMetaStore.newRetryingHMSHandler(HiveMetaStore.java:5570)
    at 
org.apache.hadoop.hive.metastore.HiveMetaStore.newRetryingHMSHandler(HiveMetaStore.java:5565)
    at 
org.apache.hadoop.hive.metastore.HiveMetaStore.startMetaStore(HiveMetaStore.java:5798)
    at 
org.apache.hadoop.hive.metastore.HiveMetaStore.main(HiveMetaStore.java:5723)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
    at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:606)
    at org.apache.hadoop.util.RunJar.main(RunJar.java:212)



*Changed the metastore DB name but the db is not created in MYSQL

Can anyone suggest it






Thanks & Regards
Amithsha

Reply via email to