Thanks Jason. It worked.
There was a different version of SLF4J libraries existing in
$HADOOP_HOME/lib.
Once I synced both the libraries, it started working.


On Fri, Jul 11, 2014 at 11:25 PM, Jason Dere <jd...@hortonworks.com> wrote:

> Looking at that error online, I see
> http://slf4j.org/faq.html#compatibility
> Maybe try to find what version of the slf libraries you have installed (in
> hadoop? hive?), and try updating to later version.
>
>
>
> On Jul 10, 2014, at 9:57 PM, Sarath Chandra <
> sarathchandra.jos...@algofusiontech.com> wrote:
>
> I'm using Hadoop 1.0.4. Suspecting some compatibility issues I moved from
> Hive 0.13 to Hive 0.12.
> But the exceptions related to SL4J still persist.
>
> Unable to move forward with hive to finalize a critical product design.
> Can somebody please help me?
>
>
> On Wed, Jul 9, 2014 at 11:25 AM, Sarath Chandra <
> sarathchandra.jos...@algofusiontech.com> wrote:
>
>> Thanks Deepesh.
>>
>> To use hive with embedded derby mode, I have put the below configuration
>> in hive-site.xml. As suggested on the net, I ran "schematool -dbType derby
>> -initSchema" and it created $HIVE_HOME/metastore_db folder.
>>
>> Then as suggested by you, I ran "hive --service metastore". Strangely I'm
>> getting exceptions related to SL4J -- *java.lang.IllegalAccessError:
>> tried to access field org.slf4j.impl.StaticLoggerBinder.SINGLETON from
>> class org.slf4j.LoggerFactory*
>>
>> Is there anything more to configure before starting?
>>
>> *hive-site.xml*
>> <configuration>
>>   <property>
>>     <name>javax.jdo.option.ConnectionURL</name>
>>     <value>jdbc:derby:;databaseName=metastore_db;create=true</value>
>>   </property>
>>   <property>
>>     <name>hive.metastore.warehouse.dir</name>
>>     <value>/user/hive/warehouse</value>
>>   </property>
>>   <property>
>>     <name>hive.exec.scratchdir</name>
>>     <value>/tmp/hduser</value>
>>   </property>
>> </configuration>
>>
>>
>> On Tue, Jul 8, 2014 at 11:20 PM, D K <deepe...@gmail.com> wrote:
>>
>>> Did you start the Hive Metastore? You can start that by running
>>> hive --service metastore
>>>
>>>
>>>
>>> On Tue, Jul 8, 2014 at 5:27 AM, Sarath Chandra <
>>> sarathchandra.jos...@algofusiontech.com> wrote:
>>>
>>>> Thanks Santhosh.
>>>> So before going to launch hive shell, we need to start hive server is
>>>> what I understand.
>>>> I tried starting hive server by running ./bin/hiveserver2. It just
>>>> prompts "Starting HiveServer2" and keeps waiting. Nothing is happening even
>>>> after waiting for several minutes.
>>>>
>>>>
>>>> On Tue, Jul 8, 2014 at 4:16 PM, Santhosh Thomas <
>>>> santhosh.tho...@yahoo.com> wrote:
>>>>
>>>>> how did you start hive? Use hive-server2
>>>>>
>>>>>   ------------------------------
>>>>>  *From:* Sarath Chandra <sarathchandra.jos...@algofusiontech.com>
>>>>> *To:* user@hive.apache.org
>>>>> *Sent:* Tuesday, July 8, 2014 4:02 PM
>>>>> *Subject:* Issue while running Hive 0.13
>>>>>
>>>>> Hi,
>>>>>
>>>>> I'm a newbie to Hive. Facing an issue while installing hive stable
>>>>> version (0.13). I downloaded the tar file from the site (
>>>>> apache-hive-0.13.1-bin.tar.gz
>>>>> <http://apache.cs.utah.edu/hive/hive-0.13.1/apache-hive-0.13.1-bin.tar.gz>)
>>>>> and followed the instructions given on Hive Wiki
>>>>> <https://cwiki.apache.org/confluence/display/Hive/GettingStarted#GettingStarted-InstallationandConfiguration>.
>>>>> On running the "hive" command to get to the hive shell, I'm getting the
>>>>> below exception.
>>>>>
>>>>> Request for a help in this regard. What am I missing? Is there any
>>>>> further configuration to be done?
>>>>>
>>>>> Logging initialized using configuration in
>>>>> file:/usr/local/hive-0.13.1/conf/hive-log4j.properties
>>>>> Exception in thread "main" java.lang.RuntimeException:
>>>>> java.lang.RuntimeException: Unable to instantiate
>>>>> org.apache.hadoop.hive.metastore.HiveMetaStoreClient
>>>>> at
>>>>> org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:346)
>>>>>  at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:681)
>>>>> at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:625)
>>>>>  at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>>>> at
>>>>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
>>>>>  at
>>>>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
>>>>> at java.lang.reflect.Method.invoke(Method.java:597)
>>>>>  at org.apache.hadoop.util.RunJar.main(RunJar.java:156)
>>>>> Caused by: java.lang.RuntimeException: Unable to instantiate
>>>>> org.apache.hadoop.hive.metastore.HiveMetaStoreClient
>>>>>  at
>>>>> org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1412)
>>>>> at
>>>>> org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.<init>(RetryingMetaStoreClient.java:62)
>>>>>  at
>>>>> org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:72)
>>>>> at
>>>>> org.apache.hadoop.hive.ql.metadata.Hive.createMetaStoreClient(Hive.java:2453)
>>>>>  at org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:2465)
>>>>> at
>>>>> org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:340)
>>>>>  ... 7 more
>>>>> Caused by: java.lang.reflect.InvocationTargetException
>>>>> at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native
>>>>> Method)
>>>>>  at
>>>>> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:39)
>>>>> at
>>>>> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:27)
>>>>>  at java.lang.reflect.Constructor.newInstance(Constructor.java:513)
>>>>> at
>>>>> org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1410)
>>>>>  ... 12 more
>>>>> Caused by: javax.jdo.JDOFatalInternalException: Error creating
>>>>> transactional connection factory
>>>>> NestedThrowables:
>>>>> java.lang.reflect.InvocationTargetException
>>>>>  at
>>>>> org.datanucleus.api.jdo.NucleusJDOHelper.getJDOExceptionForNucleusException(NucleusJDOHelper.java:587)
>>>>> at
>>>>> org.datanucleus.api.jdo.JDOPersistenceManagerFactory.freezeConfiguration(JDOPersistenceManagerFactory.java:788)
>>>>>  at
>>>>> org.datanucleus.api.jdo.JDOPersistenceManagerFactory.createPersistenceManagerFactory(JDOPersistenceManagerFactory.java:333)
>>>>> at
>>>>> org.datanucleus.api.jdo.JDOPersistenceManagerFactory.getPersistenceManagerFactory(JDOPersistenceManagerFactory.java:202)
>>>>>  at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>>>> at
>>>>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
>>>>>  at
>>>>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
>>>>> at java.lang.reflect.Method.invoke(Method.java:597)
>>>>>  at javax.jdo.JDOHelper$16.run(JDOHelper.java:1965)
>>>>> at java.security.AccessController.doPrivileged(Native Method)
>>>>>  at javax.jdo.JDOHelper.invoke(JDOHelper.java:1960)
>>>>> at
>>>>> javax.jdo.JDOHelper.invokeGetPersistenceManagerFactoryOnImplementation(JDOHelper.java:1166)
>>>>>  at
>>>>> javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:808)
>>>>> at javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:701)
>>>>>  at
>>>>> org.apache.hadoop.hive.metastore.ObjectStore.getPMF(ObjectStore.java:310)
>>>>> at
>>>>> org.apache.hadoop.hive.metastore.ObjectStore.getPersistenceManager(ObjectStore.java:339)
>>>>>  at
>>>>> org.apache.hadoop.hive.metastore.ObjectStore.initialize(ObjectStore.java:248)
>>>>> at
>>>>> org.apache.hadoop.hive.metastore.ObjectStore.setConf(ObjectStore.java:223)
>>>>>  at
>>>>> org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:62)
>>>>> at
>>>>> org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:117)
>>>>>  at
>>>>> org.apache.hadoop.hive.metastore.RawStoreProxy.<init>(RawStoreProxy.java:58)
>>>>> at
>>>>> org.apache.hadoop.hive.metastore.RawStoreProxy.getProxy(RawStoreProxy.java:67)
>>>>>  at
>>>>> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.newRawStore(HiveMetaStore.java:497)
>>>>> at
>>>>> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.getMS(HiveMetaStore.java:475)
>>>>>  at
>>>>> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.createDefaultDB(HiveMetaStore.java:523)
>>>>> at
>>>>> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.init(HiveMetaStore.java:397)
>>>>>  at
>>>>> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.<init>(HiveMetaStore.java:356)
>>>>> at
>>>>> org.apache.hadoop.hive.metastore.RetryingHMSHandler.<init>(RetryingHMSHandler.java:54)
>>>>>  at
>>>>> org.apache.hadoop.hive.metastore.RetryingHMSHandler.getProxy(RetryingHMSHandler.java:59)
>>>>> at
>>>>> org.apache.hadoop.hive.metastore.HiveMetaStore.newHMSHandler(HiveMetaStore.java:4944)
>>>>>  at
>>>>> org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:171)
>>>>> ... 17 more
>>>>> Caused by: java.lang.reflect.InvocationTargetException
>>>>> at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native
>>>>> Method)
>>>>> at
>>>>> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:39)
>>>>>  at
>>>>> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:27)
>>>>> at java.lang.reflect.Constructor.newInstance(Constructor.java:513)
>>>>>  at
>>>>> org.datanucleus.plugin.NonManagedPluginRegistry.createExecutableExtension(NonManagedPluginRegistry.java:631)
>>>>> at
>>>>> org.datanucleus.plugin.PluginManager.createExecutableExtension(PluginManager.java:325)
>>>>>  at
>>>>> org.datanucleus.store.AbstractStoreManager.registerConnectionFactory(AbstractStoreManager.java:282)
>>>>> at
>>>>> org.datanucleus.store.AbstractStoreManager.<init>(AbstractStoreManager.java:240)
>>>>>  at
>>>>> org.datanucleus.store.rdbms.RDBMSStoreManager.<init>(RDBMSStoreManager.java:286)
>>>>> at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native
>>>>> Method)
>>>>>  at
>>>>> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:39)
>>>>> at
>>>>> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:27)
>>>>>  at java.lang.reflect.Constructor.newInstance(Constructor.java:513)
>>>>> at
>>>>> org.datanucleus.plugin.NonManagedPluginRegistry.createExecutableExtension(NonManagedPluginRegistry.java:631)
>>>>>  at
>>>>> org.datanucleus.plugin.PluginManager.createExecutableExtension(PluginManager.java:301)
>>>>> at
>>>>> org.datanucleus.NucleusContext.createStoreManagerForProperties(NucleusContext.java:1187)
>>>>>  at org.datanucleus.NucleusContext.initialise(NucleusContext.java:356)
>>>>> at
>>>>> org.datanucleus.api.jdo.JDOPersistenceManagerFactory.freezeConfiguration(JDOPersistenceManagerFactory.java:775)
>>>>>  ... 46 more
>>>>> Caused by: java.lang.IllegalAccessError: tried to access field
>>>>> org.slf4j.impl.StaticLoggerBinder.SINGLETON from class
>>>>> org.slf4j.LoggerFactory
>>>>>  at org.slf4j.LoggerFactory.<clinit>(LoggerFactory.java:60)
>>>>> at com.jolbox.bonecp.BoneCPConfig.<clinit>(BoneCPConfig.java:62)
>>>>>  at
>>>>> org.datanucleus.store.rdbms.connectionpool.BoneCPConnectionPoolFactory.createConnectionPool(BoneCPConnectionPoolFactory.java:59)
>>>>> at
>>>>> org.datanucleus.store.rdbms.ConnectionFactoryImpl.generateDataSources(ConnectionFactoryImpl.java:238)
>>>>>  at
>>>>> org.datanucleus.store.rdbms.ConnectionFactoryImpl.initialiseDataSources(ConnectionFactoryImpl.java:131)
>>>>> at
>>>>> org.datanucleus.store.rdbms.ConnectionFactoryImpl.<init>(ConnectionFactoryImpl.java:85)
>>>>>  ... 64 more
>>>>>
>>>>> ~Sarath
>>>>>
>>>>>
>>>>>
>>>>
>>>
>>
>
>
> CONFIDENTIALITY NOTICE
> NOTICE: This message is intended for the use of the individual or entity
> to which it is addressed and may contain information that is confidential,
> privileged and exempt from disclosure under applicable law. If the reader
> of this message is not the intended recipient, you are hereby notified that
> any printing, copying, dissemination, distribution, disclosure or
> forwarding of this communication is strictly prohibited. If you have
> received this communication in error, please contact the sender immediately
> and delete it from your system. Thank You.

Reply via email to