Thanks !
There was a similar problem: Conflicting Jars, but between Hive and Spark.
My eventual goal is running Spark with Hive's tables, and having Spark's
libraries on my path as well, there were conflicting Jar files.
I removed Spark libraries from my PATH and Hive's services (remote
metastore) just started all well.
For now I am good, but I am just wondering what is the correct way to fix
this ? Once I wanna start Spark, I need to include its libraries to the
PATH, and the conflicts seems inevitable.



On Mon, Jun 8, 2015 at 12:09 PM, Slava Markeyev <slava.marke...@upsight.com>
wrote:

> It sounds like you are running into a jar conflict between the hive
> packaged derby and hadoop distro packaged derby. Look for derby jars on
> your system to confirm.
>
> In the mean time try adding this to your hive-env.sh or hadoop-env.sh file:
>
> export HADOOP_USER_CLASSPATH_FIRST=true
>
> On Mon, Jun 8, 2015 at 11:52 AM, James Pirz <james.p...@gmail.com> wrote:
>
>> I am trying to run Hive 1.2.0 on Hadoop 2.6.0 (on a cluster, running
>> CentOS). I am able to start Hive CLI and run queries. But once I try to
>> start Hive's metastore (I trying to use the builtin derby) using:
>>
>> hive --service metastore
>>
>> I keep getting Class Not Found Exceptions for
>> "org.apache.derby.jdbc.EmbeddedDriver" (See below).
>>
>> I have exported $HIVE_HOME and added $HIVE_HOME/bin and $HIVE_HOME/lib to
>> the $PATH, and I see that there is "derby-10.11.1.1.jar" file under
>> $HIVE_HOME/lib .
>>
>> In my hive-site.xml (under $HIVE_HOME/conf) I have:
>>
>> <property>
>>     <name>javax.jdo.option.ConnectionDriverName</name>
>>     <value>org.apache.derby.jdbc.EmbeddedDriver</value>
>>     <description>Driver class name for a JDBC metastore</description>
>>   </property>
>>
>> <property>
>>     <name>javax.jdo.option.ConnectionURL</name>
>>     <value>jdbc:derby:;databaseName=metastore_db;create=true</value>
>>     <description>JDBC connect string for a JDBC metastore</description>
>>   </property>
>>
>> So I am not sure, why it can not find it.
>> Any suggestion or hint would be highly appreciated.
>>
>>
>> Here is the error:
>>
>> javax.jdo.JDOFatalInternalException: Error creating transactional
>> connection factory
>> ...
>> Caused by: java.lang.NoClassDefFoundError: Could not initialize class
>> org.apache.derby.jdbc.EmbeddedDriver
>> at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
>> at
>> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
>> at
>> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
>> at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
>> at java.lang.Class.newInstance(Class.java:379)
>> at
>> org.datanucleus.store.rdbms.connectionpool.AbstractConnectionPoolFactory.loadDriver(AbstractConnectionPoolFactory.java:47)
>> at
>> org.datanucleus.store.rdbms.connectionpool.BoneCPConnectionPoolFactory.createConnectionPool(BoneCPConnectionPoolFactory.java:54)
>> at
>> org.datanucleus.store.rdbms.ConnectionFactoryImpl.generateDataSources(ConnectionFactoryImpl.java:238)
>> at
>> org.datanucleus.store.rdbms.ConnectionFactoryImpl.initialiseDataSources(ConnectionFactoryImpl.java:131)
>> at
>> org.datanucleus.store.rdbms.ConnectionFactoryImpl.<init>(ConnectionFactoryImpl.java:85)
>>
>>
>
>
> --
>
> Slava Markeyev | Engineering | Upsight
>
> Find me on LinkedIn <http://www.linkedin.com/in/slavamarkeyev>
> <http://www.linkedin.com/in/slavamarkeyev>
>

Reply via email to