This doesn't work. In CLI mode you could export the environment
variable to avoid add jar every time.
I did this, but still encounter the error when I access from java client.
And I can't even specify the --auxpath param when you start a hive
thrift service.
So at least in my situation, I have to add jar by myself.

On Wed, Nov 7, 2012 at 12:31 AM, kulkarni.swar...@gmail.com
<kulkarni.swar...@gmail.com> wrote:
> FWIW, you can also drop all your needed jars (including the hbase and
> zookeeper ones) in a folder and then set this property in your hive-env.sh.
>
> export HIVE_AUX_JARS_PATH = <path to the folder>
>
> This way you need not add them manually everytime.
>
>
> On Mon, Nov 5, 2012 at 9:18 PM, Cheng Su <scarcer...@gmail.com> wrote:
>>
>> Mark, thank you so much for your suggestion.
>>
>> Although I've already add necessary jars to my hive aux path, thus I
>> can execute my sql in hive CLI mode without getting any error.
>> But when I use a java client to access the tables through the thrift
>> service, I need to add these jars manually.
>> I execute the "ADD JAR xxxx.jar" sql and the problem is solved!
>>
>> Thank you again!
>>
>> On Tue, Nov 6, 2012 at 9:03 AM, Mark Grover <grover.markgro...@gmail.com>
>> wrote:
>> > Cheng,
>> > You will have to add the appropriate HBase related jars to your class
>> > path.
>> >
>> > You can do that by running "add jar" command(s) or put it in aux_lib.
>> > See
>> > this thread for reference:
>> >
>> > http://mail-archives.apache.org/mod_mbox/hive-user/201103.mbox/%3caanlktingqlgknqmizgoi+szfnexgcat8caqtovf8j...@mail.gmail.com%3E
>> >
>> > Mark
>> >
>> >
>> > On Mon, Nov 5, 2012 at 6:53 AM, Cheng Su <scarcer...@gmail.com> wrote:
>> >>
>> >> Hi, all. I have a hive+hbase integration cluster.
>> >>
>> >> When I try to execute query through the java client of hive, sometimes
>> >> a ClassNotFoundException happens.
>> >>
>> >> My java code :
>> >>
>> >> final Connection conn = DriverManager.getConnection(URL);
>> >> final ResultSet rs = conn.executeQuery("SELECT count(*) FROM
>> >> test_table WHERE (source = '0' AND ur_createtime BETWEEN
>> >> '20121031000000' AND '20121031235959')");
>> >>
>> >> I can execute the sql:SELECT count(*) FROM test_table WHERE (source =
>> >> '0' AND ur_createtime BETWEEN '20121031000000' AND '20121031235959')
>> >> in hive cli mode, and get the query result, so there is no error in my
>> >> sql.
>> >>
>> >> The client side exception:
>> >>
>> >> Caused by: java.sql.SQLException: Query returned non-zero code: 9,
>> >> cause: FAILED: Execution Error, return code 2 from
>> >> org.apache.hadoop.hive.ql.exec.MapRedTask
>> >>     at
>> >>
>> >> org.apache.hadoop.hive.jdbc.HiveStatement.executeQuery(HiveStatement.java:189)
>> >> ... 23 more
>> >>
>> >> The server side exception(hadoop-jobtracker):
>> >>
>> >> 2012-11-05 18:55:39,443 INFO org.apache.hadoop.mapred.TaskInProgress:
>> >> Error from attempt_201210301133_0112_m_000000_3: java.io.IOException:
>> >> Cannot create an instance of InputSplit class =
>> >>
>> >>
>> >> org.apache.hadoop.hive.hbase.HBaseSplit:org.apache.hadoop.hive.hbase.HBaseSplit
>> >>     at
>> >>
>> >> org.apache.hadoop.hive.ql.io.HiveInputFormat$HiveInputSplit.readFields(HiveInputFormat.java:146)
>> >>     at
>> >>
>> >> org.apache.hadoop.io.serializer.WritableSerialization$WritableDeserializer.deserialize(WritableSerialization.java:67)
>> >>     at
>> >>
>> >> org.apache.hadoop.io.serializer.WritableSerialization$WritableDeserializer.deserialize(WritableSerialization.java:40)
>> >>     at
>> >> org.apache.hadoop.mapred.MapTask.getSplitDetails(MapTask.java:396)
>> >>     at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:412)
>> >>     at org.apache.hadoop.mapred.MapTask.run(MapTask.java:372)
>> >>     at org.apache.hadoop.mapred.Child$4.run(Child.java:255)
>> >>     at java.security.AccessController.doPrivileged(Native Method)
>> >>     at javax.security.auth.Subject.doAs(Unknown Source)
>> >>     at
>> >>
>> >> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1059)
>> >>     at org.apache.hadoop.mapred.Child.main(Child.java:249)
>> >> Caused by: java.lang.ClassNotFoundException:
>> >> org.apache.hadoop.hive.hbase.HBaseSplit
>> >>     at java.net.URLClassLoader$1.run(Unknown Source)
>> >>     at java.security.AccessController.doPrivileged(Native Method)
>> >>     at java.net.URLClassLoader.findClass(Unknown Source)
>> >>     at java.lang.ClassLoader.loadClass(Unknown Source)
>> >>     at sun.misc.Launcher$AppClassLoader.loadClass(Unknown Source)
>> >>     at java.lang.ClassLoader.loadClass(Unknown Source)
>> >>     at java.lang.Class.forName0(Native Method)
>> >>     at java.lang.Class.forName(Unknown Source)
>> >>     at
>> >>
>> >> org.apache.hadoop.conf.Configuration.getClassByName(Configuration.java:819)
>> >>     at
>> >>
>> >> org.apache.hadoop.hive.ql.io.HiveInputFormat$HiveInputSplit.readFields(HiveInputFormat.java:143)
>> >>     ... 10 more
>> >>
>> >>
>> >> My hive-env.sh
>> >>
>> >> export
>> >>
>> >> HIVE_AUX_JARS_PATH=/data/install/hive-0.9.0/lib/hive-hbase-handler-0.9.0.jar,/data/install/hive-0.9.0/lib/hbase-0.92.0.jar,/data/install/hive-0.9.0/lib/zookeeper-3.4.2.jar
>> >>
>> >>
>> >> My hive-site.xml
>> >>
>> >> <property>
>> >>     <name>hive.zookeeper.quorum</name>
>> >>     <value>hadoop01,hadoop02,hadoop03</value>
>> >>     <description>The list of zookeeper servers to talk to. This is
>> >> only needed for read/write locks.</description>
>> >> </property>
>> >>
>> >>
>> >> And I start thrift service as below:
>> >>
>> >> hive --service hiveserver -p 10000 &
>> >>
>> >>
>> >> The server side error log says that HBaseSplit is not found. But why?
>> >> How can I fix this?
>> >>
>> >> --
>> >>
>> >> Regards,
>> >> Cheng Su
>> >
>> >
>>
>>
>>
>> --
>>
>> Regards,
>> Cheng Su
>
>
>
>
> --
> Swarnim



-- 

Regards,
Cheng Su

Reply via email to