Hi Ed,

I'm happy to answer questions on this list related to Hive and Cloudera
packaging issues. I'd also like to point out that it's impossible to
differentiate between a "cloudera packaging issue" and an Apache Hive issue
before you know the cause of the problem.

Thanks.

Carl

On Wed, Sep 15, 2010 at 10:27 AM, Edward Capriolo <edlinuxg...@gmail.com>wrote:

> On Wed, Sep 15, 2010 at 1:14 PM, Leo Alekseyev <dnqu...@gmail.com> wrote:
> > This is a "me too" post: we just ran into an identical problem setting
> > up a new cluster using CDH3b2.  This is all rather mystifying, because
> > all the correct libraries are there; in fact the hive command line
> > looks something like
> >
> > /usr/java/jdk1.6.0_12/bin/java -Xmx256m -server
> > -Dhadoop.log.dir=/usr/lib/hadoop-0.20/logs
> > -Dhadoop.log.file=hadoop.log -Dhadoop.home.dir=/usr/lib/hadoop-0.20
> > -Dhadoop.id.str= -Dhadoop.root.logger=INFO,console
> > -Dhadoop.policy.file=hadoop-policy.xml -classpath
> >
> /usr/lib/hadoop-0.20/conf:/usr/java/jdk1.6.0_12/lib/tools.jar:/usr/lib/hadoop-0.20:/usr/lib/hadoop-0.20/hadoop-core-0.20.2+320.jar..................
> >  org.apache.hadoop.util.RunJar /usr/lib/hive/lib/hive-cli-0.5.0.jar
> > org.apache.hadoop.hive.cli.CliDriver
> >
> > (note the presence of the hadoop-core.jar which contains the class in
> question)
> >
> > If anyone has pointers on troubleshooting this, I'd appreciate it!
> >
> > --Leo
> >
> > On Tue, Sep 14, 2010 at 1:56 PM, Tianqiang  Li <peter...@gmail.com>
> wrote:
> >> Some more context: I run Hive on a client machine which is NOT one of
> the
> >> hadoop cluster nodes, I suppose Hive can run well and submit job from a
> >> client machine,  so I didn't change hadoop-env.sh on cluster nodes. On
> this
> >> client machine, hadoop java jobs and pig have been successfully
> submitted to
> >> cluster and processed.
> >>
> >> Regards,
> >> Peter Li
> >>
> >> On Tue, Sep 14, 2010 at 1:42 PM, Tianqiang Li <peter...@gmail.com>
> wrote:
> >>>
> >>> Hi, hive-users,
> >>> I am a new Hive users, install hive recently, when I type any
> >>> query-related command in Hive cli, it throws the exception, but create
> table
> >>> are ok:
> >>>
> >>> $ hive
> >>> Hive history file=/tmp/pli/hive_job_log_pli_201009141519_1503313446.txt
> >>> hive> create table test5(a int);
> >>> OK
> >>> Time taken: 2.551 seconds
> >>> hive> show tables;
> >>> OK
> >>> Failed with exception java.io.IOException:java.io.IOException: Cannot
> >>> create an instance of InputFormat class
> >>> org.apache.hadoop.mapred.TextInputFormat as specified in mapredWork!
> >>> Time taken: 0.09 seconds
> >>> hive>
> >>>
> >>> I searched on internet, and there are some info about hive's
> >>> HADOOP_CLASSPATH getting overwritten by some other tools in
> hadoop-env.sh, I
> >>> tried to use appending( export
> >>> HADOOP_CLASSPATH=${HADOOP_CLASSPATH}:other_paths ), didn't fix the
> problem,
> >>> then I tried to copy the jar files of hadoop libraries (
> >>> hadoop-0.20.1+169.68-core.jar ) which include
> >>> org.apache.hadoop.mapred.TextInputFormat, to $HIVE_HOME/lib/, still the
> >>> problem remain the same, It looks like Hive doesn't know where hadoop's
> >>> libraries are, and failed to new a TextInputFormat object, are there
> someone
> >>> met this before? Any hints on how to work this around are welcome.
> Thanks.
> >>>
> >>> Here are some more context, I use hadoop 0.20.1+169.68 from cloudera
> CDH2,
> >>> and Hive 0.4/0.5 from CDH2/3, (both version for hive have the same
> issues),
> >>> here is the trace stack in log file.
> >>> ------
> >>> 2010-09-14 15:19:58,826 ERROR DataNucleus.Plugin
> >>> (Log4JLogger.java:error(115)) - Bundle "org.eclipse.jdt.core" requires
> >>> "org.eclipse.core.resources" but it cannot be resolved.
> >>> 2010-09-14 15:19:58,826 ERROR DataNucleus.Plugin
> >>> (Log4JLogger.java:error(115)) - Bundle "org.eclipse.jdt.core" requires
> >>> "org.eclipse.core.resources" but it cannot be resolved.
> >>> 2010-09-14 15:19:58,828 ERROR DataNucleus.Plugin
> >>> (Log4JLogger.java:error(115)) - Bundle "org.eclipse.jdt.core" requires
> >>> "org.eclipse.core.runtime" but it cannot be resolved.
> >>> 2010-09-14 15:19:58,828 ERROR DataNucleus.Plugin
> >>> (Log4JLogger.java:error(115)) - Bundle "org.eclipse.jdt.core" requires
> >>> "org.eclipse.core.runtime" but it cannot be resolved.
> >>> 2010-09-14 15:19:58,829 ERROR DataNucleus.Plugin
> >>> (Log4JLogger.java:error(115)) - Bundle "org.eclipse.jdt.core" requires
> >>> "org.eclipse.text" but it cannot be resolved.
> >>> 2010-09-14 15:19:58,829 ERROR DataNucleus.Plugin
> >>> (Log4JLogger.java:error(115)) - Bundle "org.eclipse.jdt.core" requires
> >>> "org.eclipse.text" but it cannot be resolved.
> >>> 2010-09-14 15:20:04,444 ERROR CliDriver
> >>> (SessionState.java:printError(279)) - Failed with exception
> >>> java.io.IOException:java.io.IOException: Cannot create an instance of
> >>> InputFormat class org.apache.hadoop.mapred.TextInputFormat as specified
> in
> >>> mapredWork!
> >>> java.io.IOException: java.io.IOException: Cannot create an instance of
> >>> InputFormat class org.apache.hadoop.mapred.TextInputFormat as specified
> in
> >>> mapredWork!
> >>>   at
> >>>
> org.apache.hadoop.hive.ql.exec.FetchOperator.getNextRow(FetchOperator.java:273)
> >>>   at org.apache.hadoop.hive.ql.exec.FetchTask.fetch(FetchTask.java:133)
> >>>   at org.apache.hadoop.hive.ql.Driver.getResults(Driver.java:440)
> >>>   at
> org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:131)
> >>>   at
> org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:181)
> >>>   at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:287)
> >>>   at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> >>>   at
> >>>
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
> >>>   at
> >>>
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> >>>   at java.lang.reflect.Method.invoke(Method.java:616)
> >>>   at org.apache.hadoop.util.RunJar.main(RunJar.java:156)
> >>> Caused by: java.io.IOException: Cannot create an instance of
> InputFormat
> >>> class org.apache.hadoop.mapred.TextInputFormat as specified in
> mapredWork!
> >>>   at
> >>>
> org.apache.hadoop.hive.ql.exec.FetchOperator.getInputFormatFromCache(FetchOperator.java:113)
> >>>   at
> >>>
> org.apache.hadoop.hive.ql.exec.FetchOperator.getRecordReader(FetchOperator.java:214)
> >>>   at
> >>>
> org.apache.hadoop.hive.ql.exec.FetchOperator.getNextRow(FetchOperator.java:249)
> >>>
> >>> ----
> >>> My hadoop-env.sh :
> >>> export
> >>>
> HADOOP_CLASSPATH_OTHER=/usr/lib/hadoop-0.20/hadoop-0.20.1+169.68-core.jar:/usr/lib/hadoop-0.20/lib/commons-logging-1.0.4.jar:/usr/lib/hadoop-0.20/lib/*:/usr/lib/hadoop-0.20/*
> >>> export HADOOP_CLASSPATH=${HADOOP_CLASSPATH}$:{HADOOP_CLASSPATH_OTHER}
> >>>
> >>>
> >>>
> >>>
> >>> Regards,
> >>> Peter Li
> >>
> >>
> >
>
> This is likely a cloudera packaging issue, they have support forums
> you might want to contact them. This place is best for help if you can
> reproduce the problem with a hive release or hive trunk.
>

Reply via email to