Hi Adarsh,

Can you provide some more details as follows. I have had issues before on
communicating between my HMaster etc. but the initial issue was ACL(s) and
my zookeeper settings were a little messed up too (I can't recollect what
was wrong with zookeeper). I would suggest that you get things up and
running with default zookeeper settings before playing around with them. You
could also run a custom java program which connects to hbase to make sure
the issue is with hbase settings and has nothing to do with Hive settings.

1) Are you running Namenode, HMaster, Hive on the same machine ?
2) Are you able to access the HMaster through the Web UI ? I think the
default port is 60010 (or you can check in your hbase-site.xml)

Thanks,
Viral

On Thu, Jan 6, 2011 at 9:53 PM, Adarsh Sharma <adarsh.sha...@orkash.com>wrote:

>  John Sichi wrote:
>
> Here is what you need to do:
>
> 1) Use svn to check out the source for Hive 0.6
>
>
> I download Hive-0.6.0 source code with the command
>
>  svn co http://svn.apache.org/repos/asf/hive/branches/branch-0.6/hive-0.6.0
>
>
>
> 2) In your checkout, replace the HBase 0.20.3 jars with the ones from 0.20.6
>
>
> Replace hbase-0.20.3.jar,hbase-0.20.3.test.jar by hbase-0.20.6.jar and
> hbase-0.20.6.test jars in Hive-0.6.0/hbase-handler/lib folder
>
> 3) Build Hive 0.6 from source
>
>
> Then Build the hive package by *ant -Dhadoop.version=0.20.0 package *
> command
> Am I doing something wrong.
>
> I want to know why it occurs in hive.log
>
> 2011-01-05 15:19:36,783 ERROR DataNucleus.Plugin 
> (Log4JLogger.java:error(115)) - Bundle "org.eclipse.jdt.core" requires 
> "org.eclipse.core.resources" but it cannot be resolved.
>
>
> With Best Regards
>
> Adarsh Sharma
>
>
> 4) Use your new Hive build
>
>
>
> And use this new Hive build but I am sorry but the error remains the same.
>
>
> JVS
>
> On Jan 6, 2011, at 2:34 AM, Adarsh Sharma wrote:
>
>
>
> Dear all,
>
> I am sorry I am posting this message again but I can't able to locate the 
> root cause after googled a lot.
>
> I am trying Hive/Hbase Integration from the past 2 days. I am facing the 
> below issue while creating external table in Hive.
>
> I am using hadoop-0.20.2, hbase-0.20.6, hive-0.6.0 ( Mysql as metstore ) and 
> java-1.6.0_20. Hbase-0.20.3 is also checked.
>
> Problem arises when I issue the below command :
>
> hive> CREATE TABLE hive_hbasetable_k(key int, value string)
>     > STORED BY 'org.apache.hadoop.hive.hbase.HBaseStorageHandler'
>     > WITH SERDEPROPERTIES ("hbase.columns.mapping" = ":key,cf1:val")
>     > TBLPROPERTIES ("hbase.table.name" = "hivehbasek");
>
>
> FAILED: Error in metadata: 
> MetaException(message:org.apache.hadoop.hbase.MasterNotRunningException
>         at 
> org.apache.hadoop.hbase.client.HConnectionManager$TableServers.getMaster(HConnectionManager.java:374)
>         at 
> org.apache.hadoop.hbase.client.HBaseAdmin.<init>(HBaseAdmin.java:72)
>         at 
> org.apache.hadoop.hive.hbase.HBaseStorageHandler.getHBaseAdmin(HBaseStorageHandler.java:64)
>         at 
> org.apache.hadoop.hive.hbase.HBaseStorageHandler.preCreateTable(HBaseStorageHandler.java:159)
>         at 
> org.apache.hadoop.hive.metastore.HiveMetaStoreClient.createTable(HiveMetaStoreClient.java:275)
>         at org.apache.hadoop.hive.ql.metadata.Hive.createTable(Hive.java:394)
>         at 
> org.apache.hadoop.hive.ql.exec.DDLTask.createTable(DDLTask.java:2126)
>         at org.apache.hadoop.hive.ql.exec.DDLTask.execute(DDLTask.java:166)
>         at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:107)
>         at 
> org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:55)
>         at org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:633)
>         at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:506)
>         at org.apache.hadoop.hive.ql.Driver.run(Driver.java:384)
>         at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:138)
>         at 
> org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:197)
>         at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:302)
>         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>         at 
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
>         at 
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
>         at java.lang.reflect.Method.invoke(Method.java:597)
>         at org.apache.hadoop.util.RunJar.main(RunJar.java:156)
> FAILED: Execution Error, return code 1 from 
> org.apache.hadoop.hive.ql.exec.DDLTask
>
>
> It seems my HMaster is not Running but I checked from IP:60010 that it is 
> running and I am able to create,insert tables in Hbase Properly.
>
> Below is the contents of my hive.log :
>
>   2011-01-05 15:19:36,783 ERROR DataNucleus.Plugin 
> (Log4JLogger.java:error(115)) - Bundle "org.eclipse.jdt.core" requires 
> "org.eclipse.core.resources" but it cannot be resolved.
>  2011-01-05 15:19:36,783 ERROR DataNucleus.Plugin 
> (Log4JLogger.java:error(115)) - Bundle "org.eclipse.jdt.core" requires 
> "org.eclipse.core.resources" but it cannot be resolved.
>  2011-01-05 15:19:36,785 ERROR DataNucleus.Plugin 
> (Log4JLogger.java:error(115)) - Bundle "org.eclipse.jdt.core" requires 
> "org.eclipse.core.runtime" but it cannot be resolved.
>  2011-01-05 15:19:36,785 ERROR DataNucleus.Plugin 
> (Log4JLogger.java:error(115)) - Bundle "org.eclipse.jdt.core" requires 
> "org.eclipse.core.runtime" but it cannot be resolved.
>  2011-01-05 15:19:36,786 ERROR DataNucleus.Plugin 
> (Log4JLogger.java:error(115)) - Bundle "org.eclipse.jdt.core" requires 
> "org.eclipse.text" but it cannot be resolved.
>  2011-01-05 15:19:36,786 ERROR DataNucleus.Plugin 
> (Log4JLogger.java:error(115)) - Bundle "org.eclipse.jdt.core" requires 
> "org.eclipse.text" but it cannot be resolved.
>  2011-01-05 15:20:12,185 WARN  zookeeper.ClientCnxn 
> (ClientCnxn.java:run(967)) - Exception closing session 0x0 to 
> sun.nio.ch.selectionkeyi...@561279c8
>  java.net.ConnectException: Connection refused
>        at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
>        at 
> sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:574)
>        at org.apache.zookeeper.ClientCnxn$SendThread.run(ClientCnxn.java:933)
>  2011-01-05 15:20:12,188 WARN  zookeeper.ClientCnxn 
> (ClientCnxn.java:cleanup(1001)) - Ignoring exception during shutdown input
>  java.nio.channels.ClosedChannelException
>        at 
> sun.nio.ch.SocketChannelImpl.shutdownInput(SocketChannelImpl.java:638)
>        at sun.nio.ch.SocketAdaptor.shutdownInput(SocketAdaptor.java:360)
>        at 
> org.apache.zookeeper.ClientCnxn$SendThread.cleanup(ClientCnxn.java:999)
>        at org.apache.zookeeper.ClientCnxn$SendThread.run(ClientCnxn.java:970)
>  2011-01-05 15:20:12,188 WARN  zookeeper.ClientCnxn 
> (ClientCnxn.java:cleanup(1006)) - Ignoring exception during shutdown output
>  java.nio.channels.ClosedChannelException
>        at 
> sun.nio.ch.SocketChannelImpl.shutdownOutput(SocketChannelImpl.java:649)
>        at sun.nio.ch.SocketAdaptor.shutdownOutput(SocketAdaptor.java:368)
>        at 
> org.apache.zookeeper.ClientCnxn$SendThread.cleanup(ClientCnxn.java:1004)
>        at org.apache.zookeeper.ClientCnxn$SendThread.run(ClientCnxn.java:970)
>  2011-01-05 15:20:12,621 WARN  zookeeper.ClientCnxn 
> (ClientCnxn.java:run(967)) - Exception closing session 0x0 to 
> sun.nio.ch.selectionkeyi...@799dbc3b
>
>   Please help me, as i am not able to solve this problem.
>
>  Also I want to add one more thing that my hadoop Cluster is of 9 nodes and 8 
> nodes act as Datanodes,Tasktrackers and Regionservers.
>
>  Among these nodes is set zookeeper.quorum.property to have 5 Datanodes. I 
> don't know the number of servers needed for Zookeeper in fully distributed 
> mode.
>
>
>  Best Regards
>
>  Adarsh Sharma
>
>
>
>
>
>
>

Reply via email to