Is namenode running on my-hbase-master.com
<http://my-hbase-master.com:8020/apps/hbase/data/lib> ?

Which hbase release are you using ?

I assume hbase-site.xml is in the classpath of your app. What's the value
for hbase.rootdir ?

Cheers

On Wed, Jan 28, 2015 at 4:10 PM, Yang <teddyyyy...@gmail.com> wrote:

> we have a standalone java code, which simply tries to insert one record
> into an existing hbase table.
>
> here it got the following error:but it is able to proceed. so this means
> the following operation which triggered the error is useless ??? if so,
> shouldn't the useless code be removed?
>
> thanks
> Yang
>
> 2015-01-28 16:06:05,769 [pool-2-thread-1] WARN
>  org.apache.hadoop.hbase.util.DynamicClassLoader - Failed to identify the
> fs of dir hdfs://my-hbase-master.com:8020/apps/hbase/data/lib, ignored
> java.io.IOException: No FileSystem for scheme: hdfs
> at org.apache.hadoop.fs.FileSystem.getFileSystemClass(FileSystem.java:2385)
> at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:2392)
> at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:89)
> at org.apache.hadoop.fs.FileSystem$Cache.getInternal(FileSystem.java:2431)
> at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:2413)
> at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:368)
> at org.apache.hadoop.fs.Path.getFileSystem(Path.java:296)
> at
>
> org.apache.hadoop.hbase.util.DynamicClassLoader.<init>(DynamicClassLoader.java:104)
> at
>
> org.apache.hadoop.hbase.protobuf.ProtobufUtil.<clinit>(ProtobufUtil.java:201)
> at org.apache.hadoop.hbase.ClusterId.parseFrom(ClusterId.java:64)
> at
>
> org.apache.hadoop.hbase.zookeeper.ZKClusterId.readClusterIdZNode(ZKClusterId.java:69)
> at
>
> org.apache.hadoop.hbase.client.ZooKeeperRegistry.getClusterId(ZooKeeperRegistry.java:83)
> at
>
> org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation.retrieveClusterId(HConnectionManager.java:857)
> at
>
> org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation.<init>(HConnectionManager.java:662)
> at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
> at
>
> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
> at
>
> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
> at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
> at
>
> org.apache.hadoop.hbase.client.HConnectionManager.createConnection(HConnectionManager.java:414)
> at
>
> org.apache.hadoop.hbase.client.HConnectionManager.createConnection(HConnectionManager.java:393)
> at
>
> org.apache.hadoop.hbase.client.HConnectionManager.getConnection(HConnectionManager.java:274)
> at org.apache.hadoop.hbase.client.HTable.<init>(HTable.java:188)
> at org.apache.hadoop.hbase.client.HTable.<init>(HTable.java:150)
> at bdp.kafka.connector.BDPConsumer.run(BDPConsumer.java:58)
>
>
>
>
>     Configuration config = HBaseConfiguration.create();
>     HTable table = new HTable(config, hTableName);
>

Reply via email to