Hi,

I am using Hadoop 2.5.1 and HBase 0.98.8-hadoop2 stand-alone mode,  when I 
use the following client side code

public static void main(final String[] args) {

        HTableInterface table = null;

        try {

                final HBaseManager tableManager = 
HBaseManager.getInstance();

                table = tableManager
 .getHTable(HBaseConstants.TABLE_EMP_DETAILS);

                final Scan scan = new Scan();

                final ResultScanner resultScanner = 
table.getScanner(scan);

                for (final Result result : resultScanner) {

                        LOG.debug("The Employee id : "
                                        + 
HBaseHelper.getValueFromResult(result,
 HBaseConstants.COLUME_FAMILY,
 HBaseConstants.EMP_ID));

                        LOG.debug("The Employee Name to id : "
                                        + 
HBaseHelper.getValueFromResult(result,
 HBaseConstants.COLUME_FAMILY,
 HBaseConstants.EMP_NAME));
                }


        } catch (final Exception ex) {
                // TODO Auto-generated catch block
        }
}

When I execute the code I am getting the following error, I have firewall 
disabled for the port. Still I am getting this error.

27 Nov 2014 18:33:01,484 196074 [main] DEBUG 
org.apache.hadoop.ipc.RpcClient  - Connecting to localhost/127.0.0.1:60020
27 Nov 2014 18:33:02,484 197074 [main] DEBUG 
org.apache.hadoop.ipc.RpcClient  - IPC Client (21276817) connection to 
localhost/127.0.0.1:60020 from 394728: closing ipc connection to 
localhost/127.0.0.1:60020: Connection refused: no further information
java.net.ConnectException: Connection refused: no further information
        at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
        at sun.nio.ch.SocketChannelImpl.finishConnect(Unknown Source)
        at org.apache.hadoop.net.SocketIOWithTimeout.connect(
SocketIOWithTimeout.java:206)
        at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:529)
        at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:493)
        at 
org.apache.hadoop.hbase.ipc.RpcClient$Connection.setupConnection(
RpcClient.java:578)
        at 
org.apache.hadoop.hbase.ipc.RpcClient$Connection.setupIOstreams(
RpcClient.java:868)
        at org.apache.hadoop.hbase.ipc.RpcClient.getConnection(
RpcClient.java:1543)
        at org.apache.hadoop.hbase.ipc.RpcClient.call(RpcClient.java:1442)
        at org.apache.hadoop.hbase.ipc.RpcClient.callBlockingMethod(
RpcClient.java:1661)
        at 
org.apache.hadoop.hbase.ipc.RpcClient$BlockingRpcChannelImplementation.callBlockingMethod(
RpcClient.java:1719)
        at 
org.apache.hadoop.hbase.protobuf.generated.ClientProtos$ClientService$BlockingStub.get(
ClientProtos.java:30363)
        at org.apache.hadoop.hbase.protobuf.ProtobufUtil.getRowOrBefore(
ProtobufUtil.java:1546)
        at org.apache.hadoop.hbase.client.HTable$2.call(HTable.java:717)
        at org.apache.hadoop.hbase.client.HTable$2.call(HTable.java:715)
        at 
org.apache.hadoop.hbase.client.RpcRetryingCaller.callWithRetries(
RpcRetryingCaller.java:117)
        at org.apache.hadoop.hbase.client.HTable.getRowOrBefore(
HTable.java:721)
        at org.apache.hadoop.hbase.client.MetaScanner.metaScan(
MetaScanner.java:144)
        at 
org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation.prefetchRegionCache(
HConnectionManager.java:1140)
        at 
org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation.locateRegionInMeta(
HConnectionManager.java:1202)
        at 
org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation.locateRegion(
HConnectionManager.java:1092)
        at 
org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation.locateRegion(
HConnectionManager.java:1049)
        at 
org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation.getRegionLocation(
HConnectionManager.java:890)
        at org.apache.hadoop.hbase.client.RegionServerCallable.prepare(
RegionServerCallable.java:72)
        at org.apache.hadoop.hbase.client.ScannerCallable.prepare(
ScannerCallable.java:125)
        at 
org.apache.hadoop.hbase.client.RpcRetryingCaller.callWithRetries(
RpcRetryingCaller.java:116)
        at 
org.apache.hadoop.hbase.client.RpcRetryingCaller.callWithRetries(
RpcRetryingCaller.java:93)
        at org.apache.hadoop.hbase.client.ClientScanner.nextScanner(
ClientScanner.java:283)
        at 
org.apache.hadoop.hbase.client.ClientScanner.initializeScannerInConstruction(
ClientScanner.java:188)
        at org.apache.hadoop.hbase.client.ClientScanner.<init>(
ClientScanner.java:183)
        at org.apache.hadoop.hbase.client.ClientScanner.<init>(
ClientScanner.java:110)
        at org.apache.hadoop.hbase.client.HTable.getScanner(
HTable.java:745)
        at 
org.apache.hadoop.hbase.client.HTablePool$PooledHTable.getScanner(
HTablePool.java:417)
        at com.opt.assortment.dao.impl.CallHBase.main(CallHBase.java:57)

Any suggestion or help is deeply appricated,

Thanks & Regards
Dhamodharan Ramalingam
=====-----=====-----=====
Notice: The information contained in this e-mail
message and/or attachments to it may contain 
confidential or privileged information. If you are 
not the intended recipient, any dissemination, use, 
review, distribution, printing or copying of the 
information contained in this e-mail message 
and/or attachments to it are strictly prohibited. If 
you have received this communication in error, 
please notify us by reply e-mail or telephone and 
immediately and permanently delete the message 
and any attachments. Thank you


Reply via email to