Namit Maheshwari created HDDS-536: ------------------------------------- Summary: ozone sh throws exception and show on command line for invalid input Key: HDDS-536 URL: https://issues.apache.org/jira/browse/HDDS-536 Project: Hadoop Distributed Data Store Issue Type: Bug Reporter: Namit Maheshwari
{code:java} [root@ctr-e138-1518143905142-481027-01-000002 bin]# ./ozone sh vol info o3://as 2018-09-22 00:06:03,123 [main] ERROR - Couldn't create protocol class org.apache.hadoop.ozone.client.rpc.RpcClient exception: java.lang.reflect.InvocationTargetException at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62) at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) at java.lang.reflect.Constructor.newInstance(Constructor.java:423) at org.apache.hadoop.ozone.client.OzoneClientFactory.getClientProtocol(OzoneClientFactory.java:291) at org.apache.hadoop.ozone.client.OzoneClientFactory.getRpcClient(OzoneClientFactory.java:169) at org.apache.hadoop.ozone.client.OzoneClientFactory.getRpcClient(OzoneClientFactory.java:153) at org.apache.hadoop.ozone.client.OzoneClientFactory.getRpcClient(OzoneClientFactory.java:109) at org.apache.hadoop.ozone.web.ozShell.Handler.verifyURI(Handler.java:100) at org.apache.hadoop.ozone.web.ozShell.volume.InfoVolumeHandler.call(InfoVolumeHandler.java:49) at org.apache.hadoop.ozone.web.ozShell.volume.InfoVolumeHandler.call(InfoVolumeHandler.java:36) at picocli.CommandLine.execute(CommandLine.java:919) at picocli.CommandLine.access$700(CommandLine.java:104) at picocli.CommandLine$RunLast.handle(CommandLine.java:1083) at picocli.CommandLine$RunLast.handle(CommandLine.java:1051) at picocli.CommandLine$AbstractParseResultHandler.handleParseResult(CommandLine.java:959) at picocli.CommandLine.parseWithHandlers(CommandLine.java:1242) at picocli.CommandLine.parseWithHandler(CommandLine.java:1181) at org.apache.hadoop.hdds.cli.GenericCli.execute(GenericCli.java:61) at org.apache.hadoop.hdds.cli.GenericCli.run(GenericCli.java:52) at org.apache.hadoop.ozone.web.ozShell.Shell.main(Shell.java:77) Caused by: java.net.UnknownHostException: Invalid host name: local host is: (unknown); destination host is: "as":9889; java.net.UnknownHostException; For more details see: http://wiki.apache.org/hadoop/UnknownHost at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62) at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) at java.lang.reflect.Constructor.newInstance(Constructor.java:423) at org.apache.hadoop.net.NetUtils.wrapWithMessage(NetUtils.java:831) at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:768) at org.apache.hadoop.ipc.Client$Connection.<init>(Client.java:449) at org.apache.hadoop.ipc.Client.getConnection(Client.java:1552) at org.apache.hadoop.ipc.Client.call(Client.java:1403) at org.apache.hadoop.ipc.Client.call(Client.java:1367) at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:228) at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:116) at com.sun.proxy.$Proxy10.getServiceList(Unknown Source) at org.apache.hadoop.ozone.om.protocolPB.OzoneManagerProtocolClientSideTranslatorPB.getServiceList(OzoneManagerProtocolClientSideTranslatorPB.java:751) at org.apache.hadoop.ozone.client.rpc.RpcClient.getScmAddressForClient(RpcClient.java:154) at org.apache.hadoop.ozone.client.rpc.RpcClient.<init>(RpcClient.java:126) ... 21 more Caused by: java.net.UnknownHostException at org.apache.hadoop.ipc.Client$Connection.<init>(Client.java:450) ... 30 more Invalid host name: local host is: (unknown); destination host is: "as":9889; java.net.UnknownHostException; For more details see: http://wiki.apache.org/hadoop/UnknownHost [root@ctr-e138-1518143905142-481027-01-000002 bin]# {code} Ideally, it should just throw error like hdfs below: {code:java} [hrt_qa@ctr-e138-1518143905142-483670-01-000002 hadoopqe]$ hdfs dfs -ls s3a://namit54/ 18/09/22 00:31:53 INFO impl.MetricsConfig: Loaded properties from hadoop-metrics2.properties 18/09/22 00:31:53 INFO impl.MetricsSystemImpl: Scheduled Metric snapshot period at 10 second(s). 18/09/22 00:31:53 INFO impl.MetricsSystemImpl: s3a-file-system metrics system started ls: Bucket namit54 does not exist [hrt_qa@ctr-e138-1518143905142-483670-01-000002 hadoopqe]$ {code} And not the entire stack trace -- This message was sent by Atlassian JIRA (v7.6.3#76005) --------------------------------------------------------------------- To unsubscribe, e-mail: hdfs-dev-unsubscr...@hadoop.apache.org For additional commands, e-mail: hdfs-dev-h...@hadoop.apache.org