[
https://issues.apache.org/jira/browse/HADOOP-5414?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=12679429#action_12679429
]
Ravi Phulari commented on HADOOP-5414:
--------------------------------------
Zero length file was created after command execution .
[u...@xyzhostname ~]$ hadoop fs -touchz test/new0LenFile2
09/03/05 23:31:21 WARN hdfs.DFSClient: Problem renewing lease for
DFSClient_-661919204
java.io.IOException: Call to xxxxx-xxx.xxx.com/xxx.xxx.xxx.xxx:xxxx failed on
local exception: java.nio.channels.ClosedByInterruptException
at org.apache.hadoop.ipc.Client.wrapException(Client.java:765)
at org.apache.hadoop.ipc.Client.call(Client.java:733)
at org.apache.hadoop.ipc.RPC$Invoker.invoke(RPC.java:220)
at $Proxy0.renewLease(Unknown Source)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
at java.lang.reflect.Method.invoke(Method.java:597)
at
org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:82)
at
org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:59)
at $Proxy0.renewLease(Unknown Source)
at
org.apache.hadoop.hdfs.DFSClient$LeaseChecker.renew(DFSClient.java:1006)
at
org.apache.hadoop.hdfs.DFSClient$LeaseChecker.run(DFSClient.java:1018)
at java.lang.Thread.run(Thread.java:619)
Caused by: java.nio.channels.ClosedByInterruptException
at
java.nio.channels.spi.AbstractInterruptibleChannel.end(AbstractInterruptibleChannel.java:184)
at sun.nio.ch.SocketChannelImpl.read(SocketChannelImpl.java:263)
at
org.apache.hadoop.net.SocketInputStream$Reader.performIO(SocketInputStream.java:55)
at
org.apache.hadoop.net.SocketIOWithTimeout.doIO(SocketIOWithTimeout.java:142)
at
org.apache.hadoop.net.SocketInputStream.read(SocketInputStream.java:155)
at
org.apache.hadoop.net.SocketInputStream.read(SocketInputStream.java:128)
at java.io.FilterInputStream.read(FilterInputStream.java:116)
at
org.apache.hadoop.ipc.Client$Connection$PingInputStream.read(Client.java:276)
at java.io.BufferedInputStream.fill(BufferedInputStream.java:218)
at java.io.BufferedInputStream.read(BufferedInputStream.java:237)
at java.io.DataInputStream.readInt(DataInputStream.java:370)
at
org.apache.hadoop.ipc.Client$Connection.receiveResponse(Client.java:501)
at org.apache.hadoop.ipc.Client$Connection.run(Client.java:446)
> IO exception while executing hadoop fs -touchz fileName
> ----------------------------------------------------------
>
> Key: HADOOP-5414
> URL: https://issues.apache.org/jira/browse/HADOOP-5414
> Project: Hadoop Core
> Issue Type: Bug
> Components: fs
> Affects Versions: 0.20.0
> Reporter: Ravi Phulari
> Assignee: Hairong Kuang
>
> Stack trace while executing hadoop fs -touchz command .
> [u...@xyzhostname ~]$ hadoop fs -touchz test/new0LenFile2
> 09/03/05 23:31:21 WARN hdfs.DFSClient: Problem renewing lease for
> DFSClient_-661919204
> java.io.IOException: Call to xxxxx-xxx.xxx.com/xxx.xxx.xxx.xxx:xxxx failed on
> local exception: java.nio.channels.ClosedByInterruptException
>
--
This message is automatically generated by JIRA.
-
You can reply to this email to add a comment to the issue online.