Does anyone know what might be causing this error? I am using version Hadoop
0.20.2 and it happens when I run bin/hadoop dfs -copyFromLocal ...
10/07/09 15:51:45 INFO hdfs.DFSClient: Exception in createBlockOutputStream
java.io.IOException: Bad connect ack with firstBadLink 128.238.55.43:50010
Please see the description about xcievers at:
http://hbase.apache.org/docs/r0.20.5/api/overview-summary.html#requirements
You can confirm that you have a xcievers problem by grepping the
datanode logs with the error message pasted in the last bullet point.
On Fri, Jul 9, 2010 at 1:10 PM, Raymond
and scrubbed all of my machines reinstalled linux (same version) and
moved from hadoop 0.20.1 to 0.20.2.
- Original Message
From: Ted Yu yuzhih...@gmail.com
To: common-user@hadoop.apache.org
Sent: Fri, July 9, 2010 4:26:30 PM
Subject: Re: Help with Hadoop runtime error
Please see
linux (same version)
and
moved from hadoop 0.20.1 to 0.20.2.
- Original Message
From: Ted Yu yuzhih...@gmail.com
To: common-user@hadoop.apache.org
Sent: Fri, July 9, 2010 4:26:30 PM
Subject: Re: Help with Hadoop runtime error
Please see the description about xcievers at:
http