[ 
https://issues.apache.org/jira/browse/HDFS-872?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=12802520#action_12802520
 ] 

Hairong Kuang commented on HDFS-872:
------------------------------------

Todd, could you please take a look at the findug error?
     [exec] -1 overall.
     [exec]
     [exec]     +1 @author.  The patch does not contain any @author tags.
     [exec]
     [exec]     +1 tests included.  The patch appears to include 3 new or 
modified tests.
     [exec]
     [exec]     +1 javadoc.  The javadoc tool did not generate any warning 
messages.
     [exec]
     [exec]     +1 javac.  The applied patch does not increase the total number 
of javac compiler warnings.
     [exec]
     [exec]     -1 findbugs.  The patch appears to introduce 1 new Findbugs 
warnings.
     [exec]
     [exec]     +1 Eclipse classpath. The patch retains Eclipse classpath 
integrity.
Ant test-core was passed.

> DFSClient 0.20.1 is incompatible with HDFS 0.20.2
> -------------------------------------------------
>
>                 Key: HDFS-872
>                 URL: https://issues.apache.org/jira/browse/HDFS-872
>             Project: Hadoop HDFS
>          Issue Type: Bug
>          Components: data-node, hdfs client
>    Affects Versions: 0.20.1, 0.20.2
>            Reporter: Bassam Tabbara
>            Assignee: Todd Lipcon
>             Fix For: 0.20.2
>
>         Attachments: hdfs-793-branch20.txt, hdfs-793-branch20.txt, 
> hdfs-872.txt
>
>
> After upgrading to that latest HDFS 0.20.2 (r896310 from 
> /branches/branch-0.20), old DFS clients (0.20.1) seem to not work anymore. 
> HBase uses the 0.20.1 hadoop core jars and the HBase master will no longer 
> startup. Here is the exception from the HBase master log:
> {code}
> 2010-01-06 09:59:46,762 WARN org.apache.hadoop.hdfs.DFSClient: DFS Read: 
> java.io.IOException: Could not obtain block: blk_338051
> 2596555557728_1002 file=/hbase/hbase.version
>         at 
> org.apache.hadoop.hdfs.DFSClient$DFSInputStream.chooseDataNode(DFSClient.java:1788)
>         at 
> org.apache.hadoop.hdfs.DFSClient$DFSInputStream.blockSeekTo(DFSClient.java:1616)
>         at 
> org.apache.hadoop.hdfs.DFSClient$DFSInputStream.read(DFSClient.java:1743)
>         at 
> org.apache.hadoop.hdfs.DFSClient$DFSInputStream.read(DFSClient.java:1673)
>         at java.io.DataInputStream.readUnsignedShort(DataInputStream.java:320)
>         at java.io.DataInputStream.readUTF(DataInputStream.java:572)
>         at org.apache.hadoop.hbase.util.FSUtils.getVersion(FSUtils.java:189)
>         at org.apache.hadoop.hbase.util.FSUtils.checkVersion(FSUtils.java:208)
>         at org.apache.hadoop.hbase.master.HMaster.<init>(HMaster.java:208)
>         at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native 
> Method)
>         at 
> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:39)
>         at 
> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:27)
>         at java.lang.reflect.Constructor.newInstance(Constructor.java:513)
>         at org.apache.hadoop.hbase.master.HMaster.doMain(HMaster.java:1241)
>         at org.apache.hadoop.hbase.master.HMaster.main(HMaster.java:1282)
> 2010-01-06 09:59:46,763 FATAL org.apache.hadoop.hbase.master.HMaster: Not 
> starting HMaster because:
> java.io.IOException: Could not obtain block: blk_3380512596555557728_1002 
> file=/hbase/hbase.version
>         at 
> org.apache.hadoop.hdfs.DFSClient$DFSInputStream.chooseDataNode(DFSClient.java:1788)
>         at 
> org.apache.hadoop.hdfs.DFSClient$DFSInputStream.blockSeekTo(DFSClient.java:1616)
>         at 
> org.apache.hadoop.hdfs.DFSClient$DFSInputStream.read(DFSClient.java:1743)
>         at 
> org.apache.hadoop.hdfs.DFSClient$DFSInputStream.read(DFSClient.java:1673)
>         at java.io.DataInputStream.readUnsignedShort(DataInputStream.java:320)
>         at java.io.DataInputStream.readUTF(DataInputStream.java:572)
>         at org.apache.hadoop.hbase.util.FSUtils.getVersion(FSUtils.java:189)
>         at org.apache.hadoop.hbase.util.FSUtils.checkVersion(FSUtils.java:208)
>         at org.apache.hadoop.hbase.master.HMaster.<init>(HMaster.java:208)
>         at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native 
> Method)
>         at 
> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:39)
>         at 
> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:27)
>         at java.lang.reflect.Constructor.newInstance(Constructor.java:513)
>         at org.apache.hadoop.hbase.master.HMaster.doMain(HMaster.java:1241)
>         at org.apache.hadoop.hbase.master.HMaster.main(HMaster.java:1282)
> {code}
> If I switch the hadoop jars in the hbase/lib directory with 0.20.2 version it 
> works well, which what led me to open this bug here and not in the HBASE 
> project.

-- 
This message is automatically generated by JIRA.
-
You can reply to this email to add a comment to the issue online.

Reply via email to