Copy over the commons-configuration jar too. J-D
On Wed, Dec 21, 2011 at 2:57 PM, Aleksandr Levchuk <aleksa...@fb.com> wrote: > As you suggested, I replaced hadoop-core-0.20-append-r1056497.jar with > hadoop-core-0.20.205.0.jar (in hbase's lib) on the machines of the cluster, > but now when start-hbase.sh on the master gives: > > Exception in thread "main" java.lang.NoClassDefFoundError: > org/apache/commons/configuration/Configuration > at org.apache.hadoop.metrics2.lib.DefaultMetricsSystem.<init> > ... > at > org.apache.hadoop.security.UserGroupInformation.getCurrentUser(UserGroupInformation.java:395) > at > org.apache.hadoop.fs.FileSystem$Cache$Key.<init>(FileSystem.java:1436) > > > Alex > > > > On 12/21/2011 02:21 PM, Jean-Daniel Cryans wrote: >>> >>> I added dfs.support.append=true configuration everywhere but I get the >>> same >>> "got version 3 expected version 4" problem. >> >> >> Setting this won't solve your version problem. >> >>> How do I update the jars in the hbase lib directory? >> >> >> Remove the hadoop jar that's in there, replace it with the 0.20.205 one. >> >>>> This makes it look like hadoop-0.20.205.0 and hbase-0.90.4 are >>>> incompatible. >> >> >> This error really means that you didn't replace the jars correctly, >> the version is contained in the Hadoop jar and HBase isn't involved at >> all. From http://hbase.apache.org/book/hadoop.html >> >> "Because HBase depends on Hadoop, it bundles an instance of the Hadoop >> jar under its lib directory. The bundled jar is ONLY for use in >> standalone mode. In distributed mode, it is critical that the version >> of Hadoop that is out on your cluster match what is under HBase. >> Replace the hadoop jar found in the HBase lib directory with the >> hadoop jar you are running on your cluster to avoid version mismatch >> issues. Make sure you replace the jar in HBase everywhere on your >> cluster. Hadoop version mismatch issues have various manifestations >> but often all looks like its hung up." >> >> J-D > >