Re: HDFS and libhfds
Feel free to update https://issues.apache.org/jira/browse/HDFS-1519 if you find it suitable. 2010/12/7 Petrucci Andreas : > > thanks for the replies, this solved my problems > > http://mail-archives.apache.org/mod_mbox/hadoop-common-user/200909.mbox/%3c6f5c1d715b2da5498a628e6b9c124f040145221...@hasmsx504.ger.corp.intel.com%3e > > ...i think i should write a post in my blog about this night with hdfs, > libhdfs and fuse... > >> Date: Tue, 7 Dec 2010 22:44:39 -0700 >> Subject: Re: HDFS and libhfds >> From: sudhir.vallamko...@icrossing.com >> To: common-user@hadoop.apache.org >> >> I second Ed's answer. Try unistalling whatever you installed and start >> fresh. Whenever I see this error when trying to installing a native bridge, >> this solution always worked for me. >> >> >> On 12/7/10 5:07 PM, "common-user-digest-h...@hadoop.apache.org" >> wrote: >> >> > From: Edward Capriolo >> > Date: Tue, 7 Dec 2010 17:22:03 -0500 >> > To: >> > Subject: Re: HDFS and libhfds >> > >> > 2010/12/7 Petrucci Andreas : >> >> >> >> hello there, im trying to compile libhdfs in order but there are some >> >> problems. According to http://wiki.apache.org/hadoop/MountableHDFS i have >> >> already installes fuse. With ant compile-c++-libhdfs -Dlibhdfs=1 the >> >> buils is >> >> successful. >> >> >> >> However when i try ant package -Djava5.home=... -Dforrest.home=... the >> >> build >> >> fails and the output is the below : >> >> >> >> [exec] >> >> [exec] Exception in thread "main" >> >> java.lang.UnsupportedClassVersionError: >> >> Bad version number in .class file >> >> [exec] at java.lang.ClassLoader.defineClass1(Native Method) >> >> [exec] at java.lang.ClassLoader.defineClass(ClassLoader.java:620) >> >> [exec] at >> >> java.security.SecureClassLoader.defineClass(SecureClassLoader.java:124) >> >> [exec] at >> >> java.net.URLClassLoader.defineClass(URLClassLoader.java:260) >> >> [exec] at >> >> java.net.URLClassLoader.access$100(URLClassLoader.java:56) >> >> [exec] at java.net.URLClassLoader$1.run(URLClassLoader.java:195) >> >> [exec] at java.security.AccessController.doPrivileged(Native >> >> Method) >> >> [exec] at >> >> java.net.URLClassLoader.findClass(URLClassLoader.java:188) >> >> [exec] at java.lang.ClassLoader.loadClass(ClassLoader.java:306) >> >> [exec] at >> >> sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:268) >> >> [exec] at java.lang.ClassLoader.loadClass(ClassLoader.java:251) >> >> [exec] at >> >> org.apache.avalon.excalibur.logger.DefaultLogTargetFactoryManager.configure(D >> >> efaultLogTargetFactoryManager.java:113) >> >> [exec] at >> >> org.apache.avalon.framework.container.ContainerUtil.configure(ContainerUtil.j >> >> ava:201) >> >> [exec] at >> >> org.apache.avalon.excalibur.logger.LogKitLoggerManager.setupTargetFactoryMana >> >> ger(LogKitLoggerManager.java:436) >> >> [exec] at >> >> org.apache.avalon.excalibur.logger.LogKitLoggerManager.configure(LogKitLogger >> >> Manager.java:400) >> >> [exec] at >> >> org.apache.avalon.framework.container.ContainerUtil.configure(ContainerUtil.j >> >> ava:201) >> >> [exec] at >> >> org.apache.cocoon.core.CoreUtil.initLogger(CoreUtil.java:607) >> >> [exec] at org.apache.cocoon.core.CoreUtil.init(CoreUtil.java:169) >> >> [exec] at >> >> org.apache.cocoon.core.CoreUtil.(CoreUtil.java:115) >> >> [exec] at >> >> org.apache.cocoon.bean.CocoonWrapper.initialize(CocoonWrapper.java:128) >> >> [exec] at >> >> org.apache.cocoon.bean.CocoonBean.initialize(CocoonBean.java:97) >> >> [exec] at org.apache.cocoon.Main.main(Main.java:310) >> >> [exec] Java Result: 1 >> >> [exec] >> >> [exec] Copying broken links file to site root. >> >> [exec] >> >> [exec] >> >> [exec] BUILD FAILED >> >> [exec] /apache-forrest-0.8/main/targets/site.xml:175: Warning: Could >> >> not >> >> find file /hadoop-0.20.2/src/docs/build/tmp/brokenlinks.xml to copy. >> >> [exec] >> >> [exec] Total time: 4 seconds >> >> >> >> BUILD FAILED >> >> /hadoop-0.20.2/build.xml:867: exec returned: 1 >> >> >> >> >> >> any ideas what's wrong??? >> >> >> > >> > I never saw this usage: >> > -Djava5.home >> > Try >> > export JAVA_HOME=/usr/java >> > >> > " Bad version number in .class file " means you are mixing and >> > matching java versions somehow. >> >> >> iCrossing Privileged and Confidential Information >> This email message is for the sole use of the intended recipient(s) and may >> contain confidential and privileged information of iCrossing. Any >> unauthorized review, use, disclosure or distribution is prohibited. If you >> are not the intended recipient, please contact the sender by reply email and >> destroy all copies of the original message. >> >> >
RE: HDFS and libhfds
thanks for the replies, this solved my problems http://mail-archives.apache.org/mod_mbox/hadoop-common-user/200909.mbox/%3c6f5c1d715b2da5498a628e6b9c124f040145221...@hasmsx504.ger.corp.intel.com%3e ...i think i should write a post in my blog about this night with hdfs, libhdfs and fuse... > Date: Tue, 7 Dec 2010 22:44:39 -0700 > Subject: Re: HDFS and libhfds > From: sudhir.vallamko...@icrossing.com > To: common-user@hadoop.apache.org > > I second Ed's answer. Try unistalling whatever you installed and start > fresh. Whenever I see this error when trying to installing a native bridge, > this solution always worked for me. > > > On 12/7/10 5:07 PM, "common-user-digest-h...@hadoop.apache.org" > wrote: > > > From: Edward Capriolo > > Date: Tue, 7 Dec 2010 17:22:03 -0500 > > To: > > Subject: Re: HDFS and libhfds > > > > 2010/12/7 Petrucci Andreas : > >> > >> hello there, im trying to compile libhdfs in order but there are some > >> problems. According to http://wiki.apache.org/hadoop/MountableHDFS i have > >> already installes fuse. With ant compile-c++-libhdfs -Dlibhdfs=1 the buils > >> is > >> successful. > >> > >> However when i try ant package -Djava5.home=... -Dforrest.home=... the > >> build > >> fails and the output is the below : > >> > >> [exec] > >> [exec] Exception in thread "main" > >> java.lang.UnsupportedClassVersionError: > >> Bad version number in .class file > >> [exec] at java.lang.ClassLoader.defineClass1(Native Method) > >> [exec] at java.lang.ClassLoader.defineClass(ClassLoader.java:620) > >> [exec] at > >> java.security.SecureClassLoader.defineClass(SecureClassLoader.java:124) > >> [exec] at > >> java.net.URLClassLoader.defineClass(URLClassLoader.java:260) > >> [exec] at > >> java.net.URLClassLoader.access$100(URLClassLoader.java:56) > >> [exec] at java.net.URLClassLoader$1.run(URLClassLoader.java:195) > >> [exec] at java.security.AccessController.doPrivileged(Native > >> Method) > >> [exec] at > >> java.net.URLClassLoader.findClass(URLClassLoader.java:188) > >> [exec] at java.lang.ClassLoader.loadClass(ClassLoader.java:306) > >> [exec] at > >> sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:268) > >> [exec] at java.lang.ClassLoader.loadClass(ClassLoader.java:251) > >> [exec] at > >> org.apache.avalon.excalibur.logger.DefaultLogTargetFactoryManager.configure(D > >> efaultLogTargetFactoryManager.java:113) > >> [exec] at > >> org.apache.avalon.framework.container.ContainerUtil.configure(ContainerUtil.j > >> ava:201) > >> [exec] at > >> org.apache.avalon.excalibur.logger.LogKitLoggerManager.setupTargetFactoryMana > >> ger(LogKitLoggerManager.java:436) > >> [exec] at > >> org.apache.avalon.excalibur.logger.LogKitLoggerManager.configure(LogKitLogger > >> Manager.java:400) > >> [exec] at > >> org.apache.avalon.framework.container.ContainerUtil.configure(ContainerUtil.j > >> ava:201) > >> [exec] at > >> org.apache.cocoon.core.CoreUtil.initLogger(CoreUtil.java:607) > >> [exec] at org.apache.cocoon.core.CoreUtil.init(CoreUtil.java:169) > >> [exec] at org.apache.cocoon.core.CoreUtil.(CoreUtil.java:115) > >> [exec] at > >> org.apache.cocoon.bean.CocoonWrapper.initialize(CocoonWrapper.java:128) > >> [exec] at > >> org.apache.cocoon.bean.CocoonBean.initialize(CocoonBean.java:97) > >> [exec] at org.apache.cocoon.Main.main(Main.java:310) > >> [exec] Java Result: 1 > >> [exec] > >> [exec] Copying broken links file to site root. > >> [exec] > >> [exec] > >> [exec] BUILD FAILED > >> [exec] /apache-forrest-0.8/main/targets/site.xml:175: Warning: Could > >> not > >> find file /hadoop-0.20.2/src/docs/build/tmp/brokenlinks.xml to copy. > >> [exec] > >> [exec] Total time: 4 seconds > >> > >> BUILD FAILED > >> /hadoop-0.20.2/build.xml:867: exec returned: 1 > >> > >> > >> any ideas what's wrong??? > >> > > > > I never saw this usage: > > -Djava5.home > > Try > > export JAVA_HOME=/usr/java > > > > " Bad version number in .class file " means you are mixing and > > matching java versions somehow. > > > iCrossing Privileged and Confidential Information > This email message is for the sole use of the intended recipient(s) and may > contain confidential and privileged information of iCrossing. Any > unauthorized review, use, disclosure or distribution is prohibited. If you > are not the intended recipient, please contact the sender by reply email and > destroy all copies of the original message. > >
Re: HDFS and libhfds
I second Ed's answer. Try unistalling whatever you installed and start fresh. Whenever I see this error when trying to installing a native bridge, this solution always worked for me. On 12/7/10 5:07 PM, "common-user-digest-h...@hadoop.apache.org" wrote: > From: Edward Capriolo > Date: Tue, 7 Dec 2010 17:22:03 -0500 > To: > Subject: Re: HDFS and libhfds > > 2010/12/7 Petrucci Andreas : >> >> hello there, im trying to compile libhdfs in order but there are some >> problems. According to http://wiki.apache.org/hadoop/MountableHDFS i have >> already installes fuse. With ant compile-c++-libhdfs -Dlibhdfs=1 the buils is >> successful. >> >> However when i try ant package -Djava5.home=... -Dforrest.home=... the build >> fails and the output is the below : >> >> [exec] >> [exec] Exception in thread "main" java.lang.UnsupportedClassVersionError: >> Bad version number in .class file >> [exec] at java.lang.ClassLoader.defineClass1(Native Method) >> [exec] at java.lang.ClassLoader.defineClass(ClassLoader.java:620) >> [exec] at >> java.security.SecureClassLoader.defineClass(SecureClassLoader.java:124) >> [exec] at >> java.net.URLClassLoader.defineClass(URLClassLoader.java:260) >> [exec] at java.net.URLClassLoader.access$100(URLClassLoader.java:56) >> [exec] at java.net.URLClassLoader$1.run(URLClassLoader.java:195) >> [exec] at java.security.AccessController.doPrivileged(Native Method) >> [exec] at java.net.URLClassLoader.findClass(URLClassLoader.java:188) >> [exec] at java.lang.ClassLoader.loadClass(ClassLoader.java:306) >> [exec] at >> sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:268) >> [exec] at java.lang.ClassLoader.loadClass(ClassLoader.java:251) >> [exec] at >> org.apache.avalon.excalibur.logger.DefaultLogTargetFactoryManager.configure(D >> efaultLogTargetFactoryManager.java:113) >> [exec] at >> org.apache.avalon.framework.container.ContainerUtil.configure(ContainerUtil.j >> ava:201) >> [exec] at >> org.apache.avalon.excalibur.logger.LogKitLoggerManager.setupTargetFactoryMana >> ger(LogKitLoggerManager.java:436) >> [exec] at >> org.apache.avalon.excalibur.logger.LogKitLoggerManager.configure(LogKitLogger >> Manager.java:400) >> [exec] at >> org.apache.avalon.framework.container.ContainerUtil.configure(ContainerUtil.j >> ava:201) >> [exec] at >> org.apache.cocoon.core.CoreUtil.initLogger(CoreUtil.java:607) >> [exec] at org.apache.cocoon.core.CoreUtil.init(CoreUtil.java:169) >> [exec] at org.apache.cocoon.core.CoreUtil.(CoreUtil.java:115) >> [exec] at >> org.apache.cocoon.bean.CocoonWrapper.initialize(CocoonWrapper.java:128) >> [exec] at >> org.apache.cocoon.bean.CocoonBean.initialize(CocoonBean.java:97) >> [exec] at org.apache.cocoon.Main.main(Main.java:310) >> [exec] Java Result: 1 >> [exec] >> [exec] Copying broken links file to site root. >> [exec] >> [exec] >> [exec] BUILD FAILED >> [exec] /apache-forrest-0.8/main/targets/site.xml:175: Warning: Could not >> find file /hadoop-0.20.2/src/docs/build/tmp/brokenlinks.xml to copy. >> [exec] >> [exec] Total time: 4 seconds >> >> BUILD FAILED >> /hadoop-0.20.2/build.xml:867: exec returned: 1 >> >> >> any ideas what's wrong??? >> > > I never saw this usage: > -Djava5.home > Try > export JAVA_HOME=/usr/java > > " Bad version number in .class file " means you are mixing and > matching java versions somehow. iCrossing Privileged and Confidential Information This email message is for the sole use of the intended recipient(s) and may contain confidential and privileged information of iCrossing. Any unauthorized review, use, disclosure or distribution is prohibited. If you are not the intended recipient, please contact the sender by reply email and destroy all copies of the original message.
RE: HDFS and libhfds
Try this and see if it works Open the build.xml file and the add env JAVA_HOME in compile-core-native target. After adding the change should look like below On 12/7/10 5:07 PM, "common-user-digest-h...@hadoop.apache.org" wrote: > From: Petrucci Andreas > Date: Wed, 8 Dec 2010 02:06:26 +0200 > To: > Subject: RE: HDFS and libhfds > > > yes, my JAVA_HOME is properly set. however in hadoop 0.20.2 that i'm using > when i run from HADOOP_HOME the command ant compile-contrib -Dlibhdfs=1 > -Dcompile.c++=1 then the tail of the output is the following : > > [exec] /home/hy59045/sfakiana/hadoop-0.20.2/src/c++/libhdfs/hdfs.c: In > function 'hdfsUtime': > [exec] /home/hy59045/sfakiana/hadoop-0.20.2/src/c++/libhdfs/hdfs.c:1488: > error: 'JNIEnv' undeclared (first use in this function) > [exec] /home/hy59045/sfakiana/hadoop-0.20.2/src/c++/libhdfs/hdfs.c:1488: > error: 'env' undeclared (first use in this function) > [exec] /home/hy59045/sfakiana/hadoop-0.20.2/src/c++/libhdfs/hdfs.c:1490: > error: 'errno' undeclared (first use in this function) > [exec] /home/hy59045/sfakiana/hadoop-0.20.2/src/c++/libhdfs/hdfs.c:1494: > error: 'jobject' undeclared (first use in this function) > [exec] /home/hy59045/sfakiana/hadoop-0.20.2/src/c++/libhdfs/hdfs.c:1494: > error: expected ';' before 'jFS' > [exec] /home/hy59045/sfakiana/hadoop-0.20.2/src/c++/libhdfs/hdfs.c:1497: > error: expected ';' before 'jPath' > [exec] /home/hy59045/sfakiana/hadoop-0.20.2/src/c++/libhdfs/hdfs.c:1498: > error: 'jPath' undeclared (first use in this function) > [exec] /home/hy59045/sfakiana/hadoop-0.20.2/src/c++/libhdfs/hdfs.c:1503: > error: 'jlong' undeclared (first use in this function) > [exec] /home/hy59045/sfakiana/hadoop-0.20.2/src/c++/libhdfs/hdfs.c:1503: > error: expected ';' before 'jmtime' > [exec] /home/hy59045/sfakiana/hadoop-0.20.2/src/c++/libhdfs/hdfs.c:1504: > error: expected ';' before 'jatime' > [exec] /home/hy59045/sfakiana/hadoop-0.20.2/src/c++/libhdfs/hdfs.c:1507: > error: 'jthrowable' undeclared (first use in this function) > [exec] /home/hy59045/sfakiana/hadoop-0.20.2/src/c++/libhdfs/hdfs.c:1507: > error: expected ';' before 'jExc' > [exec] /home/hy59045/sfakiana/hadoop-0.20.2/src/c++/libhdfs/hdfs.c:1508: > error: 'jExc' undeclared (first use in this function) > [exec] /home/hy59045/sfakiana/hadoop-0.20.2/src/c++/libhdfs/hdfs.c:1508: > error: 'jFS' undeclared (first use in this function) > [exec] /home/hy59045/sfakiana/hadoop-0.20.2/src/c++/libhdfs/hdfs.c:1510: > error: 'jmtime' undeclared (first use in this function) > [exec] /home/hy59045/sfakiana/hadoop-0.20.2/src/c++/libhdfs/hdfs.c:1510: > error: 'jatime' undeclared (first use in this function) > [exec] /home/hy59045/sfakiana/hadoop-0.20.2/src/c++/libhdfs/hdfs.c: In > function 'hdfsGetHosts': > [exec] /home/hy59045/sfakiana/hadoop-0.20.2/src/c++/libhdfs/hdfs.c:1533: > error: 'JNIEnv' undeclared (first use in this function) > [exec] /home/hy59045/sfakiana/hadoop-0.20.2/src/c++/libhdfs/hdfs.c:1533: > error: 'env' undeclared (first use in this function) > [exec] /home/hy59045/sfakiana/hadoop-0.20.2/src/c++/libhdfs/hdfs.c:1535: > error: 'errno' undeclared (first use in this function) > [exec] /home/hy59045/sfakiana/hadoop-0.20.2/src/c++/libhdfs/hdfs.c:1539: > error: 'jobject' undeclared (first use in this function) > [exec] /home/hy59045/sfakiana/hadoop-0.20.2/src/c++/libhdfs/hdfs.c:1539: > error: expected ';' before 'jFS' > [exec] /home/hy59045/sfakiana/hadoop-0.20.2/src/c++/libhdfs/hdfs.c:1542: > error: expected ';' before 'jPath' > [exec] /home/hy59045/sfakiana/hadoop-0.20.2/src/c++/libhdfs/hdfs.c:1543: > error: 'jPath' undeclared (first use in this function) > [exec] /home/hy59045/sfakiana/hadoop-0.20.2/src/c++/libhdfs/hdfs.c:1547: > error: 'jvalue' undeclared (first use in this function) > [exec] /home/hy59045/sfakiana/hadoop-0.20.2/src/c++/libhdfs/hdfs.c:1547: > error: expected ';' before 'jFSVal' > [exec] /home/hy59045/sfakiana/hadoop-0.20.2/src/c++/libhdfs/hdfs.c:1548: > error: 'jthrowable' undeclared (first use in this function) > [exec] /home/hy59045/sfakiana/hadoop-0.20.2/src/c++/libhdfs/hdfs.c:1548: > error: expected ';' before 'jFSExc' > [exec] /home/hy59045/sfakiana/hadoop-0.20.2/src/c+
RE: HDFS and libhfds
hdfs/hdfs.c:1971: error: expected ';' before 'jPath' [exec] /home/hy59045/sfakiana/hadoop-0.20.2/src/c++/libhdfs/hdfs.c:1972: error: 'jPath' undeclared (first use in this function) [exec] /home/hy59045/sfakiana/hadoop-0.20.2/src/c++/libhdfs/hdfs.c:1978: error: 'jobjectArray' undeclared (first use in this function) [exec] /home/hy59045/sfakiana/hadoop-0.20.2/src/c++/libhdfs/hdfs.c:1978: error: expected ';' before 'jPathList' [exec] /home/hy59045/sfakiana/hadoop-0.20.2/src/c++/libhdfs/hdfs.c:1979: error: 'jvalue' undeclared (first use in this function) [exec] /home/hy59045/sfakiana/hadoop-0.20.2/src/c++/libhdfs/hdfs.c:1979: error: expected ';' before 'jVal' [exec] /home/hy59045/sfakiana/hadoop-0.20.2/src/c++/libhdfs/hdfs.c:1980: error: 'jthrowable' undeclared (first use in this function) [exec] /home/hy59045/sfakiana/hadoop-0.20.2/src/c++/libhdfs/hdfs.c:1980: error: expected ';' before 'jExc' [exec] /home/hy59045/sfakiana/hadoop-0.20.2/src/c++/libhdfs/hdfs.c:1981: error: 'jVal' undeclared (first use in this function) [exec] /home/hy59045/sfakiana/hadoop-0.20.2/src/c++/libhdfs/hdfs.c:1981: error: 'jExc' undeclared (first use in this function) [exec] /home/hy59045/sfakiana/hadoop-0.20.2/src/c++/libhdfs/hdfs.c:1981: error: 'jFS' undeclared (first use in this function) [exec] /home/hy59045/sfakiana/hadoop-0.20.2/src/c++/libhdfs/hdfs.c:1989: error: 'jPathList' undeclared (first use in this function) [exec] /home/hy59045/sfakiana/hadoop-0.20.2/src/c++/libhdfs/hdfs.c:1992: error: 'jsize' undeclared (first use in this function) [exec] /home/hy59045/sfakiana/hadoop-0.20.2/src/c++/libhdfs/hdfs.c:1992: error: expected ';' before 'jPathListSize' [exec] /home/hy59045/sfakiana/hadoop-0.20.2/src/c++/libhdfs/hdfs.c:1993: error: 'jPathListSize' undeclared (first use in this function) [exec] /home/hy59045/sfakiana/hadoop-0.20.2/src/c++/libhdfs/hdfs.c:2007: error: expected ';' before 'i' [exec] /home/hy59045/sfakiana/hadoop-0.20.2/src/c++/libhdfs/hdfs.c:2008: error: expected ';' before 'tmpStat' [exec] /home/hy59045/sfakiana/hadoop-0.20.2/src/c++/libhdfs/hdfs.c:2009: error: 'i' undeclared (first use in this function) [exec] /home/hy59045/sfakiana/hadoop-0.20.2/src/c++/libhdfs/hdfs.c:2010: error: 'tmpStat' undeclared (first use in this function) [exec] /home/hy59045/sfakiana/hadoop-0.20.2/src/c++/libhdfs/hdfs.c:2011: warning: implicit declaration of function 'getFileInfoFromStat' [exec] /home/hy59045/sfakiana/hadoop-0.20.2/src/c++/libhdfs/hdfs.c: In function 'hdfsGetPathInfo': [exec] /home/hy59045/sfakiana/hadoop-0.20.2/src/c++/libhdfs/hdfs.c:2041: error: 'JNIEnv' undeclared (first use in this function) [exec] /home/hy59045/sfakiana/hadoop-0.20.2/src/c++/libhdfs/hdfs.c:2041: error: 'env' undeclared (first use in this function) [exec] /home/hy59045/sfakiana/hadoop-0.20.2/src/c++/libhdfs/hdfs.c:2043: error: 'errno' undeclared (first use in this function) [exec] /home/hy59045/sfakiana/hadoop-0.20.2/src/c++/libhdfs/hdfs.c:2047: error: 'jobject' undeclared (first use in this function) [exec] /home/hy59045/sfakiana/hadoop-0.20.2/src/c++/libhdfs/hdfs.c:2047: error: expected ';' before 'jFS' [exec] /home/hy59045/sfakiana/hadoop-0.20.2/src/c++/libhdfs/hdfs.c:2050: error: expected ';' before 'jPath' [exec] /home/hy59045/sfakiana/hadoop-0.20.2/src/c++/libhdfs/hdfs.c:2051: error: 'jPath' undeclared (first use in this function) [exec] /home/hy59045/sfakiana/hadoop-0.20.2/src/c++/libhdfs/hdfs.c:2056: warning: implicit declaration of function 'getFileInfo' [exec] /home/hy59045/sfakiana/hadoop-0.20.2/src/c++/libhdfs/hdfs.c:2056: error: 'jFS' undeclared (first use in this function) [exec] /home/hy59045/sfakiana/hadoop-0.20.2/src/c++/libhdfs/hdfs.c: In function 'hdfsFreeFileInfo': [exec] /home/hy59045/sfakiana/hadoop-0.20.2/src/c++/libhdfs/hdfs.c:2080: error: 'hdfsFileInfo' has no member named 'mOwner' [exec] /home/hy59045/sfakiana/hadoop-0.20.2/src/c++/libhdfs/hdfs.c:2081: error: 'hdfsFileInfo' has no member named 'mOwner' [exec] /home/hy59045/sfakiana/hadoop-0.20.2/src/c++/libhdfs/hdfs.c:2083: error: 'hdfsFileInfo' has no member named 'mGroup' [exec] /home/hy59045/sfakiana/hadoop-0.20.2/src/c++/libhdfs/hdfs.c:2084: error: 'hdfsFileInfo' has no member named 'mGroup' [exec] make: *** [hdfs.lo] Error 1 BUILD FAILED /home/hy59045/sfakiana/hadoop
Re: HDFS and libhfds
It is seems that you're trying to run ant with java5. Make sure your JAVA_HOME is set properly. -- Take care, Konstantin (Cos) Boudnik 2010/12/7 Petrucci Andreas : > > hello there, im trying to compile libhdfs in order but there are some > problems. According to http://wiki.apache.org/hadoop/MountableHDFS i have > already installes fuse. With ant compile-c++-libhdfs -Dlibhdfs=1 the buils is > successful. > > However when i try ant package -Djava5.home=... -Dforrest.home=... the build > fails and the output is the below : > > [exec] > [exec] Exception in thread "main" java.lang.UnsupportedClassVersionError: > Bad version number in .class file > [exec] at java.lang.ClassLoader.defineClass1(Native Method) > [exec] at java.lang.ClassLoader.defineClass(ClassLoader.java:620) > [exec] at > java.security.SecureClassLoader.defineClass(SecureClassLoader.java:124) > [exec] at java.net.URLClassLoader.defineClass(URLClassLoader.java:260) > [exec] at java.net.URLClassLoader.access$100(URLClassLoader.java:56) > [exec] at java.net.URLClassLoader$1.run(URLClassLoader.java:195) > [exec] at java.security.AccessController.doPrivileged(Native Method) > [exec] at java.net.URLClassLoader.findClass(URLClassLoader.java:188) > [exec] at java.lang.ClassLoader.loadClass(ClassLoader.java:306) > [exec] at > sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:268) > [exec] at java.lang.ClassLoader.loadClass(ClassLoader.java:251) > [exec] at > org.apache.avalon.excalibur.logger.DefaultLogTargetFactoryManager.configure(DefaultLogTargetFactoryManager.java:113) > [exec] at > org.apache.avalon.framework.container.ContainerUtil.configure(ContainerUtil.java:201) > [exec] at > org.apache.avalon.excalibur.logger.LogKitLoggerManager.setupTargetFactoryManager(LogKitLoggerManager.java:436) > [exec] at > org.apache.avalon.excalibur.logger.LogKitLoggerManager.configure(LogKitLoggerManager.java:400) > [exec] at > org.apache.avalon.framework.container.ContainerUtil.configure(ContainerUtil.java:201) > [exec] at > org.apache.cocoon.core.CoreUtil.initLogger(CoreUtil.java:607) > [exec] at org.apache.cocoon.core.CoreUtil.init(CoreUtil.java:169) > [exec] at org.apache.cocoon.core.CoreUtil.(CoreUtil.java:115) > [exec] at > org.apache.cocoon.bean.CocoonWrapper.initialize(CocoonWrapper.java:128) > [exec] at > org.apache.cocoon.bean.CocoonBean.initialize(CocoonBean.java:97) > [exec] at org.apache.cocoon.Main.main(Main.java:310) > [exec] Java Result: 1 > [exec] > [exec] Copying broken links file to site root. > [exec] > [exec] > [exec] BUILD FAILED > [exec] /apache-forrest-0.8/main/targets/site.xml:175: Warning: Could not > find file /hadoop-0.20.2/src/docs/build/tmp/brokenlinks.xml to copy. > [exec] > [exec] Total time: 4 seconds > > BUILD FAILED > /hadoop-0.20.2/build.xml:867: exec returned: 1 > > > any ideas what's wrong??? >
Re: HDFS and libhfds
2010/12/7 Petrucci Andreas : > > hello there, im trying to compile libhdfs in order but there are some > problems. According to http://wiki.apache.org/hadoop/MountableHDFS i have > already installes fuse. With ant compile-c++-libhdfs -Dlibhdfs=1 the buils is > successful. > > However when i try ant package -Djava5.home=... -Dforrest.home=... the build > fails and the output is the below : > > [exec] > [exec] Exception in thread "main" java.lang.UnsupportedClassVersionError: > Bad version number in .class file > [exec] at java.lang.ClassLoader.defineClass1(Native Method) > [exec] at java.lang.ClassLoader.defineClass(ClassLoader.java:620) > [exec] at > java.security.SecureClassLoader.defineClass(SecureClassLoader.java:124) > [exec] at java.net.URLClassLoader.defineClass(URLClassLoader.java:260) > [exec] at java.net.URLClassLoader.access$100(URLClassLoader.java:56) > [exec] at java.net.URLClassLoader$1.run(URLClassLoader.java:195) > [exec] at java.security.AccessController.doPrivileged(Native Method) > [exec] at java.net.URLClassLoader.findClass(URLClassLoader.java:188) > [exec] at java.lang.ClassLoader.loadClass(ClassLoader.java:306) > [exec] at > sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:268) > [exec] at java.lang.ClassLoader.loadClass(ClassLoader.java:251) > [exec] at > org.apache.avalon.excalibur.logger.DefaultLogTargetFactoryManager.configure(DefaultLogTargetFactoryManager.java:113) > [exec] at > org.apache.avalon.framework.container.ContainerUtil.configure(ContainerUtil.java:201) > [exec] at > org.apache.avalon.excalibur.logger.LogKitLoggerManager.setupTargetFactoryManager(LogKitLoggerManager.java:436) > [exec] at > org.apache.avalon.excalibur.logger.LogKitLoggerManager.configure(LogKitLoggerManager.java:400) > [exec] at > org.apache.avalon.framework.container.ContainerUtil.configure(ContainerUtil.java:201) > [exec] at > org.apache.cocoon.core.CoreUtil.initLogger(CoreUtil.java:607) > [exec] at org.apache.cocoon.core.CoreUtil.init(CoreUtil.java:169) > [exec] at org.apache.cocoon.core.CoreUtil.(CoreUtil.java:115) > [exec] at > org.apache.cocoon.bean.CocoonWrapper.initialize(CocoonWrapper.java:128) > [exec] at > org.apache.cocoon.bean.CocoonBean.initialize(CocoonBean.java:97) > [exec] at org.apache.cocoon.Main.main(Main.java:310) > [exec] Java Result: 1 > [exec] > [exec] Copying broken links file to site root. > [exec] > [exec] > [exec] BUILD FAILED > [exec] /apache-forrest-0.8/main/targets/site.xml:175: Warning: Could not > find file /hadoop-0.20.2/src/docs/build/tmp/brokenlinks.xml to copy. > [exec] > [exec] Total time: 4 seconds > > BUILD FAILED > /hadoop-0.20.2/build.xml:867: exec returned: 1 > > > any ideas what's wrong??? > I never saw this usage: -Djava5.home Try export JAVA_HOME=/usr/java " Bad version number in .class file " means you are mixing and matching java versions somehow.
HDFS and libhfds
hello there, im trying to compile libhdfs in order but there are some problems. According to http://wiki.apache.org/hadoop/MountableHDFS i have already installes fuse. With ant compile-c++-libhdfs -Dlibhdfs=1 the buils is successful. However when i try ant package -Djava5.home=... -Dforrest.home=... the build fails and the output is the below : [exec] [exec] Exception in thread "main" java.lang.UnsupportedClassVersionError: Bad version number in .class file [exec] at java.lang.ClassLoader.defineClass1(Native Method) [exec] at java.lang.ClassLoader.defineClass(ClassLoader.java:620) [exec] at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:124) [exec] at java.net.URLClassLoader.defineClass(URLClassLoader.java:260) [exec] at java.net.URLClassLoader.access$100(URLClassLoader.java:56) [exec] at java.net.URLClassLoader$1.run(URLClassLoader.java:195) [exec] at java.security.AccessController.doPrivileged(Native Method) [exec] at java.net.URLClassLoader.findClass(URLClassLoader.java:188) [exec] at java.lang.ClassLoader.loadClass(ClassLoader.java:306) [exec] at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:268) [exec] at java.lang.ClassLoader.loadClass(ClassLoader.java:251) [exec] at org.apache.avalon.excalibur.logger.DefaultLogTargetFactoryManager.configure(DefaultLogTargetFactoryManager.java:113) [exec] at org.apache.avalon.framework.container.ContainerUtil.configure(ContainerUtil.java:201) [exec] at org.apache.avalon.excalibur.logger.LogKitLoggerManager.setupTargetFactoryManager(LogKitLoggerManager.java:436) [exec] at org.apache.avalon.excalibur.logger.LogKitLoggerManager.configure(LogKitLoggerManager.java:400) [exec] at org.apache.avalon.framework.container.ContainerUtil.configure(ContainerUtil.java:201) [exec] at org.apache.cocoon.core.CoreUtil.initLogger(CoreUtil.java:607) [exec] at org.apache.cocoon.core.CoreUtil.init(CoreUtil.java:169) [exec] at org.apache.cocoon.core.CoreUtil.(CoreUtil.java:115) [exec] at org.apache.cocoon.bean.CocoonWrapper.initialize(CocoonWrapper.java:128) [exec] at org.apache.cocoon.bean.CocoonBean.initialize(CocoonBean.java:97) [exec] at org.apache.cocoon.Main.main(Main.java:310) [exec] Java Result: 1 [exec] [exec] Copying broken links file to site root. [exec] [exec] [exec] BUILD FAILED [exec] /apache-forrest-0.8/main/targets/site.xml:175: Warning: Could not find file /hadoop-0.20.2/src/docs/build/tmp/brokenlinks.xml to copy. [exec] [exec] Total time: 4 seconds BUILD FAILED /hadoop-0.20.2/build.xml:867: exec returned: 1 any ideas what's wrong???