Hello,

I found a work around for this problem

 -- The libhdfs files were elsewhere in the build in $HADOOP_HOME/build/c+
+/Linux-amd64-64/lib/ and not in the $HADOOP_HOME/build/libhdfs directory
as the Makefile in fuse-dfs were pointing to.

Regards,
Kumar

Kumar Ravi



                                                                       
  From:       Kumar Ravi/Austin/IBM@IBMUS                              
                                                                       
  To:         common-user@hadoop.apache.org                            
                                                                       
  Date:       02/27/2012 10:22 AM                                      
                                                                       
  Subject:    Can't build hadoop-1.0.1 -- Break building fuse-dfs      
                                                                       






Hello,

 I am running into the following problem building hadoop-1.0.1:


-------------------------
 [exec] make[1]: Entering directory
`/home/kumar/hadoop-1.0.1/src/contrib/fuse-dfs'
     [exec] make[1]: Nothing to be done for `all-am'.
     [exec] make[1]: Leaving directory
`/home/kumar/hadoop-1.0.1/src/contrib/fuse-dfs'
     [exec] Making all in src
     [exec] make[1]: Entering directory
`/home/kumar/hadoop-1.0.1/src/contrib/fuse-dfs/src'
     [exec] gcc  -Wall -O3 -L/home/kumar/hadoop-1.0.1/build/libhdfs -lhdfs
-L/lib -lfuse -L/usr/java/jdk1.6.0_27//jre/lib/amd64/server -ljvm  -o
fuse_dfs fuse_dfs.o fuse_options.o fuse_trash.o fuse_stat_struct.o
fuse_users.o fuse_init.o fuse_connect.o fuse_impls_access.o
fuse_impls_chmod.o fuse_impls_chown.o fuse_impls_create.o
fuse_impls_flush.o fuse_impls_getattr.o fuse_impls_mkdir.o
fuse_impls_mknod.o fuse_impls_open.o fuse_impls_read.o fuse_impls_release.o
fuse_impls_readdir.o fuse_impls_rename.o fuse_impls_rmdir.o
fuse_impls_statfs.o fuse_impls_symlink.o fuse_impls_truncate.o
fuse_impls_utimens.o fuse_impls_unlink.o fuse_impls_write.o
     [exec] /usr/bin/ld: cannot find -lhdfs
     [exec] collect2: ld returned 1 exit status

-----------------------------------------------

Src. was downloaded from --
http://svn.apache.org/repos/asf/hadoop/common/tags/release-1.0.1/ using
svn, and the ant command with target used was:

ant -Dlibhdfs=true -Dcompile.native=true -Dfusedfs=true -Dcompile.c++=true
-Dforrest.home=/apache-forrest-0.8/ compile-core-native compile-c++
compile-c++-examples task-controller tar record-parser compile-hdfs-classes
package -Djava5.home=/opt/sun/jdk1.5.0_22/


I am using Sun Java JDK 1.6.0_31 -

java version "1.6.0_31"
Java(TM) SE Runtime Environment (build 1.6.0_31-b04)
Java HotSpot(TM) 64-Bit Server VM (build 20.6-b01, mixed mode)


I would appreciate any pointers to getting past this problem.




Kumar Ravi

Reply via email to