Petru,

 This is incredibly useful, thanks!

 Do you mind either adding this to hadoop-wiki?

thanks,
Arun

On Dec 9, 2011, at 8:04 AM, Petru Dimulescu wrote:

> Hello, this mail originated as a question, in the meantime I found the 
> solution, so it might help someone hopefully.
> 
> If you want to build fuse-dfs on 0,23 snapshot branch, on ubuntu linux 11.10 
> (important, as you'll see:), 
> First you need to do a mvn package -Pnative in 
> hadoop-common/hadoop-hdfs-project/hadoop-hdfs. You'll get a target/native/ 
> subdir, go there and do a make install so that you'll have libhdfs.so in a 
> system libdir.
> 
> Then I went to hadoop-hdfs/src/contrib and typed:
> 
> $ ant compile -Dfusedfs=1
> 
> it complained about not having 
> hadoop-common/hadoop-hdfs-project/hadoop-hdfs/ivy/libraries.properties -- 
> that (empty) file is in 
> hadoop-common/hadoop-hdfs-project/hadoop-hdfs/src/contrib/fuse-dfs/ivy/libraries.properties).
>  After I copied it in the expected place, I got :
> 
>    [exec] In file included from fuse_impls.h:26:0,
>      [exec]                  from fuse_dfs.c:21:
>      [exec] fuse_context_handle.h:22:18: fatal error: hdfs.h: No such file or 
> directory
> 
> If I copied hdfs.h from the src/main/native dir to /usr/local/include, just 
> to make it happy quickly. Next error:
> 
>   [exec] fuse_impls_write.c: In function ‘dfs_write’:
>      [exec] fuse_impls_write.c:38:16: warning: cast to pointer from integer 
> of different size [-Wint-to-pointer-cast]
>      [exec] fuse_dfs.o: In function `is_protected':
>      [exec] 
> /home/petru/work/ubeeko/hadoo.apache.org/0.23/hadoop-common/hadoop-hdfs-project/hadoop-hdfs/src/contrib/fuse-dfs/src/fuse_dfs.c:27:
>  undefined reference to `fuse_get_context'
> 
> This is because of this: 
> https://bugs.launchpad.net/ubuntu/+source/fuse/+bug/878612
> 
> so just edit the failing linking command which something like this :
> 
> gcc -Wall -g -Wall -O3 
> -L/home/petru/work/ubeeko/hadoo.apache.org/0.23/hadoop-common/hadoop-hdfs-project/hadoop-hdfs/build/c++/Linux-i386-32/lib
>  -lhdfs -L/lib -lfuse -L/usr/local/java/jdk/jre/lib/i386/server -ljvm  -o 
> fuse_dfs fuse_dfs.o fuse_options.o fuse_trash.o fuse_stat_struct.o 
> fuse_users.o fuse_init.o fuse_connect.o     fuse_impls_access.o 
> fuse_impls_chmod.o fuse_impls_chown.o fuse_impls_create.o fuse_impls_flush.o 
> fuse_impls_getattr.o fuse_impls_mkdir.o fuse_impls_mknod.o fuse_impls_open.o 
> fuse_impls_read.o fuse_impls_release.o fuse_impls_readdir.o 
> fuse_impls_rename.o fuse_impls_rmdir.o fuse_impls_statfs.o 
> fuse_impls_symlink.o fuse_impls_truncate.o fuse_impls_utimens.o 
> fuse_impls_unlink.o fuse_impls_write.o
> 
> by moving all the -L and -l part at the end, then:
> 
> $ cd src/
> $ gcc -Wall -g -Wall -O3  -o fuse_dfs fuse_dfs.o fuse_options.o fuse_trash.o 
> fuse_stat_struct.o fuse_users.o fuse_init.o fuse_connect.o 
> fuse_impls_access.o fuse_impls_chmod.o fuse_impls_chown.o fuse_impls_create.o 
> fuse_impls_flush.o fuse_impls_getattr.o fuse_impls_mkdir.o fuse_impls_mknod.o 
> fuse_impls_open.o fuse_impls_read.o fuse_impls_release.o fuse_impls_readdir.o 
> fuse_impls_rename.o fuse_impls_rmdir.o fuse_impls_statfs.o 
> fuse_impls_symlink.o fuse_impls_truncate.o fuse_impls_utimens.o 
> fuse_impls_unlink.o fuse_impls_write.o 
> -L/home/petru/work/ubeeko/hadoo.apache.org/0.23/hadoop-common/hadoop-hdfs-project/hadoop-hdfs/build/c++/Linux-i386-32/lib
>  -lhdfs -L/lib -lfuse -L/usr/local/java/jdk/jre/lib/i386/server -ljvm 
> 
> Here, hope that helps someone. Don't just love autoconf?
> 

Reply via email to