Hi Ryan,

To fix this you can simply chmod 755 that configure script referenced in the
error.

There is a JIRA for this that I think got committed that adds another
<chmod> task to build.xml, but it may not be in 0.20.0.

Thanks
-Todd

On Tue, Jul 14, 2009 at 11:36 AM, Ryan Smith <ryan.justin.sm...@gmail.com>wrote:

> I run ant clean then :
> ant compile-contrib -Dlibhdfs=1 -Dcompile.c++=1
>
> Below is the error output.
>
> Is this the right way to build libhdfs?
>
> 0.18.3 and 0.19.1 builds libhdfs for me just fine.
>
>
>
>
>
> record-parser:
>
> compile-rcc-compiler:
>
> compile-core-classes:
>    [javac] Compiling 1 source file to /home/jim/hadoop-0.20.0/build/classes
>
> compile-mapred-classes:
>    [javac] Compiling 1 source file to /home/jim/hadoop-0.20.0/build/classes
>
> compile-hdfs-classes:
>    [javac] Compiling 4 source files to
> /home/jim/hadoop-0.20.0/build/classes
>
> compile-core-native:
>
> check-c++-makefiles:
>
> create-c++-pipes-makefile:
>
> BUILD FAILED
> /home/jim/hadoop-0.20.0/build.xml:1414: Execute failed:
> java.io.IOException:
> Cannot run program "/home/jim/hadoop-0.20.0/src/c++/pipes/configure" (in
> directory "/home/jim/hadoop-0.20.0/build/c++-build/Linux-i386-32/pipes"):
> java.io.IOException: error=13, Permission denied
>

Reply via email to