Hi Ryan,

I've never seen that issue. It sounds to me like your C installation is
screwy - what OS are you running?

I *think* (but am not certain) that if you leave out -Dcompile.c++=1 and
just leave -Dlibhdfs, you'll avoid pipes but get libhdfs.

-Todd

On Tue, Jul 14, 2009 at 12:16 PM, Ryan Smith <ryan.justin.sm...@gmail.com>wrote:

> Hi Todd.
>
> Thanks, that was it.  Now I got an error during the configure script.  Any
> ideas?
>
>  [exec] configure: error: C++ preprocessor "/lib/cpp" fails sanity check
>
> Please see full details below.
>
> http://pastebin.com/m79ff6ef7
>
> Also, is there a way to avoid building pipes? I just need libhdfs.  Thanks.
> -Ryan
>
>
>
> On Tue, Jul 14, 2009 at 2:50 PM, Todd Lipcon <t...@cloudera.com> wrote:
>
> > Hi Ryan,
> >
> > To fix this you can simply chmod 755 that configure script referenced in
> > the
> > error.
> >
> > There is a JIRA for this that I think got committed that adds another
> > <chmod> task to build.xml, but it may not be in 0.20.0.
> >
> > Thanks
> > -Todd
> >
> > On Tue, Jul 14, 2009 at 11:36 AM, Ryan Smith <
> ryan.justin.sm...@gmail.com
> > >wrote:
> >
> > > I run ant clean then :
> > > ant compile-contrib -Dlibhdfs=1 -Dcompile.c++=1
> > >
> > > Below is the error output.
> > >
> > > Is this the right way to build libhdfs?
> > >
> > > 0.18.3 and 0.19.1 builds libhdfs for me just fine.
> > >
> > >
> > >
> > >
> > >
> > > record-parser:
> > >
> > > compile-rcc-compiler:
> > >
> > > compile-core-classes:
> > >    [javac] Compiling 1 source file to
> > /home/jim/hadoop-0.20.0/build/classes
> > >
> > > compile-mapred-classes:
> > >    [javac] Compiling 1 source file to
> > /home/jim/hadoop-0.20.0/build/classes
> > >
> > > compile-hdfs-classes:
> > >    [javac] Compiling 4 source files to
> > > /home/jim/hadoop-0.20.0/build/classes
> > >
> > > compile-core-native:
> > >
> > > check-c++-makefiles:
> > >
> > > create-c++-pipes-makefile:
> > >
> > > BUILD FAILED
> > > /home/jim/hadoop-0.20.0/build.xml:1414: Execute failed:
> > > java.io.IOException:
> > > Cannot run program "/home/jim/hadoop-0.20.0/src/c++/pipes/configure"
> (in
> > > directory
> "/home/jim/hadoop-0.20.0/build/c++-build/Linux-i386-32/pipes"):
> > > java.io.IOException: error=13, Permission denied
> > >
> >
>

Reply via email to