On Tue, Jul 14, 2009 at 12:58 PM, Ryan Smith <ryan.justin.sm...@gmail.com>wrote:

> Todd,
>
> Ill try it on Fedora, thanks again.
>

You can also apply the patch from HADOOP-5611 - it should apply cleanly.
Just download it and patch -p0 < HADOOP-5611.patch

-Todd


>
> -Ryan
>
> On Tue, Jul 14, 2009 at 3:55 PM, Todd Lipcon <t...@cloudera.com> wrote:
>
> > Hi Ryan,
> >
> > Sounds like HADOOP-5611:
> https://issues.apache.org/jira/browse/HADOOP-5611
> >
> > -Todd
> >
> > On Tue, Jul 14, 2009 at 12:49 PM, Ryan Smith <
> ryan.justin.sm...@gmail.com
> > >wrote:
> >
> > > Hello,
> > >
> > > My problem was I didnt have g++ installed.  :)  So i installed g++ and
> I
> > > re-ran :
> > >
> > >  ant compile-contrib -Dlibhdfs=1 -Dcompile.c++=1
> > >
> > > and now I get a strange error on utils.  Any ideas  on this one?  Looks
> > > like
> > > its 64 bit related.
> > > My os is ubuntu 32 bit.
> > > $ uname -a
> > > Linux ubuntudev 2.6.28-11-generic #42-Ubuntu SMP Fri Apr 17 01:57:59
> UTC
> > > 2009 i686 GNU/Linux
> > >
> > > Also, i tried building without -Dcompile.c++=1 and it didnt compile
> > libhdfs
> > > so it seems libhdfs needs it.  Ill be digging thru the build.xml more
> to
> > > figure more out.
> > > Thanks again.
> > > -Ryan
> > >
> > >
> > > -------------------------------------------------------------
> > >
> > > compile-hdfs-classes:
> > >    [javac] Compiling 4 source files to
> > > /home/rsmith/hadoop-0.20.0/build/classes
> > >
> > > compile-core-native:
> > >
> > > check-c++-makefiles:
> > >
> > > create-c++-pipes-makefile:
> > >
> > > create-c++-utils-makefile:
> > >
> > > compile-c++-utils:
> > >     [exec] depbase=`echo impl/StringUtils.o | sed
> > > 's|[^/]*$|.deps/&|;s|\.o$||'`; \
> > >     [exec]     if g++ -DHAVE_CONFIG_H -I.
> > > -I/home/rsmith/hadoop-0.20.0/src/c++/utils -I./impl
> > > -I/home/rsmith/hadoop-0.20.0/src/c++/utils/api -Wall -g -O2 -MT
> > > impl/StringUtils.o -MD -MP -MF "$depbase.Tpo" -c -o impl/StringUtils.o
> > > /home/rsmith/hadoop-0.20.0/src/c++/utils/impl/StringUtils.cc; \
> > >     [exec]     then mv -f "$depbase.Tpo" "$depbase.Po"; else rm -f
> > > "$depbase.Tpo"; exit 1; fi
> > >     [exec]
> /home/rsmith/hadoop-0.20.0/src/c++/utils/impl/StringUtils.cc:
> > In
> > > function ‘uint64_t HadoopUtils::getCurrentMillis()’:
> > >     [exec]
> > /home/rsmith/hadoop-0.20.0/src/c++/utils/impl/StringUtils.cc:74:
> > > error: ‘strerror’ was not declared in this scope
> > >     [exec]
> /home/rsmith/hadoop-0.20.0/src/c++/utils/impl/StringUtils.cc:
> > In
> > > function ‘std::string HadoopUtils::quoteString(const std::string&,
> const
> > > char*)’:
> > >     [exec]
> > > /home/rsmith/hadoop-0.20.0/src/c++/utils/impl/StringUtils.cc:103:
> error:
> > > ‘strchr’ was not declared in this scope
> > >     [exec]
> /home/rsmith/hadoop-0.20.0/src/c++/utils/impl/StringUtils.cc:
> > In
> > > function ‘std::string HadoopUtils::unquoteString(const std::string&)’:
> > >     [exec]
> > > /home/rsmith/hadoop-0.20.0/src/c++/utils/impl/StringUtils.cc:144:
> error:
> > > ‘strtol’ was not declared in this scope
> > >     [exec] make: *** [impl/StringUtils.o] Error 1
> > >
> > > BUILD FAILED
> > > /home/rsmith/hadoop-0.20.0/build.xml:1405: exec returned: 2
> > >
> > > Total time: 5 seconds
> > >
> > > --------------------------------------------------------------
> > >
> > >
> > >
> > >
> > >
> > >
> > >
> > > On Tue, Jul 14, 2009 at 3:32 PM, Todd Lipcon <t...@cloudera.com>
> wrote:
> > >
> > > > Hi Ryan,
> > > >
> > > > I've never seen that issue. It sounds to me like your C installation
> is
> > > > screwy - what OS are you running?
> > > >
> > > > I *think* (but am not certain) that if you leave out -Dcompile.c++=1
> > and
> > > > just leave -Dlibhdfs, you'll avoid pipes but get libhdfs.
> > > >
> > > > -Todd
> > > >
> > > > On Tue, Jul 14, 2009 at 12:16 PM, Ryan Smith <
> > > ryan.justin.sm...@gmail.com
> > > > >wrote:
> > > >
> > > > > Hi Todd.
> > > > >
> > > > > Thanks, that was it.  Now I got an error during the configure
> script.
> > > >  Any
> > > > > ideas?
> > > > >
> > > > >  [exec] configure: error: C++ preprocessor "/lib/cpp" fails sanity
> > > check
> > > > >
> > > > > Please see full details below.
> > > > >
> > > > > http://pastebin.com/m79ff6ef7
> > > > >
> > > > > Also, is there a way to avoid building pipes? I just need libhdfs.
> > > >  Thanks.
> > > > > -Ryan
> > > > >
> > > > >
> > > > >
> > > > > On Tue, Jul 14, 2009 at 2:50 PM, Todd Lipcon <t...@cloudera.com>
> > > wrote:
> > > > >
> > > > > > Hi Ryan,
> > > > > >
> > > > > > To fix this you can simply chmod 755 that configure script
> > referenced
> > > > in
> > > > > > the
> > > > > > error.
> > > > > >
> > > > > > There is a JIRA for this that I think got committed that adds
> > another
> > > > > > <chmod> task to build.xml, but it may not be in 0.20.0.
> > > > > >
> > > > > > Thanks
> > > > > > -Todd
> > > > > >
> > > > > > On Tue, Jul 14, 2009 at 11:36 AM, Ryan Smith <
> > > > > ryan.justin.sm...@gmail.com
> > > > > > >wrote:
> > > > > >
> > > > > > > I run ant clean then :
> > > > > > > ant compile-contrib -Dlibhdfs=1 -Dcompile.c++=1
> > > > > > >
> > > > > > > Below is the error output.
> > > > > > >
> > > > > > > Is this the right way to build libhdfs?
> > > > > > >
> > > > > > > 0.18.3 and 0.19.1 builds libhdfs for me just fine.
> > > > > > >
> > > > > > >
> > > > > > >
> > > > > > >
> > > > > > >
> > > > > > > record-parser:
> > > > > > >
> > > > > > > compile-rcc-compiler:
> > > > > > >
> > > > > > > compile-core-classes:
> > > > > > >    [javac] Compiling 1 source file to
> > > > > > /home/jim/hadoop-0.20.0/build/classes
> > > > > > >
> > > > > > > compile-mapred-classes:
> > > > > > >    [javac] Compiling 1 source file to
> > > > > > /home/jim/hadoop-0.20.0/build/classes
> > > > > > >
> > > > > > > compile-hdfs-classes:
> > > > > > >    [javac] Compiling 4 source files to
> > > > > > > /home/jim/hadoop-0.20.0/build/classes
> > > > > > >
> > > > > > > compile-core-native:
> > > > > > >
> > > > > > > check-c++-makefiles:
> > > > > > >
> > > > > > > create-c++-pipes-makefile:
> > > > > > >
> > > > > > > BUILD FAILED
> > > > > > > /home/jim/hadoop-0.20.0/build.xml:1414: Execute failed:
> > > > > > > java.io.IOException:
> > > > > > > Cannot run program
> > > "/home/jim/hadoop-0.20.0/src/c++/pipes/configure"
> > > > > (in
> > > > > > > directory
> > > > > "/home/jim/hadoop-0.20.0/build/c++-build/Linux-i386-32/pipes"):
> > > > > > > java.io.IOException: error=13, Permission denied
> > > > > > >
> > > > > >
> > > > >
> > > >
> > >
> >
>

Reply via email to