[ 
https://issues.apache.org/jira/browse/HADOOP-2009?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=13143228#comment-13143228
 ] 

Michael Noll commented on HADOOP-2009:
--------------------------------------

FYI: The reason might be a problem with ld and how it is configured to work by 
default.  We have run into basically the exact same error when trying to build 
the hadoop-lzo libraries from [1].

I described the problem (for hadoop-lzo) in length here:
https://github.com/kevinweil/hadoop-lzo/issues/33

In a nutshell, the problem is that some systems run ld with a default setting 
of {{"\--as-needed"}} whereas others have {{"\--no-as-needed"}}.  The build 
will only work if you run ld with the latter, i.e. {{"\--no-as-needed"}}.  We 
run into this error when we tried to build hadoop-lzo on Ubuntu 11.10, which is 
the first Ubuntu version where the default behavior of ld has been changed to 
{{\--as-needed}}.

Now the reason why this has an effect is the way that the Hadoop LZO library is 
being searched for in {{src/native/configure}}.  It is using a dummy C script 
and compiles it by running gcc with "-llzo2":

{code}
// from src/native/configure:
echo 'int main(int argc, char **argv){return 0;}' > conftest.c
{code}

However, the dummy C script does not actually use lzo2 directly, so whether the 
lzo2 library will be linked or not depends now on whether ld is being run with 
{{"\--as-needed"}} or with {{"\--no-as-needed"}}.

I provided a patch [2] for hadoop-lzo that fixes this problem by setting 
{{LDFLAGS}} (env var) in ant's {{build.xml}}.  The same fix solves this build 
error for Hadoop 0.20.203.0, too:

{code}
    <exec dir="${build.native}" executable="sh" failonerror="true">
          <env key="OS_NAME" value="${os.name}"/>
          <env key="OS_ARCH" value="${os.arch}"/>
          <env key="JVM_DATA_MODEL" value="${sun.arch.data.model}"/>
          <env key="HADOOP_NATIVE_SRCDIR" value="${native.src.dir}"/>
          <env key="LDFLAGS" value="-Wl,--no-as-needed"/>     <== add this line
          <arg line="${native.src.dir}/configure"/>
    </exec>
{code}

If this seems like a reasonable approach (I'm not a C expert) I can provide a 
similar patch for Hadoop 0.20.20x as well.

[1] https://github.com/kevinweil/hadoop-lzo
[2] https://github.com/kevinweil/hadoop-lzo/pull/34
                
> configure script for compiling hadoop native doesn't set lzo lib name 
> correctly
> -------------------------------------------------------------------------------
>
>                 Key: HADOOP-2009
>                 URL: https://issues.apache.org/jira/browse/HADOOP-2009
>             Project: Hadoop Common
>          Issue Type: Bug
>          Components: build
>    Affects Versions: 0.13.1
>         Environment: Fedora, amd64
>            Reporter: Joydeep Sen Sarma
>            Assignee: Arun C Murthy
>
> Looks like this was already reported (but not resolved on the the list): 
> http://tinyurl.com/2rwu6x
> I would like to compile libhadoop on amd64/Fedora and everything seems kosher 
> until I hit this compile error:
>      [exec] 
> /home/jssarma/fbprojects/hadoop-0.13.1/src/native/src/org/apache/hadoop/io/compress/lzo/LzoCompressor.c:116:
>  error: syntax error before ',' token
> the line in question is:
>     // Load liblzo2.so                                                        
>                             
>     liblzo2 = dlopen(HADOOP_LZO_LIBRARY, RTLD_LAZY | RTLD_GLOBAL);
> seems like this is being set by:
> configure:#define HADOOP_LZO_LIBRARY ${ac_cv_libname_lzo2}
> I tried executing the relevant part of configure by hand:
>   if test -z "`${CC} -o conftest conftest.c -llzo2 2>&1`"; then
>         if test ! -z "`which objdump`"; then
>       ac_cv_libname_lzo2="`objdump -p conftest | grep NEEDED | grep lzo2 | 
> sed 's/\W*NEEDED\W*\(.*\)\W*$/\
> \"\1\"/'`"
> This is not working on my system, since:
> > objdump -p conftest | grep NEEDED                         
>   NEEDED      libc.so.6
> So that would explain the compile error. Editing the configure script 
> manually for now works.

--
This message is automatically generated by JIRA.
If you think it was sent incorrectly, please contact your JIRA administrators: 
https://issues.apache.org/jira/secure/ContactAdministrators!default.jspa
For more information on JIRA, see: http://www.atlassian.com/software/jira

        

Reply via email to