[ https://issues.apache.org/jira/browse/HADOOP-7979?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Arun C Murthy reassigned HADOOP-7979: ------------------------------------- Assignee: Michael Noll > Native code: configure LDFLAGS and CXXFLAGS to fix the build on systems like > Ubuntu 11.10 > ----------------------------------------------------------------------------------------- > > Key: HADOOP-7979 > URL: https://issues.apache.org/jira/browse/HADOOP-7979 > Project: Hadoop Common > Issue Type: Bug > Components: build > Affects Versions: 0.24.0 > Environment: Ubuntu 11.10+ > Reporter: Michael Noll > Assignee: Michael Noll > Fix For: 0.24.0 > > Attachments: HADOOP-7979.trunk.v1.txt > > > I noticed that the build of Hadoop trunk (0.24) and the 1.0/0.20.20x branches > fail on Ubuntu 11.10 when trying to include the native code in the build. The > reason is that the default behavior of {{ld}} was changed in Ubuntu 11.10. > *Background* > From [Ubuntu 11.10 Release > Notes|https://wiki.ubuntu.com/OneiricOcelot/ReleaseNotes#GCC_4.6_Toolchain]: > {code} > The compiler passes by default two additional flags to the linker: > [...snipp...] > -Wl,--as-needed with this option the linker will only add a DT_NEEDED tag > for a dynamic library mentioned on the command line if if the library is > actually used. > {code} > This was apparently planned to be changed already back in 11.04 but was > eventually reverted in the final release. From [11.04 Toolchain > Transition|https://wiki.ubuntu.com/NattyNarwhal/ToolchainTransition#Indirect_Linking_for_Shared_Libraries]: > {quote} > Also in Natty, ld runs with the {{\--as-needed}} option enabled by default. > This means that, in the example above, if no symbols from {{libwheel}} were > needed by racetrack, then {{libwheel}} would not be linked even if it was > explicitly included in the command-line compiler flags. NOTE: The ld > {{\--as-needed}} default was reverted for the final natty release, and will > be re-enabled in the o-series. > {quote} > I already run into the same issue with Hadoop-LZO > (https://github.com/kevinweil/hadoop-lzo/issues/33). See the link and the > patch for more details. For Hadoop, the problematic configure script is > {{native/configure}}. > *How to reproduce* > There are two ways to reproduce, depending on the OS you have at hand. > 1. Use a stock Ubuntu 11.10 box and run a build that also compiles the native > libs: > {code} > # in the top level directory of the 'hadoop-common' repo, > # i.e. where the BUILDING.txt file resides > $ mvn -Pnative compile > {code} > 2. If you do not have Ubuntu 11.10 at hand, simply add {{-Wl,\--as-needed}} > explicitly to {{LDFLAGS}}. This configures {{ld}} to work like Ubuntu > 11.10's default behavior. > *Error message (for trunk/0.24)* > Running the above build command will produce the following output (I added > {{-e -X}} switches to mvn). > {code} > [DEBUG] Executing: /bin/sh -l -c cd > /home/mnoll/programming/git/hadoop/hadoop-common/hadoop-common-project/hadoop-common/target/native > && make > DESTDIR=/home/mnoll/programming/git/hadoop/hadoop-common/hadoop-common-project/hadoop-common/target/native/target > install > [INFO] /bin/bash ./libtool --tag=CC --mode=compile gcc -DHAVE_CONFIG_H -I. > -I/usr/lib/jvm/default-java/include > -I/usr/lib/jvm/default-java/include/linux > -I/home/mnoll/programming/git/hadoop/hadoop-common/hadoop-common-project/hadoop-common/target/native/src > > -I/home/mnoll/programming/git/hadoop/hadoop-common/hadoop-common-project/hadoop-common/target/native/javah > -I/usr/local/include -g -Wall -fPIC -O2 -m64 -g -O2 -MT ZlibCompressor.lo > -MD -MP -MF .deps/ZlibCompressor.Tpo -c -o ZlibCompressor.lo `test -f > 'src/org/apache/hadoop/io/compress/zlib/ZlibCompressor.c' || echo > './'`src/org/apache/hadoop/io/compress/zlib/ZlibCompressor.c > [INFO] libtool: compile: gcc -DHAVE_CONFIG_H -I. > -I/usr/lib/jvm/default-java/include -I/usr/lib/jvm/default-java/include/linux > -I/home/mnoll/programming/git/hadoop/hadoop-common/hadoop-common-project/hadoop-common/target/native/src > > -I/home/mnoll/programming/git/hadoop/hadoop-common/hadoop-common-project/hadoop-common/target/native/javah > -I/usr/local/include -g -Wall -fPIC -O2 -m64 -g -O2 -MT ZlibCompressor.lo > -MD -MP -MF .deps/ZlibCompressor.Tpo -c > src/org/apache/hadoop/io/compress/zlib/ZlibCompressor.c -fPIC -DPIC -o > .libs/ZlibCompressor.o > [INFO] src/org/apache/hadoop/io/compress/zlib/ZlibCompressor.c: In function > 'Java_org_apache_hadoop_io_compress_zlib_ZlibCompressor_initIDs': > [INFO] src/org/apache/hadoop/io/compress/zlib/ZlibCompressor.c:71:41: error: > expected expression before ',' token > [INFO] make: *** [ZlibCompressor.lo] Error 1 > {code} > *How to fix* > The fix involves adding proper settings for {{LDFLAGS}} to the build config. > In trunk, this is {{hadoop-common-project/hadoop-common/pom.xml}}. In > branches 1.0 and 0.20.20x, this is {{build.xml}}. > Basically, the fix explicitly adds {{-Wl,\--no-as-needed}} to {{LDFLAGS}}. > Special care must be taken not to add this option when running on Mac OS as > its version of ld does not support this option (and does not need it because > by default it behaves as desired). -- This message is automatically generated by JIRA. If you think it was sent incorrectly, please contact your JIRA administrators: https://issues.apache.org/jira/secure/ContactAdministrators!default.jspa For more information on JIRA, see: http://www.atlassian.com/software/jira