[
https://issues.apache.org/jira/browse/HBASE-3873?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=13043202#comment-13043202
]
stack commented on HBASE-3873:
------------------------------
@Alejandro
Yeah, I already have them installed -- thats whats odd:
{code}
stack@sv4borg231:~/hadoop-snappy-read-only$ sudo apt-get install autotools-dev
Reading package lists... Done
Building dependency tree
Reading state information... Done
autotools-dev is already the newest version.
0 upgraded, 0 newly installed, 0 to remove and 2 not upgraded.
stack@sv4borg231:~/hadoop-snappy-read-only$ mvn package
[INFO] Scanning for projects...
[INFO] ------------------------------------------------------------------------
[INFO] Building Hadoop Snappy
[INFO] task-segment: [package]
[INFO] ------------------------------------------------------------------------
[INFO] [resources:resources]
[INFO] Using default encoding to copy filtered resources.
[INFO] [compiler:compile]
[INFO] Compiling 4 source files to
/home/stack/hadoop-snappy-read-only/target/classes
[INFO] [antrun:run {execution: compile}]
[INFO] Executing tasks
main:
checkpreconditions:
compilenative:
[mkdir] Created dir:
/home/stack/hadoop-snappy-read-only/target/native-src/config
[mkdir] Created dir:
/home/stack/hadoop-snappy-read-only/target/native-src/m4
[exec] Can't exec "libtoolize": No such file or directory at
/usr/bin/autoreconf line 188.
[exec] Use of uninitialized value $libtoolize in pattern match (m//) at
/usr/bin/autoreconf line 188.
[exec] Can't exec "aclocal": No such file or directory at
/usr/share/autoconf/Autom4te/FileUtils.pm line 326.
[exec] autoreconf: failed to run aclocal: No such file or directory
[INFO] ------------------------------------------------------------------------
[ERROR] BUILD ERROR
[INFO] ------------------------------------------------------------------------
[INFO] An Ant BuildException has occured: The following error occurred while
executing this line:
/home/stack/hadoop-snappy-read-only/maven/build-compilenative.xml:62: exec
returned: 1
[INFO] ------------------------------------------------------------------------
[INFO] For more information, run Maven with the -e switch
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 6 seconds
[INFO] Finished at: Thu Jun 02 20:11:01 PDT 2011
[INFO] Final Memory: 32M/677M
[INFO] ------------------------------------------------------------------------
{code}
If I install libtool I get further....If i reinstall automake I get further
still.....
I have to set JAVA_HOME it seems...
If I redo the build I run into this issue:
{code}
...
compilenative:
[exec] cp: cannot create regular file
`/home/stack/hadoop-snappy-read-only/target/native-src/native/src/.svn/format':
Permission denied
[exec] cp: cannot create regular file
`/home/stack/hadoop-snappy-read-only/target/native-src/native/src/.svn/entries':
Permission denied
[exec] cp: cannot create regular file
`/home/stack/hadoop-snappy-read-only/target/native-src/native/src/.svn/text-base/SnappyCompressor.cc.svn-base':
Permission denied
[exec] cp: cannot create regular file
`/home/stack/hadoop-snappy-read-only/target/native-src/native/src/.svn/text-base/org_apache_hadoop_io_compress_snappy_SnappyCompressor.h.svn-base':
Permission denied
[exec] cp: cannot create regular file
`/home/stack/hadoop-snappy-read-only/target/native-src/native/src/.svn/text-base/SnappyDecompressor.cc.svn-base':
Permission denied
[exec] cp: cannot create regular file
`/home/stack/hadoop-snappy-read-only/target/native-src/native/src/.svn/text-base/hadoop_snappy.h.svn-base':
Permission denied
[exec] cp: cannot create regular file
`/home/stack/hadoop-snappy-read-only/target/native-src/native/src/.svn/text-base/org_apache_hadoop_io_compress_snappy_SnappyDecompressor.h.svn-base':
Permission denied
[exec] cp: cannot create regular file
`/home/stack/hadoop-snappy-read-only/target/native-src/native/src/.svn/all-wcprops':
Permission denied
[exec] cp: cannot create regular file
`/home/stack/hadoop-snappy-read-only/target/native-src/native/.svn/format':
Permission denied
[exec] cp: cannot create regular file
`/home/stack/hadoop-snappy-read-only/target/native-src/native/.svn/entries':
Permission denied
[exec] cp: cannot create regular file
`/home/stack/hadoop-snappy-read-only/target/native-src/native/.svn/text-base/configure.ac.svn-base':
Permission denied
[exec] cp: cannot create regular file
`/home/stack/hadoop-snappy-read-only/target/native-src/native/.svn/text-base/packageNativeHadoop.sh.svn-base':
Permission denied
[exec] cp: cannot create regular file
`/home/stack/hadoop-snappy-read-only/target/native-src/native/.svn/text-base/Makefile.am.svn-base':
Permission denied
[exec] cp: cannot create regular file
`/home/stack/hadoop-snappy-read-only/target/native-src/native/.svn/all-wcprops':
Permission denied
[INFO] ------------------------------------------------------------------------
[ERROR] BUILD ERROR
[INFO] ------------------------------------------------------------------------
[INFO] An Ant BuildException has occured: The following error occurred while
executing this line:
/home/stack/hadoop-snappy-read-only/maven/build-compilenative.xml:54: exec
returned: 1
{code}
So I have to manually remove the target dir.....
Now I'm having same as this issue:
http://code.google.com/p/hadoop-snappy/issues/detail?id=2
{code}
[exec] /usr/bin/ld: cannot find -ljvm
[exec] collect2: ld returned 1 exit status
[exec] make: *** [libhadoopsnappy.la] Error 1
[exec] libtool: link: g++ -shared -nostdlib
/usr/lib/gcc/x86_64-linux-gnu/4.3.3/../../../../lib/crti.o
/usr/lib/gcc/x86_64-linux-gnu/4.3.3/crtbeginS.o src/.libs/SnappyCompressor.o
src/.libs/SnappyDecompressor.o -L/usr/local/lib /usr/local/lib/libsnappy.so
-ljvm -L/usr/lib/gcc/x86_64-linux-gnu/4.3.3
-L/usr/lib/gcc/x86_64-linux-gnu/4.3.3/../../../../lib -L/lib/../lib
-L/usr/lib/../lib -L/usr/lib/gcc/x86_64-linux-gnu/4.3.3/../../.. -lstdc++ -lm
-lc -lgcc_s /usr/lib/gcc/x86_64-linux-gnu/4.3.3/crtendS.o
/usr/lib/gcc/x86_64-linux-gnu/4.3.3/../../../../lib/crtn.o -Wl,-soname
-Wl,libhadoopsnappy.so.0 -o .libs/libhadoopsnappy.so.0.0.1
[INFO] ------------------------------------------------------------------------
[ERROR] BUILD ERROR
[INFO] ------------------------------------------------------------------------
[INFO] An Ant BuildException has occured: The following error occurred while
executing this line:
/home/stack/hadoop-snappy-read-only/maven/build-compilenative.xml:91: exec
returned: 2
...
{code}
Thought you might be interested in my experience Alejandro.
Good stuff.
> Mavenize Hadoop Snappy JAR/SOs project dependencies
> ---------------------------------------------------
>
> Key: HBASE-3873
> URL: https://issues.apache.org/jira/browse/HBASE-3873
> Project: HBase
> Issue Type: Improvement
> Components: build
> Affects Versions: 0.90.2
> Environment: Linux
> Reporter: Alejandro Abdelnur
> Assignee: Alejandro Abdelnur
> Labels: build
> Attachments: HBASE-3873.patch
>
>
> (This JIRA builds on HBASE-3691)
> I'm working on simplifying how to use Hadoop Snappy from other based maven
> projects. The idea is that hadoop-snappy JAR and the SOs (snappy and
> hadoop-snappy) would be picked up from a Maven repository (like any other
> dependencies). SO files will be picked up based on the architecture where the
> build is running (32 or 64 bits).
> For Hbase this would remove the need to manually copy snappy JAR and SOs
> (snappy and hadoop-snappy) into HADOOP_HOME/lib or HBASE_HOME/lib and
> hadoop-snappy would be handled as a regular maven dependency (with a trick
> for the SOs file).
> The changes would affect only the pom.xml and the would be in a 'snappy'
> profile, thus requiring '-Dsnappy' option in Maven invocations to trigger the
> including of snappy JAR and SOs.
> Because hadoop-snappy (JAR and SOs) are not currently avail in public Maven
> repos, until that happens, Hbase developer would have to checkout and 'mvn
> install' hadoop-snappy. Which is (IMO) simpler than what will have to be done
> in once HBASE-3691 is committed.
--
This message is automatically generated by JIRA.
For more information on JIRA, see: http://www.atlassian.com/software/jira