I did everything mentioned in the link Ted mentioned, and the test actually
works, but using Snappy for MapReduce map output compression still fails
with native snappy library not available.
On Wed, Jan 1, 2014 at 6:37 PM, bharath vissapragada
bharathvissapragada1...@gmail.com wrote:
Did you
Your natives should be in LD_LIBRARY_PATH or java.library.path for hadoop
to pick them up. You can try adding export HADOOP_OPTS=$HADOOP_OPTS
-Djava.library.path=Path to your natives lib to hadoop-env.sh in TTs and
clients/gateways and restart TTs and give it another try. The reason its
working
, 2 Jan 2014 13:37:46 +0200
Subject: Re: Setting up Snappy compression in Hadoop
From: am...@infolinks.com
To: user@hadoop.apache.org
I did everything mentioned in the link Ted mentioned, and the test actually
works, but using Snappy for MapReduce map output compression still fails with
native
is based on recent Hadoop releases (like 2.2.0),
but something similar should apply for your release.
Regards
.g
From: bharath vissapragada [mailto:bharathvissapragada1...@gmail.com]
Sent: Thursday, January 02, 2014 5:56 AM
To: User
Subject: Re: Setting up Snappy compression in Hadoop
Your
Hi all,
I'm running on Hadoop 1.0.4 and I'd like to use Snappy for map output
compression.
I'm adding the configurations:
configuration.setBoolean(mapred.compress.map.output, true);
configuration.set(mapred.map.output.compression.codec,
org.apache.hadoop.io.compress.SnappyCodec);
And I've added
Please take a look at http://hbase.apache.org/book.html#snappy.compression
Cheers
On Wed, Jan 1, 2014 at 8:05 AM, Amit Sela am...@infolinks.com wrote:
Hi all,
I'm running on Hadoop 1.0.4 and I'd like to use Snappy for map output
compression.
I'm adding the configurations: