The env-var is auto-created by the "hadoop" script for you when you
invoke "hadoop jar". You do not necessarily have to manually set it,
nor do you have to compile the native libs if what you're using is
pre-built for your OS.

On Tue, Apr 30, 2013 at 12:52 AM,  <rkevinbur...@charter.net> wrote:
> I don't have this environment  variable. Should I create it in .bashrc AND
> /etc/profile?
>
>
> On Mon, Apr 29, 2013 at 1:55 PM, Omkar Joshi wrote:
>
>  Hi,
>
> did you check in your ubuntu installation; "libhadoop" binary.. it is
> present in my ubuntu installation at a relative path of (I used apache
> installation)
>
> "hadoop-common-project/hadoop-common/target/native/target/usr/local/lib"
>
>
> if present add it to your LID_LIBRARY_PATH.
>
>
> if not present then you can try rebuilding your hadoop installation
>
>
> "mvn clean install -Pnative -Pdist -Dtar -DskipTests"
>
>
>
> Thanks, Omkar Joshi
> Hortonworks Inc.
>
>
> On Mon, Apr 29, 2013 at 11:19 AM, Kevin Burton < rkevinbur...@charter.net>
> wrote:
> If it doesn't work what are my options? Is there source that I can download
> and compile?
>
> On Apr 29, 2013, at 10:31 AM, Ted Xu < t...@gopivotal.com> wrote:
>
> Hi Kevin,
> Native libraries are those implemented using C/C++, which only provide code
> level portability (instead of binary level portability, as Java do). That is
> to say, the binaries provided by CDH4 distribution will in most cases be
> broken in your environment.
>
> To check if your native libraries are working or not, you can follow the
> instructions I sent previously. Quote as following.
>
> <blockquote>
> During runtime, check the hadoop log files for your MapReduce tasks.
>
>    • If everything is all right, then:  DEBUG util.NativeCodeLoader - Trying
> to load the custom-built native-hadoop library...   INFO
> util.NativeCodeLoader - Loaded the native-hadoop library
>    • If something goes wrong, then:  INFO util.NativeCodeLoader - Unable to
> load native-hadoop library for your platform... using builtin-java classes
> where applicable
>
> </blokquote>
>
>
> On Mon, Apr 29, 2013 at 10:21 AM, Kevin Burton < rkevinbur...@charter.net>
> wrote:
> I looked at the link you provided and found the Ubuntu is one of the
> “supported platforms” but it doesn’t give any information on how to obtain
> it or build it. Any idea why it is not includde as part of the Cloudera CDH4
> distribution? I followed the installation instructions (mostly apt-get
> install . . . .) but I fail to see the libhadoop.so.  In order to avoid this
> warning do I need to download the Apache distribution? Which one?
>
> For the warnings about the configuration I looked in my configuration and
> for this specific example I don’t see ‘ session.id’ used anywhere. It must
> be used by default. If so why is the deprecated default being used?
>
> As for the two warnings about counters. I know I have not implemented any
> code for counters so again this must be something internal. Is there
> something I am doing to trigger this?
>
> So I can avoid them what are “hadoop generic options”?
>
> Thanks again.
>
> Kevin
>
> From: Ted Xu [mailto: t...@gopivotal.com]
> Sent: Friday, April 26, 2013 10:49 PM
> To: user@hadoop.apache.org
> Subject: Re: Warnings?
>
> Hi Kevin,
>
> Please see my comments inline,
>
> On Sat, Apr 27, 2013 at 11:24 AM, Kevin Burton < rkevinbur...@charter.net>
> wrote:
> Is the native library not available for Ubuntu? If so how do I load it?
> Native libraries usually requires recompile, for more information please
> refer Native Libraries.
>
>
> Can I tell which key is off? Since I am just starting I would want to be as
> up to date as possible. It is out of date probably because I copied my
> examples from books and tutorials.
>
> I think the warning messages are telling it already, "xxx is deprecated, use
> xxx instead...". In fact, most of the configure keys are changed from hadoop
> 1.x to 2.x. The compatibility change may later documented on
> http://wiki.apache.org/hadoop/Compatibility.
>
> The main class does derive from Tool. Should I ignore this warning as it
> seems to be in error?
> Of course you can ignore this warning as long as you don't use hadoop
> generic options.
>
>
> Thank you.
>
> On Apr 26, 2013, at 7:49 PM, Ted Xu < t...@gopivotal.com> wrote:
> Hi,
>
> First warning is saying hadoop cannot load native library, usually a
> compression codec. In that case, hadoop will use java implementation
> instead, which is slower.
>
> Second is caused by hadoop 1.x/2.x configuration key change. You're using a
> 1.x style key under 2.x, yet hadoop still guarantees backward compatibility.
>
> Third is saying that the main class of a hadoop application is recommanded
> to implement org.apache.hadoop.util.Tool, or else generic command line
> options (e.g., -D options) will not supported.
>
> On Sat, Apr 27, 2013 at 5:51 AM, < rkevinbur...@charter.net> wrote:
> I am running a simple WordCount m/r job and I get output but I get five
> warnings that I am not sure if I should pay attention to:
>
> 13/04/26 16:24:50 WARN util.NativeCodeLoader: Unable to load native-hadoop
> library for your platform... using builtin-java classes where applicable
>
> 13/04/26 16:24:50 WARN conf.Configuration: session.idis deprecated. Instead,
> use dfs.metrics.session-id
>
> 13/04/26 16:24:50 WARN mapred.JobClient: Use GenericOptionsParser for
> parsing the arguments. Applications should implement Tool for the same.
>
> 13/04/26 16:24:51 WARN mapreduce.Counters: Group
> org.apache.hadoop.mapred.Task$Counter is deprecated. Use
> org.apache.hadoop.mapreduce.TaskCounter instead
>
> 13/04/26 16:24:51 WARN mapreduce.Counters: Counter name MAP_INPUT_BYTES is
> deprecated. Use FileInputFormatCounters as group name and  BYTES_READ as
> counter name instead
>
> Any ideas on what these mean? The only one that I can see in the code is the
> third one. I am using GenericOptionsParser as it is part of an example that
> I copied. But I don't know why this is considered bad.
>
> Thank you.
>
>
>
>
> --
> Regards,
> Ted Xu
>
>
>
> --
> Regards,
> Ted Xu
>
>
>
> --
> Regards, Ted Xu
>



-- 
Harsh J

Reply via email to