Try adding the log4j.properties file to he distributed cache, e.g.:
hadoop jar job.jar -config conf -files conf/log4j.properties my.package.Class
arg1
-Joey
On Feb 29, 2012, at 16:15, GUOJUN Zhu wrote:
>
> What I found out is that the default conf/log4j.properties set root with INFO
> and
Niels,
Sorry it has taken me so long to respond. Today has been a very crazy day.
I am just guessing what your algorithm is for auto-complete. I really don't
know so I will just design a back of the envelope one myself as a starting
point. My guess is that you have a few map/reduce jobs. Th
What I found out is that the default conf/log4j.properties set root with
INFO and indeed anything beyond INFO (hadoop's or my own codes') show up.
However, I tried to put a new log4j.properties with lower threshold in the
new conf directory and specify it with "--configure" option and it did not
Hi.
I'm trying yarn + security but still cannot make a mapred example
runing. Can anyone help me to take a look?
My env:
- 3-slave cluster on ec2. Centos 5.5
- nn, dn, rm, nm all started, with security enabled.
- i saw java.lang.NoClassDefFoundError from LinuxContainerExecutor eror
log:
./a
Hi,
This question is for mapreduce-user not hbase-user.
+mapreduce-user
bcc hbase-user
On Wed, Feb 29, 2012 at 7:40 PM, T Vinod Gupta wrote:
> hi,
> whats the recommended way of passing arguments to m/r jobs? based on web
> examples, the mapper and reducer classes are static classes. so if you
Robert,
On Tue, Feb 28, 2012 at 23:28, Robert Evans wrote:
> I am not sure I can help with that unless I know better what “a special
> distribution” means.
>
The thing is that this application is a "Auto Complete" feature that has a
key that is "the letters that have been typed so far".
Now fo