Hi Rahul
I am looking for research in HDFS. Can u help me out. Pl write me.
Regards
Prof. Mahesh Maurya
Assistant Professor, MPSTME
NMIMS University
Mob. 09773314010
From: rahul patodi [mailto:patodira...@gmail.com]
Sent: Thursday, January 27, 2011 10:47 PM
To: mapreduce-user@hadoop.a
I also tried forked a new Thread which periodically calls setStatus() and
progress() in my reduce task, but it did not help.
BTW, I'm using Hadoop 0.21.0, is it a bug here?
Thanks
Anfernee
On Thu, Jan 27, 2011 at 10:15 PM, Ivan Leonardi wrote:
> I ha the same problem! Try to build a little tes
The reason was that I set the mapred-site.xml to use the new api. Thanks,
On Thu, Jan 27, 2011 at 5:04 PM, Chase Bradford
wrote:
> That's very puzzling, because I don't see any reason for the new API
> to get activated. I'm pretty sure that's what's happening though,
> based on this section of t
Hi Praveen,
The configuration files in the various *conf/* directories of Hadoop
installation needs to be on the *CLASSPATH* of your Java application for it
to get found and applied
--
*Regards*,
Rahul Patodi
Software Engineer,
Impetus Infotech (India) Pvt Ltd,
www.impetus.com
Mob:0990707441
That's very puzzling, because I don't see any reason for the new API
to get activated. I'm pretty sure that's what's happening though,
based on this section of the exception's call stack:
at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:144)
at org.apache.hadoop.mapred.MapTask.
You can try increasing the task timeout for jobs like these, to a
'good-enough' amount. Although am wondering why the reporting from
cleanups isn't making it stay alive. It could be that each iteration
of yours may be exceeding the timeout limit of 600s, thereby not
sending out a status report at a
[code]
package org.apache.hadoop.examples;
import java.io.IOException;
import java.math.BigDecimal;
import java.util.Iterator;
import org.apache.hadoop.conf.Configured;
import org.apache.hadoop.fs.FileSystem;
import org.apache.hadoop.fs.Path;
import org.apache.hadoop.io.BooleanWritable;
import o
That should be fine, but mapreduce.Mapper.map has this signature:
map(K key, V value, Context)
Your PiEstimator map signature doesn't match, so it's not overriding
the proper function and is never getting called by the framework.
Could you paste your complete PiMapper class definition and the se
Yes you're right, but I haven't understand what you said.
- Why this should be happening? Is this related to the
mapred.mapper.new-api property?
- Why the function signature doesn't match? I mean, what's the reason
that this should be happening? I don't recall to change the code that
could give th
Yes, that's the one that's being used ( o.a.h.mapreduce.Mapper ). This
is not the right one to use?
On Thu, Jan 27, 2011 at 3:40 PM, Chase Bradford
wrote:
> Are you sure the function signature for you Mapper's map matches the super
> class, and that you specified your Map class in the job setu
Are you sure the function signature for you Mapper's map matches the super
class, and that you specified your Map class in the job setup? It sounds a bit
like the base o.a.h.mapreduce.Mapper map implementation is being used instead.
On Jan 27, 2011, at 2:36 AM, Pedro Costa wrote:
> The map o
If there isn't a clean translation to and from a string, then you can wrap an
ObjectOutputStream around a FSOutputStream to HDFS and use the DistributedCache
to localize it. You task could then read it using java's ObjectInputStream and
FileInputStream.
On Jan 26, 2011, at 11:00 PM, Joan wrot
I ha the same problem! Try to build a little testing suite to actually
see how much time is required by your algorithm. I discovered that the
mine was taking 18 minutes!
Actually, I guess that your problem lies in your comment "massive work".
Ivan
2011/1/27 Anfernee Xu :
> This question has been
This question has been asked before, but I tried suggested solutions such as
call Context.setStatus() or progress(), neither them helped. Please advise.
My reduce task is doing some CPU extensive work in reduce task, below is my
code snippet
@Override
protected void reduce(Text inpput, Iterable
The map output class are well defined:
keyClass: class org.apache.hadoop.io.BooleanWritable - valClass: class
org.apache.hadoop.io.LongWritable
but executing the pi example, the values that map function passes is:
keyClass: class org.apache.hadoop.io.LongWritable - valClass: class
org.apache.hadoo
Thanks Nicholas, but it didn't worked.
Can I do a remote debugging on hadoop examples? I really like to put a
breakpoint in the Pi class.
Thanks,
On Wed, Jan 26, 2011 at 6:46 PM, Tsz Wo (Nicholas), Sze
wrote:
> Okay, I got it now. You were talking about your programs but not the
> PiEstimator
16 matches
Mail list logo