Congratulations Harsh !!!
On Fri, Sep 16, 2011 at 7:34 PM, Abhishek Mehta wrote:
> the best from tresata on the election Harsh!
>
> Cheers
>
> Abhishek Mehta ('Abhi')
> (e) abhis...@tresata.com
> (v) 980.355.9855
>
>
>
>
> On Sep 16, 2011, at 2:33 AM, Todd Lipcon wrote:
>
> > On behalf of the PMC
ks,
> Deepika
>
> -----Original Message-
> From: Chandraprakash Bhagtani [mailto:cpbhagt...@gmail.com]
> Sent: Saturday, August 28, 2010 10:40 PM
> To: general@hadoop.apache.org
> Subject: Re: Deploying my job jar on hadoop cluster
>
> Deepika,
>
> You just have
Deepika,
You just have to run the following command on any of the cluster node
HADOOP_HOME/bin/hadoop jar
this command will automatically copy the jar on all the tasktrackers.
On Sun, Aug 29, 2010 at 6:07 AM, Deepika Khera wrote:
> Hi,
>
> I want to deploy my map reduce job jar on the Hadoo
Hi Devender,
Currently there are 2 ways to analyze performance of hadoop cluster and jobs
1. Hadoop Vaidya: is a performance diagnostic tool for hadoop jobs which
executes a set of rules against the job counters and gives a report of
performance improvement areas as a result. But Hadoop Vaidya i
you can set *mapred.max.split.size* property in mapred-site.xml to create
more splits and map tasks.
On Mon, Jan 18, 2010 at 12:51 PM, Something Something <
mailinglist...@gmail.com> wrote:
> Hello,
>
> I read the documentation about running multiple Mapper tasks, but I can't
> get multiple Mappe
fate, Neo?
> Neo: No.
> Morpheus: Why Not?
> Neo: Because I don't like the idea that I'm not in control of my life.
>
>
>
> - Original Message
> From: Chandraprakash Bhagtani
> To: general@hadoop.apache.org
> Sent: Saturday, September 12, 2009 1
You need to check your cluster's Map/Reduce task capacity. i.e. how many
Map/Reduce task can run on cluster at once. You can check it on
http://JobtrackerServerIP:50030. You should also check total number of map
tasks in your job. It should be greater than map task capacity of the
cluster.
Intial
Hi,
You should definitely change mapred.tasktracker.map/reduce.tasks.maximum. If
your tasks are more CPU bound then you should run the tasks equal to the
number of CPU cores otherwise you can run more tasks than cores. You can
determine CPU and memory usage by running "top" command on datanodes. Y
Hi,
I am trying to configure hadoop eclipse plugin 0.19.1 but getting the
following exception
An internal error occurred during: "Connecting to DFS Test".
java.lang.IllegalStateException
Please help
--
Thanks & Regards,
Chandra Prakash Bhagtani,