Hi,
I am new to Hadoop and have just started to code mapreduce jobs.
Can anyone provide me link to download latest plugin for Hadoop development in
Eclipse 3.6+
I have googled a lot but all plugins I have found are for older version of
eclipse
Thanks and Regards
Utkarsh Gupta
Hi Piyush,
I think you need to override the inbuilt partitioning function.
You can use function like (first field of key)%3
This will send all the keys with same first field to a separate reduce process
Please correct me if I am wrong.
Thanks
Utkarsh
From: Piyush Kansal [mailto:piyush.kan...@gmail
ClassNotFoundException.
I tried using -libjar option with $HADOOP_HOME/bin/Hadoop jar myprg.jar
-libjar
And $HADOOP_HOME/bin/Hadoop jar myprg.jar -libjar
But this is not working. Please help.
Thanks and Regards
Utkarsh Gupta
CAUTION - Disclaimer *
This e
you give the stack trace of your problem for which class it is giving
ClassNotFoundException(i.e for main class or math lib class)?
Thanks
Devaraj
From: Utkarsh Gupta [utkarsh_gu...@infosys.com]
Sent: Wednesday, April 04, 2012 12:22 PM
To: mapreduce-user
2) include the third pary jars in /lib folder while packaging your application
3) If you are adding the jar in HADOOP_HOME/lib , you need to add this at all
nodes.
Regards
Bejoy KS
On Wed, Apr 4, 2012 at 12:55 PM, Utkarsh Gupta
mailto:utkarsh_gu...@infosys.com>> wrote:
Hi Devaraj,
folder while packaging your application
3) If you are adding the jar in HADOOP_HOME/lib , you need to add this at all
nodes.
Regards
Bejoy KS
On Wed, Apr 4, 2012 at 12:55 PM, Utkarsh Gupta
mailto:utkarsh_gu...@infosys.com>> wrote:
Hi Devaraj,
I have already copied the required jar f
tkarsh,
A log like "12/04/04 15:21:00 WARN mapred.JobClient: Use GenericOptionsParser
for parsing the arguments. Applications should implement Tool for the same."
indicates you haven't implemented the Tool approach properly (or aren't calling
its run()).
On Wed, Apr 4, 2012
class:
Configuration conf = getConf();
This is documented at
http://hadoop.apache.org/common/docs/current/api/org/apache/hadoop/util/Tool.html
On Wed, Apr 4, 2012 at 6:25 PM, Utkarsh Gupta wrote:
> Hi Harsh,
> I have implemented Tool like this
>
> public static void main(String[] args)
-
-
672364,423746273,4234
Block2:
234,2,342,34,2,34,234,2,34,234,2,342,342
-
-
-
this block goes to another mapper process
How HDFS avoids this scenario?
Thanks and Regards
Utkarsh Gupta
CAUTION - Disclaimer
. I checked the values while
emitting from map and again checked in reducer.
I am wondering is there any such kind of limitation in the Hadoop or is it a
configuration problem.
Thanks and Regards
Utkarsh Gupta
CAUTION - Disclaimer *
This e-mail contains
u can provide a test case which can reproduce this
issue.
Thanks,
+Vinod
On Thu, Jan 10, 2013 at 12:41 AM, Utkarsh Gupta
mailto:utkarsh_gu...@infosys.com>> wrote:
Hi,
I am using Apache Hadoop 1.0.4 on a 10 node cluster of commodity machines with
Ubuntu 12.04 Server edition. I am having a is
anual limits in MR. Can your
post a reproduceable test case to support your suspicion?
On Jan 16, 2013 4:34 PM, "Utkarsh Gupta"
mailto:utkarsh_gu...@infosys.com>> wrote:
Hi,
Thanks for the response. There was some issues with my code. I have checked
that in detail.
All the values
12 matches
Mail list logo