Re: writing mappers and reducers question

2015-02-22 Thread Drake민영근
I suggest Standalone mode for developing mapper or reducer. But in case of
partitioner or combiner, you need to setup Pseudo-Distributed mode.

Drake 민영근 Ph.D
kt NexR

On Fri, Feb 20, 2015 at 3:18 PM, unmesha sreeveni unmeshab...@gmail.com
wrote:

 You can write MapReduce jobs in eclipse also for testing purpose. Once it
 is done u can create jar and run that in your single node or multinode.
 But plese note while doing in such IDE s using hadoop dependecies, There
 will not be input splits, different mappers etc..





Re: writing mappers and reducers question

2015-02-19 Thread unmesha sreeveni
You can write MapReduce jobs in eclipse also for testing purpose. Once it
is done u can create jar and run that in your single node or multinode.
But plese note while doing in such IDE s using hadoop dependecies, There
will not be input splits, different mappers etc..


writing mappers and reducers question

2015-02-19 Thread Jonathan Aquilina
 

Hey guys Is it safe to guess that one would need a single node setup to
be able to write mappers and reducers for hadoop? 

-- 
Regards,
Jonathan Aquilina
Founder Eagle Eye T
 

Re: writing mappers and reducers question

2015-02-19 Thread Shahab Yunus
Nope. You can use the Standalone setup too to test things. Details here:
http://hadoop.apache.org/docs/r2.2.0/hadoop-project-dist/hadoop-common/SingleNodeSetup.html#Standalone_Operation

Regards,
Shahab

On Fri, Feb 20, 2015 at 12:40 AM, Jonathan Aquilina jaquil...@eagleeyet.net
 wrote:

  Hey guys Is it safe to guess that one would need a single node setup to
 be able to write mappers and reducers for hadoop?



 --
 Regards,
 Jonathan Aquilina
 Founder Eagle Eye T