I did something like this using a global static boolean variable (flag) while I was implementing breadth first IDA*. In my case, I set the flag to something else if a solution was found, which was examined in the reducer.

I guess in your case, since you know that if the mappers don't produce anything the reducers won't have anything as input, if I am not wrong.

And I had chaining map-reduce jobs ( http://developer.yahoo.com/hadoop/tutorial/module4.html ) running until a solution was found.


Kind regards,

Arindam Khaled





On Dec 17, 2010, at 12:58 AM, Peng, Wei wrote:

Hi,



I am a newbie of hadoop.

Today I was struggling with a hadoop problem for several hours.



I initialize a parameter by setting job configuration in main.

E.g. Configuration con = new Configuration();

con.set("test", "1");

Job job = new Job(con);



Then in the mapper class, I want to set "test" to "2". I did it by

context.getConfiguration().set("test","2");



Finally in the main method, after the job is finished, I check the
"test" again by

job.getConfiguration().get("test");



However, the value of "test" is still "1".



The reason why I want to change the parameter inside Mapper class is
that I want to determine when to stop an iteration in the main method.
For example, for doing breadth-first search, when there is no new nodes
are added for further expansion, the searching iteration should stop.



Your help will be deeply appreciated. Thank you



Wei


Reply via email to