Hi,
Hadoop tasks are always stacked to form a linear user-managed workflow (a
reduce step cannot start before all previous mappers have stopped etc). This
may be problematic in recursive tasks: for example in a BFS we will not get
any output until the longest branch has been reached.
In order to solve than, an idea came up of submitting a whole Hadoop task
from within a reducer. Have anyone tried it?
Thanks.

Reply via email to