Hi,
If you want to access certain jobconf parameters in your streaming script,
streaming provides this by setting localized jobconf parameters as system
environment variables, with the "." in parameters replaced by "_" .
To set jobconf parameters for streaming jobs, you can use -D
=
Thanks,
Amo
yup, only one task...
i should have mentioned that i'm using hadoop streaming. do i have
access to jobconf* if i write my tasks in python?
On Thu, Dec 3, 2009 at 4:32 PM, Jeff Zhang wrote:
> Mike,
>
> Do you mean you only have one reducer task for a Job ?
>
> You can increase the number of redu
Mike,
Do you mean you only have one reducer task for a Job ?
You can increase the number of reducer task for one Job by setting
JobConf.setNumReduceTasks(n)
Jeff Zhang
On Thu, Dec 3, 2009 at 2:58 PM, Mike Kendall wrote:
> i can't seem to get my cluster to run more than one reduce task...
i can't seem to get my cluster to run more than one reduce task... my
mapred-site.xml looks like this:
mapred.job.tracker
master:9001
mapred.tasktracker.map.tasks.maximum
5
mapred.tasktracker.reduce.tasks.maximum
5
mapred.map.tasks
40
mapred.reduce.tasks
8
m