I happened to have a copy of 18.1 lying about, and the JobConf is added to
the per process runtime environment in 18.1.
The entire configuration from the JobConf object is added to the
environment, with the jobconf key names being transformed slightly. Any
character in the key name, that is not on
Jason, do you know offhand when this feature was introduced? .18.x?
Thanks,
Bo
On Mon, Jun 22, 2009 at 10:58 PM, jason hadoop wrote:
> Check the process environment for your streaming tasks, generally the
> configuration variables are exported into the process environment.
>
> The Mapper input f
Check the process environment for your streaming tasks, generally the
configuration variables are exported into the process environment.
The Mapper input file is normally stored as some variant of
mapred.input.file. The reducer's input is the mapper output for that reduce,
so the input file is not
Hi All:
Is there any way using Hadoop Streaming to determining the directory from which
an input record is being read? This is straightforward in Hadoop using
InputFormats, but I am curious if the same concept can be applied to streaming.
The goal here is to read in data from 2 directories, say