Unfortunately this does not work. Hadoop complains:
08/08/21 18:04:46 ERROR streaming.StreamJob: Unexpected arg1 while processing 
-input|-output|-mapper|-combiner|-reducer|-file|-dfs|-jt|-additionalconfspec|-inputformat|-outputformat|-partitioner|-numReduceTasks|-inputreader|-mapdebug|-reducedebug|||-cacheFile|-cacheArchive|-verbose|-info|-debug|-inputtagged|-help

--- On Thu, 8/21/08, Yuri Pradkin <[EMAIL PROTECTED]> wrote:
From: Yuri Pradkin <[EMAIL PROTECTED]>
Subject: Re: [Streaming] How to pass arguments to a map/reduce script
To: core-user@hadoop.apache.org
Cc: "Gopal Gandhi" <[EMAIL PROTECTED]>
Date: Thursday, August 21, 2008, 1:43 PM

On Thursday 21 August 2008 00:14:56 Gopal Gandhi wrote:
> I am using Hadoop streaming and I need to pass arguments to my map/reduce
> script. Because a map/reduce script is triggered by hadoop, like hadoop
> ....  -file MAPPER -mapper "$MAPPER" -file REDUCER -reducer
"$REDUCER" ...
> How can I pass arguments to MAPPER?
>
> I tried -cmdenv name=val , but it does not work.
> Anybody can help me? Thanks lot.

I think you can simply do:
....  -file MAPPER -mapper "$MAPPER arg1 arg2" -file 
REDUCER -reducer "$REDUCER" ...



      

Reply via email to