That's interesting. Suppose your mapper script is a Perl script, how do you 
assign "my.mapper.arg1"'s value to a variable $x?
$x = $my.mapper.arg1
I just tried the way and my perl script does not recognize $my.mapper.arg1.

--- On Thu, 8/21/08, Rong-en Fan <[EMAIL PROTECTED]> wrote:
From: Rong-en Fan <[EMAIL PROTECTED]>
Subject: Re: [Streaming] How to pass arguments to a map/reduce script
To: core-user@hadoop.apache.org
Cc: [EMAIL PROTECTED]
Date: Thursday, August 21, 2008, 11:09 AM

On Thu, Aug 21, 2008 at 3:14 PM, Gopal Gandhi
<[EMAIL PROTECTED]> wrote:
> I am using Hadoop streaming and I need to pass arguments to my map/reduce
script. Because a map/reduce script is triggered by hadoop, like
> hadoop ....  -file MAPPER -mapper "$MAPPER" -file REDUCER
-reducer "$REDUCER" ...
> How can I pass arguments to MAPPER?
>
> I tried -cmdenv name=val , but it does not work.
> Anybody can help me? Thanks lot.

I use -jobconf, for example

hadoop ... -jobconf my.mapper.arg1="foobar"

and in the map script, I get this by reading the environment variable

my_mapper_arg1

Hope this helps,
Rong-En Fan
>
>
>
>



      

Reply via email to