I have attached code for creating a Hadoop Jar - All you need to do is run
HadoopDeployer in the same environment
that your hadoop job runs as a local process (You did test your job in this
way - It jars everything in the class path except
jars you add to an excluded list because they are not neede
Hello,
Thanks so much for the reply.
See inline.
On Fri, Jun 25, 2010 at 12:40 AM, Hemanth Yamijala wrote:
> Hi,
>
>> I've been getting the following error when trying to run a very simple
>> MapReduce job.
>> Map finishes without problem, but error occurs as soon as it enters
>> Reduce phase.
>>
> - What do you mean by aws?
It must be Amazon Web Services, I think.
On Fri, Jun 25, 2010 at 7:22 PM, Pedro Costa wrote:
> I don't understand when you say "you" :). I've created the
> hadoop-0.20.2-dev-core.jar and hadoop-0.20.2-dev-examples.jar and none of
> them has the job.jar. I've also se
Pedro,
> I don't understand when you say "you" :). I've created the
> hadoop-0.20.2-dev-core.jar and hadoop-0.20.2-dev-examples.jar and none of
> them has the job.jar. I've also searched all hadoop directories, and none of
> them have the jar.
job.jar is a name given by the Map/Reduce framework
*job.jar* is used to encapsulate the program code you write by yourself.
It's not an automatic produced file, but you self should create it manually.
And of course you can use other names to call this jar file.
If you don't understand, you can learn to study the
hadoop-0.20.2-dev-examples.jar fil
I don't understand when you say "you" :). I've created the
hadoop-0.20.2-dev-core.jar and hadoop-0.20.2-dev-examples.jar and none of
them has the job.jar. I've also searched all hadoop directories, and none of
them have the jar.
- When the jar is created and it's exported?
- What do you mean by a