Good Day,
My Hadoop version is 0.19.1. I have successfully configured to run it on a
Windows machine.
Here is the configuration that I performed:
1. I put the Hadoop files under this folder C:\cygwin\usr\local\hadoop
2. Below is the hadoop-site.xml that I use.
?xml version=1.0?
Good Day,
I am using Hadoop 0.19.1 on a Windows cluster. Everything work fine.
But I have a programming question.
For a successful map task, close() method will be invoked by the map task.
But for a failed and killed map task, close() method is not invoked.
I get the above finding by putting
Good Day,
Maybe you can try putting Hadoop jar files in the Extension classloader
directory domain name/lib/ext.
Thanks.
From: BM [mailto:bogdan.maryn...@gmail.com]
Sent: Thu 10/22/2009 3:59 PM
To: common-user@hadoop.apache.org
Subject: Re: Random weirdness
Good Day,
This is what I do to solve the problem below.
I used cygserver to start the tasktracker, instead using the Java Wrapper
Service. And now my cluster can work correctly.
In your cygwin bash window, you can try this command:
cygrunsrv --install Hadoop TaskTracker -p
Good Day,
Use this:
export JAVA_HOME=C:\\cygwin\\home\\HadoopAdmin\\java
This should work fine.
From: Rajpal, Harjeet Kumar [mailto:harjeet.ku...@honeywell.com]
Sent: Fri 9/4/2009 4:06 PM
To: common-user@hadoop.apache.org
Subject: Hadoop
Hello all,
Good Day,
I have managed to run my hadoop cluster (3-nodes, TT/JT and NN/DN) currently by
using Windows Service via Java Wrapper Service. But my tasktracker seems unable
to run any task.
I thought the spawn jvm will inherit the classpath of the parent jvm, even
starting using Windows. Is
on cleaning reference to your file. Try using
deletecache in synchronized manner.
Thanks,
Amogh
-Original Message-
From: #YONG YONG CHENG# [mailto:aarnc...@pmail.ntu.edu.sg]
Sent: Thursday, September 03, 2009 8:50 AM
To: common-user@hadoop.apache.org
Subject: DistributedCache purgeCache
Good Day,
I have a question on the DistributedCache as follows.
I have used DistributedCache to move my executable(.exe) around the (onto the
local filesystems of) nodes in Hadoop and run the .exe (via addCacheArchive()
and getLocalCacheArchives()). But I discovered after my job, the .exe