hi.
I want to run a distributed cluster, where i have say 20 machines/slaves
in 3 seperate data centers that belong to the same cluster.
Ideally I would like the other machines in the data center to be able to
upload files (apache log files in this case) onto the local slaves and
then have m
JMock is a unit testing tool for creating mock objects. I use it to mock
things like OutputCollector and Reporter, so I can unit test mappers and
reducers without running a cluster. In other words, I'm just testing the
logic of the code within the map() and reduce() methods, and testing the
map and
All,
I'm looking for a solution that would allow me to securely use
VPSs (hosted VMs) or hosted dedicated servers as nodes in a
distributed file system. My bandwidth/speed requirements aren't high,
space requirements are potentially huge and ever growing, superb
security is a must, but
I creates three virtual machines, each of them works as a node.
Does the JMock support debugging with multiple nodes cluster within Eclipse?
Could we set up breakpoints, trace the running steps of the map reduce
program?
Richard
On Mon, Jun 16, 2008 at 6:54 PM, Matt Kent <[EMAIL PROTECTED]> wrot
Hello all,
I have my own VersionedWritable class like so:
public class MyWritable extends VersionedWritable {
public OtherWritable anotherWritable; // instance var, implements Writable
...
public MyWritable() {
anotherWritable = null;
}
@Override
public void readFields(DataI
Hi Christophe,
This exception happens when you access the FileSystem after calling
FileSystem.close(). From the error message below, a FileSystem input stream
was accessed after FileSystem.close(). I guess the FileSystem was closed
manually (and too early). In most cases, you don't have to c
Looks like the client machine from which you call -put cannot connect to the
data-nodes.
It could be firewall or wrong configuration parameters that you use for the
client.
Alexander Arimond wrote:
hi,
i'm new in hadoop and im just testing it at the moment.
i set up a cluster with 2 nodes an
On Tuesday 17 June 2008, j.L wrote:
> where we can download pdf or ppt of this meetup?
Good question. I will put mine into the Mahout wiki and provide links to the
remaining slides as well. I have created an entry at Upcoming. After the get
together, I will fill in the links to the slides there
do this:
hadoop job -Dmapred.job.tracker=namenode:9001 -help
(where "namenode" is your namenode)
and look at the various options; "-list" tells you details about running
jobs, jobs that have run, whether they failed etc
if for some reason you can't associate a jobid with whatever is running,
What's the easiest way to determine what that job id is outside of hadoop so
that the external app can run this?
- Original Message
From: Meng Mao <[EMAIL PROTECTED]>
To: core-user@hadoop.apache.org
Sent: Tuesday, June 17, 2008 1:51:58 PM
Subject: Re: getting hadoop job status/progress
What if I'm not interested in which job is running but simply whether the
current job is not stalled or failed?
Is there a way I can avoid specifying a job by the job ID?
I apologize if there's some commandline documentation I'm missing,
but the commands change a bit from point version to version.
To get this from some other application rather than Hadoop, you just need
to run this within a shell (I do this kind of thing within perl)
Miles
2008/6/17 Miles Osborne <[EMAIL PROTECTED]>:
> try this:
>
> hadoop job -Dmapred.job.tracker=hermitage:9001 -status
> job_200806160820_0430
>
> (and
try this:
hadoop job -Dmapred.job.tracker=hermitage:9001 -status
job_200806160820_0430
(and replace my job id with the one you want to track):
>
hadoop job -Dmapred.job.tracker=hermitage:9001 -status
job_200806160820_0430
Job: job_200806160820_0430
file: /data/tmp/hadoop/mapred/system/job_2008
Hi
Is there a way to grab a hadoop job's status/progress outside of the job and
outside of hadoop?
I.e if I have another application running and this application needs to know
that a job has ended or the status percentage while the job is running, how can
an external app like this get sta
Please note that the location of the user group meeting has been changed
to Building 2, Training Rooms 5&6, at Yahoo! Mission College (2811
Mission College Blvd, Santa Clara).
User Group Meeting
June 18th, 6-7:30 pm
Agenda:
- Hadoop at Facebook, Hive - Jeff Hammerbacher
- Using Zookeeper -
Just standard box like yours (ubuntu). I suspect something must be
wrong. Could you determine which tests take long time. On hudson, it
seems that tests take 1:30 on average.
http://hudson.zones.apache.org/hudson/view/Hadoop/
Lukas Vlcek wrote:
Hi,
I am aware of HowToContrib wiki page but
Richard Zhang wrote:
Hello Hadoopers:
Is there a way to debug the hadoop code from Eclipse IDE? I am using Eclipse to
read the source and build the project now.
How to start the hadoop jobs from Eclipse? Say if we can put the server names,
could we trace the running process through
eclipse, su
>steve, Tom
I'd recommend you check out the trunk and try building it and running
the
tests on solaris.
I've also managed to test and build Hadoop on Solaris. From 0.17
there's support for building the native libraries on Solaris, which
are useful for performance (see
https://issues.apache.org
thks, i get it.
On Tue, Jun 17, 2008 at 10:24 PM, Nathan Fiedler <[EMAIL PROTECTED]>
wrote:
> Indeed, it's in System Preferences -> Sharing. Check "Remote Login"
> and you're done.
>
> n
>
>
> On Tue, Jun 17, 2008 at 12:25 AM, Tim Wintle <[EMAIL PROTECTED]>
> wrote:
> > I've set hadoop up on a lo
Indeed, it's in System Preferences -> Sharing. Check "Remote Login"
and you're done.
n
On Tue, Jun 17, 2008 at 12:25 AM, Tim Wintle <[EMAIL PROTECTED]> wrote:
> I've set hadoop up on a load of Intel Macs before - I think that sshd is
> what Apple call "Remote Log-in" or something like that - it
hi,
i'm new in hadoop and im just testing it at the moment.
i set up a cluster with 2 nodes and it seems like they are running
normally,
the log files of the namenode and the datanodes dont show errors.
Firewall should be set right.
but when i try to upload a file to the dfs i get following m
I've successfully run Hadoop on Solaris 5.10 (on Intel). The path
included /usr/ucb so whoami was picked up correctly.
Satoshi, you say you added /usr/ucb to you path too, so I'm puzzled
why you get a LoginException saying "whoami: not found" - did you
export your changes to path?
I've also manag
Hi all,
I am experiencing (through my students) the following error on a 28
nodes cluster running Hadoop 0.16.4.
Some jobs fail with many map tasks aborting with this error message:
2008-06-17 12:25:01,512 WARN org.apache.hadoop.mapred.TaskTracker:
Error running child
java.io.IOException: Filesys
Satoshi YAMADA wrote:
>From hadoop doc, only Linux and Windows are supported platforms. Is
it possible to run
hadoop on Solaris? Is hadoop implemented in pure java? What kinds of
problems are there in
order to port to Solaris? Thanks in advance.
hi,
no one seems to reply to the previous "had
Hi,
I am aware of HowToContrib wiki page but in my case [ant test] takes more
then one hour. I can not tell you how much time it takes because I always
stopped it after 4-5 hours...
I was running these test on notebook Dell, dual core 1GB of RAM, Windows XP.
I haven't tried it now after switching
Lukas Vlcek wrote:
Hi,
How long is Hadoop full unit test suit expected to run?
How do you go about running Hadoop tests? I found that it can take hours for
[ant test] target to run which does not seem to be very efficient for
development.
Is there anything I can do to speed up tests (like runni
I've set hadoop up on a load of Intel Macs before - I think that sshd is
what Apple call "Remote Log-in" or something like that - it was a GUI
option to allow an account to log in remotely.
Hope that helps
On Tue, 2008-06-17 at 14:27 +0800, j.L wrote:
> i wanna try hadoop, but i can't run sshd wh
where we can download pdf or ppt of this meetup?
On Tue, Jun 17, 2008 at 2:49 PM, Lukas Vlcek <[EMAIL PROTECTED]> wrote:
> Hi,
>
> any plans to capture this on video?
>
> Regards,
> Lukas
>
> On Tue, Jun 17, 2008 at 7:42 AM, Isabel Drost <
> [EMAIL PROTECTED]>
> wrote:
>
> >
> > Hello,
> >
> > I
Hi,
How long is Hadoop full unit test suit expected to run?
How do you go about running Hadoop tests? I found that it can take hours for
[ant test] target to run which does not seem to be very efficient for
development.
Is there anything I can do to speed up tests (like running Hadoop in a real
cl
29 matches
Mail list logo