Id is different in namenode and data node.you can modify the id.I met the same
issue and I completely remove all file under hadoop
—
Sent from Mailbox for iPhone
On Wed, May 1, 2013 at 8:32 PM, Mohsen B.Sarmadi
mohsen.bsarm...@gmail.com wrote:
Dear Sirs/madams
i am trying to run hadoop 1.0.4
Hi Everyone
Today I am testing about 2T data on my cluster, there several failed map
task and reduce task on same node
Here is the log
Map failed:
org.apache.hadoop.util.DiskChecker$DiskErrorException: Could not find any
valid local directory for output/spill0.out
at
Hi
I have the same problem before
I think this is caused by the lack of memory shortage for map task.
It is just a suggestion,you can post your log
BRs
Geelong
—
Sent from Mailbox for iPhone
On Mon, Apr 22, 2013 at 4:34 PM, kaveh minooie ka...@plutoz.com wrote:
HI
regardless of what job
at the beginning of the map
task (counter is only 1 )
On 04/22/2013 02:12 AM, 姚吉龙 wrote:
Hi
I have the same problem before
I think this is caused by the lack of memory shortage for map task.
It is just a suggestion,you can post your log
BRs
Hi Everyone
Sorry to ask this again.
I am confused about this issue for a long time,anyone know some detailed
steps about the test.
BRs
Geelong
--
From Good To Great
Hi Everyone
I am testing my MR programe with MRunit, it's version
is mrunit-0.9.0-incubating-hadoop2. My hadoop version is 1.0.4
The error trace is below:
java.lang.IncompatibleClassChangeError: Found class
org.apache.hadoop.mapreduce.TaskInputOutputContext, but interface was
expected
at
The num of map is decided by the block size and your rawdata
—
Sent from Mailbox for iPhone
On Sat, Apr 20, 2013 at 12:30 AM, YouPeng Yang yypvsxf19870...@gmail.com
wrote:
Hi All
I take NLineInputFormat as the Text Input Format with the following code
:
Use this command:hadoop fs -getmerge <file in hdfs> <local>
—
Sent from Mailbox for iPhone
On Wed, Apr 17, 2013 at 10:40 PM, Fabio Pitzolu fabio.pitz...@gr-ci.com
wrote:
Hi all,
is there a way to use the *getmerge* fs command and not generate the .crc
files in the output local directory?
Hi everyone
How can I install the MRunit to do the unit test? Is there any requriement
for the tool
My hadoop version is :1.0.4
Except the MRunit, any other test tool available?
BRs
Geelong
--
From Good To Great
Hi
I am a newer for hadoop, now we have 32 nodes for hadoop study
I need to speed up the process of hadoop processing by finding the best
configuration.
For example: io.sort.mb io.sort.record.percent etc. But I do not how to
start with so many parameters available for optimization.
BRs
-- Forwarded message --
From: 姚吉龙 geelong...@gmail.com
Date: 2013/3/19
Subject: Re: Need your help with Hadoop
To: Harsh J ha...@cloudera.com
Thank for your reply.
I am wondering which parameters defines the capacity of datanode,or the
way to calculate the capacity. I have
11 matches
Mail list logo