Da,
Do you have permissions to run iostat? That could help you to find
persistent IO activity from other processes.
Another possibility is that your system has a RAID controller without a
cache memory card, from my experience that makes writes awfully slow.
esteban.
On Tue, Jan 25, 2011 at
Adarsh,
Dou you have in /etc/hosts the hostnames for masters and slaves?
esteban.
On Fri, Jan 7, 2011 at 06:47, Adarsh Sharma adarsh.sha...@orkash.comwrote:
Dear all,
I am researching about the below error and could not able to find the
reason :
Data Size : 3.4 GB
Hadoop-0.20.0
Hi,
Seems that you need to add your hostname/IP pair in /etc/hosts in both
nodes. Also it looks that you need to setup your configuration files
correctly.
This guides can be helpful for you:
http://hadoop.apache.org/common/docs/r0.20.2/quickstart.html
Hello Jon,
Could you please verify that your node can resolve the host name?
It would be helpful too if you can attach your configuration files and the
output of:
HADOOP_ROOT_LOGGER=DEBUG,console hadoop fs -ls /
as Todd suggested.
Cheers,
esteban
On Jan 1, 2011 2:01 PM, Jon Lederman
2, 2011, at 8:47 AM, Esteban Gutierrez Moguel wrote:
Hello Jon,
Could you please verify that your node can resolve the host name?
It would be helpful too if you can attach your configuration files and
the
output of:
HADOOP_ROOT_LOGGER=DEBUG,console hadoop fs -ls /
as Todd
Hello Cavus,
is your Job Tracker running on localhost? It would be great if you can
provide more information about your current Hadoop setup.
cheers,
esteban.
estebangutierrez.com — twitter.com/esteban
2010/12/30 Cavus,M.,Fa. Post Direkt m.ca...@postdirekt.de
I process this
./hadoop jar
Matthew,
Cloudera has rolled a certification program for developers and admins. Take
a look into their website.
Cheers,
Esteban.
On Dec 8, 2010 9:41 PM, Matthew John tmatthewjohn1...@gmail.com wrote:
Hi all,.
Is there any valid Hadoop Certification available ? Something which adds