Check the owner of the file. Might be a problem with the owner. The owner of
the file has to be the same as the one who creates the hadoop processes
otherwise it will not work. I faced that problem recently.
Regards
Ahmed Nagy
danoomistmatiste wrote:
>
> The jobtracker is failing to startup wit
: Starting DataNode
STARTUP_MSG: host = n01/192.168.0.1
STARTUP_MSG: args = []
STARTUP_MSG: version = 0.21.0
STARTUP_MSG: classpath =
/home/ahmednagy/HadoopStandalone/hadoop-0.21.0/bin/../conf:/usr/lib/jvm/java-6-sun/lib/tools.jar:/home/ahmednagy/HadoopStandalone$
STARTUP_MSG: build =
https
Dear All,
I need to know how much data transfere occured among the nodes and how much
processing is happening during the job executions based on different
configurations that I am supplying. Any idead how to do that.
Thanks in advance
--
View this message in context:
http://old.nabble.com/measur
job tracker to
stop. It is clear to me that the processes die. I am not sure why but I am
attaching an error that I found on one of the slaves node 1. Even if i use
start-mapred.sh or start-dfs.sh it does not work Please advise any ideas?
Thanks in advance
ahmednagy@cannonau:~/HadoopStandalone
some
errors and it dies.I have atached below the error messages that I get please
advise any ideas? Thanks in advance
ahmednagy@cannonau:~/HadoopStandalone/hadoop-0.21.0/bin$ ./hadoop namenode
-format
DEPRECATED: Use of this script to execute hdfs command is deprecated.
Instead use the hdfs command
Could you elaborate more please how did u fix that
Thanks
Ahmed Nagy
danoomistmatiste wrote:
>
> I managed to fix this issue. It had to do with permissions on the
> default directory.
>
> danoomistmatiste wrote:
>>
>> Hi, I have setup a Hadoop cluster as per the instructions for CDH3.