Hi,
I also see this on the webUI:
Number of Blocks Pending Deletion: 1
how to delete the invalidate blocks immediately without restart cluster.
Thanks
Tang
On 2014/10/29 13:11:28, Tang shawndow...@gmail.com wrote:
hi,
We are running mapreduce jobs on hadoop clusters. The job inputs come
hi, Guys,
I am trying to implement a simple program(that is not for production,
experimental). And invoke FileSystem.listFiles() to get a list of files
under a hdfs folder, and then use FileSystem.getFileBlockLocations() to get
replica locations of each file/blocks.
Since it is a controlled
You can accomplish this by using the DistributedShell application that
comes with YARN.
If you copy all your archives to HDFS, then inside your shell script you
could copy those archives to your YARN container and then execute whatever
you want, provided all the other system dependencies exist in
All,
I am new to Hadoop so any help would be appreciated.
I have a question for the mailing list regarding Hadoop. I have installed
the most recent stable version (2.4.1) on a virtual machine running CentOS
7. I have tried to run this command
%Hadoop -fs ls but without success.
The question
Are RHEL7 based OSs supported?
On Wed, Oct 29, 2014 at 3:59 PM, David Novogrodsky
david.novogrod...@gmail.com wrote:
All,
I am new to Hadoop so any help would be appreciated.
I have a question for the mailing list regarding Hadoop. I have installed
the most recent stable version (2.4.1)
HI David,
JAVA_HOME should point to the java installation directory. Typically, this
directory will contain a subdirectory called 'bin'. Hadoop tries to find
the java command in $JAVA_HOME/bin/java.
It is likely that /usr/bin/java is a symlink to some other file. If you do
an ls -l
Try to add “/” at the end of hadoop fs -ls
So it will become
Hadoop fs -ls /
From: David Novogrodsky [mailto:david.novogrod...@gmail.com]
Sent: Thursday, October 30, 2014 7:00 AM
To: user@hadoop.apache.org
Subject: Fwd: problems with Hadoop instalation
All,
I am new to Hadoop so any help would