On 2/3/10 6:12 AM, janani venkat wrote:
Thank you..
And can you give me a tutorial kinda thing to connect them using LAN?
Start with buying the cable.
The people in the list are happy to assist you with hadoop-related
questions.
Feel free to use a search engine of your choice to get the re
Peter,
Out of curiosity - What is the version of Hadoop DFS and M-R are
being used behind the scenes ?
On 2/2/10 11:26 PM, Sirota, Peter wrote:
Hi Brian,
AWS has Elastic MapReduce service where you can run Hadoop starting at
10 cents per hour. Check it out at
http://aws.amazon.com/elasti
On 2/2/10 9:02 AM, zaki rahaman wrote:
Most of your questions are easily answered by taking a look at the
documentation, FAQs, and some smart Googling/Yahooing/Binging.
1. The main Hadoop project consists of two major components: HDFS (Hadoop
Distributed File System) and MapReduce.
2. Not sure
Start with hadoop-common to start building .
hadoop-hdfs / hadoop-mapred pull the dependencies from apache snapshot
repository that contains the nightlies of last successful builds so in
theory all 3 could be built independently because of the respective
snapshots being present in apache snaps
On 01/17/2010 05:11 PM, Mark Kerzner wrote:
Hi,
I am writing a second step to run after my first Hadoop job step finished.
It is to pick up the results of the previous step and to do further
processing on it. Therefore, I have two questions please.
1. Is the output file always called part-
I have found iotop to be useful -http://guichaz.free.fr/iotop/ .
Andy Sautins wrote:
I have a question that I got an interesting and helpful answer for on the
IRC channel today, but thought I'd open it up to a larger group as well.
My problem is hopefully a very common problem. I
Ravi Phulari wrote:
Hello Kay Kay ,
Yes, lines beginning with # are ignored in slaves file.
-Ravi
On 11/16/09 12:46 PM, "Kay Kay" wrote:
The file $HADOOP_HOME/conf/slaves has the list of machines that could
act as slaves. When we open the file it is not immediately obvious
The file $HADOOP_HOME/conf/slaves has the list of machines that could
act as slaves. When we open the file it is not immediately obvious if it
would be a comma separated list of machines to act as slaves or in
separate lines ( the docs
http://hadoop.apache.org/common/docs/r0.19.2/cluster_setup.