Thanks for clarifying Ellis. Am sorry I assumed certain things when
replying here.
I looked at it as well and it does absolutely nothing, nor is referred
by anything, nor can we do anything with it. We may as well remove it
(the tunable), or document it. Please do file a HADOOP JIRA (once
Apache J
Many thanks to Eli and Harsh for their responses! Comments in-line:
On 08/12/2012 09:48 AM, Harsh J wrote:
Hi Ellis,
Note that when in Hadoop-land, a "block size" term generally means the
chunking size of HDFS writers and readers, and that is not the same as
the FS term "block size" in any way
Hi Ellis,
Note that when in Hadoop-land, a "block size" term generally means the
chunking size of HDFS writers and readers, and that is not the same as
the FS term "block size" in any way.
On Thu, Aug 9, 2012 at 6:40 PM, Ellis H. Wilson III wrote:
> Hi all!
>
> Can someone please briefly explain
HI Rahul
Better to to start a new thread than hijacking others .:) It helps to keep
the mailing list archives clean.
Learning java, you need to get some JAVA books and start off.
If you just want to run wordcount example just follow the steps in below url
http://wiki.apache.org/hadoop/WordCount
Hi Tariq,
I am trying to start wordcount mapreduce, i am not getting how to start and
where to start ..
i very new to java.
can you help how to work with this..any help will appreciated.
Hi All,
Please help start with Hadoop on CDH , i have instaleed in my local PC.
any help will appreciated.
On
Hi all!
Can someone please briefly explain the difference? I do not see
deprecated warnings for fs.local.block.size when I run with them set and
I see two copies of RawLocalFileSystem.java (the other is
local/RawLocalFs.java).
The things I really need to get answers to are:
1. Is the defaul