Thanks Joey !
I will try to find out abt copyFromLocal. Looks like Hadoop Apis write serially
as you pointed out.
Thanks,
-JJ
On May 17, 2011, at 8:32 PM, Joey Echeverria wrote:
> The sequence file writer definitely does it serially as you can only
> ever write to the end of a file in Hadoop.
Hi,
My question is when I run a command from hdfs client, for eg. hadoop fs
-copyFromLocal or create a sequence file writer in java code and append
key/values to it through Hadoop APIs, does it internally transfer/write data
to HDFS serially or in parallel ?
Thanks in advance,
-JJ
Hi there,
I think you got the run(String[] args) method right but in the main method you
are not calling your run method but ToolRunner.run. You need to invoke your
method in order to point to localhost:54310 otherwise it will read those
properties from the default hadoop conf.
Praveen
Hi Lucian,
Sounds like you didn't configure MapReduce to run on top of HDFS? If you
want to run on a distributed cluster, you need a distributed file system set
up as well.
-Todd
On Mon, May 16, 2011 at 8:31 AM, Lucian Iordache <
lucian.george.iorda...@gmail.com> wrote:
> Hello,
>
> I have a Ha
Hi,all..How can I run a MR job though my own program instead of using
console to submit a job to a real Hadoop env?
I write code like this, this program works fine but i don't think it ran in
my Hadoop env,since nothing was produced in hadoop logs folder.
public int run(String[] args) throws Except