Manish,
I am not sure if the counter would provide a globally unique id. To the
best of my knowledge, Counters are mapper specific. So even if one could
see the application working on one mapper, at the end, when deployed on
production, duplicate ids would cause a problem. So, unless you are
While I whole heartedly agree that it could be useful if counter would have
returned the incremented value, the purpose of coubter is not that.
Counters are used for counting incidence occurance.
I have not read through details ofvtge issue, but if you want to generate a
unique number, then there
Huy,
as I understand it, mapper or reducer is actually run in its own JVM. So
if your class is required by mapper or reducer then one instance of it will
be created for every mapper or reducer. Also it would mean that only 1
such instance would be created, because you have made those functions
Hi There,
First of all, sorry if I am asking some stupid question. Myself being new
to the Hadoop environment , am finding it a bit difficult to figure out why
its failing
I have installed hadoop 1.2, based on instructions given in the folllowing
link
configuration (XML tags ), Please check all the
Conf files.
Thanks
On Tue, Jul 23, 2013 at 6:25 PM, Ashish Umrani ashish.umr...@gmail.comwrote:
Hi There,
First of all, sorry if I am asking some stupid question. Myself being
new to the Hadoop environment , am finding it a bit difficult
-site.xml.
It is missing.
Thanks.
On Tue, Jul 23, 2013 at 9:58 PM, Ashish Umrani ashish.umr...@gmail.comwrote:
Hey thanks for response. I have changed 4 files during installation
core-site.xml
mapred-site.xml
hdfs-site.xml and
hadoop-env.sh
I could not find any issues except that all
name,value and
description tags.
Regards
Bejoy KS
Sent from remote device, Please excuse typos
--
*From: * Ashish Umrani ashish.umr...@gmail.com
*Date: *Tue, 23 Jul 2013 09:28:00 -0700
*To: *user@hadoop.apache.org
*ReplyTo: * user@hadoop.apache.org
*Subject: *Re
of /app/hadoop/tmp to 755 and see if it helps.
Warm Regards,
Tariq
cloudfront.blogspot.com
On Tue, Jul 23, 2013 at 10:27 PM, Ashish Umrani
ashish.umr...@gmail.comwrote:
Thanks Jitendra, Bejoy and Yexi,
I got past that. And now the ls command says it can not access the
directory. I am
jeetuyadav200...@gmail.com
wrote:
Try..
*hadoop fs -ls /*
**
Thanks
On Tue, Jul 23, 2013 at 10:27 PM, Ashish Umrani
ashish.umr...@gmail.comwrote:
Thanks Jitendra, Bejoy and Yexi,
I got past that. And now the ls command says it can not access the
directory. I am sure this is a permissions
once again
regards
ashish
On Tue, Jul 23, 2013 at 10:31 AM, Shekhar Sharma shekhar2...@gmail.comwrote:
hadoop jar wc.jar fully qualified driver name inputdata outputdestination
Regards,
Som Shekhar Sharma
+91-8197243810
On Tue, Jul 23, 2013 at 10:58 PM, Ashish Umrani
ashish.umr
10 matches
Mail list logo