Hi,
Is it possible to upgrade to a newer version of hadoop without bringing the
cluster down? To my understanding its not. But just wondering..
Thanks
Sudharsan S
El 7/7/2011 8:43 PM, Kai Ju Liu escribió:
Over the past week or two, I've run into an issue where MapReduce jobs
hang or fail near completion. The percent completion of both map and
reduce tasks is often reported as 100%, but the actual number of
completed tasks is less than the total number. I
Over the past week or two, I've run into an issue where MapReduce jobs hang
or fail near completion. The percent completion of both map and reduce tasks
is often reported as 100%, but the actual number of completed tasks is less
than the total number. It appears that either tasks backtrack and need
Ling.caol
I do not believe that there is such a tool. There are a lot of Hive developers
that also work on Hadoop but it might be more appropriate to ask the Hive
mailing list this question.
u...@hive.apache.org
--Bobby Evans
On 7/6/11 4:19 AM, "ling cao" wrote:
hi,all
i have some hive sq
Just to be more specific: Different slaves in my cluster have different
interfaces configured to be the internal interfaces. For example, node1 used
eth19 for internal connectivity where as node2 uses eth20. I modified the
hdfs-site.xml and mapred-site.xml on each node making sure that
dfs.datanode
Hi,
I am trying to set up a Hadoop cluster (using hadoop-0.20.2) using a bunch
of machines each of which have 2 interfaces, a control and an internal
interface. I want only the internal interface to be used for running hadoop
(all hadoop control and data traffic is to be sent only using the intern