Could not get additional block while writing hundreds of files

2013-07-03 Thread Manuel de Ferran
Greetings all, we try to import data to an HDFS cluster, but we face random Exception. We try to figure out what is the root cause: misconfiguration, too much load, ... and how to solve that. The client writes hundred of files with a replication factor of 3. It crashes sometimes at the beginning,

Re: Could not get additional block while writing hundreds of files

2013-07-04 Thread Manuel de Ferran
2863795616 could only be > replicated to 0 nodes, instead of 1 > > > This indicates you haven't enough space on the HDFS. can you check the > cluster capacity used? > > > > > On Thu, Jul 4, 2013 at 12:14 AM, Manuel de Ferran < > manuel.defer...@gmail.com&g

DistCp fails on open files

2013-07-04 Thread Manuel de Ferran
Hi all, I'm trying to copy files from a source HDFS cluster to another. But I have numerous files open for writing, and DistCp fails on thoses ones. I've found a reference of that on jira https://issues.apache.org/jira/browse/MAPREDUCE-2160 Any workaround ? Anyone faced that too ? Thanks