Regarding the EOFException my guess is that some nodes are acting
flaky. What version of hadoop are you running?
Brock
On Mon, Nov 5, 2012 at 8:43 PM, Cameron Gandevia cgande...@gmail.com wrote:
Hi
I starting noticing the following error on our flume nodes and was wondering
if anyone had any
Thanks for the reply, it looks like the cause was our DataNodes throwing
the following exception
java.io.IOException: xceiverCount 2050 exceeds the limit of concurrent
xcievers 2048
I upgraded this setting and now everything seems to run correctly.
On Tue, Nov 6, 2012 at 5:00 AM, Brock Noland
Hello Sandeep
Please edit the dfs.web.ugi property in hdfs-site.xml. It is by
default webuser,webgroup.
Regards,
Mohammad Tariq
On Thu, Aug 16, 2012 at 11:50 PM, Sandeep Reddy P
sandeepreddy.3...@gmail.com wrote:
Hi,
I'm using flume-ng to write data from log file to HDFS but unable
Hi,
Using flume i'm unable to write in hdfs. Hadoop is working fine.
On Thu, Aug 16, 2012 at 3:35 PM, Mohammad Tariq donta...@gmail.com wrote:
Are u able to write through the hdfs shell??
On Friday, August 17, 2012, Sandeep Reddy P sandeepreddy.3...@gmail.com
wrote:
Hi,
Thaks for the
Hello Sandeep,
Sorry for late reply. Just make sure that the Hadoop and Flume are both
under the same user, the user which you have specified as the value of
dfs.web.ugi. And this user has proper privileges.
Regards,
Mohammad Tariq
On Fri, Aug 17, 2012 at 1:27 AM, Sandeep Reddy P