Thanks for the pointers!
As soon as I have some results, I'll post them back and let you know if the
problem is solved.

Jesse

int GetRandomNumber()
{
   return 4; // Chosen by fair roll of dice
                // Guaranteed to be random
} // xkcd.com



On Wed, Oct 21, 2009 at 4:46 AM, Andrzej Bialecki <a...@getopt.org> wrote:

> Jesse Hires wrote:
>
>> I tried asking this over at the nutch-user alias, but I am seeing very
>> little traction, so I thought I'd ask the developers. I realize this is most
>> likely a configuration problem on my end, but I am very new to using nutch,
>> so I am having a difficult time understanding where I need to look.
>>
>> Does anyone have any insight into the following error I am seeing in the
>> hadoop logs? Is this something I should be concerned with, or is it expected
>> that this shows up in the logs from time to time? If it is not expected,
>> where can I look for more information on what is going on?
>>
>
> It's not expected at all - this usually indicates some config error, or FS
> corruption, or it may be also caused by conflicting DNS (e.g. the same name
> resolving to different addresses on different nodes), or a problem with
> permissions (e.g. daemon started remotely uses uid/permissions/env that
> doesn't allow it to create/delete files in data dir). This may be also some
> weird corner case when processes run out of file descriptors - you should
> check ulimit -n and set it to a value higher than 4096.
>
> Please also run fsck / and see what it says.
>
>  I can also provide config files if needed.
>>
>
> We need just the modifications in hadoop-site.xml, that's where the problem
> may be located.
>
>
> --
> Best regards,
> Andrzej Bialecki     <><
>  ___. ___ ___ ___ _ _   __________________________________
> [__ || __|__/|__||\/|  Information Retrieval, Semantic Web
> ___|||__||  \|  ||  |  Embedded Unix, System Integration
> http://www.sigram.com  Contact: info at sigram dot com
>
>

Reply via email to