100% sure I have done and again the problem is not becuase my
configuration is kicking it. The problem is that my application uses
MultipleTextOutputFormat that may create 500 000 files and linux does
allow that many open files for whatever reason. If I set the limit too
high, it will ignore it.

On Wed, Jul 11, 2012 at 10:12 PM, Harsh J <ha...@cloudera.com> wrote:
> Are you sure you've raised the limits for your user, and have
> re-logged in to the machine?
>
> Logged in as the user you run eclipse as, what do you get as the
> output if you run "ulimit -n"?
>
> On Thu, Jul 12, 2012 at 3:03 AM, Mike S <mikesam...@gmail.com> wrote:
>> To debug an specific file, I need to run hadoop in eclipse and eclipse
>> keep throwing the Too Many Open File Ecxception. I followed the post
>> out there to increase the number of open file per process in
>> /etc/security/limits.conf to as high as I my machine accept and still
>> I am getting the too many open file exception from java io.
>>
>> I think the main reason is that I am using a MultipleTextOutputFormat
>> and my reducer could create many output files based on the my Muti
>> Output logic. Is there a way to make Hadoop not to open so many open
>> files. If not, can I control when the reduce to close a file?
>
>
>
> --
> Harsh J

Reply via email to