To debug an specific file, I need to run hadoop in eclipse and eclipse
keep throwing the Too Many Open File Ecxception. I followed the post
out there to increase the number of open file per process in
/etc/security/limits.conf to as high as I my machine accept and still
I am getting the too many open file exception from java io.

I think the main reason is that I am using a MultipleTextOutputFormat
and my reducer could create many output files based on the my Muti
Output logic. Is there a way to make Hadoop not to open so many open
files. If not, can I control when the reduce to close a file?

Reply via email to