I am hoping to write up an article on my project and all the cool things
that I figured out about nutch and java and eclipse, etc.. I will go
into a long and boring dissertation at that point.. For now I will keep
it short and sweet... As best I can.. 
I have eclipse, java 6, nutch, hadoop running and they all work great,
except for one thing.. After all my code completes hadoop seems to still
have open threads. Sometimes I can not delete the newly created indexes,
or the hadoop log file because I believe that hadoop still hold them. 
Basically I am using the crawl.java from the source directory.. 

Question:
What is the proper accepted and safe way to shut down nutch (hadoop)
after I am done with it?

Hadoop.getFileSystem().closeAll ??
Is that what I should be doing?? 

Thanks guys.. Thanks
Ray

Oh Hadoop is on a single machine, right out of the box, I did nothing
special with it.. Nothing.. 

Reply via email to