Hey Ray.. Great name you have there.. HA.. 

I don't actually care about deleting these files.. That is not the issue.. See 
I have embedded Nutch in my application. That application calls nutch over and 
over again to do crawling and index creation.. This thread that stays alive.. 
It eventually exceeds some limit (native thread) in Java and crashes my 
application.. So that is why I need to find and properly close down that 
service or whatever. I noticed that Hadopp files are still locked and so I am 
thinking that as a hint that it is hadopp.. 

Bottom line is

When you run Crawl in the java directory, some thread stays open.. That thread 
is killing me.. What is it that stays alive past the completion of the 
Crawl.java code... 
If you run org.apache.nutch.crawl.Crawl from within java/eclispe something 
stays alive.. How to clise that is the issue.. 

See what I am asking.. 

Ray, the other ray.. 

-----Original Message-----
From: Raymond Balmès [mailto:[email protected]] 
Sent: Thursday, April 23, 2009 8:23 AM
To: [email protected]
Subject: Re: Hadoop thread seems to remain alive

Same problems... even rebooting the PC does not always solve the issue,
files remain locked.

I have gone the brutal way and use unlocker.exe but I mean to find out
what's going wrong so I will keep posted on this one.

-Ray-

2009/4/23 Lukas, Ray <[email protected]>

> Question:
> What is the proper accepted and safe way to shut down nutch (hadoop)
> after I am done with it?
>
> Hadoop.getFileSystem().closeAll() ??
> I did try this and no luck. Anyone else having this problem?
>
> Thanks guys.. Thanks, if/when I find it I will post it for everyone.
> Ray
>

Reply via email to