Here is (hope) complete info for my question: i use linux/unix i use python 2.3
my subject is a single file at a time. but i have to check thousands of files in a loop. so, since there is thousands of files to process, i dont want to use like os.system('lsof filename') because lsof is slow regarding, when you called it thousands of times. is there another way, any python command sequence that i can check if a file is open at the time of "before i process file" i am not interested in " the file may be written after i access it.." the important point is " the time at i first access it." my routine is something like: for i in listoffiles: checkfileopen(i) processfile() thats it... i hope this will clear my question.. thank you a lot. ---------- Forwarded message ---------- From: Chris McAloney <[EMAIL PROTECTED]> Date: 16.Nis.2008 17:00 Subject: Re: is file open in system ? - other than lsof To: python-list@python.org On 16-Apr-08, at 9:20 AM, A.T.Hofkamp wrote: > On 2008-04-16, bvidinli <[EMAIL PROTECTED]> wrote: >> is there a way to find out if file open in system ? - >> please write if you know a way other than lsof. because lsof if >> slow for me. >> i need a faster way. >> i deal with thousands of files... so, i need a faster / python way >> for this. >> thanks. > > This is not a Python question but an OS question. > (Python is not going to deliver what the OS doesn't provide). > > Please first find an alternative way at OS level (ie ask this > question at an > appropiate OS news group). Once you have found that, you can think > about Python > support for that alternative. I agree with Albert that this is very operating-system specific. Since you mentioned 'lsof', I'll assume that you are at least using a Unix variant, meaning that the fcntl module will be available to you, so you can check if the file is already locked. Beyond that, I think more information on your application would be necessary before we could give you a solid answer. Do you only need to know if the file is open, or do you want only the files that are open for writing? If you only care about the files that are open for writing, then checking for a write-lock with fcntl will probably do the trick. Are you planning to check all of the "thousands of files" individually to determine if they're open? If so, I think it's unlikely that doing this from Python will actually be faster than a single 'lsof' call. If you're on Linux, you might also want to have a look at the /proc directory tree ("man proc"), as this is where lsof gets its information from on Linux machines. Chris -- http://mail.python.org/mailman/listinfo/python-list -- İ.Bahattin Vidinli Elk-Elektronik Müh. ------------------- iletisim bilgileri (Tercih sirasina gore): skype: bvidinli (sesli gorusme icin, www.skype.com) msn: [EMAIL PROTECTED] yahoo: bvidinli +90.532.7990607 +90.505.5667711 -- http://mail.python.org/mailman/listinfo/python-list