Hi Sylvain,

On 09/07/2011 10:48 AM, Sylvain Thénault wrote:
> Hi,
> 
> On 13 August 04:02, Gelonida N wrote:
>> After a certain number of files I get error messages about 'too many
>> open file handles'
>  
> several people have reported this pb, generally using windows.

Yes I was running on windows as the code to be analyzed contains some
windows specific imports.
An analysis under linux would be incomplete.


> * pylint should drop/close it once processing of a module is over
That would be great. :-)
>  
>> I will investigate further whenever I have time again to look at this issue.
>> Currently I fell back to run one pylint process for each file, which is
>> hooribly slow (~40 minutes) but working as I have to finish some other
>> tasks urgently and as the run time is not the biggest problem at the moment.
> 
> PyLint has not built its reputation by being quick ;)

For small files it's not really pylint analyzing the code, taking that
much time, but mainly loading python and pylint, which is the main
contributor with about 70 to 95 % of the time

That's why I tried to accelerate pylint by not having to load python for
every file.
The speedup would be considerable.

if I could avoid loading python / pylint for each file.

>  
>> What I wanted to know in general is following:
>> Does pylint 'only' analyze all files or does it really import the code
>> to be analyzed?
> 
> pylint doesn't actually import code (beside C-compiled module, as this
> is the only way to get a clue about what's inside)
Thanks a lot this helps understanding.

>  
>> The reason why I'm asking is, is whether I should look out for commands,
>> which are not protected with
>>
>> if __name__ == '__main__': statements
>> (This might be one reason for too  many open file handles)
> 
> As I said above, this is probably not the problem.
>  
Thanks again  for your detailed answer.

So it seems I am stuck with having to start pylint for each file or to
have a system, where I create a new process for every N files to be
analyzed.

The question is whether I can find a reasonable N.
Not knowing the internals I am afraid, that the number of files to be
linted before failure will depend on the contents of the python code and
the amount of sub modules to be analyzed.


_______________________________________________
Python-Projects mailing list
[email protected]
http://lists.logilab.org/mailman/listinfo/python-projects

Reply via email to