On 15 April 2016 at 11:25,  <cshin...@gmail.com> wrote:
> The input was a 4MB file. Even after returning from the 'fileopen' function 
> the 4MB memory was not released. I checked htop output while the loop was 
> running, the resident memory stays at 14MB. So unless the process is stopped 
> the memory stays with it.

When exactly memory gets freed to the OS is unclear but it's possible
that your process can reuse the same bits of memory. The real question
is whether continuously allocating and deallocating leads to steadily
growing memory usage. If you change it so that your code calls fun
inside the loop you will see that repeatedly calling fun does not lead
to growing memory usage.

> So if the celery worker is not killed after its task is finished it is going 
> to keep the memory for itself. I know I can use **max_tasks_per_child** 
> config value to kill the process and spawn a new one. **Is there any other 
> way to return the memory to OS from a python process?.**

I don't really understand what you're asking here. You're running
celery in a subprocess right? Is the problem about the memory used by
subprocesses that aren't killed or is it the memory usage of the
Python process?

--
Oscar
-- 
https://mail.python.org/mailman/listinfo/python-list

Reply via email to