On Tue, 19 Jun 2018 16:47:46 +0200 Martin Bammer <mrb...@gmail.com> wrote: > Hello, > > because Python is a very dynamic language the memory management is heavily > used. A lot of time is used for creating (reserve memory and fill object > structure with data) and destroying objects.
Do you have numbers about that? One modus operandi would be to collect profiling data using Linux "perf" on a real Python workload you care about. > And here comes the idea for POPT. With this idea the Python interpreter has > running several threads in background (1 thread for each object type) which > manage a set of objects as an object cache. Each object in the cache is > already preconfigured by the object provider thread. So only the part of > the object structure which is individual has to be initialized. This saves > a lot of processing time for the main thread and the memory management has > much less to do, because temporarily unused objects can be reused > immediately. How does the main thread (or, rather, the multiple application threads) communicate with the background object threads? What is the communication and synchronization overhead in this scheme? Regards Antoine. _______________________________________________ Python-ideas mailing list Python-ideas@python.org https://mail.python.org/mailman/listinfo/python-ideas Code of Conduct: http://python.org/psf/codeofconduct/