As an engineer, I can quickly knock together behavioural models of
electronic circuits,  complete units, and control systems in Python, then
annoyingly in a few recent cases, have to re-write in C for speed.

I've tried PyPy, the just-in-time compiler for Python, and that is
impressively, hugely fast in comparison, but it's no good making these
models if I can't display the results in a useful way, and at the moment
PyPy just doesn't have the huge range of useful time-saving libraries that
CPython has.  It's still quicker to do a re-write in the more cumbersome C
than try to work with PyPy because C, like CPython, also has many useful
libraries.

A few years back, I recall people saying that PyPy was going to be the
future of Python, but it seems to me that CPython still has the lion's
share of the momentum, is developing faster and has ever more libraries,
while PyPy is struggling to get enough workers to even get Numpy
completed.

Maybe there's not enough people like me that have really felt the need for
the speed.  Or maybe it's simply the accident of the historical
development path that's set-in-stone an interpreter rather than a JIT.
Anybody got a useful perspective on this?
-- 
https://mail.python.org/mailman/listinfo/python-list

Reply via email to