On Sun, 6 Nov 2016 09:17 am, Mr. Wrobel wrote: > However the most important is second part of my question. > > What do you think about using GPU processing or pypy?
I don't have any experience with GPU processing. I expect that it will be useful for somethings, but for number-crushing and numeric work, I am concerned that GPUs rarely provide correctly rounded IEEE-754 maths. That means that they are accurate enough for games where a few visual glitches don't matter, but they risk being inaccurate for serious work. I fear that doing numeric work in GPUs will be returning to the 1970s, when every computer was incompatible with every other computer, and it was almost impossible to write cross-platform, correct, accurate numeric code. As far as PyPy goes, it is a proven optimizing Python JIT compiler that can speed-up long-running code very well. It is not suitable for short scripts or programmers that run quickly. It takes time for the JIT to warm up and start showing optimizations. -- Steve “Cheer up,” they said, “things could be worse.” So I cheered up, and sure enough, things got worse. -- https://mail.python.org/mailman/listinfo/python-list