On Sep 9, 2013, at 12:23 PM, Michael Torrie <torr...@gmail.com> wrote:
> On 09/09/2013 05:02 AM, Anthony Papillion wrote: >> But (and this is stepping into *really* paranoid territory here. But >> maybe not beyond the realm of possibility) it would not be so hard to >> compromise compilers at the chip level. If the NSA were to strike an >> agreement with, say, Intel so that every time a compiler ran on the >> system, secret code was discreetly inserted into the binary, it would be >> nearly impossible to detect and a very elegant solution to a tough problem. > > Indeed it is really paranoid territory, but now doesn't seem quite as > far fetched as one originally thought a few years ago! We'll still > trust (we have to; we have no other choice), but the level of trust in > computers in general has certainly gone down a notch and will never > quite be the same. > > > -- > https://mail.python.org/mailman/listinfo/python-list I think that is pretty far fetched. It requires recognition that a compiler is being compiled. I'd be REALLY surprised if there were a unique sequence of hardware instructions that was common across every possible compiler (current and future) and which wouldn't (couldn't) exist in arbitrary non-compiller execution, which could be used to trigger insertion of a backdoor. -Bill -- https://mail.python.org/mailman/listinfo/python-list