32 vs 64 bits
the compiler will have to split up each arithmetic operation into atleast two 32-bit instructions. Though unless you're doing nothing but arithmetic, I wouldn't matter all that much about the performance impact of it, compared to things like algorithmic complexity, cache friendliness, amount of
32 vs 64 bits
Hello world. I have a stupid question just to make things clear in my head: Is it really an issue to use a 64 bits int on a 32 bits machine ?