On Monday, 8 April 2013 at 06:35:27 UTC, Paulo Pinto wrote:
I do understand that, the thing is that since I am coding in 1986, I remember people complaining that C and Turbo Pascal were too slow, lets code everything in Assembly. Then C became alright, but C++ and Ada were too slow, god forbid to call virtual methods or do any operator calls in C++'s case.

Afterwards the same discussion came around with JVM and .NET environments, which while making GC widespread, also had the sad side-effect to make younger generations think that safe languages require a VM when that is not true.

Nowadays template based code beats C, systems programming is moving to C++ in mainstream OS, leaving C behind, while some security conscious areas are adopting Ada and Spark.

So for me when someone claims about the speed benefits of C and C++ currently have, I smile as I remember having this kind of discussions with C having the role of too slow language.

But important question is "what has changed?". Was it just shift in programmer opinion and they initially mislabeled C code as slow or progress in compiler optimizations was real game-breaker? Same for GC's and VM's.

It may be perfectly possible to design GC that suits real-time needs and is fast enough (well, Manu has mentioned some of requirements it needs to satisfy). But if embedded developers need to wait until tool stack that advanced is produced for D to use it - it is pretty much same as saying that D is dead for embedded. Mythical "clever-enough compilers" are good in theory but job needs to be done right now.

Reply via email to