On Wed, May 17, 2017 at 04:16:59PM -0700, Walter Bright via Digitalmars-d wrote: > On 5/17/2017 1:46 PM, H. S. Teoh via Digitalmars-d wrote: > > People aren't willing to accept that their cherished choice of > > language may have been the wrong one, especially if they have > > invested much of their lives in mastering said language. > > It may not be the developers that initiate this change. It'll be the > managers and the customers who force the issue - as those are the > people who'll pay the bill for the problems.
That may or may not force a shift to a different language. In fact, the odds are heavily stacked against a language change. Most management are concerned (and in many cases, rightly so) about the cost of rewriting decades-old "proven" software as opposed to merely plugging the holes in the existing software. As long as they have enough coders plugging away at the bugs, they're likely to be inclined to say "good enough". The only way this will change is if a nasty enough exploit causes a large enough incident, and if it's something that shakes the very foundations of practically all code in that language -- say a fundamental flaw in the C standard library or something of that scale that has no easy fix except rewriting major chunks of all code that uses the C library. Either that, or if a continuous stream of high-visibility exploits occur over an extended period of time, all related to flaws in C or the C standard library, such that people eventually grow disgusted enough at yet another buffer overflow or yet another stack overflow, etc., that they seriously start considering alternatives. Barring these extreme scenarios, I see the more likely outcome is an increasing adoption of safe coding conventions that may help in the short term, but ultimately unable to address fundamental language design issues. Perhaps what will eventually cause a change is education: if the next generation of programmers are educated to be aware of security issues and language design issues, they may be more inclined to choose a memory-safe language when they are given a choice. Eventually, if they become the decision-makers, that is when the shift will happen. > > Though from what I can tell, the WannaCry fiasco is more than merely > > a matter of memory safety; > > It may very well be. But if memory safety is part of the problem, then > it is part of the solution. Memory safety is only part of the story. I'm tempted to quote you saying that it's only plugging one hole in a cheesegrater, but in this case it's a pretty darn big hole. :-P But there are other issues that memory safety doesn't even begin to address, like race conditions, resource leakage (leading to DoS attacks), improper use (or lack of use) of cryptographically-secure primitives, access control, data sanitization, leakage of sensitive data, inherently insecure designs (e.g., backdoors), etc.. After having been involved in a major code audit project at my work, I'm increasingly of the opinion that the vast majority of coders have no idea how to write secure code (or they are just too indifferent to bother), and that the vast majority of non-trivial codebases running our present-day systems right now are riddled chockful of security holes just waiting for someone to devise an exploit for. Only some of these flaws are related to memory safety. T -- Once bitten, twice cry...