"bearophile" <bearophileh...@lycos.com> wrote in message news:h8d7tu$179...@digitalmars.com... > > Semicolons are noise, they slow down programming a little. >
That's *very* programmer-dependent. It originally took took me all of about a week to get used to semicolons after growing up on basic (and even then it was a very very minor time sink), and now just it takes all of about a split second to press that key. But, any time I use a language that doesn't allow semicolon line endings, I keep sticking them in without even thinking about it. Then the compiler complains, and I have to go back and fix it, and that slows down programming more than just instinctively hitting a key. > >> Read Cedric's blog June 2008 for example >> http://beust.com/weblog/archives/000490.html > > The comments to that blog post are more intelligent and useful than the > main post. See for example the comment by Amit Patel. > Thanks for pointing that out. That's a *very* good comment. And interesting too because he talks about using switch for parsers (although actually, so does the original article) and just the other day I was making an implementation of Haxe's proprocessor. I ended up with code like this: switch(directive) { case "#if": ... case "#elseif": ... case "#else": ... case "#end": ... case "#error": ... default: ... } Works fine. When I originally read that article, although I understood his point and agree there are (pardon the puns) many cases for which switch is the wrong choice, I was thinking "What, am I *really* supposed turn those strings into polymorphic objects? What a pedantic waste!"