Happily I'm stupid and completely missed the condescending tone of an evident genius. Instead I'll just be grateful that it pleased one of the D masters to drop some statement down at me at all.

On Tuesday, 20 August 2013 at 21:52:29 UTC, Andrei Alexandrescu wrote:
On 8/20/13 2:22 PM, Ramon wrote:
= vs :=

void main() {
    int a, b;
    if (a = b) {}
}

./test.d(4): Error: assignment cannot be used as a condition, perhaps == was meant?

This is a solved problem. (However see also http://d.puremagic.com/issues/show_bug.cgi?id=10862 which I just submitted). I haven't heard a peep about it in D and C++ circles (most C++ compilers define a similar warning, although they aren't required).

I did as advised and found:

Now consider:

void main() {
    int a, b;
    if ((a = b) = 0) {}
}

This compiles diagnostic-free. The shape if (expr1 = expr2) should be
disallowed at a grammatical level, i.e. during parsing

Oops.
So, after all, making it invitingly easy to mix up assignment and comparison can actually be troublesome? Wow.

I've seen hints from the authors of D themselves that '++' and '--' might not be the wisest way of action. So I stand here asking "Why the
hell did they implement them?"

That would be news to me. I find "++" and "--" very useful for daily use. D also implements their overloading arguably in a more elegant way than C++ does.

I don't remember the precise spot but somewhere (here on this site) someone from the D team says something to the effect of maybe pre *and* post inc/dec might be not so desirable.

Whatever, it was *my* error to express myself not precisely. Yes, ++ and -- *are* useful. What I meant (and didn't make clear) was that there are 2 versions, post and pre. One of them is fine, two versions, pre and post, can create trouble.

It would be very simple to implement that logic in an editor for those who feel that life without '++' is impossible to automagically expand
"x++" to "X := x + 1".

This argument is unlikely to do very well with this crowd.

So do other arguments in a C++ crowd. Happily enough you did think what you thought and what made you aork on D anyway.

Having seen corporations in serious trouble
because their system broke (or happily continued to run albeit producing erroneous data ...) for this "small detail" I have a hard time to defend
'++'. ("Save 5 sec typing/day and risk your company!").

Very interesting! What pernicious effects does "++" have?

Same thing as above. Pre-inc'ing e.g. a pointer that should be post-inc'ed can lead to pretty ugly situations.

Another issue (from an Ada background): Why "byte" ... (the complete series up to) ... "cent"? Bytes happen to be important for CPUs - not for the world out there. I wouldn't like to count the gazillion cases where code went belly up because something didn't fit in 16 bits. Why not the other way around, why not the whole she-bang, i.e., 4 (or) 8 bytes as default for a single fixed point type ("int") and a mechanism
to specify what actually is needed?
So for days in a month we'd have "int'b5 dpm;" (2 pow x notation) or
"int'32dpm;"?

As a rule of thumb primitive types model primitive machine types. For everything else there are libraries. Or should be :o).

This happens to be a nice example for perspective. C's perspective (by necessity) was resource oriented along the line "offer an 8bit int so as
to not waste 16bits were 8bits suffice".
Yet we still do that in the 21st century rather than acting more *human oriented* by putting the decision for the size to the human. Don't underestimate that! The mere action of reflecting how much storage is
needed is valuable and helps to avoid errors.

There's very much wrong in this. Byte-level access is necessary in a systems language for a variety of reasons, of which storing individual integers is probably the least interesting.

There is a reason or explanation for everything, no matter what.

But my point was another one. It was about another perspective. C's and, so it seems, D's perspective is "What's natural with a CPU" - mine is "What's natural and useful for humans trying to solve problems".

With all respect, Andrei, your argument doesn't mean too much to me anyway because if the job at hand is pure system low level programming, I'll do it in C anyway. Furthermore it is well understood nowadays that it might be smart to split even OS design into a HAL (C/Asm) and higher level stuff (higher level language). Does it hurt performance to do everything in 32 bits rather than in, say, 16 bits (on a 32 or 64 bit CPU)? Last time I looked at CPU specs, no.

Finally, yes, some system level programming needs byte pointers. Well, how difficult or expensive could it be to implement a byte pointer? I think it's in the well feasible range.

But again, my argument wasn't about system programming, which, of course, is the perfect argument for D. OTOH: Is D really and only meant for systems programming? Hardly.

Deprecating "=" in favor of ":=" would solve a problem that doesn't exist, and would create a whole new one.

It *does* exist. Unless of, course, you are not content to consider and treat myself as a lowly moron but the creators of Pascal, Ada, Eiffel, and others, as well (and btw yourself, too. Have a look at the link you provided above ...)

And, just for completeness sake: May I ask *what* new problems ':=' would create? Other than adding a single char in the parser, that is.


I wish, D had done all the miraculos things it did - and then on top, had allowed itself the luxury to be more human centric rather than sticking to a paradigm that was necessary 50 years ago (and even then
not good but necessary)

I understand we all have our preferences, but reifying them to absolutes is specious. If you said "Yo dudes, I don't dig '=' for assignments and '++' and '--' for {inc,dec}rement. Also I don't give a rat's tail on dem fixed small integers. Other that dat, y'all boys did a rad job. Peace." - well that would have been easier to empathize with.

Oh master, now that the man, to whom I would have referred as "Huh? Andrei WHO??" 2 weeks ago, has told me, I understand that it can't be tolerated to havea prsonal style of thinking and putting things. Can I be forgiven, if I pray 10 Ave Maria and clean your shoes with my worthless lips?

Just btw: I don't care rat sh*t whether you empathize with me or like me.

BTW: I write this because D means a lot to me not to bash it. For Java, to bname an ugly excample, I never wasted a single line of criticism; t'sjust not worth it. So, please, read what I say as being written in a
warm tone and not negatively minded.

Awesome, thank you and keep destroying.

"destroying"??? Which part of "not to bash it" and of "D means a lot to me" and of "D is, no doubts, an excellent and modern incarnation of C/C++. As
far as I'm concerned D is *the* best C/C++ incarnation ever,
hands down." was too complicated to understand for your genius brain?

Just in case you are able to be professional on a human level for a moment: I want to buy your book and did look for it. It seems though that it's not easily available in Germany (amazon.de). Would you happen to have a hint for me where I can get it over here? Thanks.

Reply via email to