[stronger-than-usual doses of history, speculation, and opinion follow] Hi Alex,
At 2026-02-21T23:52:08+0100, Alejandro Colomar wrote: > I'd like to know why you did that. > > commit 4b7a0fe5ab5d5155bd499cf9506a91a1f4bc0125 > Author: G. Branden Robinson <[email protected]> > Date: 2025-12-06 18:26:02 -0600 > > src/utils/xtotroff/xtotroff.c: Fix code style nit. > > * src/utils/xtotroff/xtotroff.c: > (CanonicalizeFontName, FontNamesAmbiguous, MapFont, main): > Explicitly > cast unused return values of printf(3)-family functions to `void`. > > I've been updating code to remove those casts, because they don't do > much good. It's essentially just noise. I disagree. I feel that one should never call a function with no evident awareness of its return type. (If its return type is "void", then no cast should be made!) I acknowledge that type-slovenliness is a proud tradition in C, but I reject it. It only makes life harder for medium- to large-scale projects. Languages like Pascal and Ada distinguish "procedures" (which affect the state of the system only via "side effects") from "functions" that might be "pure", but even if not, generally communicate information back to their callers via return values. (Sometimes, function arguments are mutable. Ada supported this from early days with "in" and "out" annotations to formal arguments. Later, Fortran copied the idea with "intents". In C, a function parameter is "in" (termed "copy by value") unless it's a pointer type, in which case it's "out" ("copy by reference"), unless you qualify the pointer part of its type declaration with "const"--which you can still get around anyway, I think, because C is "close to the metal", dawg. No proud C hacker ever lets correctness get in the way of performance.) Of course historically, C functions returned ints by default even without declaration because C is descended from the "typeless" B.[0] And if nobody could think of anything _good_ to stick into the return value, well then they'd come up with something crappy and return that.[1] This convention persisted even for functions that returned pointers. Don't think too hard about what the best datum to return is, just return _something_--like one of the arguments the caller already knows because they passed it in. https://www.symas.com/post/the-sad-state-of-c-strings https://dgtalhaven.wordpress.com/2020/05/15/schlemiel-the-painters-algorithm/ My explicit discards of the return value remind the reader (often myself) that, yes, I'm aware that printf() has a return value, and that I don't need it. "But what if everybody did that? You'd clutter the world with void typecasts!" Yes, if I persist in using crappy APIs. Because the printf() family of functions have such generally unwanted and useless return values, we might have to admit to ourselves that K&R's flagship example of "hello, world!" had several things wrong with it, and that's blasphemy to C advocates. I am not a C advocate. If I ever get more experience with Ada, I'm sure I'll have complaints about it, too. (Actually I have one already, which is that object declarations aren't "constant" by default, whereas they should have to be explicitly declared mutable. (Rust gets this right.) Granted, the utility of that practice was arguably harder to foresee in the largely uniprocessor 1970s, but on the other hand, even Ada 83 had "tasking", which already confronted several concurrency problems. Needless to say all the cool guys decided that was crap--except for Nerain Gehani,[3] who does not seem to be remembered (fondly? or at all?)--and decided concurrency wasn't one of C's core competencies, and shouldn't become one. But don't take my word for it. "C provides no operations to deal directly with composite objects such as character strings, sets, lists, or arrays. There are no operations that manipulate an entire array or string, although structures may be copied as a unit. The language does not define any storage allocation facility other than static definition and the stack discipline provided by the local variables of functions; there is no heap or garbage collection. Finally, C itself provides no input/output facilities; there are no READ or WRITE statements, and no built-in file access methods. All of these higher-level mechanisms must be provided by explicitly-called functions. Most C implementations have included a reasonably standard collection of such functions. Similarly, C offers only straightforward, single-thread control flow: tests, loops, grouping, and subprograms, but not multiprogramming, parallel operation, synchronization, or coroutines. Although the absence of some of these features may seem like a grave deficiency ("You mean I have to call a function to compare two character strings?"), keeping the language down to modest size has real benefits. Since C is relatively small, it can be described in a small space, and learned quickly. A programmer can reasonably expect to know and understsand and indeed regularly use the entire language." -- Kernighan & Ritchie, _The C Programming Language_, 2nd edition, 1988. While C-the-language has points I would buttonhole Dennis Ritchie about to complain about were he still alive, I think much more damage has been done by the laziness with which programmers (and WG14, I reckon) have accepted the "reasonably standard collection of [] functions" to handle bread-and-butter issues of application programming. Much misery has resulted from poor design coupled with first-mover advantage, much of it from the Labs but a whole lot from outside as well. The Berkeley CSRG's curses library taught us as early as 1980 how important it was for a language as modular as C could be when turned to applications rather than kernel or embedded work, to have name spaces. Naturally, it still lacks them 46 years later. The standard C library was, and is, deserving of sterner scrutiny than it gets--and now that I mention it, gets() was far from the only grievous wart it has carried. We have paid in confusion, wasted time, and unclear practices, and will continue to do so, unless we slaughter sacred cows and reconsider popular idioms from first principles. You did ask. ;-) Regards, Branden [0] An old colleague of mine characterized the 1970s C type system as follows: "Everything is an int unless it can't be." The latter referring to, largely, floats and structs. Pointers? Of COURSE they're ints. Why would they ever not be? [1] ...until eventually C mocked up the procedures of the hated Pascal by contriving a "void" type and putting it to this use, among others. Not having keywords for procedure or function declarations was another brilliant stroke; just look at Go. ;-) [2] Like, "why do I need the overhead of scanning for format conversions when the string literal I'm passing in doesn't contain any? I thought we were efficient and close to the metal!" K&R should have started with `(void) puts("Hello, world!");`. Or so saith I. [3] http://www.silicon-press.com/books/isbn.0-929306-00-7/index.html Having actually read that book, I'd say that it was a blatantly obvious attempt to bolt Ada's "task" type onto C. It didn't look very clean (to me[4]) and if others shared that view, it's no mystery to me why it didn't take off. I wonder if Gehani was less vocal than Stroustrup about where he was borrowing ideas from. (If you read Stroustrup's own account of C++'s development, he's pretty frank about his promiscuity in terms of adopting notions from other languages. That seems to have caused C++ occasional problems-- "multi-paradigm" might just be a fancy way of saying "lack of coherent design"--but I appreciate the breadth of exposure he sought, and his openness to good ideas from elsewhere. While Stroustrup is not a universally celebrated figure in the C community, he was a shrewd enough promoter to pay homage to C, which I'm sure mollified many partisans. Many "software engineers" don't trouble themselves to evaluate your solutions to engineering problems; they simply want you to genuflect before the same icons they do. I don't know if that homage was enough by itself to explain Concurrent C and C++'s wildly disparate fates despite both being contemporaneous Bell Labs projects. (If I had to guess, I'd say that the "sex appeal" of the "object-oriented paradigm", which has its uses, but which like so many trends in Silicon Valley, bloated to cosmic proportions and became more of an exercise in branding than engineering.[5] Also see "e-commerce" and "AI" [more than once].) Stroustrup seems to have been willing to devote the rest of his life to manning the bridge of the C++ aircraft carrier--to his credit, he (eventually) grasped the point that the standard library demanded as much oversight as the language proper, which is a task K&R didn't seem to want--and Gehani might not have wanted to silo his intellectual existence so narrowly and permanently. I don't know that I would, either. [4] Hard for me now, several years later, to say exactly why. But I'll bet part of it was that Gehani didn't file off much, if any, of Ada's really strong typing, which (I think) you needed for correct event dispatch to the running "tasks", and that made it contrast starkly with C of any generation, but _especially_ pre-ANSI C. Ada's task type would have been a better fit with C++, which was already overloading functions by type signatures even in early days. My guess, based on no evidence at all, is that politics or personalities utterly foreclosed that avenue of collaboration. [5] Reputedly, Jean Icbiah, a nigh-upon domineering figure for the first decade plus of Ada language development, ragequit its development committee in the early 1990s when the language was undergoing its first major revision because, it seems, he insisted that the language _had_ to have "class" as a keyword, or it wouldn't be able to compete. Yes, seriously, I've read the (leaked) email thread (though I don't have a link) and while some stuff I don't understand was discussed, and possibly some effort at dressing the issue up was made, it really does seem to have boiled down to something as simple as keyword selection. Icbiah felt Ada had to grow a `class` keyword to be competitive in an environment that was consumed by OOP hype. He didn't win--possibly because his colleagues had tired of his single-handed overrides of consensus during the initial development of the language spec (Ada 83), and it got "tagged types" instead, which were (and are) functionally equivalent to classes but useless at riding the then-current wave of hype. Ada's relative failure despite its many excellent qualities would be a good cautionary tale if only people could reach agreement on exactly _why_ it failed. And they don't. It's because it was a DoD project and so writing in it was like getting a high-n-tight haircut and bombing babies. What self-respecting Silly Valley hacker wants to shave and put on a uniform? (Killing babies is actually okay if you're paid enough.) No, it's because the DoD funded NYU to write GNAT, which is under the GNU GPL, which means Ada is a language for dirty smelly copyleftist hippies who don't do anything cool, like programming missiles to kill babies. No, it's because it didn't have a "class" keyword and so wasn't OOP. I guess if I had to select a single biggest factor of these three, I'd choose the last. Don't count on people to be principled, even with _bad_ principles, when it's easiest to just be stupid.
signature.asc
Description: PGP signature
