> Best way to save developer time is to program in a HLL and not
> worry about bit fiddling. C is not a HLL.

C was created as a language to do bit fiddling in - a tool for writing
assembly language a bit more productively than doing it by hand.  The
original Unix C compiler was a tool for writing PDP11 assembly
language.  Later when the prospect of Unix on other architectures
began to emerge, the 7th edition Unix "portable C compiler" was a
family of tools for writing assembly language on different machines.

C was designed by and used by systems programmers who were intimately
familiar with their machine's instruction set, and the code generation
was so simple and straightforward that to such programmers it would be
obvious what instructions would correspond to a given bit of C. The
low-level directness of C made it easy to write efficient code, so
there was no need for the compiler itself to try to do clever
"optimisations".

Such a language was and still is just what is needed for writing
device drivers and the lowest layers of OS kernels where things like
memory layout and instruction ordering are important.  It's not such a
good match for writing the more abstract upper layers of an OS, and
even less so for applications programming.  I think it's unfortunate
that C ended up being so widely (mis)used for applications
programming, and by applications programmers who have never learned
assembly language, causing pressure for compilers to be obfuscated
with semantics-perverting "optimisations" and, in an attempt to
compensate for this, the language to be defaced with "features" like
volatile and __attribute__ and whatnot.

I think inferno got it about right: write the kernel in C, and
the applications in a high level language (limbo).


Reply via email to