So playing with Arduinos, and helping other people do it, I'm struck
by how many unnecessary problems we're still struggling with.  For
example:

- The heap grows up and the stack grows down, and there's no memory
    protection.  Even if you don't use recursion, there's no way to
    get help from the compiler computing your maximum stack depth,
    so the way you discover you've malloced too much memory is that
    your stack corrupts some malloced block, or vice versa.

- You'd really like to do that allocation at compile time so you can
    get an error message.  (The inability to get error messages at
    run time is kind of a step back from microcomputer programming
    in the 80s.)  But allocating a frame buffer of a specified but
    constant size at runtime looks like this:

        TV.begin(NTSC, 86, 120);

    and as far as I know there's no way in C++ to make this work as
    a global static object:

        TV::output tv(NTSC, 86, 120);

    ...which should statically allocate 86*120/8 bytes for the
    framebuffer, giving you a compile error if there isn't that much
    RAM left to allocate.

- More than allocating memory, I'm allocating *time*.  I noticed
    that a sound waveform was sounding a little bit funny; I hooked
    the audio output up to my sound card and recorded a waveform
    with Audacity and discovered that I was getting buffer underruns
    every frame.  I'd like to be able to interrogate the compiler
    about whether a particular bytebeat is going to take too many
    cycles to execute instead of improvising an oscilloscope.

- And I'd like to construct routines at compile-time by traversing
    data structures.  Maybe each scan line of the video ought to
    have eight bits spat out from the audio buffer as pixels, three
    cycles each, followed by an oscilloscope display consisting of
    setting a pin low, 128-N cycles of delay, setting a pin high, N
    cycles of delay, setting a pin low, and returning from the ISR.
    But it's very important that there not be a delay (and
    especially a variable delay!) between the 8 bits and the
    oscilloscope display.  So treating inlining as an optional
    optimization that the compiler may choose to perform or not is
    not really acceptable.

- In general, C is not suitable for timing-accurate code for this
    reason.  But doing it in assembly is crappy.

- There's normally no multithreading; although there are libraries
    using setjmp(), they're not widely used.  This may be related to
    the difficulties of managing multiple threads of unknown stack
    depth in 2K of RAM.  But it makes certain very common kinds of
    programming far more complicated than they need to be.

- There's a storage class qualifier ("PROGMEM") for data that end up
    in program memory (32K of Flash) instead of RAM.  But there
    doesn't seem to be a way to automatically choose the correct
    version of a function based on whether its arguments are in RAM
    or in PROGMEM; instead you have to manually replace `strcpy` with
    `strcpy_P` in the right places when you move a string from RAM
    to PROGMEM.  (I'm pretty sure the compiler tells you if you get
    this wrong, although I haven't tried it.)  But I don't think you
    can get the compiler to compile the functions of a class for a
    PROGMEM `this` pointer, let alone compile one version for a RAM
    `this` and one version for a PROGMEM `this`.  So you have to
    waste precious RAM on constant objects.  (Worse, in C++, the
    minimum sizeof is 1, to ensure that all objects have distinct
    addresses.)

- On an 8-bit processor, performance is quite dramatically affected
    by the range the compiler infers for intermediate results;
    division is a particularly bad problem, particularly since GCC
    doesn't have mixed-size division subroutines.  A single
    inadvertent 32-bit-by-32-bit division per audio sample makes the
    Arduino start to buffer-underrun.

So the upshot is that I end up doing a lot of stuff either by hand
or at runtime that I'd like to do automatically at compile time, and
then I have to debug problems by trial and error instead of getting
useful error messages.

C++, despite adding a fair bit of compile-time power and optional
type and memory safety to C, has other usability problems of its
own.

Is there work out there I've overlooked?  BitC
<http://www.bitc-lang.org/docs/bitc/bitc-origins.html> seems like
the closest thing.  ATS looks like it might be able to do something
like this, but I can't make heads or tails of the programs, which I
think bodes ill for the digital artists who basically just want a
scripting language for LEDs.  Maybe Rust?

Kragen
-- 
To unsubscribe: http://lists.canonical.org/mailman/listinfo/kragen-tol

Reply via email to