the latest xkcd on my internet is about a programming language that
attempts to eliminate off-by-one errors via "every time an integer is
stored or read, its value is adjusted up or down by a random amount
between 40 and 50". The title text clarifies that this generates
off-by-40-or-50 errors instead.

HOW WOULD WE MAKE THIS LANGUAGE? it's soooo zombie-like! every program
we tried to write in it would fail until we learned to accommodate
everything we calculated being massively wrong!

well
- instrument an existing runtime
- patch an existing language
- write a new one with the new store/retrieve semantics

tried to send got: "Can't connect to Mailman's REST server, your
message has not been sent." 1950 ET
got this error a second time 1950 ET. pasting into gmail.

Reply via email to