> On 25 Oct 2015, at 21:45, Joachim Breitner <m...@joachim-breitner.de> wrote: > > Hi, > > Am Sonntag, den 25.10.2015, 21:30 +0100 schrieb MigMit: >> Doesn't seem worth it to me. Current format is quite parseable, and >> not really bad for human eyes either. > > I know that you meant this as a litote,
Please, don't say "know" when you mean "assume". It's especially annoying when you assume wrong. > but let me ignore that I know > that for a moment to reply, that “not really bad” is definitely not > good enough for me, and I want the compiler to print messages that are > meant for my consumption to be in the _best_ possible format. Or at > least try that. > > Obviously, there is no “best” for every human. But things get easier if > we do not have to overly worry about computers as well. I think that's a wrong approach. My theory is that the concepts "easy to read for a human with some experience" and "easy to parse for a computer" are two closely related notions. Sure, they aren't identical — a binary format might still be quite easy to parse, but completely unreadable for human — but they go hand in hand. Even with binary formats — if, for example, there is a clear notion of "statement" in this binary formats, and statements are separated by the byte "0xff", it's easier both for a human (equipped with binary editor) and for a computer than, for example, if the length of the statement is determined by the first byte. But! It's much easier to argue about "what's easier for a computer" than the same thing for humans. > BTW, does Emacs really parse _this_ bit of information? Most GHC > integrations that I have seen match on the first line to indicate the > file and position of the overall error, and take the error verbatim. Last time I checked, Emacs transformed such positions into hyperlinks. _______________________________________________ Glasgow-haskell-users mailing list Glasgow-haskell-users@haskell.org http://mail.haskell.org/cgi-bin/mailman/listinfo/glasgow-haskell-users