https://gcc.gnu.org/bugzilla/show_bug.cgi?id=88507

--- Comment #2 from Jonny Grant <jg at jguk dot org> ---
(In reply to Jonathan Wakely from comment #1)
> (In reply to Jonny Grant from comment #0)
> > Hello
> > 
> > This is an typo in the word "string", just reporting as perhaps it could
> > show £ correctly, as it does on line 10 error.
> 
> But then you couldn't have two separate caret locations pointing to the two
> invalid bytes, because it would only occupy a single column. You also assume
> the terminal is capable of showing UTF-8 characters.

Ok. I would suggest worth displaying the "st£ing" and say ‘st£ing’ was not a
valid identifier (Latin letter, underscore, or non-digit character) as per
C/C++ specs?

Example expected output:

$ g++ -Wall -o string string.cpp
string.cpp: In function ‘int main()’:
string.cpp:8:5: error: ‘st£ing’ is not a valid identifier as contains non-latin
characters
     st£ing buf;
     ^~~~~~
string.cpp:8:5: note: suggested alternative: ‘string’
     st£ing buf;
     ^~~~~~
     string
string.cpp:10:5: error: ‘buf’ was not declared in this scope
     buf = "£"
     ^~~

Reply via email to