All,

This is in fltk-1.3 r7500.

Inspired by Manlolo's work on rendering utf8 strings under GL on OSX, I was 
digging into gl_draw.cxx a bit.
(It's not pretty, and some of the mess is mine...)

However, that aside, I was looking at gl_draw() near line 228 we have:

  for (i = 0; i < n; i++) {
    unsigned int r;
    r = (str[i] & 0xFC00) >> 10;
    if (!gl_fontsize->glok[r]) get_list(r);
  }

What *does* that do?

The array "str" is a char array, so (str[i] & 0xFC00) and you always get 0, 
surely?

Unless the "signed char" has it's top bit set, in which case it would 
presumably be sign-extended and then and'ing with 0xFC00 will always return 
0xFC00... bit shift that down by 10 and you have 0x3F...

I assume I am missing something here, but I just can't figure out what that bit 
of code is doing.

--
Ian




_______________________________________________
fltk-dev mailing list
fltk-dev@easysw.com
http://lists.easysw.com/mailman/listinfo/fltk-dev

Reply via email to