Reading the source code, i see that tdefcolor contains the following
case to handle 24-bit color codes. [1]

        case 2: /* direct color in RGB space */
                [ ... ]
                        idx = TRUECOLOR(r, g, b);
                break;

The TRUECOLOR macro returns an integer with r, g, and b in bits 0-23,
and bit 24 set. [2]

        #define TRUECOLOR(r,g,b) (1 << 24 | (r) << 16 | (g) << 8 | (b)).

tdefcolor is called by tsetattr, which assigns the result to term.c.attr.fg. [3]

        case 38:
                if ((idx = tdefcolor(attr, &i, l)) >= 0)
                        term.c.attr.fg = idx;
                break;

But fg is defined as a ushort. [4]

        typedef struct {
                Rune u;           /* character code */
                ushort mode;      /* attribute flags */
                ushort fg;        /* foreground  */
                ushort bg;        /* background  */
        } Glyph;

Since a ushort can only hold 16 bits, I don't see how this can possibly work.

[1]: 
http://git.suckless.org/st/tree/st.c?id=caa97cc781ccf29f28c3d9e6683a66eb3f70e2bd#n1762
[2]: 
http://git.suckless.org/st/tree/st.c?id=caa97cc781ccf29f28c3d9e6683a66eb3f70e2bd#n1872
[3]: 
http://git.suckless.org/st/tree/st.c?id=caa97cc781ccf29f28c3d9e6683a66eb3f70e2bd#n81
[4]: 
http://git.suckless.org/st/tree/st.c?id=caa97cc781ccf29f28c3d9e6683a66eb3f70e2bd#n191

Andrew Ekstedt

Reply via email to