Re: [dev] Suckless font rendering library

2016-05-15 Thread suigin
On Sun, May 15, 2016 at 08:28:20PM +0300, Alexander Krotov wrote: Need to state somewhere that it is a vector format, as bitmap fonts are already ok. dwm supported bitmap fonts only before it switched to Xft. Plan 9 font format is bitmap [1], it is used internally for all font rendering, and

[dev] [st] [PATCH] Small bugfix for makeglyphfontspecs call in drawregion

2015-05-09 Thread suigin
Hi, Here's a patch that fixes a bug when calling `makedrawglyphfontspecs' in `drawregion'. Wasn't offseting the pointer into the input glyphs array by `x1'. The bug isn't causing any problems currently, because `drawregion' is always called with `x1' and `y1' values of 0, but if this ever changes

Re: [dev] [ST] [PATCH] Clean up xdraws and optimize glyph drawing with non-unit kerning values

2015-05-06 Thread suigin
Here's a third version of the patch with some minor changes. It removes the curly braces around the body of the `if' clause in `xmakeglyphfontspecs' that checks for ATTR_WDUMMY. It simplifies some of the conditional logic in `xdrawglyphfontspecs', by pulling out the check to see if `base.fg' is

[dev] [ST] [PATCH] Clean up xdraws and optimize glyph drawing with non-unit kerning values

2015-05-05 Thread suigin
Hello, I have another patch here for review that optimizes the performance of glyph drawing, primarily when using non-unit kerning values, and fixes a few other minor issues. It's dependent on the earlier patch from me that stores unicode codepoints in a Rune type, typedef'd to uint_least32_t.

Re: [dev] [ST] [PATCH] Clean up xdraws and optimize glyph drawing with non-unit kerning values

2015-05-05 Thread suigin
I've discovered a deficiency with full-width characters in the original patch, I wasn't handling the ATTR_WDUMMY mode attribute properly. I've attached the updated the patch to fix this, and I've tested it out with CJK characters and SJIS art, seems to be working correctly now. Sorry about that.

[dev] [ST] [PATCH] Changed type for UTF-32 codepoints from long to uint_least32_t

2015-05-05 Thread suigin
Hi all, here's a patch that changes occurences of long to uint_least32_t where it's being used to store UTF-32 codepoints, as was previously discussed. Other cases where long is used are preserved as is. --- st.c | 50 +- 1 file changed, 25

Re: [dev] [ST] [PATCH] Changed type for UTF-32 codepoints from long to uint_least32_t

2015-05-05 Thread suigin
On Tue, May 05, 2015 at 07:37:01PM +0100, Connor Lane Smith wrote: Hi, I get the feeling we should typedef uint_least32_t Rune, as in libutf, so we don't have to have this long-winded and somewhat cryptic type everywhere. cls No prob, here's the same patch, but using a `Rune' typedef.

Re: [dev] [st] [PATCH] Replace close and dup with dup2.

2015-05-05 Thread suigin
On Mon, May 04, 2015 at 12:47:38PM +0200, Roberto E. Vargas wrote: Second I don't have any warning with any of the compilers I use. Good for you. How is that in any way relevant? There is no prescribed compiler, is there? Also, the idea with this sort of distribution model is that

Re: [dev] [st] [PATCH] Optimize memory footprint of line buffers

2015-04-27 Thread suigin
On Mon, Apr 27, 2015 at 10:09:20AM +0200, Roberto E. Vargas Caballero wrote: I have applied a version of this patch with short size. Okay, thanks! Please, be careful with your commits, because this patch and the other you sent have a lot of white spaces changes that are not needed. Okay, I

Re: [dev] [st] [PATCH] Optimize memory footprint of line buffers

2015-04-27 Thread suigin
On Mon, Apr 27, 2015 at 10:31:38AM +0200, koneu wrote: Add to that, non-int bit fields are a GNU extension. You're right. It works because uint_least32_t happens to be a typedef of unsigned int for x86_64. Changing it so that it reads: typedef struct { uint_least32_t u; unsigned

Re: [dev] [st utf8 3/4] Change internal character representation.

2015-04-27 Thread suigin
On Mon, Apr 27, 2015 at 09:58:37AM +0200, Roberto E. Vargas Caballero wrote: Uhmmm, so do you propose don't use long arrays ever? because in some implementations long may be 4, but in others may be 8. We also should forbid int arrays for the same reason. I would say it depends on the context.

Re: [dev] [st] [PATCH] Optimize memory footprint of line buffers

2015-04-27 Thread suigin
On Mon, Apr 27, 2015 at 10:29:25AM +0200, Roberto E. Vargas Caballero wrote: typedef struct { uint_least32_t u; uint_least32_t mode:12; uint_least32_t fg:10; uint_least32_t bg:10; } Glyph; The size of this struct is only one byte less than if the same of the struct

Re: [dev] [st] [PATCH] Optimize memory footprint of line buffers

2015-04-27 Thread suigin
On Mon, Apr 27, 2015 at 07:34:01AM -0700, suigin wrote: As you can see, it's actually 2 bytes less. This is because a struct is usually aligned to the maximum alignment of all fields. A 16-bit ushort has a 2-byte alignment on x86_64, so this forces the struct to also have an alignment of 2

Re: [dev] [st] [PATCH] Optimize memory footprint of line buffers

2015-04-23 Thread suigin
On Wed, Apr 22, 2015 at 05:26:37AM -0700, suigin wrote: Another solution would be to allow people to typedef the color index type used by Glyph in config.h, and move the definitions for Glyph, TCursor and Term below where config.h is included in st.c? That way, it makes it easy

Re: [dev] [st utf8 3/4] Change internal character representation.

2015-04-22 Thread suigin
On Tue, Apr 21, 2015 at 09:28:38PM +, noname wrote: typedef struct { - char c[UTF_SIZ]; /* character code */ - ushort mode; /* attribute flags */ - uint32_t fg; /* foreground */ - uint32_t bg; /* background */ + long u; /* character code */ +

Re: [dev] [st] [PATCH] Optimize memory footprint of line buffers

2015-04-22 Thread suigin
On Tue, Apr 21, 2015 at 07:03:57PM +0200, Roberto E. Vargas Caballero wrote: Sorry, I didn't realize it was possible to use colors above 256, neglected to see it in config.def.h. My bad. Maybe, a solution can be modify colorname to: 63 static const char *colorname[256] = { and force

Re: [dev] [st] [PATCH] Optimize memory footprint of line buffers

2015-04-22 Thread suigin
Another solution would be to allow people to typedef the color index type used by Glyph in config.h, and move the definitions for Glyph, TCursor and Term below where config.h is included in st.c? That way, it makes it easy to customize it for a low memory profile. Here's another patch that

[dev] [st] [PATCH] Optimize memory footprint of line buffers

2015-04-21 Thread suigin
Hello, I noticed that the Glyph struct was using more memory than was needed. Considering that the fg and bg colors have values that are always in the range of [0, 255], there is no need to use a uint32_t for them. A single byte each would suffice. Furthermore, assuming default alignment and