Hello, Recently, we moved from an RGBT 5551 internal color configuration to an RGBA 8888 configuration. There is a shared “core” application with a platform specific hardware abstraction layer. The code is compiled for windows, mac, linux, android, ios, and an embedded ARM hardware system (mix of 32 and 64 bit versions). The application is tested using a test application which can perform operations and compare to expected values. In the screenshot comparisons, we are noticing small differences in occasional pixels. Upon further testing, these are a small difference in the alpha channel value on the ARM device compared with *all* other platforms which match. Our recent move to the increased color depth exposed this change I think which may have been present all along.
To the best of my knowledge, all compiler flags and settings are identical on all builds. Font is a TTF font (droid sans based), rendered using FT_LOAD_FORCE_AUTOHINT and FT_LOAD_TARGET_LIGHT with AS_CONFIG_OPTION_USE_WARPER. We are using the built-in sbitmap cache mechanism as well. I’ve attached an output from our test application. The middle output has red pixels in the difference spots. Note how the lowercase “s” for example does match in the table, but has a difference when used in the sentence. Similar happens to the “M” or “K” in the graphic buttons at the bottom. Note that these are a different font size then the text in the character table. 1. Does anyone have a suggestion as to what build difference or setting might account for these differences? 2. Could this be some sort of a floating point rounding behavior issue? (hw float on the ARM rounding differently than hw/sw on the other systems for example) Basically, I am hoping for a start on where to even begin trying to track this down. The goal would be for rendering to be identical across all platforms. Thank you. -- Tim Wessman
_______________________________________________ Freetype mailing list [email protected] https://lists.nongnu.org/mailman/listinfo/freetype
