necouchman commented on code in PR #470:
URL: https://github.com/apache/guacamole-server/pull/470#discussion_r1437950219
##########
src/libguac/string.c:
##########
@@ -44,6 +45,16 @@
*/
#define REMAINING(n, length) (((n) < (length)) ? 0 : ((n) - (length)))
+size_t guac_itoa(char* restrict dest, int integer) {
+
+ /* Determine size of string. */
+ int str_size = snprintf(dest, 0, "%d", integer);
Review Comment:
Actually, according to the man page for the `*printf` functions:
```
d, i The int argument is converted to signed decimal notation.
The precision, if any, gives the minimum number of digits that must appear; if
the converted value requires
fewer digits, it is padded on the left with zeros. The
default precision is 1. When 0 is printed with an explicit precision 0, the
output is empty.
o, u, x, X
The unsigned int argument is converted to unsigned octal (o),
unsigned decimal (u), or unsigned hexadecimal (x and X) notation. The letters
abcdef are used for x conver‐
sions; the letters ABCDEF are used for X conversions. The
precision, if any, gives the minimum number of digits that must appear; if the
converted value requires fewer
digits, it is padded on the left with zeros. The default
precision is 1. When 0 is printed with an explicit precision 0, the output is
empty.
```
So, seems like maybe I should try "%u", instead??
##########
src/libguac/string.c:
##########
@@ -44,6 +45,16 @@
*/
#define REMAINING(n, length) (((n) < (length)) ? 0 : ((n) - (length)))
+size_t guac_itoa(char* restrict dest, int integer) {
+
+ /* Determine size of string. */
+ int str_size = snprintf(dest, 0, "%d", integer);
Review Comment:
Actually, according to the man page for the `*printf` functions:
```
d, i The int argument is converted to signed decimal notation.
The precision, if any, gives the minimum number of digits that must appear; if
the converted value requires
fewer digits, it is padded on the left with zeros. The
default precision is 1. When 0 is printed with an explicit precision 0, the
output is empty.
o, u, x, X
The unsigned int argument is converted to unsigned octal (o),
unsigned decimal (u), or unsigned hexadecimal (x and X) notation. The letters
abcdef are used for x conver‐
sions; the letters ABCDEF are used for X conversions. The
precision, if any, gives the minimum number of digits that must appear; if the
converted value requires fewer
digits, it is padded on the left with zeros. The default
precision is 1. When 0 is printed with an explicit precision 0, the output is
empty.
```
So, seems like maybe I should try `%u`, instead??
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]