I've got a 64-bit Linux system, with a 32-bit libmysqlclient (and a 64-bit),
and a C program using the libmysqlclient API which behaves very differently
depending on which platform it is compiled for.  The program is:

    #include <stdio.h>
    #include <string.h>

    #include <mysql.h>


    int main() {
        MYSQL *conn = mysql_init(NULL);
        mysql_real_connect(conn, NULL, "root", NULL, "test_mysqldb",
0, NULL, 0);
        mysql_real_query(conn, "SHOW COLLATION", strlen("SHOW COLLATION"));
        MYSQL_RES *result = mysql_store_result(conn);
        int n_fields = mysql_num_fields(result);
        MYSQL_FIELD *fields = mysql_fetch_fields(result);
        int i;
        for (i = 0; i < n_fields; i++) {
            printf("%s: %d\n", fields[i].name, fields[i].type);
        }
        mysql_free_result(result);
        mysql_close(conn);
    }


When run under 64-bit I get the expected result:

    alex@devalex:/tmp$ ./test
    Collation: 253
    Charset: 253
    Id: 8
    Default: 253
    Compiled: 253
    Sortlen: 8

However when run under 32-bit I get something very unexpected:

    alex@devalex:/tmp$ ./test32
    Collation: 253
    CHARACTER_SET_NAME: 142345400
    COLLATIONS: 142345464
    : 142345496
    : 142345584
    def: 1280069443


I'm not sure what the issue is, and it may very well be on my end, but any
debugging help you can provide would be great (this was originally extracted
from a bug in a Python MySQL driver I'm working on using the ctypes FFI).

Thanks,
Alex

Reply via email to