I'm probably being a bit stupid - but I'm trying to determine (in code) the 
length of the string in the schema for a given table.

So - for example : 


        create table a (
                blah char(20)
        )


I want to return '20', but I'm getting '60' when I use mysql_list_fields..
(Always seems to be 3x longer that I'm expecting)...

Am I missing something ? (or should I just divide by 3!!)






Heres an example : 

#include <stdio.h>
#include <stdlib.h>
#include <mysql.h>

MYSQL conn;

int main(int argc,char *argv[]) {
        // run with      username port           as arguments
        char *tabname="a";
        char *db="test1";
        char *u;
        char *p;
        MYSQL_RES *result;
        MYSQL_FIELD *field;
        if (argc!=3) {
                printf("usage : %s  username password\n", argv[0]);exit(2);
        }
        u=argv[1]; p=argv[2];
        mysql_init(&conn);
        if (!mysql_real_connect(&conn, NULL,u,p,db,0,NULL,0) ) {
                    fprintf(stderr, 
                        "Failed to connect to database: Error: %s\n", 
                        mysql_error(&conn)); 
                        exit(2);
        }

        result = mysql_list_fields (&conn, tabname, NULL);

        field = mysql_fetch_field (result);
        printf("Field =%s Type=%d Length=%d\n", field->name, 
                        field->type, field->length);
}






Thanks in advance...

-- 
MySQL General Mailing List
For list archives: http://lists.mysql.com/mysql
To unsubscribe:    http://lists.mysql.com/[EMAIL PROTECTED]

Reply via email to