Hello,

I have encountered the same problem as this: https://stackoverflow.com/q/4925084

The answers don't explain why there is a bitness difference at run-time between 
the types retrieved from INT and INTEGER columns, and that's my question. From 
reading https://sqlite.org/datatype3.html I understand there should be no 
difference whatsoever between defining a column INT or INTEGER (other than 
whether a primary key may become a rowid alias).

I don't mean the bitness how the integers are stored in the disk database, but 
the values returned by the System.Data.SQLite API (in particular via a 
DataTable loaded by a SQLiteDataReader).

I have verified that declaring a column (which isn't any kind of key) as INT 
causes System.Data.SQLite to return Int32/int (possibly, depending on the 
value? Not sure); and declaring INTEGER causes the same value to be returned as 
Int64/long.

Can anyone explain this, or point to where this is actually documented, if I've 
missed it?

It's the first time I use this mailing list, after searching for the answer on 
sqlite.org and around the Web; I hope I haven't missed any RTFM, otherwise 
please let me know.

Thanks
Xavier Porras
_______________________________________________
sqlite-users mailing list
sqlite-users@mailinglists.sqlite.org
http://mailinglists.sqlite.org/cgi-bin/mailman/listinfo/sqlite-users

Reply via email to