On 29 Apr 2014, at 10:15am, Kleiner Werner <sqliteh...@web.de> wrote:

> If I understand the SQLite Docu correct there is no difference between INT or 
> INTEGER, except if you use a column as Primary Autoincrement.
> I thought an Int column in SQLite is always 64bit integer, but why does the 
> SQLiteDatareader does it recognize as integer 32?

The problem is not inside SQLite itself.  As far as SQLite is concerned INT and 
INTEGER are treated identically.  You can verify this for yourself by doing

SELECT myColumn, typeof(myColumn) FROM myTable LIMIT 1

Both versions of your database should return exactly the string 'integer' for 
the type.  If they don't please post again because I'd love to see it.

The problem must lie in your interface to SQLite, which is probably 
SQLiteDatareader itself.  I suspect that the problem is somewhere in or around

                private SQLiteType GetSQLiteType(int i)

which is on line 965 of

<https://github.com/mono/mono/blob/master/mcs/class/Mono.Data.Sqlite/Mono.Data.Sqlite_2.0/SQLiteDataReader.cs>

but I don't know c# well enough to get any closer.  From what I can see, 
though, it assumes that SQLite has column types which is, of course, not true.

> What is the difference if I declare a column as bigint or int?

In SQLite, none.  As you correctly understood from section 2.2 of

<http://www.sqlite.org/datatype3.html>

both pieces of text are actually interpreted as meaning "INTEGER".  However, it 
looks like SQLiteDataReader does things differently.

Simon.
_______________________________________________
sqlite-users mailing list
sqlite-users@sqlite.org
http://sqlite.org:8080/cgi-bin/mailman/listinfo/sqlite-users

Reply via email to