Regarding: >> I am sure there is a better way to deal with 12K rows by 2500 >> columns, but I can't figure it out....
I wonder if you might want to use *sed* or *awk* or *perl* to preprocess the data before import. A "master" table could contain the unique person id, plus the fields that you intend to index and that you are likely to filter upon most often. Other tables could exist for the remaining data, and could be joined on the person id as needed. This might: -- let you avoid a customized version of sqlite -- allow your most-used queries to run faster _______________________________________________ sqlite-users mailing list sqlite-users@sqlite.org http://sqlite.org:8080/cgi-bin/mailman/listinfo/sqlite-users