Could you provide a little more clarification on what you're doing? Are you trying to use the CLI <while> the Python script is doing inserts?
When you try to do an update or delete with the CLI does it hang and not complete, or does it happily continue on and let you keep going? If you do ".changes on" before running the query, does the reported change count increase? If there's no sensitive data in the schema could you share the schema and/or copy the screen text from the CLI with an example? -----Original Message----- From: sqlite-users [mailto:sqlite-users-boun...@mailinglists.sqlite.org] On Behalf Of Fiona Sent: Thursday, October 12, 2017 5:41 AM To: sqlite-users@mailinglists.sqlite.org Subject: [sqlite] Sqlite3.6 Command-line delete/update not working with large db file(>280GB) Here is the specifics of my problem: ubuntu, sqlite 3.6.20 I have only two tables, each with primary key and index, I use python code to insert/update these two data, one table have a column with large blob data. Now I have a db file of about 289GB in size, when I updata/delete with command-line, the data is not changed/deleted at all, and no error ever returned, while insert is still working. I look through the sqlite limits, it says practically there is no limit about the size of a db file given that you have enough disk space. So please help me, where I can look into to solve this? Thanks a lot! -- Sent from: http://sqlite.1065341.n5.nabble.com/ _______________________________________________ sqlite-users mailing list sqlite-users@mailinglists.sqlite.org http://mailinglists.sqlite.org/cgi-bin/mailman/listinfo/sqlite-users _______________________________________________ sqlite-users mailing list sqlite-users@mailinglists.sqlite.org http://mailinglists.sqlite.org/cgi-bin/mailman/listinfo/sqlite-users