I forgot that sqlite page_size is lost during a .dump.
The command should actually be:
(echo "PRAGMA page_size=8192;" ; sqlite3 your.db .dump) | sqlite3 new.db
This runs 4 times faster on my machine with a large database than the command
below
with a default page_size of 1024. Replace
on Sat, 18 Nov 2006 06:55:34 -0800 P Kishor wrote:
>didn't try any of your tricks, but can confirm that VACUUM is very
>slow on a similar db I have...
>
>table1 -- 210k rows x 6 cols, 4 indexes, 1 pk
>table2 -- 36k rows x 6 cols, 4 indexes, 1 pk
>table3 -- 16k rows x 6 co
$ ./sqlite3.exe v.db vacuum
ATTACH 'C:\TMP\etilqs_SOVEJE7Rni84Zzy' AS vacuum_db;
PRAGMA vacuum_db.synchronous=OFF
BEGIN EXCLUSIVE;
CREATE TABLE vacuum_db.t1(a, b, primary key(b, a))
CREATE TABLE vacuum_db.t2(c, d)
CREATE INDEX vacuum_db.t2i on t2(d, c)
INSERT INTO vacuum_db.'t1' SELECT * FROM 't1';
m_db.sqlite_master SELECT type, name, tbl_name, rootpage,
sqlFROM sqlite_master WHERE type='view' OR type='trigger' OR
(type='table' AND rootpage=0)
- Original Message
From: Nemanja Corlija <[EMAIL PROTECTED]>
To: sqlite-users@sqlite.org
Sen
On 11/18/06, P Kishor <[EMAIL PROTECTED]> wrote:
didn't try any of your tricks, but can confirm that VACUUM is very
slow on a similar db I have...
Since you obviously have some CPU cycles and RAM to spare, according
to my experiences at least, you'll benefit greatly by doing it your
self instea
didn't try any of your tricks, but can confirm that VACUUM is very
slow on a similar db I have...
table1 -- 210k rows x 6 cols, 4 indexes, 1 pk
table2 -- 36k rows x 6 cols, 4 indexes, 1 pk
table3 -- 16k rows x 6 cols, 4 indexes, 1 pk
table4 -- 5M rows x 4 cols, 2 indexes, 1 pk
total size on file
Nemanja Corlija wrote:
I have a db with one table that has a text primary key and 16 text
columns in total.
After importing data from CSV file db had 5M rows and file size was
833MB. After some big DELETEs db had around 3M rows and 500MB after
"VACUUMing".
Running VACUUM for more then an hour fil
I have a db with one table that has a text primary key and 16 text
columns in total.
After importing data from CSV file db had 5M rows and file size was
833MB. After some big DELETEs db had around 3M rows and 500MB after
"VACUUMing".
Running VACUUM for more then an hour filled new db with ~300MB w
8 matches
Mail list logo