Hi,

Not sure whether its relevant or not, however upon adding an ANALYSE before
the second vacuum the issue has not presented when testing. I have managed
95 cycles thus far.

BEGIN;
CREATE TABLE x (id BIGSERIAL PRIMARY KEY, payload1 VARCHAR, payload2
VARCHAR, payload3 VARCHAR, payload4 BIGINT, payload5 BIGINT);
/* Repeat until 2,000,000 rows are inserted */
INSERT INTO x (id, payload1, payload2, payload3, payload4, payload5) VALUES
(random values of varying length/size to force random toast usage);
COMMIT;

VACUUM (ANALYZE, FULL);

BEGIN;
/* Repeat until all 2,000,000 rows are updated */
UPDATE x SET payload1 = , payload2 = , payload3 = , payload4 = , payload5 =
... again random values of varying length/size to force random toast usage
COMMIT;

ANALYZE x; -- <== New analyse here.

VACCUM (ANALYZE, FULL);

(...)

Vacuum end. 30 sec sleep

Update selective

Inserting the rows

update 0

update 20000

update 40000

update 60000

update 80000

update 100000

update 120000

update 140000

update 160000

update 180000

Update all

Attempting vacuum

Vacuum completed

dropping the table

=========================================

New attempt - number 96

Inserting the rows

Executing  0

Executing  40000

Executing  80000

Executing  120000

Executing  160000

Executing  200000

Executing  240000

Executing  280000

(...)

Many thanks,
Harry

Reply via email to