Hey,
I've developed a database auditing/versioning using the JSON data type 
(http://github.com/fxku/audit) and doing some tests now. 
Unfortunately I'm facing some problems when dealing with tables that store 
images as BYTEA. Some of them are around 15 MB big.

My tool logs changes to JSON and can populate it back to views with 
json_populate_recordset and json_agg. When performing this procedure on tables 
containing BYTEAs I receive an 54000 error ("Cannot enlarge string buffer"). Is 
this because of json_agg or because of single rows? 

Executing to_json on the whole column that contains the binary data (size above 
500 MB) lead to out-of-memory errors. The same goes for hstore. Executing these 
functions only on the biggest image was successful but freezed my pgAdmin. When 
I encoded BYTEA to TEXT before transforming it to JSON or hstore it worked. But 
trying json_populate_recordset still runs into memory problems (but explain 
worked). 

Do you think JSONB will solve my problems in the future?

Here is also a comparison in size between the bytea (and encoded versions to 
TEXT) and JSON / hstore output which I found kinda interesting:

operation | bytea | bytea->'escape' | bytea->'hex' | bytea->'base64'
----------|-------|-----------------|--------------|----------------
          | 15 MB | 40 MB           | 31 MB        | 21 MB
to_json() | 57 MB | 57 MB           | 31 MB        | 21 MB
hstore()  | 46 MB | 40 MB           | 31 MB        | 21 MB

Thanks in advance for any hints.
Cheers,
Felix


-- 
Sent via pgsql-general mailing list (pgsql-general@postgresql.org)
To make changes to your subscription:
http://www.postgresql.org/mailpref/pgsql-general

Reply via email to