Re: JSON out of memory error on PostgreSQL 9.6.x
On Mon, Dec 4, 2017 at 9:39 AM, Tom Lane wrote: > Ah. The problem here is that "json_rmq->>'totalSize'" leaks some memory > on each execution, and it's executed again for each row produced by the > json_array_elements() SRF, and the memory can't be reclaimed until we've > finished the full output cycle for the SRF. So the leakage (which is > more or less of the size of the JSON value, I think) accumulates across > 150K executions in this example. > > This is fixed as of v10. It seems impractical to do anything about it > in previous release branches, although you could reformulate your query to > avoid it by not having any other expression evaluations occurring in the > same tlist as the SRF. Something like this should work: Yeah, I agree with that. One similar leak has actually been fixed with this commit, and the infrastructure of v10 has made this fix dead simple: commit: 0c25e9652461c08b5caef259a6af27a38707e07a author: Tom Lane date: Fri, 6 Oct 2017 14:28:42 -0400 Fix intra-query memory leakage in nodeProjectSet.c. https://www.postgresql.org/message-id/20171005230321.28561.15...@wrigleys.postgresql.org nodeProjectSet.c really makes tuple-level memory handling way easier based on my studies of this code. -- Michael
Re: JSON out of memory error on PostgreSQL 9.6.x
thank you, I will look into the work-around ! From: Tom Lane Sent: Monday, 4 December 2017 11:39 AM To: Yuri Budilov Cc: rob stone; pgsql-general@lists.postgresql.org Subject: Re: JSON out of memory error on PostgreSQL 9.6.x Yuri Budilov writes: > The out-of-memory error happens if I also retrieve another JSON Column like > so: > CREATE TABLE csnbi_stg.junk4 > AS > SELECT >json_rmq->>'totalSize' as totalSize, -- this plus array below causes > out of memory error >json_array_elements(json_rmq -> 'orders'::text) AS orders > FROM csnbi_stg.stg_rmq_prod_test_json_raw_jmcn stg_rmq_json_raw Ah. The problem here is that "json_rmq->>'totalSize'" leaks some memory on each execution, and it's executed again for each row produced by the json_array_elements() SRF, and the memory can't be reclaimed until we've finished the full output cycle for the SRF. So the leakage (which is more or less of the size of the JSON value, I think) accumulates across 150K executions in this example. This is fixed as of v10. It seems impractical to do anything about it in previous release branches, although you could reformulate your query to avoid it by not having any other expression evaluations occurring in the same tlist as the SRF. Something like this should work: SELECT totalSize, json_array_elements(json_rmq -> 'orders'::text) AS orders FROM (SELECT json_rmq->>'totalSize' as totalSize, json_rmq FROM csnbi_stg.stg_rmq_prod_test_json_raw_jmcn stg_rmq_json_raw WHERE ... OFFSET 0) ss; regards, tom lane
Re: JSON out of memory error on PostgreSQL 9.6.x
Yuri Budilov writes: > The out-of-memory error happens if I also retrieve another JSON Column like > so: > CREATE TABLE csnbi_stg.junk4 > AS > SELECT >json_rmq->>'totalSize' as totalSize, -- this plus array below causes > out of memory error >json_array_elements(json_rmq -> 'orders'::text) AS orders > FROM csnbi_stg.stg_rmq_prod_test_json_raw_jmcn stg_rmq_json_raw Ah. The problem here is that "json_rmq->>'totalSize'" leaks some memory on each execution, and it's executed again for each row produced by the json_array_elements() SRF, and the memory can't be reclaimed until we've finished the full output cycle for the SRF. So the leakage (which is more or less of the size of the JSON value, I think) accumulates across 150K executions in this example. This is fixed as of v10. It seems impractical to do anything about it in previous release branches, although you could reformulate your query to avoid it by not having any other expression evaluations occurring in the same tlist as the SRF. Something like this should work: SELECT totalSize, json_array_elements(json_rmq -> 'orders'::text) AS orders FROM (SELECT json_rmq->>'totalSize' as totalSize, json_rmq FROM csnbi_stg.stg_rmq_prod_test_json_raw_jmcn stg_rmq_json_raw WHERE ... OFFSET 0) ss; regards, tom lane
Re: JSON out of memory error on PostgreSQL 9.6.x
On 12/3/2017 3:18 PM, Yuri Budilov wrote: |CREATETABLEX ASSELECTjson_array_elements(json_rmq ->'orders'::text)ASorderFROMtable_name WHEREblah;| I get out of memory error. are you sure thats a postgres error ? are you doing this in psql, or what sort of application environment ? how many rows does 'blah' match ? what is... SELECT pg_column_size(json_array_elements(json_rmq -> 'orders'::text)) FROM table_name WHERE blah; ? -- john r pierce, recycling bits in santa cruz
Re: JSON out of memory error on PostgreSQL 9.6.x
the plot thickens! I have more information. The out-of-memory error happens if I also retrieve another JSON Column like so: -- fails CREATE TABLE csnbi_stg.junk4 AS SELECT json_rmq->>'totalSize' as totalSize, -- this plus array below causes out of memory error json_array_elements(json_rmq -> 'orders'::text) AS orders FROM csnbi_stg.stg_rmq_prod_test_json_raw_jmcn stg_rmq_json_raw WHERE rmq_exchange_name = 'Staging.Salesforce.Order' AND rmq_message_id = 'd1200de2-30b0-4599-bb17-64405f45ca19'; if I *only* retrieve the JSON array by itself then it works: CREATE TABLE csnbi_stg.junk5 AS SELECT -- json_rmq->>'totalSize' as totalSize, -- take this OUT and below works json_array_elements(json_rmq -> 'orders'::text) AS orders FROM csnbi_stg.stg_rmq_prod_test_json_raw_jmcn stg_rmq_json_raw WHERE rmq_exchange_name = 'Staging.Salesforce.Order' AND rmq_message_id = 'd1200de2-30b0-4599-bb17-64405f45ca19'; THANK YOU From: Yuri Budilov Sent: Monday, 4 December 2017 11:14 AM To: rob stone; pgsql-general@lists.postgresql.org Subject: Re: JSON out of memory error on PostgreSQL 9.6.x hello good people it is *not* JSONB, just plain JSON the version 9.6.3 and running inside AWS RDS PostgreSQL (DBaaS) the machine size is just 1 GB RAM and 1 CPU, is it a called "micro" size AWS RDS instance, we use it for DEV we have also reproduced it on 2 CPU 8 GB RAM instance, FWIW. It takes well under 1 min elapsed time to fail. best regards and many thanks for trying to help me From: rob stone Sent: Monday, 4 December 2017 11:01 AM To: Yuri Budilov; John R Pierce; pgsql-general@lists.postgresql.org Subject: Re: JSON out of memory error on PostgreSQL 9.6.x On Sun, 2017-12-03 at 23:18 +, Yuri Budilov wrote: > Posted on Stack Overflow, sadly no replies, so trying here > > CREATE TABLE X AS > SELECT json_array_elements(json_rmq -> 'orders'::text) AS order > FROM table_name > WHERE blah; > I get out of memory error. > > Is there anything I can do to unpack the above? > > The JSON column is about ~5 MB and it has about ~150,000 array > row elements in 'orders' above. > > I tried work_mem values up to ~250MB and it did not help, the query > takes about same time to fail. > > I guess this parameter does not help JSON processing. > > If there another parameter I can try? Something else? > > I don't have control of the size of the JSON payload, it arrives, we > store it in a JSON column and then we need to crack it open. > > Many thanks! > Hello, It would help if you advised:- (a) version of PostgreSql being used. (b) is column json_rmq defined as json or jsonb? (c) OS. Cheers, Rob
Re: JSON out of memory error on PostgreSQL 9.6.x
On 12/3/2017 3:18 PM, Yuri Budilov wrote: Posted on Stack Overflow, sadly no replies, so trying here ... ,,, why did you email me personally ? -- john r pierce, recycling bits in santa cruz
Re: JSON out of memory error on PostgreSQL 9.6.x
hello good people it is *not* JSONB, just plain JSON the version 9.6.3 and running inside AWS RDS PostgreSQL (DBaaS) the machine size is just 1 GB RAM and 1 CPU, is it a called "micro" size AWS RDS instance, we use it for DEV we have also reproduced it on 2 CPU 8 GB RAM instance, FWIW. It takes well under 1 min elapsed time to fail. best regards and many thanks for trying to help me From: rob stone Sent: Monday, 4 December 2017 11:01 AM To: Yuri Budilov; John R Pierce; pgsql-general@lists.postgresql.org Subject: Re: JSON out of memory error on PostgreSQL 9.6.x On Sun, 2017-12-03 at 23:18 +, Yuri Budilov wrote: > Posted on Stack Overflow, sadly no replies, so trying here > > CREATE TABLE X AS > SELECT json_array_elements(json_rmq -> 'orders'::text) AS order > FROM table_name > WHERE blah; > I get out of memory error. > > Is there anything I can do to unpack the above? > > The JSON column is about ~5 MB and it has about ~150,000 array > row elements in 'orders' above. > > I tried work_mem values up to ~250MB and it did not help, the query > takes about same time to fail. > > I guess this parameter does not help JSON processing. > > If there another parameter I can try? Something else? > > I don't have control of the size of the JSON payload, it arrives, we > store it in a JSON column and then we need to crack it open. > > Many thanks! > Hello, It would help if you advised:- (a) version of PostgreSql being used. (b) is column json_rmq defined as json or jsonb? (c) OS. Cheers, Rob
Re: JSON out of memory error on PostgreSQL 9.6.x
On Sun, 2017-12-03 at 23:18 +, Yuri Budilov wrote: > Posted on Stack Overflow, sadly no replies, so trying here > > CREATE TABLE X AS > SELECT json_array_elements(json_rmq -> 'orders'::text) AS order > FROM table_name > WHERE blah; > I get out of memory error. > > Is there anything I can do to unpack the above? > > The JSON column is about ~5 MB and it has about ~150,000 array > row elements in 'orders' above. > > I tried work_mem values up to ~250MB and it did not help, the query > takes about same time to fail. > > I guess this parameter does not help JSON processing. > > If there another parameter I can try? Something else? > > I don't have control of the size of the JSON payload, it arrives, we > store it in a JSON column and then we need to crack it open. > > Many thanks! > Hello, It would help if you advised:- (a) version of PostgreSql being used. (b) is column json_rmq defined as json or jsonb? (c) OS. Cheers, Rob
Re: JSON out of memory error on PostgreSQL 9.6.x
Yuri Budilov writes: > Posted on Stack Overflow, sadly no replies, so trying here > CREATE TABLE X AS > SELECT json_array_elements(json_rmq -> 'orders'::text) AS order > FROM table_name > WHERE blah; > I get out of memory error. > The JSON column is about ~5 MB and it has about ~150,000 array row elements > in 'orders' above. I tried to reproduce that, and couldn't, given the available info. I made a JSON value of more or less that size with perl -e 'print "{\"orders\": [0"; for($i=1;$i<=15;$i++){print ",$i"}; print "]}\n";' >jsonval and then did regression=# create table table_name(json_rmq json); CREATE TABLE regression=# \copy table_name from jsonval COPY 1 regression=# insert into table_name select * from table_name; INSERT 0 1 regression=# insert into table_name select * from table_name; INSERT 0 2 regression=# insert into table_name select * from table_name; INSERT 0 4 regression=# insert into table_name select * from table_name; INSERT 0 8 regression=# insert into table_name select * from table_name; INSERT 0 16 regression=# insert into table_name select * from table_name; INSERT 0 32 regression=# insert into table_name select * from table_name; INSERT 0 64 regression=# insert into table_name select * from table_name; INSERT 0 128 regression=# CREATE TABLE X AS SELECT json_array_elements(json_rmq -> 'orders'::text) AS order FROM table_name; SELECT 38400256 Watching the process with "top", its memory consumption stayed rock-steady. If there's a leak in there, this example doesn't show it. There could be a leak related to some detail you failed to mention, but ... regards, tom lane
JSON out of memory error on PostgreSQL 9.6.x
Posted on Stack Overflow, sadly no replies, so trying here CREATE TABLE X AS SELECT json_array_elements(json_rmq -> 'orders'::text) AS order FROM table_name WHERE blah; I get out of memory error. Is there anything I can do to unpack the above? The JSON column is about ~5 MB and it has about ~150,000 array row elements in 'orders' above. I tried work_mem values up to ~250MB and it did not help, the query takes about same time to fail. I guess this parameter does not help JSON processing. If there another parameter I can try? Something else? I don't have control of the size of the JSON payload, it arrives, we store it in a JSON column and then we need to crack it open. Many thanks!