Re: [GENERAL] My Experiment of PG crash when dealing with huge amount of data

2013-09-06 Thread 高健
Hello: Sorry for disturbing again. Some of my friends told me about cgroups, So I tried it first. I found that cgroups can work for task such as wget. But it can't work for my postgres process. [root@cent6 Desktop]# cat /etc/cgconfig.conf # # Copyright IBM Corporation. 2007 # # Authors:

Re: [GENERAL] My Experiment of PG crash when dealing with huge amount of data

2013-09-02 Thread Jeff Janes
On Sun, Sep 1, 2013 at 6:25 PM, 高健 luckyjack...@gmail.com wrote: To spare memory, you would want to use something like: insert into test01 select generate_series, repeat(chr(int4(random()*26)+65),1024) from generate_series(1,2457600); Thanks a lot! What I am worrying about is that: If data

Re: [GENERAL] My Experiment of PG crash when dealing with huge amount of data

2013-09-02 Thread 高健
Thanks, I'll consider it carefully. Best Regards 2013/9/3 Jeff Janes jeff.ja...@gmail.com On Sun, Sep 1, 2013 at 6:25 PM, 高健 luckyjack...@gmail.com wrote: To spare memory, you would want to use something like: insert into test01 select generate_series,

Re: [GENERAL] My Experiment of PG crash when dealing with huge amount of data

2013-09-01 Thread 高健
To spare memory, you would want to use something like: insert into test01 select generate_series, repeat(chr(int4(random()*26)+65),1024) from generate_series(1,2457600); Thanks a lot! What I am worrying about is that: If data grows rapidly, maybe our customer will use too much memory , Is

Re: [GENERAL] My Experiment of PG crash when dealing with huge amount of data

2013-09-01 Thread Tom Lane
=?UTF-8?B?6auY5YGl?= luckyjack...@gmail.com writes: If data grows rapidly, maybe our customer will use too much memory , Is ulimit command a good idea for PG? There's no received wisdom saying that it is. There's a fairly widespread consensus that disabling OOM kill can be a good idea, but I

Re: [GENERAL] My Experiment of PG crash when dealing with huge amount of data

2013-08-31 Thread Jeff Janes
On Fri, Aug 30, 2013 at 2:10 AM, 高健 luckyjack...@gmail.com wrote: postgres=# insert into test01 values(generate_series(1,2457600),repeat( chr(int4(random()*26)+65),1024)); The construct values (srf1,srf2) will generate its entire result set in memory up front, it will not stream its results

Re: [GENERAL] My Experiment of PG crash when dealing with huge amount of data

2013-08-30 Thread hxreno1
This should be the operating system OOM kills pg process,check syslog On Fri 30 Aug 2013 05:10:42 PM CST, 高健 wrote: Hello: I have done the following experiment to test : PG's activity when dealing with data which is bigger in size than total memory of the whole os system. The result is: PG

Re: [GENERAL] My Experiment of PG crash when dealing with huge amount of data

2013-08-30 Thread Michael Paquier
On Fri, Aug 30, 2013 at 6:10 PM, 高健 luckyjack...@gmail.com wrote: In log, I can see the following: LOG: background writer process (PID 3221) was terminated by signal 9: Killed Assuming that no users on your server manually killed this process, or that no maintenance task you implemented did