Hello:
Sorry for disturbing,
In order to make my question clear,
I wrote this one as a seperate question.
If using cgroup, I can find wget work well.
But , for postgresql, when I deal huge amount of data, it still report out
of memory error. In fact I hope postgresql can work under a limit
Now let me reply it myself.
When I changed memory.limit_in_bytes=300M, it worked.
memeory before sql statment execution and after sql statement execution is:
[postgres@cent6 Desktop]$ free -m
total used free sharedbuffers cached
Mem: 2006537