Re: [SQL] overload

2011-07-11 Thread Viktor Bojović
I have found cursors example in plperl. now it works fine. the code is below. CREATE OR REPLACE FUNCTION uniprot_frekvencije.pattern_counter2("patLength" integer) RETURNS character varying AS $BODY$ my $sth = spi_query("select sequence from entry"); my $patLen = $_[0]; my $patt = '';

Re: [SQL] overload

2011-07-11 Thread Viktor Bojović
Thanx Wayne and Pavel, i will try to make this work with cursors. Theoretically there can be more than 4 bilion records in result, but only 58mil returned which took 3 days. Possibly i will have to create temporary table and store results form %patterns into it after each 500k records, and group t

Re: [SQL] overload

2011-07-08 Thread lists-pgsql
Hi Viktor, I'm not sure what your requirements are in terms of performance and stability of the your result set. See Pavel's response. A cursor issues a single query and renders a single result set. The result set is static, the cursor just gives you finer control/performance when retrieving rows

Re: [SQL] overload

2011-07-08 Thread Pavel Stehule
Hello using a "window" implemented via LIMIT OFFSET is not good - it is solution on some systems where cursors are not available, but it is bad solution on PostgreSQL. Use a cursor instead - it is significantly more efective with less memory requests. Regards Pavel Stehule 2011/7/8 Viktor Bojov

Re: [SQL] overload

2011-07-08 Thread Viktor Bojović
Thanx Wayne, at the end i did it that way and it works. The code is below. CREATE FUNCTION pattern_counter1("patLength" integer) RETURNS character varying LANGUAGE plperl AS $_X$ my $rvCnt = spi_exec_query("select count(1) as cnt from entry"); #my $rowCountAll = $rvCnt->{processed}; my $row = $rvC

Re: [SQL] overload

2011-07-08 Thread lists-pgsql
I'm have the same situation with large tables. Take a look at using a cursor to fetch several thousand rows at a time. I presume what's happening is that perl is attempting to create a massive list/array in memory. If you use a cursor the list should only contain X number of rows where X in the

[SQL] overload

2011-07-05 Thread Viktor Bojović
Hi, while reading 20GB table through PL/PERL function , it constantly grows in RAM. I wanted to ask you which is the best way to read table inside that function without such memory consumption. Thanks in advance Code is here: CREATE FUNCTION pattern_counter("patLength" integer) RETURNS varchar