Hi,
Using a table with larger data in
the sub-query always eats up CPU time and it is time consuming. The below given
statement uses the transaction table two times in the sub-query and for
processing every record, it have to go through the same table twice in the
sub-query. I
[EMAIL PROTECTED] ("Muhyiddin A.M Hayat") writes:
>> There is an easy way to do this; write a plpgsql set returning
>> function which adds the balance to the last column of the table. That
>> query will always have a cost in both time and memory proportional to
>> the size of the table, and the me
There is an easy way to do this; write a plpgsql set returning
function which adds the balance to the last column of the table. That
query will always have a cost in both time and memory proportional to
the size of the table, and the memory cost may bite you as table size
grows...
--
Can you give
There is an easy way to do this; write a plpgsql set returning
function which adds the balance to the last column of the table. That
query will always have a cost in both time and memory proportional to
the size of the table, and the memory cost may bite you as table size
grows...
--
Can you give
Oops! [EMAIL PROTECTED] ("Muhyiddin A.M Hayat") was seen spray-painting on a
wall:
> everything is ok, but when record > 100 that query eat all my
> cpu process and take a long time, i have wait for 3 mimutes
> but query doesn't finish. (pgsql-8.0-1 running on Dual Xeon 2.8 and
> 2GB of RAM)
I think you forget FOREIGN KEY:
transactions.trx_type_id -> trx_type.id
MAMH> Dear All,
MAMH> I have problem to calculation
MAMH> balance from debet and credit.
MAMH> my transaction table:
...
MAMH> CREATE TABLE "public"."transactions" (
MAMH>
MAMH> "id" SERIAL,
MAMH> "trx_timestamptz" TI