Re: [SQL] column default dependant on another columns value

2008-07-01 Thread Tom Lane
"Fernando Hevia" <[EMAIL PROTECTED]> writes: > Anyway, the rule didn't work. Got "an infinite recursion error" when > inserting on the table. > Can't figure out where the recursion is You didn't show us the rule, but I imagine that you think the WHERE clause is applied while expanding the rule. I

Re: [SQL] column default dependant on another columns value

2008-07-01 Thread Fernando Hevia
> -Mensaje original- > De: Richard Broersma [mailto:[EMAIL PROTECTED] > > It is possible to do this with a trigger or a rule. A > trigger would be more robust. > > > Is this correct? Is there another (better/simpler) way to > achieve this? > > Well I might work, but it is a bad prac

Re: [SQL] column default dependant on another columns value

2008-07-01 Thread Richard Broersma
On Tue, Jul 1, 2008 at 1:12 PM, Fernando Hevia <[EMAIL PROTECTED]> wrote: > Given a table with columns seconds and minutes, how can I have minutes be > computed automatically at the insert statement? It is possible to do this with a trigger or a rule. A trigger would be more robust. > Is this c

[SQL] column default dependant on another columns value

2008-07-01 Thread Fernando Hevia
Hi list, Given a table with columns seconds and minutes, how can I have minutes be computed automatically at the insert statement? I tried: ALTER TABLE table1 ALTER COLUMN minutes SET default (seconds/60); Postgres' answer was: ERROR: cannot use column references in default expression So

Re: [SQL] Need a sample Postgre SQL script

2008-07-01 Thread Richard Huxton
Dhanushka Samarakoon wrote: Thanks for the reply. But one problem I have is I need to loop through all the rows in the table and in each iteration I need to fetch the value of mydate in to a variable and split it to month and year and add two rows with *value, 91, month* (2 , 91, Augest) and *val

Re: [SQL] Need a sample Postgre SQL script

2008-07-01 Thread Dhanushka Samarakoon
Thanks for the reply. But one problem I have is I need to loop through all the rows in the table and in each iteration I need to fetch the value of mydate in to a variable and split it to month and year and add two rows with *value, 91, month* (2 , 91, Augest) and *value, 86, year* (2 , 86 , 2009)

Re: [SQL] Need a sample Postgre SQL script

2008-07-01 Thread Richard Huxton
Dhanushka Samarakoon wrote: Hi All, I'm kind of new to Postgre and I need some advice. No problem. It's PostgreSQL or Postgres by the way. I have the following table. metadata (value:integer , field:integer , mydate:text) given below is a sample record from that. ( 2 , 16 , Augest 2009) I

[SQL] Need a sample Postgre SQL script

2008-07-01 Thread Dhanushka Samarakoon
Hi All, I'm kind of new to Postgre and I need some advice. I have the following table. metadata (value:integer , field:integer , mydate:text) given below is a sample record from that. ( 2 , 16 , Augest 2009) I need a script that will read the above table and for each such row it will insert two

Re: [SQL] Quick select, slow update - help with performance problems

2008-07-01 Thread Gary Stainburn
On Tuesday 01 July 2008 12:17, Richard Huxton wrote: > Gary Stainburn wrote: > > update used_diary set > > ud_valet_completed=now(), ud_valet_completed_by=25 > > where ud_valet_completed is null and > > ud_valet_required < CURRENT_DATE-'7 days'::interval > > > > is still running

Re: [SQL] Quick select, slow update - help with performance problems

2008-07-01 Thread Richard Huxton
Gary Stainburn wrote: update used_diary set ud_valet_completed=now(), ud_valet_completed_by=25 where ud_valet_completed is null and ud_valet_required < CURRENT_DATE-'7 days'::interval is still running after approx 1 1/2 minutes. I've noticed that other updates also seem to take a long t

[SQL] Quick select, slow update - help with performance problems

2008-07-01 Thread Gary Stainburn
Hi folks. My system is slowing down, more notably over the last few weeks. The main reason is that it's on an old slow machine, and I'm in the process of sorting this. However, I think that there are some issues within my database which I need to investigate. There seems to be some performance