Hi Thomas,
v_message is of composite data type r_log_message and it's definition is as
shown below.
postgres=# \d r_log_message;
Composite type "public.r_log_message"
Column| Type | Collation | Nullable | Default
-+-+--
H Michael,
Please see insert_info function below. Also r_log_message is composite data
type and it's definition is also given below.
CREATE OR REPLACE FUNCTION insert_info(
info_array r_log_message[]
) RETURNS varchar AS $$
DECLARE
info_element r_log_message;
BEGIN
FO
It seems like that function has some syntax errors, and also doesn't do
what you want since I presume the "from employee" bit would mean you get
many rows inserted into that temp table for all the existing data and not
the one row you are operating on at the moment the trigger fires.
It is worth n
Ok. Let me try this. Thanks!!
On Wed, Nov 24, 2021 at 12:01 PM Thomas Kellerer wrote:
> aditya desai schrieb am 24.11.2021 um 07:25:
> > Thanks Tom. However I could not find any solution to achieve the given
> requirement. I have to take all values in the temp table and assign it to
> an array v
aditya desai schrieb am 24.11.2021 um 07:25:
> Thanks Tom. However I could not find any solution to achieve the given
> requirement. I have to take all values in the temp table and assign it to an
> array variable to pass it to the audit procedure as shown below. Can you
> please advise ?
>
> C
Thanks Tom. However I could not find any solution to achieve the given
requirement. I have to take all values in the temp table and assign it to
an array variable to pass it to the audit procedure as shown below. Can you
please advise ?
CREATE OR REPLACE FUNCTION call_insert_info(
) RETURNS void
aditya desai writes:
> In a trigger function I am creating a temp table . When an update on a
> table is executed for say 10k rows. I get the below error.
> ERROR: out of shared memory
> HINT:You might need to increase max_locks_per_transaction
> CONTEXT: SQL Statement "created temp table changed
Hi,
In a trigger function I am creating a temp table . When an update on a
table is executed for say 10k rows. I get the below error.
ERROR: out of shared memory
HINT:You might need to increase max_locks_per_transaction
CONTEXT: SQL Statement "created temp table changedinfo(colName
varchar(100), o
On Fri, Nov 12, 2021 at 09:12:38PM +0100, Jiří Fejfar wrote:
> * I know that PG is focused on OLTP rather then analytics, but we are happy
> with it at all and do not wish to use another engine for analytical
> queries... isn't somewhere some "PG analytical best practice" available?
It's a good qu