one several times. And table consist of million of records. - Original Message - From: Bayliss, Barry To: Axapta-Knowledge-Village@yahoogroups.com Sent: Tuesday, January 03, 2006 4:35 AM Subject: RE: [Axapta-Knowledge-Village] Re:
Performance Hello Maverick, W
, 26 December 2005
9:10 PM
To: Axapta[EMAIL PROTECTED]
Subject: Re: [Axapta-Knowledge-Village] Re: Performance
Hi All,
I am also facing the same problem, I have some code which is
processing thousands of update on the table. There is only one process doing
all the updates. So nothing to
, December 22, 2005 6:30
AM
Subject: RE: [Axapta-Knowledge-Village]
Re: Performance
Another thing I want to add just incase it was not taken
into consideration. You should but you OS, data file and log file on seperate
volumes. It is recommended to put the OS on a mirrored volume
Another thing I want to add just incase it was not taken into consideration. You should but you OS, data file and log file on seperate volumes. It is recommended to put the OS on a mirrored volume (2 hard drives in a RAID 1 configuration), the log file on another mirrored volume and the data fil
Yes, was referring to the sql transaction log.
Can you post your code?
Barry.
-Original Message-
From: Axapta-Knowledge-Village@yahoogroups.com
[mailto:[EMAIL PROTECTED] On Behalf Of Stefan
Capiau
Sent: Wednesday, 21 December 2005 6:19 PM
To: Axapta-Knowledge-Village@yahoogroups.com
S
Have you checked whether the process is suffering from lock
contention (e.g. by running the sp_who2 database stored procedure from the query
profiler)?
Regards
Malcolm Burtt
From: Axapta-Knowledge-Village@yahoogroups.com
[mailto:[EMAIL PROTECTED] On Behalf Of Stefan
CapiauSent: 21 De
You are welcome buddy. hope you reach a solution soon.Stefan Capiau <[EMAIL PROTECTED]> wrote: Thanks for the reply!We are running this process at night, and no other processes are active at that moment.The strange thing is that when we divide the process into batches of a couple of thousand re
Hello Stefan,
I have previously come across a problem with a piece of code running
slower like this. Turns out the problem was related to the size of the
transaction log generated by the process being run.
Have you placed the transaction log around the entire batch, or each
block of the batc
Hi Stefan There are countless ways in which performance may be improved but it depends on your specific situation. If you are selecting records for update in a transaction you will have objects (table, page. Record) locks for the tables within the transaction. It looks like you are run
9 matches
Mail list logo