Actually when I run the extract I include validation after I select the data to kick out bogus records. Unfortunately we have a bunch of records that have become duplicate data or the CSR bunked the data and I do not want those in my set of data. The way I extract this now is through a stored proc in which I run a curser to select out the data, assign to variables and if certain pieces of data are bunk I right he unquie id to an error table with an error message and the bunk vlue so the I can run a report to the CSR for data clean up through the application, else I right the data to the application table. take about 110 minutes to run, but the data is accurate. The old process taht some joker wrote took about 10 hours.

-----Original Message-----
From: Joe Rinehart [mailto:[EMAIL PROTECTED]
Sent: Thursday, July 01, 2004 9:19 AM
To: CF-Talk
Subject: Re: OT: What would you do?

Would there be a way to build a view that is the "crunched" data?  

-joe
  _____
[Todays Threads] [This Message] [Subscription] [Fast Unsubscribe] [User Settings] [Donations and Support]

Reply via email to