We are currently investigating the hash option - this would speed things
up considerably after the first data load.  It doesn't really matter how
long the initial load takes in the big pictures and this might help the
updates.

The update schedule is interesting due to this data coming out of a
payroll (and 2 other) systems.  The data is updated  quite often so in
the space of a two-week update cycle the data is updated 12 days.

Unfortunately hardware doesn't really seem to be the issue.  We are
using high end servers here and the processing time is the same under
most circumstances - there is simply SO much validation going on that it
takes forever.  The server itself is not taxed when these are running.
It's kind of weird actually.

There are some options like using 7.1 and multi-threading the import
with separate escalations - but I have no way to test that at this time.

I'm also contacting the people I know at BMC for advice - I'll post
whatever route we eventually go with.

Thanks for all the advice so far - keep it coming!

-----Original Message-----
From: Action Request System discussion list(ARSList)
[mailto:[EMAIL PROTECTED] On Behalf Of Axton
Sent: Monday, October 15, 2007 12:57 PM
To: arslist@ARSLIST.ORG
Subject: Re: CTM:People large data imports

Why not use a one way hash of the ssn as the key between Remedy and the
external file?  Your policy makers should feel comfortable with that
approach, and it gives you what you need; a way to relate the data from
the external source to the Remedy repository, without storing sensitive
information.

http://en.wikipedia.org/wiki/Cryptographic_hash_function

Axton Grams

On 10/15/07, William Rentfrow <[EMAIL PROTECTED]> wrote:
> **
>
> On my current project we will have to import about 1,000,000 
> CTM:People records every week (give or take).  These will be "new" 
> records every time because the only unique identifier we have from the

> external systems is an identifier like SSN that we can't put into our 
> system (that's a policy decision).  We're checking on ways to do 
> updates but it's still messy due to the item below.
>
> The actual import is going to be automatic and should follow this
process:
>
> 1.) Escalation deletes all relevant CTM:People records
> 2.) File is automatically put on the Remedy server with people 
> information
> 3.) Escalation fires a Run Process that imports the data into a 
> staging form.
> 4.) Escalation fires and sets a flag field to "Process Now" on all 
> records in the staging form
> 5.) Filter on "modify" on the staging form pushes to CTM:People
>
> Here's the problem - we're using decent hardware running on Solaris 9,

> etc - yet the push to CTM:People takes 1.5 seconds per record due to 
> all of the validation that takes place.
>
> I have to do 1,000,000 records a week - and there are only 604,800 
> seconds in a week....also, it appears using direct SQL loads won't 
> work due to all of the back end set fields (I can get around all of 
> those except the GUID - which could be set by another 
> escalation.....and then we are back at this same problem).
>
> If you've done large data imports to CTM:People I'd really like to 
> know how you went about loading the data efficiently.
>
> William Rentfrow, Principal Consultant [EMAIL PROTECTED] C 
> 701-306-6157 O 952-432-0227
>   __20060125_______________________This posting was submitted with 
> HTML in it___

________________________________________________________________________
_______
UNSUBSCRIBE or access ARSlist Archives at www.arslist.org ARSlist:"Where
the Answers Are"

_______________________________________________________________________________
UNSUBSCRIBE or access ARSlist Archives at www.arslist.org ARSlist:"Where the 
Answers Are"

Reply via email to