We did this at one client I was working in, whose name will be withheld
for various reasons.  We had all the data in another database, but like
you've run into, we didn't have the time to get all the data in within a
timely manner.

So we used ARSPerl to create and run multiple perl scripts that would
take segments of the data and push it into Remedy.  We had Remedy
configured with multiple instances connecting to the same database, and
we had some major hardware in place.  I think the most important part
for you right now will be the multiple instances of ARS, which will in
effect make escalations multi-threaded.  The only problem is that I
don't know if you can set up multiple instances of ARS using demo
licenses or if you have to pay for licenses, because that can get
expensive.

Shawn Pierson

        -----Original Message-----
        From: Action Request System discussion list(ARSList)
[mailto:[EMAIL PROTECTED] On Behalf Of William Rentfrow
        Sent: Monday, October 15, 2007 12:35 PM
        To: arslist@ARSLIST.ORG
        Subject: CTM:People large data imports


        **
        On my current project we will have to import about 1,000,000
CTM:People records every week (give or take).  These will be "new"
records every time because the only unique identifier we have from the
external systems is an identifier like SSN that we can't put into our
system (that's a policy decision).  We're checking on ways to do updates
but it's still messy due to the item below.

        The actual import is going to be automatic and should follow
this process:

        1.) Escalation deletes all relevant CTM:People records
        2.) File is automatically put on the Remedy server with people
information
        3.) Escalation fires a Run Process that imports the data into a
staging form.
        4.) Escalation fires and sets a flag field to "Process Now" on
all records in the staging form
        5.) Filter on "modify" on the staging form pushes to CTM:People

        Here's the problem - we're using decent hardware running on
Solaris 9, etc - yet the push to CTM:People takes 1.5 seconds per record
due to all of the validation that takes place.

        I have to do 1,000,000 records a week - and there are only
604,800 seconds in a week....also, it appears using direct SQL loads
won't work due to all of the back end set fields (I can get around all
of those except the GUID - which could be set by another
escalation.....and then we are back at this same problem).

        If you've done large data imports to CTM:People I'd really like
to know how you went about loading the data efficiently.

        William Rentfrow, Principal Consultant
        [EMAIL PROTECTED]
        C 701-306-6157
        O 952-432-0227

        __20060125_______________________This posting was submitted with
HTML in it___


Private and confidential as detailed <a
href="http://www.sug.com/disclaimers/default.htm#Mail";>here</a>.  If you cannot 
access hyperlink, please e-mail sender.

_______________________________________________________________________________
UNSUBSCRIBE or access ARSlist Archives at www.arslist.org ARSlist:"Where the 
Answers Are"

Reply via email to