This is a 'rest-of-the-story' addendum to Christopher Strauss' answer below:



- Data for the items mentioned were extracted by pertinent reports (from user 
client,

  via *.csv), then loaded  into BMC FDMT spreadsheets

- Some normalization/'vetting' was also performed during such process, with

   spreadsheet columns providing good reminders as to 'required' vs 'optional'

  fields

- Company, Location, People Locations were loaded first, staged then promoted,

  basically in one pass-- then contents were re-checked for proper laydown

- Support Group, Business Time, Business Holidays were loaded next, with similar

  re-check after importation/promotion

- Operational/Product Catalogs were loaded in third pass--and rechecked

- Support Staff were then loaded, declared as support staff--but without 
allocation

  to support groups or roles (would require pre-declaration of specific 
template,

  counter-productive in our environment -- ca 270 support staff filling over 100

   combinations of roles/groups, so 'template creation' would double the effort)

- Support staff were associated to Support Groups, roles, and application 
permissions

   by hand--and assignment/ownership rules built by hand as well.



Chris Strauss' observation is most appropriate--best advantage over by-hand 
loading

is replicability in case of database rollback, at least in for our data 
profile.  Batch-loading

probably does same some time over repetitive single entries-although 
quantification

may be difficult.  The spreadsheets do provide useful  'scripts' for data 
normalization/fusion

--a process which needs to happen for proper ITSM feeding, whether many 
organizations

will admit such or not.   I envision that the penalties for NOT performing 
careful

normalizing/fusion increase greatly if CMDB is utilized--'unwashed' data makes

reconciliation much more difficult.



Rick, we did not succeed in executing the Effective Datalink tool through all

its paces-would not run from client workstation, at least in our environment-

but I have successfully used the BMC tool for post-initial-run addenda.

The file handling is roughly equivalent for both environments, given that one 
must

follow approximately the same path with both tools (again, in our environment):

--prepare spreadsheets on client desktop, with converion to *.csv files for 
arimportcmd

--transfer *.csv files to server, to arimportcmd.exe can find them

--execute arimportcmd-based batch file, importing data to staging forms

--examine data in staging forms

--validate/promote when satisfied with content.



Quick and easy?  Not by any stretch.  But process is replicable, with numerous 
checkpoints.

Don W. McClure, P.E.
Data Administrator & System Engineer
University of North Texas Computing & IT Center
dwmac_at_unt.edu

From: Action Request System discussion list(ARSList) [mailto:[EMAIL PROTECTED] 
On Behalf Of Rick Cook
Sent: Monday, March 10, 2008 1:33 PM
To: arslist@ARSLIST.ORG
Subject: Re: Question: ITSM 7 (patch 6) - Migration

** Chris, you know Don better than I do, but I thought that Don preferred to 
use the Effective Datalink tool over the BMC one, especially for maintenance 
data updates.  Have either of you found it useful for migration purposes vs. 
other means?

Rick
On Mon, Mar 10, 2008 at 8:18 AM, T. Dee <[EMAIL PROTECTED]<mailto:[EMAIL 
PROTECTED]>> wrote:
Thanks!

On 3/10/08, strauss <[EMAIL PROTECTED]<mailto:[EMAIL PROTECTED]>> wrote:
> Don McClure can answer this in _much_ more detail, but he was able to
> extract some of the foundation data from development, insert it in the
> spreadsheets for loading via the Foundation Data Management Tool, and
> get it validated and promoted.  I know we used it for loading Company,
> People Organization, Location, Support Group, Business Time, and
> Operational and Product Catalogs. There are so many problems with that
> particular tool, however, that you may find that it is more trouble than
> it is worth.  We found it useful in that if we did not like how the data
> laid down we just restored the db to the previous snapshot, adjusted the
> data, and tried again.
>
> Christopher Strauss, Ph.D.
> Call Tracking Administration Manager
> University of North Texas Computing & IT Center
> http://itsm.unt.edu/
>
> > -----Original Message-----
> > From: Action Request System discussion list(ARSList)
> > [mailto:arslist@ARSLIST.ORG<mailto:arslist@ARSLIST.ORG>] On Behalf Of T. Dee
> > Sent: Monday, March 10, 2008 10:04 AM
> > To: arslist@ARSLIST.ORG<mailto:arslist@ARSLIST.ORG>
> > Subject: Question: ITSM 7 (patch 6) - Migration
> >
> > I'm curious as to how others are migrating their foundation
> > data / setup data for ITSM 7 from their Development Server to
> > their Production Server.
> >
> > Thanks!
> >
> > ______________________________________________________________
> > _________________
> > UNSUBSCRIBE or access ARSlist Archives at 
> > www.arslist.org<http://www.arslist.org>
> > Platinum Sponsor: www.rmsportal.com<http://www.rmsportal.com> ARSlist: 
> > "Where the Answers Are"
> >
> >
>
> _______________________________________________________________________________
> UNSUBSCRIBE or access ARSlist Archives at 
> www.arslist.org<http://www.arslist.org>
> Platinum Sponsor: www.rmsportal.com<http://www.rmsportal.com> ARSlist: "Where 
> the Answers Are"
>
_______________________________________________________________________________
UNSUBSCRIBE or access ARSlist Archives at 
www.arslist.org<http://www.arslist.org>
Platinum Sponsor: www.rmsportal.com<http://www.rmsportal.com> ARSlist: "Where 
the Answers Are"

__Platinum Sponsor: www.rmsportal.com ARSlist: "Where the Answers Are" html___

_______________________________________________________________________________
UNSUBSCRIBE or access ARSlist Archives at www.arslist.org
Platinum Sponsor: www.rmsportal.com ARSlist: "Where the Answers Are"

Reply via email to