"John S. Gage" wrote:
> My problem is that I'm twice as old as everyone else.
Well, from one to another ...
When you set up a new database (OS of course), one that will ream out
those chunks of technicolour stuff that managers, policy-makers and
other interfering know-nothing types want to have on their desks -
Are there benchmarks for the rate of transfer of data from the legacy
system, into the new one?
I'm only asking because the whiteboard markers are out again, and people
are drawing squares and circles and connecting them up with arrows. Last
time, the big one in the middle was labelled 'Cloverleaf', but we are
sticking to vague generics now. The 'management workstation' that moved
data from virtual M tables thru ODBC links to MS-Access on a PC is
probably still labouring away in the back room, having produced not one
useful extract.
Australia's peak govt health info agency has just let out a tender for
consultancy services to assist in delivering information technology
enterprise solutions. (No need to worry, this was in the newspaper, so
no threat of the Crimes Act 1914 by discussing it here, ... I hope.)
That's my question - having assembled your low-end servers running
Linux, how long does it take to populate a modern database with the data
from the old COBOL mainframe, making it available for massaging on the
NT4 desktops?
[I see this sort of question a critical part of the ugly but necessary
'business case' approach, John.]
--
Trevor Kerr
Southern Health Care Network Pathology
Melbourne Australia