Title: RE: best case scenarios for export/import

2 x (IBM p690, 16CPU, 16GB Ram, Oracle 9201), Export file from 8161 34GB. Import is 12 parallel streams (6 on each machines) with no constraints, custom parfiles (to import certain schema per stream only), buffer (either 8M and 16M), 12GB Undo on each instance, all tablespaces LMT

When testing the migration plan, I used to  (everything below was scripted)
1. Drop the database
2. Create the database total datafiles are about 140GB.
3. precreate all objects with no data, no indexes
4. Run all imports
5. Validate everything is okay.
6. Trash everything and start all over again.

The last week (Mon-Fri) I actually tested the whole process at-least twice in a day proofing the scripts, always looking for UNKNOWN-PROBLEMS and always scanning for KNOWN PROBLEMS.

Total time 1 Hour and 30 minutes for the import using pre-created objects (no indexes).

This is all from memory we did migrate on Oct 12, we had a good sized window. AIX created a problem with shared disk when we were in middle of import. So we trashed all we did and restarted. The databases were up and running, all verified, completely analyzed, necessary indexes (intermedia etc) rebuilt and running Active/Active on RAC one minute before user testing was supposed to start.

Raj
______________________________________________________
Rajendra Jamadagni              MIS, ESPN Inc.
Rajendra dot Jamadagni at ESPN dot com
Any opinion expressed here is personal and doesn't reflect that of ESPN Inc.
QOTD: Any clod can have facts, but having an opinion is an art!


-----Original Message-----
From: Magaliff, Bill [mailto:[EMAIL PROTECTED]]
Sent: Friday, December 20, 2002 9:04 AM
To: Multiple recipients of list ORACLE-L
Subject: best case scenarios for export/import


Good day, all:

I'm looking for real-life best-case scenarios for running import/export . .
. I've been playing with this for quite some time and would like to know how
"fast" I can really expect this to go, particularly for the import.

I'd be interested to hear others' experiences - how fast have you been able
to import data?  what parameters have you used? etc. . . . it's both for
informational purposes and as a sanity check.

For example:  I'm now trying to import a dump file of appx 6.5 Gb - breaks
down into 12G data and 4G indexes.
using the following params on the first import, to just get the data (I then
rerun with the indexfile param to get the indexes): 
recordlength=65535
buffer=15000000 (15M)
commit=y
indexes=n
constraints=n
grants=n

This will import in appx 36 hours using a single 3 Gb rollback segment

What kind of experiences have you had? 

Thanks
bill

********************************************************************This e-mail 
message is confidential, intended only for the named recipient(s) above and may 
contain information that is privileged, attorney work product or exempt from 
disclosure under applicable law. If you have received this message in error, or are 
not the named recipient(s), please immediately notify corporate MIS at (860) 766-2000 
and delete this e-mail message from your computer, Thank 
you.*********************************************************************2


Reply via email to