Hello,
I have several large Oracle tables (2+ million records) which I need to
import data from to Derby. Here is what I have tried so far:
1. I have dumped the data to comma separated flat files.
2. Used the import table utility like this:
CALL SYSCS_UTIL.SYSCS_IMPORT_TABLE
(null,'SSJ_CNT','SSJ_CNT.csv',null, null,null,0);
3. After '4 hours' of running it appears to have frozen up. There is
both a db.lck and dbex.lck file present which I will delete soon.
Do you think that 2 million records is just too much for this utility to
handle? Is there a better way to transfer data from another database
besides the IMPORT_TABLE utility?
Thank you for any ideas or experiences you have had with importing data
from another DB.
Derek Sedillo
SWAFS DBA / Software Engineer
Northrop Grumman Missions Systems
Tel: (719) 570-8256
Import data from large Oracle table
Sedillo, Derek \(Mission Systems\) Wed, 14 Mar 2007 13:19:27 -0800
- Large multi-record insert performance Sedillo, Derek \(Mission Systems\)
- RE: Large multi-record insert perf... Thomas J. Taylor
- RE: Large multi-record insert ... Sedillo, Derek \(Mission Systems\)
- Re: Large multi-record ins... Øystein Grøvlen
- Re: Large multi-record... Lance J. Andersen
- Import data from large Ora... Sedillo, Derek \(Mission Systems\)
- Re: Import data from l... Michael Segel
- Re: Import data from l... Suresh Thalamati
- RE: Import data f... Sedillo, Derek \(Mission Systems\)
- Re: Import da... Mike Matrigali
- Re: Import da... Army
- Re: Import da... Sunitha Kambhampati
- Re: Large multi-record ins... Mike Matrigali
- Re: Large multi-record... Lance J. Andersen
- Re: Large multi-r... Mike Matrigali
- RE: Large mul... Sedillo, Derek \(Mission Systems\)
