Hi,
Le dimanche 27 avril 2008, Greg Smith a écrit :
than SQL*PLUS. Then on the PostgreSQL side, you could run multiple COPY
sessions importing at once to read this data all back in, because COPY
will bottleneck at the CPU level before the disks will if you've got
reasonable storage hardware.
Adonias Malosso wrote:
Hi All,
I´d like to know what´s the best practice to LOAD a 70 milion rows, 101
columns table
from ORACLE to PGSQL.
The current approach is to dump the data in CSV and than COPY it to
Postgresql.
Uhm. 101 columns you say? Sounds interesting. There are dataloaders
Jonah,
Thank you for the answer. Good to know about this enterprise DB feature.
I´ll follow using pgloader.
Regards.
Adonias Malosso
On Sat, Apr 26, 2008 at 10:14 PM, Jonah H. Harris [EMAIL PROTECTED]
wrote:
On Sat, Apr 26, 2008 at 9:25 AM, Adonias Malosso [EMAIL PROTECTED]
wrote:
I´d
On Mon, Apr 28, 2008 at 5:37 PM, Adonias Malosso [EMAIL PROTECTED] wrote:
Thank you for the answer. Good to know about this enterprise DB feature.
No problem.
I´ll follow using pgloader.
That's fine. Though, I'd really suggest pg_bulkload, it's quite a bit faster.
--
Jonah H. Harris, Sr.
On Sat, 26 Apr 2008, Adonias Malosso wrote:
The current approach is to dump the data in CSV and than COPY it to
Postgresql.
You would have to comment on what you don't like about what you're doing
now, what parts need to be improved for your priorities, to get a properly
targeted answer
Hi All,
I´d like to know what´s the best practice to LOAD a 70 milion rows, 101
columns table
from ORACLE to PGSQL.
The current approach is to dump the data in CSV and than COPY it to
Postgresql.
Anyone has a better idea.
Regards
Adonias Malosso
Adonias Malosso wrote:
Hi All,
I´d like to know what´s the best practice to LOAD a 70 milion rows, 101
columns table
from ORACLE to PGSQL.
The current approach is to dump the data in CSV and than COPY it to
Postgresql.
Anyone has a better idea.
Write a java trigger in Oracle that notes
But do we link oracle trigger to postgres trigger ?
i mean :
oracle trigger will take a note of what has been changed .
but then how do we pass those changes to postgres trigger ?
can u suggest any logic or algorithm ?
Regards,
Srikanth k Potluri
+63 9177444783(philippines)
On Sat
Potluri Srikanth wrote:
But do we link oracle trigger to postgres trigger ?
i mean :
oracle trigger will take a note of what has been changed .
but then how do we pass those changes to postgres trigger ?
I am assuming you can use the java trigger from oracle to load the
postgresql jdbc
Yep just do something like this within sqlplus (from
http://www.dbforums.com/showthread.php?t=350614):
set termout off
set hea off
set pagesize 0
spool c:\whatever.csv
select a.a||','||a.b||','||a.c
from a
where a.a=whatever;
spool off
COPY is the fastest approach to get it into PG.
- Luke
Joshua D. Drake wrote:
Potluri Srikanth wrote:
But do we link oracle trigger to postgres trigger ?
i mean :
oracle trigger will take a note of what has been changed .
but then how do we pass those changes to postgres trigger ?
I am assuming you can use the java trigger from oracle to load
On Sat, Apr 26, 2008 at 9:25 AM, Adonias Malosso [EMAIL PROTECTED] wrote:
I´d like to know what´s the best practice to LOAD a 70 milion rows, 101
columns table
from ORACLE to PGSQL.
The fastest and easiest method would be to dump the data from Oracle
into CSV/delimited format using something
12 matches
Mail list logo