Hi William:

On Thu, Jul 24, 2014 at 9:04 PM, William Nolf <bn...@xceleratesolutions.com>
wrote:

> We have a postgres database that was used for an application we no longer
> use.  However, we wouldlike to copy/dump the tables to files, text or csv
> so we can post them to sharepoint.
>

....

How BIG is your public schema? As this is a one-shot I would recommend
first doing a custom format backup, and then working from it ( you can make
an sql format backup from a custom one with pg_restore ).

THEN I would suggest making a mini-script ( I'm partial to perl for that,
but this is due to 20+ years  hacking with it ) through which I'll pipe a
data-only sql backup. You just wait for the line with 'copy
table(comma,separated,field,names) from stdin' line, open the file
table.whatever, write a header line if needed, read, transform and write
copy lines until the '\.' end of data marker and close the file, repeat
until input exhausted. Something like the following perl seudo code:

BEWARE, UNTESTED CODE FOR EXAMPLE PURPOSES:
OUTER: # Loop around all the backup
    while(<>) {
    if (my ($table, $fields) = /^copy\s*(.*?)\s*\((.*?)\)\s*from stdin;$/) {
    # Header line found, do headers etc....
    my $fname = get_filename($table); # Dequote, add extensions.
    open (my $fh, '>', $fname) or die "open problem: $!";
    print $fh generate_header($fields); # If needed.
      INNER: # Loop around a single table data.
    while(<>) {
        /^\\\.$/ and last;
        print $fh transform($_); # Chomp, split, quote, etc...
    }
    close($fh) or die "Close problem"; # Disk full, .....
    }
}
# Yeah, I know print should be tested too.

One of this things should give you a file for each table as fast as your
database can send a backup, or your machine can run pg_restore.

Regards.
   Francisco Olarte.

Reply via email to