Thanks so much . I was using bullzip What I felt with Bullzip was it is good
for less amount of data.  I have 2 tables each of which has 2.5 million
records.  For me it is taking for ever, The job that I set up has been
running since 12 hours.

I appreciate if you could share the VBA code that you were mentioning. I
would love to use that to make the data transfer faster

Regards



On Thu, Jun 2, 2011 at 9:32 AM, Thomas Harold <thomas-li...@nybeta.com>wrote:

> On 5/25/2011 3:42 PM, akp geek wrote:
>
>> Dear all -
>>
>>             I would like to know if any one has migrated database from
>> MS access to Postgres . We use postgres 9.0.2 on solaris . Are there any
>> open source tools that you have used to do this task. Can you please
>> share your experiences ?
>>
>>
> I rolled my own.
>
> If the number of rows in the MDB table is not that many (under 100k), then
> I'll create a new table up on pgsql, link to it with the ODBC driver, and
> append from the source table to the pgsql table.  You can get away with
> larger appends if both systems are on the same network.
>
> If it was a table with a few million rows, then I wrote a little VBA
> snippet that created a pgdump compatible SQL text file from the source data.
>  To figure out the format, I just pgdump'd an existing table from
> PostgreSQL, then patterned my SQL file after it.  While it was extremely
> fast at doing the conversion (both generating the SQL file and the time it
> took for pgdump to process the SQL file), I only recommend that method for
> cases where you have millions and millions of rows.  Or a lot of identical
> tables.
>
> (The VBA module was about 100-150 lines of code in total.)
>

Reply via email to