That is a reasonable requirement. Perhaps we could add some
functionality to SQL::Translator::Producer::Dumper to produce dump/load
pipelines like the following:
(echo .mode csv; echo .import foo.csv foo)| sqlite3
foo.sqlite
Until such time as we have support for something like this you can
probably get away with using sqlt-dumper as before but *excluding* your
big table
sqlt-dumper --skip yourbigtable
and then import the large table into sqlite manually (using .import)
BTW, I am very motivated to help you out as I use some of your R stuff
very regularly in my work!
pjjH
-----Original Message-----
From: hadley wickham [mailto:[EMAIL PROTECTED]
Sent: Sunday, November 25, 2007 12:54 PM
To: Harrington, Paul
Cc: [email protected]
Subject: Re: [sqlfairy-developers] Large SQL files
That's useful to know. The problem I face is that I have a fairly
large mysql database that I'd like to convert to sqlite. I had hoped
that I'd simply be able to dump it out of mysql, trivially reformat it
and then load it into sqlite. Unfortunately it doesn't seem to be so
easy.
Hadley
On 11/25/07, Harrington, Paul <[EMAIL PROTECTED]> wrote:
> I recommend against using SQL::Translator for copying large data-sets
as
> the 'serialized as SQL statements' representation consumes a lot of
> memory (as you have experienced). I prefer to serialize the source
> data-set in delimited format and then use the platform-specific bulk
> insert tool to load the data to the destination.
>
> pjjH
>
>
> -----Original Message-----
> From: hadley wickham [mailto:[EMAIL PROTECTED]
> Sent: Sunday, November 25, 2007 9:37 AM
> To: Harrington, Paul
> Cc: [email protected]
> Subject: Re: [sqlfairy-developers] Large SQL files
>
> It's mostly data. Does that change anything?
>
> I can put most of the data up (95% is from one table). Is there an
> easy way to split up the one monolithic file into one for each table?
>
> Hadley
>
> On Nov 25, 2007 8:08 AM, Harrington, Paul <[EMAIL PROTECTED]>
> wrote:
> > Is that 100MB made up solely of schema? If you are able to put the
> files
> > up for FTP/HTTP somewhere, I can try and recreate the problem while
> > profiling memory consumption.
> >
> > pjjH
> >
> >
> >
> > -----Original Message-----
> > From: [EMAIL PROTECTED]
> > [mailto:[EMAIL PROTECTED] On Behalf
> Of
> > Hadley Wickham
> > Sent: Saturday, November 24, 2007 6:09 PM
> > To: [email protected]
> > Subject: [sqlfairy-developers] Large SQL files
> >
> > Hi,
> >
> > I was wondering if anyone has any tips on dealing with large (~100
> > meg) sql files. I get the following out of memory error:
> >
> > Desktop: sqlt -t SQLite -f MySQL mathpeople-imstat.sql >
> dump-sqlite.sql
> > perl(25882) malloc: *** mmap(size=221171712) failed (error code=12)
> > *** error: can't allocate region
> > *** set a breakpoint in malloc_error_break to debug
> > Out of memory!
> >
> > Thanks!
> >
> > Hadley
> >
> >
>
------------------------------------------------------------------------
> > -
> > This SF.net email is sponsored by: Microsoft
> > Defy all challenges. Microsoft(R) Visual Studio 2005.
> > http://clk.atdmt.com/MRT/go/vse0120000070mrt/direct/01/
> > --
> > sqlfairy-developers mailing list
> > [email protected]
> > https://lists.sourceforge.net/lists/listinfo/sqlfairy-developers
> >
>
>
>
> --
> http://had.co.nz/
>
--
http://had.co.nz/
-------------------------------------------------------------------------
This SF.net email is sponsored by: Microsoft
Defy all challenges. Microsoft(R) Visual Studio 2005.
http://clk.atdmt.com/MRT/go/vse0120000070mrt/direct/01/
--
sqlfairy-developers mailing list
[email protected]
https://lists.sourceforge.net/lists/listinfo/sqlfairy-developers