SQL::T doesnt deal with data, and in some cases doesnt even skip it properly, so dump the mysql db with table defs only (which iirc is an optin for mysqldump), port that, then run the INSERT statements.
Jess On Sun, 25 Nov 2007, hadley wickham wrote: > That's useful to know. The problem I face is that I have a fairly > large mysql database that I'd like to convert to sqlite. I had hoped > that I'd simply be able to dump it out of mysql, trivially reformat it > and then load it into sqlite. Unfortunately it doesn't seem to be so > easy. > > Hadley > > On 11/25/07, Harrington, Paul <[EMAIL PROTECTED]> wrote: >> I recommend against using SQL::Translator for copying large data-sets as >> the 'serialized as SQL statements' representation consumes a lot of >> memory (as you have experienced). I prefer to serialize the source >> data-set in delimited format and then use the platform-specific bulk >> insert tool to load the data to the destination. >> >> pjjH >> >> >> -----Original Message----- >> From: hadley wickham [mailto:[EMAIL PROTECTED] >> Sent: Sunday, November 25, 2007 9:37 AM >> To: Harrington, Paul >> Cc: [email protected] >> Subject: Re: [sqlfairy-developers] Large SQL files >> >> It's mostly data. Does that change anything? >> >> I can put most of the data up (95% is from one table). Is there an >> easy way to split up the one monolithic file into one for each table? >> >> Hadley >> >> On Nov 25, 2007 8:08 AM, Harrington, Paul <[EMAIL PROTECTED]> >> wrote: >>> Is that 100MB made up solely of schema? If you are able to put the >> files >>> up for FTP/HTTP somewhere, I can try and recreate the problem while >>> profiling memory consumption. >>> >>> pjjH >>> >>> >>> >>> -----Original Message----- >>> From: [EMAIL PROTECTED] >>> [mailto:[EMAIL PROTECTED] On Behalf >> Of >>> Hadley Wickham >>> Sent: Saturday, November 24, 2007 6:09 PM >>> To: [email protected] >>> Subject: [sqlfairy-developers] Large SQL files >>> >>> Hi, >>> >>> I was wondering if anyone has any tips on dealing with large (~100 >>> meg) sql files. I get the following out of memory error: >>> >>> Desktop: sqlt -t SQLite -f MySQL mathpeople-imstat.sql > >> dump-sqlite.sql >>> perl(25882) malloc: *** mmap(size=221171712) failed (error code=12) >>> *** error: can't allocate region >>> *** set a breakpoint in malloc_error_break to debug >>> Out of memory! >>> >>> Thanks! >>> >>> Hadley >>> >>> >> ------------------------------------------------------------------------ >>> - >>> This SF.net email is sponsored by: Microsoft >>> Defy all challenges. Microsoft(R) Visual Studio 2005. >>> http://clk.atdmt.com/MRT/go/vse0120000070mrt/direct/01/ >>> -- >>> sqlfairy-developers mailing list >>> [email protected] >>> https://lists.sourceforge.net/lists/listinfo/sqlfairy-developers >>> >> >> >> >> -- >> http://had.co.nz/ >> > > > -- > http://had.co.nz/ > > ------------------------------------------------------------------------- > This SF.net email is sponsored by: Microsoft > Defy all challenges. Microsoft(R) Visual Studio 2005. > http://clk.atdmt.com/MRT/go/vse0120000070mrt/direct/01/ > -- > sqlfairy-developers mailing list > [email protected] > https://lists.sourceforge.net/lists/listinfo/sqlfairy-developers > ------------------------------------------------------------------------- SF.Net email is sponsored by: The Future of Linux Business White Paper from Novell. From the desktop to the data center, Linux is going mainstream. Let it simplify your IT future. http://altfarm.mediaplex.com/ad/ck/8857-50307-18918-4 -- sqlfairy-developers mailing list [email protected] https://lists.sourceforge.net/lists/listinfo/sqlfairy-developers
