sqlt-dumper takes some SQL DDL as input (schema.sql, say) and generates
a perl script from that. When that perl script is run, it connects to
the specified database and does a SELECT without a WHERE clause for all
the tables which were defined originally in schema.sql. The output of
the perl script is a bunch of SQL insert statements which you then have
to pipe to whatever sql processor you are using.

The admin_rights.sql file you provided contains both DDL and insert
statements. When I stripped out the non-DDL (and the quotes):

create table admin_rights (
  id int(11) NOT NULL auto_increment,
  name varchar(255) default NULL,
  controller varchar(255) default NULL,
  action varchar(255) default NULL,
  PRIMARY KEY  (id)
);

I was able to run it as follows and it generated a perl script (which I
have attached)

 perl -I./lib bin/sqlt-dumper  -d MySQL ~/ftp/admin_rights.sql  >
~/ftp/generated_dumper.pl

I am unfamiliar with the compilation issues on the Mac but I will try it
out on a macbook at home this evening.

Wrt to your R packages, I use lattice a lot and am learning to use
ggplot.  I tried to use the gobi stuff but did not really understand it
so am waiting until the book is published before I start into it again.
The R code produces graphs of database performance data and Solaris
extended process accounting data.

pjjH

-----Original Message-----
From: hadley wickham [mailto:[EMAIL PROTECTED] 
Sent: Monday, November 26, 2007 1:57 PM
To: Harrington, Paul
Cc: [email protected]
Subject: Re: [sqlfairy-developers] Large SQL files

On 11/25/07, Harrington, Paul <[EMAIL PROTECTED]> wrote:
> That is a reasonable requirement.  Perhaps we could add some
> functionality to  SQL::Translator::Producer::Dumper to produce
dump/load
> pipelines like the following:
>
>                 (echo .mode csv; echo .import foo.csv foo)| sqlite3
> foo.sqlite
>
> Until such time as we have support for something like this you can
> probably get away with using sqlt-dumper as before but *excluding*
your
> big table
>
>         sqlt-dumper --skip yourbigtable
>
> and then import the large table into sqlite manually (using .import)

I wrote little script to parse out the big sql into a file for each
table, which is making things a bit more manageable.

I tried sqlt-dumper on a single file and got:

import: sqlt-dumper -d MySQL admin_rights.sql
Can't load producer 'Dumper' : Error loading Dumper as
SQL::Translator::Producer::Dumper : Can't locate Template.pm in @INC
(@INC contains: /opt/local/lib/perl5/5.8.8/darwin-2level
/opt/local/lib/perl5/5.8.8
/opt/local/lib/perl5/site_perl/5.8.8/darwin-2level
/opt/local/lib/perl5/site_perl/5.8.8 /opt/local/lib/perl5/site_perl
/opt/local/lib/perl5/vendor_perl/5.8.8/darwin-2level
/opt/local/lib/perl5/vendor_perl/5.8.8
/opt/local/lib/perl5/vendor_perl .) at
/opt/local/lib/perl5/site_perl/5.8.8/SQL/Translator/Producer/Dumper.pm
line 59.
BEGIN failed--compilation aborted at
/opt/local/lib/perl5/site_perl/5.8.8/SQL/Translator/Producer/Dumper.pm
line 59.
Compilation failed in require at
/opt/local/lib/perl5/site_perl/5.8.8/SQL/Translator.pm line 765.

which I think means I'm missing a prereq?  I'm on OS X using perl from
macports, if that helps at all, with sql::translator installed using
the cpan command line.

But even using sqlt gives me a parsing error:

import: sqlt -f MySQL -t SQLite admin_rights.sql

       ERROR (line 1): Invalid statement: Was expecting comment, or use,
or
                       set, or drop, or create, or alter, or insert, or
                       delimiter, or empty statement
Error: translate: Error with parser 'SQL::Translator::Parser::MySQL':
no results at /opt/local/bin/sqlt line 327.

(sql attached - looks to me).

> BTW, I am very motivated to help you out as I use some of your R stuff
> very regularly in my work!

Oh, neat.  What are you using?

Thanks,

Hadley


-- 
http://had.co.nz/

Attachment: generated_dumper.pl
Description: generated_dumper.pl

-------------------------------------------------------------------------
SF.Net email is sponsored by: The Future of Linux Business White Paper
from Novell.  From the desktop to the data center, Linux is going
mainstream.  Let it simplify your IT future.
http://altfarm.mediaplex.com/ad/ck/8857-50307-18918-4
-- 
sqlfairy-developers mailing list
[email protected]
https://lists.sourceforge.net/lists/listinfo/sqlfairy-developers

Reply via email to