On the other hand Grant, if it really were that "easy" you would have probably figured
it out on your own.  As we have all discovered at various times, simple does NOT automatically
mean easy nor quick. =-O


Dennis




Grant Kelly wrote:
I did play with the tail command, but as mentioned, you still have to
know the total number of lines in the file. `wc -l` took about 5
minutes to calculate this number, and then you've still got to run it
through tail.

Also, I wanted to remove the lines rather than ignore them so that, if
needed, I could reuse the file or pass it on to others without them
having to go through the same problem I did.
Furthermore, the CREATE TABLE commands were preceeded by something
like IF EXISTS DROP table.  I had already created the tables and
changed them slightly, so I didn't want them re-created.  sed was the
quickest and simplest tool for the job in this case.

So much for a "quick question".... over 30 messages in this thread!

Grant


On 6/30/06, James Washer <[EMAIL PROTECTED]> wrote:
Jeezus...

YES... the tail command gives the last "xx" lines of the files.. but without counting the lines of the 2+GB file, one has no idea how many lines that is. Counting the lines of an arbitrarily large file, when there is no need to do so, is far from efficient.

Further try "cat < head -3 /etc/passwd" and report back on the shell error you receive. That's just not legal shell syntax.



On Fri, 30 Jun 2006 07:02:00 -0700 (PDT)
Bill Cunningham <[EMAIL PROTECTED]> wrote:

>
>  Actually the tail command gives the last whatever:
>
> #!/usr/bin/perl -w
>
> $filename = "reallybighonknfile.sql";
>
> #number of lines in the entire file
> $count = `wc -l $filename`;
>
> @output = `tail -l $count-300 $filename`;
>
> [EMAIL PROTECTED] has the part of the file you'll need to feed whatever you want.
>
> Oh and the "<" was to redirect the sql commands into the mysql process (with missing parameters for database and whatnot) which is what the original post asked for:
>
>
> Bill Cunningham
>

_______________________________________________
RLUG mailing list
[email protected]
http://lists.rlug.org/mailman/listinfo/rlug


_______________________________________________
RLUG mailing list
[email protected]
http://lists.rlug.org/mailman/listinfo/rlug

Reply via email to