Jeff,

Thanks, I believe you gave me exactly what I was looking for. I have some
interesting files that I'll be processing. It's a mainframe generated file.
It's four types of records that are fixed length fields, but needs alot of
cleaning. So, I simply wanted to use DBI to manage the data in temporary
tables and then write a few different records. Anyway, I believe what I was
looking for is the "while (my $row=$sth->fetch)" loop you have. I think the
fetch is what I had missing.

Thanks for the reply,

Jeff Mirabile


Jeff Zucker wrote:

> Jeff Mirabile wrote:
> >
> > Hi folks,
> >
> > I can't seem to find the method to simply parse a file line by line from
> > beginning to end using DBI. I can do it with a "while" statement, but
> > was wondering if there is a specific DBI way of doing this?
>
> Do you mean a text file that contains fields on each line e.g. a Comma
> Separated Values file or a fixedlength records file, or a tab or pipe
> "delimited" file?  If so then DBD::AnyData will handle it.
>
> Or perhaps you mean a plain text file with no fields?  I'm not sure why
> one would want to use SQL to access that, but you can always use
> DBD::AnyData and define a single column called "line", then process the
> lines with SQL like so:
>
>   #!/usr/local/bin/perl -w
>   use DBI;
>   my $file = 'mytest.foo';
>   my $dbh=DBI->connect('dbi:AnyData:');
>   $dbh->func( 'test', 'CSV', $file, {col_names=>'line'}, 'ad_catalog' );
>   my $sth = $dbh->prepare("SELECT * FROM test WHERE line LIKE '%bar%'");
>   $sth->execute;
>   while (my $row=$sth->fetch) {
>       print "@$row\n";
>       # or other processing
>   }
>
> --
> Jeff

Reply via email to