DBD::CSV and large files...

2004-09-20 Thread amonotod
Hello, I am wondering if there is any kind of switch that can be passed to DBD::CSV that will enable it to parse only one EOL at a time? We are using DBD::CSV to parse files into databases, and it is working beautifully. Unfortunately, some of these files are in excess of 100MB, and

Re: DBD::CSV and large files...

2004-09-20 Thread Jeff Zucker
amonotod wrote: I am wondering if there is any kind of switch that can be passed to DBD::CSV that will enable it to parse only one EOL at a time? No, but Text::CSV_XS, which underlies DBD::CSV can easily parse a line at a time. I think there's discussion of adding something along the lines

Re: DBD::CSV and large files...

2004-09-20 Thread amonotod
From: Jeff Zucker [EMAIL PROTECTED] Date: 2004/09/20 Mon PM 05:40:06 GMT No, but Text::CSV_XS, which underlies DBD::CSV can easily parse a line at a time. Thanks, I'll take a look at it... I think there's discussion of adding something along the lines of execute for fetch into DBI

Re: DBD::CSV and large files...

2004-09-20 Thread Ian Harisay
Text::CSV_XS will handle what you want to do just fine. You could do: while(my $rec = $sth-fetchrow_arrayref()){ print OUTFILE $csv-combine(@{$rec}),$/; } If you are pulling large amounts of data across your network, look at doing some optimization by setting RowCacheSize in the DBI to