[Boston.pm] Bug/Limitation in DBD::mysql?

2004-01-15 Thread Chris Braiotta
Hi, all. I've discovered some odd behavior in a script I've written 
that uses DBD::mysql, and I think it's related to the large number of 
resuts my query is returning. First, the sample code:

---

#!/usr/bin/perl -Tw

use strict;
use DBD::mysql;
$ENV{'PATH'} = '';
my $conn = DBI-connect(DBI:mysql:db_name:db_host, 'db_user', 
'db_pw');
my ($prep, $exec);

my $get_uptime_sql = SELECT host,state,date FROM uptime WHERE 
date='$min_timestamp' AND date='$max_timestamp' ORDER BY host,date 
ASC;

$prep = $conn-prepare($get_uptime_sql)
or die Can't prepare: $conn-errstr\n;
$exec = $prep-execute
or die Can't execute: $conn-errstr\n;
LOOP: while (my @sql = $prep-fetchrow_array) {
my ($host, $state, $date) = @sql;
blah blah blah
}
-

When I run the script, the MySQL lookup does return some results, but 
it also throws an error: DBD::mysql::st fetchrow_array failed: fetch() 
without execute() at ./load_executive_snapshot.pl line [line number]. 
Since there is quite obviously an execute statement there, this would 
seem to indicate a problem with my SQL statement. However, I'm fairly 
confident that this isn't the case, because of the following:

1) I've had the script print out the SQL it's generating for 
$get_uptime_sql, and I've run that query at a MySQL prompt. It 
functions properly, returning 305,403 rows in just over a minute.

2) Simply adding a LIMIT 500 to the end of the SQL query, or using a 
more restrictive set of dates than the script is meant to use, 
eliminates the error.

It seems to me, then, that DBD::mysql (or, at least, my particular 
DBD:mysql environment) isn't handling such a large result gracefully. I 
can think of workarounds for this, but I shouldn't have to, since I can 
get this to run just fine using, for example, a backticked reference to 
MySQL. Does anyone have any comments, or similar experiences?



Thanks,
Chris Braiotta
Web Services Manager
Network Operations Center
Harvard University
___
Boston-pm mailing list
[EMAIL PROTECTED]
http://mail.pm.org/mailman/listinfo/boston-pm


Re: [Boston.pm] Bug/Limitation in DBD::mysql?

2004-01-15 Thread Ronald J Kimball
On Thu, Jan 15, 2004 at 09:59:38AM -0500, Chris Braiotta wrote:

 $prep = $conn-prepare($get_uptime_sql)
   or die Can't prepare: $conn-errstr\n;
 $exec = $prep-execute
   or die Can't execute: $conn-errstr\n;
 
 LOOP: while (my @sql = $prep-fetchrow_array) {
   my ($host, $state, $date) = @sql;
   blah blah blah
 }
 
 -
 
 When I run the script, the MySQL lookup does return some results, but 
 it also throws an error: DBD::mysql::st fetchrow_array failed: fetch() 
 without execute() at ./load_executive_snapshot.pl line [line number]. 
 Since there is quite obviously an execute statement there, this would 
 seem to indicate a problem with my SQL statement. However, I'm fairly 
 confident that this isn't the case, because of the following:
 
 1) I've had the script print out the SQL it's generating for 
 $get_uptime_sql, and I've run that query at a MySQL prompt. It 
 functions properly, returning 305,403 rows in just over a minute.
 
 2) Simply adding a LIMIT 500 to the end of the SQL query, or using a 
 more restrictive set of dates than the script is meant to use, 
 eliminates the error.
 
 It seems to me, then, that DBD::mysql (or, at least, my particular 
 DBD:mysql environment) isn't handling such a large result gracefully. I 
 can think of workarounds for this, but I shouldn't have to, since I can 
 get this to run just fine using, for example, a backticked reference to 
 MySQL. Does anyone have any comments, or similar experiences?

Alternatively, something in the blah blah blah section of the code may be
affecting your statement handle.  Does your code run through all the
results if the loop doesn't do anything besides the fetch?

Ronald
___
Boston-pm mailing list
[EMAIL PROTECTED]
http://mail.pm.org/mailman/listinfo/boston-pm


Re: [Boston.pm] Bug/Limitation in DBD::mysql?

2004-01-15 Thread Gyepi SAM
On Thu, Jan 15, 2004 at 09:59:38AM -0500, Chris Braiotta wrote:
 Hi, all. I've discovered some odd behavior in a script I've written 
 that uses DBD::mysql, and I think it's related to the large number of 

 2) Simply adding a LIMIT 500 to the end of the SQL query, or using a 
 more restrictive set of dates than the script is meant to use, 
 eliminates the error.

Perhaps the shell or process running the script is running into a resource
limit. What does ulimit tell you?  I am assuming, obviously, that you are
running a very recent DBD::mysql.

-Gyepi
___
Boston-pm mailing list
[EMAIL PROTECTED]
http://mail.pm.org/mailman/listinfo/boston-pm