Sara,

If your msg column has known values (like an auto_increment column), you could select those rows directly. I think the reason that all the rows are selected first by the SQL server is because of the "order by". Therefore, if you could select "msg between a and b", that could improve the performance a bit. However, if this is a single table with 50,000 rows, I think that most modern databases will select these rows (with appropriate indices on the table) very rapidly. It is probably best to do this on the SQL side, as you are already doing, as moving things from SQL to perl just to do the subset of rows is not very efficient. In short, either change your SQL query to get the rows you want based on WHERE alone or just leave it the way it is.

On the perl side, you can improve things somewhat (probably) by using fetchall_arrayref. As noted in DBI documentation, this is the fastest way to get a large chunk of data from the SQL server. Note that there is a way to use fetchall_arrayref to get a slice of the returned data, which could probably be used as a substitute for "limit" above. However, I doubt that it is faster--someone with more knowledge might want to comment.

Sean

----- Original Message ----- From: "Sara" <[EMAIL PROTECTED]>
To: <[EMAIL PROTECTED]>
Sent: Friday, November 19, 2004 9:19 PM
Subject: Reducing load on server.



I am writing a Cgi script to retrieve records and split it into pages.

my $limit = $q->param('limit') || '0';
my $results_per_page = 50;

my $query = $dbh -> prepare("SELECT * FROM $table WHERE reply=0 ORDER BY
msg
DESC LIMIT $limit, $results_per_page");
$query->execute();
while (my @row = $query -> fetchrow_array()) {
print "$row[1]<br>$row[2]<br>$row[3] blah blah blah.....";
}
$query->finish();
&print_number_of_pages;
----------------------------------------------

The table holds about 50,000 rows of records, The script is supposed to be
accessed 100,000 times a day by web users. After reading manual for LIMIT,
I
came to know that LIMIT works after getting the whole matrix of records,
so
it's not going to reduce any load on server, so I am using it only for
splitting records on to pages.

----------------------------------------------
My Questions:

1 - Is it safe to load such huge data 100,000 times a day? Or there is a
way
that I can load only those records which are going to be printed on each
page?

2 - How much data in terms of size (MB) can be holded safely by my @row =
$query -> fetchrow_array())

3- What else can I do to reduce load on mySQL, PERL and server? Should I
use
CGI::Fast?

4- Anyother possible query?

Thanks,
Sara.



--
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]
<http://learn.perl.org/> <http://learn.perl.org/first-response>





-- To unsubscribe, e-mail: [EMAIL PROTECTED] For additional commands, e-mail: [EMAIL PROTECTED] <http://learn.perl.org/> <http://learn.perl.org/first-response>




Reply via email to