Re: Get a Random Row on a HUGE db

2005-04-28 Thread Jigal van Hemert
From: "Scott Gifford" >SELECT COUNT(*) FROM firebase_content; > > to get the count. That's very fast; it comes from the table summary > information, IIRC. I use a similar solution for a similar problem, > and have had great luck with it. This is true for MyISAM tables, but e.g. InnoDB does *

Re: Get a Random Row on a HUGE db

2005-04-27 Thread Scott Gifford
<[EMAIL PROTECTED]> writes: [...] > So what I am trying is this. > > $last_row ="SELECT from firebase_content LAST_INSERT_ID()"; > $last_row_query = $dbi->query($last_row); > $last_row_result = $row->id; LAST_INSERT_ID() only works if you just inserted an element; it's maintained per-connection.

Re: Get a Random Row on a HUGE db

2005-04-27 Thread Dawid Kuroczko
On 4/26/05, [EMAIL PROTECTED] <[EMAIL PROTECTED]> wrote: > > I am wanting to display a random page from my site, But I have over 12,000 > articles right now and we add over 150 per day. What I wound up doing was a > Virtual DOS attack on my own server because the 40 mb db was being loaded to > m

Re: Get a Random Row on a HUGE db

2005-04-27 Thread Christian Meisinger
> $last_row ="SELECT from firebase_content LAST_INSERT_ID()"; > $last_row_query = $dbi->query($last_row); > $last_row_result = $row->id; i think LAST_INSERT_ID will not work for what you wonna do. if you open a connection to MySQL and call LAST_INSERT_ID without a INSERT it will return 0. http://

Re: Get a Random Row on a HUGE db

2005-04-26 Thread Gary Richardson
Why don't you generate a random integer in your code and select for an article? If there is no article there, do it again. Even if you have to call it 50 times it may be faster than doing a full scan on the table. It may not work so well if there are lots of gaps in your autoincrement. In perl (d

RE: Get a Random Row on a HUGE db

2005-04-26 Thread gunmuse
This difference between using a 40 mb table and 4mb table with the same traffic was a 70 server load versus a .9 server load. So it was the amount of data that I was selecting that was choking this feature. - [EMAIL PROTECTED] wrote: >Thanks for that I implemented to my Random code. Same

RE: Get a Random Row on a HUGE db

2005-04-26 Thread gunmuse
What I had to do was do this for my navigation db and not my content db. My server can easily handle lots of calls to a 4mb table then tell it to fetch the content once that has been achieved. The reason I bringing this up is this seems to be a "patched" way of doing this. If I have 40,000 item

Re: Get a Random Row on a HUGE db

2005-04-26 Thread Peter Brawley
Gunmuse, SELECT from firebase_content LAST_INSERT_ID() In that cmd, 'from ...' ain't right. I didn't understand either what's wrong with ORDER BY RAND() LIMIT 1. Also check the Perl manual for how to retrieve a single value. PB - [EMAIL PROTECTED] wrote: Thanks for that I implemented to my Rand

RE: Get a Random Row on a HUGE db

2005-04-26 Thread gunmuse
Thanks for that I implemented to my Random code. Same problem that select * portion is just a nightmare. Remember I selecting 38mb of data when I do that. What I want to do is jump to a Valid random row. Now If I didn't delete content often that would be easy grab the last autoincremented row_i

Re: Get a Random Row on a HUGE db

2005-04-26 Thread Rhino
Sent: Tuesday, April 26, 2005 11:33 AM Subject: Get a Random Row on a HUGE db I am wanting to display a random page from my site, But I have over 12,000 articles right now and we add over 150 per day.  What I wound up doing was a Virtual DOS attack on my own server because the 40

RE: Get a Random Row on a HUGE db

2005-04-26 Thread Jay Blanchard
[snip] I am wanting to display a random page from my site, But I have over 12,000 articles right now and we add over 150 per day. What I wound up doing was a Virtual DOS attack on my own server because the 40 mb db was being loaded to many times. I have tons of memory and a Dell Dual Xeon 2.8 gig

Get a Random Row on a HUGE db

2005-04-26 Thread gunmuse
I am wanting to display a random page from my site, But I have over 12,000 articles right now and we add over 150 per day.  What I wound up doing was a Virtual DOS attack on my own server because the 40 mb db was being loaded to many times.   I have tons of memory and a Dell Dual Xeon 2.8 g