In addition to Chris's suggestions you should also alter the homeid
column (set default to NULL and update the whole database which
shouldn't be a problem) so you don't have to do a double check on the
same column. I would also suggest that the TSCN_MEDIAid column should be
an int not a varchar
Rob Adams wrote:
select h.addr, h.city, h.county, h.state, h.zip, 'yes' as show_prop,
h.askingprice, '' as year_built, h.rooms, h.baths,
'' as apt, '' as lot, h.sqft, h.listdate, '' as date_sold,
h.comments, h.mlsnum,
r.agency, concat(r.fname, ' ', r.lname) as rname,
r.phone as rph
Stut wrote:
Stut wrote:
Chris wrote:
Stut wrote:
Chris wrote:
Rob Adams wrote:
I have a query that I run using mysql that returns about 60,000
plus rows. It's been so large that I've just been testing it with
a limit 0, 1 (ten thousand) on the query. That used to take
about 10 minutes
select h.addr, h.city, h.county, h.state, h.zip, 'yes' as show_prop,
h.askingprice, '' as year_built, h.rooms, h.baths,
'' as apt, '' as lot, h.sqft, h.listdate, '' as date_sold, h.comments,
h.mlsnum,
r.agency, concat(r.fname, ' ', r.lname) as rname,
r.phone as rphone, '' as remail,
Stut wrote:
Chris wrote:
Stut wrote:
Chris wrote:
Rob Adams wrote:
I have a query that I run using mysql that returns about 60,000
plus rows. It's been so large that I've just been testing it with a
limit 0, 1 (ten thousand) on the query. That used to take
about 10 minutes to run, incl
Chris wrote:
Stut wrote:
Chris wrote:
Rob Adams wrote:
I have a query that I run using mysql that returns about 60,000 plus
rows. It's been so large that I've just been testing it with a limit
0, 1 (ten thousand) on the query. That used to take about 10
minutes to run, including process
Stut wrote:
Chris wrote:
Rob Adams wrote:
I have a query that I run using mysql that returns about 60,000 plus
rows. It's been so large that I've just been testing it with a limit
0, 1 (ten thousand) on the query. That used to take about 10
minutes to run, including processing time in PH
Chris wrote:
Rob Adams wrote:
I have a query that I run using mysql that returns about 60,000 plus
rows. It's been so large that I've just been testing it with a limit
0, 1 (ten thousand) on the query. That used to take about 10
minutes to run, including processing time in PHP which spits
60k records shouldn't be a problem. Show us the query you're making and
the table structure.
OKi98 wrote:
Rob Adams napsal(a):
I have a query that I run using mysql that returns about 60,000 plus
rows. It's been so large that I've just been testing it with a limit
0, 1 (ten thousand) on
Rob Adams napsal(a):
I have a query that I run using mysql that returns about 60,000 plus
rows. It's been so large that I've just been testing it with a limit
0, 1 (ten thousand) on the query. That used to take about 10
minutes to run, including processing time in PHP which spits out xml
Rob Adams wrote:
I have a query that I run using mysql that returns about 60,000 plus
rows. It's been so large that I've just been testing it with a limit 0,
1 (ten thousand) on the query. That used to take about 10 minutes
to run, including processing time in PHP which spits out xml from
Seeing the query would help.
Are you using sub-queries? I believe that those can make the time go
up exponentially.
--
Kevin Murphy
Webmaster: Information and Marketing Services
Western Nevada College
www.wnc.edu
775-445-3326
P.S. Please note that my e-mail and website address have changed f
I have a query that I run using mysql that returns about 60,000 plus rows.
It's been so large that I've just been testing it with a limit 0, 1 (ten
thousand) on the query. That used to take about 10 minutes to run,
including processing time in PHP which spits out xml from the query. I
dec
13 matches
Mail list logo