Inefficient query is melting my server!!

2011-10-19 Thread Brian Dunning
Can someone tell me if what I'm trying to do can be done more efficiently? I just got off the phone with Rackspace when my server was hung up, and they found a whole bunch of this one same query was all stacked up with waiting queries and locked. Here's the query: $query = insert ignore into

[X-POST] Free sample data (US Canada) for testing

2011-02-24 Thread Brian Dunning
Hey all - I've just uploaded some free Canada sample data to complement the US data that was already available. Testing apps with a representation amount of sample data is crucial to evaluate performance. Download the data here: http://www.briandunning.com/sample-data/ Enjoy, - Brian -- MySQL

Best way to purge old records from a huge table?

2010-06-04 Thread Brian Dunning
Hey all - I have a table with 12,000,000 records spread over about 6 years. I'm trying to delete all but the last 2 years, but no matter how small of a group I try to delete at a time, it keeps hanging up the server and I eventually have to restart MySQL. The table looks like this:

7-day average

2010-03-18 Thread Brian Dunning
My brain just squirted out my ear trying to figure this out, so maybe one of you can help me scoop it back up. I have a table of accounts, and a table of hits-per-day per account. I'm trying to sort my accounts by the most hits-per-day, averaged over the preceding 7 days (8 days ago through

Re: 7-day average

2010-03-18 Thread Brian Dunning
You're exactly right, that's an important point that I neglected when putting together my example. Good catch. On Mar 18, 2010, at 11:27 AM, Chris W wrote: I think I would change the math. Since there are several days in there where there are no hits, that should in my opinion count

Possible to find this duplicate?

2010-02-13 Thread Brian Dunning
Hey all - I have a table listing references for chapters in a book. I'm trying to find all the cases where a single chapter lists more than one reference from the same author. In this example table, I want it to find IDs 1 and 2, because they're both from the same author, and both in chapter 1

Re: [PHP] Good source for sample data?

2010-01-29 Thread Brian Dunning
, exchange, so will work for mapping, phone or address validation, whatever your needs are. Hope someone find it useful. http://www.briandunning.com/sample-data/ On Jan 28, 2010, at 3:52 PM, Brian Dunning wrote: I need a few million sample contact records - name, company, address, email, web

Good source for sample data?

2010-01-28 Thread Brian Dunning
Hey all - I need a few million sample contact records - name, company, address, email, web, phone, fax. ZIP codes and area codes and street addresses should be correct and properly formatted, but preferably not real people or companies or email addresses. But they'd work if you did address

Re: Trouble joining 3 tables

2009-11-03 Thread Brian Dunning
Darn, it's not working after all. SELECT people.*, COUNT ( DISTINCT cars.car_id ) AS car_count, COUNT ( DISTINCT pets.pet_id ) AS pet_count, SUM ( IF ( pets.date_bought NOW() - INTERVAL 7 DAY, 1, 0 ) ) AS new_pet_count WHERE...etc car_count and pet_count are calculating correctly, but

Customers with no recent orders?

2009-11-03 Thread Brian Dunning
I thought I could find the answer to this by googling, but had no luck. How do I show a list of customers who: (a) Have placed no orders within 14 days, (b) Have been a customer for at least 14 days. (They do not have to have placed any orders, ever, to be a customer.) I'm trying to show

Re: Choose certain columns in mysqldump?

2009-11-03 Thread Brian Dunning
select into outfile was the ticket. Thanks! -- MySQL General Mailing List For list archives: http://lists.mysql.com/mysql To unsubscribe:http://lists.mysql.com/mysql?unsub=arch...@jab.org

Re: Customers with no recent orders?

2009-11-03 Thread Brian Dunning
More info. If the tables look like this: customers -- cust_id signup_date orders -- order_id cust_id order_date Then the SQL needs to accomplish something like this: select * from customers left join orders on customers.cust_id = orders.cust_id where

Re: Customers with no recent orders?

2009-11-03 Thread Brian Dunning
Thanks. I added orders.order_date to the list of fields being selected, but it still returns invalid use of group function. On Nov 3, 2009, at 1:09 PM, Martin Gainty wrote: any column used in any group functions such as SUM needs to be requested in select column e.g. select foo from

Re: Trouble joining 3 tables

2009-11-02 Thread Brian Dunning
Thanks, this solved it! On Nov 2, 2009, at 12:37 AM, Michael Dykman wrote: I suspect 'distinct' might help you out here. SELECT people.*, count(distinct cars.car_id) as car_count, count(distinct pets.pet_id) as pet_count -- MySQL General Mailing List For list archives:

Re: Trouble joining 3 tables

2009-11-02 Thread Brian Dunning
Johnny - Your solution might actually help me solve my next step, which is to also return a count of pets bought only within the last 7 days. Something like this: SELECT people.*, SUM(IF(cars.id IS NULL,0,1)) AS car_count, SUM(IF(pets.id IS NULL,0,1)) AS pet_count, SUM ( IF (

Trouble joining 3 tables

2009-11-01 Thread Brian Dunning
Hi all - I have a table of PEOPLE, and a table of CARS owned by various people, and a table of PETS owned by various people. Each person may have 0 or more pets, and each person may have 0 or more cars. I'm trying to return a list of all the people, showing how many pets each person has,

Choose certain columns in mysqldump?

2009-10-29 Thread Brian Dunning
Sorry if this is a n00b question, I did read the docs for mysqldump before posting, but couldn't figure it out. I'm successfully doing a mysqldump, but I want only a few of the columns, and want them in a different order. Is this possible without substantially slowing it down? The table

Simply join that confuses me.

2009-09-29 Thread Brian Dunning
I have a table of projects with several columns for the IDs of some people in various roles, and then a table of the people. How can I get all the people for a given project, but keep their roles straight? Projects -- Project_ID Boss_ID Admin_ID Assistant_ID People ID Name

Average Rating, like Netflix

2008-12-22 Thread Brian Dunning
Pretend I'm Netflix and I want to return a list of found movies, including the average of related ratings for each movie. Something like this: select movies.*, average(ratings.rating) from movies, ratings where movies.movie_id=ratings.movie_id I'm sure that's wrong in about 10 different

Re: Average Rating, like Netflix

2008-12-22 Thread Brian Dunning
) FROM movies INNER JOIN ratings ON movies.movie_id=ratings.movie_id GROUP BY movies.movie_id Change the INNER JOIN to a LEFT JOIN if you want all movies, even those with no ratings. Brent Baisley On Mon, Dec 22, 2008 at 11:13 AM, Brian Dunning br...@briandunning.com wrote: Pretend I'm Netflix

Re: Selecting around a circular reference?

2008-09-08 Thread Brian Dunning
= '$meal_date'; -Micah On 09/05/2008 03:41 PM, Brian Dunning wrote: I'm trying to calculate glycemic index of all food items eaten in a day with a single SELECT. The problem is the calculation for glycemic index of each item requires a total of all items' carbs. It's like a circular reference

Totaling from several tables away

2008-09-05 Thread Brian Dunning
How do I query How many calories did Brian eat on 2009-09-04? Table:People +---+---+ + Person_ID + Name | +---+---+ | 5 | Brian | +---+---+ Table:Meals +-+---+---++ | Meal_ID | Person_ID | Meal_Name | Date |

Re: Totaling from several tables away

2008-09-05 Thread Brian Dunning
Thanks, that was exactly what I needed. :-) -- MySQL General Mailing List For list archives: http://lists.mysql.com/mysql To unsubscribe:http://lists.mysql.com/[EMAIL PROTECTED]

Selecting around a circular reference?

2008-09-05 Thread Brian Dunning
I'm trying to calculate glycemic index of all food items eaten in a day with a single SELECT. The problem is the calculation for glycemic index of each item requires a total of all items' carbs. It's like a circular reference. Here's what I'm trying: SELECT

Retrieve insert_id on an insert ignore?

2008-08-19 Thread Brian Dunning
I have a table like this: unique_serial - Auto Increment field_1 field_2 field_3 The Primary key is a combination of field_1, field_2, and field_3. I want to do: INSERT IGNORE INTO table_name (field_1,field_2,field_3) VALUES ('xx','xx','xx') Sometimes this will be an existing record,

Super slow query

2008-07-09 Thread Brian Dunning
Can someone spot a way to improve the performance of this query? I see that every time it runs, it's Copying to tmp table and then Creating sort index and taking way too long. select count(distinct(stats.ip)) as popcount, stats.id, episodes.title from stats, episodes where

Help me format this statement

2008-03-11 Thread Brian Dunning
I am an idiot. table_a and table_b have exactly the same structure. How do I say this in SQL: INSERT (all records from table_a) into table_b where table_a.customer = '12' Just trying to eventually duplicate the whole table, one customer's set of records at a time. Thanks. -- MySQL

Re: Help me format this statement

2008-03-11 Thread Brian Dunning
Thanks to everyone who replied. So simple I couldn't see it. :-) -- MySQL General Mailing List For list archives: http://lists.mysql.com/mysql To unsubscribe:http://lists.mysql.com/[EMAIL PROTECTED]

Move data from one db to another?

2007-12-12 Thread Brian Dunning
I have similar (not identical) tables in two different db's. I want to merge all the data from one into the other, and will be dropping the original. There will be some duplication of keys so some rows will need to be ignored. Problem is these are on two different servers on different

Re: Move data from one db to another?

2007-12-12 Thread Brian Dunning
OK, I have all my data on the new machine in a SQL file with 664,000 insert statements. But when I try to run it like this from the mysql command line: source filename.sql; It starts to work, but after a few seconds the server freezes up with too many connections. How do I avoid this?

Editing fields in bulk

2007-09-04 Thread Brian Dunning
I have a column where I need to replace all instances of the text US- Complete (contained within a long sentence) with US Complete. There are probably 50 or 100 of them. I'm really scared to do it since I can't risk screwing up that column - what's the correct syntax? -- MySQL General

Before I shoot myself in the foot...

2007-06-13 Thread Brian Dunning
...if I add a column to a table with 40,000,000 records, will it cause that table to hang for any significant amount of time, and prevent other transactions? It's a MyISAM table and I was going to add a varchar(20) column, NULL. It's a very busy table, constant searches and inserts. --

Re: Before I shoot myself in the foot...

2007-06-13 Thread Brian Dunning
Many thanks to all who took the time to reply. :) :) -- MySQL General Mailing List For list archives: http://lists.mysql.com/mysql To unsubscribe:http://lists.mysql.com/[EMAIL PROTECTED]

[X-POST] Fastest way to dump this huge table

2007-05-02 Thread Brian Dunning
I have a huge MySQL table, 2.1 million records, 200MB. Once a week I need to dump it in CSV format and zip the file. This is not on my server, and it's in production, so I don't want to risk testing different methods and possibly hanging up their server for a period of time, so I wanted to

Re: [X-POST] Fastest way to dump this huge table

2007-05-02 Thread Brian Dunning
in the database software. Not having any indices on your temp table will help too. HTH, Dan On 5/2/07, Brian Dunning [EMAIL PROTECTED] wrote: I have a huge MySQL table, 2.1 million records, 200MB. Once a week I need to dump it in CSV format and zip the file. This is not on my server, and it's

Re: [PHP] [X-POST] Fastest way to dump this huge table

2007-05-02 Thread Brian Dunning
Thanks to everyone who answered, think I've got enough info now to handle it. :) -- MySQL General Mailing List For list archives: http://lists.mysql.com/mysql To unsubscribe:http://lists.mysql.com/[EMAIL PROTECTED]

[X-POST] PHP script to make sure MySQL is up?

2007-03-01 Thread Brian Dunning
I host at Rackspace, and one thing that their monitoring service does NOT catch is a problem when too many connections hit MySQL, and for some reason it remains hung up until the service is manually restarted. In the meantime, this is happening to us once or twice a month. It gets hammered

Formatting a subquery?

2007-01-16 Thread Brian Dunning
I can't get MySQL 5 to like this query. Can you tell what I'm trying to do, and is there a problem with my formatting? select account_id,picture_id from pictures where account_id in (select account_id from accounts order by rand() limit 20) order by rand(); -- MySQL General Mailing List For

Re: Formatting a subquery?

2007-01-16 Thread Brian Dunning
: if I make it a single query and try to do a 'group by' it always gives the same picture for each person. On Jan 16, 2007, at 11:55 AM, Rhino wrote: - Original Message - From: Brian Dunning [EMAIL PROTECTED] To: mysql mysql@lists.mysql.com Sent: Tuesday, January 16, 2007 2:23 PM

Re: [OT} How to pronounce GIF (was: Re: How to pronounce MyISAM and InnoDB)

2007-01-12 Thread Brian Dunning
On Jan 7, 2007, at 4:23 PM, TK wrote: In short, the original inventors of the GIF format (CompuServe, 1987) have always defined the pronunciation to be like JIF. So, that has always been the correct pronunciation. Sure, so I'll start pronouncing graphics as jraphics. -- MySQL General

Help optimizing this query?

2007-01-07 Thread Brian Dunning
This is the query that's killing me in the slow query log, usually taking around 20 seconds: select count(ip) as counted,stamp from ip_addr where stamp=NOW()- interval 14 day and source='sometext' group by stamp order by stamp desc; Here is the table: CREATE TABLE `ip_addr` ( `ip`

Re: How to pronounce MyISAM and InnoDB

2007-01-07 Thread Brian Dunning
My friend says my sam and in-NOD-b. I want to kill him every time. He also says jif instead of gif. On Jan 7, 2007, at 7:16 AM, Michael Stearne wrote: I just say My, I, Sam and inno, d, b Michael -Original Message- From: js [EMAIL PROTECTED] Date: Mon, 8 Jan 2007 00:09:15

Re: Workaround for distinct?

2006-12-15 Thread Brian Dunning
The table structure is in my original post. On Dec 14, 2006, at 11:49 AM, Rolando Edwards wrote: Just change your table name and you are all set. If you have the table structure, send it and I'll make the query if you want. -- MySQL General Mailing List For list archives:

Workaround for distinct?

2006-12-14 Thread Brian Dunning
+++ | id | color | +++ | 1 | red| | 2 | blue | | 3 | red| | 4 | yellow | | 5 | yellow | | 6 | blue | | .. | ...| I'm trying to select 5 random records, but no more than 1 of any given color. According to the notes in the documentation and to my

Re: Workaround for distinct?

2006-12-14 Thread Brian Dunning
], Brian Dunning [EMAIL PROTECTED], mysql@lists.mysql.com Sent: Thursday, December 14, 2006 12:17:30 PM GMT-0500 US/Eastern Subject: Re: Workaround for distinct? On Thu, 14 Dec 2006, Chris Boget wrote: =I tried this out this morning on MySQL 5. = It works. Please try this in MySQL 4 and see

mysql_pconnect() made things worse

2006-11-30 Thread Brian Dunning
I have many sites accessing one MySQL server, all using the same IP, username, password via a PHP $connection = mysql_connect($ip,$user, $pass);. The server is extremely busy, hammered 24x7, but almost all traffic is coming from one very short script. About once a day we're getting No

Deleting, skip the first n records

2006-10-04 Thread Brian Dunning
I'm trying to delete all but the newest n records. DELETE FROM tablename ORDER BY creation DESC LIMIT=n This does the opposite of what I want. Is there some way to tell it to start the delete after n and delete all the remaining records? -- MySQL General Mailing List For list archives:

Re: Deleting, skip the first n records

2006-10-04 Thread Brian Dunning
The offset is what I was thinking of - that would be the simplest - but as far as I can tell, delete doesn't support the offset. It's not documented, and it gives me an error when I try it. I was hoping to avoid two queries but it sounds like that's what I might have to do. On Oct 4,

Re: re[2]: Deleting, skip the first n records

2006-10-04 Thread Brian Dunning
__ -- Original Message -- FROM: Brian Dunning [EMAIL PROTECTED] TO:mysql@lists.mysql.com DATE: Wed, 4 Oct 2006 08:49:48 -0700 SUBJECT: Re: Deleting, skip the first n records The offset is what I was thinking of - that would be the simplest

Trouble duplicating a mongo live table

2006-09-30 Thread Brian Dunning
I have a 17,000,000 record table that I'm trying to duplicate in order to make some changes and improvements, then I'll rename it and drop the original table. So I need this duplicate to be a live table in the same database as the original. I tried the copy table function in the Operations

Re: Low priority copy?

2006-09-30 Thread Brian Dunning
Thanks Chris, this sounds great but when I read about mysqlhotcopy I didn't see a way to make it create a live table that's open within the same database, it seems to want only to create a separate backup file in some directory. On Sep 27, 2006, at 6:10 PM, Wagner, Chris (GEAE, CBTS)

Re: Low priority copy?

2006-09-30 Thread Brian Dunning
copying */ - RENAME TABLE newtable TO newtable_bkup, newtable2 TO newtable Dan On 9/27/06, Brian Dunning [EMAIL PROTECTED] wrote: I have a very busy 14,000,000 record table. I made a new version of the table, that's more efficient, and now I just need to copy over the data. Just about anything

Keys should not both be set for column...

2006-09-30 Thread Brian Dunning
phpMyAdmin is giving me the following warning: PRIMARY and INDEX keys should not both be set for column `referer` Here is what the table looks like: CREATE TABLE `myspacemap_visitors_2` ( `creation` timestamp NOT NULL default CURRENT_TIMESTAMP on update CURRENT_TIMESTAMP, `lat` double NOT

Low priority copy?

2006-09-27 Thread Brian Dunning
I have a very busy 14,000,000 record table. I made a new version of the table, that's more efficient, and now I just need to copy over the data. Just about anything I try swamps the machine and locks up MySQL with too many connections because it's so damn busy. Can anyone suggest the most

Inserting/updating only when combo of fields is unique

2006-09-27 Thread Brian Dunning
I have a need to insert a record only when the combination of 3 fields is unique. I do this by having an index with all 3 fields, and doing an INSERT IGNORE. This works fine. Here's the element I can't figure out how to add: When there is a pre- existing record, I want to update two of its

Re: Low priority copy?

2006-09-27 Thread Brian Dunning
, where you're copying into a duplicate of your new table structure. - CREATE TABLE newtable2 LIKE newtable - INSERT INTO newtable2 SELECT * from oldtable /* or however you're copying */ - RENAME TABLE newtable TO newtable_bkup, newtable2 TO newtable Dan On 9/27/06, Brian Dunning [EMAIL

Re: Inserting/updating only when combo of fields is unique

2006-09-27 Thread Brian Dunning
Thanks Dan, I believe that's exactly what I was looking for. Thanks for not saying RTFM even though it clearly applies. :) On Sep 27, 2006, at 12:57 PM, Dan Julson wrote: Brian, Look at the ON DUPLICATE KEY UPDATE syntax within the INSERT SYNTAX of the Docs. That should give you what

Red Hat slow query log

2006-09-27 Thread Brian Dunning
Before I do this, I just wanted to check with you all to see if this is the correct command: /etc/rc.d/init.d/mysqld restart --log-slow-queries If so, where exactly will I find the slow query log? Will the slow query log be turned off by default next time I restart it? -- MySQL General

[OFF] PHP/MySQL contractor needed

2006-09-22 Thread Brian Dunning
We need a guy for some hourly PHP/MySQL work. Large project to start with, plenty of incremental stuff down the road. Looking for someone expert, with a flexible schedule who can make hours available when we need them. Your regular hourly rate. Also - only interested in someone local to

Re: Distinct select over 2 fields?

2006-09-13 Thread Brian Dunning
)) from table where ... Steve Musumeche CIO, Internet Retail Connection [EMAIL PROTECTED] Brian Dunning wrote: Lat lon are two different fields. Either can be duplicated, but not both. On Sep 12, 2006, at 12:33 PM, Steve Musumeche wrote: Select DISTINCT(lat_long_field) from table where

Re: Distinct select over 2 fields?

2006-09-13 Thread Brian Dunning
Never mind, I figured it out: select distinct(concat(lat,lon)), lat, lon where On Sep 13, 2006, at 6:57 AM, Brian Dunning wrote: But if I do this, how do I still get lat and lon as two different fields? This finds the right record set, but it returns both fields concatenated

Re: South American timber products.

2006-09-13 Thread Brian Dunning
Hi Agrapin - This sounds great. Could you please post some of your timber products here to the list? Many of us are really looking for a break from this boring MySQL stuff. Thanks, and our kind regards to you too. - Brian On Sep 11, 2006, at 7:57 PM, Agrapin S.A. - Timber Industry and

Distinct select over 2 fields?

2006-09-12 Thread Brian Dunning
I'm searching a database of geopoints, and when two records have the same latitude and longitude, I only want to return one of them - basically just find all the unique locations. How do you set up a select like this? Thanks... -- MySQL General Mailing List For list archives:

Re: Distinct select over 2 fields?

2006-09-12 Thread Brian Dunning
Many different records will be returned though, I just don't want any dupes where both lat/lon is the same. :) On Sep 12, 2006, at 12:20 PM, Hiep Nguyen wrote: select * from table where . limit 1 that would do it if you don't care which one it returns JC On Tue, 12 Sep 2006, Brian

Re: Distinct select over 2 fields?

2006-09-12 Thread Brian Dunning
Lat lon are two different fields. Either can be duplicated, but not both. On Sep 12, 2006, at 12:33 PM, Steve Musumeche wrote: Select DISTINCT(lat_long_field) from table where... Steve Musumeche CIO, Internet Retail Connection [EMAIL PROTECTED] Brian Dunning wrote: Many different

Combo of 2 fields needs to be unique

2006-09-04 Thread Brian Dunning
I'm tracking site visitors with a database - one field is the IP address, the other is the page they've visited. I don't want any duplicates of this combination. Is there a MySQL statement similar to INSERT IGNORE but will ignore only if the combination is duplicated? -- MySQL General

I don't understand why SCSI is preferred.

2006-07-11 Thread Brian Dunning
My understanding is that SCSI has a faster transfer rate, for transferring large files. A busy database needs really fast access, for making numerous fast calls all over the disk. Two different, unrelated things. I am more than willing to be called Wrong, slapped, and cast from a bridge.

Update or insert with a single SQL statement?

2006-03-31 Thread Brian Dunning
I have a really simple two-column database: domain_name (primary key) timestamp I'm trying to keep track of the referrer of every visit to a web site, and I'm looking for a single SQL statement (since my ISP limits the total number of calls I can make in a day) that will either insert a

Moving from PowWeb to Rackspace

2006-02-06 Thread Brian Dunning
I have a bunch of databases - some are really big, 2GB - on a number of different accounts at PowWeb. I am buying a Rackspace server and want to move everything over -- hopefully all in one night. Can anyone suggest the best way to do this? Would it be to use the Export command in

Fastest way to log IP's

2006-02-02 Thread Brian Dunning
I'm logging IP addresses and also searching for existing ones. Does anyone happen to know which is more resource intensive: storing the IP's as integers using INET_NTOA() and INET_ATON() to convert them at each query; or simply storing the IP as a varchar? In each case the IP field would

Re: Getting # of days until expiration

2006-01-06 Thread Brian Dunning
On Jan 5, 2006, at 9:38 PM, Michael Stassen wrote: DATEDIFF was added in 4.1.1. What version of mysql do you have? Thanks Michael - that was indeed the problem. Some ancient-ass version that's been on my development server for who knows how many years, in accordance with the If it ain't

Getting # of days until expiration

2006-01-05 Thread Brian Dunning
I have a DATE field that includes a date sometime within the past 30 days. I'm trying to show the number of days until the expiration date, which is 30 days in the future from the date in the field. I've tried a bunch of permutations of something like this: select (30 - SUBDATE(CURDATE() -

Re: Getting # of days until expiration

2006-01-05 Thread Brian Dunning
Thanks Peter, that appears to be exactly what I'm looking for, but it still gives an error and I've been through it with a fine-toothed comb, tried different versions, parens, etc. Here is the exact SQL statement I'm using, with your suggestion: select accounts.username,

Killing my curly quotes

2005-12-05 Thread Brian Dunning
OK, I'm bad - I have curly quotes in my db that I failed to eliminate prior to the insert. Now that they're in there, is there a way to replace them with straight quotes? Everything I try fails to find them. -- MySQL General Mailing List For list archives: http://lists.mysql.com/mysql To

Only finding one record per ZIP code

2005-11-22 Thread Brian Dunning
I'm using the following to find the nearest 10 records by proximity to the ZIP code $zip. The problem is that it only returns one record per ZIP code, even when there are multiple records with the same ZIP: (Note this uses a temp table, but I already double-checked that all the desired

Re: Only finding one record per ZIP code

2005-11-22 Thread Brian Dunning
You're exactly right, and that solved it! Thank you. On Nov 22, 2005, at 7:49 AM, [EMAIL PROTECTED] wrote: Brian Dunning [EMAIL PROTECTED] wrote on 11/22/2005 10:43:13 AM: I'm using the following to find the nearest 10 records by proximity to the ZIP code $zip. The problem

Re: Delete all but the newest 100 records?

2005-11-07 Thread Brian Dunning
On Nov 6, 2005, at 6:05 PM, Jeremy Cole wrote: DELETE FROM tbl ORDER BY ts DESC LIMIT 100, 99; That's exactly needed: I was not aware of the OFFSET option. So I tried this, which appears correct to me: DELETE FROM table WHERE field='somevalue' ORDER BY timestamp DESC LIMIT

Re: Delete all but the newest 100 records?

2005-11-07 Thread Brian Dunning
On Nov 7, 2005, at 11:04 AM, Scott Noyes wrote: delete from x where ID not in (select ID from x order by timestamp desc limit 100); It's a good suggestion, I'm just shying away from it because it seems more resource intensive than using an offset, and my ISP is super anal about

Re: Delete all but the newest 100 records?

2005-11-07 Thread Brian Dunning
Interesting thought. I just tried it with and even 999 - same error! I'm sure that 999 records is not too large for MySQL... :) :) On Nov 7, 2005, at 4:52 PM, Joseph Cochran wrote: The quoted line is 99 but you're using , which I presume is too big for the system to

Re: Delete all but the newest 100 records?

2005-11-06 Thread Brian Dunning
The problem is the most recent 100 records won't be sequential. There are records with many different identifiers, in random order mixed with other records that I don't want deleted, and each time I do this I'm going to be limiting each subset of identified records to only the most recent

Re: Delete all but the newest 100 records?

2005-11-06 Thread Brian Dunning
On Nov 6, 2005, at 6:05 PM, Jeremy Cole wrote: Jeremy Cole MySQL Geek, Yahoo! Inc. Desk: 408 349 5104 Yahoo employs MySQL geeks? Always wondered what db that big index runs on... :) -- MySQL General Mailing List For list archives: http://lists.mysql.com/mysql To unsubscribe:

Help optimize this simple find

2005-11-04 Thread Brian Dunning
This simple find is taking 4 to 7 seconds. Way too long!! (This is a geotargeting query using the database from IP2location.) select lat,lon from geocodes where ipFROM=1173020467 and ipTO=1173020467 The database looks like this (how IP2location recommends): CREATE TABLE `geocodes` (

Not finding customers without invoices

2005-11-02 Thread Brian Dunning
I'm trying to find a list of customers including a count of all their invoices, but it's not including customers who have no invoices - and it should. What's broken? SELECT customers.company, count(invoices.id) as invcount FROM customers, invoices WHERE customers.id= invoices.customer_id

Re: Not finding customers without invoices

2005-11-02 Thread Brian Dunning
Thanks very much to all of you! Obviously I need to learn more about joins. Appreciate the kick in the pants. :) -- MySQL General Mailing List For list archives: http://lists.mysql.com/mysql To unsubscribe:http://lists.mysql.com/[EMAIL PROTECTED]

How does this work?

2005-11-02 Thread Brian Dunning
If I say: SELECT * FROM tablename ORDER BY rand() LIMIT 50 Will that give me the first 50 records from the table and randomize their order, or will it give me 50 records randomly from throughout the entire table? -- MySQL General Mailing List For list archives:

Re: How does this work?

2005-11-02 Thread Brian Dunning
On Nov 2, 2005, at 6:03 PM, Matt Babineau wrote: It will do your whole table. Then give you 50 records. But there won't be any duplicates, right? It won't include the same records more than once. I know that sounds stupid, just trying to debug an odd problem. -- MySQL General Mailing

Re: Temp table doesn't seem to work

2005-10-24 Thread Brian Dunning
I got it to work - turns out it was a stupid typo on my part (hate it when that happens - and hate wasting the list's time even more!). How long does this temporary table persist for - just the execution of the one page, or will it live on the server (using resources) until I explicitly

Re: Map of MySQL Users

2005-10-24 Thread Brian Dunning
That's awesome! I love it. Even though it didn't include me... :( How are you doing the geotargeting? -- MySQL General Mailing List For list archives: http://lists.mysql.com/mysql To unsubscribe:http://lists.mysql.com/[EMAIL PROTECTED]

Temp table doesn't seem to work

2005-10-23 Thread Brian Dunning
If I say this, I get all my data: ?php $sql = select * from myTable;; $result = mysql_query($sql); ? But if I say this, I get no results at all: ?php $sql = create temporary table xxx select * from myTable;; $result1 = mysql_query($sql); $sql = select * from xxx; $result2 = mysql_query($sql);

Spatial Extensions to make a Dealer Locator?

2005-10-19 Thread Brian Dunning
I'm in the midst of the age-old problem of finding the right SQL to make a Dealer Locator by ZIP code. Like Steffan's posts from a couple weeks ago, I have the ZIP code data in one table with latitudes and longitudes, and the table of Dealers with ZIP codes in a separate table. While

Append one table to another?

2005-10-11 Thread Brian Dunning
How do I append one table's contents to another? Both have identical structure. Problem is I don't have shell access, only phpAdmin or a PHP file I write upload myself. -- MySQL General Mailing List For list archives: http://lists.mysql.com/mysql To unsubscribe:

Re: Append one table to another?

2005-10-11 Thread Brian Dunning
INSERT into tbl1 SELECT * FROM tbl2 Thanks to both of you, but this is not working. Since one of the fields is a primary key that's duplicated in both tables (both tables have records numbered 1, 2, 3...), it won't allow the duplicate entries. Fortunately I do not need those primary key

Re: Append one table to another?

2005-10-11 Thread Brian Dunning
On Oct 11, 2005, at 8:24 AM, Jose Miguel PĂ©rez wrote: INSERT INTO table1 (field1, field2) SELECT field1, field FROM table2 Jose's solution worked perfectly. Thanks everyone, sorry for being so dense today. :) -- MySQL General Mailing List For list archives: http://lists.mysql.com/mysql To

[Off] How much time should this take?

2005-09-16 Thread Brian Dunning
I got a 12-hour invoice from a consultant who was tasked to do the following: - Install a Red Hat machine from absolute scratch for PHP/MySQL/Apache - Copy over some MySQL databases - Have mod_rewrite working via htaccess, and have wildcard DNS I realize there are a billion different

Emergency! How do I debug this?

2005-09-03 Thread Brian Dunning
My web site has always worked, I didn't change anything, so whatever is causing this error happened all by itself: #1017 - Can't find file: './kessler/products.frm' (errno: 13) In phpMyAdmin, it shows the tables are in use and there's no way to execute a repair or anything. I don't know

Re: Emergency! How do I debug this?

2005-09-03 Thread Brian Dunning
of this machine? do you have administrator access? does kessler/products.frm still exist? [likely: /var/lib/mysql/ kessler/products.frm] On Sat, 3 Sep 2005, Brian Dunning wrote: My web site has always worked, I didn't change anything, so whatever is causing this error happened all by itself: #1017

Re: Emergency! How do I debug this?

2005-09-03 Thread Brian Dunning
On Sep 3, 2005, at 1:43 PM, Jason Pyeron wrote: google pulls up many instances of this error from several 'different' servers, you might lookinto the software. http://www.google.com/search?num=50hl=enlr=lang_ensafe=offq=% 27kessler%2Fproducts.frm%27btnG=Searchlr= The reason for that is a

Re: Emergency! How do I debug this?

2005-09-03 Thread Brian Dunning
On Sep 3, 2005, at 1:43 PM, Jason Pyeron wrote: it could be that the permissions are wrong, too. The permissions for all the files in there are -rwxrwxr-x ...I don't know if that's what they should be or not. -- MySQL General Mailing List For list archives: http://lists.mysql.com/mysql To

Re: Emergency! How do I debug this?

2005-09-03 Thread Brian Dunning
FIXED... I did a chmod +rwx on the directory, and now all is well. -- MySQL General Mailing List For list archives: http://lists.mysql.com/mysql To unsubscribe:http://lists.mysql.com/[EMAIL PROTECTED]

Re: Emergency! How do I debug this?

2005-09-03 Thread Brian Dunning
On Sep 3, 2005, at 1:43 PM, Jason Pyeron wrote: google pulls up many instances of this error from several 'different' servers, you might lookinto the software. http://www.google.com/search?num=50hl=enlr=lang_ensafe=offq=% 27kessler%2Fproducts.frm%27btnG=Searchlr= The reason for that is a

  1   2   >