Analysis of a weeks worth of general log
I have 7 days worth of general log data totalling 4.4GB. I want to analyze this data to get: a) queries per second, minute, hour and day b) a count of the number of selects versus write statements (delete, insert, replace and update) c) a variation of the above with select, replace, delete and insert versus update How can I do this? I've looked at mysqlsla which is complex, works well but does not quite get what I want. [1] I looked at MyProfi 0.18 which looks like it will get some of the answers but runs out of memory working on the smallest log file (mysql.log) even with memory_limit in php.ini set to 1024MB [2] -rw-r- 1 imran imran 268M 2010-04-19 13:03 mysql.log -rw-r- 1 imran imran 721M 2010-04-19 12:56 mysql.log.1 -rw-r- 1 imran imran 737M 2010-04-19 13:05 mysql.log.2 -rw-r- 1 imran imran 554M 2010-04-19 13:06 mysql.log.3 -rw-r- 1 imran imran 499M 2010-04-19 13:02 mysql.log.4 -rw-r- 1 imran imran 568M 2010-04-19 12:59 mysql.log.5 -rw-r- 1 imran imran 488M 2010-04-19 13:01 mysql.log.6 Any pointers please? If all else fails, I will prolly write a perl script to munge it. [1] http://hackmysql.com/mysqlsla [2] http://myprofi.sourceforge.net -- GPG Key fingerprint = B323 477E F6AB 4181 9C65 F637 BC5F 7FCC 9CC9 CC7F -- MySQL General Mailing List For list archives: http://lists.mysql.com/mysql To unsubscribe:http://lists.mysql.com/mysql?unsub=arch...@jab.org
Re: Analysis of a weeks worth of general log
Maybe one of the maatkit tools will do it, but I tend to graph that kind of data live in Munin from the internal counters. On Tue, Apr 20, 2010 at 1:02 PM, Imran Chaudhry ichaud...@gmail.com wrote: I have 7 days worth of general log data totalling 4.4GB. I want to analyze this data to get: a) queries per second, minute, hour and day b) a count of the number of selects versus write statements (delete, insert, replace and update) c) a variation of the above with select, replace, delete and insert versus update How can I do this? I've looked at mysqlsla which is complex, works well but does not quite get what I want. [1] I looked at MyProfi 0.18 which looks like it will get some of the answers but runs out of memory working on the smallest log file (mysql.log) even with memory_limit in php.ini set to 1024MB [2] -rw-r- 1 imran imran 268M 2010-04-19 13:03 mysql.log -rw-r- 1 imran imran 721M 2010-04-19 12:56 mysql.log.1 -rw-r- 1 imran imran 737M 2010-04-19 13:05 mysql.log.2 -rw-r- 1 imran imran 554M 2010-04-19 13:06 mysql.log.3 -rw-r- 1 imran imran 499M 2010-04-19 13:02 mysql.log.4 -rw-r- 1 imran imran 568M 2010-04-19 12:59 mysql.log.5 -rw-r- 1 imran imran 488M 2010-04-19 13:01 mysql.log.6 Any pointers please? If all else fails, I will prolly write a perl script to munge it. [1] http://hackmysql.com/mysqlsla [2] http://myprofi.sourceforge.net -- GPG Key fingerprint = B323 477E F6AB 4181 9C65 F637 BC5F 7FCC 9CC9 CC7F -- MySQL General Mailing List For list archives: http://lists.mysql.com/mysql To unsubscribe:http://lists.mysql.com/mysql?unsub=vegiv...@tuxera.be -- Bier met grenadyn Is als mosterd by den wyn Sy die't drinkt, is eene kwezel Hy die't drinkt, is ras een ezel
Re: Analysis of a weeks worth of general log
Has anyone tried using the log_output option in mysql 5.1 to have the general log put into a table and not a flat file? I used it for a while before having to downgrade back to 5.0 but thought it was a great idea. I'm curious to see if anyone feels it helps analysis. On Tue, Apr 20, 2010 at 6:02 AM, Imran Chaudhry ichaud...@gmail.com wrote: I have 7 days worth of general log data totalling 4.4GB. I want to analyze this data to get: a) queries per second, minute, hour and day b) a count of the number of selects versus write statements (delete, insert, replace and update) c) a variation of the above with select, replace, delete and insert versus update How can I do this? I've looked at mysqlsla which is complex, works well but does not quite get what I want. [1] I looked at MyProfi 0.18 which looks like it will get some of the answers but runs out of memory working on the smallest log file (mysql.log) even with memory_limit in php.ini set to 1024MB [2] -rw-r- 1 imran imran 268M 2010-04-19 13:03 mysql.log -rw-r- 1 imran imran 721M 2010-04-19 12:56 mysql.log.1 -rw-r- 1 imran imran 737M 2010-04-19 13:05 mysql.log.2 -rw-r- 1 imran imran 554M 2010-04-19 13:06 mysql.log.3 -rw-r- 1 imran imran 499M 2010-04-19 13:02 mysql.log.4 -rw-r- 1 imran imran 568M 2010-04-19 12:59 mysql.log.5 -rw-r- 1 imran imran 488M 2010-04-19 13:01 mysql.log.6 Any pointers please? If all else fails, I will prolly write a perl script to munge it. [1] http://hackmysql.com/mysqlsla [2] http://myprofi.sourceforge.net -- GPG Key fingerprint = B323 477E F6AB 4181 9C65 F637 BC5F 7FCC 9CC9 CC7F -- MySQL General Mailing List For list archives: http://lists.mysql.com/mysql To unsubscribe:http://lists.mysql.com/mysql?unsub=jlyons4...@gmail.com -- Jim Lyons Web developer / Database administrator http://www.weblyons.com
Re: Analysis of a weeks worth of general log
Hi Imran, you can have a look at mysqldumpslow utility to analyze the data.. Thanks Anand On Tue, Apr 20, 2010 at 5:48 PM, Jim Lyons jlyons4...@gmail.com wrote: Has anyone tried using the log_output option in mysql 5.1 to have the general log put into a table and not a flat file? I used it for a while before having to downgrade back to 5.0 but thought it was a great idea. I'm curious to see if anyone feels it helps analysis. On Tue, Apr 20, 2010 at 6:02 AM, Imran Chaudhry ichaud...@gmail.com wrote: I have 7 days worth of general log data totalling 4.4GB. I want to analyze this data to get: a) queries per second, minute, hour and day b) a count of the number of selects versus write statements (delete, insert, replace and update) c) a variation of the above with select, replace, delete and insert versus update How can I do this? I've looked at mysqlsla which is complex, works well but does not quite get what I want. [1] I looked at MyProfi 0.18 which looks like it will get some of the answers but runs out of memory working on the smallest log file (mysql.log) even with memory_limit in php.ini set to 1024MB [2] -rw-r- 1 imran imran 268M 2010-04-19 13:03 mysql.log -rw-r- 1 imran imran 721M 2010-04-19 12:56 mysql.log.1 -rw-r- 1 imran imran 737M 2010-04-19 13:05 mysql.log.2 -rw-r- 1 imran imran 554M 2010-04-19 13:06 mysql.log.3 -rw-r- 1 imran imran 499M 2010-04-19 13:02 mysql.log.4 -rw-r- 1 imran imran 568M 2010-04-19 12:59 mysql.log.5 -rw-r- 1 imran imran 488M 2010-04-19 13:01 mysql.log.6 Any pointers please? If all else fails, I will prolly write a perl script to munge it. [1] http://hackmysql.com/mysqlsla [2] http://myprofi.sourceforge.net -- GPG Key fingerprint = B323 477E F6AB 4181 9C65 F637 BC5F 7FCC 9CC9 CC7F -- MySQL General Mailing List For list archives: http://lists.mysql.com/mysql To unsubscribe: http://lists.mysql.com/mysql?unsub=jlyons4...@gmail.com -- Jim Lyons Web developer / Database administrator http://www.weblyons.com
Re: Analysis of a weeks worth of general log
Jim Lyons skrev: Has anyone tried using the log_output option in mysql 5.1 to have the general log put into a table and not a flat file? I used it for a while before having to downgrade back to 5.0 but thought it was a great idea. I'm curious to see if anyone feels it helps analysis. I tried that once, and ran into some problems. Depending on your exact version, you might experience the same. http://www.bitbybit.dk/carsten/blog/?p=115 (also has a number of good comments on analysis tools) And yes, having the data available in a table is a Good Thing for analysis. / Carsten On Tue, Apr 20, 2010 at 6:02 AM, Imran Chaudhry ichaud...@gmail.com wrote: I have 7 days worth of general log data totalling 4.4GB. I want to analyze this data to get: a) queries per second, minute, hour and day b) a count of the number of selects versus write statements (delete, insert, replace and update) c) a variation of the above with select, replace, delete and insert versus update How can I do this? I've looked at mysqlsla which is complex, works well but does not quite get what I want. [1] I looked at MyProfi 0.18 which looks like it will get some of the answers but runs out of memory working on the smallest log file (mysql.log) even with memory_limit in php.ini set to 1024MB [2] -rw-r- 1 imran imran 268M 2010-04-19 13:03 mysql.log -rw-r- 1 imran imran 721M 2010-04-19 12:56 mysql.log.1 -rw-r- 1 imran imran 737M 2010-04-19 13:05 mysql.log.2 -rw-r- 1 imran imran 554M 2010-04-19 13:06 mysql.log.3 -rw-r- 1 imran imran 499M 2010-04-19 13:02 mysql.log.4 -rw-r- 1 imran imran 568M 2010-04-19 12:59 mysql.log.5 -rw-r- 1 imran imran 488M 2010-04-19 13:01 mysql.log.6 Any pointers please? If all else fails, I will prolly write a perl script to munge it. [1] http://hackmysql.com/mysqlsla [2] http://myprofi.sourceforge.net -- GPG Key fingerprint = B323 477E F6AB 4181 9C65 F637 BC5F 7FCC 9CC9 CC7F -- MySQL General Mailing List For list archives: http://lists.mysql.com/mysql To unsubscribe:http://lists.mysql.com/mysql?unsub=jlyons4...@gmail.com -- MySQL General Mailing List For list archives: http://lists.mysql.com/mysql To unsubscribe:http://lists.mysql.com/mysql?unsub=arch...@jab.org
Re: Analysis of a weeks worth of general log
Carsten Pedersen skrev: Jim Lyons skrev: Has anyone tried using the log_output option in mysql 5.1 to have the general log put into a table and not a flat file? I used it for a while before having to downgrade back to 5.0 but thought it was a great idea. I'm curious to see if anyone feels it helps analysis. I tried that once, and ran into some problems. Depending on your exact version, you might experience the same. http://www.bitbybit.dk/carsten/blog/?p=115 (also has a number of good comments on analysis tools) And yes, having the data available in a table is a Good Thing for analysis. / Carsten Minor correction: The post i point to is about the slow log, but I presume also relevant for the general log. And the good comments I mentioned come in the followup posting at http://www.bitbybit.dk/carsten/blog/?p=116 / Carsten -- MySQL General Mailing List For list archives: http://lists.mysql.com/mysql To unsubscribe:http://lists.mysql.com/mysql?unsub=arch...@jab.org
Re: Analysis of a weeks worth of general log
Well, first thing I'd do, is symlink the log table files onto a separate set of spindles. No use bogging the main data spindles down with logwrites. On Tue, Apr 20, 2010 at 5:33 PM, Carsten Pedersen cars...@bitbybit.dkwrote: Carsten Pedersen skrev: Jim Lyons skrev: Has anyone tried using the log_output option in mysql 5.1 to have the general log put into a table and not a flat file? I used it for a while before having to downgrade back to 5.0 but thought it was a great idea. I'm curious to see if anyone feels it helps analysis. I tried that once, and ran into some problems. Depending on your exact version, you might experience the same. http://www.bitbybit.dk/carsten/blog/?p=115 (also has a number of good comments on analysis tools) And yes, having the data available in a table is a Good Thing for analysis. / Carsten Minor correction: The post i point to is about the slow log, but I presume also relevant for the general log. And the good comments I mentioned come in the followup posting at http://www.bitbybit.dk/carsten/blog/?p=116 / Carsten -- MySQL General Mailing List For list archives: http://lists.mysql.com/mysql To unsubscribe:http://lists.mysql.com/mysql?unsub=vegiv...@tuxera.be -- Bier met grenadyn Is als mosterd by den wyn Sy die't drinkt, is eene kwezel Hy die't drinkt, is ras een ezel
Re: Analysis of a weeks worth of general log
Minor correction: The post i point to is about the slow log, but I presume also relevant for the general log. And the good comments I mentioned come in the followup posting at http://www.bitbybit.dk/carsten/blog/?p=116 / Carsten Thanks Carsten, I read the comments and Sheeri mentions mysqlsla which I have already tried. Back to square one. I might look at munin again and see if someone has written a plug-in that graphs query type but that seems too much hassle for my situation. I have the raw data and I want the appropriate tool to analyze it. Part of the reason is that the data is from a MyISAM based web app and I am writing a report recommending it be moved to a transactional storage engine. AIUI a rule of thumb is that if between 15% - 20% of statements are non SELECT/INSERT then one can obtain equal or better performance with something like InnoDB. That being said, the benefits of InnoDB (good recovery features, transactions, advanced indexes, foreign key contraints) make it a good default choice and I will recommend it anyway. Plus we have order processing stuff going on and it seems right to have atomicity in that process. It would be a bit better though to confidently state that the query-mix skews it towards InnoDB... if I can only prove it :-) -- GPG Key fingerprint = B323 477E F6AB 4181 9C65 F637 BC5F 7FCC 9CC9 CC7F -- MySQL General Mailing List For list archives: http://lists.mysql.com/mysql To unsubscribe:http://lists.mysql.com/mysql?unsub=arch...@jab.org
ANN: Database Workbench Pro 3, 25% discount for two more weeks!
Ladies, gentlemen, Database Workbench 3.4 Pro is available with a 25% discount now! You will get the next major version for free when it is released later this month. Order now and use the coupon code DBW3X when ordering. Database Workbench is a multi-dbms development tool, for more information, see http://www.upscene.com/go/?go=dbw Database Workbench supports: - Borland InterBase ( 4.x - 8.x ) - Firebird ( 1.x, 2.x ) - MS SQL Server/MSDE ( v6.5, 7, 2000, 2005, 2008, MSDE 1 2, SQL Express ) - MySQL 4.x, 5.x - Oracle Database ( 8i, 9i, 10g, 11g ) - Sybase SQL Anywhere ( 9, 10 and 11 ) - NexusDB ( 2.08.x, 3.0.x ) With regards, Martijn Tonies Database Workbench - the database developer tool for professionals Upscene Productions http://www.upscene.com -- MySQL General Mailing List For list archives: http://lists.mysql.com/mysql To unsubscribe:http://lists.mysql.com/mysql?unsub=arch...@jab.org
Re: Weeks
ML, trying to write some SQL that will give me records for the CURRENT WEEK. Example, starting on a Sunday and going through Saturday. This week it would be Dec 27 - Jan 2. For the week of any date @d: ... WHERE order_date BETWEEN AddDate(@d, -DayOfWeek(@d)+1) AND AddDate(@d, 7-DayOfWeek(@d)) ... PB - ML wrote: Hi All, trying to write some SQL that will give me records for the CURRENT WEEK. Example, starting on a Sunday and going through Saturday. This week it would be Dec 27 - Jan 2. I am doing this so I can write a query that will show orders that are placed during the current week. Here is what I have, but this is showing from today for the next seven days. SELECT * FROM orders WHERE WEEK(NOW(), 7) = WEEK(orders.order_date, 7) AND DATEDIFF(NOW(),orders.order_date) 7; Would anyone have any advice? -Jason No virus found in this incoming message. Checked by AVG - www.avg.com Version: 8.5.431 / Virus Database: 270.14.123/2592 - Release Date: 12/29/09 07:47:00
Re: Weeks
If all you want is the current week then the query is simple: SELECT * FROM orders where WEEK(orders.order_date) = WEEK(NOW()) The default is thje day starts on Sunday so the second value is not needed. WEEK(NOW(),7) is equivalent to WEEK(NOW(),0) - the valid values are 0 - 6. As for performance, I tested the query against a table I have with 199,826 rows - it returned the data in 0.016 seconds (selecting distinct week(date)) and selecting * returned 3,816 rows in 0.827 seconds. One concern will be when the data spans years - in that case you will need to also check for year: SELECT * FROM orders where WEEK(orders.date) = WEEK(NOW()) and YEAR(orders.order_date) = YEAR(NOW()) - Original Message - From: ML mailingli...@mailnewsrss.com To: mysql@lists.mysql.com Sent: Monday, December 28, 2009 5:14 PM Subject: Weeks Hi All, trying to write some SQL that will give me records for the CURRENT WEEK. Example, starting on a Sunday and going through Saturday. This week it would be Dec 27 - Jan 2. I am doing this so I can write a query that will show orders that are placed during the current week. Here is what I have, but this is showing from today for the next seven days. SELECT * FROM orders WHERE WEEK(NOW(), 7) = WEEK(orders.order_date, 7) AND DATEDIFF(NOW(),orders.order_date) 7; Would anyone have any advice? -Jason -- MySQL General Mailing List For list archives: http://lists.mysql.com/mysql To unsubscribe:http://lists.mysql.com/mysql?unsub=my...@dawiz.net -- MySQL General Mailing List For list archives: http://lists.mysql.com/mysql To unsubscribe:http://lists.mysql.com/mysql?unsub=arch...@jab.org
Weeks
Hi All, trying to write some SQL that will give me records for the CURRENT WEEK. Example, starting on a Sunday and going through Saturday. This week it would be Dec 27 - Jan 2. I am doing this so I can write a query that will show orders that are placed during the current week. Here is what I have, but this is showing from today for the next seven days. SELECT * FROM orders WHERE WEEK(NOW(), 7) = WEEK(orders.order_date, 7) AND DATEDIFF(NOW(),orders.order_date) 7; Would anyone have any advice? -Jason -- MySQL General Mailing List For list archives: http://lists.mysql.com/mysql To unsubscribe:http://lists.mysql.com/mysql?unsub=arch...@jab.org
RE: Weeks
See: http://gtowey.blogspot.com/2009/04/how-to-select-this-wednesday-or-other.html just calculate the two dates, and use WHERE order_date BETWEEN (calculated start date) AND (calculated end date) This avoids using functions on the actual column when possible, since that will prevent using indexes to find your query results. Regards, Gavin Towey -Original Message- From: ML [mailto:mailingli...@mailnewsrss.com] Sent: Monday, December 28, 2009 4:15 PM To: mysql@lists.mysql.com Subject: Weeks Hi All, trying to write some SQL that will give me records for the CURRENT WEEK. Example, starting on a Sunday and going through Saturday. This week it would be Dec 27 - Jan 2. I am doing this so I can write a query that will show orders that are placed during the current week. Here is what I have, but this is showing from today for the next seven days. SELECT * FROM orders WHERE WEEK(NOW(), 7) = WEEK(orders.order_date, 7) AND DATEDIFF(NOW(),orders.order_date) 7; Would anyone have any advice? -Jason -- MySQL General Mailing List For list archives: http://lists.mysql.com/mysql To unsubscribe:http://lists.mysql.com/mysql?unsub=gto...@ffn.com This message contains confidential information and is intended only for the individual named. If you are not the named addressee, you are notified that reviewing, disseminating, disclosing, copying or distributing this e-mail is strictly prohibited. Please notify the sender immediately by e-mail if you have received this e-mail by mistake and delete this e-mail from your system. E-mail transmission cannot be guaranteed to be secure or error-free as information could be intercepted, corrupted, lost, destroyed, arrive late or incomplete, or contain viruses. The sender therefore does not accept liability for any loss or damage caused by viruses or errors or omissions in the contents of this message, which arise as a result of e-mail transmission. [FriendFinder Networks, Inc., 220 Humbolt court, Sunnyvale, CA 94089, USA, FriendFinder.com -- MySQL General Mailing List For list archives: http://lists.mysql.com/mysql To unsubscribe:http://lists.mysql.com/mysql?unsub=arch...@jab.org
Re: Query to Select records in the last 4 weeks
Hi, I have tried to use this query: SELECT count(smsc_id) as total, insertdate FROM momtbak WHERE insertdate BETWEEN DATE_SUB(CURRENT_DATE(), INTERVAL 4 WEEK) AND CURRENT_DATE() group by week(date_format(insertdate,'%Y-%m-%d'),3) to group records in the last 4 weeks by week. But the result returns this list: 144 2008-11-06 07:00:24 1883 2008-11-10 07:00:06 1645 2008-11-17 11:59:46 2476 2008-11-24 21:54:11 1015 2008-12-01 20:45:43 The expected result is the date shown above is the weeknumber. What do I miss here? TIA Willy -- MySQL General Mailing List For list archives: http://lists.mysql.com/mysql To unsubscribe:http://lists.mysql.com/[EMAIL PROTECTED]
[SOLVED]Re: Query to Select records in the last 4 weeks
Hi, I finally found the solution SELECT count( smsc_id ) AS total, week( insertdate ) AS tanggal FROM momtbak WHERE insertdate BETWEEN DATE_SUB( CURRENT_DATE( ) , INTERVAL 4 WEEK ) AND CURRENT_DATE( ) GROUP BY week( insertdate ) Willy -- MySQL General Mailing List For list archives: http://lists.mysql.com/mysql To unsubscribe:http://lists.mysql.com/[EMAIL PROTECTED]
Query to Select records in the last 4 weeks
Hi, I get stuck to build a query to select records between curdate() and the last 4 weeks and groupped by week. I tested with: SELECT * FROM momtbak WHERE insertdate BETWEEN curdate( ) AND curdate( ) - INTERVAL 4 week It doesn't work. Please help, TIA. Willy Every why hath a wherefore. -- William Shakespeare, A Comedy of Errors -- MySQL General Mailing List For list archives: http://lists.mysql.com/mysql To unsubscribe:http://lists.mysql.com/[EMAIL PROTECTED]
Re: Query to Select records in the last 4 weeks
On Thu, 2008-12-04 at 08:27 +0700, sangprabv wrote: Hi, I get stuck to build a query to select records between curdate() and the last 4 weeks and groupped by week. I tested with: SELECT * FROM momtbak WHERE insertdate BETWEEN curdate( ) AND curdate( ) - INTERVAL 4 week It doesn't work. Please help, TIA. http://dev.mysql.com/doc/refman/5.0/en/comparison-operators.html#operator_between http://dev.mysql.com/doc/refman/4.1/en/date-and-time-functions.html#function_date-add Untested, but something like this maybe? SELECT * FROM momtbak WHERE insertdate BETWEEN DATE_SUB(CURRENT_DATE(), INTERVAL 4 WEEK) AND CURRENT_DATE();
Re: Query to Select records in the last 4 weeks
Hi, Thanks for the prompt reply. I have tested yours and it seems to be working. What about to group the result by week? TIA. Willy Your life would be very empty if you had nothing to regret. -Original Message- From: Daevid Vincent [EMAIL PROTECTED] To: mysql@lists.mysql.com Cc: sangprabv [EMAIL PROTECTED] Subject: Re: Query to Select records in the last 4 weeks Date: Wed, 03 Dec 2008 17:52:32 -0800 SELECT * FROM momtbak WHERE insertdate BETWEEN DATE_SUB(CURRENT_DATE(), INTERVAL 4 WEEK) AND CURRENT_DATE(); -- MySQL General Mailing List For list archives: http://lists.mysql.com/mysql To unsubscribe:http://lists.mysql.com/[EMAIL PROTECTED]
Only 3 weeks left for PostgreSQL conference
Hello, I know this is a *little* off topic but it is about Open Source databases :) There are only three weeks left to register for the PostgreSQL Community Conference: East! The conference is scheduled on March 29th and 30th (a Saturday and Sunday) at the University of Maryland. Come join us as Bruce Momjian and Joshua Drake have a round table open to any PostgreSQL question from the community. If a round table isn't your cup of tea, visit our talks page where you can pick to learn from over 20 experts in the field. http://www.postgresqlconference.org/talks/ To register just point that old fashion web browser over to: http://www.postgresqlconference.org/ All registrations and sponsorships are donations to PostgreSQL via Software in the Public Interest, Inc., a 501(c)3 non-profit corporation. Thanks again to our Community Conference Sponsors: Organizational Sponsor Command Prompt, Inc. http://www.commandprompt.com/ Silver Sponsor EnterpriseDB http://www.enterprisedb.com Talk Sponsors Afilias http://www.afilias.org/ Continuent http://www.continuent.com/ Sun http://www.sun.com/ Truviso http://www.truviso.com/ Xtuple http://www.xtuple.com/ Meal Sponsors OTG http://www.otg-nc.com/ General Sponsor Emma http://www.myemma.com/ Sincerely, Joshua D. Drake -- The PostgreSQL Company since 1997: http://www.commandprompt.com/ PostgreSQL Community Conference: http://www.postgresqlconference.org/ Donate to the PostgreSQL Project: http://www.postgresql.org/about/donate PostgreSQL SPI Liaison | SPI Director | PostgreSQL political pundit signature.asc Description: PGP signature
Re: select datetime older than X weeks
From: Frank Bax I have a table with datetime field and I would like to select all data older than X weeks, where X is a variable in my php script. SELECT (.) WHERE `datetime_field` NOW() - INTERVAL (7*X) DAY This way you compare the datetime field with a constant (the expression results in a constant), so MySQL can use an index if there is one for the datetime field. Regards, Jigal. -- MySQL General Mailing List For list archives: http://lists.mysql.com/mysql To unsubscribe:http://lists.mysql.com/[EMAIL PROTECTED]
select datetime older than X weeks
I'm using OpenBSD 3.6 (latest version) which comes with binary packages for MySQL 4.0.20 - More recent binary packages are not yet available for this platform and installation from source is not an option. I have a table with datetime field and I would like to select all data older than X weeks, where X is a variable in my php script. DateDiff(datetime,now()) looks like its the right function for this purpose, but this function requires date arguments and date() isn't in 4.0.20 to convert my field from datetime to date. I have tried many different combinations, only to find a function isn't on my system or my data is not the proper type for functions that I do have. Can anyone suggest a workable SQL query for this situation? -- MySQL General Mailing List For list archives: http://lists.mysql.com/mysql To unsubscribe:http://lists.mysql.com/[EMAIL PROTECTED]
Re: Restore help! been going 2 weeks
Those are pretty much standard settings I had ours set almost exactly like that, and performance was worse -- MySQL General Mailing List For list archives: http://lists.mysql.com/mysql To unsubscribe:http://lists.mysql.com/[EMAIL PROTECTED]
RE: Restore help! been going 2 weeks
ours is InnoDB. we are not getting any performance problems with the settings. it is working fine since last 1 Year. to my knowledge due to myisam type you are getting performance issue. Thanks Anil -Original Message- From: matt_lists [mailto:[EMAIL PROTECTED] Sent: Wednesday, December 22, 2004 7:12 PM To: Anil Doppalapudi Cc: mysql@lists.mysql.com Subject: Re: Restore help! been going 2 weeks Those are pretty much standard settings I had ours set almost exactly like that, and performance was worse -- MySQL General Mailing List For list archives: http://lists.mysql.com/mysql To unsubscribe: http://lists.mysql.com/[EMAIL PROTECTED] -- MySQL General Mailing List For list archives: http://lists.mysql.com/mysql To unsubscribe:http://lists.mysql.com/[EMAIL PROTECTED]
Re: Restore help! been going 2 weeks
Anil Doppalapudi wrote: ours is InnoDB. we are not getting any performance problems with the settings. it is working fine since last 1 Year. to my knowledge due to myisam type you are getting performance issue. Thanks Anil based on this email list, myisam is prefered for heavy query/index use, we do not insert records into these tables durring the day, only once per night no deletes from these tables all are optimized every saturday night they are just freakin huge, too big for mysql I think -- MySQL General Mailing List For list archives: http://lists.mysql.com/mysql To unsubscribe:http://lists.mysql.com/[EMAIL PROTECTED]
RE: Restore help! been going 2 weeks
here is my my.cnf file contents # The MySQL server [mysqld] port= 3306 socket = /tmp/mysql.sock skip-locking key_buffer = 384M max_allowed_packet = 1M table_cache = 512 sort_buffer_size = 2M read_buffer_size = 2M myisam_sort_buffer_size = 64M thread_cache = 40 query_cache_size = 32M max_connections = 500 # Try number of CPU's*2 for thread_concurrency thread_concurrency = 8 wait_timeout = 288000 # Don't listen on a TCP/IP port at all. This can be a security enhancement, # if all processes that need to connect to mysqld run on the same host. # All interaction with mysqld must be made via Unix sockets or named pipes. # Note that using this option without enabling named pipes on Windows # (via the enable-named-pipe option) will render mysqld useless! # #skip-networking # Replication Master Server (default) # binary logging is required for replication Added by Anil For Replication server-id = 1 log-bin=/usr/local/mysql/data/extended/neptune-bin.1 set-variable=max_binlog_size=500M # required unique id between 1 and 2^32 - 1 # defaults to 1 if master-host is not set # but will not function as a master if omitted #server-id = 1 # Replication Slave (comment out master section to use this) # # To configure this host as a replication slave, you can choose between # two methods : # # 1) Use the CHANGE MASTER TO command (fully described in our manual) - #the syntax is: # #CHANGE MASTER TO MASTER_HOST=host, MASTER_PORT=port, #MASTER_USER=user, MASTER_PASSWORD=password ; # #where you replace host, user, password by quoted strings and #port by the master's port number (3306 by default). # #Example: # #CHANGE MASTER TO MASTER_HOST='125.564.12.1', MASTER_PORT=3306, #MASTER_USER='joe', MASTER_PASSWORD='secret'; # # OR # # 2) Set the variables below. However, in case you choose this method, then #start replication for the first time (even unsuccessfully, for example #if you mistyped the password in master-password and the slave fails to #connect), the slave will create a master.info file, and any later #change in this file to the variables' values below will be ignored and #overridden by the content of the master.info file, unless you shutdown #the slave server, delete master.info and restart the slaver server. #For that reason, you may want to leave the lines below untouched #(commented) and instead use CHANGE MASTER TO (see above) # # required unique id between 2 and 2^32 - 1 # (and different from the master) # defaults to 2 if master-host is set # but will not function as a slave if omitted #server-id = 2 # # The replication master for this slave - required #master-host = hostname # # The username the slave will use for authentication when connecting # to the master - required #master-user = username # # The password the slave will authenticate with when connecting to # the master - required #master-password = password # # The port the master is listening on. # optional - defaults to 3306 #master-port = port # # binary logging - not required for slaves, but recommended #log-bin # Point the following paths to different dedicated disks #tmpdir = /tmp/ tmpdir = /home/mysql/tmp/ #log-update = /path-to-dedicated-directory/hostname # Uncomment the following if you are using BDB tables #bdb_cache_size = 384M #bdb_max_lock = 10 # Uncomment the following if you are using InnoDB tables innodb_data_home_dir = /usr/local/mysql/data/ innodb_data_file_path = ibdata1:4000M;ibdata2:4000M;ibdata3:4000M;ibdata4:4000M;ibdata5:4000M;ibdata 6:4000M;ibdata7:4000M;ibdata8:4000M;ibdata9:4000M;ibdata10:4000M;ibdata11:40 00M;ibdata12:4000M;extended/ibdata13:4000M;extended/ibdata14:4000M;extended/ ibdata15:4000M;extended/ibdata16:4000M;extended/ibdata17:4000M;extended/ibda ta18:4000M;extended/ibdata19:4000M;extended/ibdata20:4000M;extended/ibdata21 :4000M;extended/ibdata22:4000M innodb_log_group_home_dir = /usr/local/mysql/data/ innodb_log_arch_dir = /usr/local/mysql/data/ # You can set .._buffer_pool_size up to 50 - 80 % # of RAM but beware of setting memory usage too high innodb_buffer_pool_size = 448M innodb_additional_mem_pool_size = 20M # Set .._log_file_size to 25 % of buffer pool size innodb_log_file_size = 100M innodb_log_buffer_size = 8M innodb_flush_log_at_trx_commit = 0 innodb_lock_wait_timeout = 50 -Original Message- From: matt_lists [mailto:[EMAIL PROTECTED] Sent: Monday, December 20, 2004 5:56 PM To: matt_lists Cc: Anil Doppalapudi; [EMAIL PROTECTED] Subject: Re: Restore help! been going 2 weeks matt_lists wrote: Can you post your my.ini or my.cnf sense your restore actually worked Mine is not swapping, in fact, mysql is only using 610,824 K of ram, there is still over 1 gig of ram free Our next server will have 16 gig of ram and quad xeons, I'm going to completely disable the swap files Any chance of getting a copy of your my.inf/my.cnf
Re: Restore help! been going 2 weeks
matt_lists wrote: Can you post your my.ini or my.cnf sense your restore actually worked Mine is not swapping, in fact, mysql is only using 610,824 K of ram, there is still over 1 gig of ram free Our next server will have 16 gig of ram and quad xeons, I'm going to completely disable the swap files Any chance of getting a copy of your my.inf/my.cnf ? -- MySQL General Mailing List For list archives: http://lists.mysql.com/mysql To unsubscribe:http://lists.mysql.com/[EMAIL PROTECTED]
Re: Restore help! been going 2 weeks
matt_lists wrote: matt_lists wrote: Can you post your my.ini or my.cnf sense your restore actually worked Mine is not swapping, in fact, mysql is only using 610,824 K of ram, there is still over 1 gig of ram free Our next server will have 16 gig of ram and quad xeons, I'm going to completely disable the swap files Any chance of getting a copy of your my.inf/my.cnf ? PS I dont agree with you, I do not think innodb is faster | 1 | xotech | localhost:3073 | NULL | Sleep | 7 | | NULL | | 12 | xotech | localhost:3115 | finlog | Query | 493381 | copy to tmp table | alter table bragg_stat engine=innodb | | 15 | xotech | localhost:2868 | NULL | Query | 0 | NULL | no other threads on this box 137 hours so far the innodb file is now up to 19 gig -- MySQL General Mailing List For list archives: http://lists.mysql.com/mysql To unsubscribe:http://lists.mysql.com/[EMAIL PROTECTED]
Re: Restore help! been going 2 weeks
Are you sure innodb is better for tables larger than 4 gig? | 12 | xotech | localhost:3115 | finlog | Query | 238224 | copy to tmp table | alter table bragg_stat engine=innodb pack_keys=0 | | 14 | xotech | localhost:3356 | NULL | Query | 0 | NULL | | It's been going 66 hours and all I'm doing is changing one table from myisam with packed keys, to innodb without packed keys this is on a stand alone server with no other processes and no users something tells me innodb is not the answer for me also how do you do a merge table on innodb, I've got table merges to split the data up by fiscal year, the table I'm restoring is just for this year, and we are still in the first quarter -- MySQL General Mailing List For list archives: http://lists.mysql.com/mysql To unsubscribe:http://lists.mysql.com/[EMAIL PROTECTED]
RE: Restore help! been going 2 weeks
please check the my.cnf parameters with recommended values given in my_huge.cnf file in support files. to my knowledge For 2 GB RAM values you set are very high. I think system is going for heavy swapping that might be the reason it is going very slowly. Anil -Original Message- From: matt_lists [mailto:[EMAIL PROTECTED] Sent: Friday, December 17, 2004 12:11 AM To: Anil Doppalapudi Cc: [EMAIL PROTECTED] Subject: Re: Restore help! been going 2 weeks Anil Doppalapudi wrote: it is not normal. i have restored 90 GB database in 2 days on dell server with 2 GB RAM. Are you able to connect to mysql. if so at mysql prompt type the below command and check what processes are running mysqlshow processlist; if any insert statements are running then restoration process is going on then check your my.cnf parameters. This is a dell with 2 gig of ram set-variable= key_buffer_size=1500M set-variable=join_buffer_size=512M set-variable= max_allowed_packet=384M set-variable=delay_key_write=ALL set-variable= table_cache=512 set-variable= sort_buffer_size=256M set-variable= record_buffer=384M set-variable= thread_cache=8 set-variable=myisam_sort_buffer_size=384M myisam-recover=BACKUP,FORCE set-variable=read_buffer_size=384M set-variable=interactive_timeout=7200 set-variable=wait_timeout=7200 set-variable=thread_concurrency=4 mysql show processlist; +++++-+--++- -+ | Id | User | Host | db | Command | Time | State | Info | +++++-+--++- -+ | 2 | xotech | localhost:1810 | finlog | Query | 75 | update | INSERT INTO `321st_stat` VALUES ('AE1','AF4','013000887','EA','1','W918AR42800179','R','04280','','B | | 6 | xotech | localhost:1634 | NULL | Query |0 | NULL | show processlist | +++++-+--++- -+ 2 rows in set (0.00 sec) mysql -- MySQL General Mailing List For list archives: http://lists.mysql.com/mysql To unsubscribe:http://lists.mysql.com/[EMAIL PROTECTED]
Re: Restore help! been going 2 weeks
Can you post your my.ini or my.cnf sense your restore actually worked Mine is not swapping, in fact, mysql is only using 610,824 K of ram, there is still over 1 gig of ram free Our next server will have 16 gig of ram and quad xeons, I'm going to completely disable the swap files -- MySQL General Mailing List For list archives: http://lists.mysql.com/mysql To unsubscribe:http://lists.mysql.com/[EMAIL PROTECTED]
Re: Restore help! been going 2 weeks
The restore is still running is this normal? How do you all deal with customers that do not have their data for almost 3 weeks, and no end in sight I've had oracle crashes before, the restores were very simple, this is not I am very disappointed with mysql's performance with files over 5 gig Almost all my MYD files are more than 4 gig I was not aware of this limitation I tested with InnoDB and found it horribly slow for what we do Anil Doppalapudi wrote: check your .myd file size. if table type is myisam and it it is more than 4 GB then convert it to InnoDB. --Anil -- MySQL General Mailing List For list archives: http://lists.mysql.com/mysql To unsubscribe:http://lists.mysql.com/[EMAIL PROTECTED]
RE: Restore help! been going 2 weeks
it is not normal. i have restored 90 GB database in 2 days on dell server with 2 GB RAM. Are you able to connect to mysql. if so at mysql prompt type the below command and check what processes are running mysqlshow processlist; if any insert statements are running then restoration process is going on then check your my.cnf parameters. Anil -Original Message- From: matt_lists [mailto:[EMAIL PROTECTED] Sent: Thursday, December 16, 2004 6:15 PM To: [EMAIL PROTECTED] Subject: Re: Restore help! been going 2 weeks The restore is still running is this normal? How do you all deal with customers that do not have their data for almost 3 weeks, and no end in sight I've had oracle crashes before, the restores were very simple, this is not I am very disappointed with mysql's performance with files over 5 gig Almost all my MYD files are more than 4 gig I was not aware of this limitation I tested with InnoDB and found it horribly slow for what we do Anil Doppalapudi wrote: check your .myd file size. if table type is myisam and it it is more than 4 GB then convert it to InnoDB. --Anil -- MySQL General Mailing List For list archives: http://lists.mysql.com/mysql To unsubscribe: http://lists.mysql.com/[EMAIL PROTECTED] -- MySQL General Mailing List For list archives: http://lists.mysql.com/mysql To unsubscribe:http://lists.mysql.com/[EMAIL PROTECTED]
Re: Restore help! been going 2 weeks
Almost all my MYD files are more than 4 gig I was not aware of this limitation I tested with InnoDB and found it horribly slow for what we do Anil Doppalapudi wrote: check your .myd file size. if table type is myisam and it it is more than 4 GB then convert it to InnoDB. --Anil -- MySQL General Mailing List For list archives: http://lists.mysql.com/mysql To unsubscribe:http://lists.mysql.com/[EMAIL PROTECTED]
Restore help! been going 2 weeks
Need help with mysql restore speed Table crashed, had to restore from backup, I started the restore 2 weeks ago, the last change date on the files is the 8th, so mysql has not wrote data into the files sense then, but it's still running! (or is it?) the data restore is pretty quick, the index restore is VERY slow, if it even functions how does everyone deal with this issue, or is this not an issue for anyone else? Here's what I use to backup mysqldump --opt finlog 321st_stat /intranet/backup/321st_stat.sql using this to restore mysql -f 321st_stat.sql Server is a dual 2.8 xeon with 1gig of ram, the only job running is the restore windows 2000, mysql 4.1.7, no options set in the ini file table has aproximatly 80 million records CREATE TABLE `321st_stat` ( `dic` char(3) NOT NULL default '', `fr_ric` char(3) NOT NULL default '', `niin` char(11) NOT NULL default '', `ui` char(2) NOT NULL default '', `qty` char(5) NOT NULL default '', `don` char(14) NOT NULL default '', `suf` char(1) NOT NULL default '', `dte_txn` char(5) NOT NULL default '', `ship_to` char(3) NOT NULL default '', `sta` char(2) NOT NULL default '', `lst_sos` char(3) NOT NULL default '', `esd` char(4) NOT NULL default '', `stor` char(3) NOT NULL default '', `d_t` char(4) NOT NULL default '', `ctasc` char(10) NOT NULL default '', PRIMARY KEY (`dic`,`niin`,`fr_ric`,`don`,`suf`,`dte_txn`,`sta`), KEY `don` (`don`), KEY `niin` (`niin`), KEY `stor` (`stor`), KEY `dic` (`dic`), KEY `ctasc` (`ctasc`) ) ENGINE=MyISAM PACK_KEYS=1 -- MySQL General Mailing List For list archives: http://lists.mysql.com/mysql To unsubscribe:http://lists.mysql.com/[EMAIL PROTECTED]
Re: Restore help! been going 2 weeks
Nobody else has problems with restores on 8+ gig tables? -- MySQL General Mailing List For list archives: http://lists.mysql.com/mysql To unsubscribe:http://lists.mysql.com/[EMAIL PROTECTED]
RE: Restore help! been going 2 weeks
check your .myd file size. if table type is myisam and it it is more than 4 GB then convert it to InnoDB. --Anil -Original Message- From: matt_lists [mailto:[EMAIL PROTECTED] Sent: Tuesday, December 14, 2004 8:36 PM Cc: [EMAIL PROTECTED] Subject: Re: Restore help! been going 2 weeks Nobody else has problems with restores on 8+ gig tables? -- MySQL General Mailing List For list archives: http://lists.mysql.com/mysql To unsubscribe: http://lists.mysql.com/[EMAIL PROTECTED] -- MySQL General Mailing List For list archives: http://lists.mysql.com/mysql To unsubscribe:http://lists.mysql.com/[EMAIL PROTECTED]
Re: MySQL Lost 3 Weeks of Data
Michael, check all your connectivity settings (host and port) of all of the software that you use (backup scripts for example) especially if you connect via tcp/ip instead of via sockets. Check the error log, too. Did you move something on January, 20th (database)? From our own experiences it sounds to me it is most likely a failure in the human interface but I may be totally wrong. Good Luck! Regards, Frank. Schmuck, Michael schrieb: Hello I've got a big problem. My MySQL server has yesterday lost data since 20th january. Fact, we got a daily tape backup. All our tapes (monday - friday tapes, oldest one is we 04th feb) are holding the database of 19th january evening. Yesterday at about 14 o'clock we resartet the demon on our bsd server since september 03. I belive the deamon didn't wrote the data into the files. At the restart of the database he loaded the old files and lost erverything newer then 20th. Everyone in my company says thats impossible (lost during restart of deamon). Everyone else says somthing wrong with the database update software, but the user worked all feb with this data. Wed morning he says he could work with it, but after our restart the database was like 19th jan. Has someone ideas why this happened? I couldn't explan to me why this happened! grds, Michael Schmuck p.s. sorry for bad english -- Dr. Frank Ullrich, DBA Netzwerkadministration Heise Zeitschriften Verlag GmbH Co KG, Helstorfer Str. 7, D-30625 Hannover E-Mail: [EMAIL PROTECTED] Phone: +49 511 5352 587; FAX: +49 511 5352 538 -- MySQL General Mailing List For list archives: http://lists.mysql.com/mysql To unsubscribe:http://lists.mysql.com/[EMAIL PROTECTED]
Re: MySQL Lost 3 Weeks of Data
Schmuck, Michael wrote: I've got a big problem. My MySQL server has yesterday lost data since 20th january. Yesterday at about 14 o'clock we resartet the demon on our bsd server since september 03. I belive the deamon didn't wrote the data into the files. At the restart of the database he loaded the old files and lost erverything newer then 20th. What do you mean: he loaded the old files and lost everything newer than 20th. Did he restore from a backup tape? If so, then it's probably a problem with your backup system not properly backing up the mysql tables. While the Mysql process is running, the files are in-use by Mysqld, and many tape-backup programs cannot properly copy files that are in-use. -- MySQL General Mailing List For list archives: http://lists.mysql.com/mysql To unsubscribe:http://lists.mysql.com/[EMAIL PROTECTED]
MySQL Lost 3 Weeks of Data
Hello I've got a big problem. My MySQL server has yesterday lost data since 20th january. Fact, we got a daily tape backup. All our tapes (monday - friday tapes, oldest one is we 04th feb) are holding the database of 19th january evening. Yesterday at about 14 o'clock we resartet the demon on our bsd server since september 03. I belive the deamon didn't wrote the data into the files. At the restart of the database he loaded the old files and lost erverything newer then 20th. Everyone in my company says thats impossible (lost during restart of deamon). Everyone else says somthing wrong with the database update software, but the user worked all feb with this data. Wed morning he says he could work with it, but after our restart the database was like 19th jan. Has someone ideas why this happened? I couldn't explan to me why this happened! grds, Michael Schmuck p.s. sorry for bad english
Re: MySQL Lost 3 Weeks of Data
First and foremost, your English is not even remotely bad! You should hear half of my native-English speaking friends! Can you give us some more information, such as the server configuration, OS, filesystem, MySQL version, table types in use, table size, size of the data gone missing, backup methods and anything else worthy of note (forms of logical volume management, impatient users that reboot servers because they can't look at www.insert terrible site here.com due to proxy/firewall restrictions)? Regards, Chris Schmuck, Michael wrote: Hello I've got a big problem. My MySQL server has yesterday lost data since 20th january. Fact, we got a daily tape backup. All our tapes (monday - friday tapes, oldest one is we 04th feb) are holding the database of 19th january evening. Yesterday at about 14 o'clock we resartet the demon on our bsd server since september 03. I belive the deamon didn't wrote the data into the files. At the restart of the database he loaded the old files and lost erverything newer then 20th. Everyone in my company says thats impossible (lost during restart of deamon). Everyone else says somthing wrong with the database update software, but the user worked all feb with this data. Wed morning he says he could work with it, but after our restart the database was like 19th jan. Has someone ideas why this happened? I couldn't explan to me why this happened! grds, Michael Schmuck p.s. sorry for bad english -- MySQL General Mailing List For list archives: http://lists.mysql.com/mysql To unsubscribe:http://lists.mysql.com/[EMAIL PROTECTED]
Repairing a large table takes weeks!
Hi, About two weeks ago I received The table Worklist is full error. Since then I have been struggling to update the table indexes beyond 32 bits as suggested by the MySQL documentation. I have tried the following methods (I have two identical MySQL databases on two identical computers): - ALTER TABLE Worklist MAX_ROWS=10. This created a new database file (#sql-50c_2829.*) in about 1.5 days. After this, the file sizes remained constant and nothing seemed to be happening. I let it run for 7 days, after which I killed the MySQL process. - Created a new table and did INSERT INTO Worklist_Temp SELECT * FROM Worklist. This has now been running for about 7 days on one of my servers, and it shows no signs of finishing (MySQL 4.0.14b). - As above, but with ALTER TABLE Worklist_Temp DISABLE KEYS. The INSERT phase went through quickly (a few hours). After that I ran ALTER TABLE Worklist_Temp ENABLE KEYS which is now rebuilding the database. It has now been running for 3 days (MySQL 4.0.12). - SHOW PROCESSLIST shows that during the slow periods, the MySQL server is doing Repair with keycache. As suggested by the documentation, I tried setting myisam_sort_buffer_size to 180 MB (I have 256 MB of memory in the computer). This didn't change anything so I think the indexes were too large to be repaired by sorting. - I have defragmented the disks on both of the computers. - CHECK TABLE Worklist on the original table returns no errors. During this process, I have been using the following MySQL versions: 3.23.49, 4.0.12 and 4.0.14b. The computers are running Windows XP and using NTFS volumes. Both computers are workstation-grade with 1.0 GHz processor, 256 MB RAM and 40 GB HD. The harddrives have about 20 GB free space. If anyone has any ideas how to get this update done, I'd appreciate them a lot! The servers were in production use, and production has been halted for the past 2 weeks because of these problems... How long has MySQL taken for others when upgrading 4 GB databases after the The table Worklist is full error? -- Mikko Noromaa ([EMAIL PROTECTED]) - tel. +358 40 7348034 Noromaa Solutions - see http://www.nm-sol.com/ -- MySQL General Mailing List For list archives: http://lists.mysql.com/mysql To unsubscribe:http://lists.mysql.com/[EMAIL PROTECTED]
Re: Repairing a large table takes weeks!
About two weeks ago I received The table Worklist is full error. Since what type of table? MyISAM? how big is 'full'? If you don't specify any of the above options, the maximum size for a table will be 4G (or 2G if your operating systems only supports 2G tables). i dont know how big files can be on NTFS/WinXP. -- Sebastian Mendel www.sebastianmendel.de www.tekkno4u.de www.nofetish.com -- MySQL General Mailing List For list archives: http://lists.mysql.com/mysql To unsubscribe:http://lists.mysql.com/[EMAIL PROTECTED]
RE: Repairing a large table takes weeks!
Hi, About two weeks ago I received The table Worklist is full error. what type of table? MyISAM? how big is 'full'? Sorry I forgot to mention. The table is MyISAM with 32-bit file pointers. This makes its maximum size 4 GB. This is the limit I hit. Windows XP with NTFS has no file-size limits at 4 GB (they are somewhere in the TB range I believe). -- Mikko Noromaa ([EMAIL PROTECTED]) - tel. +358 40 7348034 Noromaa Solutions - see http://www.nm-sol.com/ -Original Message- From: Sebastian Tobias Mendel genannt Mendelsohn [mailto:[EMAIL PROTECTED] Sent: Tuesday, September 16, 2003 1:52 PM To: [EMAIL PROTECTED] Subject: Re: Repairing a large table takes weeks! About two weeks ago I received The table Worklist is full error. Since what type of table? MyISAM? how big is 'full'? If you don't specify any of the above options, the maximum size for a table will be 4G (or 2G if your operating systems only supports 2G tables). i dont know how big files can be on NTFS/WinXP. -- Sebastian Mendel www.sebastianmendel.de www.tekkno4u.de www.nofetish.com -- MySQL General Mailing List For list archives: http://lists.mysql.com/mysql To unsubscribe:http://lists.mysql.com/[EMAIL PROTECTED]
Incorrect DATE_FORMAT output for weeks with year e.g. '%v-%y'
Mysql gives what I'd call incorrect output when outputing week and years at the end of the year. PHP handles this correctly -- if I do a date(W-y, $date) for '2001-12-31' I get '01-02', but in mysql you get the wrong year: '01-01' Here's the an example (the second one is not what I'd expect): SELECT DATE_FORMAT('2001-12-25', '%v-%y'); ++ | DATE_FORMAT('2001-12-25', '%v-%y') | ++ | 52-01 | ++ SELECT DATE_FORMAT('2001-12-31', '%v-%y'); ++ | DATE_FORMAT('2001-12-31', '%v-%y') | ++ | 01-01 | ++ SELECT DATE_FORMAT('2002-01-05', '%v-%y'); ++ | DATE_FORMAT('2002-01-05', '%v-%y') | ++ | 01-02 | ++ I'm using mysql version 3.23.51-log, and I haven't seen anything with DATE_FORMAT in the changelog since. Do people agree that this is a bug? Is there any work around for this? Thanks, Joe - Before posting, please check: http://www.mysql.com/manual.php (the manual) http://lists.mysql.com/ (the list archive) To request this thread, e-mail [EMAIL PROTECTED] To unsubscribe, e-mail [EMAIL PROTECTED] Trouble unsubscribing? Try: http://lists.mysql.com/php/unsubscribe.php
RE: Incorrect DATE_FORMAT output for weeks with year e.g. '%v-%y'
PHP handles this correctly -- if I do a date(W-y, $date) for '2001-12-31' I get '01-02', but in mysql you get the wrong year: '01-01' I don't know where you got this data from, but the second number would be the year, that means mysql is showing the right year and php is showing the wrong year ( xx-01 is right ) Here's the an example (the second one is not what I'd expect): SELECT DATE_FORMAT('2001-12-25', '%v-%y'); ++ | DATE_FORMAT('2001-12-25', '%v-%y') | ++ | 52-01 | ++ SELECT DATE_FORMAT('2001-12-31', '%v-%y'); ++ | DATE_FORMAT('2001-12-31', '%v-%y') | ++ | 01-01 | ++ SELECT DATE_FORMAT('2002-01-05', '%v-%y'); ++ | DATE_FORMAT('2002-01-05', '%v-%y') | ++ | 01-02 | ++ I'm using mysql version 3.23.51-log, and I haven't seen anything with DATE_FORMAT in the changelog since. Do people agree that this is a bug? Is there any work around for this? from the documentation %v Week (01..53), where Monday is the first day of the week. Used with '%x' should you be using %x with this? %x Year for the week, where Monday is the first day of the week, numeric, 4 digits, used with '%v' Don't really know how you would use them with each other to get the right date, but you must be able to do it some way %u Week (00..53), where Monday is the first day of the week SELECT DATE_FORMAT('2001-12-31', '%u-%y'); ++ | DATE_FORMAT('2001-12-31', '%u-%y') | ++ | 53-01 | ++ the note that the week can return 53, perhaps this is being retranslated into week 1 when using %v - Before posting, please check: http://www.mysql.com/manual.php (the manual) http://lists.mysql.com/ (the list archive) To request this thread, e-mail [EMAIL PROTECTED] To unsubscribe, e-mail [EMAIL PROTECTED] Trouble unsubscribing? Try: http://lists.mysql.com/php/unsubscribe.php
Re: Incorrect DATE_FORMAT output for weeks with year e.g. '%v-%y'
PHP handles this correctly -- if I do a date(W-y, $date) for '2001-12-31' I get '01-02', but in mysql you get the wrong year: '01-01' I don't know where you got this data from, but the second number would be the year, that means mysql is showing the right year and php is showing the wrong year ( xx-01 is right ) Sorry I was incorrect on PHP: ?php echo date(W-y, strtotime('2001-12-31 01:00:00'));? prints 53-01. I don't agree that mysql is 'right' here though, I realize that if you simply strip out the year for the date it would be '01', but if you wrap the week number, you shouldn't you increment the year as well? SELECT DATE_FORMAT('2001-12-31', '%v-%y'); returning '01-01' is inaccurate to say the least. %x and %X have the same problem as %v (since they use it). Don't really know how you would use them with each other to get the right date, but you must be able to do it some way %u Week (00..53), where Monday is the first day of the week SELECT DATE_FORMAT('2001-12-31', '%u-%y'); ++ | DATE_FORMAT('2001-12-31', '%u-%y') | ++ | 53-01 | ++ the note that the week can return 53, perhaps this is being retranslated into week 1 when using %v This output makes sense and matches PHP's output so I'll use this instead. Thanks, Joe - Before posting, please check: http://www.mysql.com/manual.php (the manual) http://lists.mysql.com/ (the list archive) To request this thread, e-mail [EMAIL PROTECTED] To unsubscribe, e-mail [EMAIL PROTECTED] Trouble unsubscribing? Try: http://lists.mysql.com/php/unsubscribe.php
Re: Incorrect DATE_FORMAT output for weeks with year e.g. '%v-%y'
I don't know about incorrect, but confusing, sure. It is easy to predict what is going to be returned based on the documentation. On Thu, 2002-11-21 at 14:19, Joe Siegrist wrote: I don't agree that mysql is 'right' here though, I realize that if you simply strip out the year for the date it would be '01', but if you wrap the week number, you shouldn't you increment the year as well? SELECT DATE_FORMAT('2001-12-31', '%v-%y'); returning '01-01' is inaccurate to say the least. mysql select DATE_FORMAT('2001-12-31', '%v-%y'); +---+ | DATE_FORMAT('2001-12-31', '%Y%v') | +---+ | 200101| +---+ In the date_format query given above, the %v (01) is correct, as 2001-12-31 is in the first week of 2002. But the %v doesn't report that it's giving the week for 2002, it only reports the week. Compare to: mysql select yearweek('2001-12-31'); ++ | yearweek('2001-12-31') | ++ | 200201 | ++ date_format's %v (and the week() function) prints the week of the year as extracted from the date, which may fall in the next year. date_format's %Y prints the year extracted from the date, which, for all dates in 2001, is 2001. yearweek() (and %X%v, see below) takes both the year and the date into account when calcuating the value to return, and date_format's %Y doesn't: each substitution is done independantly of the other substitutions, ie set @d = '2001-12-31'; select date_format(@d, '%Y%v'); is equivalent to select concat(date_format(@d, '%Y'),date_format(@d, '%v')); but neither of them are necessarily equivalent to select yearweek(@d); but that's equivalent to select date_format(@d, '%X%v'); %x and %X have the same problem as %v (since they use it). I don't know if it's actually a problem, note in docs under yearweek(): Note that the week number is different from what the WEEK() function would return (0) as WEEK() returns the week in the context of the given year. ... that is, the year in the date given to week()/%u (which is why week can return 53), that is. The difference between %u and %v is the difference between using yearweek() and week() with the optional second argument to specify if the week starts on sunday or monday. Andy. - Before posting, please check: http://www.mysql.com/manual.php (the manual) http://lists.mysql.com/ (the list archive) To request this thread, e-mail [EMAIL PROTECTED] To unsubscribe, e-mail [EMAIL PROTECTED] Trouble unsubscribing? Try: http://lists.mysql.com/php/unsubscribe.php
Re: Calculating weeks
I use the Now() function, and I just set my format to ww without parenthesis and it gives me the current week number. --- Georg Richter [EMAIL PROTECTED] wrote: On Monday, 15. July 2002 20:28, Paul W. Reilly wrote: Hello Paul, maybe http://www.mysql.com/doc/D/a/Date_and_time_functions.html could answer your question (Function week() ) Regards Georg Trying to count weeks! I am doing a personal accounting system in php/mysql. I have a report section that groups and calculates expenses into running totals, so that I can see total amount spent in each category. I would like to add a break down to this that will show me the average weekly amount. The calculation is easy enough, but I can not figure out how to get the number of weeks being calculated. Any ideas would be greatly appreciated. Right now I am putting in the number of weeks in manually. - Before posting, please check: http://www.mysql.com/manual.php (the manual) http://lists.mysql.com/ (the list archive) To request this thread, e-mail [EMAIL PROTECTED] To unsubscribe, e-mail [EMAIL PROTECTED] Trouble unsubscribing? Try: http://lists.mysql.com/php/unsubscribe.php __ Do You Yahoo!? Yahoo! Health - Feel better, live better http://health.yahoo.com - Before posting, please check: http://www.mysql.com/manual.php (the manual) http://lists.mysql.com/ (the list archive) To request this thread, e-mail [EMAIL PROTECTED] To unsubscribe, e-mail [EMAIL PROTECTED] Trouble unsubscribing? Try: http://lists.mysql.com/php/unsubscribe.php
Calculating weeks
Trying to count weeks! I am doing a personal accounting system in php/mysql. I have a report section that groups and calculates expenses into running totals, so that I can see total amount spent in each category. I would like to add a break down to this that will show me the average weekly amount. The calculation is easy enough, but I can not figure out how to get the number of weeks being calculated. Any ideas would be greatly appreciated. Right now I am putting in the number of weeks in manually. Paul W. Reilly Product Management/Web Developer Demosphere International, Inc. 703-536-1600 ext. 31 --- Outgoing mail is certified Virus Free. Checked by AVG anti-virus system (http://www.grisoft.com). Version: 6.0.375 / Virus Database: 210 - Release Date: 7/10/2002 - Before posting, please check: http://www.mysql.com/manual.php (the manual) http://lists.mysql.com/ (the list archive) To request this thread, e-mail [EMAIL PROTECTED] To unsubscribe, e-mail [EMAIL PROTECTED] Trouble unsubscribing? Try: http://lists.mysql.com/php/unsubscribe.php
Re: Calculating weeks
On Monday, 15. July 2002 20:28, Paul W. Reilly wrote: Hello Paul, maybe http://www.mysql.com/doc/D/a/Date_and_time_functions.html could answer your question (Function week() ) Regards Georg Trying to count weeks! I am doing a personal accounting system in php/mysql. I have a report section that groups and calculates expenses into running totals, so that I can see total amount spent in each category. I would like to add a break down to this that will show me the average weekly amount. The calculation is easy enough, but I can not figure out how to get the number of weeks being calculated. Any ideas would be greatly appreciated. Right now I am putting in the number of weeks in manually. - Before posting, please check: http://www.mysql.com/manual.php (the manual) http://lists.mysql.com/ (the list archive) To request this thread, e-mail [EMAIL PROTECTED] To unsubscribe, e-mail [EMAIL PROTECTED] Trouble unsubscribing? Try: http://lists.mysql.com/php/unsubscribe.php
AS SEEN ON NATIONAL TV: MAKE OVER $500,000 EVERY 20 WEEKS!!
Dear Friend: The first time I received this in my e-mail I deleted it! And later wished I had it back! I had to wait months before someone E-mailed me again with this offer! I always thought I would wait until later to try this. I wish I had done it sooner now! Don't do what I did! Make Sure you Go For It! I personally thought that this would not work! But just like many of you, I was involved in MLM marketing, which in most cases did not work! I was spending from $20 to $200 a month on FFA pages and other things to get leads to market my product. Unfortunately, as I'm sure many of you have no doubt noticed, most people who post to theses pages are like you and I. They have a product to sell! They're not there to BUY! So most of us end up spinning our wheels and getting NO WHERE! SO I thought when I came across this E-Mail... HECK, What did I have to lose? It was a lot less money than what I had been spending, and it was only $25.00. I thought even if it did not work, I have already lost a lot more money than that from other MLM and scams on the net, what was another $25.00? And I did see this on T.V. So I gave it a shot. AND BOY I AM GLAD I DID The first month only a few hundred came in, but the following month I could not believe it the response! It really worked!!! So now I am going to run it one more time! And I Am sure it will be just as good as the first time! That is what is nice about this program...you can start over and over again once you're off the 5th level. If you're A GOOD HONEST PERSON IT WILL WORK FOR YOU TOO!! GO FOR IT! MAKE YOUR DREAMS COME TRUE! AS SEEN ON NATIONAL TV: Making over half million dollars every 4 to 5 months from your home for A one-time investment of only $25 U.S.!! Before you say ''BULL'', please read the following. This is the letter you have been hearing about on the news lately. Due to the popularity of this letter on the Internet, a national weekly news program recently devoted an entire show to the investigation of this program described below, to see if it really can make people money. The show also investigated whether or not the program was legal. Their findings proved once and for all that there are ''absolutely NO Laws prohibiting the participation in the program and if people can -follow the simple instructions, they are bound to make some mega bucks with only $25 out of pocket cost''! === This is what one participant had to say: '' Thanks to this profitable opportunity. I was approached many times before but each time I passed on it. I am so glad I finally joined just to see what one could expect in return for the minimal effort and money required. To my astonishment, I received total $610,470.00 in 21 weeks, with money still coming in''. Pam Hedland, Fort Lee, New Jersey. === Here is another testimonial: ''' This program has been around for a long time but I never believed in it. But one day when I received this again in the mail I decided to gamble my $25 on it. I followed the simple instructions and whalaa . 3 weeks later the money started to come in. First month I only made $240.00 but the next 2 months after that I made a total of $290,000.00. So far, in the past 8 months by re-entering the program, I have made over $710,000.00 and I am playing it again. The key to success in this program is to follow the simple steps and NOT change anything.'' More testimonials later but first: = PRINT THIS NOW FOR YOUR FUTURE REFERENCE If you would like to make at least $500,000 every 4 to 5 months easily and comfortably, please read the following...THEN READ IT AGAIN and AGAIN!!! FOLLOW THESE SIMPLE INSTRUCTION AND YOUR FINANCIAL DREAMS WILL COME TRUE.GUARANTEED! =ORDER ALL 5 REPORTS SHOWN ON THE LIST BELOW = For each report, send $5 CASH, THE NAME NUMBER OF THE REPORT YOU ARE ORDERING and YOUR E-MAIL ADDRESS to the person whose name appears ON THAT LIST next to the report. MAKE SURE YOUR RETURN ADDRESS IS ON YOUR ENVELOPE TOP LEFT CORNER in case of any mail problems. === When you place your order, make sure you order each of the 5 reports. You will need all 5 reports so that you can save them on your computer and resell them. YOUR TOTAL COST $5 X 5=$25.00. Within a few days you will receive, via e-mail, each of the 5 reports from these 5 different individuals. Save them on your computer so they will be accessible for you to send to the 1,000's of people who will order them from you. Also make a floppy of these reports and keep it on your desk in case something happens to your computer. IMPORTANT - DO NOT alter the names of the people who
RE: AS SEEN ON NATIONAL TV: MAKE OVER $500,000 EVERY 20 WEEKS!!
I deleted it too! Guess what, I don't miss it either! -Original Message- From: Reports [mailto:[EMAIL PROTECTED]] Sent: Wednesday, January 09, 2002 11:12 AM To: [EMAIL PROTECTED] Subject: AS SEEN ON NATIONAL TV: MAKE OVER $500,000 EVERY 20 WEEKS!! Dear Friend: The first time I received this in my e-mail I deleted it! And later wished I had it back! I had to wait months before someone E-mailed me again with this offer! I always thought I would wait until later to try this. I wish I had done it sooner now! Don't do what I did! Make Sure you Go For It! I personally thought that this would not work! But just like many of you, I was involved in MLM marketing, which in most cases did not work! I was spending from $20 to $200 a month on FFA pages and other things to get leads to market my product. Unfortunately, as I'm sure many of you have no doubt noticed, most people who post to theses pages are like you and I. They have a product to sell! They're not there to BUY! So most of us end up spinning our wheels and getting NO WHERE! SO I thought when I came across this E-Mail... HECK, What did I have to lose? It was a lot less money than what I had been spending, and it was only $25.00. I thought even if it did not work, I have already lost a lot more money than that from other MLM and scams on the net, what was another $25.00? And I did see this on T.V. So I gave it a shot. AND BOY I AM GLAD I DID The first month only a few hundred came in, but the following month I could not believe it the response! It really worked!!! So now I am going to run it one more time! And I Am sure it will be just as good as the first time! That is what is nice about this program...you can start over and over again once you're off the 5th level. If you're A GOOD HONEST PERSON IT WILL WORK FOR YOU TOO!! GO FOR IT! MAKE YOUR DREAMS COME TRUE! AS SEEN ON NATIONAL TV: Making over half million dollars every 4 to 5 months from your home for A one-time investment of only $25 U.S.!! Before you say ''BULL'', please read the following. This is the letter you have been hearing about on the news lately. Due to the popularity of this letter on the Internet, a national weekly news program recently devoted an entire show to the investigation of this program described below, to see if it really can make people money. The show also investigated whether or not the program was legal. Their findings proved once and for all that there are ''absolutely NO Laws prohibiting the participation in the program and if people can -follow the simple instructions, they are bound to make some mega bucks with only $25 out of pocket cost''! === This is what one participant had to say: '' Thanks to this profitable opportunity. I was approached many times before but each time I passed on it. I am so glad I finally joined just to see what one could expect in return for the minimal effort and money required. To my astonishment, I received total $610,470.00 in 21 weeks, with money still coming in''. Pam Hedland, Fort Lee, New Jersey. === Here is another testimonial: ''' This program has been around for a long time but I never believed in it. But one day when I received this again in the mail I decided to gamble my $25 on it. I followed the simple instructions and whalaa . 3 weeks later the money started to come in. First month I only made $240.00 but the next 2 months after that I made a total of $290,000.00. So far, in the past 8 months by re-entering the program, I have made over $710,000.00 and I am playing it again. The key to success in this program is to follow the simple steps and NOT change anything.'' More testimonials later but first: = PRINT THIS NOW FOR YOUR FUTURE REFERENCE If you would like to make at least $500,000 every 4 to 5 months easily and comfortably, please read the following...THEN READ IT AGAIN and AGAIN!!! FOLLOW THESE SIMPLE INSTRUCTION AND YOUR FINANCIAL DREAMS WILL COME TRUE.GUARANTEED! =ORDER ALL 5 REPORTS SHOWN ON THE LIST BELOW = For each report, send $5 CASH, THE NAME NUMBER OF THE REPORT YOU ARE ORDERING and YOUR E-MAIL ADDRESS to the person whose name appears ON THAT LIST next to the report. MAKE SURE YOUR RETURN ADDRESS IS ON YOUR ENVELOPE TOP LEFT CORNER in case of any mail problems. === When you place your order, make sure you order each of the 5 reports. You will need all 5 reports so that you can save them on your computer and resell them. YOUR TOTAL COST $5 X 5=$25.00. Within a few days you will receive, via e-mail, each of the 5 reports from these 5 different individuals. Save them on your computer