> Minor correction: The post i point to is about the slow log, but I presume
> also relevant for the general log. And the good comments I mentioned come in
> the followup posting at http://www.bitbybit.dk/carsten/blog/?p=116
>
> / Carsten
Thanks Carsten, I read the comments and Sheeri mentions mys
Well, first thing I'd do, is symlink the log table files onto a separate set
of spindles. No use bogging the main data spindles down with logwrites.
On Tue, Apr 20, 2010 at 5:33 PM, Carsten Pedersen wrote:
> Carsten Pedersen skrev:
>
> Jim Lyons skrev:
>>
>>> Has anyone tried using the log_out
Carsten Pedersen skrev:
Jim Lyons skrev:
Has anyone tried using the log_output option in mysql 5.1 to have the
general log put into a table and not a flat file? I used it for a while
before having to downgrade back to 5.0 but thought it was a great
idea. I'm
curious to see if anyone feels it
Jim Lyons skrev:
Has anyone tried using the log_output option in mysql 5.1 to have the
general log put into a table and not a flat file? I used it for a while
before having to downgrade back to 5.0 but thought it was a great idea. I'm
curious to see if anyone feels it helps analysis.
I tried
Hi Imran,
you can have a look at mysqldumpslow utility to analyze the data..
Thanks
Anand
On Tue, Apr 20, 2010 at 5:48 PM, Jim Lyons wrote:
> Has anyone tried using the log_output option in mysql 5.1 to have the
> general log put into a table and not a flat file? I used it for a while
> befor
Has anyone tried using the log_output option in mysql 5.1 to have the
general log put into a table and not a flat file? I used it for a while
before having to downgrade back to 5.0 but thought it was a great idea. I'm
curious to see if anyone feels it helps analysis.
On Tue, Apr 20, 2010 at 6:02
Maybe one of the maatkit tools will do it, but I tend to graph that kind of
data live in Munin from the internal counters.
On Tue, Apr 20, 2010 at 1:02 PM, Imran Chaudhry wrote:
> I have 7 days worth of general log data totalling 4.4GB.
>
> I want to analyze this data to get:
>
> a) queries per
I have 7 days worth of general log data totalling 4.4GB.
I want to analyze this data to get:
a) queries per second, minute, hour and day
b) a count of the number of selects versus write statements (delete,
insert, replace and update)
c) a variation of the above with "select, replace, delete and i
Ladies, gentlemen,
Database Workbench 3.4 Pro is available with a 25% discount now!
You will get the next major version for free when it is released later
this month. Order now and use the coupon code " DBW3X " when
ordering.
Database Workbench is a multi-dbms development tool, for more
infor
:
SELECT * FROM orders where WEEK(orders.date) = WEEK(NOW()) and
YEAR(orders.order_date) = YEAR(NOW())
- Original Message -
From: "ML"
To:
Sent: Monday, December 28, 2009 5:14 PM
Subject: Weeks
Hi All,
trying to write some SQL that will give me records for the CURRENT WEEK
ML,
trying to write some SQL that will give me records for the CURRENT WEEK.
Example, starting on a Sunday and going through Saturday.
This week it would be Dec 27 - Jan 2.
For the week of any date @d:
... WHERE order_date BETWEEN AddDate(@d, -DayOfWeek(@d)+1) AND
AddDate(@d, 7-DayOfWeek(@d
indexes to find your query results.
Regards,
Gavin Towey
-Original Message-
From: ML [mailto:mailingli...@mailnewsrss.com]
Sent: Monday, December 28, 2009 4:15 PM
To: mysql@lists.mysql.com
Subject: Weeks
Hi All,
trying to write some SQL that will give me records for the CURRENT WEEK
Hi All,
trying to write some SQL that will give me records for the CURRENT WEEK.
Example, starting on a Sunday and going through Saturday.
This week it would be Dec 27 - Jan 2.
I am doing this so I can write a query that will show orders that are placed
during the current week.
Here is what I
Hi,
I finally found the solution
"SELECT count( smsc_id ) AS total, week( insertdate ) AS tanggal
FROM momtbak
WHERE insertdate
BETWEEN DATE_SUB( CURRENT_DATE( ) , INTERVAL 4 WEEK )
AND CURRENT_DATE( )
GROUP BY week( insertdate )"
Willy
--
MySQL General Mailing List
For list archives: http://
Hi, I have tried to use this query: "SELECT count(smsc_id) as total,
insertdate FROM momtbak WHERE insertdate BETWEEN
DATE_SUB(CURRENT_DATE(), INTERVAL 4 WEEK) AND CURRENT_DATE() group by
week(date_format(insertdate,'%Y-%m-%d'),3)" to group records in the
last 4 weeks by
com
Cc: sangprabv <[EMAIL PROTECTED]>
Subject: Re: Query to Select records in the last 4 weeks
Date: Wed, 03 Dec 2008 17:52:32 -0800
SELECT * FROM momtbak
WHERE insertdate
BETWEEN DATE_SUB(CURRENT_DATE(), INTERVAL 4 WEEK) AND CURRENT_DATE();
--
MySQL General Mailing List
For list ar
On Thu, 2008-12-04 at 08:27 +0700, sangprabv wrote:
> Hi,
> I get stuck to build a query to select records between curdate() and the
> last 4 weeks and groupped by week. I tested with:
>
> "SELECT *
> FROM momtbak
> WHERE insertdate
> BETWEEN curdate( )
>
Hi,
I get stuck to build a query to select records between curdate() and the
last 4 weeks and groupped by week. I tested with:
"SELECT *
FROM momtbak
WHERE insertdate
BETWEEN curdate( )
AND curdate( ) - INTERVAL 4 week"
It doesn't work. Please help, TIA.
Willy
Every why h
Hello,
I know this is a *little* off topic but it is about Open Source
databases :)
There are only three weeks left to register for the PostgreSQL
Community Conference: East!
The conference is scheduled on March 29th and 30th (a Saturday and
Sunday) at the University of Maryland. Come join us
From: "Frank Bax"
> I have a table with datetime field and I would like to select all data
> older than "X" weeks, where "X" is a variable in my php script.
SELECT (.) WHERE `datetime_field` < NOW() - INTERVAL (7*X) DAY
This way you compare the datetim
an "X" weeks, where "X" is a variable in my php script.
DateDiff(datetime,now()) looks like its the right function for this
purpose, but this function requires date arguments and date() isn't in
4.0.20 to convert my field from datetime to date. I have tried many
differen
Anil Doppalapudi wrote:
ours is InnoDB. we are not getting any performance problems with the
settings. it is working fine since last 1 Year. to my knowledge due to
myisam type you are getting performance issue.
Thanks
Anil
based on this email list, myisam is prefered for heavy query/index use,
22, 2004 7:12 PM
To: Anil Doppalapudi
Cc: mysql@lists.mysql.com
Subject: Re: Restore help! been going 2 weeks
Those are pretty much standard settings
I had ours set almost exactly like that, and performance was worse
--
MySQL General Mailing List
For list archives: http://lists.mysql.com/mysql
Those are pretty much standard settings
I had ours set almost exactly like that, and performance was worse
--
MySQL General Mailing List
For list archives: http://lists.mysql.com/mysql
To unsubscribe:http://lists.mysql.com/[EMAIL PROTECTED]
t: Monday, December 20, 2004 5:56 PM
To: matt_lists
Cc: Anil Doppalapudi; [EMAIL PROTECTED]
Subject: Re: Restore help! been going 2 weeks
matt_lists wrote:
> Can you post your my.ini or my.cnf
>
> sense your restore actually worked
>
> Mine is not swapping, in fact, mysql is only usi
matt_lists wrote:
matt_lists wrote:
Can you post your my.ini or my.cnf
sense your restore actually worked
Mine is not swapping, in fact, mysql is only using 610,824 K of ram,
there is still over 1 gig of ram free
Our next server will have 16 gig of ram and quad xeons, I'm going to
completely dis
matt_lists wrote:
Can you post your my.ini or my.cnf
sense your restore actually worked
Mine is not swapping, in fact, mysql is only using 610,824 K of ram,
there is still over 1 gig of ram free
Our next server will have 16 gig of ram and quad xeons, I'm going to
completely disable the swap file
Can you post your my.ini or my.cnf
sense your restore actually worked
Mine is not swapping, in fact, mysql is only using 610,824 K of ram,
there is still over 1 gig of ram free
Our next server will have 16 gig of ram and quad xeons, I'm going to
completely disable the swap files
--
MySQL Genera
: matt_lists [mailto:[EMAIL PROTECTED]
Sent: Friday, December 17, 2004 12:11 AM
To: Anil Doppalapudi
Cc: [EMAIL PROTECTED]
Subject: Re: Restore help! been going 2 weeks
Anil Doppalapudi wrote:
>it is not normal. i have restored 90 GB database in 2 days on dell server
>with 2 GB RAM.
>Are
Are you sure innodb is better for tables larger than 4 gig?
| 12 | xotech | localhost:3115 | finlog | Query | 238224 | copy to tmp
table | alter table bragg_stat engine=innodb pack_keys=0 |
| 14 | xotech | localhost:3356 | NULL | Query | 0 |
NULL |
is going on
then check your my.cnf parameters.
Anil
-Original Message-
From: matt_lists [mailto:[EMAIL PROTECTED]
Sent: Thursday, December 16, 2004 6:15 PM
To: [EMAIL PROTECTED]
Subject: Re: Restore help! been going 2 weeks
The restore is still running
is this normal? How do you all d
The restore is still running
is this normal? How do you all deal with customers that do not have
their data for almost 3 weeks, and no end in sight
I've had oracle crashes before, the restores were very simple, this is not
I am very disappointed with mysql's performance with files
Almost all my MYD files are more than 4 gig
I was not aware of this limitation
I tested with InnoDB and found it horribly slow for what we do
Anil Doppalapudi wrote:
check your .myd file size. if table type is myisam and it it is more than 4
GB then convert it to InnoDB.
--Anil
--
MySQL Genera
check your .myd file size. if table type is myisam and it it is more than 4
GB then convert it to InnoDB.
--Anil
-Original Message-
From: matt_lists [mailto:[EMAIL PROTECTED]
Sent: Tuesday, December 14, 2004 8:36 PM
Cc: [EMAIL PROTECTED]
Subject: Re: Restore help! been going 2 weeks
Nobody else has problems with restores on 8+ gig tables?
--
MySQL General Mailing List
For list archives: http://lists.mysql.com/mysql
To unsubscribe:http://lists.mysql.com/[EMAIL PROTECTED]
Need help with mysql restore speed
Table crashed, had to restore from backup, I started the restore 2 weeks
ago, the last change date on the files is the 8th, so mysql has not
wrote data into the files sense then, but it's still running! (or is it?)
the data restore is pretty quick, the
Schmuck, Michael wrote:
I've got a big problem. My MySQL server has yesterday lost data since 20th
january.
Yesterday at about 14 o'clock we resartet the demon on our bsd server since
september 03. I belive the deamon
didn't wrote the data into the files. At the restart of the database he
loade
Michael,
check all your connectivity settings (host and port) of all of the
software that you use (backup scripts for example) especially if you
connect via tcp/ip instead of via sockets.
Check the error log, too.
Did you move something on January, 20th (database)?
From our own experiences it s
First and foremost, your English is not even remotely "bad"! You should
hear half of my native-English speaking friends!
Can you give us some more information, such as the server configuration,
OS, filesystem, MySQL version, table types in use, table size, size of
the data gone missing, backup
Hello
I've got a big problem. My MySQL server has yesterday lost data since 20th
january.
Fact, we got a daily tape backup. All our tapes (monday - friday tapes,
oldest one is we 04th feb)
are holding the database of 19th january evening.
Yesterday at about 14 o'clock we resartet the demo
Hi,
> > About two weeks ago I received "The table Worklist is full"
> > error.
>
> what type of table? MyISAM?
> how big is 'full'?
Sorry I forgot to mention. The table is MyISAM with 32-bit file pointers.
This makes its maximum size 4 GB. This is the
About two weeks ago I received "The table Worklist is full" error. Since
what type of table? MyISAM?
how big is 'full'?
"If you don't specify any of the above options, the maximum size for a
table will be 4G (or 2G if your operating systems only supports 2G tables
Hi,
About two weeks ago I received "The table Worklist is full" error. Since
then I have been struggling to update the table indexes beyond 32 bits as
suggested by the MySQL documentation.
I have tried the following methods (I have two identical MySQL databases on
two identical
I don't know about "incorrect", but confusing, sure. It is easy to
predict what is going to be returned based on the documentation.
On Thu, 2002-11-21 at 14:19, Joe Siegrist wrote:
> I don't agree that mysql is 'right' here though, I realize that if you
> simply strip out the year for the date i
>> PHP handles this correctly -- if I do a date("W-y", $date)
>> for '2001-12-31'
>> I get '01-02', but in mysql you get the wrong year: '01-01'
>
>I don't know where you got this data from, but the second number would be
the year, that means mysql is showing the right year and php is showing the
> PHP handles this correctly -- if I do a date("W-y", $date)
> for '2001-12-31'
> I get '01-02', but in mysql you get the wrong year: '01-01'
I don't know where you got this data from, but the second number would be the year,
that means mysql is showing the right year and php is showing the wr
Mysql gives what I'd call incorrect output when outputing week and years at
the end of the year.
PHP handles this correctly -- if I do a date("W-y", $date) for '2001-12-31'
I get '01-02', but in mysql you get the wrong year: '01-01'
Here's the an example (the second one is not what I'd expect):
ybe
>
http://www.mysql.com/doc/D/a/Date_and_time_functions.html
> could answer
> your question (Function week() )
>
> Regards Georg
>
> > Trying to count weeks! I am doing a personal
> accounting system in
> > php/mysql. I have a report section that groups
> and calcula
On Monday, 15. July 2002 20:28, Paul W. Reilly wrote:
Hello Paul,
maybe http://www.mysql.com/doc/D/a/Date_and_time_functions.html could answer
your question (Function week() )
Regards Georg
> Trying to count weeks! I am doing a personal accounting system in
> php/mysql. I have a
Trying to count weeks! I am doing a personal accounting system in
php/mysql. I have a report section that groups and calculates expenses into
running totals, so that I can see total amount spent in each category. I
would like to add a break down to this that will show me the average weekly
I deleted it too! Guess what, I don't miss it either!
-Original Message-
From: Reports [mailto:[EMAIL PROTECTED]]
Sent: Wednesday, January 09, 2002 11:12 AM
To: [EMAIL PROTECTED]
Subject: AS SEEN ON NATIONAL TV: MAKE OVER $500,000 EVERY 20 WEEKS!!
Dear Friend:
The first t
sed on it. I am
so glad I finally joined just to see what one could expect in return for
the
minimal effort and money required. To my astonishment, I received total
$610,470.00 in 21 weeks, with money still coming in''.
Pam Hedland, Fort Lee, New Jersey.
=
52 matches
Mail list logo