set time_zone='Europe/Kiev';
-Original Message-
From: Andre Polykanine [mailto:an...@oire.org]
Sent: Thursday, February 24, 2011 10:23 AM
To: mysql@lists.mysql.com
Subject: Changing the timezone
Hi everyone,
since I'm using the shared hosting, I can't change the default
timezone
After a TCP connection has been closed, the state changes to TIME_WAIT. If I
recall correctly, this is a way to prevent the same port from being
immediately reused, and allow any duplicate packets from the previous
connection that might still be floating around the network time to expire.
In genera
Here's one option to "pivot" your results:
select
record_id
,max(soi) as soi
,max(heading) as heading
,max(description) as description
,max(relloc) as relloc
from
(select
record_id
,if(field_name = 'SOI', field_value, '') as soi
,if(field_name = 'Heading', field_va
Maybe some sort of logarithmic expression?
select no_of_jobs, 10 * log(10, no_of_jobs) as job_weight
from data;
Of course, you'd have to tweak your coefficients to match the weighting
system you want to use.
-Travis
-Original Message-
From: Richard Reina [mailto:gatorre...@gmail.com]
S
What columns do you have indexed on your event_log table? Can you post the
output from SHOW CREATE TABLE? How long does the query run for?
-Original Message-
From: Andy Wallace [mailto:awall...@ihouseweb.com]
Sent: Friday, February 04, 2011 10:29 AM
To: mysql list
Subject: Table/select p
Something like this might help you find all of the times where your user_id
switched to a different team_id:
select team_id, user_id, min(last_changed)
from
(select home_team_id as team_id, home_user_id as user_id, last_changed
from data
union all
select away_team_id as team_id, away_user_id as us
You might check here: http://www.census.gov/geo/www/tiger/
-Travis
-Original Message-
From: viraj [mailto:kali...@gmail.com]
Sent: Wednesday, February 02, 2011 3:31 AM
To: mysql@lists.mysql.com
Subject: map polygon data for popular us cities
dear list,
where can i find a list of map pol
Maybe you could do something like the following with user variables (or
prepared statements):
set @sex = 1, @country = 120, @education = 0;
select if(sex_id = @sex, 1, 0) + if(country_id = @country, 1, 0) +
if(education_id = @education, 1, 0) as num_matches, sex_id, country_id,
education_id
from
join, indexing that column may not be a benefit. Running
EXPLAIN will tell you whether or not the index you anticipate is actually
being used for your query.
-Travis
From: Tompkins Neil [mailto:neil.tompk...@googlemail.com]
Sent: Friday, October 15, 2010 3:43 AM
To: [MySQL]; Travis Ard
Subje
I couldn't help but notice you have individual indexes on nearly all the
fields of your table. If you won't be using these fields exclusively as a
join or filter condition in a query, you are unlikely to benefit from the
extra indexes and, in fact, they could slow down your inserts and add to
your
Sorry, try changing the column mappings.ip to use the table aliases (m.ip
and m2.ip).
-Travis
From: Paul Halliday [mailto:paul.halli...@gmail.com]
Sent: Tuesday, October 12, 2010 11:37 AM
To: Travis Ard
Cc: mysql@lists.mysql.com
Subject: Re: Can this query be done w/o adding another
You could join your mappings table twice, once on src_ip and again on
dst_ip:
SELECT COUNT(signature) AS count,
MAX(timestamp) AS maxTime,
INET_NTOA(src_ip),
m.cc as src_cc,
INET_NTOA(dst_ip),
m2.cc as dst_cc,
signature,
signature_id,
ip_prot
You may get better performance from your query, and be able to make better
use of indexes if you use integer comparisons for your IP address
expressions instead of converting to strings with pattern matching. You
might consider something like the following:
SELECT DISTINCT(e.src_ip)
FROM event
Maybe you could use something like the following to truncate your times to
10 minute increments before doing your GROUP BY and COUNT():
select concat(date_format(timestamp_col, '%Y-%m-%d %H:'),
truncate(minute(timestamp_col) / 10, 0), '0') from your_table;
-Travis
---
I don't think I'd seriously consider the max() option, especially if you're
expecting the table to grow large. Using a table to store the next record
number (your last_used_record_nums table) is a technique that is fairly
commonly used. You can increment and retrieve the value atomically if you
u
I don't believe it's possible to do what you're suggesting. At least,
according to the second example on this page:
http://dev.mysql.com/doc/refman/5.1/en/subquery-errors.html.
-Travis
-Original Message-
From: Ramsey, Robert L [mailto:robert-ram...@uiowa.edu]
Sent: Monday, September 2
Try using the IS NULL operator instead of !
-Travis
-Original Message-
From: Andy Wallace [mailto:awall...@ihouseweb.com]
Sent: Thursday, September 16, 2010 10:47 AM
To: mysql@lists.mysql.com
Subject: Update query problem
So I'm having a problem with an update query. I have three tables
You could try
SELECT ... INTO OUTFILE ...
Also, if you just want to look at the data a page at a time, try setting
your pager variable to your favorite pager program.
mysql> pager /usr/bin/less
and unset it with \n
mysql> \n
-Travis
--
From:
When the explain output says "Using filesort", it doesn't necessarily mean
it is sorting on disk. It could still be sorting in memory and, thus, be
reasonably fast. You might check the value of Created_tmp_disk_tables
before and after your query to see for sure.
-Travis
-Original Message---
MySQL doesn't have the windowing functions that some other databases
provide, but you can probably achieve the same effect with a couple
user-defined variables:
select
teams_id as my_teams_id
,sum(rating) as total_team_rating
from
(select
players.teams_id
,players.players_id
Does this work?
select *
from t1
join t2 on (t1.datum = t2.sdat or dayname(t1.datum) = t2.tag);
-Travis
-Original Message-
From: mysql [mailto:my...@ayni.com]
Sent: Tuesday, September 07, 2010 1:43 AM
To: mysql@lists.mysql.com
Subject: Conditional join of tow tables
Hi listers
mysql> s
Have you considered adding a secondary index on the units column for your
delete queries?
DELETE FROM clientinfo WHERE units='155618918';
-Original Message-
From: Alexandre Vieira [mailto:nul...@gmail.com]
Sent: Thursday, September 02, 2010 8:46 AM
To: John Daisley; joh...@pixelated.net
I think you could do away with your right outer join of table B (which will
include all rows from B whether or not they match to C), since you are
explicitly filtering for C.State like 'Yes'. The intermediate result
doesn't necessarily need to be stored in a temporary table. You can include
multi
it is too awkward to
query this table twice and union the results, then you might want to add a
summary table.
-Travis
-Original Message-
From: Tompkins Neil [mailto:neil.tompk...@googlemail.com]
Sent: Wednesday, August 25, 2010 2:06 PM
To: Travis Ard
Cc: [MySQL]
Subject: Re: Calcula
I think your match table has all the information necessary to display the
results you want. Since each record contains data for two teams (home and
away), you'd probably need to select each separately and union the results
together before summarizing. Your query might look something like the
foll
You could try doing a multi-master configuration by setting up mysql proxy
to receive changes from 1, 2, and 3 and replicate to 4.
-Travis
-Original Message-
From: Nunzio Daveri [mailto:nunziodav...@yahoo.com]
Sent: Monday, August 23, 2010 3:28 PM
To: mysql@lists.mysql.com
Subject: Creat
Something like this might work:
insert into domains
select a.accountid, reverse(a.domainid), a.mailname
from domains a
left outer join domains b on b.domainid = reverse(a.domainid) and
b.accountid = a.accountid and b.mailname = a.mailname
where b.domainid is null;
-Travis
Well, it wouldn't exactly limit the size of your tables, but you may want to
look into creating a partitioned table to store your data. You could define
your partition ranges to store a single day's worth of data or whatever
granularity works best for you. Then, when you need to remove older data
What are you using as your primary key on this table? Is an auto_increment
field or something non-sequential? Do you have your secondary indexes in
place while you load the table or are you explicitly disabling them and
re-enabling them afterward?
-Travis
-Original Message-
From: Xn No
Maybe something like this:
select
user_id
,max(if(meta_key = 'zip_code', meta_value, null)) as zip_code
,max(if(meta_key = 'first_name', meta_value, null)) as first_name
,max(if(meta_key = 'last_name', meta_value, null)) as last_name
from wp_usermeta
group by user_id;
-Travis
--
Can you create a second, indexed column in your feed_new temp table that
includes the title without the year appended? That might allow you to get
by with a single pass through the larger prod table and avoid reading rows
from the feed_new table.
-Travis
-Original Message-
From: Jerry S
ct: Re: script of mysql
> From: pratikshadjayswa...@gmail.com
> To: travis_...@hotmail.com
> CC: mysql@lists.mysql.com
>
> Hi Travis,
>
> Thanks a lot for your help,
>
> Can i have a same scrip for windows OS.
>
>
> On Sat, Jul 31, 2010 at 3:29 AM, Travis Ard
You could also pre-define your intervals in a subquery using UNION and join
that to your original table like so:
select ifnull(sum(calls), 0) as calls, n as queue_seconds
from
(select 0 as n union
select 5 union
select 10 union
select 15) as step
left join calls on calls.queue_seconds > (ste
Something like this mysql-summary script might be useful:
http://code.google.com/p/aspersa/wiki/mysql_summary
-Travis
> Date: Fri, 30 Jul 2010 12:19:07 +0100
> Subject: script of mysql
> From: pratikshadjayswa...@gmail.com
> To: mysql@lists.mysql.com
>
>
You could try it inside a "here" document:
mysql < Date: Thu, 29 Jul 2010 14:29:55 -0500
> From: todd_...@ssiresults.com
> To: mysql@lists.mysql.com
> Subject: Variable Expansion Using MySQL Client and .sql Files
>
> Hey all,
>
> I've found many packages that sit on top of MySQL for various clien
With some databases such as MySQL, subqueries have to be explicitly named. For
example
select * from (select * from (select * from table) sub1) sub2;
If not, you will see an error like: "ERROR 1248 (42000): Every derived table
must have its own alias"
If
I understand your problem correctly,
36 matches
Mail list logo