Philip Hallstrom wrote:
Philip Hallstrom wrote:
Hi all, a though query problem for me...
I have a table with 2 rows that matter: url and id
If url and id are the same in 2 rows, then that's no good (bad data).
I need to find all the rows that are duplicates. I can't think of how
to approach t
Hi Agrapin -
This sounds great. Could you please post some of your timber products
here to the list? Many of us are really looking for a break from this
boring MySQL stuff. Thanks, and our kind regards to you too.
- Brian
On Sep 11, 2006, at 7:57 PM, Agrapin S.A. - Timber Industry and
T
I found something that we can get all the tree.
SELECT cat_id, group_concat( id )
FROM categoria
GROUP BY cat_id
Try it and tell me if it´s good or not.
""abhishek jain"" <[EMAIL PROTECTED]> escreveu na mensagem
news:[EMAIL PROTECTED]
> Hi,
> I have a table structure like :
> ID , NAME, PARENT
No, I don't generally go along with underhand political activity. :-)
(but I expect that's an old joke - I haven't been MySQLing all that
long, you see...)
--
Cheers... Chris
Highway 57 Web Development -- http://highway57.co.uk/
Any inaccuracies in this index may be explained by the fact
that
I just filed bug #22317 about this. The following script fails to
return a row under 4.1.21 (on x86_64, anyway), but works correctly on
4.1.20 (and .18):
drop table if exists test1;
create table test1
(
datetimeval datetime,
dateval1 date,
dat
You should always have a field that is a unique id for the record (i.e. autoincrement). It makes it easier for differentiating and
deleting duplicates if needed. This query only works if you have a unique id field and it will only delete one duplicate record at a
time. So if you have 4 records th
yep, your're right
thanks for clearing that up :)
How do most mysql folks sync live and development databases ?
As an alternative, I could use a PHP script on a late-night crontab.
g
On Sep 13, 2006, at 1:11 AM, Chris wrote:
Graham Anderson wrote:
Is anyone using subversion to sync liv
Philip Hallstrom wrote:
Hi all, a though query problem for me...
I have a table with 2 rows that matter: url and id
If url and id are the same in 2 rows, then that's no good (bad data).
I need to find all the rows that are duplicates. I can't think of how
to approach the sql for this.. any poi
I'm not sure I understand where the other field data (that you say you have
to enter manually) is coming from, but what I often do when confronted with
data that needs to be massaged before entry is put it in an Excel
spreadsheet and use a formula to build the MySQL statements.
Regards,
Jerry Sc
For our Ruby on Rails app, during development, we're using SVN to store and
sync a group of SQL load files,
containing all the delete, insert and update commands necessary to fully
populate the database.
After doing an SVN update we then run a shell (or batch) script to pipe the
SQL files into MyS
Здравствуйте, .
Hi all, i have some strange records in InnoDB status, what does they
all mean?
*** (2) TRANSACTION:
TRANSACTION 0 139334621, ACTIVE 1 sec, process no 594, OS thread id 2725583792
fetching rows, thread declared inside InnoDB 425
mysql tables in use 1, locked 1
1815 lock struct(s),
Never mind, I figured it out:
select distinct(concat(lat,lon)), lat, lon where
On Sep 13, 2006, at 6:57 AM, Brian Dunning wrote:
But if I do this, how do I still get lat and lon as two different
fields? This finds the right record set, but it returns both fields
concatenated into a sing
But if I do this, how do I still get lat and lon as two different
fields? This finds the right record set, but it returns both fields
concatenated into a single field.
On Sep 12, 2006, at 12:46 PM, Steve Musumeche wrote:
You could try using CONCAT:
select distinct(CONCAT(lat, long)) from
|Hello,
i've a mysql table in which a webapplication, written in php, stores a
lot of records...
One of the table's field is a date and the webapplication uses mktime()
to store it as unix timestamp.
Now i'd like to use Microsoft Access to access to this table but i'd
like to convert unix tim
The MySQL GUI tools have been available as a suite of all tools for
quite a while (http://dev.mysql.com/downloads/gui-tools/5.0.html).
Therefore, it seemed logical to also provide a single manual covering
all tools (MySQL Administrator, MySQL Query Browser, MySQL Migration
Toolkit, and MySQL Workbe
Leo wrote:
> Hi,all.I want to use mysqldump to backup a innodb table,and add the option
> --single-transaction,dose it lock all the table?thanks.
It has to so it can give you one transaction and make sure nothing else
gets entered after it starts and before it finishes.
--
MySQL General Mailing
Hi,all.I want to use mysqldump to backup a innodb table,and add the option
--single-transaction,dose it lock all the table?thanks.
--
Leo
2006-09-13
--
MySQL General Mailing List
For list archives: http://lists.mysql.com/mysql
To unsubscribe:ht
Graham Anderson wrote:
Is anyone using subversion to sync live and development databases?
If so, how?
Is this desired or a best practice?
Everything except my databases are under version control.
In theory, I would like my databases to sync with the same subversion
'svn update' command.
That
Philip Hallstrom wrote:
Hi all, a though query problem for me...
I have a table with 2 rows that matter: url and id
If url and id are the same in 2 rows, then that's no good (bad data).
I need to find all the rows that are duplicates. I can't think of how
to approach the sql for this.. any poi
Why don't you just use a GROUP BY on lat,long?
> You could try using CONCAT:
>
> select distinct(CONCAT(lat, long)) from table where ...
>
> Steve Musumeche
> CIO, Internet Retail Connection
> [EMAIL PROTECTED]
>
>
>
> Brian Dunning wrote:
> > Lat & lon are two different fields. Either can be dupl
20 matches
Mail list logo