I'm trying to figure out the best way to do a tricky bit of sorting.
I'm pretty sure it's entirely possible with an IFNULL or something,
but I always feel like I hit a barrier when I get to a certain level
of complexity in my MYSQL.
In any case, I have some magazines, each of which has a DATE, a
We keep all of the schema (one file per table) in SVN (subversion)
with a directory to represent each database. As the schema evolves,
we have had no trouble tracking changes through resource history and
are able to extract diffs on every commited change. It works like a
charm and would proably
Hello!
I'm just getting in to using CVS to track changes to my source code for PHP
projects that I'm working on, and it's exactly what my organization needed.
However, there does not appear to be a way to track changes to mySQL
databases in the same way. Basically, as the structure of tables are
Dear MySql,
I'm looking into availability and wonder if any member might be able to
help me.
I have two databases, one Primary and one full Replication.
Normally the primary is used for data input, reports are drawn from the
replication.
If I loose the Primary, do any members have any soft
Dan Nelson wrote:
> To be completely safe, you would want to use innodb tables, then select
> only the primary key of the 50,000 records you're interested in, using
> the "FOR UPDATE" keyword (to keep others from modifying those records
> while you're doing the move). Then "INSERT INTO newtable S
In the last episode (Mar 30), Ian P. Christian said:
> Brent Baisley wrote:
> >No, you can't assure the same data will be addressed without at
> >least including an order by. Even then you would need to make sure
> >that the first X records in the order would not change. For
> >instance, if you ord
Brent Baisley wrote:
No, you can't assure the same data will be addressed without at least
including an order by. Even then you would need to make sure that the
first X records in the order would not change. For instance, if you
order by entered_date DESC, then the data set would change because
No, you can't assure the same data will be addressed without at least including an order by. Even then you would need to make sure
that the first X records in the order would not change. For instance, if you order by entered_date DESC, then the data set would
change because any new records would
It's taking a long time because your filter is external to the table, so you can't use an index. You want to focus your search on
the category table, where you can quickly narrow down the number of records to search.
SELECT ctg, count(itemid) FROM categories JOIN items ON ctgID=itemCtgID WHERE
Or you could also do this.
SELECT LTA FROM TABLE A WHERE ID IN (SELECT B.ID FROM TABLE B);
i hope this what your looking at, please let us know.
regards
anandkl
On 3/30/07, Ananda Kumar <[EMAIL PROTECTED]> wrote:
Hi Rakaha,
You have a cursor , select id from table_name and then have a loop w
Hi Christian,
Before delete teh data from old_table, just have a backup.
Create table new_table_bck select * from old_table limit 5;
But i feel, instead of using limit, try to get data based on some date or
other condition, so that you are sure that same data gets insert and also
deleted from
I'm trying to move data between 2 tables.
> INSERT INTO new_table SELECT * FROM old_table LIMIT 5;
> DELETE FROM old_table LIMIT 5;
This is the only process that deletes data from old_table, can I be
*sure* that the limit in these 2 queries will address the same data set?
(if I don't
Hi Rakaha,
You have a cursor , select id from table_name and then have a loop where in
you select values of LTA for each id got from the above cursor, close the
loop once all the ID have been processed.
regards
anandk
On 3/30/07, raksha <[EMAIL PROTECTED]> wrote:
How to retrieve data from th
How to retrieve data from three consecutive tuples until th end of the
database.Such that suppose the field name is ID and I want to retrieve data
from another field LAT such that at a time LAT values for ID's 1,2,3 are
taken then for 3,4,5 then for 4,5,6 so on till table ends.
--
View this messa
In news:[EMAIL PROTECTED],
Shadow <[EMAIL PROTECTED]> wrote:
> I need to get number of items in a specific category, so I use
> SELECT COUNT(*) FROM items WHERE ctg=''
>
> But each query takes ~ 10seconds.
> Its really slow.
You may add an index on `items`.`ctg` if there is none.
Maciek
--
M
15 matches
Mail list logo