Running Queries When INSERTing Data?

2010-11-08 Thread Hal Vaughan
I'm redesigning some software that's been in use since 2002.  I'll be working 
with databases that will start small and grow along the way.  

In the old format, data would come to us in mega-big text files that had to be 
parsed and manipulated and so on with Perl to remove crap and finally produce 
one tab delimited file.  Once that file was created, another simple program 
would go through and use the data in each line for an INSERT statement that 
would put the data in a table.  This table also has an Idx field that is an 
auto-incrementing primary key for the table.

Each night at 3:30 am, a program would run and would go through the same 
process for each client.  I never timed it, but it could take something like 
30-60 seconds per client, but timing wasn't a major issue, since it had a LONG 
time from then until new data would be inserted into the DB.  The SELECT 
statements to pull the data for each client involve a number of AND and OR 
conditions.  The first one of these would create a temporary table with its 
results, then another long SELECT statement would create a 2nd temporary table 
by filtering the data out more.  This would continue for a few temporary tables 
until the data was filtered.  Then it would be packaged up and encrypted, then 
sent out to the client, who has a program on his computer to read that data and 
print it out if desired.

This has worked, but for a number of reasons, a once-a-day data pull and send 
won't work as well with the new design.  The program on the clients' computers 
will be able to access a special server just for them directly.  (I know the 
concept of a server for each human client sounds inefficient, but it actually 
improves operations in a number of ways.)  So each server will only have to 
provide data for one client.

The big difference is that I'd like to make it so they can access the data 
live, or almost live.  I don't mean all the data, but the subset that meets 
their needs.  In other words, the equivalent of what was sent to them daily in 
the old system.  Their individual servers will still get the big tab-delimited 
file that will still be INSERTed in to their DB line by line.  But I'd like to 
be able to select from the new data as it comes in, once it's been given a new 
number in the Idx field.

Is there any way to run a row of data through SELECT queries as it is being 
INSERTed into a table -- or just after?

The reason for doing this, instead of INSERTing all the data, then running a 
program is that as the database grows, pulling out the data will take longer 
and longer, so if there were a way to screen data as it comes in, that would 
make it easier to provide instantly available data.

I also know my knowledge of MySQL is quite limited, so if this can be done in 
better ways, I'd be interested in hearing about them.

Thank you!



Hal
--
MySQL General Mailing List
For list archives: http://lists.mysql.com/mysql
To unsubscribe:http://lists.mysql.com/mysql?unsub=arch...@jab.org



RE: Running Queries When INSERTing Data?

2010-11-08 Thread Gavin Towey
I'm not sure I understand exactly what you mean, but I think you just need to 
keep a timestamp associated with each row as it is inserted, put an index on 
it, then you can select new data just by using the appropriate time range.

Also, if you're parsing files into tab delimited format, you don't need to 
write a separate parser to insert rows line by line.  MySQL has LOAD DATA 
INFILE which takes delimited text files and inserts data in bulk, which is much 
faster than inserting line by line.

Regards,
Gavin Towey

-Original Message-
From: Hal Vaughan [mailto:h...@halblog.com]
Sent: Monday, November 08, 2010 10:18 AM
To: mysql@lists.mysql.com
Subject: Running Queries When INSERTing Data?

I'm redesigning some software that's been in use since 2002.  I'll be working 
with databases that will start small and grow along the way.

In the old format, data would come to us in mega-big text files that had to be 
parsed and manipulated and so on with Perl to remove crap and finally produce 
one tab delimited file.  Once that file was created, another simple program 
would go through and use the data in each line for an INSERT statement that 
would put the data in a table.  This table also has an Idx field that is an 
auto-incrementing primary key for the table.

Each night at 3:30 am, a program would run and would go through the same 
process for each client.  I never timed it, but it could take something like 
30-60 seconds per client, but timing wasn't a major issue, since it had a LONG 
time from then until new data would be inserted into the DB.  The SELECT 
statements to pull the data for each client involve a number of AND and OR 
conditions.  The first one of these would create a temporary table with its 
results, then another long SELECT statement would create a 2nd temporary table 
by filtering the data out more.  This would continue for a few temporary tables 
until the data was filtered.  Then it would be packaged up and encrypted, then 
sent out to the client, who has a program on his computer to read that data and 
print it out if desired.

This has worked, but for a number of reasons, a once-a-day data pull and send 
won't work as well with the new design.  The program on the clients' computers 
will be able to access a special server just for them directly.  (I know the 
concept of a server for each human client sounds inefficient, but it actually 
improves operations in a number of ways.)  So each server will only have to 
provide data for one client.

The big difference is that I'd like to make it so they can access the data 
live, or almost live.  I don't mean all the data, but the subset that meets 
their needs.  In other words, the equivalent of what was sent to them daily in 
the old system.  Their individual servers will still get the big tab-delimited 
file that will still be INSERTed in to their DB line by line.  But I'd like to 
be able to select from the new data as it comes in, once it's been given a new 
number in the Idx field.

Is there any way to run a row of data through SELECT queries as it is being 
INSERTed into a table -- or just after?

The reason for doing this, instead of INSERTing all the data, then running a 
program is that as the database grows, pulling out the data will take longer 
and longer, so if there were a way to screen data as it comes in, that would 
make it easier to provide instantly available data.

I also know my knowledge of MySQL is quite limited, so if this can be done in 
better ways, I'd be interested in hearing about them.

Thank you!



Hal
--
MySQL General Mailing List
For list archives: http://lists.mysql.com/mysql
To unsubscribe:http://lists.mysql.com/mysql?unsub=gto...@ffn.com


This message contains confidential information and is intended only for the 
individual named.  If you are not the named addressee, you are notified that 
reviewing, disseminating, disclosing, copying or distributing this e-mail is 
strictly prohibited.  Please notify the sender immediately by e-mail if you 
have received this e-mail by mistake and delete this e-mail from your system. 
E-mail transmission cannot be guaranteed to be secure or error-free as 
information could be intercepted, corrupted, lost, destroyed, arrive late or 
incomplete, or contain viruses. The sender therefore does not accept liability 
for any loss or damage caused by viruses or errors or omissions in the contents 
of this message, which arise as a result of e-mail transmission. [FriendFinder 
Networks, Inc., 220 Humboldt Court, Sunnyvale, CA 94089, USA, FriendFinder.com

--
MySQL General Mailing List
For list archives: http://lists.mysql.com/mysql
To unsubscribe:http://lists.mysql.com/mysql?unsub=arch...@jab.org



Re: Running Queries When INSERTing Data?

2010-11-08 Thread Hal Vaughan

On Nov 8, 2010, at 1:23 PM, Gavin Towey wrote:

 I'm not sure I understand exactly what you mean, but I think you just need to 
 keep a timestamp associated with each row as it is inserted, put an index on 
 it, then you can select new data just by using the appropriate time range.

But won't that take just as long as any other queries?  Or will it be speeded 
up because all the matching records would be adjacent to each other -- like all 
at the end?

 Also, if you're parsing files into tab delimited format, you don't need to 
 write a separate parser to insert rows line by line.  MySQL has LOAD DATA 
 INFILE which takes delimited text files and inserts data in bulk, which is 
 much faster than inserting line by line.

THANKS!  Is this new?  I never read about it before, but then again, from about 
2005-2006 until now the system was running smoothly and that's a long gap with 
no programming work!


Hal

 
 Regards,
 Gavin Towey
 
 -Original Message-
 From: Hal Vaughan [mailto:h...@halblog.com]
 Sent: Monday, November 08, 2010 10:18 AM
 To: mysql@lists.mysql.com
 Subject: Running Queries When INSERTing Data?
 
 I'm redesigning some software that's been in use since 2002.  I'll be working 
 with databases that will start small and grow along the way.
 
 In the old format, data would come to us in mega-big text files that had to 
 be parsed and manipulated and so on with Perl to remove crap and finally 
 produce one tab delimited file.  Once that file was created, another simple 
 program would go through and use the data in each line for an INSERT 
 statement that would put the data in a table.  This table also has an Idx 
 field that is an auto-incrementing primary key for the table.
 
 Each night at 3:30 am, a program would run and would go through the same 
 process for each client.  I never timed it, but it could take something like 
 30-60 seconds per client, but timing wasn't a major issue, since it had a 
 LONG time from then until new data would be inserted into the DB.  The SELECT 
 statements to pull the data for each client involve a number of AND and OR 
 conditions.  The first one of these would create a temporary table with its 
 results, then another long SELECT statement would create a 2nd temporary 
 table by filtering the data out more.  This would continue for a few 
 temporary tables until the data was filtered.  Then it would be packaged up 
 and encrypted, then sent out to the client, who has a program on his computer 
 to read that data and print it out if desired.
 
 This has worked, but for a number of reasons, a once-a-day data pull and send 
 won't work as well with the new design.  The program on the clients' 
 computers will be able to access a special server just for them directly.  (I 
 know the concept of a server for each human client sounds inefficient, but it 
 actually improves operations in a number of ways.)  So each server will only 
 have to provide data for one client.
 
 The big difference is that I'd like to make it so they can access the data 
 live, or almost live.  I don't mean all the data, but the subset that meets 
 their needs.  In other words, the equivalent of what was sent to them daily 
 in the old system.  Their individual servers will still get the big 
 tab-delimited file that will still be INSERTed in to their DB line by line.  
 But I'd like to be able to select from the new data as it comes in, once it's 
 been given a new number in the Idx field.
 
 Is there any way to run a row of data through SELECT queries as it is being 
 INSERTed into a table -- or just after?
 
 The reason for doing this, instead of INSERTing all the data, then running a 
 program is that as the database grows, pulling out the data will take longer 
 and longer, so if there were a way to screen data as it comes in, that would 
 make it easier to provide instantly available data.
 
 I also know my knowledge of MySQL is quite limited, so if this can be done in 
 better ways, I'd be interested in hearing about them.
 
 Thank you!
 
 
 
 Hal
 --
 MySQL General Mailing List
 For list archives: http://lists.mysql.com/mysql
 To unsubscribe:http://lists.mysql.com/mysql?unsub=gto...@ffn.com
 
 
 This message contains confidential information and is intended only for the 
 individual named.  If you are not the named addressee, you are notified that 
 reviewing, disseminating, disclosing, copying or distributing this e-mail is 
 strictly prohibited.  Please notify the sender immediately by e-mail if you 
 have received this e-mail by mistake and delete this e-mail from your system. 
 E-mail transmission cannot be guaranteed to be secure or error-free as 
 information could be intercepted, corrupted, lost, destroyed, arrive late or 
 incomplete, or contain viruses. The sender therefore does not accept 
 liability for any loss or damage caused by viruses or errors or omissions in 
 the contents of this message, which arise as a result of e-mail transmission

RE: Running Queries When INSERTing Data?

2010-11-08 Thread BMBasal
 But won't that take just as long as any other queries?  Or will it be
speeded up because all the matching records would be adjacent to each other
-- like all at the end?

You can order the result data set by timestamp in descending order, so the
latest will come up first, i.e., LIFO.

-Original Message-
From: Hal Vaughan [mailto:h...@halblog.com] 
Sent: Monday, November 08, 2010 1:39 PM
To: mysql@lists.mysql.com
Subject: Re: Running Queries When INSERTing Data?


On Nov 8, 2010, at 1:23 PM, Gavin Towey wrote:

 I'm not sure I understand exactly what you mean, but I think you just need
to keep a timestamp associated with each row as it is inserted, put an index
on it, then you can select new data just by using the appropriate time
range.

But won't that take just as long as any other queries?  Or will it be
speeded up because all the matching records would be adjacent to each other
-- like all at the end?

 Also, if you're parsing files into tab delimited format, you don't need to
write a separate parser to insert rows line by line.  MySQL has LOAD DATA
INFILE which takes delimited text files and inserts data in bulk, which is
much faster than inserting line by line.

THANKS!  Is this new?  I never read about it before, but then again, from
about 2005-2006 until now the system was running smoothly and that's a long
gap with no programming work!


Hal

 
 Regards,
 Gavin Towey
 
 -Original Message-
 From: Hal Vaughan [mailto:h...@halblog.com]
 Sent: Monday, November 08, 2010 10:18 AM
 To: mysql@lists.mysql.com
 Subject: Running Queries When INSERTing Data?
 
 I'm redesigning some software that's been in use since 2002.  I'll be
working with databases that will start small and grow along the way.
 
 In the old format, data would come to us in mega-big text files that had
to be parsed and manipulated and so on with Perl to remove crap and finally
produce one tab delimited file.  Once that file was created, another simple
program would go through and use the data in each line for an INSERT
statement that would put the data in a table.  This table also has an Idx
field that is an auto-incrementing primary key for the table.
 
 Each night at 3:30 am, a program would run and would go through the same
process for each client.  I never timed it, but it could take something like
30-60 seconds per client, but timing wasn't a major issue, since it had a
LONG time from then until new data would be inserted into the DB.  The
SELECT statements to pull the data for each client involve a number of AND
and OR conditions.  The first one of these would create a temporary table
with its results, then another long SELECT statement would create a 2nd
temporary table by filtering the data out more.  This would continue for a
few temporary tables until the data was filtered.  Then it would be packaged
up and encrypted, then sent out to the client, who has a program on his
computer to read that data and print it out if desired.
 
 This has worked, but for a number of reasons, a once-a-day data pull and
send won't work as well with the new design.  The program on the clients'
computers will be able to access a special server just for them directly.
(I know the concept of a server for each human client sounds inefficient,
but it actually improves operations in a number of ways.)  So each server
will only have to provide data for one client.
 
 The big difference is that I'd like to make it so they can access the data
live, or almost live.  I don't mean all the data, but the subset that meets
their needs.  In other words, the equivalent of what was sent to them daily
in the old system.  Their individual servers will still get the big
tab-delimited file that will still be INSERTed in to their DB line by line.
But I'd like to be able to select from the new data as it comes in, once
it's been given a new number in the Idx field.
 
 Is there any way to run a row of data through SELECT queries as it is
being INSERTed into a table -- or just after?
 
 The reason for doing this, instead of INSERTing all the data, then running
a program is that as the database grows, pulling out the data will take
longer and longer, so if there were a way to screen data as it comes in,
that would make it easier to provide instantly available data.
 
 I also know my knowledge of MySQL is quite limited, so if this can be done
in better ways, I'd be interested in hearing about them.
 
 Thank you!
 
 
 
 Hal
 --
 MySQL General Mailing List
 For list archives: http://lists.mysql.com/mysql
 To unsubscribe:http://lists.mysql.com/mysql?unsub=gto...@ffn.com
 
 
 This message contains confidential information and is intended only for
the individual named.  If you are not the named addressee, you are notified
that reviewing, disseminating, disclosing, copying or distributing this
e-mail is strictly prohibited.  Please notify the sender immediately by
e-mail if you have received this e-mail by mistake and delete this e-mail
from your system. E-mail

RE: Running Queries When INSERTing Data?

2010-11-08 Thread Gavin Towey
If you are selecting records within a certain time range that is a subset of 
the entire set of data, then indexes which use the timestamp column will be 
fine.

More generally: create appropriate indexes to optimize queries.

Although typically, you should design the database to be correct first, then 
identify performance bottlenecks in the production system, and fix them 
individually.  Trying to predict performance is difficult.  If you're 
concerned, then it's best to create a test that reproduces the exact situation 
of which you are unsure so you can observe real performance, and experiment 
with optimizations.


-Original Message-
From: Hal Vaughan [mailto:h...@halblog.com]
Sent: Monday, November 08, 2010 10:39 AM
To: mysql@lists.mysql.com
Subject: Re: Running Queries When INSERTing Data?


On Nov 8, 2010, at 1:23 PM, Gavin Towey wrote:

 I'm not sure I understand exactly what you mean, but I think you just need to 
 keep a timestamp associated with each row as it is inserted, put an index on 
 it, then you can select new data just by using the appropriate time range.

But won't that take just as long as any other queries?  Or will it be speeded 
up because all the matching records would be adjacent to each other -- like all 
at the end?

 Also, if you're parsing files into tab delimited format, you don't need to 
 write a separate parser to insert rows line by line.  MySQL has LOAD DATA 
 INFILE which takes delimited text files and inserts data in bulk, which is 
 much faster than inserting line by line.

THANKS!  Is this new?  I never read about it before, but then again, from about 
2005-2006 until now the system was running smoothly and that's a long gap with 
no programming work!


Hal


 Regards,
 Gavin Towey

 -Original Message-
 From: Hal Vaughan [mailto:h...@halblog.com]
 Sent: Monday, November 08, 2010 10:18 AM
 To: mysql@lists.mysql.com
 Subject: Running Queries When INSERTing Data?

 I'm redesigning some software that's been in use since 2002.  I'll be working 
 with databases that will start small and grow along the way.

 In the old format, data would come to us in mega-big text files that had to 
 be parsed and manipulated and so on with Perl to remove crap and finally 
 produce one tab delimited file.  Once that file was created, another simple 
 program would go through and use the data in each line for an INSERT 
 statement that would put the data in a table.  This table also has an Idx 
 field that is an auto-incrementing primary key for the table.

 Each night at 3:30 am, a program would run and would go through the same 
 process for each client.  I never timed it, but it could take something like 
 30-60 seconds per client, but timing wasn't a major issue, since it had a 
 LONG time from then until new data would be inserted into the DB.  The SELECT 
 statements to pull the data for each client involve a number of AND and OR 
 conditions.  The first one of these would create a temporary table with its 
 results, then another long SELECT statement would create a 2nd temporary 
 table by filtering the data out more.  This would continue for a few 
 temporary tables until the data was filtered.  Then it would be packaged up 
 and encrypted, then sent out to the client, who has a program on his computer 
 to read that data and print it out if desired.

 This has worked, but for a number of reasons, a once-a-day data pull and send 
 won't work as well with the new design.  The program on the clients' 
 computers will be able to access a special server just for them directly.  (I 
 know the concept of a server for each human client sounds inefficient, but it 
 actually improves operations in a number of ways.)  So each server will only 
 have to provide data for one client.

 The big difference is that I'd like to make it so they can access the data 
 live, or almost live.  I don't mean all the data, but the subset that meets 
 their needs.  In other words, the equivalent of what was sent to them daily 
 in the old system.  Their individual servers will still get the big 
 tab-delimited file that will still be INSERTed in to their DB line by line.  
 But I'd like to be able to select from the new data as it comes in, once it's 
 been given a new number in the Idx field.

 Is there any way to run a row of data through SELECT queries as it is being 
 INSERTed into a table -- or just after?

 The reason for doing this, instead of INSERTing all the data, then running a 
 program is that as the database grows, pulling out the data will take longer 
 and longer, so if there were a way to screen data as it comes in, that would 
 make it easier to provide instantly available data.

 I also know my knowledge of MySQL is quite limited, so if this can be done in 
 better ways, I'd be interested in hearing about them.

 Thank you!



 Hal
 --
 MySQL General Mailing List
 For list archives: http://lists.mysql.com/mysql
 To unsubscribe:http://lists.mysql.com/mysql?unsub=gto