Re: [sqlite] limiting table size?

2006-01-02 Thread Dennis Cote

Julien LEFORT wrote:


This looks like a great solution. Just another point,
what if I want to fill this ring buffer automatically by using triggers on 
other tables, and be able to reset the counter of the ring buffer when I 
arrive at the max integer value of the primary key.
 


--Create trigger on Data table when an event occurs on one of the fields
CREATE TRIGGER Data_Update AFTER UPDATE OF FirstName ON Data
BEGIN
 INSERT INTO fifo VALUES (NULL, new.FirstName);
---
-- There, what if I get the error SQLITE_FULL while the INSERT INTO Data 
request???
 
-- I'm really not sure of the syntax, but could this work?

---
 ON CONFLICT UPDATE fifo SET ID = ID - (SELECT MIN(ID)-1 FROM fifo);
 INSERT INTO fifo VALUES (NULL, new.FirstName);

END;

 


Julien,

What you have proposed won't work because there is no way to catch an 
error in SQL. The SQLITE_FULL error is returned to the calling 
application only.


On the other hand, this probably isn't a issue for any real application. 
Even if you insert 1000 rows into the fifo every second, the 64 bit row 
ids will let you do this for 292,271,023 years before you fill the rowid 
space. Even if future CPUs and multiple parallel writers let you 
increase your write rate by a factor of 1,000,000 you are still good for 
at least 292 years.


HTH
Dennis Cote


Re: [sqlite] limiting table size?

2005-12-30 Thread Julien LEFORT
This looks like a great solution. Just another point,
what if I want to fill this ring buffer automatically by using triggers on 
other tables, and be able to reset the counter of the ring buffer when I 
arrive at the max integer value of the primary key.
For example :

-- the table to be used as a fifo
create table fifo (
    id integer primary key autoincrement,
    data varchar
);

-- table to store the fifo size limit
create table fifo_size (
    max_entries integer
);

-- set the maximum size of the fifo
-- this can be changed latter
insert into fifo_size values(5);

-- remove any records beyond the fifo size limit
-- whenever the limit is changed
create trigger fifo_resize after update on fifo_size
begin
    delete from fifo
    where id = (select max(id) from fifo) - (select max_entries from 
fifo_size);
end;

-- remove oldest record from the fifo
-- after each insert into the fifo
create trigger fifo_limit after insert on fifo
begin
    delete from fifo
    where id = (select max(id) from fifo) - (select max_entries from 
fifo_size);
end;

--Create table for any data storage
CREATE TABLE Data (ID PRIMARY KEY AUTOINCREMENT, FirstName VARCHAR, LastName 
VARCHAR);

--Create trigger on Data table when an event occurs on one of the fields
CREATE TRIGGER Data_Update AFTER UPDATE OF FirstName ON Data
BEGIN
  INSERT INTO fifo VALUES (NULL, new.FirstName);
---
-- There, what if I get the error SQLITE_FULL while the INSERT INTO Data 
request???
  
-- I'm really not sure of the syntax, but could this work?
---
  ON CONFLICT UPDATE fifo SET ID = ID - (SELECT MIN(ID)-1 FROM fifo);
  INSERT INTO fifo VALUES (NULL, new.FirstName);

END;

Regards

Julien


Le jeudi 22 Décembre 2005 18:02, Dennis Cote a écrit :
 Adler, Eliedaat wrote:
 Sean Wrote ===
 I'm considering using SQLLite for an embedded project.  Some of the
 
 data I'd like to store is timestamped sensor readings.  I'd like to know
 if there is a way to configure a table so that it acts like a fixed
 length FIFO queue, e.g. stores 10,000 records then once full drops off
 the oldest record each time a new one is inserted.
 
 .Can anyone tell me if we can do that with SQLLite?
 =
 
 You could use a simple trigger (example for queue of size 1 below)
 and a self-incrementing key column:
 
 create table myqueue (limit_id integer not null primary key, myvalue
 integer) ;
 
 create trigger queue_limit after insert on myqueue
 begin
 update myqueue set limit_id = limit_id-1 where (select
 max(limit_id) from myqueue )  1;
 delete from myqueue where limit_id =0 ;
  end ;
 
 INSERT TO MYQUEUE:
 insert into myqueue (myvalue) values (1) ;
 insert into myqueue (myvalue) values(2) ;
 ..
 insert into myqueue (myvalue) values (10050) ;
 RESULTS:
 limit_idmyvalue
 151
 252
 .
 1 10050
 
 Note: Its not very efficient.
 
 Regards,
 Eli

 The following SQL should do what you want. It is also quite efficient
 because it always uses the implicit index associated with the integer
 primary key field, and usually only touches the first or last record.

 -- the table to be used as a fifo
 create table fifo (
 id integer primary key autoincrement,
 data varchar
 );

 -- remove oldest record from the fifo
 -- after each insert into the fifo
 create trigger fifo_limit after insert on fifo
 begin
 delete from fifo
 where id = (select max(id) from fifo) - 1; -- max size is 1
 end;

 The following slightly more complicated version allows the size of the
 fifo to be changed on the fly.

 -- the table to be used as a fifo
 create table fifo (
 id integer primary key autoincrement,
 data varchar
 );

 -- table to store the fifo size limit
 create table fifo_size (
 max_entries integer
 );

 -- set the maximum size of the fifo
 -- this can be changed latter
 insert into fifo_size values(5);

 -- remove any records beyond the fifo size limit
 -- whenever the limit is changed
 create trigger fifo_resize after update on fifo_size
 begin
 delete from fifo
 where id = (select max(id) from fifo) - (select max_entries from
 fifo_size);
 end;

 -- remove oldest record from the fifo
 -- after each insert into the fifo
 create trigger fifo_limit after insert on fifo
 begin
 delete from fifo
 where id = (select max(id) from fifo) - (select max_entries from
 fifo_size);
 end;

 The following is a sample run of the second version using fifo sizes of
 5 and 8 for demonstration.

 insert into fifo values(NULL, 'one');
 insert into fifo values(NULL, 'two');
 insert into fifo values(NULL, 'three');
 select * from fifo;
 1|one
 2|two
 3|three
 insert into fifo values(NULL, 'four');
 insert into fifo values(NULL, 'five');
 insert into fifo values(NULL, 'six');
 select * from fifo;
 2|two
 3|three
 4|four
 5|five
 6|six
 insert into fifo values(NULL, 'seven');
 insert into fifo values(NULL, 'eight');
 

RE: [sqlite] limiting table size?

2005-12-22 Thread Adler, Eliedaat

Sean Wrote ===
I'm considering using SQLLite for an embedded project.  Some of the
data I'd like to store is timestamped sensor readings.  I'd like to know
if there is a way to configure a table so that it acts like a fixed
length FIFO queue, e.g. stores 10,000 records then once full drops off
the oldest record each time a new one is inserted.

.Can anyone tell me if we can do that with SQLLite?
=

You could use a simple trigger (example for queue of size 1 below)
and a self-incrementing key column:

create table myqueue (limit_id integer not null primary key, myvalue
integer) ;

create trigger queue_limit after insert on myqueue
begin
update myqueue set limit_id = limit_id-1 where (select
max(limit_id) from myqueue )  1;
delete from myqueue where limit_id =0 ;
 end ;

INSERT TO MYQUEUE:
insert into myqueue (myvalue) values (1) ;
insert into myqueue (myvalue) values(2) ;
..
insert into myqueue (myvalue) values (10050) ;
RESULTS:
limit_idmyvalue
151
252
.
1 10050
 
Note: Its not very efficient.
 
Regards,
Eli

***
This email message and any attachments thereto are intended only for use by the 
addressee(s) named above, and may contain legally privileged and/or 
confidential information. If the reader of this message is not the intended 
recipient, or the employee or agent responsible to deliver it to the intended 
recipient, you are hereby notified that any dissemination, distribution or 
copying of this communication is strictly prohibited. If you have received this 
communication in error, please immediately notify the [EMAIL PROTECTED] and 
destroy the original message.
***


Re: [sqlite] limiting table size?

2005-12-22 Thread Dennis Cote

Adler, Eliedaat wrote:


Sean Wrote ===
I'm considering using SQLLite for an embedded project.  Some of the
   


data I'd like to store is timestamped sensor readings.  I'd like to know
if there is a way to configure a table so that it acts like a fixed
length FIFO queue, e.g. stores 10,000 records then once full drops off
the oldest record each time a new one is inserted.

.Can anyone tell me if we can do that with SQLLite?
=

You could use a simple trigger (example for queue of size 1 below)
and a self-incrementing key column:

   create table myqueue (limit_id integer not null primary key, myvalue
integer) ;

   create trigger queue_limit after insert on myqueue
   begin
   update myqueue set limit_id = limit_id-1 where (select
max(limit_id) from myqueue )  1;
   delete from myqueue where limit_id =0 ;
end ;

INSERT TO MYQUEUE:
   insert into myqueue (myvalue) values (1) ;
   insert into myqueue (myvalue) values(2) ;
   ..
   insert into myqueue (myvalue) values (10050) ;
RESULTS:
limit_idmyvalue
151
252
.
1 10050

Note: Its not very efficient.

Regards,
Eli

 

The following SQL should do what you want. It is also quite efficient 
because it always uses the implicit index associated with the integer 
primary key field, and usually only touches the first or last record.


-- the table to be used as a fifo
create table fifo (
   id integer primary key autoincrement,
   data varchar
);

-- remove oldest record from the fifo
-- after each insert into the fifo
create trigger fifo_limit after insert on fifo
begin
   delete from fifo
   where id = (select max(id) from fifo) - 1; -- max size is 1
end;

The following slightly more complicated version allows the size of the 
fifo to be changed on the fly.


-- the table to be used as a fifo
create table fifo (
   id integer primary key autoincrement,
   data varchar
);

-- table to store the fifo size limit
create table fifo_size (
   max_entries integer
);

-- set the maximum size of the fifo
-- this can be changed latter
insert into fifo_size values(5);

-- remove any records beyond the fifo size limit
-- whenever the limit is changed
create trigger fifo_resize after update on fifo_size
begin
   delete from fifo
   where id = (select max(id) from fifo) - (select max_entries from 
fifo_size);

end;

-- remove oldest record from the fifo
-- after each insert into the fifo
create trigger fifo_limit after insert on fifo
begin
   delete from fifo
   where id = (select max(id) from fifo) - (select max_entries from 
fifo_size);

end;

The following is a sample run of the second version using fifo sizes of 
5 and 8 for demonstration.


insert into fifo values(NULL, 'one');
insert into fifo values(NULL, 'two');
insert into fifo values(NULL, 'three');
select * from fifo;
1|one
2|two
3|three
insert into fifo values(NULL, 'four');
insert into fifo values(NULL, 'five');
insert into fifo values(NULL, 'six');
select * from fifo;
2|two
3|three
4|four
5|five
6|six
insert into fifo values(NULL, 'seven');
insert into fifo values(NULL, 'eight');
insert into fifo values(NULL, 'nine');
select * from fifo;
5|five
6|six
7|seven
8|eight
9|nine
update fifo_size set max_entries = 8;
insert into fifo values(NULL, 'ten');
insert into fifo values(NULL, 'eleven');
insert into fifo values(NULL, 'twelve');
select * from fifo;
5|five
6|six
7|seven
8|eight
9|nine
10|ten
11|eleven
12|twelve
insert into fifo values(NULL, 'thirteen');
insert into fifo values(NULL, 'fourteen');
insert into fifo values(NULL, 'fifteen');
select * from fifo;
8|eight
9|nine
10|ten
11|eleven
12|twelve
13|thirteen
14|fourteen
15|fifteen

For either case, the fifo table can eventually exhaust the available id 
values (after a very long time because of the 64 bit id values). If this 
happens you will receive an SQLITE_FULL error when you try to do an 
insert (because of the autoincrement constraint on the id). When this 
happens you will need to run the following update to reset the lowest id 
to 1 before repeating the failed insert.


update fifo
   set id = id - (select min(id) - 1 from fifo);

HTH
Dennis Cote



Re: [sqlite] limiting table size?

2005-12-22 Thread Clark Christensen
Looks like the experts have given good advice.  I'll try a different tack...

Have a look at http://www.sqlite.org/cvstrac/tktview?tn=1476  The author 
pointed me to this patch in response to my question about keeping a foreign 
sequence.  But, I think the patch could work equally well for you, and for the 
poster of the 'ring buffer table' question.  If you were to set-up a sequence 
with a max of 10,000, and set it to cycle, you could then use this sequence 
to set the  primary key in your log table.  

So, if your log table looked like:

create table log_10k (log_oid integer primary key, timestamp, reading);

Using the enhancement, and a sequence named sq10k, you could do everything in 
a single SQL statement:

insert or replace into log_10k values (nextval('sq10k'), '2005-12-22 00:00:01', 
12345);

It should add records until the sequence (nextval()) wrapped back to 1, then 
simply overwrite the lowest numbered ones.  I don't know the cost of using 
these enhanced functions versus the trigger suggestion given by others.  This 
method has the added benefit of allowing you to tailor the size of the log 
table per installation by simply changing the values in the sequence record.  
Plus, once implemented, the sequence funcs are always available.  The downside 
is you would have to fiddle with SQLite itself, and maintain a 'custom' build.

For my project/environment, a Perl/CGI app on Solaris where I don't have 
shell/build access, building/maintaining a special SQLite build was too much 
hassle.  I took the idea from the enhancement in ticket 1476, and built my own 
equivalent functions in Perl.  Probably not really fast, but reasonable for a 
Web app in my environment.

Good luck!

 -Clark



- Original Message 
From: Sean Machin [EMAIL PROTECTED]
To: sqlite-users@sqlite.org
Sent: Tuesday, December 20, 2005 11:02:39 AM
Subject: [sqlite] limiting table size?

Hi All,

I'm considering using SQLLite for an embedded project.  Some of the data 
I'd like to store is
timestamped sensor readings.  I'd like to know if there is a way to 
configure a table
so that it acts like a fixed length FIFO queue, e.g. stores 10,000 
records then once full
drops off the oldest record each time a new one is inserted.

Can anyone tell me if we can do that with SQLLite?

Thanks,
Sean





[sqlite] limiting table size?

2005-12-20 Thread Sean Machin

Hi All,

I'm considering using SQLLite for an embedded project.  Some of the data 
I'd like to store is
timestamped sensor readings.  I'd like to know if there is a way to 
configure a table
so that it acts like a fixed length FIFO queue, e.g. stores 10,000 
records then once full

drops off the oldest record each time a new one is inserted.

Can anyone tell me if we can do that with SQLLite?

Thanks,
Sean


Re: [sqlite] limiting table size?

2005-12-20 Thread John Stanton
Since you are using a time your could try using a trigger which deleted 
the row with the oldest (lowest) time.  You would need to have an index 
on the timestamp.  I guess something like timestamp  0 LIMIT 1 might work.


Sean Machin wrote:

Hi All,

I'm considering using SQLLite for an embedded project.  Some of the data 
I'd like to store is
timestamped sensor readings.  I'd like to know if there is a way to 
configure a table
so that it acts like a fixed length FIFO queue, e.g. stores 10,000 
records then once full

drops off the oldest record each time a new one is inserted.

Can anyone tell me if we can do that with SQLLite?

Thanks,
Sean