Re: [sqlite] feed "endless" data into sqlite, thru a shell script

2011-09-28 Thread Patrick Proniewski
On 28 sept. 2011, at 13:38, Paul Linehan wrote:

>> The granularity I'm looking for is between 1 second and 10 seconds. Cron is 
>> not
>> an option here.
> 
> I woke up this morning and there is a way that cron *_could_* do what you
> want. You appear to have figured out a way that suits you, but cron could
> be used.
> 
> 10 second granularity.
> 
> You have 6 cron jobs, each launched on the minute.
> 
> The first launches iostat and puts data into SQLite.
> The second does a sleep 10, launches iostat and puts data into SQLite,
> the third sleep 20 
> 
> I know it's an appalling hack, but could be useful to somebody?

That's appalling :)
Especially if you consider the fact that some systems can have a crond launched 
with the -j flag (jitter : adds a random sleep before running cron job).

patpro
___
sqlite-users mailing list
sqlite-users@sqlite.org
http://sqlite.org:8080/cgi-bin/mailman/listinfo/sqlite-users


Re: [sqlite] feed "endless" data into sqlite, thru a shell script

2011-09-28 Thread Paul Linehan
2011/9/27 Patrick Proniewski :


>> I can't see why you would want to do this more than once every minute
>> - or do you?

> The granularity I'm looking for is between 1 second and 10 seconds. Cron is 
> not
> an option here.


I woke up this morning and there is a way that cron *_could_* do what you
want. You appear to have figured out a way that suits you, but cron could
be used.

10 second granularity.

You have 6 cron jobs, each launched on the minute.

The first launches iostat and puts data into SQLite.
The second does a sleep 10, launches iostat and puts data into SQLite,
 the third sleep 20 

I know it's an appalling hack, but could be useful to somebody?

Sincères saluations.


Paul...


> patpro

-- 


lineh...@tcd.ie

Mob: 00 353 86 864 5772
___
sqlite-users mailing list
sqlite-users@sqlite.org
http://sqlite.org:8080/cgi-bin/mailman/listinfo/sqlite-users


Re: [sqlite] feed "endless" data into sqlite, thru a shell script

2011-09-28 Thread Michael Schlenker
Am 27.09.2011 23:07, schrieb Patrick Proniewski:
> On 27 sept. 2011, at 20:18, Gabor Grothendieck wrote:
> 
>> gawk has fflush()
> 
> 
> On 27 sept. 2011, at 20:29, Roger Andersson wrote:
> 
>> stdbuf? unbuffer?
> 
> 
> none of them is available out of the box on Mac OS X, or FreeBSD.
> gawk can be installed, but I'd rather use my "while true" loop
> instead of installing gawk.

Well 'unbuffer' is a trivial Expect script, and expect IS available on
OS X out of the box...
http://developer.apple.com/library/mac/#documentation/Darwin/Reference/ManPages/man1/expect.1.html

See for the script:
http://expect.cvs.sourceforge.net/viewvc/expect/expect/example/unbuffer?revision=5.34=markup

Michael

-- 
Michael Schlenker
Software Architect

CONTACT Software GmbH   Tel.:   +49 (421) 20153-80
Wiener Straße 1-3   Fax:+49 (421) 20153-41
28359 Bremen
http://www.contact.de/  E-Mail: m...@contact.de

Sitz der Gesellschaft: Bremen
Geschäftsführer: Karl Heinz Zachries, Ralf Holtgrefe
Eingetragen im Handelsregister des Amtsgerichts Bremen unter HRB 13215
___
sqlite-users mailing list
sqlite-users@sqlite.org
http://sqlite.org:8080/cgi-bin/mailman/listinfo/sqlite-users


Re: [sqlite] feed "endless" data into sqlite, thru a shell script

2011-09-27 Thread Patrick Proniewski
On 27 sept. 2011, at 23:11, Scott Hess wrote:

> On Tue, Sep 27, 2011 at 2:07 PM, Patrick Proniewski wrote:
> 
>> On 27 sept. 2011, at 20:18, Gabor Grothendieck wrote:
>>> gawk has fflush()
>> 
>> none of them is available out of the box on Mac OS X, or FreeBSD. gawk can
>> be installed, but I'd rather use my "while true" loop instead of installing
>> gawk.
>> 
> 
> Did you try it?

nop, I don't have gawk so I didn't even think about trying.

>  On my Mac fflush() fixes it.

indeed. Thanks. So it's not specific to gawk, that's great news! My problem is 
solved.

regards,
patpro
___
sqlite-users mailing list
sqlite-users@sqlite.org
http://sqlite.org:8080/cgi-bin/mailman/listinfo/sqlite-users


Re: [sqlite] feed "endless" data into sqlite, thru a shell script

2011-09-27 Thread Patrick Proniewski
On 27 sept. 2011, at 20:44, Paul Linehan wrote:

> 2011/9/27 Patrick Proniewski :
> 
>>> Take a look at a utility called dstat.
> 
>> no, it's linux only.
> 
> But it is written in Python - so it should be relatively
> transportable.

and it relies on /proc/, Mac OS X does not have a /proc/

patpro

___
sqlite-users mailing list
sqlite-users@sqlite.org
http://sqlite.org:8080/cgi-bin/mailman/listinfo/sqlite-users


Re: [sqlite] feed "endless" data into sqlite, thru a shell script

2011-09-27 Thread Patrick Proniewski
On 27 sept. 2011, at 20:41, Paul Linehan wrote:

> 2011/9/27 Patrick Proniewski :
> 
>> That's what I do, but I think using a loop is ugly, and I would like to find 
>> a way
>> to feed data continuously into sqlite.
> 
> I can't see why you would want to do this more than once every minute
> - or do you?

The granularity I'm looking for is between 1 second and 10 seconds. Cron is not 
an option here.

> Why, exactly, do you want to do this anyway? I'm interested because I've
> done something similar.

I've performance issue on a file server hooked to a raid enclosure, and 
exporting the corresponding volume via NFS.
The performance problem seems to be on the raid itself. So I'm logging I/O 
performances during production, to detect anomaly.
sample: http://perso.univ-lyon2.fr/~pproniew/kbpt-2011-09-27-22.png (besier 
smoothing, 24 hours of data).
We will change the storage in few days, and this iostat logging will help 
compare before/after performances.

regards,
patpro
___
sqlite-users mailing list
sqlite-users@sqlite.org
http://sqlite.org:8080/cgi-bin/mailman/listinfo/sqlite-users


Re: [sqlite] feed "endless" data into sqlite, thru a shell script

2011-09-27 Thread Scott Hess
On Tue, Sep 27, 2011 at 2:07 PM, Patrick Proniewski wrote:

> On 27 sept. 2011, at 20:18, Gabor Grothendieck wrote:
> > gawk has fflush()
>
> none of them is available out of the box on Mac OS X, or FreeBSD. gawk can
> be installed, but I'd rather use my "while true" loop instead of installing
> gawk.
>

Did you try it?  On my Mac fflush() fixes it.

-scott
___
sqlite-users mailing list
sqlite-users@sqlite.org
http://sqlite.org:8080/cgi-bin/mailman/listinfo/sqlite-users


Re: [sqlite] feed "endless" data into sqlite, thru a shell script

2011-09-27 Thread Patrick Proniewski
On 27 sept. 2011, at 20:18, Gabor Grothendieck wrote:

> gawk has fflush()


On 27 sept. 2011, at 20:29, Roger Andersson wrote:

> stdbuf?
> unbuffer?


none of them is available out of the box on Mac OS X, or FreeBSD. gawk can be 
installed, but I'd rather use my "while true" loop instead of installing gawk.

patpro
___
sqlite-users mailing list
sqlite-users@sqlite.org
http://sqlite.org:8080/cgi-bin/mailman/listinfo/sqlite-users


Re: [sqlite] feed "endless" data into sqlite, thru a shell script

2011-09-27 Thread Paul Linehan
2011/9/27 Patrick Proniewski :

>> Take a look at a utility called dstat.

> no, it's linux only.


But it is written in Python - so it should be relatively
transportable. I've even
managed to modify the code myself - and if I can do it, anybody can!  8-)


Paul...


> patpro

-- 


Hmmm a "life": wonder where I can download one of those?


lineh...@tcd.ie

Mob: 00 353 86 864 5772
___
sqlite-users mailing list
sqlite-users@sqlite.org
http://sqlite.org:8080/cgi-bin/mailman/listinfo/sqlite-users


Re: [sqlite] feed "endless" data into sqlite, thru a shell script

2011-09-27 Thread David Garfield
Patrick Proniewski writes:
> On 27 sept. 2011, at 20:14, David Garfield wrote:
> 
> > Any entry in a pipe could be buffering.  In a quick test here, awk is
> > buffering.  To find the buffering, try using the pieces up to a given
> > stage with " | cat " added at the end.  If this buffers, you've found
> > the problem.
> 
> as far as my understanding goes, the simple fact I don't have my last output 
> into a tty is enough to trigger buffering.

Actually, any program that doesn't explicitly block buffering and uses
stdout will get buffering.  Some block buffering.  Some (like probably
iostat) explicitly flush the buffers.  Some don't use stdout.  Others
get buffered.

> >  Unbuffered output is usually slower, so it is normally
> > done only to a terminal.  I think the only easy way to externally
> > disable the buffer is to wrap the program in a pseudo-tty.
> 
> apparently... not so easy by the way :)

Well, I think there are three choices for arbitrary programs:
1) Wrap it in a pseudo-tty.  I think I've seen a program to do this,
   but I don't remember where/what.
2) Override isatty() through an LD_PRELOAD.
3) Change the source, either to the program or to libc.

> > Alternatively, look for an option that lets you explicitly unbuffer.
> > (for instance, in perl, do: $| = 1; )
> 
> nothing in awk, but I could try sed instead (-l  Make output line 
> buffered)
> 
> regards,
> patpro

--David Garfield

___
sqlite-users mailing list
sqlite-users@sqlite.org
http://sqlite.org:8080/cgi-bin/mailman/listinfo/sqlite-users


Re: [sqlite] feed "endless" data into sqlite, thru a shell script

2011-09-27 Thread Paul Linehan
2011/9/27 Patrick Proniewski :

> That's what I do, but I think using a loop is ugly, and I would like to find 
> a way
> to feed data continuously into sqlite.


I can't see why you would want to do this more than once every minute
- or do you?

If not,
==
cron iostat > myfile.

Parse myfile - get data out. Insert into SQLite.

Delete myfile.
=== Should take a max of 5 seconds =

Then repeat cron for the next min, 2 mins 5/10/whatever...

Not elegant I know, but it will do the job.

Why, exactly, do you want to do this anyway? I'm interested because I've
done something similar.


Paul...





> patpro

-- 


Hmmm a "life": wonder where I can download one of those?


lineh...@tcd.ie

Mob: 00 353 86 864 5772
___
sqlite-users mailing list
sqlite-users@sqlite.org
http://sqlite.org:8080/cgi-bin/mailman/listinfo/sqlite-users


Re: [sqlite] feed "endless" data into sqlite, thru a shell script

2011-09-27 Thread Paul Linehan
2011/9/27 Patrick Proniewski :

> That's what I do, but I think using a loop is ugly, and I would like to find a
> way to feed data continuously into sqlite.


cron


Paul...

> patpro


-- 


Hmmm a "life": wonder where I can download one of those?


lineh...@tcd.ie

Mob: 00 353 86 864 5772
___
sqlite-users mailing list
sqlite-users@sqlite.org
http://sqlite.org:8080/cgi-bin/mailman/listinfo/sqlite-users


Re: [sqlite] feed "endless" data into sqlite, thru a shell script

2011-09-27 Thread Scott Hess
On Tue, Sep 27, 2011 at 11:18 AM, Gabor Grothendieck <
ggrothendi...@gmail.com> wrote:

> On Tue, Sep 27, 2011 at 2:14 PM, David Garfield
>  wrote:
> > Any entry in a pipe could be buffering.  In a quick test here, awk is
> > buffering.  To find the buffering, try using the pieces up to a given
> > stage with " | cat " added at the end.  If this buffers, you've found
> > the problem.  Unbuffered output is usually slower, so it is normally
> > done only to a terminal.  I think the only easy way to externally
> > disable the buffer is to wrap the program in a pseudo-tty.
> > Alternatively, look for an option that lets you explicitly unbuffer.
> > (for instance, in perl, do: $| = 1; )
> >
>
> gawk has fflush()


To clarify, you need to add "fflush();" at the end of your awk command.
 iostat is flushing, and awk is flushing IFF the output is to a terminal.
 But if it's to a pipe, it's not flushing, so you need to do it manually.

-scott
___
sqlite-users mailing list
sqlite-users@sqlite.org
http://sqlite.org:8080/cgi-bin/mailman/listinfo/sqlite-users


Re: [sqlite] feed "endless" data into sqlite, thru a shell script

2011-09-27 Thread Roger Andersson

 On 09/27/11 20:14, David Garfield wrote:

Any entry in a pipe could be buffering.  In a quick test here, awk is
buffering.  To find the buffering, try using the pieces up to a given
stage with " | cat " added at the end.  If this buffers, you've found
the problem.  Unbuffered output is usually slower, so it is normally
done only to a terminal.  I think the only easy way to externally
disable the buffer is to wrap the program in a pseudo-tty.
Alternatively, look for an option that lets you explicitly unbuffer.
(for instance, in perl, do: $| = 1; )

stdbuf?
unbuffer?

/Roger

___
sqlite-users mailing list
sqlite-users@sqlite.org
http://sqlite.org:8080/cgi-bin/mailman/listinfo/sqlite-users


Re: [sqlite] feed "endless" data into sqlite, thru a shell script

2011-09-27 Thread Patrick Proniewski
On 27 sept. 2011, at 20:14, David Garfield wrote:

> Any entry in a pipe could be buffering.  In a quick test here, awk is
> buffering.  To find the buffering, try using the pieces up to a given
> stage with " | cat " added at the end.  If this buffers, you've found
> the problem.

as far as my understanding goes, the simple fact I don't have my last output 
into a tty is enough to trigger buffering.


>  Unbuffered output is usually slower, so it is normally
> done only to a terminal.  I think the only easy way to externally
> disable the buffer is to wrap the program in a pseudo-tty.

apparently... not so easy by the way :)

> Alternatively, look for an option that lets you explicitly unbuffer.
> (for instance, in perl, do: $| = 1; )

nothing in awk, but I could try sed instead (-l  Make output line buffered)

regards,
patpro
___
sqlite-users mailing list
sqlite-users@sqlite.org
http://sqlite.org:8080/cgi-bin/mailman/listinfo/sqlite-users


Re: [sqlite] feed "endless" data into sqlite, thru a shell script

2011-09-27 Thread Gabor Grothendieck
On Tue, Sep 27, 2011 at 2:14 PM, David Garfield
 wrote:
> Any entry in a pipe could be buffering.  In a quick test here, awk is
> buffering.  To find the buffering, try using the pieces up to a given
> stage with " | cat " added at the end.  If this buffers, you've found
> the problem.  Unbuffered output is usually slower, so it is normally
> done only to a terminal.  I think the only easy way to externally
> disable the buffer is to wrap the program in a pseudo-tty.
> Alternatively, look for an option that lets you explicitly unbuffer.
> (for instance, in perl, do: $| = 1; )
>

gawk has fflush()
___
sqlite-users mailing list
sqlite-users@sqlite.org
http://sqlite.org:8080/cgi-bin/mailman/listinfo/sqlite-users


Re: [sqlite] feed "endless" data into sqlite, thru a shell script

2011-09-27 Thread Patrick Proniewski
On 27 sept. 2011, at 20:04, Paul Linehan wrote:

> 2011/9/27 Patrick Proniewski :
> 
> 
>> I'm facing a challenging problem. I want to log some data into an SQLite3 DB.
>> Data come from a system command (iostat) in an endless steam, one row every 
>> X seconds:
> 
> 
> Take a look at a utility called dstat. 

no, it's linux only.

patpro
___
sqlite-users mailing list
sqlite-users@sqlite.org
http://sqlite.org:8080/cgi-bin/mailman/listinfo/sqlite-users


Re: [sqlite] feed "endless" data into sqlite, thru a shell script

2011-09-27 Thread David Garfield
Any entry in a pipe could be buffering.  In a quick test here, awk is
buffering.  To find the buffering, try using the pieces up to a given
stage with " | cat " added at the end.  If this buffers, you've found
the problem.  Unbuffered output is usually slower, so it is normally
done only to a terminal.  I think the only easy way to externally
disable the buffer is to wrap the program in a pseudo-tty.
Alternatively, look for an option that lets you explicitly unbuffer.
(for instance, in perl, do: $| = 1; )

--David Garfield

Patrick Proniewski writes:
> On 27 sept. 2011, at 18:31, Roger Andersson wrote:
> 
> > I do not know if tee makes any difference or if it's available on Mac?
> > http://unixhelp.ed.ac.uk/CGI/man-cgi?tee
> 
> tee is available, but no more luck here, as it won't allow to disable the 
> buffer.
> 
> 
> > iostat -d -w 10 disk0 | tee -a logfile
> > and then
> > tail -f logfile | awk '!/[a-zA-Z]/ {print "INSERT INTO io 
> > VALUES(datetime(\"now\",\"localtime\"),"$1","$2","$3");"}' |\
> > sqlite3 iostat.db
> 
> same problem here ;)
> 
> patpro
> ___
> sqlite-users mailing list
> sqlite-users@sqlite.org
> http://sqlite.org:8080/cgi-bin/mailman/listinfo/sqlite-users

___
sqlite-users mailing list
sqlite-users@sqlite.org
http://sqlite.org:8080/cgi-bin/mailman/listinfo/sqlite-users


Re: [sqlite] feed "endless" data into sqlite, thru a shell script

2011-09-27 Thread Paul Linehan
2011/9/27 Patrick Proniewski :


> I'm facing a challenging problem. I want to log some data into an SQLite3 DB.
> Data come from a system command (iostat) in an endless steam, one row every X 
> seconds:


Take a look at a utility called dstat. I've twiddled with the source and
have its output go to a .csv. Basically, I've done what you want to do,
except with Oracle.

The plan then is to cron (or within Oracle - use the db scheduler) a job that
copies the data into the database - then at certain times delete down the
.csv file so that you're not continually rejecting records already in the db.

I have implemented this with Oracle - need to do a bit of work - but it's
part of what I believe *_should_* be easy -  i.e. put basic system metrics
directly into a database so that such data can be analysed over a long
period, rather than "Oh, what did iostat say yesterday?".

When I have it fully working with Oracle (XE 10), I plan to get it working
with SQLite - it should be reasonably easy using .csv and cron jobs.


Paul...

-- 


Hmmm a "life": wonder where I can download one of those?


lineh...@tcd.ie

Mob: 00 353 86 864 5772
___
sqlite-users mailing list
sqlite-users@sqlite.org
http://sqlite.org:8080/cgi-bin/mailman/listinfo/sqlite-users


Re: [sqlite] feed "endless" data into sqlite, thru a shell script

2011-09-27 Thread Patrick Proniewski
On 27 sept. 2011, at 18:31, Roger Andersson wrote:

> I do not know if tee makes any difference or if it's available on Mac?
> http://unixhelp.ed.ac.uk/CGI/man-cgi?tee

tee is available, but no more luck here, as it won't allow to disable the 
buffer.


> iostat -d -w 10 disk0 | tee -a logfile
> and then
> tail -f logfile | awk '!/[a-zA-Z]/ {print "INSERT INTO io 
> VALUES(datetime(\"now\",\"localtime\"),"$1","$2","$3");"}' |\
> sqlite3 iostat.db

same problem here ;)

patpro
___
sqlite-users mailing list
sqlite-users@sqlite.org
http://sqlite.org:8080/cgi-bin/mailman/listinfo/sqlite-users


Re: [sqlite] feed "endless" data into sqlite, thru a shell script

2011-09-27 Thread Roger Andersson

 On 09/27/11 07:48, Patrick Proniewski wrote:

I though I could easily pipe data into SQLite:

iostat -d -w 10 disk0 |\
awk '!/[a-zA-Z]/ {print "INSERT INTO io 
VALUES(datetime(\"now\",\"localtime\"),"$1","$2","$3");"}' |\
sqlite3 iostat.db

but it won't work, because sqlite3 won't record any data until the iostat 
command ends. And of course, this iostat command will never end.
So I'm stuck with a working but very ugly script:

while true; do
  iostat -c 2 -d -w 10 disk0 |\
  tail -1 |\
  awk '!/[a-zA-Z]/ {print "INSERT INTO io 
VALUES(datetime(\"now\",\"localtime\"),"$1","$2","$3");"}' |\
  sqlite3 iostat.db
done

endless loop, forking iostat for 2 rows of data (-c 2), keeping only the last 
row because the first one is an artifact (tail -1).
I've tried various solutions with named pipes, file descriptors redirections… 
but none worked, because they all seem to require the data steam to end before 
feeding data into the DB.

Any idea?


I do not know if tee makes any difference or if it's available on Mac?
http://unixhelp.ed.ac.uk/CGI/man-cgi?tee

iostat -d -w 10 disk0 | tee -a logfile
and then
tail -f logfile | awk '!/[a-zA-Z]/ {print "INSERT INTO io 
VALUES(datetime(\"now\",\"localtime\"),"$1","$2","$3");"}' |\

sqlite3 iostat.db

/Roger
___
sqlite-users mailing list
sqlite-users@sqlite.org
http://sqlite.org:8080/cgi-bin/mailman/listinfo/sqlite-users


Re: [sqlite] feed "endless" data into sqlite, thru a shell script

2011-09-27 Thread Black, Michael (IS)
I love C myself...this does what you want I think.  Only has 3 processes ever 
run.
And since you're not worried about SQL errors apparently no need to use the 
sqlite library.
A couple of changes to match your iostat output is all that's needed.
I assume you know C (a rather large assumption I must admit)

#include 
#include 
#include 
int main(int argc, char *argv[])
{
  FILE *pipe1,*pipe2;
  char buf[4096];
  char scan[4096];
  if (argc != 2) {
fprintf(stderr,"Usage: %s [device]\n",argv[0]);
exit(1);
  }
  //sprintf(buf,"iostat -d -w 10 %s",argv[1]);
  sprintf(buf,"iostat %s 2",argv[1]);
  pipe1=popen(buf,"r");
  if (pipe1 == NULL) {
perror("iostat");
exit(1);
  }
  pipe2=popen("sqlite3 test.db","w");
  if (pipe2 == NULL) {
perror("sqlite3");
exit(1);
  }
  sprintf(scan,"%s %%lf %%lf %%lf",argv[1]);
  while(fgets(buf,sizeof(buf),pipe1)) {
char sql[4096];
double d1, d2, d3;
int n=sscanf(buf,scan,,,);
if (n == 3) {
  printf("%s %f %f %f\n",argv[1],d1,d2,d3);
  sprintf(sql,"insert into io 
values(datetime(\'now\',\'localtime\'),%f,%f,%f",d1,d2,d3);
  //fprintf(pipe2,"%s\n",sql);
  puts(sql);
}
  }
  pclose(pipe1);
  return 0;
}


Michael D. Black
Senior Scientist
NG Information Systems
Advanced Analytics Directorate




From: sqlite-users-boun...@sqlite.org [sqlite-users-boun...@sqlite.org] on 
behalf of Patrick Proniewski [pat...@patpro.net]
Sent: Tuesday, September 27, 2011 6:10 AM
To: General Discussion of SQLite Database
Subject: EXT :Re: [sqlite] feed "endless" data into sqlite, thru a shell script


On 27 sept. 2011, at 08:31, Baptiste Daroussin wrote:

> You don't need awk :)
>
> iostat -d -w 10 disk0 | while read a b c; do case $a in *[a-zA-Z]*)
> continue ;; *) sqlite3 iostat.db "INSERT INTO io
> VALUES(datetime('now', 'localtime'), \"$a\", \"$b\", \"$c\");" ;;
> esac; done


Ok, this forks less, but still, you can't get rid of the loop ;) (I love awk)

thanks,
patpro

___
sqlite-users mailing list
sqlite-users@sqlite.org
http://sqlite.org:8080/cgi-bin/mailman/listinfo/sqlite-users
___
sqlite-users mailing list
sqlite-users@sqlite.org
http://sqlite.org:8080/cgi-bin/mailman/listinfo/sqlite-users


Re: [sqlite] feed "endless" data into sqlite, thru a shell script

2011-09-27 Thread Simon Slavin

On 27 Sep 2011, at 1:07pm, Patrick Proniewski wrote:

> On 27 sept. 2011, at 13:44, Simon Slavin wrote:
> 
>> If you're using the OS X version, I don't think you need to run iostat as a 
>> continuous process.  Write a shell script with a timed loop which runs 
>> iostat without the '-w 10'.  So you could write a script which does
> 
> That's what I do, but I think using a loop is ugly, and I would like to find 
> a way to feed data continuously into sqlite. 

Can't be done using piping because of the problem pointed out earlier: piping 
buffers octets until it fills up a page.  That is a good solution for normal 
piping purposes and completely useless for anything that must be 
up-to-the-second.  I think the scripting solution will do better for you.

Simon.
___
sqlite-users mailing list
sqlite-users@sqlite.org
http://sqlite.org:8080/cgi-bin/mailman/listinfo/sqlite-users


Re: [sqlite] feed "endless" data into sqlite, thru a shell script

2011-09-27 Thread Patrick Proniewski
On 27 sept. 2011, at 13:44, Simon Slavin wrote:

> On 27 Sep 2011, at 12:03pm, Patrick Proniewski wrote:
> 
>> You're assuming I'm running Linux, but I'm running Mac OS X Server (or 
>> FreeBSD by the way), so no /proc here, and iostat is probably working 
>> differently too.
>> 
> 
> If you're using the OS X version, I don't think you need to run iostat as a 
> continuous process.  Write a shell script with a timed loop which runs iostat 
> without the '-w 10'.  So you could write a script which does

That's what I do, but I think using a loop is ugly, and I would like to find a 
way to feed data continuously into sqlite. 

patpro
___
sqlite-users mailing list
sqlite-users@sqlite.org
http://sqlite.org:8080/cgi-bin/mailman/listinfo/sqlite-users


Re: [sqlite] feed "endless" data into sqlite, thru a shell script

2011-09-27 Thread Simon Slavin

On 27 Sep 2011, at 12:03pm, Patrick Proniewski wrote:

> You're assuming I'm running Linux, but I'm running Mac OS X Server (or 
> FreeBSD by the way), so no /proc here, and iostat is probably working 
> differently too.
> 

If you're using the OS X version, I don't think you need to run iostat as a 
continuous process.  Write a shell script with a timed loop which runs iostat 
without the '-w 10'.  So you could write a script which does

while true
do 
iostat -d disk0 | ...
sleep 10
done

You should be able to feed just the result of the single iostat output to 
sqlite3 somehow.

Simon.
___
sqlite-users mailing list
sqlite-users@sqlite.org
http://sqlite.org:8080/cgi-bin/mailman/listinfo/sqlite-users


Re: [sqlite] feed "endless" data into sqlite, thru a shell script

2011-09-27 Thread Patrick Proniewski
On 27 sept. 2011, at 08:31, Baptiste Daroussin wrote:

> You don't need awk :)
> 
> iostat -d -w 10 disk0 | while read a b c; do case $a in *[a-zA-Z]*)
> continue ;; *) sqlite3 iostat.db "INSERT INTO io
> VALUES(datetime('now', 'localtime'), \"$a\", \"$b\", \"$c\");" ;;
> esac; done


Ok, this forks less, but still, you can't get rid of the loop ;) (I love awk)

thanks,
patpro

___
sqlite-users mailing list
sqlite-users@sqlite.org
http://sqlite.org:8080/cgi-bin/mailman/listinfo/sqlite-users


Re: [sqlite] feed "endless" data into sqlite, thru a shell script

2011-09-27 Thread Patrick Proniewski
On 27 sept. 2011, at 08:02, Stephan Beal wrote:

> That's a tricky one, it seems. If you're not restricted to shell code, you
> could possibly do this using perl, PHP, or similar. You could open a pipe
> for iostat, read a line from the pipe, and feed that line to your db (not in
> the form of a raw text line but using the script language's sqlite3 API).
> Repeat until the pipe is eof or a signal is caught or whatever.

Changing languages could be an option, but I'd rather keep my ugly while loop 
than learn PERL :)

patpro
___
sqlite-users mailing list
sqlite-users@sqlite.org
http://sqlite.org:8080/cgi-bin/mailman/listinfo/sqlite-users


Re: [sqlite] feed "endless" data into sqlite, thru a shell script

2011-09-27 Thread Patrick Proniewski
On 27 sept. 2011, at 12:58, Simon Slavin wrote:

> On 27 Sep 2011, at 6:48am, Patrick Proniewski wrote:
> 
>> I've tried various solutions with named pipes, file descriptors 
>> redirections… but none worked, because they all seem to require the data 
>> steam to end before feeding data into the DB.
> 
> Most of your problems are caused because you're using iostat.  Can you 
> instead read the data directly out of /proc ?  Take a look at the end of the 
> 'man iostat' page for details.

You're assuming I'm running Linux, but I'm running Mac OS X Server (or FreeBSD 
by the way), so no /proc here, and iostat is probably working differently too.


patpro
___
sqlite-users mailing list
sqlite-users@sqlite.org
http://sqlite.org:8080/cgi-bin/mailman/listinfo/sqlite-users


Re: [sqlite] feed "endless" data into sqlite, thru a shell script

2011-09-27 Thread Simon Slavin

On 27 Sep 2011, at 6:48am, Patrick Proniewski wrote:

> I've tried various solutions with named pipes, file descriptors redirections… 
> but none worked, because they all seem to require the data steam to end 
> before feeding data into the DB.

Most of your problems are caused because you're using iostat.  Can you instead 
read the data directly out of /proc ?  Take a look at the end of the 'man 
iostat' page for details.

Simon.
___
sqlite-users mailing list
sqlite-users@sqlite.org
http://sqlite.org:8080/cgi-bin/mailman/listinfo/sqlite-users


Re: [sqlite] feed "endless" data into sqlite, thru a shell script

2011-09-27 Thread Patrick Proniewski
On 27 sept. 2011, at 08:21, Roger Binns wrote:

> The easiest solution is to just be patient and accept the data will be a
> little delayed.

that won't work for me, because my SQL command includes a datetime('now'). Any 
row input that is delayed won't be recorded with the proper datetime. That's 
one of the reasons why I must use tail -1 in my infinite loop. When I send more 
than one line, they all have the same datetime.


> Other solutions involve various helper programs such as using a pty so that
> the programs think they are using terminals:
> 
>  http://stackoverflow.com/questions/1000674/turn-off-buffering-in-pipe

I've neither unbuffer nor socat available on my system, but I'll read the full 
thread to grab info.

thanks,
patpro

___
sqlite-users mailing list
sqlite-users@sqlite.org
http://sqlite.org:8080/cgi-bin/mailman/listinfo/sqlite-users


Re: [sqlite] feed "endless" data into sqlite, thru a shell script

2011-09-27 Thread Baptiste Daroussin
2011/9/27 Patrick Proniewski :
> Hello,
>
> I'm facing a challenging problem. I want to log some data into an SQLite3 DB. 
> Data come from a system command (iostat) in an endless steam, one row every X 
> seconds:
>
>         disk0
>   KB/t tps  MB/s
>   4.02 2318 9.09
>   4.00 1237  4.83
>   6.63 979  6.34
>  46.30  15  0.69
>  30.58  23  0.69
>  12.90  32  0.41
>  107.85  55  5.75
>
> I though I could easily pipe data into SQLite:
>
> iostat -d -w 10 disk0 |\
> awk '!/[a-zA-Z]/ {print "INSERT INTO io 
> VALUES(datetime(\"now\",\"localtime\"),"$1","$2","$3");"}' |\
> sqlite3 iostat.db
>
> but it won't work, because sqlite3 won't record any data until the iostat 
> command ends. And of course, this iostat command will never end.
> So I'm stuck with a working but very ugly script:
>
> while true; do
>  iostat -c 2 -d -w 10 disk0 |\
>  tail -1 |\
>  awk '!/[a-zA-Z]/ {print "INSERT INTO io 
> VALUES(datetime(\"now\",\"localtime\"),"$1","$2","$3");"}' |\
>  sqlite3 iostat.db
> done
>
> endless loop, forking iostat for 2 rows of data (-c 2), keeping only the last 
> row because the first one is an artifact (tail -1).
> I've tried various solutions with named pipes, file descriptors redirections… 
> but none worked, because they all seem to require the data steam to end 
> before feeding data into the DB.
>
> Any idea?
>
> regards,
> patpro
> ___
> sqlite-users mailing list
> sqlite-users@sqlite.org
> http://sqlite.org:8080/cgi-bin/mailman/listinfo/sqlite-users
>

You don't need awk :)


iostat -d -w 10 disk0 | while read a b c; do case $a in *[a-zA-Z]*)
continue ;; *) sqlite3 iostat.db "INSERT INTO io
VALUES(datetime('now', 'localtime'), \"$a\", \"$b\", \"$c\");" ;;
esac; done

(the mail can have beed splitted, but the above code should be one
single line :)

regards,
Bapt
___
sqlite-users mailing list
sqlite-users@sqlite.org
http://sqlite.org:8080/cgi-bin/mailman/listinfo/sqlite-users


Re: [sqlite] feed "endless" data into sqlite, thru a shell script

2011-09-27 Thread Roger Binns
-BEGIN PGP SIGNED MESSAGE-
Hash: SHA1

On 09/26/2011 10:48 PM, Patrick Proniewski wrote:
> but it won't work, because sqlite3 won't record any data until the iostat 
> command ends. 

UNIX tools using the standard I/O library will show this.  They detect that
standard output is not a terminal and buffer up output.  The buffer is
likely to be 4 or 8kb.  The buffer will be flushed each time it fills up.

The SQLite's shell standard input is likely to be doing the same thing, as
is awk.

Some explanation here:

  http://www.pixelbeat.org/programming/stdio_buffering/

The easiest solution is to just be patient and accept the data will be a
little delayed.

Other solutions involve various helper programs such as using a pty so that
the programs think they are using terminals:

  http://stackoverflow.com/questions/1000674/turn-off-buffering-in-pipe

Roger
-BEGIN PGP SIGNATURE-
Version: GnuPG v1.4.11 (GNU/Linux)

iEYEARECAAYFAk6Ba3MACgkQmOOfHg372QTtygCePyL0cN/qdIowaXzAQp/YtOEA
v0QAmgMr7ZXRWqaejJW3W5JYKLVwYkhb
=r87o
-END PGP SIGNATURE-
___
sqlite-users mailing list
sqlite-users@sqlite.org
http://sqlite.org:8080/cgi-bin/mailman/listinfo/sqlite-users


Re: [sqlite] feed "endless" data into sqlite, thru a shell script

2011-09-27 Thread Stephan Beal
On Tue, Sep 27, 2011 at 7:48 AM, Patrick Proniewski wrote:

> while true; do
> ...

endless loop, forking iostat for 2 rows of data (-c 2), keeping only the
> last row because the first one is an artifact (tail -1).
>

That's a tricky one, it seems. If you're not restricted to shell code, you
could possibly do this using perl, PHP, or similar. You could open a pipe
for iostat, read a line from the pipe, and feed that line to your db (not in
the form of a raw text line but using the script language's sqlite3 API).
Repeat until the pipe is eof or a signal is caught or whatever.

-- 
- stephan beal
http://wanderinghorse.net/home/stephan/
___
sqlite-users mailing list
sqlite-users@sqlite.org
http://sqlite.org:8080/cgi-bin/mailman/listinfo/sqlite-users


[sqlite] feed "endless" data into sqlite, thru a shell script

2011-09-26 Thread Patrick Proniewski
Hello,

I'm facing a challenging problem. I want to log some data into an SQLite3 DB. 
Data come from a system command (iostat) in an endless steam, one row every X 
seconds:

 disk0 
   KB/t tps  MB/s 
   4.02 2318  9.09 
   4.00 1237  4.83 
   6.63 979  6.34 
  46.30  15  0.69 
  30.58  23  0.69 
  12.90  32  0.41 
 107.85  55  5.75 

I though I could easily pipe data into SQLite:

iostat -d -w 10 disk0 |\
awk '!/[a-zA-Z]/ {print "INSERT INTO io 
VALUES(datetime(\"now\",\"localtime\"),"$1","$2","$3");"}' |\
sqlite3 iostat.db

but it won't work, because sqlite3 won't record any data until the iostat 
command ends. And of course, this iostat command will never end.
So I'm stuck with a working but very ugly script:

while true; do
 iostat -c 2 -d -w 10 disk0 |\
 tail -1 |\
 awk '!/[a-zA-Z]/ {print "INSERT INTO io 
VALUES(datetime(\"now\",\"localtime\"),"$1","$2","$3");"}' |\
 sqlite3 iostat.db
done

endless loop, forking iostat for 2 rows of data (-c 2), keeping only the last 
row because the first one is an artifact (tail -1).
I've tried various solutions with named pipes, file descriptors redirections… 
but none worked, because they all seem to require the data steam to end before 
feeding data into the DB.

Any idea?

regards,
patpro
___
sqlite-users mailing list
sqlite-users@sqlite.org
http://sqlite.org:8080/cgi-bin/mailman/listinfo/sqlite-users