Re: [sqlite] Attach databases using tons of memory (DBI Perl)

2009-09-25 Thread John Lynch
I will try what you have mentioned below.  I am 99% sure it isnt the perl
script, since it doesn't have the same behavior without attaches.  I will
let you know what happens.

Thank You

2009/9/25 D. Richard Hipp 

>
> On Sep 25, 2009, at 11:02 AM, John Lynch wrote:
>
> > page size is max (32768).  These sqlt files are around 100Gig with 200
> > million rows.
> > I have query_cache set to 0.
> >
> > I am running single queries against these attach tables and then
> > putting the
> > data in memcached.  The script runs thousands of queries before it
> > runs out
> > of memory.  If I use the same width table ( a product of all 3
> > attached
> > tables, flattened into 1 table) and run it with the same script I
> > have no
> > memory issues.  I am hoping there is something I am missing with
> > caching and
> > attach.  Maybe a pragma or something?
> >
>
>
> SQLite's memory usage should be self-limiting.  No special pragma
> settings required.  You can reduce the amount of memory requested by
> using:
>
> PRAGMA main.cache_size=10;
>
> Repeat this for each of the attached databases.  But even if you don't
> do this, the cache size should be 2000 pages which is only about 64MB
> for each attached database.
>
> What does sqlite3_mem_used() and sqlite3_mem_highwater() tell you
> about SQLite's memory usage?  Are you sure that it is SQLite and not
> your perl script that is using the memory?
>
> D. Richard Hipp
> d...@hwaci.com
>
>
>
> ___
> sqlite-users mailing list
> sqlite-users@sqlite.org
> http://sqlite.org:8080/cgi-bin/mailman/listinfo/sqlite-users
>



-- 
--
John Lynch
(217) 953-0557
johntly...@gmail.com
___
sqlite-users mailing list
sqlite-users@sqlite.org
http://sqlite.org:8080/cgi-bin/mailman/listinfo/sqlite-users


Re: [sqlite] Attach databases using tons of memory (DBI Perl)

2009-09-25 Thread John Lynch
I only query for one row at a time.

2009/9/25 Simon Slavin 

>
> On 25 Sep 2009, at 1:10pm, Kees Nuyt wrote:
>
> > 4) Don't try to collect the full resultset into a Perl
> > array.
>
> LOL
>
> Simon.
> ___
> sqlite-users mailing list
> sqlite-users@sqlite.org
> http://sqlite.org:8080/cgi-bin/mailman/listinfo/sqlite-users
>



-- 
--
John Lynch
(217) 953-0557
johntly...@gmail.com
___
sqlite-users mailing list
sqlite-users@sqlite.org
http://sqlite.org:8080/cgi-bin/mailman/listinfo/sqlite-users


Re: [sqlite] Attach databases using tons of memory (DBI Perl)

2009-09-25 Thread Simon Slavin

On 25 Sep 2009, at 1:10pm, Kees Nuyt wrote:

> 4) Don't try to collect the full resultset into a Perl
> array.

LOL

Simon.
___
sqlite-users mailing list
sqlite-users@sqlite.org
http://sqlite.org:8080/cgi-bin/mailman/listinfo/sqlite-users


Re: [sqlite] Attach databases using tons of memory (DBI Perl)

2009-09-25 Thread D. Richard Hipp

On Sep 25, 2009, at 11:02 AM, John Lynch wrote:

> page size is max (32768).  These sqlt files are around 100Gig with 200
> million rows.
> I have query_cache set to 0.
>
> I am running single queries against these attach tables and then  
> putting the
> data in memcached.  The script runs thousands of queries before it  
> runs out
> of memory.  If I use the same width table ( a product of all 3  
> attached
> tables, flattened into 1 table) and run it with the same script I  
> have no
> memory issues.  I am hoping there is something I am missing with  
> caching and
> attach.  Maybe a pragma or something?
>


SQLite's memory usage should be self-limiting.  No special pragma  
settings required.  You can reduce the amount of memory requested by  
using:

 PRAGMA main.cache_size=10;

Repeat this for each of the attached databases.  But even if you don't  
do this, the cache size should be 2000 pages which is only about 64MB  
for each attached database.

What does sqlite3_mem_used() and sqlite3_mem_highwater() tell you  
about SQLite's memory usage?  Are you sure that it is SQLite and not  
your perl script that is using the memory?

D. Richard Hipp
d...@hwaci.com



___
sqlite-users mailing list
sqlite-users@sqlite.org
http://sqlite.org:8080/cgi-bin/mailman/listinfo/sqlite-users


Re: [sqlite] Attach databases using tons of memory (DBI Perl)

2009-09-25 Thread John Lynch
page size is max (32768).  These sqlt files are around 100Gig with 200
million rows.
I have query_cache set to 0.

I am running single queries against these attach tables and then putting the
data in memcached.  The script runs thousands of queries before it runs out
of memory.  If I use the same width table ( a product of all 3 attached
tables, flattened into 1 table) and run it with the same script I have no
memory issues.  I am hoping there is something I am missing with caching and
attach.  Maybe a pragma or something?

2009/9/25 P Kishor 

> On Thu, Sep 24, 2009 at 3:30 PM, John Lynch  wrote:
> > Sqlite Users,
> >
> > I am using sqliute user for a very large customer database.  I have 3
> > seperate DB files with a table in each file.  The files are cust,acct,
> and
> > web.
> >
> > Each DB table has  ~ 200 Million Rows.  When I attach web and acct to
> cust
> > with perl DBI/DBD::Sqlite the process runs out of memory on windows (2G
> > 32bit process limit).  The memory usage grows out of control while
> querying
> > all table joined together via the attach.
> >
> > When I used a flattened table of the same data (ie create table as select
> *
> > from ...) via same attach to create the flattened table.  I have no
> > problems.
> >
>
> Are you running out of memory when you ATTACH, or when you actually
> perform a specific query? If the only difference between your two
> instances is "ATTACH," and you are doing everything exactly the same
> and running out of memory with ATTACH and not without ATTACH, then
> there is something going on at the point of ATTACH. More info from you
> will shed light on this.
>
> Besides providing more info on this here, you might also want to ask
> this question at perlmonks, in case this is a Perl-specific issue.
>
>
>
>
> >
> > Any ideas on how to reduce the memory consumption when using attach?
> >
> > Thank You
> >
> > --
> > --
> > John Lynch
> > (217) 953-0557
> > johntly...@gmail.com
> > ___
> > sqlite-users mailing list
> > sqlite-users@sqlite.org
> > http://sqlite.org:8080/cgi-bin/mailman/listinfo/sqlite-users
> >
>
>
>
> --
> Puneet Kishor http://www.punkish.org
> Carbon Model http://carbonmodel.org
> Charter Member, Open Source Geospatial Foundation http://www.osgeo.org
> Science Commons Fellow, http://sciencecommons.org/about/whoweare/kishor
> Nelson Institute, UW-Madison http://www.nelson.wisc.edu
> ---
> Assertions are politics; backing up assertions with evidence is science
> ===
> Sent from Madison, WI, United States
> ___
> sqlite-users mailing list
> sqlite-users@sqlite.org
> http://sqlite.org:8080/cgi-bin/mailman/listinfo/sqlite-users
>



-- 
--
John Lynch
(217) 953-0557
johntly...@gmail.com
___
sqlite-users mailing list
sqlite-users@sqlite.org
http://sqlite.org:8080/cgi-bin/mailman/listinfo/sqlite-users


Re: [sqlite] Attach databases using tons of memory (DBI Perl)

2009-09-25 Thread P Kishor
On Thu, Sep 24, 2009 at 3:30 PM, John Lynch  wrote:
> Sqlite Users,
>
> I am using sqliute user for a very large customer database.  I have 3
> seperate DB files with a table in each file.  The files are cust,acct, and
> web.
>
> Each DB table has  ~ 200 Million Rows.  When I attach web and acct to cust
> with perl DBI/DBD::Sqlite the process runs out of memory on windows (2G
> 32bit process limit).  The memory usage grows out of control while querying
> all table joined together via the attach.
>
> When I used a flattened table of the same data (ie create table as select *
> from ...) via same attach to create the flattened table.  I have no
> problems.
>

Are you running out of memory when you ATTACH, or when you actually
perform a specific query? If the only difference between your two
instances is "ATTACH," and you are doing everything exactly the same
and running out of memory with ATTACH and not without ATTACH, then
there is something going on at the point of ATTACH. More info from you
will shed light on this.

Besides providing more info on this here, you might also want to ask
this question at perlmonks, in case this is a Perl-specific issue.




>
> Any ideas on how to reduce the memory consumption when using attach?
>
> Thank You
>
> --
> --
> John Lynch
> (217) 953-0557
> johntly...@gmail.com
> ___
> sqlite-users mailing list
> sqlite-users@sqlite.org
> http://sqlite.org:8080/cgi-bin/mailman/listinfo/sqlite-users
>



-- 
Puneet Kishor http://www.punkish.org
Carbon Model http://carbonmodel.org
Charter Member, Open Source Geospatial Foundation http://www.osgeo.org
Science Commons Fellow, http://sciencecommons.org/about/whoweare/kishor
Nelson Institute, UW-Madison http://www.nelson.wisc.edu
---
Assertions are politics; backing up assertions with evidence is science
===
Sent from Madison, WI, United States
___
sqlite-users mailing list
sqlite-users@sqlite.org
http://sqlite.org:8080/cgi-bin/mailman/listinfo/sqlite-users


Re: [sqlite] Attach databases using tons of memory (DBI Perl)

2009-09-25 Thread Kees Nuyt
On Thu, 24 Sep 2009 15:30:40 -0500, John Lynch
 wrote:

>Sqlite Users,
>
>I am using sqliute user for a very large customer database.  I have 3
>seperate DB files with a table in each file.  The files are cust,acct, and
>web.
>
>Each DB table has  ~ 200 Million Rows.  When I attach web and acct to cust
>with perl DBI/DBD::Sqlite the process runs out of memory on windows (2G
>32bit process limit).  The memory usage grows out of control while querying
>all table joined together via the attach.
>
>When I used a flattened table of the same data (ie create table as select *
>from ...) via same attach to create the flattened table.  I have no
>problems.
>
>
>Any ideas on how to reduce the memory consumption when using attach?

1) Limit the size of the page cache with the appropriate
PRAGMAs. THe cache takes page_size (bytes) * cache_size
(pages) plus some overhead.

2) Optimize your queries, prevent full Cartesian products.

3) Limit the size of the resultset to what you really need.

e.g. http://www.sqlite.org/cvstrac/wiki?p=ScrollingCursor

4) Don't try to collect the full resultset into a Perl
array.

>Thank You

HTH
-- 
  (  Kees Nuyt
  )
c[_]
___
sqlite-users mailing list
sqlite-users@sqlite.org
http://sqlite.org:8080/cgi-bin/mailman/listinfo/sqlite-users