On Sat, Mar 21, 2009 at 1:25 PM, Nicolas Williams
<nicolas.willi...@sun.com> wrote:
> On Sat, Mar 21, 2009 at 12:14:43PM -0500, P Kishor wrote:
>> If I can't improve 33 ms per query, then I can experiment with
>
> 33ms per-query sounds like you're not caching enough of the database in
> memory.  What's the cache size?  Can you jack it up?

hmmm... never thought of the cache size. From the docs...

"PRAGMA default_cache_size = Number-of-pages;

Query or change the maximum number of database disk pages that SQLite
will hold in memory at once. Each page uses 1K on disk and about 1.5K
in memory. This pragma works like the cache_size pragma with the
additional feature that it changes the cache size persistently. With
this pragma, you can set the cache size once and that setting is
retained and reused every time you reopen the database."

So, any suggestions on what cache size I should experiment with? And,
does this have to be set *before* the db is created? From the above
description it sounds like I can set the cache_size at any time.

> Does the entire
> dataset fit in memory?  If so, why isn't it all in memory?  Or if it is
> all in memory, what's SQLite3 spending its time on?

How do I take a db on disk and load it all in memory? How is that
done? I have never done that before. Seems like I can create a db in
memory with

my $dbh = DBI->connect('dbi:SQLite:dbname=:memory:');

But how do I take a db on disk and load it all in memory. Also, isn't
there a 2 GB limit to the amount of RAM that 32-bit processes can
address?

By the way, even though I have a lot of computing horsepower, I would
like to work toward a solution that would work reasonably well even
without access to a cluster. While no one would expect lightning fast
responses for model runs over millions of cells, it would be nice to
cut the time from several hours down to sub-hour levels. But that is a
longer road to tread.


>
> Can you profile your application?  You could use DTrace if on Solaris,
> FreeBSD or MacOS X (actually, there's an early port of DTrace to Linux
> too that I hear is usable).
>

I will do that eventually. For now, I don't really have an
application, just the very beginnings of a db. What you see in this
thread are results from my profiling, albeit at a very coarse level
using Benchmark Perl module.


> Nico
> --
>



-- 
Puneet Kishor http://www.punkish.org/
Nelson Institute for Environmental Studies http://www.nelson.wisc.edu/
Carbon Model http://carbonmodel.org/
Open Source Geospatial Foundation http://www.osgeo.org/
Sent from: Madison WI United States.
_______________________________________________
sqlite-users mailing list
sqlite-users@sqlite.org
http://sqlite.org:8080/cgi-bin/mailman/listinfo/sqlite-users

Reply via email to