Ranjitha wrote:
> Hi all,
> 
> I'm relatively new to python 

And to databases ?

> and am facing a problem with database
> access
> 
> I want to store my data in a database on the disk. I also want to be
> able to reload the tables into the RAM whenever I have a lot of disk
> accesses and commit the changes back to the database.

This should be the database duty, not yours. Serious RDBMS are highly
optimized wrt/ caching and file I/O, and there are very few chances you
can do anything better by yourself.

> There is an
> option of storing the data in the RAM where you connect to :memory:
> instead of a DB file. The problem with this is that the data is lost
> everytime you close the connection to the database.

Seems quite obvious !-)

> Could somebody
> suggest a way to load the tables into the RAM as tables and not as some
> lists or dictionaries?

There's nothing like a "table" in Python's builtin datatypes !-)

More seriously: don't bother.

Focus first on writing correct code. Then, *if* and *when* you *really*
have a performance problem, *first* use a profiler to check where the
*real* problem is. If it then happens that SQLite is the bottleneck, try
switching to a real RDBMS like PostgreSQL.

Remember the 3 golden rules about optimisation:
1/ don't optimize
2/ don't optimize
3/ for the experts only: don't optimize


My 2 cents...
-- 
bruno desthuilliers
"Premature optimization is the root of all evil."
(some wise guy)
-- 
http://mail.python.org/mailman/listinfo/python-list

Reply via email to