On Dec 18, 2011, at 1:56 PM, rivka wrote: > For the sake of speed - I would like to work with my database in > memory (disk access is a killer). So - I would like to load my > database from disk to memory - without going through a query.all() and > then - while working from memory - occasionally commit to the disk (or > - when executing a commit - somehow have it commit to both the memory > an disk versions).
If you use a SQLite :memory: database, you'll get the effect of the database being loaded into memory, but still you will have the Python overhead of the ORM, which adds a lot of automation to the task and thus takes time. Using SQLAlchemy Core will perform lots faster,. and then using pysqlite directly will run in the blink of an eye as it's 100% written in C The attached script illustrates the difference between ORM, Core, and raw cursor, using CPython: querying: done ! it took 50 sec querying with Core instead of ORM: done ! it took 12 sec Querying with a raw sqlite cursor: done ! it took 1 sec Here's with Pypy 1.7, results are slightly more egalitarian querying: done ! it took 23 sec querying with Core instead of ORM: done ! it took 7 sec Querying with a raw sqlite cursor: done ! it took 4 sec > > Is it possible with using sqlite or do I need the big bucks to go get > a copy of commercial grade db software? You never need "commercial grade" anything these days, Postgresql is as industrial strength as anyone would need and it's free. That said, SQLite is a very fast performer too, especially with the :memory: db. -- You received this message because you are subscribed to the Google Groups "sqlalchemy" group. To post to this group, send email to sqlalchemy@googlegroups.com. To unsubscribe from this group, send email to sqlalchemy+unsubscr...@googlegroups.com. For more options, visit this group at http://groups.google.com/group/sqlalchemy?hl=en.
from sqlalchemy import create_engine from sqlalchemy import * from sqlalchemy.orm import * from sqlalchemy.ext.declarative import declarative_base Base= declarative_base() class MyObject(Base): __tablename__ = "object" id = Column(Integer, primary_key=True) e = create_engine("sqlite:///file.db") Base.metadata.drop_all(e) Base.metadata.create_all(e) s = Session(e) for i in xrange (1200): # insert with ORM: #s.add_all([ # MyObject(id=i * 1000 + j) for j in xrange(1000) #]) #s.flush() # insert with Core: s.execute(MyObject.__table__.insert(), [{} for j in xrange(1000)] ) print "inserted %d" % (i * 1000) s.commit() import time print "querying:" now = time.time() obj = s.query(MyObject).all() print "done ! it took %d sec" % ( time.time() - now) assert len(obj) == 1200000, len(obj) print "querying with Core instead of ORM:" now = time.time() obj = [ (row.id,) for row in s.execute(MyObject.__table__.select()) ] print "done ! it took %d sec" % ( time.time() - now) assert len(obj) == 1200000, len(obj) print "Querying with a raw sqlite cursor:" now = time.time() conn = e.raw_connection() cursor = conn.cursor() cursor.execute("select * from object") obj = [ (row[0], ) for row in cursor.fetchall() ] print "done ! it took %d sec" % ( time.time() - now) assert len(obj) == 1200000, len(obj)