I mentioned in my earlier post that I am operating in thread safe manner.

In simple words my application is doing following thing,

There are 'n' tracks (some entity in business logic context)
every 5 mins per track have a thread--->each thread will spawn 5 threads 
per mins--> now each of these threads will fetch data from web service and 
insert into DB.


class DBHandler(object):
    '''Handles generic DB related operations'''

    def __init__(self):
        global engine
        self.session = Session(bind=engine)

    def execute_query():


def insert_data():
       dbh = DBHandler()



def fetch_insert_data()
      fetch_data()
      insert_data()


def processing(track)
          create 5 threads per minute in 5 min interval and call 
fetch_insert_data(track, minute_value)

def main():
          create thread per track and call processing(track)
      
90% of the queries are insert queries and very few are fetch with no 
complex joins or anything. I understand that it will be difficult to get 
the whole picture with these code snippets.



On Monday, September 22, 2014 12:53:45 PM UTC-5, Michael Bayer wrote:
>
>
> On Sep 22, 2014, at 1:40 PM, Milind Vaidya <kav...@gmail.com <javascript:>> 
> wrote: 
>
> > is it true of sqlalchemy verison 0.7 (which BTW I am using)? 
>
> yes. 
>
> from sqlalchemy import * 
>
> e = create_engine("mysql://scott:tiger@localhost/test", echo=True) 
>
> m = MetaData() 
> t = Table('t', m, Column('x', Integer, primary_key=True), Column('y', 
> Integer)) 
>
> m.create_all(e) 
>
> with e.begin() as conn: 
>     result = conn.execute(t.insert(), x=1, y=2) 
>     assert result.closed 
>
>
> > If I close() explicitly only for insert queries, I can see that parallel 
> inserts happening equal to no. of threads that I have, but if I remove it, 
> no parallel inserts happen(I am checking this by using show processlist on 
> mysql prompt) 
>
> there should be nothing whatsoever done in “parallel” relative to a single 
> connection with the Python DBAPI, and especially with MySQLdb.    In your 
> code snippet, if “self.session” should absolutely never be shared with more 
> than one thread at a time.    Plenty of docs on this at 
> http://docs.sqlalchemy.org/en/rel_0_9/orm/session.html#is-the-session-thread-safe
>  
>
> > pool_size = 150 : when close() is called. 
> > pool_size = 30 : when close() in not called. 
>
>
> its hard to tell what would be causing this without example code that can 
> illustrate the full context.  the kind of pattern you have in “def 
> execute_query()” is one that is usually associated with problems like 
> these, though, because its hard to have a single logical operation in your 
> program refer to multiple, ad-hoc transactions/connections that are each 
> established every time execute_query() is called.    the preferred system 
> is one that establishes a new session at the start of an entire operation 
> at the use-case level (like, a web request starts), makes that single 
> session available throughout the scope of an operation, then when the 
> use-case is fully complete the session is closed.     the pattern here is 
> turning that inside out. 
>
>

-- 
You received this message because you are subscribed to the Google Groups 
"sqlalchemy" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to sqlalchemy+unsubscr...@googlegroups.com.
To post to this group, send email to sqlalchemy@googlegroups.com.
Visit this group at http://groups.google.com/group/sqlalchemy.
For more options, visit https://groups.google.com/d/optout.

Reply via email to