Am Samstag, 22. Dezember 2012 12:43:54 UTC+1 schrieb Peter Otten: > wrote: > > > > > Hello, to all, > > > > > > I hope I can describe me problem correctly. > > > > > > I have written a Project split up to one Main.py and different modules > > > which are loaded using import and here is also my problem: > > > > > > 1. Main.py executes: > > > 2. Import modules > > > 3. One of the Modules is a SqliteDB datastore. > > > 4. A second module creates an IPC socket. > > > 5. Here is now my problem : > > > The IPC Socket should run a sub which is stored ad SqliteDB and > > > returns all Rows. > > > > > > Is there a elegant way to solve it? except a Queue. Is it possible to > > > import modules multiple?! > > > > If you import a module more than once code on the module level will be > > executed the first time only. Subsequent imports will find the ready-to-use > > module object in a cache (sys.modules). > > > > > I'm unsure because the open DB file at another > > > module. > > > > > > How is this solved in bigger projects? > > > > If I'm understanding you correctly you have code on the module level that > > creates a socket or opens a database. Don't do that! > > Put the code into functions instead. That will give the flexibility you need > > for all sizes of projects. For instance > > > > socket_stuff.py > > > > def send_to_socket(rows): > > socket = ... # open socket > > for row in rows: > > # do whatever it takes to serialize the row > > socket.close() > > > > database_stuff.py > > > > def read_table(dbname, tablename): > > if table not in allowed_table_names: > > raise ValueError > > db = sqlite3.connect(dbname) > > cursor = db.cursor() > > for row in cursor.execute("select * from %s" % tablename): > > yield row > > db.close() > > > > > > main.py > > > > import socket_stuff > > import database_stuff > > > > if __name__ == "__main__": > > socket_stuff.send_to_socket( > > database_stuff.read_table("some_db", "some_table"))
Hello Thanks for that answer: I think I have to write it a bit larger, This isn't a real code, but it for better overview: main.py / / \ \ ------------------------------------------------------------------- Datastore | ModbusClient | DaliBusClient | Advanced Scheduler -------------------------------------------------------------------- Main.py: import Datastore Datastore.startup() .... (Everything should run in threads) Datastore.py # Opens a SQLite DB # Contains Queries to serve data for other mods # Central point for updating / Quering etc. ModbusClient.py # Opens TCP connection to Modbus Device !!! Here it's interesting !!! This module contains predefined Queries and functions IF I START A FUNCTION a SQL should be executed in Datastore.py DaliBusClient.py #Lamps, etc. Functions. # Stored Informations should be kept in Datastore # Advanced scheduler I don't know, Python allways looks for me like a one script "File". But there are big projects. like the the "Model of an SQL Server", using coordinators no problems running threads and exchange Data through a Backbone. I have searched a lot, but I havent find anything to write the "core" splited up into modules and geting over the scopes etc. DO I have anything forgotten?! everything unclear?? :-P -- http://mail.python.org/mailman/listinfo/python-list