Hello, I have a design question that for me primarily occurs when I have a module that needs to transition to a package. Lets assume I have module that services a database. Initially the module just has some functions to pull data out of the database format them etc. As the project grows I add tables some of the tables have static data or initialization data that are stored in CSVs. I build functions that initialize the database a schema file is now used. Now my module is still the only Python code file I have but I have these other data files the module needs, time to make a package. So naturally I make a package folder called database, put and empty __init__.py, in goes my database module and all resource files. Here is where I my primary design question comes in. As organized now as described to import and use the package I need
from database import database or I have to put in the init file from database import * Either of these still leaves a database.database namespace laying about and to me it just seems untidy. So for a while now I have taken to converting my module, in this example database.py, to the __init__.py of the package. I remember reading years back that having code in the __init__.py was bad practice but I can't remember reading for any specific reason. So then I ask is there anything in fact wrong with this practice? I have probably a dozen packages now that are used internal to my organization that follow this pattern and I haven't encountered any issues. So I was hoping to get feed back from a wider audience about if there are any issues with this type of design that I just haven't encounter yet or find out what other people do with this same problem. Thanks Chris -- https://mail.python.org/mailman/listinfo/python-list