Eric S. Johansson wrote: > as one would expect when creating a body of software, eventually you > create a series of relatively generic components you find yourself using > over and over again. As a result, I'm finding myself slightly bit by > the same problem I have faced multiple times of the past. Namely, how > do you distribute "small" components that are used in a series of larger > applications. > > Yes, I could turn each 300 line file into a Python module complete with > its own source code control and installation framework. Seems like a > bit of overkill however. I also know I could cluster them together into > a larger module but I would end up with something as unwieldy as twisted > or zope. I'm just trying to make a couple of nuts and bolts available, > not a whole railcar full. > > Then there's the assembly problem, pulling together all of the disparate > pieces into the application for distribution. Do you pull them together > before you make a tar ball or do you pull them together during > installation ala cpan. there really should be some automated method for > pulling together components when building an application. how many > components is it reasonable to expect an admin to manually download? > Two? Five? 100?
How about one? I bundle everything together. Sharing modules at end user host is more difficult because you have to test many combinations. Needless to say, end users also have a strange ability to create untested combinations of modules :) > > I'd like to know if I'm missing something. Mini modules management and > distribution should be possible without too much headache. I organize projects in workspaces, where workspace consist of several projects pulled together from different source code repositories. And I also have small scrips to manage workspaces, for example, I don't use versioning system directly like "svn update" but run project specific update.py. If I want to share a mini-module I modify update.py not to update this module by default. So it works like this: projects X and Y share module C. Project X improves C and commits new version of C, which actually can break project Y. But since my update script won't fetch a new version of C by default, project Y is not affected. When project Y is ready for update, they run "update.py C", test, fix C and commit the fixed C. Project X updates C when they are ready to do it. Work like a charm for me. -- http://mail.python.org/mailman/listinfo/python-list