Hello again,
Actually, I found a (pretty ugly) workaround to my problem: In every
__init__.py of the subdirectories I created, I add the following lines:
import os.path,sys
for directory,sub_dir,files in os.walk(os.path.normpath(os.path.join(os.path.dirname(__file__),..))):
sys.path.append(directory)
That add all of the sub-directories of the parent directory to the
sys.path. I can then move my files in the directories I want without
having to change a single line in them.
it may not be very elegant, but for the moment, it will do.
Actually, after thinking about it, I think that my problem is more a
problem with the os than with python. subdirectories are the only
method most OS provide for file organization. If there was a method
for, for example, add labels to files and then browse files according
to the labels, I wouldn't have this problem. (ok, I could arrange that
with the names of the files, but it doesn't feel the same for me).
Jason Actually, unless I am wrong, you don't need to add the path
of the package with a .pth . It should be done automatically when you
import it.
Giovanni
The usual way is to do import pkgN and then dereference
What do you call 'dereference'? do you mean something like 'import spam.foo foo=spam.foo' ?
My preferred way is to have __init__.py just do from submodules
import *, and then each submodule defines __all__ to specify which are
its public symbols.
Your solution soud interesting (although it doesn't exactly do what I
want in the present case). I would consider it when I have to do
hierarchical packages. However, I think I read somewhere that from
module import * statements could cause (probably small) performance
issue. Do you think this can be the case here?
Anyway, thank you all for your very interesting replies. (and sorry for the double-post)
TokiDoki
--
http://mail.python.org/mailman/listinfo/python-list