Hi, Most Python HOWTOs and similar resources suggest using 'pip', 'easy_install' or other tools to install python modules or python-based programs. The problem is, that in PLD those tools would install modules in /usr/{lib{64},share}/pythonX.Y/site-packages – the same place, where python modules from our RPM packages go.
This is a mess and may destroy already installed packages – using pip to install a single innocent program may cause chain reaction of installing dependency modules and overwitting old versions of those already in the system. virtualenv can help, but only if one chooses to use it. I suggest patching python, python3 and, if neccessary, other packages, so distutils/setuptools/pip would install Python modules to /usr/local by default – like autoconf configure scripts do. Python would look for modues in /usr/local first and then in /usr. Effects: 1. easy_install/pip/etc would not overwrite distribution packages – that is what we want. 2. modules installed with easy_install/pip/etc would override those installed from RPM – that is what the user would expect installing something manually. 3. No existing python-*.spec would build any more. All python specs would need to be updated to force proper instalation directories. I would prepare %setup_py2 and %setup_py3. Those would use proper python interpreter and compiler flags too. I guess a 'sed' job on all the python-*.spec would do the trick for most packages. 4. Existing packages (except, maybe, a few exceptions, like pip itself would not have to be rebuilt immediately – the paths used by the packages would still be ok What do you think? Jacek _______________________________________________ pld-devel-en mailing list pld-devel-en@lists.pld-linux.org http://lists.pld-linux.org/mailman/listinfo/pld-devel-en