tags 652313 patch thanks On Wed, 2011-12-21 at 13:23 -0500, Adam C Powell IV wrote: > retitle 652313 Needs mpich2 targets in debian/rules > block 652313 by 652312 > thanks > > Hi Julien, > > On Mon, 2011-12-19 at 21:03 +0100, Julien Cristau wrote: > > On Mon, Dec 19, 2011 at 08:43:12 -0500, Adam C Powell IV wrote: > > > > > On Sat, 2011-12-17 at 17:32 +0100, Julien Cristau wrote: > > > > On Fri, Dec 16, 2011 at 08:00:15 -0500, Adam C Powell IV wrote: > > > > > > > > > I think blacs-mpi, scalapack and suitesparse make sense for no-change > > > > > rebuilds. But I've been procrastinating maintenance on the rest (just > > > > > took care of spooles last night, working on hypre now) so this will > > > > > motivate me to finish up hypre, scotch and mumps -- I'll take care of > > > > > those three. > > > > > > > > > What are the archs where those rebuilds are needed? > > > > > > > > Cheers, > > > > Julien > > > > > > armel, armhf, mips, mipsel, s390, s390x, sparc. > > > > > Can't do that for blacs-mpi, it's broken: > > http://bugs.debian.org/cgi-bin/bugreport.cgi?bug=650804 > > The bug is in the lam implementation, which does not include an f90 > wrapper. When the rebuild happens, it will use mpich2 which does have > an f90 wrapper so it should work. So please schedule this build.
I've just confirmed that blacs-mpi builds with libmpich2-dev installed, and does not build with lam4-dev installed with the same failure mode as on the buildds. Simply rescheduling a build of blacs-mpi with the new mpi-defaults depending on mpich2 instead of lam should therefore close bug #650804. Please schedule this build on armel, mips, mipsel, s390, s390x and sparc. (armhf apparently has openmpi.) > > Scheduled rebuilds for scalapack. > > Unfortunately scalapack depends on blacs-mpi, so blacs-mpi needs to go > first. > > Looking at the log of the scheduled build, scalapack failed because it > doesn't have an mpich2 target, so that will need just a bit of work, > please do not schedule a second build of scalapack. I'm attaching a patch which fixes #652313. But blacs-mpi needs to be built before uploading with this fix. Also, I thoughtlessly uploaded mumps earlier today, which depends on scalapack and blacs-mpi. :-( Oh it's failing everywhere, and I think I know why, will fix and re-upload after the new scalapack goes in. Thanks, Adam -- GPG fingerprint: D54D 1AEE B11C CE9B A02B C5DD 526F 01E8 564E E4B6 Engineering consulting with open source tools http://www.opennovation.com/
diff -ur scalapack-1.8.0.orig/debian/changelog scalapack-1.8.0/debian/changelog --- scalapack-1.8.0.orig/debian/changelog 2011-09-18 11:09:36.000000000 -0400 +++ scalapack-1.8.0/debian/changelog 2011-12-22 16:58:13.000000000 -0500 @@ -1,3 +1,11 @@ +scalapack (1.8.0-8) unstable; urgency=low + + [ Adam C. Powell, IV ] + * Non-maintainer upload. + * Works with mpich2 as mpi default. + + -- Muammar El Khatib <muam...@debian.org> Thu, 22 Dec 2011 16:58:01 -0500 + scalapack (1.8.0-7) unstable; urgency=low * Test command execution failure caused due to incompatible file location in diff -ur scalapack-1.8.0.orig/debian/patches/01_SLmake.inc.patch scalapack-1.8.0/debian/patches/01_SLmake.inc.patch --- scalapack-1.8.0.orig/debian/patches/01_SLmake.inc.patch 2011-09-18 10:28:24.000000000 -0400 +++ scalapack-1.8.0/debian/patches/01_SLmake.inc.patch 2011-12-22 16:04:43.000000000 -0500 @@ -1,7 +1,7 @@ Index: scalapack-1.8.0/SLmake.inc =================================================================== ---- scalapack-1.8.0.orig/SLmake.inc 2007-04-07 06:37:26.000000000 +0200 -+++ scalapack-1.8.0/SLmake.inc 2011-09-18 15:49:24.971359670 +0200 +--- scalapack-1.8.0.orig/SLmake.inc ++++ scalapack-1.8.0/SLmake.inc @@ -33,15 +33,30 @@ # # MPI setup; tailor to your system if using MPIBLACS @@ -35,7 +35,7 @@ BLACSFINIT = -lblacsF77init-lam BLACSCINIT = -lblacsCinit-lam BLACSLIB = -lblacs-lam -@@ -56,7 +71,7 @@ +@@ -56,13 +71,28 @@ BLACSCINIT = /usr/lib/libblacsCinit-mpich.a BLACSLIB = /usr/lib/libblacs-mpich.a else @@ -44,7 +44,28 @@ BLACSFINIT = -lblacsF77init-mpich BLACSCINIT = -lblacsCinit-mpich BLACSLIB = -lblacs-mpich -@@ -96,10 +111,10 @@ + endif + TESTINGdir = $(home)/TESTING + endif ++ifeq ($(MPI),mpich2) ++USEMPI = -DUsingMpiBlacs ++ifeq ($(BUILD),static) ++SMPLIB = -L/usr/lib/mpich2/lib/ -lmpich ++BLACSFINIT = /usr/lib/libblacsF77init-mpich2.a ++BLACSCINIT = /usr/lib/libblacsCinit-mpich2.a ++BLACSLIB = /usr/lib/libblacs-mpich2.a ++else ++SMPLIB = -L/usr/lib/mpich2/lib/ -lmpich ++BLACSFINIT = -lblacsF77init-mpich2 ++BLACSCINIT = -lblacsCinit-mpich2 ++BLACSLIB = -lblacs-mpich2 ++endif ++TESTINGdir = $(home)/TESTING ++endif + ifeq ($(MPI),pvm) + USEMPI = + ifeq ($(BUILD),static) +@@ -96,10 +126,10 @@ # # The fortran and C compilers, loaders, and their flags # @@ -57,7 +78,7 @@ F77FLAGS = -Wall -O6 -funroll-all-loops -ffast-math $(NOOPT) CCFLAGS = -Wall $(FPIC) -O6 -funroll-all-loops -ffast-math SRCFLAG = -@@ -117,7 +132,7 @@ +@@ -117,7 +147,7 @@ # C preprocessor defs for compilation # (-DNoChange, -DAdd_, -DUpCase, or -Df77IsF2C) # @@ -66,7 +87,7 @@ # # The archiver and the flag(s) to use when building archive (library) # Also the ranlib routine. If your system has no ranlib, set RANLIB = echo -@@ -129,7 +144,7 @@ +@@ -129,7 +159,7 @@ # The name of the libraries to be created/linked to # SCALAPACKLIB = $(home)/scalapack_$(MPI).a diff -ur scalapack-1.8.0.orig/debian/rules scalapack-1.8.0/debian/rules --- scalapack-1.8.0.orig/debian/rules 2011-09-18 10:28:24.000000000 -0400 +++ scalapack-1.8.0/debian/rules 2011-12-22 16:51:57.000000000 -0500 @@ -11,8 +11,8 @@ topdir=$(shell pwd) -# This little hack works on mpich, lam and openmpi as of the lenny release -SCALAPACK_MPI=$(shell readlink /etc/alternatives/mpi | sed s/usr//g | sed s/include//g | sed s/lib//g | sed s/\\///g) +include /usr/share/mpi-default-dev/debian_defaults +SCALAPACK_MPI=$(ARCH_DEFAULT_MPI_IMPL) build: build-$(SCALAPACK_MPI) build-pvm @@ -22,6 +22,8 @@ build-mpich: build-stamp-mpich +build-mpich2: build-stamp-mpich2 + build-pvm: build-stamp-pvm build-stamp-openmpi: @@ -165,6 +167,55 @@ touch build-stamp-mpich +build-stamp-mpich2: + dh_testdir + +# next is a clean + echo *** cleaning object files *** + BASEDIR=$(topdir) make clean + +# build the shared libraries + echo *** building shared libraries for mpich2 *** + BASEDIR=$(topdir) MPI=mpich2 FPIC=-fPIC make lib + mkdir -p tmp + set -e ;\ + for i in scalapack ; do \ + cd tmp ;\ + ar x ../$${i}_mpich2.a ;\ + cd .. ;\ + gcc -shared -Wl,-soname=lib$$i-mpich2.so.$(version_major) -o \ + lib$$i-mpich2.so.$(version) tmp/*.o -lblas -llapack -lblacsCinit-mpich2 -lblacs-mpich2 -lmpich -lgfortran;\ + ln -snf lib$$i-mpich2.so.$(version) lib$$i-mpich2.so.$(version_major) ;\ + ln -snf lib$$i-mpich2.so.$(version_major) lib$$i-mpich2.so ;\ + rm tmp/* ;\ + done + rmdir tmp +# for i in $$(find -name "*.f"); do \ +# if grep '^[^\*].*TOTMEM *= *' $$i | grep -v 64000000 >/dev/null ; then \ +# cat $$i | sed 's,\(^[^\*].*TOTMEM *= *\)[0-9]*,\164000000,g' >tmp ;\ +# mv tmp $$i;\ +# fi;\ +# done + +# the testing binaries + echo *** building static testing binaries for mpich2 *** + BASEDIR=$(topdir) MPI=mpich2 BUILD=shared make exe + set -e ;\ + cd TESTING ;\ + for i in $$(find -name 'x*' -maxdepth 1 ! -name 'x*-openmpi' ! -name 'x*-mpich*' ! -name 'x*-pvm'); do \ + mv $$i $$i-mpich2 ;\ + done + +# next is a clean + echo *** cleaning object files *** + BASEDIR=$(topdir) make clean + +# build the static libraries + echo *** building static libraries for mpich2 *** + BASEDIR=$(topdir) MPI=mpich2 make lib + + touch build-stamp-mpich2 + build-stamp-pvm: dh_testdir @@ -194,7 +245,7 @@ BASEDIR=$(topdir) MPI=pvm BUILD=shared make exe set -e ;\ cd TESTING ;\ - for i in $$(find -name 'x*' -maxdepth 1 ! -name 'x*-openmpi' ! -name 'x*-lam' ! -name 'x*-mpich'); do \ + for i in $$(find -name 'x*' -maxdepth 1 ! -name 'x*-openmpi' ! -name 'x*-lam' ! -name 'x*-mpich*'); do \ mv $$i $$i-pvm ;\ done @@ -244,6 +295,8 @@ install-mpich: install-stamp-mpich +install-mpich2: install-stamp-mpich2 + install-pvm: install-stamp-pvm install-stamp-openmpi: build-stamp-openmpi @@ -315,6 +368,29 @@ touch install-stamp-mpich +install-stamp-mpich2: build-stamp-mpich2 + dh_testdir + dh_testroot + + set -e ;\ + for i in scalapack ; do \ + cp -a lib$$i-mpich2.so.* \ + `pwd`/debian/libscalapack-mpi1/usr/lib/ ;\ + cp -a lib$$i-mpich2.so \ + `pwd`/debian/libscalapack-mpi-dev/usr/lib/ ;\ + done + + install TESTING/x*-mpich2 \ + `pwd`/debian/scalapack-mpi-test/usr/lib/scalapack + + set -e ;\ + for i in scalapack ; do \ + install $${i}_mpich2.a \ + `pwd`/debian/libscalapack-mpi-dev/usr/lib/lib$$i-mpich2.a ;\ + done + + touch install-stamp-mpich2 + install-stamp-pvm: build-stamp-pvm dh_testdir dh_testroot
signature.asc
Description: This is a digitally signed message part