A couple of suggestions.
- try building with gcc/gfortran - likely the compiler will flag issues
(warnings) with the sources - that might be the cause of some of the errors.
- try using PetscInt datatype across all sources (i.e use .F90 suffix - and
include petsc includes) - to avoid any
On Tue, 14 May 2024, Runjian Wu wrote:
61;7601;1c
> Yes, it is indeed cleaner. Thanks for your explanation!
>
> Now I have a question. If I stick to the old PETSc version (v3.16.2), I can
> manually remove the duplicated libraries in PETSC_LIB and keep the order of
> libraries at the same time,
ort
-lmpi -lgfortran -lstdc++
Satish
On Tue, 14 May 2024, Satish Balay via petsc-users wrote:
> You can try using latest petsc version [3.21] - the list should be a bit more
> cleaner with it.
>
> balay@pj01:~/petsc$ ./configure --download-mpich --download-fblaslapack
> --
You can try using latest petsc version [3.21] - the list should be a bit more
cleaner with it.
balay@pj01:~/petsc$ ./configure --download-mpich --download-fblaslapack
--download-hdf5 --with-hdf5-fortran-bindings --download-metis
--download-parmetis COPTFLAGS=-O3 CXXOPTFLAGS=-O3 FOPTFLAGS=-O3
nclude it in compile
commands.
Satish
On Mon, 13 May 2024, Satish Balay via petsc-users wrote:
> On Mon, 13 May 2024, neil liu wrote:
>
>
> > I also tried the 2nd way, it didn't work.
>
> configure.log attached is successful.
>
> >>>>>>>>
> Configure
On Mon, 13 May 2024, neil liu wrote:
> I also tried the 2nd way, it didn't work.
configure.log attached is successful.
Configure Options: --configModules=PETSc.Configure
--optionsModule=config.compilerOptions --download-fblaslapack
--with-mpi-dir=/usr/lib64/openmpi
Compilers:
C
If you are using mpicc/mpif90 as compilers from you pre-installed MPI - you
don't need to list --with-mpi-include, --with-mpi-lib options.
As mentioned - you can do this either with:
[if you have then in PATH] --with-cc=mpicc --with-cxx=mpicxx --with-fc=mpif90
or:
--with-cc=MPI-DIR/bin/mpicc
You are misinterpreting --with-mpi-include --with-mpi-lib options.
Any particular reason you want to use these options instead of mpi compilers?
>>>
balay@p1 /home/balay
$ mpicc -show
gcc -I/home/balay/soft/mpich-4.0.1/include -L/home/balay/soft/mpich-4.0.1/lib
-Wl,-rpath
On Mon, 13 May 2024, neil liu wrote:
> Dear Petsc developers,
>
> I am trying to install Petsc with a preinstalled OpenMPi.
>
> ./configure --download-fblaslapack --with-mpi-dir=/usr/lib64/openmpi
--with-mpi-dir=DIR is a bit unique [wrt other pkg-dir options].
It means:
what version of PETSc? What configure command? What do you have for
PETSC_EXTERNAL_LIB_BASIC?
You can send configure.log for your build to petsc-maint
Generally duplicates should not cause grief. [as one needs them to overcome
circular dependencies].
What issues are you seeing? [send relevant
Try:
spack install slepc+hpddm ^petsc+hpddm
Satish
On Thu, 9 May 2024, Ng, Cho-Kuen via petsc-users wrote:
> Pierre,
>
> petec and slepc libraries are found in the spack directory, but
> libhpddm_petsc is not. So it is not built during the spack install process.
>
> ... ...
>
1. Suggest reinstalling brew/gfortran - to make sure its compatible with latest
xcode you have.
https://urldefense.us/v3/__https://petsc.org/release/install/install/*installing-on-macos__;Iw!!G_uCfscf7eWS!Y7eNVqUTtBNMYgHPI9pgKFBFMZJwLjwjtp4gGnxFTyYpQMN-rfvF4wJiuZywnYU-Rof6DANF1xg1U0VLr7FgG54$
2)
> craype-x86-milan15) cray-libsci/23.12.5
>
>
> On Thu, 2 May 2024, Junchao Zhang wrote:
>
> > I used cudatoolkit-standalone/12.4.1 and gcc-12.3.
> >
> > Be sure to use the latest petsc/main or petsc/release, which contains fixes
> > for Polaris.
> >
lan15) cray-libsci/23.12.5
On Thu, 2 May 2024, Junchao Zhang wrote:
> I used cudatoolkit-standalone/12.4.1 and gcc-12.3.
>
> Be sure to use the latest petsc/main or petsc/release, which contains fixes
> for Polaris.
>
> --Junchao Zhang
>
>
> On Thu, May 2, 202
Try:
module use /soft/modulefiles
Satish
On Thu, 2 May 2024, Vanella, Marcos (Fed) via petsc-users wrote:
> Hi all, it seems the modules in Polaris have changed (can't find
> cudatoolkit-standalone anymore).
> Does anyone have recent experience compiling the library with gnu and cuda in
>
On Mon, 29 Apr 2024, Vanella, Marcos (Fed) wrote:
> Hi Satish,
> Ok thank you for clarifying. I don't need to include Metis in the config
> phase then (not using anywhere else).
> Is there a way I can configure PETSc to not require X11 (Xgraph functions,
> etc.)?
if x is not installed -
anVR7b-fp6-0WA2AHagxB5l4dfWhCIXRkGDh_lPJgJ0av1HSQ_YMreLtmLeeBQ5VB9M$
Satish
On Mon, 29 Apr 2024, Satish Balay via petsc-users wrote:
>
> # Other CMakeLists.txt files inside SuiteSparse are from dependent packages
> # (LAGraph/deps/json_h, GraphBLAS/cpu_features, and CHOLMOD/SuiteSparse_metis
> # which
# Other CMakeLists.txt files inside SuiteSparse are from dependent packages
# (LAGraph/deps/json_h, GraphBLAS/cpu_features, and CHOLMOD/SuiteSparse_metis
# which is a slightly revised copy of METIS 5.0.1) but none of those
# CMakeLists.txt files are used to build any package in SuiteSparse.
So
This is the complexity with maintaining dependencies (and dependencies
of dependencies), and different build systems
- Its not easy to keep the "defaults" in both builds exactly the same.
- And its not easy to expose all "variants" or keep the same variants in both
builds.
- And each pkg has its
Or you can skip fortran - if you are not using PETSc from it [or any external
package that requires it], but you would need cxx for cuda
--with-fc=0 --download-f2cblaslapack --with-cxx=0 --with-cudac=0
or
--with-fc=0 --download-f2cblaslapack --with-cudac=nvcc LIBS=-lstdc++
Satish
On Fri, 5
>>>
Executing: mpifort -o /tmp/petsc-nopi85m9/config.compilers/conftest -v
-KPIC -O2 -g /tmp/petsc-nopi85m9/config.compilers/conftest.o
stdout:
Export
NVCOMPILER=/software/sse2/tetralith_el9/manual/nvhpc/23.7/Linux_x86_64/23.7
Export PGI=/software/sse2/tetralith_el9/manual/nvhpc/23.7
On Thu, 4 Apr 2024, Frank Bramkamp wrote:
> Dear PETSC Team,
>
> I found the following problem:
> I compile petsc 3.20.5 with Nvidia compiler 23.7.
>
>
> I use a pretty standard configuration, including
>
> --with-cc=mpicc --with-cxx=mpicxx --with-fc=mpifort COPTFLAGS="-O2 -g"
>
With xcode-15.3 and branch "barry/2024-04-03/fix-chaco-modern-c/release" from
https://urldefense.us/v3/__https://gitlab.com/petsc/petsc/-/merge_requests/7433__;!!G_uCfscf7eWS!YJPSyG4qeGbCKYRp9y16HJgjw7AOrQ0mL0QWb_XcKYZ17UwK2GtURGMpkyi4TctAY-8XqSvQUFmyCQFNnKy75fI$
[and a patched openmpi tarball
On Mon, 1 Apr 2024, Zongze Yang wrote:
> Thank you for your update.
>
> I found some links that suggest this issue is related to the Apple linker,
> which is causing problems with Fortran linking.
>
> 1.
>
On Mon, 1 Apr 2024, Zongze Yang wrote:
>
> I noticed this in the config.log of OpenMPI:
> ```
> configure:30230: checking to see if mpifort compiler needs additional linker
> flags
> configure:30247: gfortran -o conftest -fPIC -ffree-line-length-none
> -ffree-line-length-0
On Sun, 31 Mar 2024, Zongze Yang wrote:
> > ---
> > petsc@npro petsc % ./configure --download-bison --download-chaco
> > --download-ctetgen --download-eigen --download-fftw --download-hdf5
> > --download-hpddm --download-hwloc --download-hypre --download-libpng
> > --download-metis
I'll just note - I can reproduce with:
petsc@npro petsc.x % ./configure --download-mpich --download-mumps
--download-scalapack && make && make check
And then - the following work fine for me:
petsc@npro petsc.x % ./configure --download-mpich --download-mumps
--download-scalapack
I'm able to reproduce this error on a slightly older xcode [but don't know why
this issue comes up]
> Apple clang version 15.0.0 (clang-1500.1.0.2.5)
Can you try using the additional configure options (along with
LDFLAGS=-Wl,-ld_classic) and see if it works?
COPTFLAGS=-O0 FOPTFLAGS=-O0
On Fri, 29 Mar 2024, Pfeiffer, Sharon wrote:
> I’d like to unsubscribe to this mailing list.
Done.
Note: every list e-mail provides this info [in headers]
List-Id: PETSc users list
List-Unsubscribe:
Could you:
- reinstall brew after the xcode upgrade (not just update)
https://urldefense.us/v3/__https://petsc.org/main/install/install/*installing-on-macos__;Iw!!G_uCfscf7eWS!fr-mLHdhIQgT2IBZhK9C2IQMUAmTmneTF38VsNLrywxooidf1uunovfx8qJrr8-Y73tICazCqyaZ6SJ6ca6JXnQ$
- not use
.
Satish
On Thu, 21 Mar 2024, Satish Balay via petsc-users wrote:
> Delete your old build files - and retry. i.e
>
> rm -rf /cygdrive/g/mypetsc/petsc-3.20.5/arch-mswin-c-opt
>
> ./configure
>
> Satish
>
>
> On Thu, 21 Mar 2024, 程奔 wrote:
>
> > Hi, S
Delete your old build files - and retry. i.e
rm -rf /cygdrive/g/mypetsc/petsc-3.20.5/arch-mswin-c-opt
./configure
Satish
On Thu, 21 Mar 2024, 程奔 wrote:
> Hi, Satish Thanks for your reply, I try both way your said in petsc-3. 20. 5
> but it encounter same question,
>
Configure Options: --configModules=PETSc.Configure
--optionsModule=config.compilerOptions --with-debugging=0
--with-cc=/cygdrive/g/mypetsc/petsc-3.20.2/lib/petsc/bin/win32fe/win_cl
--with-fc=/cygdrive/g/mypetsc/petsc-3.20.2/lib/petsc/bin/win32fe/win_ifort
Check
https://urldefense.us/v3/__https://gitlab.com/petsc/petsc/-/jobs/6412623047__;!!G_uCfscf7eWS!ZAg_b85bAvm8-TShDMHvxaXIu77pjwlDqU2g9AXQSNNw0gmk3peDktdf8MsGAq3jHLTJHo6WSPGyEe5QrCJ-fN0$
for a successful build of latest petsc-3.20 [i.e release branch in git] with
metis and parmetis
Note the
On Mon, 18 Mar 2024, Pierre Jolivet wrote:
>
> And here we go:
> https://urldefense.us/v3/__https://gitlab.com/petsc/petsc/-/jobs/6420606887__;!!G_uCfscf7eWS!alfBlmyFQ5JJUYKxxFdETav6xjHOl5W54BPrmJEyXdSakVXnj8eYIRZdknOI-FK4uiaPdL4zSdJlD2zrcw$
>
> 20 minutes in, and still in the dm_* tests with
On Mon, 18 Mar 2024, Satish Balay via petsc-users wrote:
> On Mon, 18 Mar 2024, Pierre Jolivet wrote:
>
> >
> >
> > > On 18 Mar 2024, at 5:13 PM, Satish Balay via petsc-users
> > > wrote:
> > >
> > > Ah - the compiler did flag code bugs.
On Mon, 18 Mar 2024, Pierre Jolivet wrote:
>
>
> > On 18 Mar 2024, at 5:13 PM, Satish Balay via petsc-users
> > wrote:
> >
> > Ah - the compiler did flag code bugs.
> >
> >> (current version is 0.3.26 but we can’t update because there is a
Ah - the compiler did flag code bugs.
> (current version is 0.3.26 but we can’t update because there is a huge
> performance regression which makes the pipeline timeout)
maybe we should retry - updating to the latest snapshot and see if this issue
persists.
Satish
On Mon, 18 Mar 2024, Zongze
Glad you have a successful build! Thanks for the update.
Satish
On Tue, 12 Mar 2024, 程奔 wrote:
> Hi Satish Sorry for replying to your email so late, I follow your suggestion
> and it have been installed successfully. Thank you so much. best wishes, Ben
> > -原始邮件- > 发件人: "Satish Balay"
The website is now updated
Satish
On Fri, 8 Mar 2024, Satish Balay via petsc-users wrote:
> Thanks for the report! The fix is at
> https://urldefense.us/v3/__https://gitlab.com/petsc/petsc/-/merge_requests/7343__;!!G_uCfscf7eWS!aRgkooWLFmFbCqsaZyYQixeOgy0qD1N3WlxPMXIGCCA-fhjJ6DS
Thanks for the report! The fix is at
https://urldefense.us/v3/__https://gitlab.com/petsc/petsc/-/merge_requests/7343__;!!G_uCfscf7eWS!aRgkooWLFmFbCqsaZyYQixeOgy0qD1N3WlxPMXIGCCA-fhjJ6DSuGhanT-xc5iuF4tjVn4BBShyMJnqZr2I0dEo$
Satish
On Fri, 8 Mar 2024, David Bold wrote:
> Dear all, I noticed
> make[3]: *** No rule to make target 'w'. Stop.
Try the following to overcome the above error:
make OMAKE_PRINTDIR=make all
However 3.13.6 is a bit old - so don't know if it will work with these versions
of compilers.
Satish
On Wed, 6 Mar 2024, 程奔 wrote:
> Hello,
>
>
> Last time I
Looks like ex55 is the one to use - that links in with ex55k
But it needs a fix for a build from 'make'
>>
balay@petsc-gpu-01:/scratch/balay/petsc/src/snes/tutorials$ git diff
diff --git a/src/snes/tutorials/makefile b/src/snes/tutorials/makefile
index 672a62aa5a0..eed127f7eae 100644
---
diff --git a/src/vec/f90-mod/petscvecmod.F90 b/src/vec/f90-mod/petscvecmod.F90
index 4c54fbf63dc..8772f89e135 100644
--- a/src/vec/f90-mod/petscvecmod.F90
+++ b/src/vec/f90-mod/petscvecmod.F90
@@ -163,6 +163,7 @@
#include <../src/vec/f90-mod/petscvec.h90>
interface
#include
On Thu, 18 Jan 2024, Aaron Scheinberg wrote:
> Hello,
>
> I'm getting this error when linking:
>
> undefined reference to `petsc_allreduce_ct_th'
>
> The instances are regular MPI_Allreduces in my code that are not located in
> parts of the code related to PETSc, so I'm wondering what is
The usual xcode/clang + brew/gfortran should work.
https://gitlab.com/petsc/petsc/-/jobs/5895519334
https://gitlab.com/petsc/petsc/-/jobs/5895519414
There can be issues - not all CI builds work in M2 - with latest xcode [when I
tried this previously] - so some CI jobs are still on Intel/Mac
Executing: mpicc -show
stdout: icc -I/opt/apps/cuda/11.4/include -I/opt/apps/cuda/11.4/include -lcuda
-L/opt/apps/cuda/11.4/lib64/stubs -L/opt/apps/cuda/11.4/lib64 -lcudart -lrt
-Wl,-rpath,/opt/apps/cuda/11.4/lib64 -Wl,-rpath,XORIGIN/placeholder
-Wl,--build-id -L/opt/apps/cuda/11.4/lib64/ -lm
Do you have a ~/.petscrc file - with -log_view enabled?
Satish
On Wed, 29 Nov 2023, Di Miao via petsc-users wrote:
> Hi,
>
> I tried to compile PETSc with the following configuration:
>
> ./configure --with-debugging=0 COPTFLAGS='-O3' CXXOPTFLAGS='-O3'
> FOPTFLAGS='-O3' --with-clean=1
>
Do you really need this combination of pkgs?
Matlab is distributed with ILP64 MKL - so it doesn't really work with
LP64 blas/lapack that most external packages require - i.e you can't
really use use matlab and other external-packages.
[also it might not work with complex]
To get a successful
Can you do a simple build with only superlu-dist and see if the error persists?
./configure PETSC_ARCH=linux-slu --with-cc=/usr/local/gcc11/bin/gcc
--with-cxx=/usr/local/gcc11/bin/g++ --with-fc=gfortran --with-debugging=1
--with-scalar-type=complex --download-mpich --download-fblaslapack
replied on petsc-maint
xcode-15 changed considerably, (fixes are in petsc-3.20) that its not easy to
backport all needed patches to 3.17.0
So best bet for petsc-3.19 and older is to use linux (remotely or via VM) - or
downgrade to xcode-14.
Satish
On Mon, 20 Nov 2023, Jan Izak C. Vermaak
Suggest attaching text logs (copy/paste) - instead of screenshots.
Try:
./configure --with-cc=gcc-11 --with-cxx=g++-11 --with-fc=gfortran-11
--download-fftw --download-openmpi --download-fblaslapack --with-zlibs=yes
--with-szlib=no --with-c2html=0 --with-x=0 --download-hdf5-fortran-bindings=1
I guess the flag you are looking for is CUDAFLAGS
>>>
balay@petsc-gpu-01:/scratch/balay/petsc/src/vec/vec/tests$ make ex100
CUDAFLAGS="-Xcompiler -fopenmp" LDFLAGS=-fopenmp
/usr/local/cuda/bin/nvcc -o ex100.o -c
-I/nfs/gce/projects/petsc/soft/u22.04/mpich-4.0.2/include -ccbin mpicxx
On Wed, 25 Oct 2023, Qiyue Lu wrote:
> Hello,
> I have an in-house code enabled OpenMP and it works. Now I am trying to
> incorporate PETSc as the linear solver and build together using the
> building rules in $PETSC_HOME/lib/petsc/conf/rules. However, I found the
> OpenMP part doesn't work
Try using the additional option --with-64-bit-blas-indices=1
Satish
On Fri, 20 Oct 2023, Di Miao wrote:
> Hi,
>
> I found that when compiled with '--with-64-bit-indices=1' option, the
> following three definitions in petscconf.h will be removed:
>
> #define PETSC_HAVE_MKL_SPARSE 1
> #define
> Working directory: /home/tt/petsc-3.16.0
use latest petsc release - 3.20
> --with-fc=flang
I don't think this ever worked. Use --with-fc=gfortran instead
/opt/ohpc/pub/spack/opt/spack/linux-centos7-skylake_avx512/gcc-8.3.0/m4-1.4.19-lwqcw3hzoxoia5q6nzolylxaf5zevluk/bin/m4:
internal error
I'll note - current sundials release has some interfaces to petsc functionality
Satish
On Mon, 16 Oct 2023, Matthew Knepley wrote:
> On Mon, Oct 16, 2023 at 2:29 PM Vanella, Marcos (Fed) via petsc-users <
> petsc-users@mcs.anl.gov> wrote:
>
> > Hi, we were wondering if it would be possible to
The same docs should be available in
https://web.cels.anl.gov/projects/petsc/download/release-snapshots/petsc-with-docs-3.20.0.tar.gz
Satish
On Wed, 11 Oct 2023, Richter, Roland wrote:
> Hei,
> Thank you very much for the answer! I looked it up, but petsc.org seems to
> be a bit unstable here,
Will note - OneAPI MPI usage is documented at
https://petsc.org/release/install/install/#mpi
Satish
On Mon, 9 Oct 2023, Barry Smith wrote:
>
> Instead of using the mpiicc -cc=icx style use -- with-cc=mpiicc (etc) and
>
> export I_MPI_CC=icx
> export I_MPI_CXX=icpx
> export I_MPI_F90=ifx
>
On Fri, 6 Oct 2023, Qiyue Lu wrote:
> Hello,
> I am trying to configure PETSc(current release version) with NVCC, with
> these options:
> ./configure --with-cc=nvcc --with-cxx=nvcc --with-fc=0 --with-cuda=1
this usage is incorrect. You need:
--with-cc=mpicc --with-cxx=mpicxx
Here --download-cmake is failing [due to the old version of c++ compiler]. You
can try installing an older version manually instead of --download-cmake
Or you might be able to install a newer gcc/g++ easily [if its not already
installed on your machine]. For ex:
git clone
Can you send us the complete configure.log file [as attachment] - perhaps to
petsc-ma...@mcs.anl.gov
Satish
On Sat, 30 Sep 2023, Ivan Luthfi wrote:
> Hi team,
> I have an issue when I configure my petsc with the following script:
>
> ./configure --with-fc=0 --download-f2cblaslapack=1
>
petsc git repo main branch has fixes for xcode-15. Can you give it a try?
Satish
On Thu, 28 Sep 2023, Paul Tackley wrote:
> Hello,
>
> PETSc was working fine on my M1 Mac until I upgraded to Xcode 15.0 - now I
> can’t even configure it. There seems to be a problem related to C and C++ in
>
On Tue, 26 Sep 2023, Ivan Luthfi wrote:
> Sorry, In the petsc_lib i only found out the libpetsc.so, but not
> libpetsc.a,
>
> are they functionality the same ?
yes.
You would run 'make check' after the build - to verify if PETSc examples are
able to compile, link correctly [to this
We generally recommend getting a basic build going for such use case -
and then migrate it to latest version as old versions have more issues
[as you might be encountering now]
Also - you haven't responded to my follow-up regarding the issue you
are encountering.
If you are having build issues -
What are you looking at? Send 'ls' from PETSC_DIR/PETSC_ARCH/lib
Perhaps its a shared library build - and you have libpetsc.so?
BTW: the current release is 3.19 - and you are attempting to build a super old
version 3.4.
We recommend using the latest version to avoid build and other
Do you get this failure with petsc main branch as well?
Satish
On Thu, 21 Sep 2023, Blaise Bourdin wrote:
> FWIW, CLT 15.0 also seems to include changes to the linker, with incompatible
> options etc… I was able to rebuild mpich and petsc but I get many linker
> warnings and have not fully
Its a run time option to petsc (application) binary.
So you can either specify it via command line - at run time - or add it to env
variable "PETSC_OPTIONS" - or add it to $HOME/.petscrc file
Satish
On Tue, 19 Sep 2023, Thuc Bui wrote:
> Hi Barry,
>
>
>
> Thanks for getting back to me.
BTW: Can check if you are using threaded MKL?
We default to:
Libraries: -L/cygdrive/c/PROGRA~2/Intel/oneAPI/mkl/2022.1.0/lib/intel64
mkl_intel_lp64_dll.lib mkl_sequential_dll.lib mkl_core_dll.lib
If using threaded MKL - try using env variable "OMP_NUM_THREADS=1" and see that
makes a
On Tue, 19 Sep 2023, Matthew Knepley wrote:
> On Tue, Sep 19, 2023 at 7:04 AM Thuc Bui wrote:
>
> > Hi Barry,
> >
> >
> >
> > Visual Studio 2022 is the problem! The code linked to Petsc 3.18.6 built
> > with VS 2022 also crashes at the same place. The same errors are shown
> > below. I don’t
BTW: Stepping back and looking that the error message:
> > > >> Error: The import statement 'import matlab.internal.engine.input'
> > cannot be found or cannot be imported. Imported names must end with '.*' or
> > be fully qualified.
Google suggests:
out this error?*
> > > >>>
> > > >>> *$cd src/tao/leastsquares/tutorials/matlab/*
> > > >>> *$make matlab_ls_test*
> > > >>> /home/vit/sfw/linux/openmpi/4.1.4/bin/mpicc -fPIC -Wall
> > -Wwrite-strings
> > > >>> -W
o get additional info - and debug
> >>>>
> >>>> Satish
> >>>> --
> >>>>
> >>>> balay@compute-386-07:/scratch/balay/petsc$ cd
> >>>> src/tao/leastsquares/tutorials/matlab/
> >>>> balay@comput
1140850689
> > -2080374784 max tags = 268435455
> > [0] PetscMatlabEngineCreate(): Starting MATLAB engine with command
> > /nfs/gce/software/custom/linux-ubuntu20.04-x86_64/matlab/R2021a/bin/matlab
> > -glnxa64 -nodisplay -nosplash
> > [0] PetscMatlabEngineCreate(): Started MA
et started, type doc.
For product information, visit www.mathworks.com.
>>
On Sat, 2 Sep 2023, Satish Balay via petsc-users wrote:
> Please don't cc: both petsc-users and petsc-maint [reverting thread to
> petsc-users only]
>
> I'm not sure what is happening here. Can yo
>
> >>> > -with-blaslapack-dir=/path/to/matlab_dir
> >>> > --known-64-bit-blas-indices=1
> >>> >
> >>> > Is this what you are suggesting?
> >>> >
> >>> > On Fri, Sep 1, 2023, 20:42 Satish Balay wrote:
> >>> >
&g
sting?
>
> On Fri, Sep 1, 2023, 20:42 Satish Balay wrote:
>
> > Also:
> >
> > '-known-64-bit-blas-indices=1',
> >
> > Note: most externalpackages might not work in this mode.
> >
> > [we can't really over come such dependency/conflicts across
Also:
'-known-64-bit-blas-indices=1',
Note: most externalpackages might not work in this mode.
[we can't really over come such dependency/conflicts across packages]
Satish
On Fri, 1 Sep 2023, Satish Balay via petsc-users wrote:
> Here is the matlab test that runs in CI
>
&
Here is the matlab test that runs in CI
https://gitlab.com/petsc/petsc/-/jobs/4904566768
config/examples/arch-ci-linux-matlab-ilp64.py
# Note: regular BLAS [with 32-bit integers] conflict with
# MATLAB BLAS - hence requiring -known-64-bit-blas-indices=1
Ah - so you need to use the ilp64
please send the correspond configure.log for this failure - perhaps to
petsc-maint [to avoid sending large files to petsc-users mailing list]
BTW: We normally use xcode clang/clang++ with brew gfortran (with system
blas/lapack) for MacOS builds
Satish
On Fri, 1 Sep 2023, Giselle Sosa Jones
Well - you sent in libmesh log not petsc's configure.log/make.log for petsc-3.17
Anyway - with petsc-3.13 - you have:
Matlab:
Includes: -I/usr/local/MATLAB/R2020b/extern/include
/usr/local/MATLAB/R2020b
MatlabEngine:
Library:
Send configure.log, make.log from both petsc-3.13 and 3.17 [or 3.19].
[you can gzip them to make the logs friendly to mailing list - or send them to
petsc-maint]
And does test suite work with 3.17? [or 3.19?]
Satish
On Tue, 29 Aug 2023, INTURU SRINIVAS 20PHD0548 via petsc-users wrote:
> I am
Also - the instructions don't say if matlab is required.
So perhaps you might want to try an install without matlab - and see if you are
able to get IBAMR working.
Satish
On Mon, 28 Aug 2023, Satish Balay via petsc-users wrote:
> https://ibamr.github.io/linux says petsc-3.17
>
>
https://ibamr.github.io/linux says petsc-3.17
Here you are using 3.13
Can you retry with petsc-3.17.5?
Satish
On Mon, 28 Aug 2023, INTURU SRINIVAS 20PHD0548 via petsc-users wrote:
> Hello,
>
> I want to build PETSc with MATLAB for working on the simulation using IBAMR
> open software. While
Check: https://lists.mcs.anl.gov/pipermail/petsc-users/2023-July/049115.html
Also - best to not cross post to multiple lists.
Satish
On Wed, 23 Aug 2023, VAIBHAV BHANDARI wrote:
> Dear Sir/Mam,
>
> I hope this email finds you well. I am writing to request an invitation to
> join the PETSc
On Mon, 21 Aug 2023, meator wrote:
> Hi. I'm trying to package PETSc using the tarball with documentation
> (https://ftp.mcs.anl.gov/pub/petsc/release-snapshots/petsc-with-docs-3.19.4.tar.gz)
> and I've got some questions about the structure of PETSc.
>
> What are the contents of the
Can you try the update in branch "balay/amgx-cuda-12"?
Satish
On Fri, 18 Aug 2023, Zisheng Ye wrote:
> Dear PETSc team
>
> I am configuring AMGX package under the main branch with CUDA 12.1. But it
> can't get through. Can you help to solve the problem? I have attached the
> configure.log to
I think gfortran defaults to fixed form for .F and free-form for .F90
This can be changed with FFLAGS=-ffree-form - but yeah - switching the suffix
might be more suitable..
In addition - PETSc configure attempts to add in "-ffree-line-length-none
-ffree-line-length-0" options - so that extra
Do you get this error when you compile a PETSc example [with the
corresponding PETSc makefile]?
If not - you'll have to check the difference in compiler options
between this example compile - and your application.
Satish
On Tue, 15 Aug 2023, maitri ksh wrote:
> I was earlier using petsc with
On Fri, 11 Aug 2023, Jed Brown wrote:
> Jacob Faibussowitsch writes:
>
> > More generally, it would be interesting to know the breakdown of installed
> > CUDA versions for users. Unlike compilers etc, I suspect that cluster
> > admins (and those running on local machines) are much more likely
Sure - but if 'module load icc' has gcc-4*' in path - that's a bug in the icc
module spec [as that version is incompatible with its c++ support] . It should
also load a compatible gcc version [via PATH - or via module dependencies]
if its implemented this way - then you won't have a broken icc
If using modules - using 'module load gcc icc' [or equivalent] should normally
work - but if the modules are setup such that loading icc unloads gcc - then I
think that's a bug in this module setup..
[as icc has an (internal) dependency on gcc - so ignoring this dependency to
remove a gcc
Its easier to just add the newer version of gcc/g++ compilers to PATH - and icc
will pick it up [without requiring -gcc-toolchain option]
export PATH=/location/of/newer/g++/bin:$PATH
./configure ...
make ...
Satish
On Tue, 8 Aug 2023, Victor Eijkhout wrote:
> Maybe an option for specifying
> gcc (GCC) 4.8.5 20150623 (Red Hat 4.8.5-44)
Is it possible for you to use a newer version GNU compilers?
If not - your alternative is to build PETSc with --with-cxx=0 option
But then - you can't use --download-superlu_dist or any pkgs that need
c++ [you could try building them separately
One way to boost performance [of MatVec etc] in sparse matrices with
blocks is by avoiding loading (from memory to cpu registers) of
row/col indices for the blocks - when possible. [the performance
boost here come by the fact that the memory bandwidth requirements get
reduced]
So we have BAIJ
tly copied
> >>> into hypre/src/hypre/include. This is not done for a cmake build - I had
> >>> to do the copying myself. Maybe I missed one.
> >>>
> >>>
> >>> On shared vs. static - if there a clear way of telling which I've e
On Fri, 21 Jul 2023, Satish Balay via petsc-users wrote:
> Were you able to try Jacob's fix - so you could build with cxx?
>
> Wrt building external pkgs - one way:
>
> - first build pkgs:
> ./configure PETSC_ARCH=arch-pkgs --prefix=$HOME/soft/petsc-pkgs --with-cc=icc
Were you able to try Jacob's fix - so you could build with cxx?
Wrt building external pkgs - one way:
- first build pkgs:
./configure PETSC_ARCH=arch-pkgs --prefix=$HOME/soft/petsc-pkgs --with-cc=icc
--with-cxx=icpc --with-fc=ifort --download-mpich --download-suitesparse
- now build PETSc with
ng, which I lilnked in
> using the --cflags
> option (maybe --cc-linker-flags would have been neater, but it worked. I've
> tried both in order to try to get the above working).
>
>
> I can go into detail about the compile and linker commands if needed; I'd
> have to explain more
default,
> > which I didn't change:
> >
> > # Configuration options
> > option(HYPRE_ENABLE_SHARED "Build a shared library" OFF)
> > option(HYPRE_ENABLE_BIGINT "Use long long int for HYPRE_Int" OFF)
> > option(HYPRE_ENABL
1 - 100 of 526 matches
Mail list logo