Re: [petsc-users] Communication during MatAssemblyEnd

2019-06-24 Thread Mills, Richard Tran via petsc-users
Hi Ale, I don't know if this has anything to do with the strange performance you are seeing, but I notice that some of your Intel MPI settings are inconsistent and I'm not sure what you are intending. You have specified a value for I_MPI_PIN_DOMAIN and also a value for I_MPI_PIN_PROCESSOR_LIST.

Re: [petsc-users] DMPlexDistributeField

2019-06-24 Thread Adrian Croucher via petsc-users
hi Thanks Matt for the explanation about this. I have been trying a test which does the following: 1) read in DMPlex from file 2) distribute it, with overlap = 1, using DMPlexDistribute() 3) create FVM cell and face geometry vectors using DMPlexComputeGeometryFVM() 4) re-distribute, again