[deal.II] Re: Installation error, unable to configure with p4est

2019-10-04 Thread vachan potluri
Okay, I found the error. Some time back, I noted changing
include/deal.II/base/config.h.in
to
include/deal.II/base/config.h
(removed the .in). I don't remember exactly, but the reason I did this was 
because some error popped up while compiling one of the initial tutorials. 
This was my mistake.

The complete error message of cmake (on the terminal) mentions that 
config.h.in file is missing. So now, I made a copy and, the configuration 
and installation went fine.

However,
make test
for p4est failed with the following message
There are not enough slots available in the system to satisfy the 10 slots
that were requested by the application:
  ./p4est.debug

Either request fewer slots for your application, or make more slots 
available
for use.
It is true that my PC has only 4 slots (8 with hyper threading). So can I 
ignore this error?

-- 
The deal.II project is located at http://www.dealii.org/
For mailing list/forum options, see 
https://groups.google.com/d/forum/dealii?hl=en
--- 
You received this message because you are subscribed to the Google Groups 
"deal.II User Group" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to dealii+unsubscr...@googlegroups.com.
To view this discussion on the web visit 
https://groups.google.com/d/msgid/dealii/418ae277-bf9d-4b83-b987-a03a3d48a4be%40googlegroups.com.


[deal.II] Large Memory Consumption making BlockDynamicSparsityPattern

2019-10-04 Thread Matteo Frigo
Hello,

I'm currently working on the upgrading of my code, adding PETSc as an 
alternative to Trilinos for the Linear Algebra package.
I'm implementing this option following Tutorial 55.
However, I'm dealing with some issues when I try to run massive parallel 
simulations.
Especially large memory consumption occurs in the system setup phase.
After some debugging, I was able to figure out that the part of the code 
responsible for this is the generation of the sparsity pattern, i.e., the 
following rows:

BlockDynamicSparsityPattern 
dsp(local_partitioning);
DoFTools::make_sparsity_pattern(dof_handler, scratch_coupling, dsp, 
constraints, false,  this_mpi_process);  

I wanted to point out that this behavior doesn't depend on PETSc, but it is 
related only with the procedure wherewith we make the Block Sparsity 
Pattern (BSP). Indeed I ran into the same issue with Trilinos 
if the above strategy is selected. 

In the previous version of the code, I used these rows to generate the BSP:

TrilinosWrappers::BlockSparsityPattern 
sp(local_partitioning,MPI_COMM_WORLD); 
DoFTools::make_sparsity_pattern(dof_handler, 
matrix_coupling,
sp, constraints, false, 
this_mpi_process);  
sp.compress();

In this last case, the amount of memory required to generate the BSP is 
much less respect with the first case. 
Any ideas what is going on?  Am I doing wrong something? 
Thank you very much for your support.

Matteo

-- 
The deal.II project is located at http://www.dealii.org/
For mailing list/forum options, see 
https://groups.google.com/d/forum/dealii?hl=en
--- 
You received this message because you are subscribed to the Google Groups 
"deal.II User Group" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to dealii+unsubscr...@googlegroups.com.
To view this discussion on the web visit 
https://groups.google.com/d/msgid/dealii/42a40c37-eb6b-4f0c-994d-4e74e57c3764%40googlegroups.com.


[deal.II] Crowdsourcing some deal.II-related tasks

2019-10-04 Thread Wolfgang Bangerth


All,

as you know, we try to keep a list of publications using deal.II at
   https://dealii.org/publications.html
but given how many entries we now have to add to it every year, it has become 
an impossible task to keep it up to date.

So we need help:

* Timo has used a script to generate a list of candidate entries at
https://docs.google.com/document/d/1agTpMQWok0JxrbmMCpF0tnJY7RtCy2DfhT2xetMvojk/edit?usp=sharing

* We need *your* help identifying which of these are actually *using* deal.II. 
Instructions on how to do that are in the google document. For many of the 
articles, you will have to have access to journal websites (generally through 
your university) to download PDF files.

If you have ten or twenty or thirty minutes, we would be extremely grateful if 
you could go over there and help us out with a few entries. My experience is 
that it takes 3-5 minutes per entry; there are ~160 entries, so if only 20 of 
you were willing to help, we'd be done within a day or two!

Thanks you in advance!
  Wolfgang

-- 

Wolfgang Bangerth  email: bange...@colostate.edu
www: http://www.math.colostate.edu/~bangerth/

-- 
The deal.II project is located at http://www.dealii.org/
For mailing list/forum options, see 
https://groups.google.com/d/forum/dealii?hl=en
--- 
You received this message because you are subscribed to the Google Groups 
"deal.II User Group" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to dealii+unsubscr...@googlegroups.com.
To view this discussion on the web visit 
https://groups.google.com/d/msgid/dealii/6fba3653-d94d-d07c-d7e0-33a46b4a2fb9%40colostate.edu.


[deal.II] Error during refinement of a parallel distibuted quarter hyperball

2019-10-04 Thread Stefan Käßmair
Dear all,

during the refinement of a quarter hyper ball in 3D in debug mode, I 
receive the following error when running on a single core (mpirun -np 1):




*An error occurred in line <2764> of file 
 in functionvoid 
dealii::parallel::distributed::Triangulation::copy_local_forest_to_triangulation() [with int dim = 3; int 
spacedim = 3]The violated condition was: static_cast(parallel_forest->local_num_quadrants) == total_local_cells*

When running on multiple cores I hit another error message (probably 
because there are no ghosts when using only one core):




*An error occurred in line <2670> of file 
 in functionvoid 
dealii::parallel::distributed::Triangulation::copy_local_forest_to_triangulation() [with int dim = 3; int 
spacedim = 3]The violated condition was: num_ghosts == 
parallel_ghost->ghosts.elem_cou*nt

Currently I am using deal.II 9.0.1. 
This is the relevant part of the code (a full MWE is attached):

const unsigned int dim = 3;

parallel::distributed::Triangulation tria (
  mpi_communicator,
  typename 
Triangulation::MeshSmoothing(Triangulation::smoothing_on_refinement),
  
parallel::distributed::Triangulation::default_setting);

GridGenerator::quarter_hyper_ball(tria);

for (unsigned int i_refinement = 0; i_refinement < 6; 
++i_refinement)
{
auto cell = tria.begin_active();
auto endc = tria.end();

for (; cell!=endc; ++cell)
if(cell->is_locally_owned() && cell->at_boundary())
cell->set_refine_flag ();

tria.prepare_coarsening_and_refinement ();
tria.execute_coarsening_and_refinement ();
}

Is there something wrong with my code? Or maybe with my installation (can 
anyone confirm the error)? 
The code works in 2D but crashes in 3D. When the number of refinement 
cycles is reduced to 4 instead of 6, it also works.
Out of curiosity, I've tried using GridGenerator::hyper_cube instead of the 
hyper_ball, there the code works just fine in 2D and 3D. 

I would really appreciate if anyone can help me with this.

Kind regards,
Stefan

-- 
The deal.II project is located at http://www.dealii.org/
For mailing list/forum options, see 
https://groups.google.com/d/forum/dealii?hl=en
--- 
You received this message because you are subscribed to the Google Groups 
"deal.II User Group" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to dealii+unsubscr...@googlegroups.com.
To view this discussion on the web visit 
https://groups.google.com/d/msgid/dealii/8d86f847-dab7-4d49-9d45-e2c83b5f9fcd%40googlegroups.com.
#include 
#include 
#include 


int main (int argc, char *argv[]) {

try {
using namespace dealii;

Utilities::MPI::MPI_InitFinalize mpi_initialization(argc, argv, 1);

const unsigned int dim = 3;


MPI_Comm mpi_communicator = MPI_COMM_WORLD;

parallel::distributed::Triangulation tria (
  mpi_communicator,
  typename Triangulation::MeshSmoothing(Triangulation::smoothing_on_refinement),
  parallel::distributed::Triangulation::default_setting);

GridGenerator::quarter_hyper_ball(tria);

for (unsigned int i_refinement = 0; i_refinement < 6; ++i_refinement)
{
auto cell = tria.begin_active();
auto endc = tria.end();

for (; cell!=endc; ++cell)
if(cell->is_locally_owned() && cell->at_boundary())
cell->set_refine_flag ();

tria.prepare_coarsening_and_refinement ();
tria.execute_coarsening_and_refinement ();
}
}
catch (std::exception &exec)
{
std::cout << std::flush;
std::cerr << "\n\n\n"
  << "Exception thrown :\n" << exec.what() << std::endl
  << "Aborting!\n"
  << "" << std::endl;
return 1;
}
catch (...)
{
std::cout << std::flush;
std::cerr << "\n\n\n"
  << "Unknown exception!\n"
  << "Aborting!\n"
  << "" << std::endl;
return 1;
}

return 0;
}


[deal.II] Re: Installation error, unable to configure with p4est

2019-10-04 Thread vachan potluri
Sorry for incomplete information, cmake exits with the following message.

###
#
#  deal.II configuration:
#CMAKE_BUILD_TYPE:   DebugRelease
#BUILD_SHARED_LIBS:  ON
#CMAKE_INSTALL_PREFIX:   /home/vachan/bin/dealii
#CMAKE_SOURCE_DIR:   /home/vachan/dealii-9.1.1
#(version 9.1.1)
#CMAKE_BINARY_DIR:   /home/vachan/build/dealii
#CMAKE_CXX_COMPILER: GNU 7.4.0 on platform Linux x86_64
#/usr/local/bin/mpicxx
#
#  Configured Features (DEAL_II_ALLOW_BUNDLED = ON, 
DEAL_II_ALLOW_AUTODETECTION = ON):
#  ( DEAL_II_WITH_64BIT_INDICES = OFF )
#  ( DEAL_II_WITH_ADOLC = OFF )
#  ( DEAL_II_WITH_ARPACK = OFF )
#  ( DEAL_II_WITH_ASSIMP = OFF )
#DEAL_II_WITH_BOOST set up with bundled packages
#  ( DEAL_II_WITH_COMPLEX_VALUES = OFF )
#  ( DEAL_II_WITH_CUDA = OFF )
#DEAL_II_WITH_CXX14 = ON
#DEAL_II_WITH_CXX17 = ON
#  ( DEAL_II_WITH_GINKGO = OFF )
#  ( DEAL_II_WITH_GMSH = OFF )
#  ( DEAL_II_WITH_GSL = OFF )
#  ( DEAL_II_WITH_HDF5 = OFF )
#  ( DEAL_II_WITH_LAPACK = OFF )
#  ( DEAL_II_WITH_METIS = OFF )
#DEAL_II_WITH_MPI set up with external dependencies
#DEAL_II_WITH_MUPARSER set up with bundled packages
#  ( DEAL_II_WITH_NANOFLANN = OFF )
#  ( DEAL_II_WITH_NETCDF = OFF )
#  ( DEAL_II_WITH_OPENCASCADE = OFF )
#DEAL_II_WITH_P4EST set up with external dependencies
#DEAL_II_WITH_PETSC set up with external dependencies
#  ( DEAL_II_WITH_SCALAPACK = OFF )
#  ( DEAL_II_WITH_SLEPC = OFF )
#  ( DEAL_II_WITH_SUNDIALS = OFF )
#  ( DEAL_II_WITH_SYMENGINE = OFF )
#DEAL_II_WITH_THREADS set up with bundled packages
#  ( DEAL_II_WITH_TRILINOS = OFF )
#  ( DEAL_II_WITH_UMFPACK = OFF )
#DEAL_II_WITH_ZLIB set up with external dependencies
#
#  Component configuration:
#  ( DEAL_II_COMPONENT_DOCUMENTATION = OFF )
#DEAL_II_COMPONENT_EXAMPLES
#  ( DEAL_II_COMPONENT_PACKAGE = OFF )
#  ( DEAL_II_COMPONENT_PYTHON_BINDINGS = OFF )
#
#  Detailed information (compiler flags, feature configuration) can be 
found in detailed.log
#
#  Run  $ make info  to print a help message with a list of top level 
targets
#
###
-- Configuring incomplete, errors occurred!
See also "/home/vachan/build/dealii/CMakeFiles/CMakeOutput.log".
See also "/home/vachan/build/dealii/CMakeFiles/CMakeError.log".

-- 
The deal.II project is located at http://www.dealii.org/
For mailing list/forum options, see 
https://groups.google.com/d/forum/dealii?hl=en
--- 
You received this message because you are subscribed to the Google Groups 
"deal.II User Group" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to dealii+unsubscr...@googlegroups.com.
To view this discussion on the web visit 
https://groups.google.com/d/msgid/dealii/74b5c4d1-7098-417c-bca2-260b4ca48a10%40googlegroups.com.


[deal.II] Re: error during installation with spack on CentOS7

2019-10-04 Thread Denis Davydov
Hi Alberto,

Try reporting issue on Spack Github and ping @balay .

Denis.

On Friday, October 4, 2019 at 12:50:24 PM UTC+2, Alberto Salvadori wrote:
>
> Dear community
>
> I apologize for this too long bothering on  installing deal.ii on a linux 
> machine equipped with CentOS7. I am having quite a large amount of issues, 
> perhaps related to the gcc compiler(?). 
> The very last, which I was unable to solve up to now, relates to slepc . I 
> wonder if any of you had a similar problem and in case could address its 
> solution.
>
> Here is the outcome of installation via spack:
>
> *==>* *Installing* *slepc*
>
> *==>* Searching for binary cache of slepc
>
> *==>* Warning: No Spack mirrors are currently configured
>
> *==>* No binary for slepc found: installing from source
>
> *==>* Fetching http://slepc.upv.es/download/distrib/slepc-3.12.0.tar.gz
>
>  
> 100.0%
>
> *==>* Staging archive: 
> /tmp/deal.ii/spack-stage/slepc-3.12.0-5md6u45rynyaqtcta4e5dmecqhkp2jkr/slepc-3.12.0.tar.gz
>
> *==>* Created stage in 
> /tmp/deal.ii/spack-stage/slepc-3.12.0-5md6u45rynyaqtcta4e5dmecqhkp2jkr
>
> *==>* No patches needed for slepc
>
> *==>* Building slepc [Package]
>
> *==>* Executing phase: 'install'
>
> *==>* Error: ProcessError: Command exited with status 1:
>
> './configure' 
> '--prefix=/home/deal.ii/spack/opt/spack/linux-centos7-ivybridge/gcc-9.2.0/slepc-3.12.0-5md6u45rynyaqtcta4e5dmecqhkp2jkr'
>  
> '--with-arpack-dir=/home/deal.ii/spack/opt/spack/linux-centos7-ivybridge/gcc-9.2.0/arpack-ng-3.7.0-i5fx7mowpxx7acbasidsfc4r3owcd2vx/lib'
>  
> '--with-arpack-flags=-lparpack,-larpack'
>
> See build log for details:
>
>   
> /tmp/deal.ii/spack-stage/slepc-3.12.0-5md6u45rynyaqtcta4e5dmecqhkp2jkr/spack-build-out.txt
>
>
> and the log (s):
>
> ==> Executing phase: 'install'
>
> ==> [2019-10-03-20:48:03.194513] './configure' 
> '--prefix=/home/deal.ii/spack/opt/spack/linux-centos7-ivybridge/gcc-9.2.0/slepc-3.12.0-5md6u45rynyaqtcta4e5dmecqhkp2jkr'
>  
> '--with-arpack-dir=/home/deal.ii/spack/opt/spack/linux-centos7-ivybridge/gcc-9.2.0/arpack-ng-3.7.0-i5fx7mowpxx7acbasidsfc4r3owcd2vx/lib'
>  
> '--with-arpack-flags=-lparpack,-larpack'
>
> Checking environment... done
>
> Checking PETSc installation... 
>
> ERROR: Unable to link with PETSc
>
> ERROR: See "installed-arch-linux2-c-opt/lib/slepc/conf/configure.log" file 
> for details
>
>
>
> 
>
> Starting Configure Run at Thu Oct  3 20:48:03 2019
>
> Configure Options: 
> --prefix=/home/deal.ii/spack/opt/spack/linux-centos7-ivybridge/gcc-9.2.0/slepc-3.12.0-5md6u45rynyaqtcta4e5dmecqhkp2jkr
>  
> --with-arpack-dir=/home/deal.ii/spack/opt/spack/linux-centos7-ivybridge/gcc-9.2.0/arpack-ng-3.7.0-i5fx7mowpxx7acbasidsfc4r3owcd2vx/lib
>  
> --with-arpack-flags=-lparpack,-larpack
>
> Working directory: 
> /tmp/deal.ii/spack-stage/slepc-3.12.0-5md6u45rynyaqtcta4e5dmecqhkp2jkr/spack-src
>
> Python version:
>
> 2.7.16 (default, Oct  3 2019, 20:40:41) 
>
> [GCC 9.2.0]
>
> make: /usr/bin/gmake
>
> PETSc source directory: 
> /home/deal.ii/spack/opt/spack/linux-centos7-ivybridge/gcc-9.2.0/petsc-3.12.0-7b3mdm63ap32riorneym2mtcmwjlb63s
>
> PETSc install directory: 
> /home/deal.ii/spack/opt/spack/linux-centos7-ivybridge/gcc-9.2.0/petsc-3.12.0-7b3mdm63ap32riorneym2mtcmwjlb63s
>
> PETSc version: 3.12.0
>
> SLEPc source directory: 
> /tmp/deal.ii/spack-stage/slepc-3.12.0-5md6u45rynyaqtcta4e5dmecqhkp2jkr/spack-src
>
> SLEPc install directory: 
> /home/deal.ii/spack/opt/spack/linux-centos7-ivybridge/gcc-9.2.0/slepc-3.12.0-5md6u45rynyaqtcta4e5dmecqhkp2jkr
>
> SLEPc version: 3.12.0
>
>
> 
>
> Checking PETSc installation...
>
> - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - 
>
> Running command:
>
> cd /tmp/slepc-7TxU8j;/usr/bin/gmake checklink TESTFLAGS=""
>
> - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - 
>
> #include "petscsnes.h"
>
> int main() {
>
> Vec v; Mat m; KSP k;
>
> PetscInitializeNoArguments();
>
> VecCreate(PETSC_COMM_WORLD,&v);
>
> MatCreate(PETSC_COMM_WORLD,&m);
>
> KSPCreate(PETSC_COMM_WORLD,&k);
>
> return 0;
>
> }
>
> /home/deal.ii/spack/opt/spack/linux-centos7-ivybridge/gcc-9.2.0/openmpi-3.1.4-4lzhe2gtz3nzhffn6efu2fzgochphcix/bin/mpicc
>  
> -o checklink.o -c -fPIC   
> -I/home/deal.ii/spack/opt/spack/linux-centos7-ivybridge/gcc-9.2.0/petsc-3.12.0-7b3mdm63ap32riorneym2mtcmwjlb63s/include
>  
> -I/home/deal.ii/spack/opt/spack/linux-centos7-ivybridge/gcc-9.2.0/hypre-2.18.0-dbexk2cnwvnsjd5fm6ltw7o7q66ik3hy/include
>  
> -I/home/deal.ii/spack/opt/spack/linux-centos7-ivybridge/gcc-9.2.0/superlu-dist-6.1.1-stsykz4xojdqtlnjavms2opkppopzush/include
>  
> -I/home/deal.ii/spack/opt/spack/linux-centos7-ivybridge/gcc-9.2.0/hdf5-1.10.5-lt5jyi3ix6dbrbblku7ygutgek7wg5w2/include
>  
> -I/home/deal.ii/spack/opt/spack/linu

[deal.II] Re: error during installation with spack on CentOS7

2019-10-04 Thread Alberto Salvadori
Dear community

I apologize for this too long bothering on  installing deal.ii on a linux 
machine equipped with CentOS7. I am having quite a large amount of issues, 
perhaps related to the gcc compiler(?). 
The very last, which I was unable to solve up to now, relates to slepc . I 
wonder if any of you had a similar problem and in case could address its 
solution.

Here is the outcome of installation via spack:

*==>* *Installing* *slepc*

*==>* Searching for binary cache of slepc

*==>* Warning: No Spack mirrors are currently configured

*==>* No binary for slepc found: installing from source

*==>* Fetching http://slepc.upv.es/download/distrib/slepc-3.12.0.tar.gz

 
100.0%

*==>* Staging archive: 
/tmp/deal.ii/spack-stage/slepc-3.12.0-5md6u45rynyaqtcta4e5dmecqhkp2jkr/slepc-3.12.0.tar.gz

*==>* Created stage in 
/tmp/deal.ii/spack-stage/slepc-3.12.0-5md6u45rynyaqtcta4e5dmecqhkp2jkr

*==>* No patches needed for slepc

*==>* Building slepc [Package]

*==>* Executing phase: 'install'

*==>* Error: ProcessError: Command exited with status 1:

'./configure' 
'--prefix=/home/deal.ii/spack/opt/spack/linux-centos7-ivybridge/gcc-9.2.0/slepc-3.12.0-5md6u45rynyaqtcta4e5dmecqhkp2jkr'
 
'--with-arpack-dir=/home/deal.ii/spack/opt/spack/linux-centos7-ivybridge/gcc-9.2.0/arpack-ng-3.7.0-i5fx7mowpxx7acbasidsfc4r3owcd2vx/lib'
 
'--with-arpack-flags=-lparpack,-larpack'

See build log for details:

  
/tmp/deal.ii/spack-stage/slepc-3.12.0-5md6u45rynyaqtcta4e5dmecqhkp2jkr/spack-build-out.txt


and the log (s):

==> Executing phase: 'install'

==> [2019-10-03-20:48:03.194513] './configure' 
'--prefix=/home/deal.ii/spack/opt/spack/linux-centos7-ivybridge/gcc-9.2.0/slepc-3.12.0-5md6u45rynyaqtcta4e5dmecqhkp2jkr'
 
'--with-arpack-dir=/home/deal.ii/spack/opt/spack/linux-centos7-ivybridge/gcc-9.2.0/arpack-ng-3.7.0-i5fx7mowpxx7acbasidsfc4r3owcd2vx/lib'
 
'--with-arpack-flags=-lparpack,-larpack'

Checking environment... done

Checking PETSc installation... 

ERROR: Unable to link with PETSc

ERROR: See "installed-arch-linux2-c-opt/lib/slepc/conf/configure.log" file 
for details




Starting Configure Run at Thu Oct  3 20:48:03 2019

Configure Options: 
--prefix=/home/deal.ii/spack/opt/spack/linux-centos7-ivybridge/gcc-9.2.0/slepc-3.12.0-5md6u45rynyaqtcta4e5dmecqhkp2jkr
 
--with-arpack-dir=/home/deal.ii/spack/opt/spack/linux-centos7-ivybridge/gcc-9.2.0/arpack-ng-3.7.0-i5fx7mowpxx7acbasidsfc4r3owcd2vx/lib
 
--with-arpack-flags=-lparpack,-larpack

Working directory: 
/tmp/deal.ii/spack-stage/slepc-3.12.0-5md6u45rynyaqtcta4e5dmecqhkp2jkr/spack-src

Python version:

2.7.16 (default, Oct  3 2019, 20:40:41) 

[GCC 9.2.0]

make: /usr/bin/gmake

PETSc source directory: 
/home/deal.ii/spack/opt/spack/linux-centos7-ivybridge/gcc-9.2.0/petsc-3.12.0-7b3mdm63ap32riorneym2mtcmwjlb63s

PETSc install directory: 
/home/deal.ii/spack/opt/spack/linux-centos7-ivybridge/gcc-9.2.0/petsc-3.12.0-7b3mdm63ap32riorneym2mtcmwjlb63s

PETSc version: 3.12.0

SLEPc source directory: 
/tmp/deal.ii/spack-stage/slepc-3.12.0-5md6u45rynyaqtcta4e5dmecqhkp2jkr/spack-src

SLEPc install directory: 
/home/deal.ii/spack/opt/spack/linux-centos7-ivybridge/gcc-9.2.0/slepc-3.12.0-5md6u45rynyaqtcta4e5dmecqhkp2jkr

SLEPc version: 3.12.0



Checking PETSc installation...

- - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - 

Running command:

cd /tmp/slepc-7TxU8j;/usr/bin/gmake checklink TESTFLAGS=""

- - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - 

#include "petscsnes.h"

int main() {

Vec v; Mat m; KSP k;

PetscInitializeNoArguments();

VecCreate(PETSC_COMM_WORLD,&v);

MatCreate(PETSC_COMM_WORLD,&m);

KSPCreate(PETSC_COMM_WORLD,&k);

return 0;

}

/home/deal.ii/spack/opt/spack/linux-centos7-ivybridge/gcc-9.2.0/openmpi-3.1.4-4lzhe2gtz3nzhffn6efu2fzgochphcix/bin/mpicc
 
-o checklink.o -c -fPIC   
-I/home/deal.ii/spack/opt/spack/linux-centos7-ivybridge/gcc-9.2.0/petsc-3.12.0-7b3mdm63ap32riorneym2mtcmwjlb63s/include
 
-I/home/deal.ii/spack/opt/spack/linux-centos7-ivybridge/gcc-9.2.0/hypre-2.18.0-dbexk2cnwvnsjd5fm6ltw7o7q66ik3hy/include
 
-I/home/deal.ii/spack/opt/spack/linux-centos7-ivybridge/gcc-9.2.0/superlu-dist-6.1.1-stsykz4xojdqtlnjavms2opkppopzush/include
 
-I/home/deal.ii/spack/opt/spack/linux-centos7-ivybridge/gcc-9.2.0/hdf5-1.10.5-lt5jyi3ix6dbrbblku7ygutgek7wg5w2/include
 
-I/home/deal.ii/spack/opt/spack/linux-centos7-ivybridge/gcc-9.2.0/parmetis-4.0.3-p3vaameiqho6enhkpjhcupk5lam6jvc6/include
 
-I/home/deal.ii/spack/opt/spack/linux-centos7-ivybridge/gcc-9.2.0/metis-5.1.0-zepovp3vvqzcirbxoqyb33fg5mm26spe/include
 
-I/home/deal.ii/spack/opt/spack/linux-centos7-ivybridge/gcc-9.2.0/zlib-1.2.11-fa7l75havytsbgz77sh6yyzvqgmmm5dj/include
  
  `pwd`/checklink.c

/home/deal.ii/spack/opt/spack/linux-centos7-ivybridg

Re: [deal.II] Re: error during installation with spack on CentOS7

2019-10-04 Thread Denis Davydov
Great, happy to hear you sorted this out by tweaking compilers.yaml 
settings.

On Friday, October 4, 2019 at 12:38:20 PM UTC+2, Alberto Salvadori wrote:
>
> Hi Denis
>
> thanks for your note. I figured out that by adding the path in the 
> compilers.yaml file, in this way
>
> - compiler:
>
> environment: 
>
>   append-path:
>
> LD_LIBRARY_PATH: /usr/local/lib64
>
> extra_rpaths: []
>
> flags: {}
>
> modules: []
>
> operating_system: centos7
>
> paths:
>
>   cc: /usr/local/bin/gcc
>
>   cxx: /usr/local/bin/g++
>
>   f77: /usr/local/bin/gfortran
>
>   fc: /usr/local/bin/gfortran
>
> spec: gcc@9.2.0
>
> target: x86_64
>
> sorts out the issue.
>
>
> Il giorno mercoledì 2 ottobre 2019 19:26:06 UTC+2, Denis Davydov ha 
> scritto:
>>
>> Hi Alberto,
>>
>> Looks like the issue is known to Spack community: 
>> https://github.com/spack/spack/issues/11224 where there is also a 
>> possible source of the problem you can try as a fix (un-do on-line PR).
>>
>> Regards,
>> Denis.
>>
>> On Wednesday, October 2, 2019 at 3:14:46 PM UTC+2, Bruno Turcksin wrote:
>>>
>>> Alberto, 
>>>
>>> So what happens is that spack is using gcc 9.2 like you want but it 
>>> usess the libstdc++ from gcc 4.8.5 I usually spack to install a new 
>>> compiler and my compilers.yaml looks like your *but* I have the path 
>>> to the correct libstdc++ in my LD_LIBRARY_PATH when I load the module. 
>>> So I guess you need to add that path somewhere in your compilers.yaml 
>>>
>>> Best, 
>>>
>>> Bruno 
>>>
>>> Le mer. 2 oct. 2019 à 08:51, Alberto Salvadori 
>>>  a écrit : 
>>> > 
>>> > Thank you, Bruno. In fact, my aim was to use my system compiler. 
>>> > Here is my  .spack/linux/packages.yaml: 
>>> > 
>>> > packages: 
>>> > 
>>> >   all: 
>>> > 
>>> > compiler: [gcc] 
>>> > 
>>> > providers: 
>>> > 
>>> >   mpi: [openmpi] 
>>> > 
>>> >   openmpi: 
>>> > 
>>> > version: [3.1.4] 
>>> > 
>>> > paths: 
>>> > 
>>> >   openmpi@3.1.4%gcc@9.2.0: /usr/local/ 
>>> > 
>>> > buildable: False 
>>> > 
>>> >   perl: 
>>> > 
>>> > paths: 
>>> > 
>>> >   perl@5.16.3%gcc@9.2.0: /usr 
>>> > 
>>> >   cmake: 
>>> > 
>>> > version: [3.15.3] 
>>> > 
>>> > paths: 
>>> > 
>>> >   cmake@3.15.3%gcc@9.2.0: /usr/local/ 
>>> > 
>>> >   hdf5: 
>>> > 
>>> > version: [1.8.12] 
>>> > 
>>> > paths: 
>>> > 
>>> >   hdf5@1.8.12%gcc@9.2.0: /usr 
>>> > 
>>> > variants: +hl+fortran 
>>> > 
>>> >   netcdf: 
>>> > 
>>> > version: [7.2.0] 
>>> > 
>>> > paths: 
>>> > 
>>> >  netcdf@7.2.0%gcc@9.2.0: /usr 
>>> > 
>>> >   netcdf-cxx: 
>>> > 
>>> > version: [4.2.8] 
>>> > 
>>> > paths: 
>>> > 
>>> >  netcdf-cxx@4.2.8%gcc@9.2.0: /usr 
>>> > 
>>> >   dealii: 
>>> > 
>>> > variants: +optflags~python 
>>> > 
>>> > 
>>> > Shall I perhaps add something to the compiler flag (paths or so)? 
>>> > Here is also my  .spack/linux/compilers.yaml 
>>> > 
>>> > compilers: 
>>> > 
>>> > - compiler: 
>>> > 
>>> > environment: {} 
>>> > 
>>> > extra_rpaths: [] 
>>> > 
>>> > flags: {} 
>>> > 
>>> > modules: [] 
>>> > 
>>> > operating_system: centos7 
>>> > 
>>> > paths: 
>>> > 
>>> >   cc: /usr/bin/gcc 
>>> > 
>>> >   cxx: /usr/bin/g++ 
>>> > 
>>> >   f77: /usr/bin/gfortran 
>>> > 
>>> >   fc: /usr/bin/gfortran 
>>> > 
>>> > spec: gcc@4.8.5 
>>> > 
>>> > target: x86_64 
>>> > 
>>> > - compiler: 
>>> > 
>>> > environment: {} 
>>> > 
>>> > extra_rpaths: [] 
>>> > 
>>> > flags: {} 
>>> > 
>>> > modules: [] 
>>> > 
>>> > operating_system: centos7 
>>> > 
>>> > paths: 
>>> > 
>>> >   cc: /usr/local/bin/gcc 
>>> > 
>>> >   cxx: /usr/local/bin/g++ 
>>> > 
>>> >   f77: /usr/local/bin/gfortran 
>>> > 
>>> >   fc: /usr/local/bin/gfortran 
>>> > 
>>> > spec: gcc@9.2.0 
>>> > 
>>> > target: x86_64 
>>> > 
>>> > 
>>> > 
>>> > Alberto 
>>> > 
>>> > 
>>> > Alberto Salvadori 
>>> >  Dipartimento di Ingegneria Meccanica e Industriale (DIMI) 
>>> >  Universita` di Brescia, via Branze 43, 25123 Brescia 
>>> >  Italy 
>>> >  tel 030 3711239 
>>> >  fax 030 3711312 
>>> > 
>>> > e-mail: 
>>> >  alberto@unibs.it 
>>> > web-page: 
>>> >  http://m4lab.unibs.it/faculty.html 
>>> > 
>>> > 
>>> > 
>>> > On Wed, Oct 2, 2019 at 2:41 PM Bruno Turcksin  
>>> wrote: 
>>> >> 
>>> >> Alberto, 
>>> >> 
>>> >> On Wednesday, October 2, 2019 at 7:24:32 AM UTC-4, Alberto Salvadori 
>>> wrote: 
>>> >>> 
>>> >>> 
>>> >>> Thank you so much W and D, 
>>> >>> As you pointed out there seems to be a mistake in the most recent 
>>> version of perl during installation. 
>>> >>> I will propagate this to the proper communities. 
>>> >>> 
>>> >>> As Denis proposed, I went on simply tell Spack to use Perl from 
>>> system: 
>>> >>> 
>>> >>> perl: 
>>> >>> paths: 
>>> >>>  perl@5.26.2%gcc@9.2.0: /usr 
>>> >>> 
>>> >>> but I bumped into another issue: 
>>> >>> 
>>> >>> [deal.ii@localhost spack]$ spack

Re: [deal.II] Re: error during installation with spack on CentOS7

2019-10-04 Thread Alberto Salvadori
Hi Denis

thanks for your note. I figured out that by adding the path in the 
compilers.yaml file, in this way

- compiler:

environment: 

  append-path:

LD_LIBRARY_PATH: /usr/local/lib64

extra_rpaths: []

flags: {}

modules: []

operating_system: centos7

paths:

  cc: /usr/local/bin/gcc

  cxx: /usr/local/bin/g++

  f77: /usr/local/bin/gfortran

  fc: /usr/local/bin/gfortran

spec: gcc@9.2.0

target: x86_64

sorts out the issue.


Il giorno mercoledì 2 ottobre 2019 19:26:06 UTC+2, Denis Davydov ha scritto:
>
> Hi Alberto,
>
> Looks like the issue is known to Spack community: 
> https://github.com/spack/spack/issues/11224 where there is also a 
> possible source of the problem you can try as a fix (un-do on-line PR).
>
> Regards,
> Denis.
>
> On Wednesday, October 2, 2019 at 3:14:46 PM UTC+2, Bruno Turcksin wrote:
>>
>> Alberto, 
>>
>> So what happens is that spack is using gcc 9.2 like you want but it 
>> usess the libstdc++ from gcc 4.8.5 I usually spack to install a new 
>> compiler and my compilers.yaml looks like your *but* I have the path 
>> to the correct libstdc++ in my LD_LIBRARY_PATH when I load the module. 
>> So I guess you need to add that path somewhere in your compilers.yaml 
>>
>> Best, 
>>
>> Bruno 
>>
>> Le mer. 2 oct. 2019 à 08:51, Alberto Salvadori 
>>  a écrit : 
>> > 
>> > Thank you, Bruno. In fact, my aim was to use my system compiler. 
>> > Here is my  .spack/linux/packages.yaml: 
>> > 
>> > packages: 
>> > 
>> >   all: 
>> > 
>> > compiler: [gcc] 
>> > 
>> > providers: 
>> > 
>> >   mpi: [openmpi] 
>> > 
>> >   openmpi: 
>> > 
>> > version: [3.1.4] 
>> > 
>> > paths: 
>> > 
>> >   openmpi@3.1.4%gcc@9.2.0: /usr/local/ 
>> > 
>> > buildable: False 
>> > 
>> >   perl: 
>> > 
>> > paths: 
>> > 
>> >   perl@5.16.3%gcc@9.2.0: /usr 
>> > 
>> >   cmake: 
>> > 
>> > version: [3.15.3] 
>> > 
>> > paths: 
>> > 
>> >   cmake@3.15.3%gcc@9.2.0: /usr/local/ 
>> > 
>> >   hdf5: 
>> > 
>> > version: [1.8.12] 
>> > 
>> > paths: 
>> > 
>> >   hdf5@1.8.12%gcc@9.2.0: /usr 
>> > 
>> > variants: +hl+fortran 
>> > 
>> >   netcdf: 
>> > 
>> > version: [7.2.0] 
>> > 
>> > paths: 
>> > 
>> >  netcdf@7.2.0%gcc@9.2.0: /usr 
>> > 
>> >   netcdf-cxx: 
>> > 
>> > version: [4.2.8] 
>> > 
>> > paths: 
>> > 
>> >  netcdf-cxx@4.2.8%gcc@9.2.0: /usr 
>> > 
>> >   dealii: 
>> > 
>> > variants: +optflags~python 
>> > 
>> > 
>> > Shall I perhaps add something to the compiler flag (paths or so)? 
>> > Here is also my  .spack/linux/compilers.yaml 
>> > 
>> > compilers: 
>> > 
>> > - compiler: 
>> > 
>> > environment: {} 
>> > 
>> > extra_rpaths: [] 
>> > 
>> > flags: {} 
>> > 
>> > modules: [] 
>> > 
>> > operating_system: centos7 
>> > 
>> > paths: 
>> > 
>> >   cc: /usr/bin/gcc 
>> > 
>> >   cxx: /usr/bin/g++ 
>> > 
>> >   f77: /usr/bin/gfortran 
>> > 
>> >   fc: /usr/bin/gfortran 
>> > 
>> > spec: gcc@4.8.5 
>> > 
>> > target: x86_64 
>> > 
>> > - compiler: 
>> > 
>> > environment: {} 
>> > 
>> > extra_rpaths: [] 
>> > 
>> > flags: {} 
>> > 
>> > modules: [] 
>> > 
>> > operating_system: centos7 
>> > 
>> > paths: 
>> > 
>> >   cc: /usr/local/bin/gcc 
>> > 
>> >   cxx: /usr/local/bin/g++ 
>> > 
>> >   f77: /usr/local/bin/gfortran 
>> > 
>> >   fc: /usr/local/bin/gfortran 
>> > 
>> > spec: gcc@9.2.0 
>> > 
>> > target: x86_64 
>> > 
>> > 
>> > 
>> > Alberto 
>> > 
>> > 
>> > Alberto Salvadori 
>> >  Dipartimento di Ingegneria Meccanica e Industriale (DIMI) 
>> >  Universita` di Brescia, via Branze 43, 25123 Brescia 
>> >  Italy 
>> >  tel 030 3711239 
>> >  fax 030 3711312 
>> > 
>> > e-mail: 
>> >  alberto@unibs.it 
>> > web-page: 
>> >  http://m4lab.unibs.it/faculty.html 
>> > 
>> > 
>> > 
>> > On Wed, Oct 2, 2019 at 2:41 PM Bruno Turcksin  
>> wrote: 
>> >> 
>> >> Alberto, 
>> >> 
>> >> On Wednesday, October 2, 2019 at 7:24:32 AM UTC-4, Alberto Salvadori 
>> wrote: 
>> >>> 
>> >>> 
>> >>> Thank you so much W and D, 
>> >>> As you pointed out there seems to be a mistake in the most recent 
>> version of perl during installation. 
>> >>> I will propagate this to the proper communities. 
>> >>> 
>> >>> As Denis proposed, I went on simply tell Spack to use Perl from 
>> system: 
>> >>> 
>> >>> perl: 
>> >>> paths: 
>> >>>  perl@5.26.2%gcc@9.2.0: /usr 
>> >>> 
>> >>> but I bumped into another issue: 
>> >>> 
>> >>> [deal.ii@localhost spack]$ spack install dealii^cmake@3.9.4 
>> >>> 
>> >>> ==> libsigsegv is already installed in 
>> /home/deal.ii/spack/opt/spack/linux-centos7-ivybridge/gcc-9.2.0/libsigsegv-2.11-brkulrpubdu66nzym2zt2j6c3h6nw463
>>  
>>
>> >>> 
>> >>> ==> m4 is already installed in 
>> /home/deal.ii/spack/opt/spack/linux-centos7-ivybridge/gcc-9.2.0/m4-1.4.18-23npyrcdfzqehgp4s2mhka4nknjjkbzt
>>  
>>
>> >>> 
>> >>> ==> perl@5.16.3 : externa