Re: [petsc-users] question about MatCreateRedundantMatrix

2019-09-18 Thread hong--- via petsc-users
Michael,
We have support of MatCreateRedundantMatrix for dense matrices. For
example, petsc/src/mat/examples/tests/ex9.c:
mpiexec -n 4 ./ex9 -mat_type dense -view_mat -nsubcomms 2

Hong

On Wed, Sep 18, 2019 at 5:40 PM Povolotskyi, Mykhailo via petsc-users <
petsc-users@mcs.anl.gov> wrote:

> Dear Petsc developers,
>
> I found that MatCreateRedundantMatrix does not support dense matrices.
>
> This causes the following problem: I cannot use CISS eigensolver from
> SLEPC with dense matrices with parallelization over quadrature points.
>
> Is it possible for you to add this support?
>
> Thank you,
>
> Michael.
>
>
> p.s. I apologize if you received this e-mail twice, I sent if first from
> a different address.
>
>


Re: [petsc-users] question about MatCreateRedundantMatrix

2019-09-19 Thread Jose E. Roman via petsc-users
Michael,

In my previous email I should have checked it better. The CISS solver works 
indeed with dense matrices:

$ mpiexec -n 2 ./ex2 -n 30 -eps_type ciss -terse -rg_type ellipse 
-rg_ellipse_center 1.175 -rg_ellipse_radius 0.075 -eps_ciss_partitions 2 
-mat_type dense 

2-D Laplacian Eigenproblem, N=900 (30x30 grid)

 Solution method: ciss

 Number of requested eigenvalues: 1
 Found 15 eigenvalues, all of them computed up to the required tolerance:
 1.10416, 1.10416, 1.10455, 1.10455, 1.12947, 1.12947, 1.13426, 1.13426, 
 1.16015, 1.16015, 1.19338, 1.19338, 1.21093, 1.21093, 1.24413


There might be something different in the way matrices are initialized in your 
code. Send me a simple example that reproduces the problem and I will track it 
down.

Sorry for the confusion.
Jose



> El 19 sept 2019, a las 6:20, hong--- via petsc-users 
>  escribió:
> 
> Michael,
> We have support of MatCreateRedundantMatrix for dense matrices. For example, 
> petsc/src/mat/examples/tests/ex9.c:
> mpiexec -n 4 ./ex9 -mat_type dense -view_mat -nsubcomms 2
> 
> Hong
> 
> On Wed, Sep 18, 2019 at 5:40 PM Povolotskyi, Mykhailo via petsc-users 
>  wrote:
> Dear Petsc developers,
> 
> I found that MatCreateRedundantMatrix does not support dense matrices.
> 
> This causes the following problem: I cannot use CISS eigensolver from 
> SLEPC with dense matrices with parallelization over quadrature points.
> 
> Is it possible for you to add this support?
> 
> Thank you,
> 
> Michael.
> 
> 
> p.s. I apologize if you received this e-mail twice, I sent if first from 
> a different address.
> 



Re: [petsc-users] question about MatCreateRedundantMatrix

2019-09-19 Thread Povolotskyi, Mykhailo via petsc-users
Hello Jose,

I have done the test case to reproduce my error.

1. You will need to download a file "matrix.bin"  from the following link

https://www.dropbox.com/s/6y7ro99ou4qr8uy/matrix.bin?dl=0

2. Here is the C++ code I use

#include 
#include 
#include 
#include 
#include 
#include 

#include "mpi.h"
#include "petscmat.h"
#include "slepcsys.h"
#include "slepceps.h"
#include "slepcrg.h"
using namespace std;

void read_matrix(const std::string& filename,
          int& matrix_size,
          std::vector >& data)
{

   int file_size;

   struct stat results;

   if (stat(filename.c_str(), &results) == 0)
   {
     file_size = results.st_size;
   }
   else
   {
     throw runtime_error("Wrong file\n");
   }

   int data_size =  file_size / sizeof(std::complex);

   int n1 = (int) sqrt(data_size);

   if (n1  * n1 == data_size)
   {
     matrix_size = n1;
   }
   else
   {
  throw runtime_error("Wrong file size\n");
   }

   data.resize(matrix_size*matrix_size);

   ifstream myFile (filename.c_str(), ios::in | ios::binary);
   myFile.read ((char*) data.data(), file_size);


}


int main(int argc, char* argv[])
{
   MPI_Init(NULL, NULL);
   PetscInitialize(&argc, &argv,(char*)0, (char* )0);
   SlepcInitialize(&argc, &argv,(char*)0, (char* )0);

   int rank;

   MPI_Comm_rank(MPI_COMM_WORLD, &rank);
   string filename("matrix.bin");


   int matrix_size;

   std::vector > data;

   read_matrix(filename, matrix_size,  data);
   if (rank == 0)
   {
     cout << "matrix size " << matrix_size << "\n";
   }
   Mat mat;

MatCreateDense(MPI_COMM_WORLD,PETSC_DECIDE,PETSC_DECIDE,matrix_size,matrix_size,NULL,&mat);
   int local_row_begin;
   int local_row_end;

MatGetOwnershipRange(mat,&local_row_begin,&local_row_end);

   for (int i = local_row_begin; i < local_row_end; i++)
   {
     vector > v(matrix_size);
     vector index(matrix_size);
     for (int j = 0; j < matrix_size; j++)
     {
   v[j] = data[j*matrix_size + i];
   index[j] = j;
     }
     MatSetValues(mat,1,&i,matrix_size,index.data(), v.data(), 
INSERT_VALUES);
   }

   MatAssemblyBegin(mat,MAT_FINAL_ASSEMBLY);
   MatAssemblyEnd(mat,MAT_FINAL_ASSEMBLY);



   complex center(0, 0);
   double  radius(100);
   double vscale(1.0);

   EPS    eps;
   EPSType    type;
   RG rg;

   EPSCreate(MPI_COMM_WORLD,&eps);
   EPSSetOperators( eps,mat,NULL);
   EPSSetType(eps,EPSCISS);
   EPSSetProblemType(eps, EPS_NHEP);

   EPSSetFromOptions(eps);

   EPSGetRG(eps,&rg);
   RGSetType(rg,RGELLIPSE);
   RGEllipseSetParameters( rg, center,radius,vscale);
   EPSSolve(eps);

   if (rank == 0)
   {
     int nconv;
     EPSGetConverged(eps,&nconv);
     for (int i = 0; i < nconv; i++)
     {
   complex a,b;
   EPSGetEigenvalue(eps,i,&a,&b);;
   cout << a << "\n";
     }
   }

   PetscFinalize();
   SlepcFinalize();
   MPI_Finalize();
}

3. If I run it as mpiexec -n 1 a.out -eps_ciss_partitions 1 it works well.
If run it as  mpiexec -n 2 a.out -eps_ciss_partitions 2
I get an error message
[0]PETSC ERROR: - Error Message 
--
[0]PETSC ERROR: No support for this operation for this object type
[0]PETSC ERROR: Mat type seqdense
[0]PETSC ERROR: See http://www.mcs.anl.gov/petsc/documentation/faq.html 
for trouble shooting.
[0]PETSC ERROR: Petsc Release Version 3.8.4, Mar, 24, 2018
[0]PETSC ERROR: a.out on a linux-complex named 
brown-fe03.rcac.purdue.edu by mpovolot Thu Sep 19 14:02:06 2019
[0]PETSC ERROR: Configure options --with-scalar-type=complex --with-x=0 
--with-hdf5 --download-hdf5=1 --with-single-library=1 --with-pic=1 
--with-shared-libraries=0 --with-log=0 --with-clanguage=C++ 
--CXXFLAGS="-fopenmp -fPIC" --CFLAGS="-fopenmp -fPIC" --with-fortran=0 
--FFLAGS="-fopenmp -fPIC" --with-debugging=0 --with-cc=mpicc 
--with-fc=mpif90 --with-cxx=mpicxx COPTFLAGS= CXXOPTFLAGS= FOPTFLAGS= 
--download-metis=1 --download-parmetis=1 
--with-valgrind-dir=/apps/brown/valgrind/3.13.0_gcc-4.8.5 
--download-mumps=1 --with-fortran-kernels=0 --download-superlu_dist=1 
--with-blaslapack-lib="-L/apps/cent7/intel/compilers_and_libraries_2017.1.132/linux/mkl/lib/intel64
 
-lmkl_intel_lp64 -lmkl_gnu_thread -lmkl_core " 
--with-blacs-lib=/apps/cent7/intel/compilers_and_libraries_2017.1.132/linux/mkl/lib/intel64/libmkl_blacs_intelmpi_lp64.so
 
--with-blacs-include=/apps/cent7/intel/compilers_and_libraries_2017.1.132/linux/mkl/include
 
--with-scalapack-lib="-Wl,-rpath,/apps/cent7/intel/compilers_and_libraries_2017.1.132/linux/mkl/lib/intel64
 
-L/apps/cent7/intel/compilers_and_libraries_2017.1.132/linux/mkl/lib/intel64 
-lmkl_gf_lp64 -lmkl_gnu_thread -lmkl_core  -lpthread 
-L/apps/cent7/intel/compilers_and_libraries_2017.1.132/linux/mkl/lib/intel64 
-lmkl_scalapack_lp64 -lmkl_blacs_intelmpi_lp64" 
--with-scalapack-include=/apps/cent7/intel/compilers_and_libraries_2017.1.132/linux/mkl/include
[0]PETSC ERROR: #1 MatCreateMPIMatConcatenateSeqMat() line 10547 in 
/depot/kildisha

Re: [petsc-users] question about MatCreateRedundantMatrix

2019-09-19 Thread Zhang, Hong via petsc-users
Michael,

--
[0]PETSC ERROR: No support for this operation for this object type
[0]PETSC ERROR: Mat type seqdense
[0]PETSC ERROR: See http://www.mcs.anl.gov/petsc/documentation/faq.html
for trouble shooting.
[0]PETSC ERROR: Petsc Release Version 3.8.4, Mar, 24, 2018

This is an old version of  Petsc. Can you update to the latest Petsc release?
Hong


On 09/19/2019 04:55 AM, Jose E. Roman wrote:
> Michael,
>
> In my previous email I should have checked it better. The CISS solver works 
> indeed with dense matrices:
>
> $ mpiexec -n 2 ./ex2 -n 30 -eps_type ciss -terse -rg_type ellipse 
> -rg_ellipse_center 1.175 -rg_ellipse_radius 0.075 -eps_ciss_partitions 2 
> -mat_type dense
>
> 2-D Laplacian Eigenproblem, N=900 (30x30 grid)
>
>   Solution method: ciss
>
>   Number of requested eigenvalues: 1
>   Found 15 eigenvalues, all of them computed up to the required tolerance:
>   1.10416, 1.10416, 1.10455, 1.10455, 1.12947, 1.12947, 1.13426, 1.13426,
>   1.16015, 1.16015, 1.19338, 1.19338, 1.21093, 1.21093, 1.24413
>
>
> There might be something different in the way matrices are initialized in 
> your code. Send me a simple example that reproduces the problem and I will 
> track it down.
>
> Sorry for the confusion.
> Jose
>
>
>
>> El 19 sept 2019, a las 6:20, hong--- via petsc-users 
>> mailto:petsc-users@mcs.anl.gov>> escribió:
>>
>> Michael,
>> We have support of MatCreateRedundantMatrix for dense matrices. For example, 
>> petsc/src/mat/examples/tests/ex9.c:
>> mpiexec -n 4 ./ex9 -mat_type dense -view_mat -nsubcomms 2
>>
>> Hong
>>
>> On Wed, Sep 18, 2019 at 5:40 PM Povolotskyi, Mykhailo via petsc-users 
>> mailto:petsc-users@mcs.anl.gov>> wrote:
>> Dear Petsc developers,
>>
>> I found that MatCreateRedundantMatrix does not support dense matrices.
>>
>> This causes the following problem: I cannot use CISS eigensolver from
>> SLEPC with dense matrices with parallelization over quadrature points.
>>
>> Is it possible for you to add this support?
>>
>> Thank you,
>>
>> Michael.
>>
>>
>> p.s. I apologize if you received this e-mail twice, I sent if first from
>> a different address.
>>



Re: [petsc-users] question about MatCreateRedundantMatrix

2019-09-19 Thread Povolotskyi, Mykhailo via petsc-users
Hong,

do you have in mind a reason why the newer version should work or is it a 
general recommendation?

Which stable version would you recommend to upgrade to?

Thank you,

Michael.

On 09/19/2019 02:22 PM, Zhang, Hong wrote:
Michael,

--
[0]PETSC ERROR: No support for this operation for this object type
[0]PETSC ERROR: Mat type seqdense
[0]PETSC ERROR: See http://www.mcs.anl.gov/petsc/documentation/faq.html
for trouble shooting.
[0]PETSC ERROR: Petsc Release Version 3.8.4, Mar, 24, 2018

This is an old version of  Petsc. Can you update to the latest Petsc release?
Hong


On 09/19/2019 04:55 AM, Jose E. Roman wrote:
> Michael,
>
> In my previous email I should have checked it better. The CISS solver works 
> indeed with dense matrices:
>
> $ mpiexec -n 2 ./ex2 -n 30 -eps_type ciss -terse -rg_type ellipse 
> -rg_ellipse_center 1.175 -rg_ellipse_radius 0.075 -eps_ciss_partitions 2 
> -mat_type dense
>
> 2-D Laplacian Eigenproblem, N=900 (30x30 grid)
>
>   Solution method: ciss
>
>   Number of requested eigenvalues: 1
>   Found 15 eigenvalues, all of them computed up to the required tolerance:
>   1.10416, 1.10416, 1.10455, 1.10455, 1.12947, 1.12947, 1.13426, 1.13426,
>   1.16015, 1.16015, 1.19338, 1.19338, 1.21093, 1.21093, 1.24413
>
>
> There might be something different in the way matrices are initialized in 
> your code. Send me a simple example that reproduces the problem and I will 
> track it down.
>
> Sorry for the confusion.
> Jose
>
>
>
>> El 19 sept 2019, a las 6:20, hong--- via petsc-users 
>> mailto:petsc-users@mcs.anl.gov>> escribió:
>>
>> Michael,
>> We have support of MatCreateRedundantMatrix for dense matrices. For example, 
>> petsc/src/mat/examples/tests/ex9.c:
>> mpiexec -n 4 ./ex9 -mat_type dense -view_mat -nsubcomms 2
>>
>> Hong
>>
>> On Wed, Sep 18, 2019 at 5:40 PM Povolotskyi, Mykhailo via petsc-users 
>> mailto:petsc-users@mcs.anl.gov>> wrote:
>> Dear Petsc developers,
>>
>> I found that MatCreateRedundantMatrix does not support dense matrices.
>>
>> This causes the following problem: I cannot use CISS eigensolver from
>> SLEPC with dense matrices with parallelization over quadrature points.
>>
>> Is it possible for you to add this support?
>>
>> Thank you,
>>
>> Michael.
>>
>>
>> p.s. I apologize if you received this e-mail twice, I sent if first from
>> a different address.
>>




Re: [petsc-users] question about MatCreateRedundantMatrix

2019-09-20 Thread Jose E. Roman via petsc-users
I have tried with slepc-master and it works:

$ mpiexec -n 2 ./ex1 -eps_ciss_partitions 2
matrix size 774
(-78.7875,8.8022)
(-73.9569,-42.2401)
(-66.9942,-7.50907)
(-62.262,-2.71603)
(-58.9716,0.60)
(-57.9883,0.298729)
(-57.8323,1.06041)
(-56.5317,1.10758)
(-56.0234,45.2405)
(-54.4058,2.88373)
(-25.946,26.0317)
(-23.5383,-16.9096)
(-19.0999,0.194467)
(-18.795,1.15113)
(-15.3051,0.915914)
(-14.803,-0.00475538)
(-8.52467,10.6032)
(-4.36051,2.29996)
(-0.525758,0.796658)
(1.41227,0.112858)
(1.53801,0.446984)
(9.43357,0.505277)

slepc-master will become version 3.12 in a few days. I have not tried with 3.11 
but I think it should work.

It is always recommended to use the latest version. Version 3.8 is two years 
old.

Jose


> El 19 sept 2019, a las 20:33, Povolotskyi, Mykhailo  
> escribió:
> 
> Hong,
> 
> do you have in mind a reason why the newer version should work or is it a 
> general recommendation?
> 
> Which stable version would you recommend to upgrade to?
> 
> Thank you,
> 
> Michael.
> 
> 
> On 09/19/2019 02:22 PM, Zhang, Hong wrote:
>> Michael,
>> 
>> --
>> [0]PETSC ERROR: No support for this operation for this object type
>> [0]PETSC ERROR: Mat type seqdense
>> [0]PETSC ERROR: See http://www.mcs.anl.gov/petsc/documentation/faq.html 
>> for trouble shooting.
>> [0]PETSC ERROR: Petsc Release Version 3.8.4, Mar, 24, 2018
>> 
>> This is an old version of  Petsc. Can you update to the latest Petsc release?
>> Hong
>> 
>> 
>> On 09/19/2019 04:55 AM, Jose E. Roman wrote:
>> > Michael,
>> >
>> > In my previous email I should have checked it better. The CISS solver 
>> > works indeed with dense matrices:
>> >
>> > $ mpiexec -n 2 ./ex2 -n 30 -eps_type ciss -terse -rg_type ellipse 
>> > -rg_ellipse_center 1.175 -rg_ellipse_radius 0.075 -eps_ciss_partitions 2 
>> > -mat_type dense
>> >
>> > 2-D Laplacian Eigenproblem, N=900 (30x30 grid)
>> >
>> >   Solution method: ciss
>> >
>> >   Number of requested eigenvalues: 1
>> >   Found 15 eigenvalues, all of them computed up to the required tolerance:
>> >   1.10416, 1.10416, 1.10455, 1.10455, 1.12947, 1.12947, 1.13426, 
>> > 1.13426,
>> >   1.16015, 1.16015, 1.19338, 1.19338, 1.21093, 1.21093, 1.24413
>> >
>> >
>> > There might be something different in the way matrices are initialized in 
>> > your code. Send me a simple example that reproduces the problem and I will 
>> > track it down.
>> >
>> > Sorry for the confusion.
>> > Jose
>> >
>> >
>> >
>> >> El 19 sept 2019, a las 6:20, hong--- via petsc-users 
>> >>  escribió:
>> >>
>> >> Michael,
>> >> We have support of MatCreateRedundantMatrix for dense matrices. For 
>> >> example, petsc/src/mat/examples/tests/ex9.c:
>> >> mpiexec -n 4 ./ex9 -mat_type dense -view_mat -nsubcomms 2
>> >>
>> >> Hong
>> >>
>> >> On Wed, Sep 18, 2019 at 5:40 PM Povolotskyi, Mykhailo via petsc-users 
>> >>  wrote:
>> >> Dear Petsc developers,
>> >>
>> >> I found that MatCreateRedundantMatrix does not support dense matrices.
>> >>
>> >> This causes the following problem: I cannot use CISS eigensolver from
>> >> SLEPC with dense matrices with parallelization over quadrature points.
>> >>
>> >> Is it possible for you to add this support?
>> >>
>> >> Thank you,
>> >>
>> >> Michael.
>> >>
>> >>
>> >> p.s. I apologize if you received this e-mail twice, I sent if first from
>> >> a different address.
>> >>
>> 
>