Thiis i already did with mpiexec -n 20 ... and there the error occured. I was also a little bit surprised that this error occured. Our computation nodes have 20 cores with 6GB RAM. Is PETSc/ SLEPc saving the dense eigenvector error in one core ?
Am Fr., 19. Okt. 2018 um 12:52 Uhr schrieb Jan Grießer < griesser....@googlemail.com>: > Thiis i already did with mpiexec -n 20 ... and there the error occured. I > was also a little bit surprised that this error occured. Our computation > nodes have 20 cores with 6GB RAM. > Is PETSc/ SLEPc saving the dense eigenvector error in one core ? > > Am Fr., 19. Okt. 2018 um 11:08 Uhr schrieb Jose E. Roman < > jro...@dsic.upv.es>: > >> No, I mean to run in parallel: >> >> $ mpiexec -n 8 python ex1.py >> >> Jose >> >> >> > El 19 oct 2018, a las 11:01, Jan Grießer <griesser....@googlemail.com> >> escribió: >> > >> > With more than 1 MPI process you mean i should use spectrum slicing in >> divide the full problem in smaller subproblems? >> > The --with-64-bit-indices is not a possibility for me since i >> configured petsc with mumps, which does not allow to use the 64-bit version >> (At least this was the error message when i tried to configure PETSc ) >> > >> > Am Mi., 17. Okt. 2018 um 18:24 Uhr schrieb Jose E. Roman < >> jro...@dsic.upv.es>: >> > To use BVVECS just add the command-line option -bv_type vecs >> > This causes to use a separate Vec for each column, instead of a single >> long Vec of size n*m. But it is considerably slower than the default. >> > >> > Anyway, for such large problems you should consider using more than 1 >> MPI process. In that case the error may disappear because the local size is >> smaller than 768000. >> > >> > Jose >> > >> > >> > > El 17 oct 2018, a las 17:58, Matthew Knepley <knep...@gmail.com> >> escribió: >> > > >> > > On Wed, Oct 17, 2018 at 11:54 AM Jan Grießer < >> griesser....@googlemail.com> wrote: >> > > Hi all, >> > > i am using slepc4py and petsc4py to solve for the smallest real >> eigenvalues and eigenvectors. For my test cases with a matrix A of the size >> 30k x 30k solving for the smallest soutions works quite well, but when i >> increase the dimension of my system to around A = 768000 x 768000 or 3 >> million x 3 million and ask for the smallest real 3000 (the number is >> increasing with increasing system size) eigenvalues and eigenvectors i get >> the output (for the 768000): >> > > The product 4001 times 768000 overflows the size of PetscInt; >> consider reducing the number of columns, or use BVVECS instead >> > > i understand that the requested number of eigenvectors and >> eigenvalues is causing an overflow but i do not understand the solution of >> the problem which is stated in the error message. Can someone tell me what >> exactly BVVECS is and how i can use it? Or is there any other solution to >> my problem ? >> > > >> > > You can also reconfigure with 64-bit integers: --with-64-bit-indices >> > > >> > > Thanks, >> > > >> > > Matt >> > > >> > > Thank you very much in advance, >> > > Jan >> > > >> > > >> > > >> > > -- >> > > What most experimenters take for granted before they begin their >> experiments is infinitely more interesting than any results to which their >> experiments lead. >> > > -- Norbert Wiener >> > > >> > > https://www.cse.buffalo.edu/~knepley/ >> > >> >>