Re: [petsc-users] VecSetSizes hangs in MPI

2017-01-04 Thread Manuel Valera
Thanks i had no idea how to debug and read those logs, that solved this issue at least (i was sending a message from root to everyone else, but trying to catch from everyone else including root) Until next time, many thanks, Manuel On Wed, Jan 4, 2017 at 3:23 PM, Matthew Knepley

Re: [petsc-users] VecSetSizes hangs in MPI

2017-01-04 Thread Matthew Knepley
On Wed, Jan 4, 2017 at 5:21 PM, Manuel Valera wrote: > I did a PetscBarrier just before calling the vicariate routine and im > pretty sure im calling it from every processor, code looks like this: > >From the gdb trace. Proc 0: Is in some MPI routine you call yourself,

Re: [petsc-users] VecSetSizes hangs in MPI

2017-01-04 Thread Manuel Valera
I did a PetscBarrier just before calling the vicariate routine and im pretty sure im calling it from every processor, code looks like this: call PetscBarrier(PETSC_NULL_OBJECT,ierr) print*,'entering POInit from',rank !call exit() call PetscObjsInit() And output gives: entering POInit

Re: [petsc-users] VecSetSizes hangs in MPI

2017-01-04 Thread Dave May
Are you certain ALL ranks in PETSC_COMM_WORLD call these function(s). These functions cannot be inside if statements like if (rank == 0){ VecCreateMPI(...) } On Wed, 4 Jan 2017 at 23:34, Manuel Valera wrote: > Thanks Dave for the quick answer, appreciate it, > > I just

Re: [petsc-users] VecSetSizes hangs in MPI

2017-01-04 Thread Matthew Knepley
On Wed, Jan 4, 2017 at 4:21 PM, Manuel Valera wrote: > Hello all, happy new year, > > I'm working on parallelizing my code, it worked and provided some results > when i just called more than one processor, but created artifacts because i > didn't need one image of the

Re: [petsc-users] VecSetSizes hangs in MPI

2017-01-04 Thread Barry Smith
> On Jan 4, 2017, at 4:21 PM, Manuel Valera wrote: > > Hello all, happy new year, > > I'm working on parallelizing my code, it worked and provided some results > when i just called more than one processor, but created artifacts because i > didn't need one image of the

Re: [petsc-users] VecSetSizes hangs in MPI

2017-01-04 Thread Manuel Valera
Thanks Dave for the quick answer, appreciate it, I just tried that and it didn't make a difference, any other suggestions ? Thanks, Manuel On Wed, Jan 4, 2017 at 2:29 PM, Dave May wrote: > You need to swap the order of your function calls. > Call VecSetSizes() before

Re: [petsc-users] VecSetSizes hangs in MPI

2017-01-04 Thread Dave May
You need to swap the order of your function calls. Call VecSetSizes() before VecSetType() Thanks, Dave On Wed, 4 Jan 2017 at 23:21, Manuel Valera wrote: Hello all, happy new year, I'm working on parallelizing my code, it worked and provided some results when i just

[petsc-users] VecSetSizes hangs in MPI

2017-01-04 Thread Manuel Valera
Hello all, happy new year, I'm working on parallelizing my code, it worked and provided some results when i just called more than one processor, but created artifacts because i didn't need one image of the whole program in each processor, conflicting with each other. Since the pressure solver is