Re: [deal.II] compress error for vector of PETScWrappers::MPI::SparseMatrix

2017-06-07 Thread Weixiong Zheng
Daniel and Dr Bangerth,

Thanks for the help. I got it resolved with modifying the dsp.

That does be my misunderstanding, hahaha.

Best,
Weixiong

在 2017年6月7日星期三 UTC-7上午10:33:57,Wolfgang Bangerth写道:
>
> On 06/07/2017 11:14 AM, Weixiong Zheng wrote: 
> > 
> > I see your point. I would double check later. The reason I didn't use 
> locally 
> > relevant dofs is I didn't design the code to do h-refinement so I didn't 
> see 
> > points using relevant dofs. 
>
> Then you misunderstand the difference between locally active and locally 
> relevant dofs :-) The difference has nothing to do with h-refinement. 
>
> Best 
>   W. 
>
> -- 
>  
> Wolfgang Bangerth  email: bang...@colostate.edu 
>  
> www: http://www.math.colostate.edu/~bangerth/ 
>
>

-- 
The deal.II project is located at http://www.dealii.org/
For mailing list/forum options, see 
https://groups.google.com/d/forum/dealii?hl=en
--- 
You received this message because you are subscribed to the Google Groups 
"deal.II User Group" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to dealii+unsubscr...@googlegroups.com.
For more options, visit https://groups.google.com/d/optout.


Re: [deal.II] compress error for vector of PETScWrappers::MPI::SparseMatrix

2017-06-07 Thread Weixiong Zheng
Daniel,

I see your point. I would double check later. The reason I didn't use 
locally relevant dofs is I didn't design the code to do h-refinement so I 
didn't see points using relevant dofs.

Thanks,
Weixiong

在 2017年6月7日星期三 UTC-7上午6:35:44,Daniel Arndt写道:
>
> Weixiong,
>
> [...]
>> SparsityTools::distribute_sparsity_pattern (dsp,
>> 
>> dof_handler.n_locally_owned_dofs_per_processor (),
>> mpi_communicator,
>>locally_owned_dofs);
>>
> This looks a bit suspicious. Normally, you should use the set of locall 
> relevant dofs here [1].
> You also might want to compare with step-40 [2].
>
> Best,
> Daniel
>
> [1] 
> https://www.dealii.org/8.5.0/doxygen/deal.II/namespaceSparsityTools.html#ae2c7bdbdb62642f60d60087e4cb6195f
> [2] 
> https://www.dealii.org/8.5.0/doxygen/deal.II/step_40.html#LaplaceProblemsetup_system
>
> [1] 
>
>

-- 
The deal.II project is located at http://www.dealii.org/
For mailing list/forum options, see 
https://groups.google.com/d/forum/dealii?hl=en
--- 
You received this message because you are subscribed to the Google Groups 
"deal.II User Group" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to dealii+unsubscr...@googlegroups.com.
For more options, visit https://groups.google.com/d/optout.


Re: [deal.II] compress error for vector of PETScWrappers::MPI::SparseMatrix

2017-06-06 Thread Weixiong Zheng
This is how I setup my sparsity pattern:
if (discretization=="DFEM")
fe = new FE_DGQ (p_order);
  else
fe = new FE_Q (p_order);

dof_handler.distribute_dofs (*fe);
locally_owned_dofs = dof_handler.locally_owned_dofs ();

DynamicSparsityPattern dsp (locally_owned_dofs);

if (discretization=="DFEM")
  DoFTools::make_flux_sparsity_pattern (dof_handler, dsp);
else
  DoFTools::make_sparsity_pattern (dof_handler, dsp);

SparsityTools::distribute_sparsity_pattern (dsp,

dof_handler.n_locally_owned_dofs_per_processor (),
mpi_communicator,
   locally_owned_dofs);



在 2017年6月6日星期二 UTC-7下午12:07:18,Wolfgang Bangerth写道:
>
> On 06/06/2017 12:10 PM, Weixiong Zheng wrote: 
> > It runs in serial. The error occurs when using multiple processors. 
>
> And it really happens when you call `compress()`? 
>
> The only thing I can think of is that you didn't set up the sparsity 
> pattern correctly. Can you show us the code that sets up the sparsity 
> pattern of these matrices? 
>
> Best 
>   W. 
>
> -- 
>  
> Wolfgang Bangerth  email: bang...@colostate.edu 
>  
> www: http://www.math.colostate.edu/~bangerth/ 
>

-- 
The deal.II project is located at http://www.dealii.org/
For mailing list/forum options, see 
https://groups.google.com/d/forum/dealii?hl=en
--- 
You received this message because you are subscribed to the Google Groups 
"deal.II User Group" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to dealii+unsubscr...@googlegroups.com.
For more options, visit https://groups.google.com/d/optout.


Re: [deal.II] compress error for vector of PETScWrappers::MPI::SparseMatrix

2017-06-06 Thread Wolfgang Bangerth

On 06/06/2017 12:10 PM, Weixiong Zheng wrote:

It runs in serial. The error occurs when using multiple processors.


And it really happens when you call `compress()`?

The only thing I can think of is that you didn't set up the sparsity 
pattern correctly. Can you show us the code that sets up the sparsity 
pattern of these matrices?


Best
 W.

--

Wolfgang Bangerth  email: bange...@colostate.edu
   www: http://www.math.colostate.edu/~bangerth/

--
The deal.II project is located at http://www.dealii.org/
For mailing list/forum options, see 
https://groups.google.com/d/forum/dealii?hl=en
--- 
You received this message because you are subscribed to the Google Groups "deal.II User Group" group.

To unsubscribe from this group and stop receiving emails from it, send an email 
to dealii+unsubscr...@googlegroups.com.
For more options, visit https://groups.google.com/d/optout.


Re: [deal.II] compress error for vector of PETScWrappers::MPI::SparseMatrix

2017-06-06 Thread Weixiong Zheng
It runs in serial. The error occurs when using multiple processors.

-- 
The deal.II project is located at http://www.dealii.org/
For mailing list/forum options, see 
https://groups.google.com/d/forum/dealii?hl=en
--- 
You received this message because you are subscribed to the Google Groups 
"deal.II User Group" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to dealii+unsubscr...@googlegroups.com.
For more options, visit https://groups.google.com/d/optout.


Re: [deal.II] compress error for vector of PETScWrappers::MPI::SparseMatrix

2017-06-06 Thread Wolfgang Bangerth

On 06/06/2017 03:13 AM, Weixiong Zheng wrote:


Now, I got the error: compressing can go through direction 0 but not 1 
with the error at the end of this post.

Any ideas how to fix it?

Thanks in advance!
Weixiong

[0]PETSC ERROR: - Error Message 
--


[0]PETSC ERROR: Argument out of range

[0]PETSC ERROR: Inserting a new nonzero at global row/column (6, 15) 
into matrix


You say this happens during the compress() call? That is strange. Are 
you running in parallel, or does this also happen if you run on one 
processor?


Best
 W.

--

Wolfgang Bangerth  email: bange...@colostate.edu
   www: http://www.math.colostate.edu/~bangerth/

--
The deal.II project is located at http://www.dealii.org/
For mailing list/forum options, see 
https://groups.google.com/d/forum/dealii?hl=en
--- 
You received this message because you are subscribed to the Google Groups "deal.II User Group" group.

To unsubscribe from this group and stop receiving emails from it, send an email 
to dealii+unsubscr...@googlegroups.com.
For more options, visit https://groups.google.com/d/optout.


[deal.II] compress error for vector of PETScWrappers::MPI::SparseMatrix

2017-06-06 Thread Weixiong Zheng
Dear All,

I am doing a radiation transport problem, which has multiple individual 
equations (number determined by the number of directions we input). As the 
number of directions is arbitrary, I put the sparsematries are contained in 
std::vectors as raw pointers (using LA namespace to stand for PETScWrappers)
std::vector sys_mats;
When initializing:
for (unsigned int i=0; ireinit (locally_owned_dofs,
  locally_owned_dofs,
  dsp,
  mpi_communicator);
}

When assembling:
for (; cell!=endc; ++cell)
{
  if (cell->is_locally_owned())
  {
std::vector local_mats
// do local matrix and boundary term assembly for all directions in one 
cell
cell->get_dof_indices (local_dof_indices);

// Map local matrices to global matrices for all directions
for (unsigned int i=0; iadd (local_dof_indices,
local_mats [ i ]);
}
  }
}

Then we compress:
for (unsigned int i=0; icompress (VectorOperation::add);
  pcout << "we have compressed Dir " << i << std::endl;
}

Now, I got the error: compressing can go through direction 0 but not 1 with 
the error at the end of this post.
Any ideas how to fix it?

Thanks in advance!
Weixiong

[0]PETSC ERROR: - Error Message 
--

[0]PETSC ERROR: Argument out of range

[0]PETSC ERROR: Inserting a new nonzero at global row/column (6, 15) into 
matrix

[0]PETSC ERROR: See http://www.mcs.anl.gov/petsc/documentation/faq.html for 
trouble shooting.

[0]PETSC ERROR: Petsc Release Version 3.6.3, unknown 


[0]PETSC ERROR: #1 MatSetValues_MPIAIJ() line 613 in 
/usr/local/src/d2-parallel-package/petsc/src/mat/impls/aij/mpi/mpiaij.c

[0]PETSC ERROR: #2 MatAssemblyEnd_MPIAIJ() line 731 in 
/usr/local/src/d2-parallel-package/petsc/src/mat/impls/aij/mpi/mpiaij.c

[0]PETSC ERROR: #3 MatAssemblyEnd() line 5098 in 
/usr/local/src/d2-parallel-package/petsc/src/mat/interface/matrix.c

ERROR: Uncaught exception in MPI_InitFinalize on proc 0. Skipping 
MPI_Finalize() to avoid a deadlock.



Exception on processing: 




An error occurred in line <263> of file 
<../source/lac/petsc_matrix_base.cc> in function

void dealii::PETScWrappers::MatrixBase::compress(const 
VectorOperation::values)

The violated condition was: 

ierr == 0

The name and call sequence of the exception was:

ExcPETScError(ierr)

Additional Information: 

An error with error number 63 occurred while calling a PETSc function




Aborting!



---

Primary job  terminated normally, but 1 process returned

a non-zero exit code.. Per user-direction, the job has been aborted.

---

--

MPI_ABORT was invoked on rank 1 in communicator MPI_COMM_WORLD 

with errorcode 59.


NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes.

You may or may not see output from other processes, depending on

exactly when Open MPI kills them.

--

[1]PETSC ERROR: 


[1]PETSC ERROR: Caught signal number 15 Terminate: Some process (or the 
batch system) has told this process to end

[1]PETSC ERROR: Try option -start_in_debugger or -on_error_attach_debugger

[1]PETSC ERROR: or see 
http://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind

[1]PETSC ERROR: or try http://valgrind.org on GNU/linux and Apple Mac OS X 
to find memory corruption errors

[1]PETSC ERROR: configure using --with-debugging=yes, recompile, link, and 
run 

[1]PETSC ERROR: to get more information on the crash.

[1]PETSC ERROR: - Error Message 
--

[1]PETSC ERROR: Signal received

[1]PETSC ERROR: See http://www.mcs.anl.gov/petsc/documentation/faq.html for 
trouble shooting.

[1]PETSC ERROR: Petsc Release Version 3.6.3, unknown 

[1]PETSC ERROR: dg-ep-proto on a uly named my_hostname by GrillCheese Tue 
Jun  6 01:18:23 2017

[1]PETSC ERROR: Configure options --with-make-np=8 --with-debugging=0 
--prefix=/Applications/deal.II.app/Contents/Resources/opt/petsc-3e25e16 
--with-mpi-dir=/Applications/deal.II.app/Contents/Resources/opt/openmpi-1.10.2