Re: [deal.II] compress error for vector of PETScWrappers::MPI::SparseMatrix

2017-06-07 Thread Weixiong Zheng
Daniel and Dr Bangerth,

Thanks for the help. I got it resolved with modifying the dsp.

That does be my misunderstanding, hahaha.

Best,
Weixiong

在 2017年6月7日星期三 UTC-7上午10:33:57,Wolfgang Bangerth写道:
>
> On 06/07/2017 11:14 AM, Weixiong Zheng wrote: 
> > 
> > I see your point. I would double check later. The reason I didn't use 
> locally 
> > relevant dofs is I didn't design the code to do h-refinement so I didn't 
> see 
> > points using relevant dofs. 
>
> Then you misunderstand the difference between locally active and locally 
> relevant dofs :-) The difference has nothing to do with h-refinement. 
>
> Best 
>   W. 
>
> -- 
>  
> Wolfgang Bangerth  email: bang...@colostate.edu 
>  
> www: http://www.math.colostate.edu/~bangerth/ 
>
>

-- 
The deal.II project is located at http://www.dealii.org/
For mailing list/forum options, see 
https://groups.google.com/d/forum/dealii?hl=en
--- 
You received this message because you are subscribed to the Google Groups 
"deal.II User Group" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to dealii+unsubscr...@googlegroups.com.
For more options, visit https://groups.google.com/d/optout.


Re: [deal.II] compress error for vector of PETScWrappers::MPI::SparseMatrix

2017-06-07 Thread Weixiong Zheng
Daniel,

I see your point. I would double check later. The reason I didn't use 
locally relevant dofs is I didn't design the code to do h-refinement so I 
didn't see points using relevant dofs.

Thanks,
Weixiong

在 2017年6月7日星期三 UTC-7上午6:35:44,Daniel Arndt写道:
>
> Weixiong,
>
> [...]
>> SparsityTools::distribute_sparsity_pattern (dsp,
>> 
>> dof_handler.n_locally_owned_dofs_per_processor (),
>> mpi_communicator,
>>locally_owned_dofs);
>>
> This looks a bit suspicious. Normally, you should use the set of locall 
> relevant dofs here [1].
> You also might want to compare with step-40 [2].
>
> Best,
> Daniel
>
> [1] 
> https://www.dealii.org/8.5.0/doxygen/deal.II/namespaceSparsityTools.html#ae2c7bdbdb62642f60d60087e4cb6195f
> [2] 
> https://www.dealii.org/8.5.0/doxygen/deal.II/step_40.html#LaplaceProblemsetup_system
>
> [1] 
>
>

-- 
The deal.II project is located at http://www.dealii.org/
For mailing list/forum options, see 
https://groups.google.com/d/forum/dealii?hl=en
--- 
You received this message because you are subscribed to the Google Groups 
"deal.II User Group" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to dealii+unsubscr...@googlegroups.com.
For more options, visit https://groups.google.com/d/optout.


Re: [deal.II] compress error for vector of PETScWrappers::MPI::SparseMatrix

2017-06-06 Thread Weixiong Zheng
This is how I setup my sparsity pattern:
if (discretization=="DFEM")
fe = new FE_DGQ (p_order);
  else
fe = new FE_Q (p_order);

dof_handler.distribute_dofs (*fe);
locally_owned_dofs = dof_handler.locally_owned_dofs ();

DynamicSparsityPattern dsp (locally_owned_dofs);

if (discretization=="DFEM")
  DoFTools::make_flux_sparsity_pattern (dof_handler, dsp);
else
  DoFTools::make_sparsity_pattern (dof_handler, dsp);

SparsityTools::distribute_sparsity_pattern (dsp,

dof_handler.n_locally_owned_dofs_per_processor (),
mpi_communicator,
   locally_owned_dofs);



在 2017年6月6日星期二 UTC-7下午12:07:18,Wolfgang Bangerth写道:
>
> On 06/06/2017 12:10 PM, Weixiong Zheng wrote: 
> > It runs in serial. The error occurs when using multiple processors. 
>
> And it really happens when you call `compress()`? 
>
> The only thing I can think of is that you didn't set up the sparsity 
> pattern correctly. Can you show us the code that sets up the sparsity 
> pattern of these matrices? 
>
> Best 
>   W. 
>
> -- 
>  
> Wolfgang Bangerth  email: bang...@colostate.edu 
>  
> www: http://www.math.colostate.edu/~bangerth/ 
>

-- 
The deal.II project is located at http://www.dealii.org/
For mailing list/forum options, see 
https://groups.google.com/d/forum/dealii?hl=en
--- 
You received this message because you are subscribed to the Google Groups 
"deal.II User Group" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to dealii+unsubscr...@googlegroups.com.
For more options, visit https://groups.google.com/d/optout.


Re: [deal.II] compress error for vector of PETScWrappers::MPI::SparseMatrix

2017-06-06 Thread Wolfgang Bangerth

On 06/06/2017 12:10 PM, Weixiong Zheng wrote:

It runs in serial. The error occurs when using multiple processors.


And it really happens when you call `compress()`?

The only thing I can think of is that you didn't set up the sparsity 
pattern correctly. Can you show us the code that sets up the sparsity 
pattern of these matrices?


Best
 W.

--

Wolfgang Bangerth  email: bange...@colostate.edu
   www: http://www.math.colostate.edu/~bangerth/

--
The deal.II project is located at http://www.dealii.org/
For mailing list/forum options, see 
https://groups.google.com/d/forum/dealii?hl=en
--- 
You received this message because you are subscribed to the Google Groups "deal.II User Group" group.

To unsubscribe from this group and stop receiving emails from it, send an email 
to dealii+unsubscr...@googlegroups.com.
For more options, visit https://groups.google.com/d/optout.


Re: [deal.II] compress error for vector of PETScWrappers::MPI::SparseMatrix

2017-06-06 Thread Weixiong Zheng
It runs in serial. The error occurs when using multiple processors.

-- 
The deal.II project is located at http://www.dealii.org/
For mailing list/forum options, see 
https://groups.google.com/d/forum/dealii?hl=en
--- 
You received this message because you are subscribed to the Google Groups 
"deal.II User Group" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to dealii+unsubscr...@googlegroups.com.
For more options, visit https://groups.google.com/d/optout.


Re: [deal.II] compress error for vector of PETScWrappers::MPI::SparseMatrix

2017-06-06 Thread Wolfgang Bangerth

On 06/06/2017 03:13 AM, Weixiong Zheng wrote:


Now, I got the error: compressing can go through direction 0 but not 1 
with the error at the end of this post.

Any ideas how to fix it?

Thanks in advance!
Weixiong

[0]PETSC ERROR: - Error Message 
--


[0]PETSC ERROR: Argument out of range

[0]PETSC ERROR: Inserting a new nonzero at global row/column (6, 15) 
into matrix


You say this happens during the compress() call? That is strange. Are 
you running in parallel, or does this also happen if you run on one 
processor?


Best
 W.

--

Wolfgang Bangerth  email: bange...@colostate.edu
   www: http://www.math.colostate.edu/~bangerth/

--
The deal.II project is located at http://www.dealii.org/
For mailing list/forum options, see 
https://groups.google.com/d/forum/dealii?hl=en
--- 
You received this message because you are subscribed to the Google Groups "deal.II User Group" group.

To unsubscribe from this group and stop receiving emails from it, send an email 
to dealii+unsubscr...@googlegroups.com.
For more options, visit https://groups.google.com/d/optout.