> On 15 Sep 2020, at 5:40 PM, Abhyankar, Shrirang G 
> <shrirang.abhyan...@pnnl.gov> wrote:
> 
> Pierre,
>    You are right. There are a few MatMultTransposeAdd that may need 
> conforming layouts for the equality/inequality constraint vectors and 
> equality/inequality constraint Jacobian matrices. I need to check if that’s 
> the case. We only have ex1 example currently, we need to add more examples. 
> We are currently working on making PDIPM robust and while doing it will work 
> on adding another example.
>  
> Very naive question, but given that I have a single constraint, how do I 
> split a 1 x N matrix column-wise? I thought it was not possible.
>  
> When setting the size of the constraint vector, you need to set the local 
> size on one rank to 1 and all others to zero. For the Jacobian, the local row 
> size on that rank will be 1 and all others to zero. The column layout for the 
> Jacobian should follow the layout for vector x. So each rank will set the 
> local column size of the Jacobian to local size of x.

That is assuming I don’t want x to follow the distribution of the Hessian, 
which is not my case.
Is there some plan to make PDIPM handle different layouts?
I hope I’m not the only one thinking that having a centralized Hessian when 
there is a single constraint is not scalable?

Thanks,
Pierre

> Shri
>  
>> On 15 Sep 2020, at 2:21 AM, Abhyankar, Shrirang G 
>> <shrirang.abhyan...@pnnl.gov <mailto:shrirang.abhyan...@pnnl.gov>> wrote:
>>  
>> Hello Pierre,
>>    PDIPM works in parallel so you can have distributed Hessian, Jacobians, 
>> constraints, variables, gradients in any layout you want.  If you are using 
>> a DM then you can have it generate the Hessian. 
>  
> Could you please show an example where this is the case?
> pdipm->x, which I’m assuming is a working vector, is both used as input for 
> Hessian and Jacobian functions, e.g., 
> https://gitlab.com/petsc/petsc/-/blob/master/src/tao/constrained/impls/ipm/pdipm.c#L369
>  
> <https://gitlab.com/petsc/petsc/-/blob/master/src/tao/constrained/impls/ipm/pdipm.c#L369>
>  (Hessian) + 
> https://gitlab.com/petsc/petsc/-/blob/master/src/tao/constrained/impls/ipm/pdipm.c#L473
>  
> <https://gitlab.com/petsc/petsc/-/blob/master/src/tao/constrained/impls/ipm/pdipm.c#L473>
>  (Jacobian)
> I thus doubt that it is possible to have different layouts?
> In practice, I end up with the following error when I try this (2 processes, 
> distributed Hessian with centralized Jacobian):
> [1]PETSC ERROR: --------------------- Error Message 
> --------------------------------------------------------------
> [1]PETSC ERROR: Nonconforming object sizes
> [1]PETSC ERROR: Vector wrong size 14172 for scatter 0 (scatter reverse and 
> vector to != ctx from size)
> [1]PETSC ERROR: #1 VecScatterBegin() line 96 in 
> /Users/jolivet/Documents/repositories/petsc/src/vec/vscat/interface/vscatfce.c
> [1]PETSC ERROR: #2 MatMultTransposeAdd_MPIAIJ() line 1223 in 
> /Users/jolivet/Documents/repositories/petsc/src/mat/impls/aij/mpi/mpiaij.c
> [1]PETSC ERROR: #3 MatMultTransposeAdd() line 2648 in 
> /Users/jolivet/Documents/repositories/petsc/src/mat/interface/matrix.c
> [0]PETSC ERROR: Nonconforming object sizes
> [0]PETSC ERROR: Vector wrong size 13790 for scatter 27962 (scatter reverse 
> and vector to != ctx from size)
> [1]PETSC ERROR: #4 TaoSNESFunction_PDIPM() line 510 in 
> /Users/jolivet/Documents/repositories/petsc/src/tao/constrained/impls/ipm/pdipm.c
> [0]PETSC ERROR: #5 TaoSolve_PDIPM() line 712 in 
> /Users/jolivet/Documents/repositories/petsc/src/tao/constrained/impls/ipm/pdipm.c
> [1]PETSC ERROR: #6 TaoSolve() line 222 in 
> /Users/jolivet/Documents/repositories/petsc/src/tao/interface/taosolver.c
> [0]PETSC ERROR: #1 VecScatterBegin() line 96 in 
> /Users/jolivet/Documents/repositories/petsc/src/vec/vscat/interface/vscatfce.c
> [0]PETSC ERROR: #2 MatMultTransposeAdd_MPIAIJ() line 1223 in 
> /Users/jolivet/Documents/repositories/petsc/src/mat/impls/aij/mpi/mpiaij.c
> [0]PETSC ERROR: #3 MatMultTransposeAdd() line 2648 in 
> /Users/jolivet/Documents/repositories/petsc/src/mat/interface/matrix.c
> [0]PETSC ERROR: #4 TaoSNESFunction_PDIPM() line 510 in 
> /Users/jolivet/Documents/repositories/petsc/src/tao/constrained/impls/ipm/pdipm.c
> [0]PETSC ERROR: #5 TaoSolve_PDIPM() line 712 in 
> /Users/jolivet/Documents/repositories/petsc/src/tao/constrained/impls/ipm/pdipm.c
> [0]PETSC ERROR: #6 TaoSolve() line 222 in 
> /Users/jolivet/Documents/repositories/petsc/src/tao/interface/taosolver.c
>  
> I think this can be reproduced by ex1.c by just distributing the Hessian 
> instead of having it centralized on rank 0.
> 
> 
>> Ideally, you want to have the layout below to minimize movement of 
>> matrix/vector elements across ranks.
>> ·         The layout of vectors x, bounds on x, and gradient is same.
>> ·         The row layout of the equality/inequality Jacobian is same as the 
>> equality/inequality constraint vector layout.
>> ·         The column layout of the equality/inequality Jacobian is same as 
>> that for x.
>  
> Very naive question, but given that I have a single constraint, how do I 
> split a 1 x N matrix column-wise? I thought it was not possible.
>  
> Thanks,
> Pierre
> 
> 
>> ·         The row and column layout for the Hessian is same as x.
>>  
>> The tutorial example ex1 is extremely small (only 2 variables) so its 
>> implementation is very simplistic. I think, in parallel, it ships off 
>> constraints etc. to rank 0. It’s not an ideal example w.r.t demonstrating a 
>> parallel implementation. We aim to add more examples as we develop PDIPM. If 
>> you have an example to contribute then we would most welcome it and provide 
>> help on adding it.
>>  
>> Thanks,
>> Shri
>> From: petsc-dev <petsc-dev-boun...@mcs.anl.gov 
>> <mailto:petsc-dev-boun...@mcs.anl.gov>> on behalf of Pierre Jolivet 
>> <pierre.joli...@enseeiht.fr <mailto:pierre.joli...@enseeiht.fr>>
>> Date: Monday, September 14, 2020 at 1:52 PM
>> To: PETSc Development <petsc-dev@mcs.anl.gov <mailto:petsc-dev@mcs.anl.gov>>
>> Subject: [petsc-dev] PDIPDM questions
>>  
>> Hello,
>> In my quest to help users migrate from Ipopt to Tao, I’ve a new question.
>> When looking at src/tao/constrained/tutorials/ex1.c, it seems that almost 
>> everything is centralized on rank 0 (local sizes are 0 but on rank 0).
>> I’d like to have my Hessian distributed more naturally, as in (almost?) all 
>> other SNES/TS examples, but still keep the Jacobian of my equality 
>> constraint, which is of dimension 1 x N (N >> 1), centralized on rank 0.
>> Is this possible?
>> If not, is it possible to supply the transpose of the Jacobian, of dimension 
>> N x 1, which could then be distributed row-wise like the Hessian?
>> Or maybe use some trick to distribute a MatAIJ/MatDense of dimension 1 x N 
>> column-wise? Use a MatNest with as many blocks as processes?
>>  
>> So, just to sum up, how can I have a distributed Hessian with a Jacobian 
>> with a single row?
>>  
>> Thanks in advance for your help,
>> Pierre

Reply via email to