Thank you very much for your reply. Given this, when using MUMPS in parallel, I 
can still get the factor matrix (using getFactorMatrix method of a PC object) 
and use it to do matrix multiplications (e.g., using matMult method of the 
factor matrix), correct? I also would like to confirm whether the factor matrix 
returned is really triangular and multiplying it with another matrix gives the 
intended result.

> On Nov 16, 2025, at 08:59, Barry Smith <[email protected]> wrote:
> 
>   It appears that only MATSOLVERMKL_CPARDISO provides a parallel backward 
> solve currently. 
> 
>   The only seperation of forward and backward solves in MUMPS appears to be 
> provided with (from its users manual)
> 
> A special case is the one
> where the forward elimination step is performed during factorization (see 
> Subsection 3.8), instead of
> during the solve phase. This allows accessing the L factors right after they 
> have been computed, with a
> better locality, and can avoid writing the L factors to disk in an 
> out-of-core context. In this case (forward
> 
> 
> 
>> On Nov 15, 2025, at 9:17 AM, Yin Shi via petsc-users 
>> <[email protected]> wrote:
>> 
>> Dear Developers,
>> 
>> In short, I need to explicitly use A.solveBackward(b, x) in parallel with 
>> petsc4py, where A is a Cholesky factored matrix, but it seems that this is 
>> not supported (e.g., for mumps and superlu_dist factorization solver 
>> backend). Is it possible to work around this?
>> 
>> In detail, the problem I need to solve is to generate a set of correlated 
>> random numbers (denoted by a vector, w) from an uncorrelated one (denoted by 
>> a vector n). Denote the covariance matrix of n as C (symmetric). One needs 
>> to first factorize C, C = L L^T, and then solve the linear system L^T w = n 
>> for w in parallel. Is it possible to reformulate this problem for it to be 
>> implemented using petsc4py?
>> 
>> Thank you!
>> Yin
> 

Reply via email to