Conversation
tupek2
left a comment
There was a problem hiding this comment.
Do the new preconditioners need to set height/width in their SetOperator implementations?
| void BlockTriangularPreconditioner::SetOperator(const mfem::Operator& jacobian) | ||
| { | ||
| // Cast the supplied jacobian to a block operator object | ||
| block_jacobian_ = dynamic_cast<const mfem::BlockOperator*>(&jacobian); |
There was a problem hiding this comment.
what will happen if someone passes a non-BlockOperator to the triangular preconditioner? ideally it should fall back to just a normal preconditioner somehow.
There was a problem hiding this comment.
We can work on that in a follow up, if you agree with that suggestion. There may be other ways to allow this flexibility.
There was a problem hiding this comment.
Just tried passing a SparseMatrix in one of my tests, and I get a seg fault. The code does work when passing a single block BlockOperator, so if we wanted we could just add something in SetOperator to do something like BlockOperator block_jacobian(offsets); block_jacobian.SetBlock(0, 0, jacobian);
| mfem_solvers_[0]->SetOperator(*A11); | ||
|
|
||
| // Extract the diagonal of A11 (no inversion!) | ||
| mfem::HypreParVector* Md = new mfem::HypreParVector(MPI_COMM_WORLD, A11->GetGlobalNumRows(), A11->GetRowStarts()); |
There was a problem hiding this comment.
can we get the communicator here from some other source? COMM_WORLD may or may not be appropriate.
| delete A21DinvA12; | ||
|
|
||
| // Set the Schur complement preconditioner for block (1,1) | ||
| mfem_solvers_[1]->SetOperator(*S_approx_); |
There was a problem hiding this comment.
do we need to call delete or something on S_approx_ to avoid a memory leak?
| BlockOperator J(block_offsets); | ||
|
|
||
| J.SetBlock(0, 0, A_hypre); | ||
| J.SetBlock(1, 1, new HypreParMatrix(*A_hypre)); // Deep copy |
|
|
||
| auto pv_writer = smith::createParaviewWriter(*mesh, sols, physics_name); | ||
| pv_writer.write(0, 0.0, sols); | ||
|
|
There was a problem hiding this comment.
Is there an assert anywhere in this test? for example to check that the solver converged?
Add block preconditioners to be used in
LinearDifferentiableBlockSolver. This includes block diagonal (additive fieldsplit), block triangular (multiplicative fieldsplit) for arbitrary N x N block systems, as well as Schur complement factorizations for 2x2 systems, with a standard Schur complement approximation (selfp).