Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add multi-system to FEProblemSolve and deploy on Physics syntax #29021

Merged
merged 24 commits into from
Nov 22, 2024

Conversation

GiudGiud
Copy link
Contributor

@GiudGiud GiudGiud commented Nov 7, 2024

closes #29019

Includes:

  • support for setting system names in Physics, which can then be coded to assign the variables to the right systems. There is no matching variables to systems from the input directly. I dont think all options are needed for the main class.
  • a reworked FEProblemSolve that includes the fixed point iteration (or once-through loop) using a CV object for each system
  • fixes for residual debugging for multi-system
  • support for using the dynamic pressure for solving the NS equations. This one is a small change but consequential and I need it for the same project as this.

Next items we need (TODO but not in this PR?)

  • Nonlinear convergence for each NL system that don't fight over the EquationSystems parameter (currently an issue)
    This is requiring this PR in libmesh: Add a parameter object on the system, use it in derived classes libMesh/libmesh#4007
  • A decent convergence object for multi-system fixed point. Right now if you tried to use the nonlinear residual (like for multiapp iterations), you'd end up with always considering it converged (because you just solved that nonlinear system, but did not recompute the residual after solving the other nonlinear systems)
  • Relatedly, an execute_on for multisystem fixed-point convergence checks (needed to update any PP during that convergence check) or for post system1 solve (a la post-Executor style)

For these last two tasks, i think the solutions will be fairly close to what gets used for MultiAppFixedPointConvergence when that gets figured out.

@GiudGiud GiudGiud self-assigned this Nov 7, 2024
@GiudGiud GiudGiud force-pushed the PR_physics_multisys branch 6 times, most recently from bda2b80 to 665b214 Compare November 12, 2024 17:05
Copy link
Contributor

@joshuahansel joshuahansel left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It's looking good!

framework/src/executioners/FEProblemSolve.C Outdated Show resolved Hide resolved
framework/include/executioners/FEProblemSolve.h Outdated Show resolved Hide resolved
framework/include/physics/PhysicsBase.h Show resolved Hide resolved
framework/src/problems/FEProblemBase.C Outdated Show resolved Hide resolved
framework/src/actions/AddDefaultConvergenceAction.C Outdated Show resolved Hide resolved
framework/src/executioners/MultiSystemSolveObject.C Outdated Show resolved Hide resolved
framework/src/physics/PhysicsBase.C Outdated Show resolved Hide resolved
framework/src/physics/PhysicsBase.C Outdated Show resolved Hide resolved
@GiudGiud GiudGiud force-pushed the PR_physics_multisys branch 4 times, most recently from abbcc10 to b2069b1 Compare November 12, 2024 22:56
@moosebuild
Copy link
Contributor

moosebuild commented Nov 13, 2024

Job Documentation, step Docs: sync website on e8e9ecc wanted to post the following:

View the site here

This comment will be updated on new commits.

@idaholab idaholab deleted a comment from moosebuild Nov 13, 2024
@GiudGiud GiudGiud force-pushed the PR_physics_multisys branch from a64d0c1 to 7dc574e Compare November 14, 2024 18:23
@moosebuild
Copy link
Contributor

moosebuild commented Nov 16, 2024

Job Coverage, step Generate coverage on e8e9ecc wanted to post the following:

Framework coverage

8a5faa #29021 e8e9ec
Total Total +/- New
Rate 85.14% 85.13% -0.01% 82.14%
Hits 107324 107450 +126 207
Misses 18727 18766 +39 45

Diff coverage report

Full coverage report

Modules coverage

Contact

8a5faa #29021 e8e9ec
Total Total +/- New
Rate 90.41% 90.38% -0.03% 66.67%
Hits 4957 4959 +2 4
Misses 526 528 +2 2

Diff coverage report

Full coverage report

Heat transfer

8a5faa #29021 e8e9ec
Total Total +/- New
Rate 88.81% 88.81% +0.00% 100.00%
Hits 4404 4406 +2 2
Misses 555 555 - 0

Diff coverage report

Full coverage report

Navier stokes

8a5faa #29021 e8e9ec
Total Total +/- New
Rate 84.68% 84.78% +0.10% 96.74%
Hits 17182 17384 +202 356
Misses 3108 3120 +12 12

Diff coverage report

Full coverage report

Optimization

8a5faa #29021 e8e9ec
Total Total +/- New
Rate 88.75% 88.73% -0.02% 85.71%
Hits 2035 2039 +4 6
Misses 258 259 +1 1

Diff coverage report

Full coverage report

Scalar transport

8a5faa #29021 e8e9ec
Total Total +/- New
Rate 88.86% 88.89% +0.03% 100.00%
Hits 367 368 +1 1
Misses 46 46 - 0

Diff coverage report

Full coverage report

Full coverage reports

Reports

Warnings

  • framework new line coverage rate 82.14% is less than the suggested 90.0%
  • contact new line coverage rate 66.67% is less than the suggested 90.0%
  • optimization new line coverage rate 85.71% is less than the suggested 90.0%

This comment will be updated on new commits.

@GiudGiud
Copy link
Contributor Author

Got to add doco and fix the odd straggler tests but otherwise ready to get looked at

@GiudGiud GiudGiud force-pushed the PR_physics_multisys branch from 6d60fe9 to 8f0ff16 Compare November 18, 2024 18:39
framework/src/executioners/FEProblemSolve.C Show resolved Hide resolved
framework/include/executioners/FEProblemSolve.h Outdated Show resolved Hide resolved
framework/src/executioners/MultiSystemSolveObject.C Outdated Show resolved Hide resolved
test/tests/minimal_app/tests Show resolved Hide resolved
framework/include/executioners/FEProblemSolve.h Outdated Show resolved Hide resolved
Copy link
Member

@lindsayad lindsayad left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think this PR should be split into two. Can we change the title to highlight some of the more fundamental additions/changes in this PR? The things I'm concerned about are separate from the Physics classes.

I also do not like that we have hard-coded a certain type of fixed point iteration (for which conceptually we would hope to have something like a FixedPointSolveObject electable from the input file) in a FEProblemSolve object

@grmnptr
Copy link
Contributor

grmnptr commented Nov 18, 2024

The same issues will come up with this I think:
#28821

Maybe it has an excuse that currently there is no executioner that can run a linear system on its own in the framework.

@GiudGiud GiudGiud changed the title Add multi-system to Physics syntax Add multi-system to FEProblemSolve and deploy on Physics syntax Nov 18, 2024
@GiudGiud
Copy link
Contributor Author

GiudGiud commented Nov 18, 2024

Can we change the title to highlight some of the more fundamental additions/changes in this PR? The things I'm concerned about are separate from the Physics classes.

Done.

I also do not like that we have hard-coded a certain type of fixed point iteration (for which conceptually we would hope to have something like a FixedPointSolveObject electable from the input file) in a FEProblemSolve object

I have hard coded the most simple one, both conceptually and for a user (no parameters). "Electing it from the input file" is the plan for executors.
If you recall I reached out before transforming FEProblemSolve. I m happy to rework if in the end we dont like that FEProblemSolve took on a multi-system solve (even though conceptually FEProblem does not mean single-system). There are 3 options:

  • create 2 new executioners, SteadyMultiSystem and TransientMultiSystem which create a different solve object. Lots of code duplication in that path imo. Lots of refactoring of executioners to limit the code duplication but just the solve objects will be tons of duplication
  • what I did (minimal duplication, no extra option)
  • modify Steady and Transient so they do NOT always create a FEProblemSolve, but instead there's some options for the solve object, selectable from the input file. So turn this:
Steady::Steady(const InputParameters & parameters)
  : Executioner(parameters),
    _problem(_fe_problem),
    _feproblem_solve(*this),
    _system_time(getParam<Real>("time")),
    _time_step(_problem.timeStep()),
    _time(_problem.time()),
    _output_iteration_number(0)
{
  _fixed_point_solve->setInnerSolve(_feproblem_solve);

into this:

Steady::Steady(const InputParameters & parameters)
  : Executioner(parameters),
    _problem(_fe_problem),
    _system_time(getParam<Real>("time")),
    _time_step(_problem.timeStep()),
    _time(_problem.time()),
    _output_iteration_number(0)
{
if (_using_multi_system)
  _inner_solve = std::make_shared<MultiSystemSolve>(...);
else
  _inner_solve = std::make_shared<FEProblemSolve>(...);

is that preferred? That's the design I used for MultiApp fixed point iterations so I am familiar with it.
I opted against this design because of:

  • the duplication of the FEProblemSolve code
  • it's not the right design for handling parameters. It worked fine for Secant/Steffensen because they use the same parameters as Picard but we got lucky
  • the options to do other algorithms than Picard for multiapp coupling / fixed-point is hardly used and caused some unhappiness and complaints for the executor rework. The deeper we expand into this solve object design, the more complex the code is to re-create as executors. I think what I added here however is quite simple, just a loop
  • what I did is all we need for MSFR and all that was needed for SAM too. It is not enough for coding SIMPLE though, but then again on SIMPLE, we are creating new executioners not just new solve objects so until we decide we ONLY want a new solveObject, the effort is not there

Or do you have a 4th option?

@GiudGiud GiudGiud force-pushed the PR_physics_multisys branch from ab87457 to 762be82 Compare November 18, 2024 23:41
@idaholab idaholab deleted a comment from moosebuild Nov 18, 2024
@GiudGiud GiudGiud force-pushed the PR_physics_multisys branch 2 times, most recently from e9ac599 to d4eedcc Compare November 19, 2024 00:15
- remove boolean that enables legacy behavior
- rename a parameter and re-type it to forward the selection of forward solve to the list of nonlinear systesm

Co-authored-by: Alex Lindsay <[email protected]>
@GiudGiud
Copy link
Contributor Author

GiudGiud commented Nov 19, 2024

@lewisgross1296 this is the PR I mentioned

Copy link
Contributor

@grmnptr grmnptr left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Very nice generalization!

framework/src/executioners/MultiSystemSolveObject.C Outdated Show resolved Hide resolved
framework/src/problems/FEProblemBase.C Show resolved Hide resolved
framework/src/executioners/FEProblemSolve.C Outdated Show resolved Hide resolved
- fix indexing of linear systems
- doco fix
@GiudGiud GiudGiud force-pushed the PR_physics_multisys branch from edf3b27 to e8e9ecc Compare November 22, 2024 02:02
@GiudGiud GiudGiud requested a review from grmnptr November 22, 2024 16:34
Copy link
Contributor

@grmnptr grmnptr left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Looks good to me, thanks! Failure is unrelated.

I will add the example for non-physics-based multi linear problem in
#28821

@GiudGiud GiudGiud merged commit 367b3c0 into idaholab:next Nov 22, 2024
50 of 51 checks passed
@GiudGiud GiudGiud deleted the PR_physics_multisys branch November 22, 2024 18:39
@GiudGiud
Copy link
Contributor Author

Thanks for the reviews! I ll work on linear scalar asap

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Add support for multi-system to physics classes
5 participants