-
Notifications
You must be signed in to change notification settings - Fork 1.1k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Allowing separated InterfaceKernels using periodic boundaries #24140
Comments
Reposting some text to start the discussion. From working on interface methods for a while, my opinion is that there are three somewhat well-delineated categories:
These 3 categories increase in generality of problem class from top to bottom while also increasing in implementation/architecture complexity and projection errors. I've implemented all 3 categories before in my matlab FEM codes and in FEAP. Within MOOSE, I see these 3 mapping onto I finally got some time to work on this within the past week. The source code is here https://github.com/Truster-Research-Group/moose/tree/topological-neighbor-PBC and https://github.com/ttruster/libmesh/tree/weak-enforcement-PBC. Perhaps we can open an issue to continue the discussion of this? I had also heard that the discussion of stateful mortar constraints was continuing, so I noticed a new issue was started here for that #24030. Some of the discussions to occur in the issue include
|
@lindsayad by glancing over these, I believe this approach would (have a good chance to) fix all three of these. From #21161 I agree that traditional theory of CZM is that the two material points that "talk to each other" (have force interaction) remain fixed until complete fracture/separation; this is the difference from contact mechanics with friction where the interacting points changes during sliding. The approach I'm proposing here, using the terminology of #21161, is to use the existing periodic boundaries system with a "weak-enforcement" flag to serve as a data-container or at least a means to look up the "fake neighbor" (after breaking the mesh or just from two pairs of boundaries that were never broken). I haven't dealt with distributed meshes yet, but I would hope that by removing the "hack" and using "supported" functions from libmesh such as Whether the ex-neighbor information is stored when the mesh is broken in Because that's the drawback of this: while it makes the handling of CZM more robust, it also makes it "slower". Though I have no idea what fraction of time this is compared to solving Kd=F or computing K and F; maybe it doesn't "matter"... |
I also forgot, I'm including in here the ability of the two sides to not be physically adjacent; the two boundaries just need to be offset by a constant vector. That's why to me this fits "logically" with periodic boundary conditions and why it motivates doing something "different", i.e. not just forcing the CZM stuff to fit within the existing In short, the difference is:
Because the "result" (second phrase) is the same for both case, it's logical to me to use an |
One more idea, from looking over those 90+ comments at #12033, and based on my list of bullets: these are a "little" more complicated to describe in the input file because the user has to "know" the list of paired boundaries to mention in the Periodic BC (PBC) object. And then they need to add an Maybe we can write a composite action that writes both the PBC and the IK. Then the boundaries are only mentioned once. The downside is, how to pass all the IK parameter values since that action needs to potentially handle all possible types of IK. So we are probably stuck with just needing to add a list of the PBC using the existing Anyways, for my physical problems, I'll be generating my |
I think an action makes sense. We could leverage the existing cohesive zone action |
I can work on modifying the existing master action to create both the Interface Kernel and the periodic boundary. I will also try out Distributed Meshes on my own. Are we going to enforce that, all meshes generated with the |
Yes, absolutely. Allowing the application of |
You will be a hero to me if I can remove the hacks in #24018 |
@lindsayad and @roystgnr: Well I have some interesting information to report about the current version of the source code with these weak periodic boundaries. I'll have to be very specific, because I will need your help to know what in the rest of the code will need to change. If you want to see the files I can push them into my branch.
So in summary:
Thoughts on how to get those ghosts written out? |
Before I go to bed, to un-informed ideas to get the ghosts written
The second case makes sense to me, as right now I'm never checking that the periodic boundary condition is actually assigned to the variable that the Interface Kernel is acting on, I'm only using it to look up boundary pairs. |
I've forgotten - what is MOOSE currently doing with PeriodicBoundary objects on a DistributedMesh? You're exactly right about the problem - DistributedMesh only knows how to delete remote elements and repartition/redistribute already-properly-ghosted elements; it doesn't yet know how to get improperly-deleted elements back if the required ghosting level is increased after the fact by something like the post facto addition of a new periodic boundary). But IIRC we hit this issue years ago, and although MOOSE was unable to make the obvious "add all periodic BCs before |
When there are "late" geometric ghosting functors to be added like in the case of periodic boundaries, we tell the mesh to not delete any remote elements until the periodic boundaries have been added |
So basically we keep the mesh serialized ... (although we probably don't do any checks for a pre-split mesh ...) |
That works (from a correctness standpoint, if not an efficiency one). So we'd just need to add another special case for this type of PeriodicBoundary adder? |
Yea although wouldn't this also use the |
I quickly ran a test on the Would storing the |
You mean when the mesh is read back in from a CheckpointIO file? I don't see anything disabling distributed or disabling restart testing on those periodic BC regression tests, and at least some of them are running transient and are large enough not to pull in periodic neighbors "by accident", so I'd expect to be failing exodiffs in the "Distributed mesh and recover" recipe if we had a problem there. Looking at checkpoint_io.C I see we're just writing all semilocal elements (which would include all periodic neighbors) in the distributed mesh case, and we're writing everything the ghosting functors tell us we need (which should also include all periodic neighbors) in the serialized/Replicated mesh case. And then at read time I wouldn't expect to need anything except for the same "disable deleting remote elements until we've definitely redefined 'remote' correctly" workaround. |
Yea some clarification on what you mean by "automatic mesh splitting" would be helpful |
Using my "next" branch of compiled MOOSE with the test case above that I mention: (moose_docs) ttruster@designstar: /_Prog_Repo/MOOSE/mooseCLMI/test/tests/bcs/periodic$ ../../../moose_test-opt -i periodic_bc_test.i --split-mesh 2 --split-file two Mesh/parallel_type=replicated Then I receive the error message is: I receive the same message when calling moose-test-debug. I don't see any tests in that folder "bcs/periodic" using distributed meshes, and I don't see any files in the "mesh/splitting" folder that use PBC. Is there another place to look? |
Yea this is the case I referred to in #24140 (comment). If you load a pre-split mesh, which did not have any ghosting functors from periodic bcs attached to it at the time that the mesh was prepared before writing out, then we don't do any communication to recover the elements that are needed. If you do single step with |
Okay, so I reran both They both run with opt version and can also be recovered. So if we are not worried about enabling pre-split meshes and periodic BC together right now, then the current approach seems to pass tests. Namely we have addressed my 4th bullet at the top "What happens if the two sides of an "interface" live on two processors?" For debug mode, the error message comes from Libmesh in mesh/mesh_base from the line |
What is the difference if I instead use |
I wasn't aware of the |
@hugary1995 you didn't know about |
I've been presplitting my large meshes since my day 1 of moose :). As Tim said, we probably should mention the existence of that option in the split mesh doco page. Anyways,
is exactly what I need to know. So yeah, I'm pretty sure for all of our problems, there will be enough memory to hold the serial mesh prior to solving the system. |
So our semester is finally ending, and I was also working on interface kernels with coupled scalar variables with @ambehnam. That got me ready to post a closure to this issue. My one-line request that I'm seeking an answer for is at the bottom. With the question of distributed vs serialized meshes "addressed", I have collected the comments above and made a list of a dozen tasks to do for this development, and I am ready to open it as a pull request except to settle on one point I raised above: Holding the data for the weak-enforced PBC within the The benefits of having a paired-boundaries container handled up in the
However, the drawbacks are:
So if we agree that the drawbacks outweigh the benefits of adding a paired-boundary container, then I'll move forward with opening a pull-request to begin developments within the |
@roystgnr is more qualified to give his opinions on the "vs
We would handle this using MOOSE restartable data if implemented at the MOOSE level |
I don't honesty see the drawbacks of keeping the pairing connections in DofMap. In DofMap, MOOSE can access it because MOOSE knows about libMesh, so MOOSE can do anything. In MOOSE, libMesh can't access it because libMesh doesn't know about MOOSE, so better make sure you also defined a custom GhostingFunctor if you want to be able to repartition a distributed mesh without losing connected elements. |
I'm all for putting this as far upstream as possible 👍 |
Sounds good, we will keep with the DofMap approach as planned. This is my first time doing this: As a related question, I had previously been able to change .gitsubmodules to point to my repo and execute |
It's easiest to get the libmesh PR merged first. Alternatively if you really want to see MOOSE testing before the libMesh PR is merged, @roystgnr or I could potentially push your fork's branch to libmesh/libmesh and then you could have that libmesh update hash in your MOOSE PR. It's possible you could keep it in your own fork and update
and then carry on in MOOSE without running the full-blown |
Reason
Meshes written by
--mesh-only
that contain split blocks fromBreakMeshByBlock
cannot be utilized for physical analyses withInterfaceKernels
in a subsequent input file which reads the mesh, due to the missing element-neighbor connectivity information. This issue impacts the ability of cohesive zone modeling for distributed meshes. Also, there is not a mechanism for inputting element-neighbor pairs other than the automatically detected pairs due to adjacency (shared nodes).Design
Generalize Periodic Boundary Condition (PBC) input to allow weak and strong enforcement. Strong enforcement uses DOF constraint equations (current approach); weak enforcement would use surface integrals as part of the weak form of the boundary value problem.
Provide the names of the two topologically conforming (potentially separated) boundaries for weak enforcement by using the
AddPeriodicBCAction
, so that the pair is registered in the DOF map but does not get a constraint matrix.Add method
onPeriodicBoundary
within thread loops of assembly with analogous behavior toonInterface
with handles boundary ID of an element that lack an adjacent neighbor but do have a separated neighbor.Utilize existing
topological_neighbor
andinverse_map
methods to handle reinitialization and existingInterfaceKernels
for physics.Impact
Allows PBC class to help with solving interface problems like cohesive zone modeling and to treat periodic conforming meshes using interface kernels instead of nodal constraints, allowing Nitsche/DG/mortar/penalty treatments.
References
Originated from Discussion #22948 (reply in thread)
Requires enhancements of Libmesh from PR libMesh/libmesh#3525
The text was updated successfully, but these errors were encountered: