Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Stress Calculation in ForceRegressionTask #302

Merged
merged 14 commits into from
Oct 10, 2024
Merged

Conversation

melo-gonzo
Copy link
Collaborator

This PR adds the ability to perform stress calculations inside the ForceRegressionTask, and is inspired by the implementation here: https://github.com/ACEsuit/mace/blob/main/mace/modules/utils.py. This functionality if usefull in downstream tasks that require this parameter, and will be helpful for future MatSciML tasks such as a StressRegressionTask or similar.

@melo-gonzo melo-gonzo requested a review from laserkelvin October 4, 2024 22:59
@laserkelvin
Copy link
Collaborator

I personally prefer StressfulTask

matsciml/models/base.py Show resolved Hide resolved
volume = torch.linalg.det(cell).abs().unsqueeze(-1)
stress = virials / volume.view(-1, 1, 1)
stress = torch.where(
torch.abs(stress) < 1e10, stress, torch.zeros_like(stress)
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

What is this hard coded value?

For something like this, I don't actually know if it's actually more performant to just clip values, instead of using torch.where where you have to allocate a zeros_like tensor.

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

this is an inspired bit from here. im assuming it is helpful in tasks which compute losses from stress. happy to remove the line and reconsider adding it back if/when we do stress related training.

matsciml/models/base.py Show resolved Hide resolved
pyproject.toml Outdated Show resolved Hide resolved
@laserkelvin
Copy link
Collaborator

For now this is fine, but I have to say that ForceRegressionTask is now reaching unmaintainable levels, now that we have both frame averaging and stress/virial logic spread out across process_embedding and other functions. We really need to think about reabstracting stuff where it makes sense. Not saying we need to do it for this PR, but sooner rather than later.

@melo-gonzo
Copy link
Collaborator Author

For now this is fine, but I have to say that ForceRegressionTask is now reaching unmaintainable levels, now that we have both frame averaging and stress/virial logic spread out across process_embedding and other functions. We really need to think about reabstracting stuff where it makes sense. Not saying we need to do it for this PR, but sooner rather than later.

I agree, it could use some organizing.

Copy link
Collaborator

@laserkelvin laserkelvin left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM, merge when ready

@melo-gonzo melo-gonzo merged commit 88218c9 into IntelLabs:main Oct 10, 2024
2 of 4 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants