Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[bugfix] delete parameter not initialized for triton2.2 #462

Merged
merged 2 commits into from
Feb 26, 2025
Merged

Conversation

StrongSpoon
Copy link
Collaborator

PR Category

Other

Type of Change

Bug Fix

Description

In the definition of triton.Config, attribute enable_persistent is not acclaimed as initialization parameter. While it's stored in FlagGems' config cache, resulting the error in preload.
Since in most cases, enable_persistent is set as default value, I ignore it when reconstructing triton.Config.

Issue

Progress

  • Change is properly reviewed (1 reviewer required, 2 recommended).
  • Change is responded to an issue.
  • Change is fully covered by a UT.

Performance

@@ -93,6 +93,7 @@ def preload(self):
kwargs[k] = eval(v)
for k, v in cfg_ls[-attrs:]:
numargs[k] = eval(v)
numargs.pop("enable_persistent", None)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Could you add a comment line here describing this config option for reference?

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

done

Copy link
Contributor

@tongxin tongxin left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LG

@StrongSpoon StrongSpoon merged commit 7acdae4 into master Feb 26, 2025
@StrongSpoon StrongSpoon deleted the tunerfix branch February 26, 2025 07:50
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants