Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Optional normalization for the ranknet loss. #11272

Merged
merged 8 commits into from
Feb 25, 2025

Conversation

trivialfis
Copy link
Member

@trivialfis trivialfis commented Feb 21, 2025

The old implementation has the seed (iter + 1) *1111, while the current implementation uses iter. Otherwise, everything is the same when the set of hyper-parameters in the new document is used.

@trivialfis trivialfis requested a review from hcho3 February 21, 2025 18:50
@trivialfis trivialfis changed the title Avoid normalization for the ranknet loss. Optional normalization for the ranknet loss. Feb 24, 2025
@trivialfis
Copy link
Member Author

Added a simple test.

@trivialfis
Copy link
Member Author

Keeping the referenced issue open for now since there are unresolved issues with GPU build. I suspect it's the rng, but will wait for more updates before closing.

@trivialfis trivialfis merged commit be83eb6 into dmlc:master Feb 25, 2025
58 of 59 checks passed
@trivialfis trivialfis deleted the ltr-pairwise-norm branch February 25, 2025 18:13
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants