-
Notifications
You must be signed in to change notification settings - Fork 136
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Fix Loss Function to Improve Model Convergence for AutoEncoder
#1460
Fix Loss Function to Improve Model Convergence for AutoEncoder
#1460
Conversation
/ok to test |
af11a73
to
01517bb
Compare
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Reviewed the changes and have made a few commits to improve the readability and remove redundant calculations. The loss now equally weights all types of features. Once CI is passing and @hsin-c is good with the changes, we can merge.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks for including the use of .sum()
. Looks good now.
AutoEncoder
/merge |
Description
This PR addresses an issue in the dfencoder model related to its convergence behavior. Previously, the model exhibited difficulty in converging when trained exclusively with numerical features.
This PR fixes the way different loss types are combined in the model's loss function to ensure that backpropagation works correctly.
Note: This may alter the exact values resulting from calling
fit()
on the model. Before, categorical features were weighted much higher than binary or numerical categories (all numerical features shared a combined weight of 1, all binaries features shared a combined weight of 1, and each categorical feature had a weight of 1). Now all features are weighted equally which may impact the trained weights.Closes #1455