Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Improve GP/SGP API #132

Merged
merged 26 commits into from
Feb 1, 2024
Merged
Changes from 1 commit
Commits
Show all changes
26 commits
Select commit Hold shift + click to select a range
92f0c33
Add experts getter
relf Jan 25, 2024
01230ca
Add experts API
relf Jan 25, 2024
acc5644
Add sparse method choice in py API
relf Jan 26, 2024
3a689f6
Manage traces, Add initial_theta and sparse_method
relf Jan 26, 2024
47fb019
Make GP n_start configurable
relf Jan 27, 2024
54bb51c
Add n_start argument to control hyperparams optimization restarts
relf Jan 27, 2024
4ec0e22
Add theta tuning interface for SGP
relf Jan 29, 2024
8e1a754
Fix theta tuning initialization, refactor cobyla params
relf Jan 30, 2024
4c118cd
Renaming sparse_algorithm|parameters
relf Jan 30, 2024
b36c359
Renaming theta_init
relf Jan 30, 2024
bff20f5
Rename guess in init
relf Jan 30, 2024
f266045
Add thate_tuning in GP API
relf Jan 30, 2024
8ce0dc4
Add theta_tuning to GP
relf Jan 30, 2024
79299cd
Fix cobyla maxeval parameter
relf Jan 31, 2024
b4991ea
Parallellize multistart optimizations for SGP
relf Jan 31, 2024
3029bc5
Add SparseGpx basic tutorial
relf Jan 31, 2024
8894fb6
Trained Gp model store reduced likelihood value
relf Jan 31, 2024
d6da0df
Improve display trained model infos
relf Jan 31, 2024
f139e01
Fix moe display test
relf Jan 31, 2024
4c9ea54
Make GP/SGP computation interruptible
relf Feb 1, 2024
a5c3084
Fix double import
relf Feb 1, 2024
36ee40a
Cleanup
relf Feb 1, 2024
726c087
Remove fallible assertion in moe display test
relf Feb 1, 2024
f826103
Avoid ctrlc multiple handlers errors
relf Feb 1, 2024
cce24f7
Relax sgp noise test tolerance
relf Feb 1, 2024
ae62ce6
Add parallel multistart to GP
relf Feb 1, 2024
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
Prev Previous commit
Next Next commit
Remove fallible assertion in moe display test
  • Loading branch information
relf committed Feb 1, 2024
commit 726c0879dc80e11f3f429e3a835eef644ee205ca
4 changes: 3 additions & 1 deletion moe/src/gp_algorithm.rs
Original file line number Diff line number Diff line change
@@ -1220,6 +1220,8 @@ mod tests {
.with_rng(rng)
.fit(&Dataset::new(xt, yt))
.expect("MOE fitted");
assert_eq!("Mixture[Hard](Constant_SquaredExponentialGP(mean=ConstantMean, corr=SquaredExponential, theta=[0.03871601282054056], variance=[0.276011431746834], likelihood=454.17113736397033), Constant_SquaredExponentialGP(mean=ConstantMean, corr=SquaredExponential, theta=[0.07903503494417609], variance=[0.0077182164672893756], likelihood=436.39615700140183), Constant_SquaredExponentialGP(mean=ConstantMean, corr=SquaredExponential, theta=[0.050821466014058826], variance=[0.32824998062969973], likelihood=193.19339252734846))", moe.to_string());
// Values may vary depending on the platforms and linalg backends
// assert_eq!("Mixture[Hard](Constant_SquaredExponentialGP(mean=ConstantMean, corr=SquaredExponential, theta=[0.03871601282054056], variance=[0.276011431746834], likelihood=454.17113736397033), Constant_SquaredExponentialGP(mean=ConstantMean, corr=SquaredExponential, theta=[0.07903503494417609], variance=[0.0077182164672893756], likelihood=436.39615700140183), Constant_SquaredExponentialGP(mean=ConstantMean, corr=SquaredExponential, theta=[0.050821466014058826], variance=[0.32824998062969973], likelihood=193.19339252734846))", moe.to_string());
println!("Display moe: {}", moe);
}
}
Loading