Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Find loss value on end point when bound limit is reached #14

Open
TorkelE opened this issue Apr 18, 2024 · 4 comments
Open

Find loss value on end point when bound limit is reached #14

TorkelE opened this issue Apr 18, 2024 · 4 comments
Labels
enhancement New feature or request

Comments

@TorkelE
Copy link

TorkelE commented Apr 18, 2024

So, if you look at the output of the get_interval function, you can actually find the exact parameter points (and loss function values) where the interval was reached (in profilePoints). However, this does not seem to be possible when the edge of the scan boundary was reached?

E.g. for something like this (profile in blue, critical loss value in red)
image
We do not find a value on the left, but just note that the scan boundary was reached.

Is there still a way to recover the best value of the loss function here (essentially the end point of the blue line)?

My reasoning is that it is not always trivial to know what critical value to use. Here, I would first run one scal (with loss_crit = Inf) to find the maximum values at the boundaries (if these are maximum values, but in my case I think so). Next, I could use this (and the loss value at the starting point) to select some good loss_crit for a follow up scan.

@ivborissov
Copy link
Collaborator

That's a good remark. We don't save it, but indeed it is useful. Currently you can only extract the best value of the parameter of interest - the value, which has been reached during the search interval.result[1].supreme.

For the loss_crit value. How do you determine which value to use? Usually for likelihood-based CIs the threshold is set as optimal_loss + quantile of the chisq distribution

@ivborissov ivborissov added the enhancement New feature or request label Apr 19, 2024
@TorkelE
Copy link
Author

TorkelE commented Apr 19, 2024

Yes. I think there are some other useful information that can be achieved from it. E.g. if the right min-loss is way higher than the left, and the left is similar-ish to the optimal, you probably have a diagram like the one above which indicates practical non-identifiability.

Right now I have access to the true loss value (since I am working with a simulated data), so I can use it (because of noise, it is typically a bit worse than the optimum, but not much). I mostly use it as it is an unbiased value that more or less work all of the time. "optimal_loss + quantile of the chisq distribution" sound like something that makes sense though, maybe I should use that instead.

@ivborissov
Copy link
Collaborator

Yes, that's something commonly used. E.g. in https://academic.oup.com/bioinformatics/article/25/15/1923/213246#394197733

@TorkelE
Copy link
Author

TorkelE commented Apr 22, 2024

Thanks for the reference, I'll read that one

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

No branches or pull requests

2 participants