fix: correct anomaly detection to compare original data vs bounds #70
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
Fixes #67.
Claude's commit description below:
Fixed detectAnomalies(): Now compares predictions vs bounds instead
actual of vs bounds
Added null handling: Properly skips out-of-sample predictions (null values)
Updated type signatures: Support null values in originalValues array
Extracted testable function: Made detectAnomalies() independently testable
Added comprehensive test suite with 11 test cases covering:
Before: Anomaly detection was essentially broken - would never detect
anomalies in real data when models produced good predictions
After: Anomaly detection works correctly, detecting actual anomalies
in original time series data regardless of model prediction quality
📦 Published PR as canary version:
0.5.1--canary.70.15561052406.0✨ Test out this PR locally via: