Releases: OasisLMF/OasisPlatform
Release 1.28.6
Oasis Release v1.28.6
Docker Images (Platform)
- coreoasis/api_server:1.28.6
- coreoasis/model_worker:1.28.6
- coreoasis/model_worker:1.28.6-debian
- coreoasis/piwind_worker:1.28.6
Docker Images (User Interface)
Components
Changelogs
OasisPlatform Changelog - 1.28.6
- #962 - Remove pip from 2nd stage server build
- #939 - Fix Azure support for workers (platform 1)
- #987 - Fixed CVE issues 1.28.6
OasisLMF Changelog - 1.28.6
- #1460 - Occurrence file not found when requesting output from ktools component aalcalcmeanonly
- #1446 - Add missing loc_id check
- #1447 - use correct error_model in back allocation
- #1448 - ensure header is written in keys.csv
- #1350 - model settings - correlation settings - allow optional hazard or damage correlation value
- #1385 - Missing parquet library dependencies for gulmc in 1.27
- #1430 - FM acceptance tests failing with pandas==2.2.0
- #1467 - fix for 1 loc with no account fm terms
- #1407 - Added tests for condition coverages 1-5 financial terms
- #80 - Added fields to analysis settings schema
Release Notes
OasisPlatform Notes
Remove pip from 2nd stage server build - (PR #962)
- Fix #961
Fix Azure support for workers (platform 1) - (PR #939)
Two fixes for platform 1 workers (1.27.x
, 1.28.x
) running on azure via kubernetes.
- CeleryDB passwords with special characters (like
#
) can crash workers because these are not escaped correctly - Workers fail to store results in the shared-fs file mount due to
chmod
causing an exception.
Fixed CVE issues - (PR #987)
┌─────────────────────────┬────────────────┬──────────┬────────┬───────────────────┬───────────────┬────────────────────────────────────────────────────────────┐
│ Library │ Vulnerability │ Severity │ Status │ Installed Version │ Fixed Version │ Title │
├─────────────────────────┼────────────────┼──────────┼────────┼───────────────────┼───────────────┼────────────────────────────────────────────────────────────┤
│ cryptography (METADATA) │ CVE-2023-50782 │ HIGH │ fixed │ 41.0.6 │ 42.0.0 │ python-cryptography: Bleichenbacher timing oracle attack │
│ │ │ │ │ │ │ against RSA decryption - incomplete fix for... │
│ │ │ │ │ │ │ https://avd.aquasec.com/nvd/cve-2023-50782 │
│ ├────────────────┤ │ │ ├───────────────┼────────────────────────────────────────────────────────────┤
│ │ CVE-2024-26130 │ │ │ │ 42.0.4 │ cryptography is a package designed to expose cryptographic │
│ │ │ │ │ │ │ primitives ... │
│ │ │ │ │ │ │ https://avd.aquasec.com/nvd/cve-2024-26130 │
└─────────────────────────┴────────────────┴──────────┴────────┴───────────────────┴───────────────┴────────────────────────────────────────────────────────────┘
OasisLMF Notes
Fix missing occurrence file error when aalcalcmeanonly output requested - (PR #1463)
When only aalcalcmeanonly
output requested and an identifier is used to identify the occurrence file to be used, a symbolic link to that file is created in the run static directory. This fixes an issue where the symbolic link was not created in the aforementioned scenario.
Added check for missing loc_ids after lookup returns keys - (PR #1446)
- Safe guard to ensure all location rows are processed by the lookup class, a return for each
loc_id
should exist in either keys.csv or keys_error.csv - Fixed platform testing after release of
2.3.0
fix potential ZeroDivisionError in during fmpy back allocation - (PR #1447)
Fix missing header in keys.csv - (PR #1448)
This fix make sure a header is written in keys.csv even if the first block of results send to the keys written is all failling
Hazard and damage correlation values in model settings now optional - (PR #1456)
The correlation value for either damage or hazard is now optional and defaults to zero if not entered.
Fixed package requirments - (PR #1459)
- Moved PyArrow to a required package, its needed for gulmc which is now the default
- Set maximum pandas version to 2.1.x OasisLMF/OasisLMF#1466
adapt code to be compatible between pandas 2.2 and previous version - (PR #1431)
- remove some deprecation idiom
- fix issue with acc_idx when loading account
fix issue with 1 location and no account terms - (PR #1467)
Make sure the information in the account file are merge even if no financial terms are present.
Added tests for condition coverages 1-5 financial terms - (PR #1407)
Completed all units in validation/insurance_policy_coverages
Release 1.27.8
Oasis Release v1.27.8
Docker Images (Platform)
- coreoasis/api_server:1.27.8
- coreoasis/model_worker:1.27.8
- coreoasis/model_worker:1.27.8-debian
- coreoasis/piwind_worker:1.27.8
Docker Images (User Interface)
Components
Changelogs
OasisPlatform Changelog
- #962 - Remove pip from 2nd stage server build
- #939 - Fix Azure support for workers (platform 1)
- #987 - Fixed CVE issues 1.27.8
OasisLMF Changelog
- #1420 - Fix/double condtag
- #1446 - Add missing loc_id check
- #1448 - ensure header is written in keys.csv
- #1447 - use correct error_model in back allocation
- #1456 - Allow optional hazard or damage correlation value
Release Notes
OasisPlatform Notes
Remove pip from 2nd stage server build - (PR #962)
- Fix #961
Fix Azure support for workers (platform 1) - (PR #939)
Two fixes for platform 1 workers (1.27.x
, 1.28.x
) running on azure via kubernetes.
- CeleryDB passwords with special characters (like
#
) can crash workers because these are not escaped correctly - Workers fail to store results in the shared-fs file mount due to
chmod
causing an exception.
Fixed CVE issues - (PR #988)|
┌─────────────────────────┬────────────────┬──────────┬────────┬───────────────────┬───────────────────────┬─────────────────────────────────────────────────────────────┐
│ Library │ Vulnerability │ Severity │ Status │ Installed Version │ Fixed Version │ Title │
├─────────────────────────┼────────────────┼──────────┼────────┼───────────────────┼───────────────────────┼─────────────────────────────────────────────────────────────┤
│ Django (METADATA) │ CVE-2023-46695 │ HIGH │ fixed │ 3.2.20 │ 3.2.23, 4.1.13, 4.2.7 │ python-django: Potential denial of service vulnerability in │
│ │ │ │ │ │ │ UsernameField on Windows │
│ │ │ │ │ │ │ https://avd.aquasec.com/nvd/cve-2023-46695 │
├─────────────────────────┼────────────────┤ │ ├───────────────────┼───────────────────────┼─────────────────────────────────────────────────────────────┤
│ cryptography (METADATA) │ CVE-2023-50782 │ │ │ 41.0.2 │ 42.0.0 │ python-cryptography: Bleichenbacher timing oracle attack │
│ │ │ │ │ │ │ against RSA decryption - incomplete fix for... │
│ │ │ │ │ │ │ https://avd.aquasec.com/nvd/cve-2023-50782 │
│ ├────────────────┤ │ │ ├───────────────────────┼─────────────────────────────────────────────────────────────┤
│ │ CVE-2024-26130 │ │ │ │ 42.0.4 │ cryptography is a package designed to expose cryptographic │
│ │ │ │ │ │ │ primitives ... │
│ │ │ │ │ │ │ https://avd.aquasec.com/nvd/cve-2024-26130 │
├─────────────────────────┼────────────────┼──────────┤ ├───────────────────┼───────────────────────┼─────────────────────────────────────────────────────────────┤
│ pyarrow (METADATA) │ CVE-2023-47248 │ CRITICAL │ │ 12.0.0 │ 14.0.1 │ PyArrow: Arbitrary code execution when loading a malicious │
│ │ │ │ │ │ │ data file │
│ │ │ │ │ │ │ https://avd.aquasec.com/nvd/cve-2023-47248 │
└─────────────────────────┴────────────────┴──────────┴────────┴───────────────────┴───────────────────────┴─────────────────────────────────────────────────────────────┘
Release 2.3.0
Oasis Release v2.3.0
Docker Images (Platform)
- coreoasis/api_server:2.3.0
- coreoasis/model_worker:2.3.0
- coreoasis/model_worker:2.3.0-debian
- coreoasis/piwind_worker:2.3.0
Docker Images (User Interface)
Components
Changelogs
OasisPlatform Changelog - 2.3.0
- #898 - Fix ods-tools changelog call
- #869 - Worker Controller crashing under heavy load
- #897 - Collected WebSocket bug fixes
- #912 - Fix syntax in flower chart template
- #913 - Add ENV var to disable http in websocket pod
- #918 - Fix worker_count_max assigment
- #920 - ODS Tools link in release notes points to OasisLMF repo
- #929 - Platform 2 - Keycloak DB reset on restart or redeployment.
- #893 - Support Platform 1 workers on the v2 server
- #942 - Updated Package Requirements: oasislmf==1.28.5 ods-tools==3.1.4
- #928, #681 - Added chunking options to analysis level
- #905, #786 - Fixed generate and run endpoint
- #818 - Update/remote trig python tests
- #910 - Add post analysis hook to platform 2 workflow
- #890 - Fetch a model's versions when auto-registration is disabled
- #903 - File linking OED from sub-directories fails to link inside workers
- #953 - Platform 2.1.3 - No free channel ids error
- #955 - Revert "Always post model version info on worker startup (platform 2)…
- #951 - Allow 'single instance' execution from v2 api
- #952 - Cleaner split between v1 and v2 OpenAPI schemas
- #960 - Update external images & python packages (2.3.0 release)
- #961 - Remove the python3-pip from production server images
- #966 - Fix broken swagger calls when SUB_PATH_URL=True
- #968 - Fix model registration script for v1 workers
- #872 - Investigate flower error in monitoring chart
- #871 - Handle exceptions from OedExposure on file Upload
- #702 - Fix worker controller stablility
OasisLMF Changelog - 2.3.0
- #1409 - Fix server-version flag for API client runner
- #1410 - Support for AccParticipation
- #1412 - use category for peril_id in keys.csv, improve write_fm_xref_file
- #1408, #1414 - Replace single vulnerabilities through additional adjustments settings or file
- #1416 - fix useful columns when extra aggregation level is needed
- #1417 - Update CI job triggers - only test on PR or commit to main branches
- #1418 - Set ktools to 3.11.1
- #1421 - add test with location with 1 empty and 1 level 2 condtag
- #1423 - add acc participation only
- #1425 - Customise specific vulnerabilities (without providing full replacement data)
- #140 - Implement OED peril fields
- #1429 - franchise deductible
- #1430 - FM acceptance tests failing with pandas==2.2.0
- #1422 - Adjust log levels separately for modules
- #1435 - Fix/update defaults
- #1441 - Feature/lot3 merge
- #1443 - Set package versions for 2.3.0
- #1249 - Discuss documentation strategy
- #1340 - collect_unused_df in il preparation
- #1326 - Update the the
KeyLookupInterface
class to have access to thelookup_complex_config_json
- #140 - Implement OED peril fields
- #1322 - Step policies: Allow BI ground up loss through to gross losses
- #1293 - Multiple footprint file options
- #1357 - fix permissions for docs deploy
- #1360 - Add docs about gulmc
- #1366 - Update fm supported terms document
- #1347 - Add runtime user supplied secondary factor option to plapy
- #1317 - Add post-analysis hook
- #1372 - Incorect TIV in the summary info files
- #1377 - Clean up 'runs' dir in repo
- #1378 - Support output of overall average period loss without standard deviation calculation
- #1292 - Parquet format summary info file
- #1382 - Change vulnerability weight data type from 32-bit integer to 32-bit float in gulmc
- #1381 - Converting exposure files to previous OED version before running model
- #1394 - Net RI losses do not use -z in summarycalc
- #1398 - Allow disaggregation to be disabled
- #1399 - Fixed loading booleans from oasislmf.json
- #1088 - Correlation options for the user
- #1405 - Fix/non compulsory condtag
- #1403 - Vulnerability File Option
- #1407 - Added tests for condition coverages 1-5 financial terms
ODS_Tools Changelog - 3.2.0
- #44 - add check for conditional requirement
- #49 - Reorganize Branches
- #50 - Update CI for stable 3.1.x
- #52 - Fix/improve check perils
- #54 - Add footprint file suffix options
- #58 - Validation crash after converting account file from csv to parquet
- #60 - Add options to enable/disable post loss amplification, and set secondary and uniform post loss amplification factors
- #61 - Model_settings, allow additional properties under 'data_settings'
- #62 - Add fields for running aalcalcmeanonly ktools component
- #64 - Backward compatibility when adding new codes in OED
- #68 - Define relationships between event and footprint sets
- #70 - Fix/forex case error
- [#73](OasisLMF/ODS_Tools#73...
Release 1.28.5
Oasis Release v1.28.5
Docker Images (Platform)
- coreoasis/api_server:1.28.5
- coreoasis/model_worker:1.28.5
- coreoasis/model_worker:1.28.5-debian
- coreoasis/piwind_worker:1.28.5
Docker Images (User Interface)
Components
Changelogs
OasisLMF Changelog - 1.28.5
- #1409 - Fix server-version flag for API client runner
- #1410 - Support for AccParticipation
- #1412 - use category for peril_id in keys.csv, improve write_fm_xref_file
- #1416 - fix useful columns when extra aggregation level is needed
- #1417 - Update CI job triggers - only test on PR or commit to main branches
- #1418 - Set ktools to 3.11.1
- #1347 - Add runtime user supplied secondary factor option to plapy
- #1405 - Fix/non compulsory condtag
- #1403 - Vulnerability File Option
- #1407 - Added tests for condition coverages 1-5 financial terms
ODS_Tools Changelog - 3.1.4
- #76 - aalcalcmeanonly and ORD alt_meanonly are missing in model settings valid metrics
- #80 - Added fields to analysis settings schema
Release Notes
OasisLMF Notes
Fixed flag in APIclient to set the server version - (PR #1409)
Use by setting oasislmf api run --server-version v1
or oasislmf api run --server-version v2
Support for AccParticipation - (PR #1411)
add support for AccParticipation in account all level.
introduce new calcrule where a share term is positive for all direct calcrule.
this "duplicated" calcrules have an id corresponding to their no share term calcrule plus 100
(ex: deductible and limit , id 1 => deductible, limit and share, id 101)
Note that calcrules with the same terms can have different id if they are perform in "direct" levels or in "direct layer" levels because in "direct" the share is apply on top of the policy that may have to keep track of deductible underlimit and overlimit
Improve memory usage when reading keys.csv - (PR #1412)
use category for peril_id when reading keys.csv.
use directly index when creating fm_xref_file
Fix in IL file generation with missing columns when final level of aggregation is needed - (PR #1416)
When account level aggregation is performed but there is no terms, some needed columns where not taken from the account file leading to error in get_xref_df:
KeyError: "['acc_idx', 'PolNumber'] not in index"
This fix the issue by using all useful columns when the account file is merged.
Set ktools to 3.11.1 - (PR #1418)
Add options to enable Post Loss Amplification and provide secondary and uniform factors - (PR #1369)
The requirement for amplifications file generated by the MDK as a trigger for the execution of Post Loss Amplification (PLA) has been replaced with the pla
flag in the analysis settings file. This allows a user to enable or disable (default) the PLA component plapy
.
Additionally, a secondary factor in the range [0, 1] can be specified from the command line with the argument -f
when running plapy
:
$ plapy -f 0.8 < gul_output.bin > plapy_output.bin
The secondary factor is applied to the deviation of the loss factor from 1. For example:
event_id | factor from model | relative factor from user | applied factor |
---|---|---|---|
1 | 1.10 | 0.8 | 1.08 |
2 | 1.20 | 0.8 | 1.16 |
3 | 1.00 | 0.8 | 1.00 |
4 | 0.90 | 0.8 | 0.92 |
Finally, an absolute, uniform, positive amplification/reduction factor can be specified from the command line with the argument -F
:
$ plapy -F 0.8 < gul_output.bin > plapy_output.bin
This factor is applied to all losses, thus loss factors from the model (those in lossfactors.bin
) are ignored. For example:
event_id | factor from model | uniform factor from user | applied factor |
---|---|---|---|
1 | 1.10 | 0.8 | 0.8 |
2 | 1.20 | 0.8 | 0.8 |
3 | 1.00 | 0.8 | 0.8 |
4 | 0.90 | 0.8 | 0.8 |
The absolute, uniform factor is incompatible with the relative, secondary factor. Therefore, if both are given by the user, a warning is logged and the secondary factor is ignored.
Fix allow CondTag column to be optional - (PR #1405)
Fix issue where CondTag was needed in location file if it was present in the account file making user have to add an empty CondTag column.
Vulnerability file options can be selected using the apposite field in analysis_settings - (PR #1406)
If "vulnerability_set" contains an identifier, the corresponding vulnerability file will be used.
Added tests for condition coverages 1-5 financial terms - (PR #1407)
Completed all units in validation/insurance_policy_coverages
ODS_Tools Notes
Fix missing model settings valid metrics - (PR #77)
- Added missing values
aalcalcmeanonly
,alt_meanonly
,alct_confidence
to valid_metrics sections in model_settings. - Added testing so missing values are caught next time.
Added fields correlation_settings, vulnerability_set, and vulnerability_adjustments to the schema - (PR #80)
Added fields used in PRs OasisLMF/OasisLMF#1401 and OasisLMF/OasisLMF#1406.
Release 1.23.21
Oasis Release v1.23.21
Docker Images (Platform)
- coreoasis/api_server:1.23.21
- coreoasis/model_worker:1.23.21
- coreoasis/model_worker:1.23.21-debian
- coreoasis/piwind_worker:1.23.21
Docker Images (User Interface)
Components
Changelogs
Release Notes
- Security patches for docker images
Release 1.15.31
Oasis Release v1.15.31
Docker Images (Platform)
- coreoasis/api_server:1.15.31
- coreoasis/model_worker:1.15.31
- coreoasis/model_worker:1.15.31-debian
- coreoasis/piwind_worker:1.15.31
Docker Images (User Interface)
Components
Changelogs
Release Notes
- Security patches for docker images
- Updated base server image to
ubuntu:22:04
OasisLMF Notes
Assign output zeros flag to summarycalc for all reinsurance loss computes - (PR #1397)
The ktools
component summarycalc
does not output zero loss events by default. These zero loss events are required when net loss is called in fmpy
. Currently, net loss is called in all reinsurance instances, so the -z
flag has been assigned to all executions ofsummarycalc
when computing reinsurance losses.
Release 1.28.4
Oasis Release v1.28.4
Docker Images (Platform)
- coreoasis/api_server:1.28.4
- coreoasis/model_worker:1.28.4
- coreoasis/model_worker:1.28.4-debian
- coreoasis/piwind_worker:1.28.4
Docker Images (User Interface)
Components
Changelogs
OasisLMF Changelog - 1.28.4
- #1292 - Parquet format summary info file
- #1382 - Change vulnerability weight data type from 32-bit integer to 32-bit float in gulmc
- #1381 - Converting exposure files to previous OED version before running model
- #140 - Implement OED peril fields
- #1394 - Net RI losses do not use -z in summarycalc
- #1398 - Allow disaggregation to be disabled
- #1399 - Fixed loading booleans from oasislmf.json
- #1347 - Add runtime user supplied secondary factor option to plapy
ODS_Tools Changelog - 3.1.3
- #64 - Backward compatibility when adding new codes in OED
- #68 - Define relationships between event and footprint sets
- #70 - Fix/forex case error
- #73 - Feature/peril filter
ktools Changelog - v3.11.0
- #353 - Add runtime user supplied secondary factor option to placalc
- #342 - aalcalc Performance Improvements
- #304 - CALT estimated standard error in AAL overstates observed sampling error
- #359 - CSV to BIN conversion tool for aggregate weights and vulnerability definitions.
- #361 - The
vulnerability.bin
file can be written with the wrong data types
Release Notes
OasisLMF Notes
Write summary info files in same format as ORD output reports - (PR #1380)
Summary info files are now written in the same format as the ORD output reports. Therefore, should a user request ORD output reports in parquet format, the summary info files will also be in parquet format.
Change vulnerability weight data type to 32-bit float in gulmc - (PR #1386)
The data type for vulnerability weights that are read from the binary file weights.bin
by gulmc
has been changed from 32-bit integer to 32-bit float.
If supported OED versions are reported in the model settings, exposure files are converted to the latest compatible OED version before running the model.
Support OED Peril terms and coverage specific terms for all level - (PR #1299)
- support OED Peril terms (adding a filter so only the loss from correct perils are part of the policy)
- full revamp of fm file generation step in order to preserve memory.
- support coverage specific term for condition
- have the condition logic able to handle graph structure (not just tree structure)
Also, to be able to run our tests using exposure run, Peril need to be taken from LocPerilCovered
in exposure run add option to use LocPerilCovered for peril id and use only certain peril
During an exposure run, the perils used were determine base on num_subperils and their id were 1 to num_subperils
With this change user can specify the peril covered by the deterministic model via --model-perils-covered
if nothing is given all peril in LocPerilCovered will be attributed a key and will receive a loss from the model.
it is also now possible to specify extra summary column so they can be seen in the loss summary at the end of exposure run using --extra-summary-cols
example:
oasislmf exposure run -s ~/test/peril_test -r ~/OasisLMF/runs/peril_test --extra-summary-cols peril_id --model-perils-covered WTC
Assign output zeros flag to summarycalc for all reinsurance loss computes - (PR #1397)
The ktools
component summarycalc
does not output zero loss events by default. These zero loss events are required when net loss is called in fmpy
. Currently, net loss is called in all reinsurance instances, so the -z
flag has been assigned to all executions ofsummarycalc
when computing reinsurance losses.
Fixed loading booleans from oasislmf.json - (PR #1399)
The function str2bool(var) converts "False" (str)
to False (bool)
but is not correctly called from the oasislmf.json file.
So setting, a boolean flag with:
{
"do_disaggregation": "False"
}
Evaluates to True
because the type is str and not bool
> (self.do_disaggregation)
'False'
> bool(self.do_disaggregation)
True
Add options to enable Post Loss Amplification and provide secondary and uniform factors - (PR #1369)
The requirement for amplifications file generated by the MDK as a trigger for the execution of Post Loss Amplification (PLA) has been replaced with the pla
flag in the analysis settings file. This allows a user to enable or disable (default) the PLA component plapy
.
Additionally, a secondary factor in the range [0, 1] can be specified from the command line with the argument -f
when running plapy
:
$ plapy -f 0.8 < gul_output.bin > plapy_output.bin
The secondary factor is applied to the deviation of the loss factor from 1. For example:
event_id | factor from model | relative factor from user | applied factor |
---|---|---|---|
1 | 1.10 | 0.8 | 1.08 |
2 | 1.20 | 0.8 | 1.16 |
3 | 1.00 | 0.8 | 1.00 |
4 | 0.90 | 0.8 | 0.92 |
Finally, an absolute, uniform, positive amplification/reduction factor can be specified from the command line with the argument -F
:
$ plapy -F 0.8 < gul_output.bin > plapy_output.bin
This factor is applied to all losses, thus loss factors from the model (those in lossfactors.bin
) are ignored. For example:
event_id | factor from model | uniform factor from user | applied factor |
---|---|---|---|
1 | 1.10 | 0.8 | 0.8 |
2 | 1.20 | 0.8 | 0.8 |
3 | 1.00 | 0.8 | 0.8 |
4 | 0.90 | 0.8 | 0.8 |
The absolute, uniform factor is incompatible with the relative, secondary factor. Therefore, if both are given by the user, a warning is logged and the secondary factor is ignored.
ODS_Tools Notes
Model setting option to set dependency between event set and footprint files - (PR #69)
- Added
valid_footprint_ids
per event set section
fix case issue in forex conversion - (PR #70)
when case change from lowercase to the schema case the forex module didn't follow and ended up having issue when finding column to convert. In this fix we now use the correct case.
Add function to check if a peril is part of a peril group as defined in peril columns - (PR #73)
ex:
when oed_schema.peril_filtering is run
WTC is part of all those peril groups ['WW2', 'WTC,WSS', 'QQ1;WW2', 'WTC']
XLT would be part of none of those
ktools Notes
Add runtime user-supplied relative, secondary factor option to placalc - (PR #354)
An optional, relative flat-factor can be specified by the user and applied to all loss factors with the command line argument -f
. For example, to apply a relative secondary factor of 0.8 the following can be entered:
$ placalc -f 0.8 < gulcalc_output.bin > placalc_output.bin
The relative secondary factor must lie within the range [0, 1] and is applied to the deviation of the factor from 1. For example:
event_id | factor from model | relative factor from user | applied factor |
---|---|---|---|
1 | 1.10 | 0.8 | 1.08 |
2 | 1.20 | 0.8 | 1.16 |
3 | 1.00 | 0.8 | 1.00 |
4 | 0.90 | 0.8 | 0.92 |
Add runtime user-supplied absolute, uniform factor option to placalc
Alternatively, an absolute, uniform post loss amplification/reduction factor can be applied to all losses by the user with the command line argument -F
. For example, to specify a uniform factor of 0.8 across all losses, the following can be entered:
$ placalc -F 0.8 < gulcalc_output.bin > placalc_output.bin
If specified, the loss fac...
Release 1.27.7
Oasis Release v1.27.7
Docker Images (Platform)
- coreoasis/api_server:1.27.7
- coreoasis/model_worker:1.27.7
- coreoasis/model_worker:1.27.7-debian
- coreoasis/piwind_worker:1.27.7
Docker Images (User Interface)
Components
Changelogs
OasisLMF Changelog - 1.27.7
- #1397 - Add output zeros flag to summarycalc for all reinsurance loss computes
- #1219 - Fix flakly checks in TestGetDataframe
- #1390 - Backport - Post analysis hook
- #1335 - Update CI - 1.27
ODS_Tools Changelog - 3.0.8
ktools Changelog - v3.11.0
- #353 - Add runtime user supplied secondary factor option to placalc
- #342 - aalcalc Performance Improvements
- #358 - Release/3.10.1
- #304 - CALT estimated standard error in AAL overstates observed sampling error
- #359 - CSV to BIN conversion tool for aggregate weights and vulnerability definitions.
- #361 - The
vulnerability.bin
file can be written with the wrong data types - #351 - Introduce components for Post Loss Amplification
Release Notes
OasisLMF Notes
Assign output zeros flag to summarycalc for all reinsurance loss computes - (PR #1397)
The ktools
component summarycalc
does not output zero loss events by default. These zero loss events are required when net loss is called in fmpy
. Currently, net loss is called in all reinsurance instances, so the -z
flag has been assigned to all executions ofsummarycalc
when computing reinsurance losses.
flaky tests failures - (PR #1327)
Fixed intermittent testing failures:
- Fixed NaN errors from
utils/test_data.py
- CI failure 9996230083 - Remove deadline from
test_lookup.py
- CI failure 9996208518
Implement post analysis hook - Backport 1.27.x - (PR #1390)
Model vendors can supply a custom Python module that will be run after the analysis has completed. This module will have access to the run directory, model data directory and analysis settings. It could for instance modify the output files, parse logs to produce user-friendly reports or generate plots.
The two new Oasis settings required to use this feature are similar to the ones used for the pre analysis hook.
post_analysis_module
: Path to the Python module containing the class.post_analysis_class_name
: Name of the class.
The class must have a constructor that takes kwargs model_data_dir
, model_run_dir
and analysis_settings_json
, plus a run
method with no arguments. For example:
class MyPostAnalysis:
def __init__(self, model_data_dir=None, model_run_dir=None, analysis_settings_json=None):
self.model_data_dir = model_data_dir
self.model_run_dir = model_run_dir
self.analysis_settings_json = analysis_settings_json
def run():
# do something
ODS_Tools Notes
fix case issue in forex conversion - (PR #70)
when case change from lowercase to the schema case the forex module didn't follow and ended up having issue when finding column to convert. In this fix we now use the correct case.
ktools Notes
Add runtime user-supplied relative, secondary factor option to placalc - (PR #354)
An optional, relative flat-factor can be specified by the user and applied to all loss factors with the command line argument -f
. For example, to apply a relative secondary factor of 0.8 the following can be entered:
$ placalc -f 0.8 < gulcalc_output.bin > placalc_output.bin
The relative secondary factor must lie within the range [0, 1] and is applied to the deviation of the factor from 1. For example:
event_id | factor from model | relative factor from user | applied factor |
---|---|---|---|
1 | 1.10 | 0.8 | 1.08 |
2 | 1.20 | 0.8 | 1.16 |
3 | 1.00 | 0.8 | 1.00 |
4 | 0.90 | 0.8 | 0.92 |
Add runtime user-supplied absolute, uniform factor option to placalc
Alternatively, an absolute, uniform post loss amplification/reduction factor can be applied to all losses by the user with the command line argument -F
. For example, to specify a uniform factor of 0.8 across all losses, the following can be entered:
$ placalc -F 0.8 < gulcalc_output.bin > placalc_output.bin
If specified, the loss factors from the model (those in lossfactors.bin
) are ignored. This factor must be positive and is applied uniformly across all losses. For example:
event_id | factor from model | uniform factor from user | applied factor |
---|---|---|---|
1 | 1.10 | 0.8 | 0.8 |
2 | 1.20 | 0.8 | 0.8 |
3 | 1.00 | 0.8 | 0.8 |
4 | 0.90 | 0.8 | 0.8 |
The absolute, uniform factor is incompatible with the relative, secondary factor given above. Therefore, if both are given by the user, a warning is issued and the relative, secondary factor is ignored:
$ placalc -f 0.8 -F 0.8 < gulcalc_output.bin > placalc_output
WARNING: Relative secondary and absolute factors are incompatible
INFO: Ignoring relative secondary factor
Add tests for Post Loss Amplification (PLA) components
Acceptance tests for placalc
, amplificationstobin
, amplificationstocsv
, lossfactorstobin
and lossfactorstocsv
have been included.
New component aalcalcmeanonly - (PR #357)
A new component aalcalcmeanonly
calculates the overall average period loss. Unlike aalcalc
, it does not calculate the standard deviation from the average. Therefore, it has a quicker execution time and uses less memory.
Remove ANOVA fields from Convergence Average Loss Table (CALT) - (PR #360)
The standard error in the Convergence Average Loss Table (CALT) has been observed to overestimate the observed sampling error. This is because the random effects model used to partition the variance into vulnerability and hazard factors requires those contributions to be random. However, the hazard element in the Oasis framework is fixed, not random: events occurrences are assigned to years in a fixed timeline. Therefore, the hazard element in the variance of the Average Annual Loss (AAL) does not reduce with increasing samples, leading to a larger standard error.
As the ANOVA (ANalysis Of VAriance) fields are not helpful in predicting AAL convergence, they have been dropped. The standard error is calculated as s / sqrt(IM), where s is the standard deviation of the annual losses, I is the total number of periods and M is the number of samples.
Introduce csv <---> binary Conversion Tools for Aggregate Vulnerabilities and Weights - (PR #362)
The following components have been introduced to convert aggregate vulnerability tables between binary and csv formats:
- aggregatevulnerabilitytobin
- aggregatevulnerabilitytocsv
These can be executed from the command line as follows:
$ aggregatevulnerabilitytobin < aggregate_vulnerability.csv > aggregate_vulnerability.bin
$ aggregatevulnerabilitytocsv < aggregate_vulnerability.bin > aggregate_vulnerability.csv
Additionally, the following components have been introduced to convert vulnerability weight tables between binary and csv formats:
- weightstobin
- weightstocsv
These can be executed from the command line as follows:
$ weightstobin < weights.csv > weights.bin
$ weightstocsv < weights.bin > weights.csv
Add validation check to validatevulnerability that Vulnerability ID does not exceed maximum signed integer value - (PR #363)
A validation check has been added to validatevulnerability
, which outputs an error message should the vulnerability ID exceed the maximum signed integer value. For example:
$ validatevulnerability < vulnerability.csv
Error: Vulnerability ID 1100000000000 on line 5 exceeds maximum permissible value 2147483647. Please reassign vulnerability IDs to lie un...
Release 1.26.9
Oasis Release v1.26.9
Docker Images (Platform)
- coreoasis/api_server:1.26.9
- coreoasis/model_worker:1.26.9
- coreoasis/model_worker:1.26.9-debian
- coreoasis/piwind_worker:1.26.9
Docker Images (User Interface)
Components
Changelogs
- #856 - Update CI 1.26
OasisLMF Changelog - 1.26.9
- #1338 - Update CI - 1.26
- #1397 - Add output zeros flag to summarycalc for all reinsurance loss computes
ktools Changelog - v3.11.0
- #353 - Add runtime user supplied secondary factor option to placalc
- #342 - aalcalc Performance Improvements
- #358 - Release/3.10.1
- #304 - CALT estimated standard error in AAL overstates observed sampling error
- #359 - CSV to BIN conversion tool for aggregate weights and vulnerability definitions.
- #361 - The
vulnerability.bin
file can be written with the wrong data types - #346 - Release/3.9.7
- #344 - Incorrect Values from Wheatsheaf/Per Sample Mean with Period Weights in leccalc/ordleccalc
- #351 - Introduce components for Post Loss Amplification
Release Notes
OasisLMF Notes
Assign output zeros flag to summarycalc for all reinsurance loss computes - (PR #1397)
The ktools
component summarycalc
does not output zero loss events by default. These zero loss events are required when net loss is called in fmpy
. Currently, net loss is called in all reinsurance instances, so the -z
flag has been assigned to all executions ofsummarycalc
when computing reinsurance losses.
ktools Notes
Add runtime user-supplied relative, secondary factor option to placalc - (PR #354)
An optional, relative flat-factor can be specified by the user and applied to all loss factors with the command line argument -f
. For example, to apply a relative secondary factor of 0.8 the following can be entered:
$ placalc -f 0.8 < gulcalc_output.bin > placalc_output.bin
The relative secondary factor must lie within the range [0, 1] and is applied to the deviation of the factor from 1. For example:
event_id | factor from model | relative factor from user | applied factor |
---|---|---|---|
1 | 1.10 | 0.8 | 1.08 |
2 | 1.20 | 0.8 | 1.16 |
3 | 1.00 | 0.8 | 1.00 |
4 | 0.90 | 0.8 | 0.92 |
Add runtime user-supplied absolute, uniform factor option to placalc
Alternatively, an absolute, uniform post loss amplification/reduction factor can be applied to all losses by the user with the command line argument -F
. For example, to specify a uniform factor of 0.8 across all losses, the following can be entered:
$ placalc -F 0.8 < gulcalc_output.bin > placalc_output.bin
If specified, the loss factors from the model (those in lossfactors.bin
) are ignored. This factor must be positive and is applied uniformly across all losses. For example:
event_id | factor from model | uniform factor from user | applied factor |
---|---|---|---|
1 | 1.10 | 0.8 | 0.8 |
2 | 1.20 | 0.8 | 0.8 |
3 | 1.00 | 0.8 | 0.8 |
4 | 0.90 | 0.8 | 0.8 |
The absolute, uniform factor is incompatible with the relative, secondary factor given above. Therefore, if both are given by the user, a warning is issued and the relative, secondary factor is ignored:
$ placalc -f 0.8 -F 0.8 < gulcalc_output.bin > placalc_output
WARNING: Relative secondary and absolute factors are incompatible
INFO: Ignoring relative secondary factor
Add tests for Post Loss Amplification (PLA) components
Acceptance tests for placalc
, amplificationstobin
, amplificationstocsv
, lossfactorstobin
and lossfactorstocsv
have been included.
New component aalcalcmeanonly - (PR #357)
A new component aalcalcmeanonly
calculates the overall average period loss. Unlike aalcalc
, it does not calculate the standard deviation from the average. Therefore, it has a quicker execution time and uses less memory.
Remove ANOVA fields from Convergence Average Loss Table (CALT) - (PR #360)
The standard error in the Convergence Average Loss Table (CALT) has been observed to overestimate the observed sampling error. This is because the random effects model used to partition the variance into vulnerability and hazard factors requires those contributions to be random. However, the hazard element in the Oasis framework is fixed, not random: events occurrences are assigned to years in a fixed timeline. Therefore, the hazard element in the variance of the Average Annual Loss (AAL) does not reduce with increasing samples, leading to a larger standard error.
As the ANOVA (ANalysis Of VAriance) fields are not helpful in predicting AAL convergence, they have been dropped. The standard error is calculated as s / sqrt(IM), where s is the standard deviation of the annual losses, I is the total number of periods and M is the number of samples.
Introduce csv <---> binary Conversion Tools for Aggregate Vulnerabilities and Weights - (PR #362)
The following components have been introduced to convert aggregate vulnerability tables between binary and csv formats:
- aggregatevulnerabilitytobin
- aggregatevulnerabilitytocsv
These can be executed from the command line as follows:
$ aggregatevulnerabilitytobin < aggregate_vulnerability.csv > aggregate_vulnerability.bin
$ aggregatevulnerabilitytocsv < aggregate_vulnerability.bin > aggregate_vulnerability.csv
Additionally, the following components have been introduced to convert vulnerability weight tables between binary and csv formats:
- weightstobin
- weightstocsv
These can be executed from the command line as follows:
$ weightstobin < weights.csv > weights.bin
$ weightstocsv < weights.bin > weights.csv
Add validation check to validatevulnerability that Vulnerability ID does not exceed maximum signed integer value - (PR #363)
A validation check has been added to validatevulnerability
, which outputs an error message should the vulnerability ID exceed the maximum signed integer value. For example:
$ validatevulnerability < vulnerability.csv
Error: Vulnerability ID 1100000000000 on line 5 exceeds maximum permissible value 2147483647. Please reassign vulnerability IDs to lie under this maximum
Error: Vulnerability ID 1100000000000 on line 6 exceeds maximum permissible value 2147483647. Please reassign vulnerability IDs to lie under this maximum
Error: Vulnerability ID 1100000000000 on line 7 exceeds maximum permissible value 2147483647. Please reassign vulnerability IDs to lie under this maximum
Error: Vulnerability ID 1100000000000 on line 8 exceeds maximum permissible value 2147483647. Please reassign vulnerability IDs to lie under this maximum
Some checks have failed. Please edit input file.
Release 3.9.7 - (PR #346)
- #343 - Empty leccalc output files when using period weights and return periods file](https://github.com/OasisLMF/ktools/releases/tag/v3.9.7)
Fix Per Sample Mean (Wheatsheaf Mean) with Period Weights Output from leccalc/ordleccalc - (PR #349)
When a period weights file was supplied by the user, the Per Sample Mean (i.e. Wheatsheaf Mean) from leccalc
and ordleccalc
was incorrect. After sorting the loss vector in descending order, the vector was then reorganised by period number, nullifying the sorting. This would only yield the correct results in the very rare cases when the loss value decreased with increasing period number.
As the return periods are determined by the period weights, in order to calculate the mean losses, the data would need to traversed twice: once to determine the return periods; and the second time to fill them. However, if the return periods are known in advance, i.e. when the user supplies a return period file, the first iteration is unnecessary.
As the per sample mean with period weights does not appear to be a very useful metric, this option is only supported when a return periods file is present. Should a return periods file be missing, the following message will be written to the log file:
WARNING: Return periods file must be present if you wish to use non-uniform period weights for Wheatsheaf mean/per sample mean output.
INFO: Wheatsheaf ...
Release 1.23.20
Oasis Release v1.23.20
Docker Images (Platform)
- coreoasis/api_server:1.23.20
- coreoasis/model_worker:1.23.20
- coreoasis/model_worker:1.23.20-debian
- coreoasis/piwind_worker:1.23.20
Docker Images (User Interface)
Components
Changelogs
OasisLMF Changelog - 1.23.20
- #1337 - Update CI - 1.23
- #1397 - Add output zeros flag to summarycalc for all reinsurance loss computes
ktools Changelog - v3.11.0
- #353 - Add runtime user supplied secondary factor option to placalc
- #342 - aalcalc Performance Improvements
- #358 - Release/3.10.1
- #304 - CALT estimated standard error in AAL overstates observed sampling error
- #359 - CSV to BIN conversion tool for aggregate weights and vulnerability definitions.
- #361 - The
vulnerability.bin
file can be written with the wrong data types - #346 - Release/3.9.7
- #344 - Incorrect Values from Wheatsheaf/Per Sample Mean with Period Weights in leccalc/ordleccalc
- #351 - Introduce components for Post Loss Amplification
Release Notes
OasisLMF Notes
Assign output zeros flag to summarycalc for all reinsurance loss computes - (PR #1397)
The ktools
component summarycalc
does not output zero loss events by default. These zero loss events are required when net loss is called in fmpy
. Currently, net loss is called in all reinsurance instances, so the -z
flag has been assigned to all executions ofsummarycalc
when computing reinsurance losses.
ktools Notes
Add runtime user-supplied relative, secondary factor option to placalc - (PR #354)
An optional, relative flat-factor can be specified by the user and applied to all loss factors with the command line argument -f
. For example, to apply a relative secondary factor of 0.8 the following can be entered:
$ placalc -f 0.8 < gulcalc_output.bin > placalc_output.bin
The relative secondary factor must lie within the range [0, 1] and is applied to the deviation of the factor from 1. For example:
event_id | factor from model | relative factor from user | applied factor |
---|---|---|---|
1 | 1.10 | 0.8 | 1.08 |
2 | 1.20 | 0.8 | 1.16 |
3 | 1.00 | 0.8 | 1.00 |
4 | 0.90 | 0.8 | 0.92 |
Add runtime user-supplied absolute, uniform factor option to placalc
Alternatively, an absolute, uniform post loss amplification/reduction factor can be applied to all losses by the user with the command line argument -F
. For example, to specify a uniform factor of 0.8 across all losses, the following can be entered:
$ placalc -F 0.8 < gulcalc_output.bin > placalc_output.bin
If specified, the loss factors from the model (those in lossfactors.bin
) are ignored. This factor must be positive and is applied uniformly across all losses. For example:
event_id | factor from model | uniform factor from user | applied factor |
---|---|---|---|
1 | 1.10 | 0.8 | 0.8 |
2 | 1.20 | 0.8 | 0.8 |
3 | 1.00 | 0.8 | 0.8 |
4 | 0.90 | 0.8 | 0.8 |
The absolute, uniform factor is incompatible with the relative, secondary factor given above. Therefore, if both are given by the user, a warning is issued and the relative, secondary factor is ignored:
$ placalc -f 0.8 -F 0.8 < gulcalc_output.bin > placalc_output
WARNING: Relative secondary and absolute factors are incompatible
INFO: Ignoring relative secondary factor
Add tests for Post Loss Amplification (PLA) components
Acceptance tests for placalc
, amplificationstobin
, amplificationstocsv
, lossfactorstobin
and lossfactorstocsv
have been included.
New component aalcalcmeanonly - (PR #357)
A new component aalcalcmeanonly
calculates the overall average period loss. Unlike aalcalc
, it does not calculate the standard deviation from the average. Therefore, it has a quicker execution time and uses less memory.
Remove ANOVA fields from Convergence Average Loss Table (CALT) - (PR #360)
The standard error in the Convergence Average Loss Table (CALT) has been observed to overestimate the observed sampling error. This is because the random effects model used to partition the variance into vulnerability and hazard factors requires those contributions to be random. However, the hazard element in the Oasis framework is fixed, not random: events occurrences are assigned to years in a fixed timeline. Therefore, the hazard element in the variance of the Average Annual Loss (AAL) does not reduce with increasing samples, leading to a larger standard error.
As the ANOVA (ANalysis Of VAriance) fields are not helpful in predicting AAL convergence, they have been dropped. The standard error is calculated as s / sqrt(IM), where s is the standard deviation of the annual losses, I is the total number of periods and M is the number of samples.
Introduce csv <---> binary Conversion Tools for Aggregate Vulnerabilities and Weights - (PR #362)
The following components have been introduced to convert aggregate vulnerability tables between binary and csv formats:
- aggregatevulnerabilitytobin
- aggregatevulnerabilitytocsv
These can be executed from the command line as follows:
$ aggregatevulnerabilitytobin < aggregate_vulnerability.csv > aggregate_vulnerability.bin
$ aggregatevulnerabilitytocsv < aggregate_vulnerability.bin > aggregate_vulnerability.csv
Additionally, the following components have been introduced to convert vulnerability weight tables between binary and csv formats:
- weightstobin
- weightstocsv
These can be executed from the command line as follows:
$ weightstobin < weights.csv > weights.bin
$ weightstocsv < weights.bin > weights.csv
Add validation check to validatevulnerability that Vulnerability ID does not exceed maximum signed integer value - (PR #363)
A validation check has been added to validatevulnerability
, which outputs an error message should the vulnerability ID exceed the maximum signed integer value. For example:
$ validatevulnerability < vulnerability.csv
Error: Vulnerability ID 1100000000000 on line 5 exceeds maximum permissible value 2147483647. Please reassign vulnerability IDs to lie under this maximum
Error: Vulnerability ID 1100000000000 on line 6 exceeds maximum permissible value 2147483647. Please reassign vulnerability IDs to lie under this maximum
Error: Vulnerability ID 1100000000000 on line 7 exceeds maximum permissible value 2147483647. Please reassign vulnerability IDs to lie under this maximum
Error: Vulnerability ID 1100000000000 on line 8 exceeds maximum permissible value 2147483647. Please reassign vulnerability IDs to lie under this maximum
Some checks have failed. Please edit input file.
Release 3.9.7 - (PR #346)
- #343 - Empty leccalc output files when using period weights and return periods file](https://github.com/OasisLMF/ktools/releases/tag/v3.9.7)
Fix Per Sample Mean (Wheatsheaf Mean) with Period Weights Output from leccalc/ordleccalc - (PR #349)
When a period weights file was supplied by the user, the Per Sample Mean (i.e. Wheatsheaf Mean) from leccalc
and ordleccalc
was incorrect. After sorting the loss vector in descending order, the vector was then reorganised by period number, nullifying the sorting. This would only yield the correct results in the very rare cases when the loss value decreased with increasing period number.
As the return periods are determined by the period weights, in order to calculate the mean losses, the data would need to traversed twice: once to determine the return periods; and the second time to fill them. However, if the return periods are known in advance, i.e. when the user supplies a return period file, the first iteration is unnecessary.
As the per sample mean with period weights does not appear to be a very useful metric, this option is only supported when a return periods file is present. Should a return periods file be missing, the following message will be written to the log file:
WARNING: Return periods file must be present if you wish to use non-uniform period weights for Wheatsheaf mean/per sample mean output.
INFO: Wheatsheaf mean/per sample mean output will not be produced.
As outl...