Skip to content

Releases: OasisLMF/OasisPlatform

Release 1.15.31

06 Dec 12:58
Compare
Choose a tag to compare

Oasis Release v1.15.31

Docker Images (Platform)

Docker Images (User Interface)

Components

Changelogs

Release Notes

  • Security patches for docker images
  • Updated base server image to ubuntu:22:04

OasisLMF Notes

Assign output zeros flag to summarycalc for all reinsurance loss computes - (PR #1397)

The ktools component summarycalc does not output zero loss events by default. These zero loss events are required when net loss is called in fmpy. Currently, net loss is called in all reinsurance instances, so the -z flag has been assigned to all executions ofsummarycalc when computing reinsurance losses.

Release 1.28.4

20 Nov 12:24
Compare
Choose a tag to compare

Oasis Release v1.28.4

Docker Images (Platform)

Docker Images (User Interface)

Components

Changelogs

OasisLMF Changelog - 1.28.4

  • #1292 - Parquet format summary info file
  • #1382 - Change vulnerability weight data type from 32-bit integer to 32-bit float in gulmc
  • #1381 - Converting exposure files to previous OED version before running model
  • #140 - Implement OED peril fields
  • #1394 - Net RI losses do not use -z in summarycalc
  • #1398 - Allow disaggregation to be disabled
  • #1399 - Fixed loading booleans from oasislmf.json
  • #1347 - Add runtime user supplied secondary factor option to plapy

ODS_Tools Changelog - 3.1.3

  • #64 - Backward compatibility when adding new codes in OED
  • #68 - Define relationships between event and footprint sets
  • #70 - Fix/forex case error
  • #73 - Feature/peril filter

ktools Changelog - v3.11.0

  • #353 - Add runtime user supplied secondary factor option to placalc
  • #342 - aalcalc Performance Improvements
  • #304 - CALT estimated standard error in AAL overstates observed sampling error
  • #359 - CSV to BIN conversion tool for aggregate weights and vulnerability definitions.
  • #361 - The vulnerability.bin file can be written with the wrong data types

Release Notes

OasisLMF Notes

Write summary info files in same format as ORD output reports - (PR #1380)

Summary info files are now written in the same format as the ORD output reports. Therefore, should a user request ORD output reports in parquet format, the summary info files will also be in parquet format.

Change vulnerability weight data type to 32-bit float in gulmc - (PR #1386)

The data type for vulnerability weights that are read from the binary file weights.bin by gulmc has been changed from 32-bit integer to 32-bit float.

If supported OED versions are reported in the model settings, exposure files are converted to the latest compatible OED version before running the model.

Support OED Peril terms and coverage specific terms for all level - (PR #1299)

  • support OED Peril terms (adding a filter so only the loss from correct perils are part of the policy)
  • full revamp of fm file generation step in order to preserve memory.
  • support coverage specific term for condition
  • have the condition logic able to handle graph structure (not just tree structure)

Also, to be able to run our tests using exposure run, Peril need to be taken from LocPerilCovered
in exposure run add option to use LocPerilCovered for peril id and use only certain peril
During an exposure run, the perils used were determine base on num_subperils and their id were 1 to num_subperils
With this change user can specify the peril covered by the deterministic model via --model-perils-covered
if nothing is given all peril in LocPerilCovered will be attributed a key and will receive a loss from the model.

it is also now possible to specify extra summary column so they can be seen in the loss summary at the end of exposure run using --extra-summary-cols

example:

oasislmf exposure run -s ~/test/peril_test -r ~/OasisLMF/runs/peril_test --extra-summary-cols peril_id --model-perils-covered WTC

Assign output zeros flag to summarycalc for all reinsurance loss computes - (PR #1397)

The ktools component summarycalc does not output zero loss events by default. These zero loss events are required when net loss is called in fmpy. Currently, net loss is called in all reinsurance instances, so the -z flag has been assigned to all executions ofsummarycalc when computing reinsurance losses.

Fixed loading booleans from oasislmf.json - (PR #1399)

The function str2bool(var) converts "False" (str) to False (bool) but is not correctly called from the oasislmf.json file.

So setting, a boolean flag with:

{
  "do_disaggregation": "False"
}  

Evaluates to True because the type is str and not bool

> (self.do_disaggregation)
'False'
> bool(self.do_disaggregation)
True

Add options to enable Post Loss Amplification and provide secondary and uniform factors - (PR #1369)

The requirement for amplifications file generated by the MDK as a trigger for the execution of Post Loss Amplification (PLA) has been replaced with the pla flag in the analysis settings file. This allows a user to enable or disable (default) the PLA component plapy.

Additionally, a secondary factor in the range [0, 1] can be specified from the command line with the argument -f when running plapy:

$ plapy -f 0.8 < gul_output.bin > plapy_output.bin

The secondary factor is applied to the deviation of the loss factor from 1. For example:

event_id factor from model relative factor from user applied factor
1 1.10 0.8 1.08
2 1.20 0.8 1.16
3 1.00 0.8 1.00
4 0.90 0.8 0.92

Finally, an absolute, uniform, positive amplification/reduction factor can be specified from the command line with the argument -F:

$ plapy -F 0.8 < gul_output.bin > plapy_output.bin

This factor is applied to all losses, thus loss factors from the model (those in lossfactors.bin) are ignored. For example:

event_id factor from model uniform factor from user applied factor
1 1.10 0.8 0.8
2 1.20 0.8 0.8
3 1.00 0.8 0.8
4 0.90 0.8 0.8

The absolute, uniform factor is incompatible with the relative, secondary factor. Therefore, if both are given by the user, a warning is logged and the secondary factor is ignored.

ODS_Tools Notes

Model setting option to set dependency between event set and footprint files - (PR #69)

  • Added valid_footprint_ids per event set section

fix case issue in forex conversion - (PR #70)

when case change from lowercase to the schema case the forex module didn't follow and ended up having issue when finding column to convert. In this fix we now use the correct case.

Add function to check if a peril is part of a peril group as defined in peril columns - (PR #73)

ex:
when oed_schema.peril_filtering is run
WTC is part of all those peril groups ['WW2', 'WTC,WSS', 'QQ1;WW2', 'WTC']
XLT would be part of none of those

ktools Notes

Add runtime user-supplied relative, secondary factor option to placalc - (PR #354)

An optional, relative flat-factor can be specified by the user and applied to all loss factors with the command line argument -f. For example, to apply a relative secondary factor of 0.8 the following can be entered:

$ placalc -f 0.8 < gulcalc_output.bin > placalc_output.bin

The relative secondary factor must lie within the range [0, 1] and is applied to the deviation of the factor from 1. For example:

event_id factor from model relative factor from user applied factor
1 1.10 0.8 1.08
2 1.20 0.8 1.16
3 1.00 0.8 1.00
4 0.90 0.8 0.92

Add runtime user-supplied absolute, uniform factor option to placalc

Alternatively, an absolute, uniform post loss amplification/reduction factor can be applied to all losses by the user with the command line argument -F. For example, to specify a uniform factor of 0.8 across all losses, the following can be entered:

$ placalc -F 0.8 < gulcalc_output.bin > placalc_output.bin

If specified, the loss fac...

Read more

Release 1.27.7

09 Nov 13:50
Compare
Choose a tag to compare

Oasis Release v1.27.7

Docker Images (Platform)

Docker Images (User Interface)

Components

Changelogs

OasisLMF Changelog - 1.27.7

  • #1397 - Add output zeros flag to summarycalc for all reinsurance loss computes
  • #1219 - Fix flakly checks in TestGetDataframe
  • #1390 - Backport - Post analysis hook
  • #1335 - Update CI - 1.27

ODS_Tools Changelog - 3.0.8

  • #51 - Update CI for stable 3.0.x
  • #70 - Fix/forex case error

ktools Changelog - v3.11.0

  • #353 - Add runtime user supplied secondary factor option to placalc
  • #342 - aalcalc Performance Improvements
  • #358 - Release/3.10.1
  • #304 - CALT estimated standard error in AAL overstates observed sampling error
  • #359 - CSV to BIN conversion tool for aggregate weights and vulnerability definitions.
  • #361 - The vulnerability.bin file can be written with the wrong data types
  • #351 - Introduce components for Post Loss Amplification

Release Notes

OasisLMF Notes

Assign output zeros flag to summarycalc for all reinsurance loss computes - (PR #1397)

The ktools component summarycalc does not output zero loss events by default. These zero loss events are required when net loss is called in fmpy. Currently, net loss is called in all reinsurance instances, so the -z flag has been assigned to all executions ofsummarycalc when computing reinsurance losses.

flaky tests failures - (PR #1327)

Fixed intermittent testing failures:

Implement post analysis hook - Backport 1.27.x - (PR #1390)

Model vendors can supply a custom Python module that will be run after the analysis has completed. This module will have access to the run directory, model data directory and analysis settings. It could for instance modify the output files, parse logs to produce user-friendly reports or generate plots.

The two new Oasis settings required to use this feature are similar to the ones used for the pre analysis hook.

  • post_analysis_module: Path to the Python module containing the class.
  • post_analysis_class_name: Name of the class.

The class must have a constructor that takes kwargs model_data_dir, model_run_dir and analysis_settings_json, plus a run method with no arguments. For example:

class MyPostAnalysis:
    def __init__(self, model_data_dir=None, model_run_dir=None, analysis_settings_json=None):
        self.model_data_dir = model_data_dir
        self.model_run_dir = model_run_dir
        self.analysis_settings_json = analysis_settings_json

    def run():
        # do something

ODS_Tools Notes

fix case issue in forex conversion - (PR #70)

when case change from lowercase to the schema case the forex module didn't follow and ended up having issue when finding column to convert. In this fix we now use the correct case.

ktools Notes

Add runtime user-supplied relative, secondary factor option to placalc - (PR #354)

An optional, relative flat-factor can be specified by the user and applied to all loss factors with the command line argument -f. For example, to apply a relative secondary factor of 0.8 the following can be entered:

$ placalc -f 0.8 < gulcalc_output.bin > placalc_output.bin

The relative secondary factor must lie within the range [0, 1] and is applied to the deviation of the factor from 1. For example:

event_id factor from model relative factor from user applied factor
1 1.10 0.8 1.08
2 1.20 0.8 1.16
3 1.00 0.8 1.00
4 0.90 0.8 0.92

Add runtime user-supplied absolute, uniform factor option to placalc

Alternatively, an absolute, uniform post loss amplification/reduction factor can be applied to all losses by the user with the command line argument -F. For example, to specify a uniform factor of 0.8 across all losses, the following can be entered:

$ placalc -F 0.8 < gulcalc_output.bin > placalc_output.bin

If specified, the loss factors from the model (those in lossfactors.bin) are ignored. This factor must be positive and is applied uniformly across all losses. For example:

event_id factor from model uniform factor from user applied factor
1 1.10 0.8 0.8
2 1.20 0.8 0.8
3 1.00 0.8 0.8
4 0.90 0.8 0.8

The absolute, uniform factor is incompatible with the relative, secondary factor given above. Therefore, if both are given by the user, a warning is issued and the relative, secondary factor is ignored:

$ placalc -f 0.8 -F 0.8 < gulcalc_output.bin > placalc_output
WARNING: Relative secondary and absolute factors are incompatible
INFO: Ignoring relative secondary factor

Add tests for Post Loss Amplification (PLA) components

Acceptance tests for placalc, amplificationstobin, amplificationstocsv, lossfactorstobin and lossfactorstocsv have been included.

New component aalcalcmeanonly - (PR #357)

A new component aalcalcmeanonly calculates the overall average period loss. Unlike aalcalc, it does not calculate the standard deviation from the average. Therefore, it has a quicker execution time and uses less memory.

Remove ANOVA fields from Convergence Average Loss Table (CALT) - (PR #360)

The standard error in the Convergence Average Loss Table (CALT) has been observed to overestimate the observed sampling error. This is because the random effects model used to partition the variance into vulnerability and hazard factors requires those contributions to be random. However, the hazard element in the Oasis framework is fixed, not random: events occurrences are assigned to years in a fixed timeline. Therefore, the hazard element in the variance of the Average Annual Loss (AAL) does not reduce with increasing samples, leading to a larger standard error.

As the ANOVA (ANalysis Of VAriance) fields are not helpful in predicting AAL convergence, they have been dropped. The standard error is calculated as s / sqrt(IM), where s is the standard deviation of the annual losses, I is the total number of periods and M is the number of samples.

Introduce csv <---> binary Conversion Tools for Aggregate Vulnerabilities and Weights - (PR #362)

The following components have been introduced to convert aggregate vulnerability tables between binary and csv formats:

  • aggregatevulnerabilitytobin
  • aggregatevulnerabilitytocsv

These can be executed from the command line as follows:

$ aggregatevulnerabilitytobin < aggregate_vulnerability.csv > aggregate_vulnerability.bin
$ aggregatevulnerabilitytocsv < aggregate_vulnerability.bin > aggregate_vulnerability.csv

Additionally, the following components have been introduced to convert vulnerability weight tables between binary and csv formats:

  • weightstobin
  • weightstocsv

These can be executed from the command line as follows:

$ weightstobin < weights.csv > weights.bin
$ weightstocsv < weights.bin > weights.csv

Add validation check to validatevulnerability that Vulnerability ID does not exceed maximum signed integer value - (PR #363)

A validation check has been added to validatevulnerability, which outputs an error message should the vulnerability ID exceed the maximum signed integer value. For example:

$ validatevulnerability < vulnerability.csv
Error: Vulnerability ID 1100000000000 on line 5 exceeds maximum permissible value 2147483647. Please reassign vulnerability IDs to lie un...
Read more

Release 1.26.9

09 Nov 14:03
Compare
Choose a tag to compare

Oasis Release v1.26.9

Docker Images (Platform)

Docker Images (User Interface)

Components

Changelogs

  • #856 - Update CI 1.26

OasisLMF Changelog - 1.26.9

  • #1338 - Update CI - 1.26
  • #1397 - Add output zeros flag to summarycalc for all reinsurance loss computes

ktools Changelog - v3.11.0

  • #353 - Add runtime user supplied secondary factor option to placalc
  • #342 - aalcalc Performance Improvements
  • #358 - Release/3.10.1
  • #304 - CALT estimated standard error in AAL overstates observed sampling error
  • #359 - CSV to BIN conversion tool for aggregate weights and vulnerability definitions.
  • #361 - The vulnerability.bin file can be written with the wrong data types
  • #346 - Release/3.9.7
  • #344 - Incorrect Values from Wheatsheaf/Per Sample Mean with Period Weights in leccalc/ordleccalc
  • #351 - Introduce components for Post Loss Amplification

Release Notes

OasisLMF Notes

Assign output zeros flag to summarycalc for all reinsurance loss computes - (PR #1397)

The ktools component summarycalc does not output zero loss events by default. These zero loss events are required when net loss is called in fmpy. Currently, net loss is called in all reinsurance instances, so the -z flag has been assigned to all executions ofsummarycalc when computing reinsurance losses.

ktools Notes

Add runtime user-supplied relative, secondary factor option to placalc - (PR #354)

An optional, relative flat-factor can be specified by the user and applied to all loss factors with the command line argument -f. For example, to apply a relative secondary factor of 0.8 the following can be entered:

$ placalc -f 0.8 < gulcalc_output.bin > placalc_output.bin

The relative secondary factor must lie within the range [0, 1] and is applied to the deviation of the factor from 1. For example:

event_id factor from model relative factor from user applied factor
1 1.10 0.8 1.08
2 1.20 0.8 1.16
3 1.00 0.8 1.00
4 0.90 0.8 0.92

Add runtime user-supplied absolute, uniform factor option to placalc

Alternatively, an absolute, uniform post loss amplification/reduction factor can be applied to all losses by the user with the command line argument -F. For example, to specify a uniform factor of 0.8 across all losses, the following can be entered:

$ placalc -F 0.8 < gulcalc_output.bin > placalc_output.bin

If specified, the loss factors from the model (those in lossfactors.bin) are ignored. This factor must be positive and is applied uniformly across all losses. For example:

event_id factor from model uniform factor from user applied factor
1 1.10 0.8 0.8
2 1.20 0.8 0.8
3 1.00 0.8 0.8
4 0.90 0.8 0.8

The absolute, uniform factor is incompatible with the relative, secondary factor given above. Therefore, if both are given by the user, a warning is issued and the relative, secondary factor is ignored:

$ placalc -f 0.8 -F 0.8 < gulcalc_output.bin > placalc_output
WARNING: Relative secondary and absolute factors are incompatible
INFO: Ignoring relative secondary factor

Add tests for Post Loss Amplification (PLA) components

Acceptance tests for placalc, amplificationstobin, amplificationstocsv, lossfactorstobin and lossfactorstocsv have been included.

New component aalcalcmeanonly - (PR #357)

A new component aalcalcmeanonly calculates the overall average period loss. Unlike aalcalc, it does not calculate the standard deviation from the average. Therefore, it has a quicker execution time and uses less memory.

Remove ANOVA fields from Convergence Average Loss Table (CALT) - (PR #360)

The standard error in the Convergence Average Loss Table (CALT) has been observed to overestimate the observed sampling error. This is because the random effects model used to partition the variance into vulnerability and hazard factors requires those contributions to be random. However, the hazard element in the Oasis framework is fixed, not random: events occurrences are assigned to years in a fixed timeline. Therefore, the hazard element in the variance of the Average Annual Loss (AAL) does not reduce with increasing samples, leading to a larger standard error.

As the ANOVA (ANalysis Of VAriance) fields are not helpful in predicting AAL convergence, they have been dropped. The standard error is calculated as s / sqrt(IM), where s is the standard deviation of the annual losses, I is the total number of periods and M is the number of samples.

Introduce csv <---> binary Conversion Tools for Aggregate Vulnerabilities and Weights - (PR #362)

The following components have been introduced to convert aggregate vulnerability tables between binary and csv formats:

  • aggregatevulnerabilitytobin
  • aggregatevulnerabilitytocsv

These can be executed from the command line as follows:

$ aggregatevulnerabilitytobin < aggregate_vulnerability.csv > aggregate_vulnerability.bin
$ aggregatevulnerabilitytocsv < aggregate_vulnerability.bin > aggregate_vulnerability.csv

Additionally, the following components have been introduced to convert vulnerability weight tables between binary and csv formats:

  • weightstobin
  • weightstocsv

These can be executed from the command line as follows:

$ weightstobin < weights.csv > weights.bin
$ weightstocsv < weights.bin > weights.csv

Add validation check to validatevulnerability that Vulnerability ID does not exceed maximum signed integer value - (PR #363)

A validation check has been added to validatevulnerability, which outputs an error message should the vulnerability ID exceed the maximum signed integer value. For example:

$ validatevulnerability < vulnerability.csv
Error: Vulnerability ID 1100000000000 on line 5 exceeds maximum permissible value 2147483647. Please reassign vulnerability IDs to lie under this maximum
Error: Vulnerability ID 1100000000000 on line 6 exceeds maximum permissible value 2147483647. Please reassign vulnerability IDs to lie under this maximum
Error: Vulnerability ID 1100000000000 on line 7 exceeds maximum permissible value 2147483647. Please reassign vulnerability IDs to lie under this maximum
Error: Vulnerability ID 1100000000000 on line 8 exceeds maximum permissible value 2147483647. Please reassign vulnerability IDs to lie under this maximum
Some checks have failed. Please edit input file.

Release 3.9.7 - (PR #346)

Fix Per Sample Mean (Wheatsheaf Mean) with Period Weights Output from leccalc/ordleccalc - (PR #349)

When a period weights file was supplied by the user, the Per Sample Mean (i.e. Wheatsheaf Mean) from leccalc and ordleccalc was incorrect. After sorting the loss vector in descending order, the vector was then reorganised by period number, nullifying the sorting. This would only yield the correct results in the very rare cases when the loss value decreased with increasing period number.

As the return periods are determined by the period weights, in order to calculate the mean losses, the data would need to traversed twice: once to determine the return periods; and the second time to fill them. However, if the return periods are known in advance, i.e. when the user supplies a return period file, the first iteration is unnecessary.

As the per sample mean with period weights does not appear to be a very useful metric, this option is only supported when a return periods file is present. Should a return periods file be missing, the following message will be written to the log file:

WARNING: Return periods file must be present if you wish to use non-uniform period weights for Wheatsheaf mean/per sample mean output.
INFO: Wheatsheaf ...
Read more

Release 1.23.20

09 Nov 14:07
Compare
Choose a tag to compare

Oasis Release v1.23.20

Docker Images (Platform)

Docker Images (User Interface)

Components

Changelogs

OasisLMF Changelog - 1.23.20

  • #1337 - Update CI - 1.23
  • #1397 - Add output zeros flag to summarycalc for all reinsurance loss computes

ktools Changelog - v3.11.0

  • #353 - Add runtime user supplied secondary factor option to placalc
  • #342 - aalcalc Performance Improvements
  • #358 - Release/3.10.1
  • #304 - CALT estimated standard error in AAL overstates observed sampling error
  • #359 - CSV to BIN conversion tool for aggregate weights and vulnerability definitions.
  • #361 - The vulnerability.bin file can be written with the wrong data types
  • #346 - Release/3.9.7
  • #344 - Incorrect Values from Wheatsheaf/Per Sample Mean with Period Weights in leccalc/ordleccalc
  • #351 - Introduce components for Post Loss Amplification

Release Notes

OasisLMF Notes

Assign output zeros flag to summarycalc for all reinsurance loss computes - (PR #1397)

The ktools component summarycalc does not output zero loss events by default. These zero loss events are required when net loss is called in fmpy. Currently, net loss is called in all reinsurance instances, so the -z flag has been assigned to all executions ofsummarycalc when computing reinsurance losses.

ktools Notes

Add runtime user-supplied relative, secondary factor option to placalc - (PR #354)

An optional, relative flat-factor can be specified by the user and applied to all loss factors with the command line argument -f. For example, to apply a relative secondary factor of 0.8 the following can be entered:

$ placalc -f 0.8 < gulcalc_output.bin > placalc_output.bin

The relative secondary factor must lie within the range [0, 1] and is applied to the deviation of the factor from 1. For example:

event_id factor from model relative factor from user applied factor
1 1.10 0.8 1.08
2 1.20 0.8 1.16
3 1.00 0.8 1.00
4 0.90 0.8 0.92

Add runtime user-supplied absolute, uniform factor option to placalc

Alternatively, an absolute, uniform post loss amplification/reduction factor can be applied to all losses by the user with the command line argument -F. For example, to specify a uniform factor of 0.8 across all losses, the following can be entered:

$ placalc -F 0.8 < gulcalc_output.bin > placalc_output.bin

If specified, the loss factors from the model (those in lossfactors.bin) are ignored. This factor must be positive and is applied uniformly across all losses. For example:

event_id factor from model uniform factor from user applied factor
1 1.10 0.8 0.8
2 1.20 0.8 0.8
3 1.00 0.8 0.8
4 0.90 0.8 0.8

The absolute, uniform factor is incompatible with the relative, secondary factor given above. Therefore, if both are given by the user, a warning is issued and the relative, secondary factor is ignored:

$ placalc -f 0.8 -F 0.8 < gulcalc_output.bin > placalc_output
WARNING: Relative secondary and absolute factors are incompatible
INFO: Ignoring relative secondary factor

Add tests for Post Loss Amplification (PLA) components

Acceptance tests for placalc, amplificationstobin, amplificationstocsv, lossfactorstobin and lossfactorstocsv have been included.

New component aalcalcmeanonly - (PR #357)

A new component aalcalcmeanonly calculates the overall average period loss. Unlike aalcalc, it does not calculate the standard deviation from the average. Therefore, it has a quicker execution time and uses less memory.

Remove ANOVA fields from Convergence Average Loss Table (CALT) - (PR #360)

The standard error in the Convergence Average Loss Table (CALT) has been observed to overestimate the observed sampling error. This is because the random effects model used to partition the variance into vulnerability and hazard factors requires those contributions to be random. However, the hazard element in the Oasis framework is fixed, not random: events occurrences are assigned to years in a fixed timeline. Therefore, the hazard element in the variance of the Average Annual Loss (AAL) does not reduce with increasing samples, leading to a larger standard error.

As the ANOVA (ANalysis Of VAriance) fields are not helpful in predicting AAL convergence, they have been dropped. The standard error is calculated as s / sqrt(IM), where s is the standard deviation of the annual losses, I is the total number of periods and M is the number of samples.

Introduce csv <---> binary Conversion Tools for Aggregate Vulnerabilities and Weights - (PR #362)

The following components have been introduced to convert aggregate vulnerability tables between binary and csv formats:

  • aggregatevulnerabilitytobin
  • aggregatevulnerabilitytocsv

These can be executed from the command line as follows:

$ aggregatevulnerabilitytobin < aggregate_vulnerability.csv > aggregate_vulnerability.bin
$ aggregatevulnerabilitytocsv < aggregate_vulnerability.bin > aggregate_vulnerability.csv

Additionally, the following components have been introduced to convert vulnerability weight tables between binary and csv formats:

  • weightstobin
  • weightstocsv

These can be executed from the command line as follows:

$ weightstobin < weights.csv > weights.bin
$ weightstocsv < weights.bin > weights.csv

Add validation check to validatevulnerability that Vulnerability ID does not exceed maximum signed integer value - (PR #363)

A validation check has been added to validatevulnerability, which outputs an error message should the vulnerability ID exceed the maximum signed integer value. For example:

$ validatevulnerability < vulnerability.csv
Error: Vulnerability ID 1100000000000 on line 5 exceeds maximum permissible value 2147483647. Please reassign vulnerability IDs to lie under this maximum
Error: Vulnerability ID 1100000000000 on line 6 exceeds maximum permissible value 2147483647. Please reassign vulnerability IDs to lie under this maximum
Error: Vulnerability ID 1100000000000 on line 7 exceeds maximum permissible value 2147483647. Please reassign vulnerability IDs to lie under this maximum
Error: Vulnerability ID 1100000000000 on line 8 exceeds maximum permissible value 2147483647. Please reassign vulnerability IDs to lie under this maximum
Some checks have failed. Please edit input file.

Release 3.9.7 - (PR #346)

Fix Per Sample Mean (Wheatsheaf Mean) with Period Weights Output from leccalc/ordleccalc - (PR #349)

When a period weights file was supplied by the user, the Per Sample Mean (i.e. Wheatsheaf Mean) from leccalc and ordleccalc was incorrect. After sorting the loss vector in descending order, the vector was then reorganised by period number, nullifying the sorting. This would only yield the correct results in the very rare cases when the loss value decreased with increasing period number.

As the return periods are determined by the period weights, in order to calculate the mean losses, the data would need to traversed twice: once to determine the return periods; and the second time to fill them. However, if the return periods are known in advance, i.e. when the user supplies a return period file, the first iteration is unnecessary.

As the per sample mean with period weights does not appear to be a very useful metric, this option is only supported when a return periods file is present. Should a return periods file be missing, the following message will be written to the log file:

WARNING: Return periods file must be present if you wish to use non-uniform period weights for Wheatsheaf mean/per sample mean output.
INFO: Wheatsheaf mean/per sample mean output will not be produced.

As outl...

Read more

Release 1.15.30

09 Nov 14:01
Compare
Choose a tag to compare

Oasis Release v1.15.30

Docker Images (Platform)

Docker Images (User Interface)

Components

Changelogs

  • #854 - Update CI 1.15

OasisLMF Changelog - 1.15.30

  • #1336 - Update CI - 1.15
  • #1397 - Add output zeros flag to summarycalc for all reinsurance loss computes

Release Notes

OasisLMF Notes

Assign output zeros flag to summarycalc for all reinsurance loss computes - (PR #1397)

The ktools component summarycalc does not output zero loss events by default. These zero loss events are required when net loss is called in fmpy. Currently, net loss is called in all reinsurance instances, so the -z flag has been assigned to all executions ofsummarycalc when computing reinsurance losses.

Release 1.23.19

02 Nov 17:29
Compare
Choose a tag to compare

Oasis Release v1.23.19

Docker Images (Platform)

Docker Images (User Interface)

Components

Changelogs

OasisPlatform Changelog - 1.23.19

  • #915 - Platform 1.23.x CVE
  • #855 - Update CI 1.23

Release Notes

OasisPlatform Notes

Fix CVE issues in platform 1.23.18 - (PR #916)

Release 1.28.3

06 Oct 13:34
Compare
Choose a tag to compare

Oasis Release v1.28.3

Docker Images (Platform)

Docker Images (User Interface)

Components

Changelogs

  • #891 - Release 1.28.2 (Staging)

OasisLMF Changelog - 1.28.3

  • #1377 - Clean up 'runs' dir in repo
  • #1378 - Support output of overall average period loss without standard deviation calculation
  • #1366 - Update fm supported terms document
  • #1347 - Add runtime user supplied secondary factor option to plapy
  • #1317 - Add post-analysis hook
  • #1372 - Incorect TIV in the summary info files

ODS_Tools Changelog - 3.1.2

  • #53 - Release 3.1.1 (staging)
  • #62 - Add fields for running aalcalcmeanonly ktools component

ktools Changelog - v3.10.1

  • #353 - Add runtime user supplied secondary factor option to placalc
  • #342 - aalcalc Performance Improvements

Release Notes

OasisLMF Notes

Support output of overall average period loss without standard deviation calculation - (PR #1378)

The new ktools component aalcalcmeanonly (see PR OasisLMF/ktools#357) calculates the overall average period loss but does not include the standard deviation. As a result, it has a faster execution time and uses less memory than aalcalc.

Support for executing this component as part of a model run has been introduced through the aalcalc_meanonly (legacy output) and alt_meanonly(ORD output) flags in the analysis settings file.

Financial terms supported document update - (PR #1367)

The document has been updated to reflect recent additional financial fields that are supported, including

  • Currency fields
  • Account level terms

In addition a 'Version introduced" field has been included to identify the version of OasisLMF in which the field was first supported, if later than v1.15 LTS.

Add options to enable Post Loss Amplification and provide secondary and uniform factors - (PR #1369)

The requirement for amplifications file generated by the MDK as a trigger for the execution of Post Loss Amplification (PLA) has been replaced with the pla flag in the analysis settings file. This allows a user to enable or disable (default) the PLA component plapy.

Additionally, a secondary factor in the range [0, 1] can be specified from the command line with the argument -f when running plapy:

$ plapy -f 0.8 < gul_output.bin > plapy_output.bin

The secondary factor is applied to the deviation of the loss factor from 1. For example:

event_id factor from model relative factor from user applied factor
1 1.10 0.8 1.08
2 1.20 0.8 1.16
3 1.00 0.8 1.00
4 0.90 0.8 0.92

Finally, an absolute, uniform, positive amplification/reduction factor can be specified from the command line with the argument -F:

$ plapy -F 0.8 < gul_output.bin > plapy_output.bin

This factor is applied to all losses, thus loss factors from the model (those in lossfactors.bin) are ignored. For example:

event_id factor from model uniform factor from user applied factor
1 1.10 0.8 0.8
2 1.20 0.8 0.8
3 1.00 0.8 0.8
4 0.90 0.8 0.8

The absolute, uniform factor is incompatible with the relative, secondary factor. Therefore, if both are given by the user, a warning is logged and the secondary factor is ignored.

Implement post analysis hook - (PR #1371)

Model vendors can supply a custom Python module that will be run after the analysis has completed. This module will have access to the run directory, model data directory and analysis settings. It could for instance modify the output files, parse logs to produce user-friendly reports or generate plots.

The two new Oasis settings required to use this feature are similar to the ones used for the pre analysis hook.

  • post_analysis_module: Path to the Python module containing the class.
  • post_analysis_class_name: Name of the class.

The class must have a constructor that takes kwargs model_data_dir, model_run_dir and analysis_settings_json, plus a run method with no arguments. For example:

class MyPostAnalysis:
    def __init__(self, model_data_dir=None, model_run_dir=None, analysis_settings_json=None):
        self.model_data_dir = model_data_dir
        self.model_run_dir = model_run_dir
        self.analysis_settings_json = analysis_settings_json

    def run():
        # do something

Fix Tiv calculation when NumberOfBuilding is >1 in location file - (PR #1373)

The Tiv calculated in the output summaries was incorrect as the granularity has change after the implementation of stochastic dis-aggregation (when NumberOfBuilding > 1).
Only 'loc_id', 'coverage_type_id' were taken in account leading to detect duplicate leading to lower TIV than it should
With this change, we add 'building_id' and 'risk_id' to the summary_map and add building_id in the key to detect duplicate when we calculate the TIV

ODS_Tools Notes

Add fields for running aalcalcmeanonly ktools component - (PR #62)

The aalcalcmeanonly ktools component calculates the overall average period loss, skipping the calculation of the standard deviation from this value. The following boolean fields have been introduced to the analysis settings file:

  • aalcalc_meanonly: if true, output table in legacy format.
  • alt_meanonly: if true, output table in ORD format.

ktools Notes

Add runtime user-supplied relative, secondary factor option to placalc - (PR #354)

An optional, relative flat-factor can be specified by the user and applied to all loss factors with the command line argument -f. For example, to apply a relative secondary factor of 0.8 the following can be entered:

$ placalc -f 0.8 < gulcalc_output.bin > placalc_output.bin

The relative secondary factor must lie within the range [0, 1] and is applied to the deviation of the factor from 1. For example:

event_id factor from model relative factor from user applied factor
1 1.10 0.8 1.08
2 1.20 0.8 1.16
3 1.00 0.8 1.00
4 0.90 0.8 0.92

Add runtime user-supplied absolute, uniform factor option to placalc

Alternatively, an absolute, uniform post loss amplification/reduction factor can be applied to all losses by the user with the command line argument -F. For example, to specify a uniform factor of 0.8 across all losses, the following can be entered:

$ placalc -F 0.8 < gulcalc_output.bin > placalc_output.bin

If specified, the loss factors from the model (those in lossfactors.bin) are ignored. This factor must be positive and is applied uniformly across all losses. For example:

event_id factor from model uniform factor from user applied factor
1 1.10 0.8 0.8
2 1.20 0.8 0.8
3 1.00 0.8 0.8
4 0.90 0.8 0.8

The absolute, uniform factor is incompatible with the relative, secondary factor given above. Therefore, if both are given by the user, a warning is issued and the relative, secondary factor is ignored:

$ placalc -f 0.8 -F 0.8 < gulcalc_output.bin > placalc_output
WARNING: Relative secondary and absolute factors are incompatible
INFO: Ignoring relative secondary factor

Add tests for Post Loss Amplification (PLA) components

Acceptance tests for placalc, amplificationstobin, amplificationstocsv, lossfactorstobin and lossfactorstocsv have been included.

New component aalcalcmeanonly - (PR #357)

A new component aalcalcmeanonly calculates the overall average period loss. Unlike aalcalc, it does not calculate the standard deviation from the average. Therefore, it has a quicker execution time and uses less memory.

Release 1.28.2

18 Sep 13:40
Compare
Choose a tag to compare

Oasis Release v1.28.2

Docker Images (Platform)

Docker Images (User Interface)

Components

Changelogs

OasisPlatform Changelog - 1.28.2

  • #868 - Fixes for OasisPlatform Publish
  • #841 - Release 1.28.0
  • #867 - Fix cryptography CVE
  • #880 - Analysis settings compatibility fix 1.15.x workers
  • #871 - Handle exceptions from OedExposure on file Upload
  • #853 - Update CI 1.28
  • #892 - CI Increase compatibility testing to check LTS workers

OasisLMF Changelog - 1.28.2

  • #1344 - Release/1.28.1 (staging)
  • #1326 - Update the the KeyLookupInterface class to have access to the lookup_complex_config_json
  • #140 - Implement OED peril fields
  • #1349 - Fix removal of handlers to logger + give logfiles unique names
  • #1322 - Step policies: Allow BI ground up loss through to gross losses
  • #1293 - Multiple footprint file options
  • #1249 - Discuss documentation strategy
  • #1324 - Release/1.28.0
  • #1357 - fix permissions for docs deploy
  • #1360 - Add docs about gulmc
  • #1334 - Update CI - 1.28
  • #1347 - Add runtime user supplied secondary factor option to plapy
  • #1340 - collect_unused_df in il preparation
  • #1341 - Bug in latest platform2 release

ODS_Tools Changelog - 3.1.1

  • #39 - Release 3.1.0 - for next stable oasislmf release
  • #44 - add check for conditional requirement
  • #50 - Update CI for stable 3.1.x
  • #52 - Fix/improve check perils
  • #54 - Add footprint file suffix options
  • #58 - Validation crash after converting account file from csv to parquet
  • #60 - Add options to enable/disable post loss amplification, and set secondary and uniform post loss amplification factors
  • #61 - Model_settings, allow additional properties under 'data_settings'

Release Notes

OasisPlatform Notes

CI release workflow fixes - (PR #865)

  • Add release tag to target piwind branch on release
  • Add new latest tags 2-latest for platform 2 and 1-latest for 1
  • Extract previous component versions from last released worker image
  • added Option to override cve_severity value

Fix cryptography CVE-2023-38325 - platform 1 - (PR #874)

Add back settings compatibility workaround - (PR #878)

Bug added when the analysis settings compatibility mapping (needed for workers 1.15.x) was moved to ods-tools.
Reverted to the settings validation from platform (1.27.1)

https://github.com/OasisLMF/OasisPlatform/blob/bcc3a28a19ab52ff5582c6c13851996aa74a46fe/src/server/oasisapi/schemas/serializers.py#L207-L220

This stores both older and newer keys in the analysis settings file when its posted to the server.

Handle ODS exceptions when validating exposure files (Plat 1) - (PR #884)

Fix for issue #871 catch and raise exceptions and throw a validation error to return 400 Bad request instead of 500 Server error.

Improve Stable image compatibility testing - (PR #894)

Added image testing for all the main worker stable versions

OasisLMF Notes

Add complex model config into model config if both present - (PR #1345)

If both complex model config and model config are present, add the json dict from the complex config into the model config
as below
config['complex_config_dir'] = complex_config_dir
config['complex_config'] = complex_config

Update all fm test to use AA1 as peril in all peril columns - (PR #1346)

Work is in progress to have perils columns such as LocPerilsCovered, LocPeril, ... supported in oasislmf. This change aim at changing all perils to AA1 as they represent generic test. some more test specific to peril covered will be added later on with the feature.

also improve the split combine scripts used to add fm unit test by adding support for reinsurance files

Fixed the removal of log handlers in logging redirect wrapper - (PR #1349)

  • Log handlers were not correctly removed when exiting from log redirect
  • Added log redirect to plapy
  • Fixed open file leaks in testing

Step policies: Allow BI ground up loss through to gross losses - (PR #1351)

OasisLMF/OasisLMF#1322

Support multiple identifiers for footprint files - (PR #1352)

To enable the storage of footprints in multiple files rather than a single master file, optional identifiers in the form of footprint file suffixes are now supported. This is executed in a similar way to that currently in place to distinguish multiple events and event occurrences files. The footprint_set model settings option in the analysis settings file can be set to the desired file suffix for the footprint files to be used. A symbolic link to the desired footprint set is created in the static/ directory within the model run directory. Footprint file priorities are identical to those set by modelpy and gulmc, which in order of descending priority are: parquet; zipped binary; binary; and csv.

Revamp the oasislmf package documentation - (PR #1320)

This PR Fix #1249 by revamping the oasislmf package documentation.
The complete documentation of the full Python API of oasislmf is automatically generated using sphinx-autoapi. There is no need to manually update the docs pages whenever the oasislmf package is updated: sphinx-autoapi dynamically finds the changes and generates the docs for the latest oasislmf version.
The documentation is built using the build-docs.yml GH action workflow on all PR targeting main and is built & deployed to the gh-pages branch for all commits on main.

Add extensive docs about gulmc - (PR #1360)

This PR adds extensive documentation about gulmc.

Add options to enable Post Loss Amplification and provide secondary and uniform factors - (PR #1369)

The requirement for amplifications file generated by the MDK as a trigger for the execution of Post Loss Amplification (PLA) has been replaced with the pla flag in the analysis settings file. This allows a user to enable or disable (default) the PLA component plapy.

Additionally, a secondary factor in the range [0, 1] can be specified from the command line with the argument -f when running plapy:

$ plapy -f 0.8 < gul_output.bin > plapy_output.bin

The secondary factor is applied to the deviation of the loss factor from 1. For example:

event_id factor from model relative factor from user applied factor
1 1.10 0.8 1.08
2 1.20 0.8 1.16
3 1.00 0.8 1.00
4 0.90 0.8 0.92

Finally, an absolute, uniform, positive amplification/reducti...

Read more

Release 1.27.6

30 Aug 09:01
Compare
Choose a tag to compare

Oasis Release v1.27.6

Docker Images (Platform)

Docker Images (User Interface)

Components

Changelogs

OasisPlatform Changelog - 1.27.6

  • #868 - Fixes for OasisPlatform Publish
  • #867 - Fix cryptography CVE
  • #852 - Update CI 1.27
  • #878 - revert settings compatibility to older version

Release Notes

OasisPlatform Notes

CI release workflow fixes - (PR #865)

  • Add release tag to target piwind branch on release
  • Add new latest tags 2-latest for platform 2 and 1-latest for 1
  • Extract previous component versions from last released worker image
  • added Option to override cve_severity value

Fix cryptography CVE-2023-38325 - platform 1 - (PR #874)

Add back settings compatibility workaround - (PR #878)

Bug added when the analysis settings compatibility mapping (needed for workers 1.15.x) was moved to ods-tools.
Reverted to the settings validation from platform (1.27.1)

https://github.com/OasisLMF/OasisPlatform/blob/bcc3a28a19ab52ff5582c6c13851996aa74a46fe/src/server/oasisapi/schemas/serializers.py#L207-L220

This stores both older and newer keys in the analysis settings file when its posted to the server.