As many in the insurance industry understand, catastrophe modelling was once seen as the original Insurtech. Born in the shadow of the large catastrophes of the late eighties and early nineties, catastrophe modelling looked to apply scientific knowledge and cutting-edge technology to generate a quantitative assessment of hazard, and forever changed our views of peril-based risk transfer.

What emerged from the development of this innovative approach were procedures, skills and infrastructure, forming a framework that has remained in place ever since. And whilst the underlying hazard science continues to evolve as perils are better understood, the platforms that facilitate the transfer of scientific knowledge to risk carriers have remained largely unchanged.

As is typical in information architectures of the late nineties and early 2000s, when heavy reliance is placed upon large, static systems an organic superstructure develops around the process. With this, greater complexity is added to enable the transition of data from the fixed system, into the many and varied forms of information in which the user wishes to consume the information.

Fast forward a couple of decades and we find cutting-edge risk carriers utilising cutting-edge methodologies, infrastructure and technology to manage all aspects of their business. And with the explosion of distributed computing and subsequent reduction in cost of processing power and space, these organisations are leveraging efficiencies in all areas of their business. However, when it comes to exposure to catastrophic risk, the model put in place 30 years ago is still commonly employed.

With the advancing power of initiatives like distributed ledger technology and smart contracts, there is significant potential for the catastrophe risk assessment and pricing process to slow the automation of the insurance value chain. What once was the forefront of technological advancement in the insurance industry could be the very thing that slows it down.

The proliferation of application programming interfaces (APIs) which facilitate the transfer of information from machine to machine with well-defined parameters and without human intervention have provided vast opportunity to speed up encumbered processes. Their use is already providing efficiencies throughout the global re/insurance market, and catastrophe modelling is about to follow suit.

Here at reask we’re building solutions that consider the re/insurance business process from the ground up, and the broad set of requirements that individual organisations require for their hazard assessment.

We aren’t building another view of risk. We’re helping you build your view of risk.

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.