Insurance & Technology is part of the Informa Tech Division of Informa PLC

This site is operated by a business or businesses owned by Informa PLC and all copyright resides with them. Informa PLC's registered office is 5 Howick Place, London SW1P 1WG. Registered in England and Wales. Number 8860726.

Claims

11:08 AM
Peter Lewis
Peter Lewis
Commentary
50%
50%

Technology & Catastrophe Modeling: Beyond the Hype

As computing power increases, so does the sophistication -- and footprint -- of catastrophe modeling software, which raises important questions about how to deploy, manage, and support these new, more resource-intensive modeling environments.

Today’s insurance risk management landscape is becoming increasingly complex. Companies are searching for the path to continued profitable growth, under both challenging market conditions and an environment of even stricter regulatory regimes that require companies to demonstrate a deep understanding of their risk.

Users of catastrophe models are applying advanced analytics more frequently to inform their business decisions. They want to run risk analyses faster or, equally important, to run more analyses, whether for deeper dives into the drivers of risk, to test the sensitivity of the models, or to satisfy regulatory reporting requirements.

Catastrophe modeling firms are harnessing the power of new technologies to drive the continuing evolution of catastrophe models and also to deliver significant performance improvements over previous-generation platforms. As computing power increases, so does the sophistication -- and footprint -- of catastrophe modeling software, which raises important questions about how to deploy, manage, and support these new, more resource-intensive modeling environments.

One solution that comes immediately to mind is the cloud, which has received a lot of hype over the past few years across many industries. But the reality of cloud computing is that it’s not a one-size-fits-all solution, certainly not yet. While some companies have embraced cloud solutions and have already adapted their catastrophe modeling workflows, many others have concerns.

A recent Wall Street Journal article reports that many companies are not ready to move mission-critical applications to the cloud, a finding that was backed by a recent AIR client survey. In fact, 65% of respondents indicated to AIR that their organization will not be moving their data into the cloud in the near future. Although the flexibility and agility of the cloud may be attractive to some, many companies are hesitant to entrust their data and critical operations to a cloud service provider -- in fact, in some countries, regulators forbid it.

Modeling companies can best serve their clients by offering a choice in deployment strategy. Cloud computing has unquestionably opened up new operational possibilities, but it is not the only way to deliver significant performance in catastrophe modeling solutions. 

Performance is often a result of the product architecture, including its data and computing technology platform and the efficiency of the underlying core algorithms. There are portable and affordable new-generation technologies that can enable high performance modeling, regardless of whether they are deployed on-premises or in a cloud.  

One such technology is analytical clustered databases, often referred to as massively parallel processing (MPP) databases. This class of database technology provides performance and scalability and can be deployed to any commodity hardware. MPP databases are fully SQL-compliant relational databases that provide unrestricted direct query access to stored data.

The industry is also seeing growing interest in a hybrid cloud solution from companies that want on-demand elasticity -- that is, the ability to expand their computer resources as needed without purchasing additional hardware or compromising their data security. A significant aspect of hybrid cloud technologies is that companies do not have to move their entire modeling platform into the cloud.

Certain types of data, including mission-critical or highly confidential ones, can always remain on-premises. Companies can tap additional compute capacity through cloud bursting as-needed. Cloud bursting allows a company to keep its modeling system and data within its data centers, while offloading all or some of the computational processing to the cloud during periods of high activity.

Research firm Gartner predicts that by 2017, 50% of all firms will be applying hybrid cloud technology, using both cloud and on-premises solutions. Companies only pay the cloud provider for the compute resources utilized, enabling better direct control over costs. Because bursting happens only on demand, companies have the flexibility to use it as a permanent part of their modeling workflow, or only during busy periods like renewals or special projects.

Ultimately, when it comes to the deployment of catastrophe modeling software, insurance and reinsurance executives are faced with important strategic and operational decisions. The ability to weigh the costs and benefits of emerging technologies -- and to dig beyond the hype -- is a required skill today.

In light of this, catastrophe modeling providers should offer a choice in deployment strategy, whether it’s a public or private cloud, an on-premises installation, a hybrid cloud solution, or an integrated part of the model user’s own internal systems. Modelers should be fully committed to explaining the choices and helping companies establish an environment that works best for them.

[Learn more about the Internet of Things at Interop's Internet of Things Summit on Monday, September 29.]

Peter Lewis is senior vice president of Technical Services at AIR Worldwide, a Verisk Analytics business. View Full Bio

Register for Insurance & Technology Newsletters
Slideshows
Video