News & Commentary

12:59 PM
Stuart Rose, SAS
Stuart Rose, SAS

Adopting an Insurance Data Model For a Single Version of the Truth

To combat the existing silo approach and to alleviate problems with data quality, a growing number of insurers are undertaking enterprisewide data management projects, and fundamental to these data initiatives should be an Insurance Data Model.

An explosion of new data is turning the insurance business model on its head. There is more data and more access to data than there has ever been and it's growing. The challenge for insurers is how to take advantage of all this data to price better, expand your market and improve the business of underwriting risk and handing claims.

In the past insurance companies have relied on "old data" -- information from policy administration solutions, claims management applications and billing systems, often supplemented by third-party data, such as census data, motor vehicle records (MVRs), medical reports and Dun & Bradstreet data, to name but a few sources. However today, insurance companies are incorporating "new data" such as credit scoring, social media such as Facebook and Twitter, telematics from in-car data recording devices, and geo-spatial information such as Google maps. Unfortunately many insurers are drowning in this vast amount of data and struggling to digest it for meaningful insight. Hence an insurance company's core asset is no longer data gathering, it's their ability to handle and analyze data.

However many insurance carriers are struggling to get data in a way they can utilize it to perform better analytics and derive business intelligence from it. To combat the existing silo approach and to alleviate problems with data quality, a growing number of insurers are undertaking enterprisewide data management projects, and fundamental to these data initiatives should be an Insurance Data Model.

In its broadest definition a data model serves as a single version of the truth for an enterprise data warehouse covering all key insurance subject areas. It ensures consistency in the data, but also in the terminology. For example, if you are measuring the productivity of a line of business or retention rate, you should use the same metrics and processes to establish the measurement

The Insurance Data Model stores comprehensive, accurate, consolidated and historical information related to the insurance industry. It serves as the primary data store for all incoming data from various source systems, for example policy, claims, billing and agency management, acting as a single version of the truth. It should contain common language with a logical and physical construction to help business, IT and analytics experts map and extract data to support analytical model development and model deployment

Despite the assertion that most insurance carriers claim that the way they do business is unique, due to regulations and general business practices, the majority of insurance companies perform 80 percent of their activities the same way their competitors do. For example, all insurance companies process policy transactions, they earn and write premiums, pay claims and manage reserves. The differentiator among insurers is in how they define lines of business, and how they market and sell their products, which makes up the remaining 20 percent. A standard insurance data model is no different. It should be comprehensive to fit 80 percent of an insurance carrier's data requirements, but flexible and customizable to support the remaining 20 percent of requirements that are unique to that specific insurance company.

The foundation of a successful analytics operation is quality data, superior data management and an insurance data model. There are many examples of where data defects and inaccessibility to data can result in increased operational costs, potential customer dissatisfaction and even missed revenue opportunities. Clearly there is a business case for creating a single, unified environment for integrating, sharing and centrally managing data for business analytics. Unleashing the full power of business analytics should be on the short list for every insurer. The path to maximize the investment in business analytics is data. While an overused quote says that "data is the lifeblood of an insurance company," but more often than not it's not about how much data; it's about how smart you are with the data. Data management and data quality are no longer optional components of an analytical environment -- they are essential, and fundamental to these initiatives is an Insurance Data Model.

About the Author: By Stuart Rose is global insurance marketing manager at Cary, N.C.-based SAS. Rose, a 20-year veteran of the insurance industry, began his career as an actuary. He has worked for a global insurance carrier in both its life and property divisions and has worked for several software vendors, where he was responsible for marketing, product management and application development. He has driven successful development and implementation of enterprise systems with insurance companies in the U.S., the U.K., South Africa and Continental Europe.

Comment  | 
Print  | 
More Insights
Newest First  |  Oldest First  |  Threaded View
Register for Dark Reading Newsletters
White Papers
Cartoon Contest
Write a Caption, Win a Starbucks Card! Click Here
Latest Comment: This is a secure windows pc.
Current Issue
Security Operations and IT Operations: Finding the Path to Collaboration
A wide gulf has emerged between SOC and NOC teams that's keeping both of them from assuring the confidentiality, integrity, and availability of IT systems. Here's how experts think it should be bridged.
Flash Poll
New Best Practices for Secure App Development
New Best Practices for Secure App Development
The transition from DevOps to SecDevOps is combining with the move toward cloud computing to create new challenges - and new opportunities - for the information security team. Download this report, to learn about the new best practices for secure application development.
Twitter Feed
Dark Reading - Bug Report
Bug Report
Enterprise Vulnerabilities
From DHS/US-CERT's National Vulnerability Database
Published: 2017-05-09
NScript in mpengine in Microsoft Malware Protection Engine with Engine Version before 1.1.13704.0, as used in Windows Defender and other products, allows remote attackers to execute arbitrary code or cause a denial of service (type confusion and application crash) via crafted JavaScript code within ...

Published: 2017-05-08
unixsocket.c in lxterminal through 0.3.0 insecurely uses /tmp for a socket file, allowing a local user to cause a denial of service (preventing terminal launch), or possibly have other impact (bypassing terminal access control).

Published: 2017-05-08
A privilege escalation vulnerability in Brocade Fibre Channel SAN products running Brocade Fabric OS (FOS) releases earlier than v7.4.1d and v8.0.1b could allow an authenticated attacker to elevate the privileges of user accounts accessing the system via command line interface. With affected version...

Published: 2017-05-08
Improper checks for unusual or exceptional conditions in Brocade NetIron 05.8.00 and later releases up to and including 06.1.00, when the Management Module is continuously scanned on port 22, may allow attackers to cause a denial of service (crash and reload) of the management module.

Published: 2017-05-08
Nextcloud Server before 11.0.3 is vulnerable to an inadequate escaping leading to a XSS vulnerability in the search module. To be exploitable a user has to write or paste malicious content into the search dialogue.

Dark Reading Radio
Archived Dark Reading Radio
In past years, security researchers have discovered ways to hack cars, medical devices, automated teller machines, and many other targets. Dark Reading Executive Editor Kelly Jackson Higgins hosts researcher Samy Kamkar and Levi Gundert, vice president of threat intelligence at Recorded Future, to discuss some of 2016's most unusual and creative hacks by white hats, and what these new vulnerabilities might mean for the coming year.