01:23 PM
Connect Directly

Critical Compliance Technologies for the Insurance Industry: Part I - The Compliance Challenges

The root of the problem is that many insurers rely on a decentralized IT infrastructure, with different systems for different lines of business, or a patchwork of systems built up over time as a result of mergers and acquisitions.

By Ravi Shankar, Siperian

To say that the insurance industry is tightly regulated would be an understatement. The recent introduction of the National Insurance Consumer Protection Act (NICPA) bill and the effort to introduce optional federal charter suggest that many insurers believe a federal alternative to the existing state-based system of regulation will simplify compliance. However, it's possible that federal regulation would further complicate the insurance compliance model at the state level.Today, the compliance challenges insurers face are uniquely complex since every state in the United States maintains its own distinct insurance regulations, and these laws and policies vary widely across different insurance lines of business. In addition, many states maintain regulations that govern virtually every aspect of insurance company operations, including the amount of financial reserves a company must conserve, how they can market their products, and how much brokers and agents can charge for their services. Moreover, most states impose strict reporting requirements by which insurers must regularly document their compliance with various statutes. Depending on the policy type - life, home, auto, health coverage, etc. - reporting guidelines can require a confusing array of quarterly and annual reports including: audited financial statements, unaudited financial statements, actuarial opinions, claims data, evaluations of securities on deposit and disclosure of material transactions.

The Interstate Insurance Compact, which is supported by the Interstate Insurance Product Regulation Commission (IIPRC), is an effort that's been underway over the past few years to implement standard filing guidelines and reporting requirements across all the states. To date, 33 states have adopted the standards. If this effort succeeds, insurers will have an easier time filing their compliance reports, but they will still be on the hook for scores or hundreds of filings in each of the states where they do business-even if all 50 states sign on. The actual filing might be easier, but the hard work of compiling data and creating accurate reports remains unchanged. And this can be a very difficult job indeed.

Decentralized Data = Difficult Compliance Reporting The root of the problem is that many insurers rely on a decentralized IT infrastructure, with different systems for different lines of business, or a patchwork of systems built up over time as a result of mergers and acquisitions. The typical IT approach within most large insurers has been to design a systems infrastructure that is centered around agent/broker needs or specific policy offerings, as opposed to creating a customer-centric or policy-centric approach. For instance, it's common for insurance firms to organize their IT infrastructure and data sources according to line of business - home, auto, commercial and surety - with each line maintaining its own system for claims, billing, online customer service and so forth. The benefit of this type of design is that insurers can develop and maintain a clear understanding of the hierarchy and history of their agent and broker networks. However, the drawback is that compliance officers have to create mandated reports by compiling data from multiple systems across the organization, rather than from a single source.

From a compliance reporting standpoint, this approach causes data quality issues because information often gets duplicated from system to system. A single customer who maintains both homeowners and auto policies with the company would have records residing in two separate systems. Similarly, investment account information might be housed in multiple data stores. This means when it's time to compile quarterly or annual reports for state-level insurance regulators, compliance managers may have a difficult time determining which systems contain the correct and up-to-date records. IT systems that are organized around separate lines of business complicate regulatory reporting, making it difficult to map operations, processes and information to specific reporting requirements. Under these circumstances, how can insurers best prepare themselves to ensure compliance with stricter regulations and to manage risk appropriately? Master data management (MDM).

Using the right MDM platform, agents, brokers and carriers can create the most reliable master reference data, review complex account relationships and hierarchies and obtain real-time, unified views of clients, producers, employees, agents, policies, claims and risk exposures.

In Part II of Critical Compliance Technologies for the Insurance Industry subtitled, Master Data Management to the Rescue, the author, Ravi Shankar, will address the importance of selecting the right MDM technology and share his ten key requirements that will ensure you choose the most effective MDM solution. By taking the time to build the right MDM foundation, insurers can increase revenue per client through cross-sell and up-sell; automate agency and broker processes, improve the customer experience; increase revenue through producer management; streamline client on-boarding and claims processing, and improve corporate governance through enterprise risk management. Part II will appear tomorrow.

About the Author: Ravi Shankar is senior director of product marketing at Siperian, Inc., a provider of a flexible master data management platform. For more information, contact him at or visit root of the problem is that many insurers rely on a decentralized IT infrastructure, with different systems for different lines of business, or a patchwork of systems built up over time as a result of mergers and acquisitions.

Comment  | 
Print  | 
More Insights
Newest First  |  Oldest First  |  Threaded View
Register for Dark Reading Newsletters
White Papers
Cartoon Contest
Write a Caption, Win a Starbucks Card! Click Here
Latest Comment: This is a secure windows pc.
Current Issue
Security Operations and IT Operations: Finding the Path to Collaboration
A wide gulf has emerged between SOC and NOC teams that's keeping both of them from assuring the confidentiality, integrity, and availability of IT systems. Here's how experts think it should be bridged.
Flash Poll
New Best Practices for Secure App Development
New Best Practices for Secure App Development
The transition from DevOps to SecDevOps is combining with the move toward cloud computing to create new challenges - and new opportunities - for the information security team. Download this report, to learn about the new best practices for secure application development.
Twitter Feed
Dark Reading - Bug Report
Bug Report
Enterprise Vulnerabilities
From DHS/US-CERT's National Vulnerability Database
Published: 2017-05-09
NScript in mpengine in Microsoft Malware Protection Engine with Engine Version before 1.1.13704.0, as used in Windows Defender and other products, allows remote attackers to execute arbitrary code or cause a denial of service (type confusion and application crash) via crafted JavaScript code within ...

Published: 2017-05-08
unixsocket.c in lxterminal through 0.3.0 insecurely uses /tmp for a socket file, allowing a local user to cause a denial of service (preventing terminal launch), or possibly have other impact (bypassing terminal access control).

Published: 2017-05-08
A privilege escalation vulnerability in Brocade Fibre Channel SAN products running Brocade Fabric OS (FOS) releases earlier than v7.4.1d and v8.0.1b could allow an authenticated attacker to elevate the privileges of user accounts accessing the system via command line interface. With affected version...

Published: 2017-05-08
Improper checks for unusual or exceptional conditions in Brocade NetIron 05.8.00 and later releases up to and including 06.1.00, when the Management Module is continuously scanned on port 22, may allow attackers to cause a denial of service (crash and reload) of the management module.

Published: 2017-05-08
Nextcloud Server before 11.0.3 is vulnerable to an inadequate escaping leading to a XSS vulnerability in the search module. To be exploitable a user has to write or paste malicious content into the search dialogue.

Dark Reading Radio
Archived Dark Reading Radio
In past years, security researchers have discovered ways to hack cars, medical devices, automated teller machines, and many other targets. Dark Reading Executive Editor Kelly Jackson Higgins hosts researcher Samy Kamkar and Levi Gundert, vice president of threat intelligence at Recorded Future, to discuss some of 2016's most unusual and creative hacks by white hats, and what these new vulnerabilities might mean for the coming year.