12:47 PM
Connect Directly

From Automation to Precision: Closing the Productivity Gap

Properly understood, legacy system replacement is both more challenging and more valuable than mere process automation; it should ask more penetrating questions about how to improve underwriting, claim, and customer/producer satisfaction outcomes.

By Marcus Ryu, Guidewire

Marcus Ryu, Guidewire
Marcus Ryu, Guidewire
Observers of the insurance industry rightly identify inefficiency as one of the insurance industry's greatest challenges. Complex, stove-piped organizations replete with manual processes call out for the benefits of automation. They also invite comparisons to the manufacturing industry and advocacy for a more "industrial" approach to insurance processing. No one would dispute that the insurance industry could gain greater benefit from automation. However, it may be worthwhile to ask how much insurers can ultimately gain through greater efficiency and what precisely they can learn from manufacturing.The differences between insurance business processes and manufacturing are stark: instead of material inputs moving in a linear supply chain and accumulating in inventory, insurance professionals move information in multiple directions and interact with policyholders. Insurance carriers have neither inventory nor a supply chain, and yet there is a deep similarity: high-volume, repetitive, time-sensitive tasks completed by organizations of hundreds or thousands of people with similar job roles.

This similarity is intriguing because it suggests that perhaps the insurance industry can learn from the efficiency achievement of manufacturing industries, which on average have increased their productivity (output per work-hour) over 500% in the last 50 years. In contrast, the insurance industry's labor productivity has increased by only about 50%. If an average insurer 50 years ago were to have kept its number of employees constant, it would have been able to grow total premiums in real dollars by a factor of less than 1.5.

Prior to WWII, automation was the primary driver of productivity gains in manufacturing. Manufacturers emphasized standardization and role specialization - exemplified by the mass production assembly line - driving volume-oriented metrics such as total output, output per worker, total cycle time, and total cost of production. In the last several decades, however, a different kind of theory has governed productivity improvements in manufacturing. Under the rubric of "lean," "just-in-time," and "total quality management," manufacturers have shifted from an emphasis on pure automation to a precision-orientation. This shift involved dramatic increases in planning, coordination, and process optimization -and a corresponding shift in metrics, to inventory turnover, waste levels, internal stage cycle times, and quality thresholds.

The key point here is that automation - a perennial focus of insurers - and productivity are not synonymous. You can pump water harder, but if you have a leaky pipe, you will get better returns from fixing that pipe. Likewise, reduced loss adjustment or underwriting expense are valuable goals, but it's much more valuable to minimize indemnity leakage and make good underwriting decisions to control the loss ratio.

Achieving those goals requires systematic improvements that go beyond automation to precision-oriented metrics. Unfortunately for insurance, legacy core systems have stymied this advance because of their inherent limitations. Legacy core systems do not capture nearly enough structured high-quality data; they cannot distribute work to specialists at key points in the underwriting or claims process; they cannot support the complex product attributes and variations required by the market; they cannot engage the full spectrum of available data and business logic in underwriting and claims decisions. In short they do not provide the necessary "shop floor" for precision-oriented insurance work.

In planning technology-based improvements, insurers have tended to focus their system replacement projects primarily on basic processing tasks: data entry and data management, clerical file management, answering queries, and writing standard correspondence. Only more recently have they focused on tasks requiring more human skill-interaction tasks like issue resolution, high-touch customer service, negotiation, and vendor or broker performance management. Still few carriers have targeted underwriting and claims judgment, liability judgment, or medical evaluation.

The potential tragedy here is trying to justify the cost and effort of legacy system replacement on the slender foundation of "time savings." Properly understood, legacy system replacement is both more challenging and more valuable than mere process automation; it should ask more penetrating questions about how to improve underwriting, claim, and customer/producer satisfaction outcomes.

By freeing insurers from legacy system limitations and by focusing beyond automation, insurers are able to experiment with different processes and business logic. They can scrutinize which activities really drive better decision-making. They can observe business use of the system and streamline the interface to avoid wasted motion. And perhaps most important, they will be able to initiate a continuous process of self improvement.

About the Author: Marcus Ryu is Vice President, Strategy and Products at Guidewire Software, a provider of core systems for the property/casualty market. He can be reached at understood, legacy system replacement is both more challenging and more valuable than mere process automation; it should ask more penetrating questions about how to improve underwriting, claim, and customer/producer satisfaction outcomes.

Comment  | 
Print  | 
More Insights
Newest First  |  Oldest First  |  Threaded View
Register for Dark Reading Newsletters
White Papers
Cartoon Contest
Write a Caption, Win a Starbucks Card! Click Here
Latest Comment: This is a secure windows pc.
Current Issue
Security Operations and IT Operations: Finding the Path to Collaboration
A wide gulf has emerged between SOC and NOC teams that's keeping both of them from assuring the confidentiality, integrity, and availability of IT systems. Here's how experts think it should be bridged.
Flash Poll
New Best Practices for Secure App Development
New Best Practices for Secure App Development
The transition from DevOps to SecDevOps is combining with the move toward cloud computing to create new challenges - and new opportunities - for the information security team. Download this report, to learn about the new best practices for secure application development.
Twitter Feed
Dark Reading - Bug Report
Bug Report
Enterprise Vulnerabilities
From DHS/US-CERT's National Vulnerability Database
Published: 2017-05-09
NScript in mpengine in Microsoft Malware Protection Engine with Engine Version before 1.1.13704.0, as used in Windows Defender and other products, allows remote attackers to execute arbitrary code or cause a denial of service (type confusion and application crash) via crafted JavaScript code within ...

Published: 2017-05-08
unixsocket.c in lxterminal through 0.3.0 insecurely uses /tmp for a socket file, allowing a local user to cause a denial of service (preventing terminal launch), or possibly have other impact (bypassing terminal access control).

Published: 2017-05-08
A privilege escalation vulnerability in Brocade Fibre Channel SAN products running Brocade Fabric OS (FOS) releases earlier than v7.4.1d and v8.0.1b could allow an authenticated attacker to elevate the privileges of user accounts accessing the system via command line interface. With affected version...

Published: 2017-05-08
Improper checks for unusual or exceptional conditions in Brocade NetIron 05.8.00 and later releases up to and including 06.1.00, when the Management Module is continuously scanned on port 22, may allow attackers to cause a denial of service (crash and reload) of the management module.

Published: 2017-05-08
Nextcloud Server before 11.0.3 is vulnerable to an inadequate escaping leading to a XSS vulnerability in the search module. To be exploitable a user has to write or paste malicious content into the search dialogue.

Dark Reading Radio
Archived Dark Reading Radio
In past years, security researchers have discovered ways to hack cars, medical devices, automated teller machines, and many other targets. Dark Reading Executive Editor Kelly Jackson Higgins hosts researcher Samy Kamkar and Levi Gundert, vice president of threat intelligence at Recorded Future, to discuss some of 2016's most unusual and creative hacks by white hats, and what these new vulnerabilities might mean for the coming year.