Insurers have long seen data as a source of competitive advantage. But data alone is worthless -- it's the insights derived from the data that matter, says Kimberly Holmes, SVP of strategic analytics with XL Group (Bermuda). And with the emergence of big data, she notes, the possibility for deriving insights is increasing dramatically.
Yet, for those insights to have an impact on the business, they have to have the attention of senior underwriters. "The data analytics mean nothing without the decision makers embracing it," Holmes insists. "We see a lot of our competitors create models that don't have an impact because the underwriters don't use them."
To foster a more collaborative approach to the analysis of large volumes of data, XL ($45.1 billion in total assets) currently is implementing Cary, N.C.-based SAS's Visual Analytics technology. Holmes describes the solution as a powerful communication tool for creating a partnership of exploration between the analytics team and underwriters. She says the tool will bring the expertise of the decision makers and other stakeholders more deeply into the analytics process by demonstrating the meaning of data more readily and inspiring further exploration and insight.
"It's really a demonstration of the expression that 'every picture is worth a thousand words,'" Holmes adds. "The key to getting people to embrace new insights and change how they make decisions is that they believe in their gut that this insight is true."
Rewriting the Rules of the Game
The world of insurance is changing at an exponential rate as volumes of available data rapidly expand and sources of data proliferate, Holmes asserts. As a result, roles within the insurance enterprise will change, along with the terms of competition. "Commercial insurance will become more efficient by creating more automation in decision making and how we access our customers," Holmes predicts. "Those changes will happen more rapidly in smaller-account business, but we need the right technology and data to take advantage of that."
Holmes characterizes XL as one of the few carriers in the commercial insurance domain to act on this vision. "We expect investments such as SAS Visual Analytics to create enormous competitive advantage and shareholder value for XL," she relates.
But few insurers are at the point where they are ready to talk openly about their big data-related initiatives, acknowledges Benjamin Moreland, a Hartford-based senior analyst with Celent. The world of big data constitutes a paradigm shift for carriers, many of which continue to struggle with issues in their traditional transactional data, he notes.
"Carriers continue to have trust issues with internal data," Moreland reports. "Many insurers are not used to using data for operational status and decision support because of their skepticism. Also, business-line-specific data orientation has resulted in inconsistencies in reports, leaving C-level officers to ask, 'Which report should I believe?'"
Insurers that can take advantage of large amounts and types of data early on will be able to do better on pricing and customer segmentation, Moreland says. Their challenge will be driving data into the decision-making process. "Senior leadership often makes decisions on anecdotal evidence," he notes. "Their instincts may be strong, but they have to determine the worth of those instincts based on whether the data supports it."
Big data isn't just a matter of the volume and source of data, but also the speed at which it is processed, Moreland emphasizes. In the past carriers could crunch numbers over time, distribute reports and then make decisions; today many more decisions need to be made in or near real time. Whereas traditionally underwriters may have reviewed overexposure in a given area in hindsight, Moreland observes, "The task for IT today is to bring opportunities to underwriters and other decision makers to support decisioning as events are happening." Effective handling of big data, he suggests, will also enable an increasing range of automated underwriting decisions in near real time, such as preventing the writing of new business in an area with the potential to be struck by a developing weather event.
Company size will be a factor in how -- and how quickly -- a given insurer will adopt big data-related capabilities for underwriting and other purposes, implies Martina Conlon, principal, Novarica (New York). Larger insurers, she says, have made far more progress in the use of large volumes of data -- for example, telematics, geo-spatial data, mobile information, social media data, automated information from weather services and even click streams of visitors to their websites.
"Very large carriers are leveraging big data and operationalizing automated analytics in their business processes," Conlon relates. "Below the top tier, most are dealing with more basic issues, such as implementing solid core systems and trying to establish a baseline business intelligence infrastructure for an integrated view of their data, as well as trying to marry-in structured external data."
The greatest big data barrier for small carriers is cost of entry, as both initial costs and maintenance are high, according to Conlon. Second-tier and smaller carriers also struggle to find the right talent to adopt big data capabilities, he adds. "Lower-tier carriers don't have the resources to determine whether such initiatives are worth it," Conlon explains. "Bigger firms can afford to invest in the analysis to make the business case."
Vendors will help smaller carriers punch above their R&D budget weight, suggests Conlon's Novarica colleague Greg Wittenbrook. Vendor products will begin including big data-related functionality or enabling capabilities, Wittenbrook says, and more data providers will emerge.
Anthony O'Donnell has covered technology in the insurance industry since 2000, when he joined the editorial staff of Insurance & Technology. As an editor and reporter for I&T and the InformationWeek Financial Services of TechWeb he has written on all areas of information ... View Full Bio