Feeding the Beast: Three Data Value Considerations
David Lukens | November 24, 2014
Insurance carriers have traditionally used predictive models for pricing, using credit-based insurance scores, and general linear model or GLM-based pricing plans to determine rates. But now, this sophistication has expanded into virtually all aspects of our business—from creating underwriting and claims workflows, developing marketing programs, establishing pay-plan options, and even deciding what underwriting reports to buy.
These models are ravenous for data. The more data we feed them, the better the results, so carriers are looking outside their walls for more sources of data to fuel their models. At the same time, expense pressure is pushing them to cut costs wherever possible.
For the carrier, these needs are in opposition. Ask, how much data can you afford to buy (or not buy), and, more importantly, how can you best make that decision? Let’s examine the factors to consider in making data purchasing decisions and some best practices for forming cost-benefit analyses to help drive these decisions.
Data cost is usually pretty obvious. There is a price for either access to data (subscription-based), or per data item purchased (transaction). Consider data value. First, think about data completeness, generally measured in terms of hit rate: For a given inquiry, how often am I going to get information back? Second is accuracy. When I get a hit back, how certain am I that it is correct? This is generally measured in terms of either false positives (I received a result, but it was not for this inquiry subject), or false negatives (I show no result for this inquiry subject, but there actually was one). Third is compliance. How certain am I that the data I am purchasing meets the compliance requirements for my needs? Is it governed by FCRA, DPPA, or another regulation?
Considering Data Completeness
A hit on a certain data requirement clearly has value; if it didn’t we would be using the data in our models or pricing. When evaluating a data source, hit rate is one of the key drivers of the buying decision. The critical metric to have in mind when considering hit rate is: What is the value of a hit?
Having a dollar figure predetermined for the value of a hit will allow you to make an informed decision about whether a more expensive data source with a better hit rate provides a better return than a less expensive alternative with a lower hit rate. An upshot to this metric is the cost of a no-hit, which is not necessarily the same for every data application. For a credit-based insurance score, this is the cost of missed segmentation on the price for a risk; and for an accident history or MVR, a no-hit could be hundreds of dollars in lost premium. Again, establishing these metrics up front will allow a robust CBA when considering alternate data sources.
Data accuracy is the second component in assessing data quality. This is usually measured in terms of false positives or false negatives (or false clears). False positives can be very costly in terms of customer relations, typically occurring in driver discovery-type solutions during the underwriting process. The real cost of driver discovery is the time and effort it takes to follow up on potential leads, rather than the data cost. A false positive creates unnecessary work for underwriters or rate pursuit teams and can ultimately upset potential customers.
In an accident history situation, a false positive can create artificial rate increases that result in lost risks (e.g. if the customer buys elsewhere), or a poor customer experience if the policyholder has to follow up to correct the error. False positives are especially important to consider when balancing with hit rate. A high hit rate is generally preferable to a low hit rate, but not if it creates a lot of false positives. Establishing the cost associated with a false positive result allows you to include this in your CBA as well.
False clears, or false negatives, are particularly difficult to assess. As carriers look to decrease expense loads, they consider ways to avoid ordering expensive underwriting reports, such as MVRs, whenever possible. The tradeoff here is the understanding that any predictive model one uses to decide whether or not to order an MVR (or another report) will not be right 100 percent of the time. You will miss violations.
In assessing whether such a model is worthwhile, look back to the cost of a false clear. Typically this will be the rate load that a chargeable violation will have on the premium charged (not usually a trivial number). False clears can be insidious because it is tempting to simply turn on a model that will reduce orders and start realizing expense savings. But, this can be misleading because you won’t really know how well your model is performing.
As a best practice in testing these kinds of predictors, you need to establish the cost of a no hit or value of a hit (in this scenario they are the same) and order 100 percent for some period of time while you test the predictive model. Then, you can determine whether the savings outweigh the potential premium loss.
With the CFPB’s arrival, it is opportune to readdress compliance. Data sources we have historically used, such as credit-based insurance scores, claims histories, and MVRs, have well-established compliance rules that should not present any surprises. The past couple of years have introduced a lot of new data sources, such as court-based activity and event-driven triggers, however the compliance specifics surrounding them are not necessarily as well established.
Make sure you thoroughly review the compliance concerns with not only the sources of this data, but also with how you are using the data. A new, non-FCRA data source can still be subject to FCRA compliance limitations depending on the actions you take with the data. Since you and your data vendors share a common bond in abiding by these compliance rules, make sure you are comfortable that your data vendor understands the rules around the data they are selling, have appropriate consumer disclosure processes in place to handle calls, and have guidelines in place regarding specific use cases for the data they are providing.
If you feed the beast, you can get better results. But remember the cost balance: Are you paying less or more at the end of the day just to keep going? If you look at the three dimensions of data value—completeness, accuracy and compliance—there’s nothing but upside to your CBA.
(David Lukens is director, insurance, telematics, for the risk solutions business of LexisNexis. He is responsible for telematics and mobile solutions for the auto insurance market. Since joining LexisNexis in 2010, Lukens has also led several key data and analytics initiatives, including building out solutions for identity risk management, driver discovery and policyholder retention.)
- Enterprise Architecture in an Agile World
- Top 10 Tips for Securing Your Mobile Devices and Sensitive Client Data
- Industry Insight: 4 Global Insurance Trends in Digital, Data, Content Services and Security
- Diving Deeper into Prioritizing Your Strategic Digital investments
- Why Content Rules
- How Mass Personalization Will Open the Small Business Benefits Market
- At Year End 2017, Will Your Organization Be Protected from Cyber Risks?
- Do Insurance Bots Dream of Mitigating Risk?
- Conditioned to Respond
- Managing & Mobilizing Insurance Data in a Connected World
- Race to the Finish Line
- New Tools, New Opportunities in Claims
- ITA LIVE: Reaching Insurance Industry Crossroads
- Advice to Insurance IT Leaders: Keep Your Eye on the Ball
- New Date, Venue for ITA LIVE 2017
- Guidewire Makes Major Push to Small and Midtier Market by Acquiring ISCS
- Insurance Disruption is Happening Right Now
- Insurity Adds Strategic Investment Partner, General Atlantic
- Beyond Transformation: The Convergence of Finance, Risk, and Actuarial Functions
- The Rapid Evolution of Consumer Protection Regulation
- Talent Hunt: Finding, Attracting, Retaining Top People
- Insurers Flexing Their Distribution Models
- Technology Driving Disruption in Insurance
- Fear of ‘Next Bubble’ Challenges Life, Annuity Carriers
- Technology Allows Commercial Lines Insurers to Stand Out
- Single Sign-on Viewed as Biggest Tech Challenge for Agencies
- ISCS Observes 20th Anniversary; Scurto Predicts Major Changes Ahead
- Policyholders and Their First Impressions
- Progressive Making Progress on the UBI Front
- High and Dry: Insurers Search for Disaster Recovery Plans
- Insurers Sign The (Un)Dotted Line
- Reflections of a Retired Insurance CIO
- Mobile Device Management Just One Answer to BYOD Issue
- Lessons from GEICO and Progressive on Winning the Critical Buying Stage
- You Are a Target for a Cyber Attack
- Web-based Systems are the Next Evolution in Claims Technology
- Gaining a “Wow” Experience from Web Users
- Time to Shift from Business/IT Alignment to Business/IT Alliance
- Healthcare Insurers Changing to Consumer Model
- Organization is the Key for Selecting Software Vendors
- Analysts Expound on the Needs of the Mid-tier Insurance Market
- Finding the Cure for Obamacare’s Website
- New Software Solutions Benefit Insurers on the Inside and Outside
- Products, Market Impede Investment in Systems for Life Insurers
- Combatting Cyber Threats: Predict, Prevent, Persist
- The Future of Telematics Heads Beyond Insurance
- The Shame in Cyber Security Lapses
- Building Policy Administration Systems for the Future
- Insurers Look Into The Eyes of Their Policyholders
- It’s a New Dawn for the ITA
INSURANCE IT NEWS
- Captricity AI Surpasses Human Ability to Read
- Maple Tech Announces Enhanced Integration Functionality Available in Latest Version of Aspire
- ORIGAMI RISK RECEIVES GROWTH INVESTMENT FROM SPECTRUM EQUITY
- Raffles Health Insurance (Singapore) Selects Majesco’s Health Core Platform
- EIS Group Announces Reseller Agreement with Comtec
- Kookmin Best Insurance Company Selects Majesco CloudInsurer
- Glatfelter Insurance Group Selects OnBase by Hyland to Improve Information Management
- Bold Penguin Announces Commercial Insurance Marketplace Expansion
The Email Chat is a regular feature of the ITA Pro magazine and website. We send a series of questions to an insurance IT leader in search of thought-provoking responses on important issues facing the insurance industry.
ITA is pleased to present the 2014 Webinar Series. We have many topics for you to choose from and attendance is open to all ITA members. The webinar topics are current and exciting — ranging from predictive analytics to telematics and will focus on the direction insurance carriers need to follow for the future. All webinars are presented by insurance IT professionals along with some of the leading analysts and consultants in the field. There is no cost to attend an ITA webinar. For more information and to register for the webinar, click the “title” of the webinar below.
BLOGS AND COLUMNS
It has become a common refrain over the past few years to view the practice of enterprise architecture (EA) as something that time has passed by, much... READ MORE
One important trend in society over the past decade is our increasing ability to create and consume a seemingly unlimited amount of digital content... READ MORE
You have surely heard it said that small businesses are the growth engine for America. Today, the phrase has a special ring to it for benefits... READ MORE
With stagnant growth and lingering low interest rates, the life insurance industry faces a challenging future... READ MORE
Finding insurance carriers willing to write commercial lines risks has always been a challenge for producers... READ MORE
As Guidewire Software prepares for the start of Connections, its 11th annual user conference that begins on Nov. 2, Brian Desmond, chief marketing... READ MORE
Fraud detection has always been and will continue to be a critical component of claims management. Learning the lessons from current claims Straight... READ MORE
- Vendor Views