Predictive Models: Right Place + Right Time = Better Decisions
JJ Jagannathan | April 06, 2015
Predictive modeling is a powerful tool that can help make better business decisions in every segment of the P&C insurance value chain. Daily, we come across all kinds of industry statistics on how P&C carriers plan to increase their investment in predictive analytics and where they plan to make their investments.
The ‘how we do it’ is equally important as the ‘what we do.’ The efficiency of how a predictive analytics project is run matters more than saying, “Yes, we do predictive analytics because it is a recommended industry best practice.” When I stepped back to think about how we are running predictive modeling initiatives today, I realized that there are plenty of things we can do to reduce the total cost of ownership and improve the impact on business decisions. Over the last several years, I have witnessed multiple P&C insurance predictive analytics projects suffer from the following efficiency gaps:
Time to Get Modeling Data
The time it takes for the modeling teams to pull policy, claims, billing, and external data from the core systems and data warehouses to build the model far exceeds the time that the teams actually spend on data preparation and model development work. The inefficiencies around pulling partial data, wrong data, and historical data from multiple systems with completely different data formats can impact the quality of the model and the time to create the initial model.
Most teams spend at least two to three months extracting the raw transaction data. Any delays in extracting data force teams to spend very little time on the more important areas of validating target data availability, understanding sample bias, and identifying good data surrogates.
Time to Deploy the Initial Model
After the teams overcome the data challenge and even if the modeling teams are fast in developing a version 1 candidate model that is ready for deployment, the next biggest delay the modeling teams face is the long wait time it takes to get access to IT resources. In most situations, the IT team’s priorities get filled up 18 to 24 months in advance and you have the following options: just wait; demonstrate a significant ROI multiple and hope for some luck; or have the business leader with the biggest title and influence championing your project to move it up the IT prioritization queue.
You might have the most sophisticated algorithm, but if you cannot operationalize your model quickly, it delivers no benefit to the organization. Division leaders that want to move quickly to transform their business operations with analytics struggle from the IT resource scarcity situation. After teams get the necessary IT resources aligned, it typically takes about four months to deploy the initial model.
Whenever predictive models get deployed in a model hosting application silo that is completely disconnected from the core systems, the models are not taking full advantage of the real-time data stream and the opportunity to embed analytics directly into the business workflow to make better decisions. It has to be noted that carriers that are on legacy core systems are forced to choose analytics application silos because their core systems don’t provide the necessary support to run predictive modeling initiatives.
The modeling teams usually develop multiple models and test them against a holdout dataset to select the one model that performs best and is easiest to implement. Because of all the process constraints around model deployment, the teams are forced to choose just one model for the initial rollout. This approach of banking on one model for the initial rollout limits the probability of success.
You can find a number of stray models within the organization that never get deployed and integrated into the regular business workflow, but are technically considered to be in ‘production’ and are run daily from an expert’s personal desktop. This behavior is primarily driven by the process hassles around model deployment and internal system limitations.
Time to Enhance the Model
After the version 1 model has been in production for some time and based on the model’s performance in the field, the project teams will decide to enhance the model using additional external datasets, adjust cut points, overlay new business rules or adjust model coefficients. The time to pull the gap data (the data that is needed for refreshing or rebuilding the model) and scenarios of pulling the gap data from a completely different transaction system than that which was used to develop the version 1 model can impact model quality and cause delays.
Next, the project teams have to go through one more round of delays to get IT resources before deploying the enhanced version 2 model. Typically the model refresh, enhancement, and redeployment efforts can take about six to eight months to complete, but there are plenty of carriers that choose to do this once a year.
Time to Learn Model Performance Results
In most cases, the business leadership and modeling teams don’t have a view into the model’s performance in the field for several months, and in some cases, even up to a year, and I am not talking about rating models here. Modeling teams try to share as much information as possible with the business teams, but often they get stuck waiting to get access to the field performance data. The approach of rolling out one model during the initial deployment and waiting for answers extends the learning curve.
Moving from Predictive Analytics 1.0 to Predictive Analytics 2.0
I believe we have to move beyond the Predictive Analytics 1.0 phase of getting started in this transformative analytics journey and need next-generation capabilities to overcome the current efficiency gaps. Compressing the time-to-action on the four areas of a predictive modeling project that were outlined above is critical to drive success in terms of lowering the total cost of delivering a predictive analytics project and the ability to make better business decisions.
Here are some options to consider:
- Look at secure and reliable public or private cloud-hosted analytics platforms that can be integrated into your core systems and data warehouses to overcome data extraction and model deployment delays. An instant-on predictive analytics capability is required to effectively compete in the market and your IT department leaders will be able to help you select the right solution architecture.
- Deploy the predictive models closely integrated with your core systems to take advantage of the real-time data stream in the core systems and make your analytics more actionable. The ability to have the right data at the right time is critical and embedding predictive analytics models within the core systems can get you closer to the optimized state.
- Eliminate stray models that are run from an expert’s personal desktop for production usage. These are single points of failure and it is critical that these models get integrated into the mainstream business workflow. Wrong business decisions driven by human error in these suboptimal processes, can cause financial impact and unpleasant customer experience.
- Create a data-driven testing culture and the ability to vet multiple models at the same time in limited pilot rollouts to identify the best model for wider production usage. We can take some cues from how some of the world-class e-commerce organizations are taking advantage of A/B testing and multivariate testing to quickly run experiments on customer call-to-action response to decide the best course of action to maximize results. The mindset of rapid iteration and learning is key.
- Democratize the information on model performance results so all the stakeholders have a full view of the predictive analytics program’s performance. Failing fast and learning quickly is what we need here. Not knowing whether or not your model is working for the next several months and hoping for the best outcome is not the way to run a predictive analytics project. It is much easier to track model performance results when it is integrated with the core systems workflow. Monthly dashboards that track model performance metrics can be extremely helpful.
- Include the right external datasets into your version 1 modeling dataset and don’t just rely on internal data and push external data evaluation to a later phase. Having the right external datasets can help boost your version 1 model’s performance, compress the development timeline, and help create a competitive differentiator. There are plenty of new data streams from smart home sensors and social media exhaust that have demonstrated predictive signal and is already coming into the carrier’s business workflow. It is critical that the right data management framework is in place to support quicker adoption of the newer datasets into your modeling initiatives.
By no means is this a comprehensive list of recommendations, but this is a start. Fixing efficiency gaps in predictive analytics projects is more about ‘getting a solid foundation in place’ before we go after bigger business problems and more powerful analytic techniques. Having the right data and the right model in the right place is key to making better business decisions. I welcome the readers to share what has worked well for you in the past and what we should be doing as an industry.
JJ Jagannathan is senior director of product management at Guidewire Software. He is responsible for product strategy for Guidewire Live, a cloud-hosted platform of instant-on P&C analytics apps. He can be reached via email at email@example.com
- ITA Pro Magazine May/June 2019
- Spotlight on the 2019 IASA Conference
- ValueMomentum Selects Erie as Site of Regional Development Center
- Capgemini and Majesco Become Alliance Partners
- Electronic Chat with Dr. Dan Shoham
- Electronic Chat with Todd Greenbaum
- Martha Notaras: The “Outsider” with an Amazing Inside View
- ProSight Direct Offers “Effortless Insurance for Today’s Professional”
- Electronic Chat with Larissa Tosch
- Martha Notaras Will Join ITA LIVE 2019 as a Keynote Speaker
- Five Things to Consider When Evaluating Your Cyber Risk
- ITA Pro Magazine, January/February 2019
- Major Ransomware Attack Could Hit U.S. with $89B In Economic Damages
- ITA Announces 1st of Three Keynote Speakers at ITA LIVE 2019
- Electronic Chat with Jeroen Morrenhof
- Legacy Systems Are Dead. Really? Don't Count On It.
- Now Accepting Nominations for the 2019 ITA Bridge Awards
- It's time to register for ITA LIVE!
- Registration is Now Open for ITA LIVE 2019!
- What to Expect from a Digital Experience Platform Implementation
- ITA Pro Magazine September Edition is Now Available
- It's National IT Professionals Day
- Save the Date for ITA-LIVE 2019
- OneShield Software and UrbanStat Work Together to Improve Real-Time Analytics and Risk Decision-Making
- ITA LIVE 2019 - SAVE THE DATE!
- Insurance Technology Association Announces New Editor-in-Chief
- August 2018 Edition ITA Pro Magazine is Now Available
- Enterprise Architecture in an Agile World
- Top 10 Tips for Securing Your Mobile Devices and Sensitive Client Data
- Industry Insight: 4 Global Insurance Trends in Digital, Data, Content Services and Security
- Diving Deeper into Prioritizing Your Strategic Digital investments
- Why Content Rules
- How Mass Personalization Will Open the Small Business Benefits Market
- At Year End 2017, Will Your Organization Be Protected from Cyber Risks?
- Do Insurance Bots Dream of Mitigating Risk?
- Conditioned to Respond
- Managing & Mobilizing Insurance Data in a Connected World
- Race to the Finish Line
- New Tools, New Opportunities in Claims
- ITA LIVE: Reaching Insurance Industry Crossroads
- Advice to Insurance IT Leaders: Keep Your Eye on the Ball
- New Date, Venue for ITA LIVE 2017
- Guidewire Makes Major Push to Small and Midtier Market by Acquiring ISCS
- Insurance Disruption is Happening Right Now
- Insurity Adds Strategic Investment Partner, General Atlantic
- Beyond Transformation: The Convergence of Finance, Risk, and Actuarial Functions
- The Rapid Evolution of Consumer Protection Regulation
- Talent Hunt: Finding, Attracting, Retaining Top People
- Insurers Flexing Their Distribution Models
- Technology Driving Disruption in Insurance
- Fear of ‘Next Bubble’ Challenges Life, Annuity Carriers
- Technology Allows Commercial Lines Insurers to Stand Out
- Single Sign-on Viewed as Biggest Tech Challenge for Agencies
- ISCS Observes 20th Anniversary; Scurto Predicts Major Changes Ahead
- Policyholders and Their First Impressions
- Progressive Making Progress on the UBI Front
- High and Dry: Insurers Search for Disaster Recovery Plans
- Insurers Sign The (Un)Dotted Line
- Reflections of a Retired Insurance CIO
- Mobile Device Management Just One Answer to BYOD Issue
- Lessons from GEICO and Progressive on Winning the Critical Buying Stage
- You Are a Target for a Cyber Attack
- Web-based Systems are the Next Evolution in Claims Technology
- Gaining a “Wow” Experience from Web Users
- Time to Shift from Business/IT Alignment to Business/IT Alliance
- Healthcare Insurers Changing to Consumer Model
- Organization is the Key for Selecting Software Vendors
- Analysts Expound on the Needs of the Mid-tier Insurance Market
- Finding the Cure for Obamacare’s Website
- New Software Solutions Benefit Insurers on the Inside and Outside
- Products, Market Impede Investment in Systems for Life Insurers
- Combatting Cyber Threats: Predict, Prevent, Persist
- The Future of Telematics Heads Beyond Insurance
- The Shame in Cyber Security Lapses
- Building Policy Administration Systems for the Future
- Insurers Look Into The Eyes of Their Policyholders
- It’s a New Dawn for the ITA
INSURANCE IT NEWS
- Aspire Names Hyland a Leader in Customer Communications Management for Business Automation
- ValueMomentum Joins the MuleSoft Partner Program
- Crawford Technologies Partners with BlueCrest to Combine Market-leading Mail Processing and Workflow Technology
- Zywave Achieves Major Milestone in Small Group Quoting through Acquisition of RateFactory
- Unicorn Underwriting Successfully Implements ClarionDoor Rating and Quoting
- Greek insurance company Eurolife ERB Wins at IMPACT Bite Awards 2019 with FRISS for Automated Fraud Prevention
- Allstate Designates Official “Day for Play” in 2019 Concacaf Gold Cup™ Tournament Cities
- FRISS and omni:us supercharging insurance with AI
The Email Chat is a regular feature of the ITA Pro magazine and website. We send a series of questions to an insurance IT leader in search of thought-provoking responses on important issues facing the insurance industry.
ITA LIVE 2019
The tide is up! It's time to register for ITA LIVE 2019, our annual educational and networking conference! Our theme is "The InsurTech Revolution: Cutting Through the Hype." and we'll be bringing in a torrent of industry thought leaders, amazing insight and wonderful perspectives on the world of insurtech and its impact on the insurance landscape.
ITA LIVE 2019 will present real-life examples of true startup technologies that are helping insurers gain real advantage -- and a competitive edge -- in the marketplace. We’ll highlight the more successful InsurTech partnerships, while offering case studies that demonstrate exciting innovation and cutting-edge techniques impacting all aspects of the insurance ecosystem.
Ride the wave to LIVE 2019. Sign up today! We look forward to seeing you in May, 2019!
BLOGS AND COLUMNS
You have surely heard it said that small businesses are the growth engine for America. Today, the phrase has a special ring to it for benefits... READ MORE
With stagnant growth and lingering low interest rates, the life insurance industry faces a challenging future... READ MORE
Finding insurance carriers willing to write commercial lines risks has always been a challenge for producers... READ MORE
As Guidewire Software prepares for the start of Connections, its 11th annual user conference that begins on Nov. 2, Brian Desmond, chief marketing... READ MORE