Accelerating the claims response to the LA wildfires
Using a range of data from multiple sources was instrumental in allowing insurers to provide timely support to affected policyholders and mitigate financial exposure
Multi-source data strategies have enabled insurers to gain crucial insights in real-time and allocate resources efficiently
When wildfires erupted across Los Angeles on January 7, few could have predicted the catastrophic scale of destruction that followed.
Within two weeks, the fires were classified as among the most severe natural disasters in US history, with damage and economic losses estimated by some at more than $250bn.
Fuelled by hurricane-force winds and arid conditions, the Palisades, Eaton and Hurst fires also claimed more than 25 lives as they left a trail of devastation across the region.
Amid the chaos, insurers faced an immense challenge in responding swiftly to policyholder claims while ensuring accurate damage assessments. This was made possible by leveraging multi-source data strategies, enabling insurers to gain crucial insights in real time and allocate resources efficiently.
Wildfires present unique challenges for data collection and analysis. Their dynamic nature, combined with thick smoke plumes and rapidly evolving conditions, often hinder traditional assessment methods.
However, by employing multi-source data – including satellite imagery, aerial reconnaissance and ground reports – insurers were able to overcome these obstacles and accelerate their claims response.
In the first 24 hours of the fires, geospatial intelligence specialists integrated data from emergency response agencies, satellite sensors and open-source reporting. This resulted in an “exposure layer”, mapping fire perimeters and identifying active hotspots. With this intelligence, insurers gained immediate situational awareness, allowing them to anticipate claims volumes and strategically deploy claims adjusters where possible.
Overcoming data challenges
Speed and accuracy are both critical in catastrophe response, but the Los Angeles wildfires highlighted the importance of balancing these factors. Thick smoke plumes initially obscured high-resolution optical satellite imagery, a key component in assessing building-level damage.
To address this, some intelligence analysts turned to synthetic aperture radar (SAR), a technology capable of penetrating smoke and providing valuable insights into fire damage, for their insurer clients. However, SAR’s complexity in distinguishing fine details necessitated a hybrid approach, combining SAR with optical imagery and human intelligence expertise once conditions improved.
By January 10, clearer 40 cm optical satellite imagery became available, enabling the release of the first “claims layer” within 72 hours. This layer provided insurers with a broad overview of damage across the affected areas, guiding their claims triage process and resource allocation.
Understanding the distinction between modelled and observed data proved critical for insurers navigating the disaster’s aftermath. While industry estimates projected insured losses between $20bn and $45bn, insurers required high-precision, observed data to make informed decisions.
By January 11, newly available high-resolution imagery enabled specialists at McKenzie Intelligence Services to conduct 17,000 initial building-level damage assessments. This level of granular intelligence empowered insurers to identify total losses, prioritise severely affected areas and communicate effectively with policyholders.
Within a week, nearly 57,000 properties across Los Angeles had been assessed, revealing 12,362 buildings destroyed and an additional 17,500 at risk of major internal damage. These insights enabled insurers to act decisively, providing critical support without unnecessary delays.
Lessons learned and implications
The response to the wildfires underscored several key lessons for insurers and intelligence specialists:
The value of a multi-source data approach: integrating diverse data streams – from satellite and SAR imagery to ground-level reports – provides insurers with a more comprehensive overview than relying solely on traditional ground assessments or single-source data.
Balancing data with human expertise: while multi-data sources are very valuable in accelerating the natural catastrophe claims process, expert human analysis is often needed in the case of wildfires to achieve the best outcomes and reduce the risk of claims disputes.
The role of emerging technologies: advancements in analytics driven by artificial intelligence (AI) are part of a new wave of technologies with exciting potential to support damage assessments. However, this approach is not yet delivering the required results, so the expertise of human intelligence analysts is and will remain the most effective method.
The Los Angeles wildfires serve as a stark reminder of the devastating impact natural disasters can have on communities and economies.
For insurers, adopting a multi-source data strategy was instrumental in accelerating claims response, providing timely support to affected policyholders, and mitigating financial exposure.
The combination of multi-source intelligence with expert analysis is crucial to enhance insurers’ ability to respond to future catastrophic events efficiently and effectively. While machine learning and AI tools will inevitably grow in value as they evolve, they are not yet sophisticated enough to deal with major natural catastrophe events in isolation.
David Heathcote is intelligence delivery manager at McKenzie Intelligence Services