(Carl Lambert, Vice-President of Business Intelligence at The Co-operators)
In June
2013, Canada suffered one of its most severe floods in recorded history. 32
towns in southern Alberta were flooded for total damages exceeding C$5 billion. At the time, the
insurance industry did not offer flood insurance. Sewer backup losses were covered and
the total cost for the industry was C$1.7 billion. Yet, in early 2015, Canada
remained the only G7 country where residential flood insurance coverage was not
available.
At
Co-operators, we were already working on launching a flood product. Those events reinforced the demand for residential flood coverage and put
more pressure on the industry to develop a solution.
The Co-operators was the first in Canada to launch
such a new coverage. Significant effort was required across the organization to
ensure we implemented the proper solution that would answer an unmet need, while
focusing on making Canadian communities more resilient to flooding. This blog
will focus on only one piece of the work, the development of the risk assessment
and the pricing. The BI-Research team and Actuarial pricing team collaborated
on a non-traditional pricing solution.
The Approach - Research
Learn & Partner with Canadian Universities
We started
by reaching out to our network of partners
in Canadian universities. This helped us better understand important concepts
around flood hazards, flood plains, and damage functions. Our Statisticians and
Actuaries have learned to work with Hydrologists, Geologists, Hydraulic
Engineers and Civil Engineers.
Seek out Third Party Vendors
We then started
a long process of seeking out and assessing existing flood models and data
sources. We learned to speak with modeling firms, and gradually built enough
expertise internally to be able to assess the credibility and value of third party vendors.
Leverage Open data & Big Data
We then
sought out external available data.
There is a lot of information available and the challenge was to
identify the ones that are usable for that purpose. By usability of information, we mean
reliability, predictability and frequency of updates.
We have tested dozens
of external sources and a significant number of them have been used. For example, we used elevation data at every
5 meters Canada Wide. (30 meters in
rural areas). We also used the Soil type
across Canada to better model water dispersion and evaluate how long the flood
will last. We also used Historical River
flows, with numerous lecture points of all rivers in Canada, available every
minute, for at least 50 years. We even
used a database showing historical Tectonic Plates movements.
Assessment of the Risk
There were three sources of flood risk to model:
Fluvial flood, pluvial flood and coastal flood. Each of them has their own
specificities and therefore have different models.
It was
important for us to provide adequate and flexible coverage for all Canadians,
whether they are in a high risk zone or not, at a price that accurately
reflects the true risk. For that reason, we needed a model that was accurate,
precise, and consistent.
Our model is customized to use
different sources of insight that complement each other. Vendor models will
sometimes fail our quality standards and, most of them also ignore a
significant amount of local flood defense structures such as dikes and reservoirs.
On the other side, our internal models were not always based on enough data to
be fully credible.
With extensive R&D efforts, we
were able to leverage the large amount of data available in Canada to bridge
that gap and create a national flood risk model that meets our standards.
Assessing the risk means developing
models for the following 3 phenomena:
Model Flood Water amounts
Hydrological
models are used to determine the probability that a water body will flood.
Model Water Dispersion
Hydraulic
models are used to determine how those water volumes flood the landscape.
Model the « submersion depth »
Submersion
depth models use the results of water dispersions and combine them with other
sources of information to determine models for submersion. Furthermore, the required use of
rooftop geocoding of the exact location of the insured building complicated the
availability of information since many possible sources did not have the
geocode.
Convert the « submersion depth » into building &
content damage
Many
factors impact the amount of damages:
the submersion depth, the type of building, the expected duration of the
flood, the temperature of the water and many more. In order to build both
content and building predictive models, we have used text mining on notes
coming from past sewer backups claims and integrated that with external
probabilistic models.
Pricing Policies
Flood
models estimate the flood risk but they don’t calculate an insurance premium.
For example, third party damage curves work well at estimating flood damage but
cannot be directly applied to insurance claims, because the latter includes
elements of client behavior as well as the effect of limits and deductibles.
Furthermore, our comprehensive water insurance product offers our clients
unprecedented flexibility regarding their water coverage, which also provided
pricing challenges. In times when the “buzz word” in technology and science is Minimal Viable Product, in the case of
flood insurance the bar for a viable product is very high.
In the end,
a key to our success is to take a scientific approach to modeling the flood
risk, for each and every house, farm, and building. It is what allows us to
provide insurance at the right price, for everyone. It is that analytical
mindset, combined with a lot of determination and innovation, that is and will
continue to be the Co-operators’ advantage.
This blog post has been written by Carl Lambert, who is vice-president of Business Intelligence at The Co-operators. Carl completed a Master's degree in Actuarial in 1994. He joined The Co-operators in 2009, where he launched a Research team that now consists of over 65 professionals in Mathematics, Statistics, IT and Actuarial. The team is responsible for the development of Analytics throughout the organization.