Risk requires science
Hero Wildfire First Frame

Lower Loss Ratios

Rapid Organic Growth

Better Portfolio Construction

Success!

Thank you for your message

Success!

Thank you, you have successfully subscribed to our newsletter list.

Tornado Watchers
Our Science

An insurance company’s primary responsibility is to measure & control risk with precision.

We have developed Skynet, the world’s most precise risk technology for property & casualty insurance by going back to the scientific principles that cause insurance claims

Flooded Neighbourhood

We built a Cannon
To kill a mosquito

The Problem

Property insurance is simple.
Pricing it properly isn’t.

  • The foundations & shortcomings of insurance pricing
  • Developing a new approach to evaluating risk
  • Engineering the science needed to implement our approach
The Solution

Skynet – Automated, Scientific Underwriting.
Done live at the point of sale.

Storm Cloud

The foundations & shortcomings of insurance pricing

In today’s status quo, insurance pricing is founded upon the practice of grouping common risks.

This concept needed to be abolished entirely in favour of ground up analysis based on the scientific properties of the events that cause insurance claims.

Hear More

Hear More

Developing a new approach to evaluating risk

Grouping common risks is an oversimplified method to deal with a very complex problem. However, if you want to underwrite property insurance risk with pinpoint accuracy, it can’t be solved for by just using big data and artificial intelligence either. This isn’t like every other data science problem.

Building centroid based refinement of a regular 1km grid

Building centroid based refinement of a regular 1km grid

Lake surge exposures for Lake Ontario, Canada

Lake surge exposures for Lake Ontario, Canada

The risks in property insurance are driven by natural hazards (such as weather) which are capable of producing loss events that are far outside the parameters of any actual event in recorded history. For this reason, physics-based frameworks capturing the underlying dynamics are necessary to model the impact of all plausible events.

As such, underwriting property risk requires a brand new approach. That’s what we’ve built.

Here are the principles we followed:

We Must Evaluate
each Peril independently

This isn’t a novel concept, but following it strictly is crucial to our approach. It is also something preached but rarely practiced properly in today’s industry.

The evaluation of risk of flood losses should be independent from that for house fires. Which should be completely separate from earthquakes. And so on.

We Must Analyze the Risk specific to a single policy. No groupings or averages.

Pricing starts with data about the thing being insured.

  • [who you are] – what are the attributes of the policyholder?
    Age, income, claims history, etc.
  • [what you are] – what are the physical attributes of the house?
    construction type, roof age, stories, basement, etc.
  • [where you are] – what is the location, elevation, etc?

Collecting this data isn’t unusual in insurance. What is unusual is actually using all of it to determine price.

Normally all this data is used as a filter – companies don’t want to insure anyone under 25. Or anyone with more than 3 claims. Or a house with a wood burning fireplace. Or one located north of a certain latitude.

The industry is missing the point.

There Is No Bad Risk, Only Mispriced Risk.

Flood Waters

Our Precision creates
Sustainable Growth

Our precision offers lower prices to an enormous portion of all property insurance policies, while maintaining low loss ratios.

Normally insurance companies sacrifice performance in order to grow. We don’t.

Insurance Policies

Through extensive scientific analysis, we found that as much as

70%

of all home insurance policies are materially mispriced

Insurance Market

>10B $

Canadian Homeowner’s Insurance Market

Insurance Market

>100B $

United States Homeowner’s Insurance Market

Insurance Market

>200B $

Global Homeowner’s Insurance Market

Call Stack of an ember lofting routine run - Canada

Call stack of an ember lofting routine run on a wildfire in northern Alberta, Canada

Vegetation Photoplot Locations - Canada

Vegetation photoplot locations for Central British Columbia, Canada

Catastrophe Risk

It’s hard to analyse the risk of an event that’s never happened.

But black swan events do happen – earthquakes, tsunamis, wildfires etc. – all with devastating impact.

The industry needs to stop pricing for what has happened, and start pricing for what could happen. We call this exposure based pricing.

Insurers have long held the view that catastrophe risk is about risk transfer to the reinsurance market, and that the ultimate responsibility for establishing a proper price for such risks lies in the hands of the reinsurers.

But catastrophic events usually cause 10-50% of all personal property claims, depending on the market.

The risk of those claims happening to a specific home can impact:

  • The prices an insurer can offer to consumers
  • The reinsurance costs an insurer will incur, and
  • The composition & volatility of the insurer’s portfolio

Catastrophe risk is an unavoidable element of a property insurer’s business.

In our opinion, primary insurance carriers need to start understanding and pricing for the catastrophe risks inherent in every policy they write, irrespective of whether the risk is being reinsured out.

Pixi Image

History of Catastrophe Modelling

Pioneers in catastrophe modelling have helped to advance the insurance industry’s understanding of catastrophic events by integrating actuarial, weather and mathematical sciences to create highly accurate models that utilize computational fluid dynamics to simulate and estimate losses from weather events.

1800s

residential insurers covering fire and lightning risk use pins on maps to visualize concentrations of exposure.

1845

John Thomas Romney Robinson invents the first anemometer, used to measure wind speeds, a critical component to understanding hurricane activity.

1880

John Milne, James Alfred Ewing and Thomas Gray invent the modern seismograph while studying earthquakes in Japan

1906

the San Francisco earthquake wipes out all insurer profits of the preceding 47 years, spurring an abundance of scientific & engineering research to better understand the risks and provisioning for infrequent large events.

1987

AIR introduces the first fully probabilistic catastrophe model to the insurance industry - a U.S. hurricane model.

1992

Hurricane Andrew makes landfall in Florida. Within hours, AIR estimates $13B in damages, an estimate that is immediately dismissed by industry experts. After months of cleanup, the final total comes in at a very close $15B in losses, driving the broader acceptance and adoption of catastrophe modelling.

2000

AIR's extratropical cyclone model becomes the first cat model to incorporate a physical, as opposed to purely statistical approach to modelling.

Flood Watch / Warning zones for northbern Albera, Canada May 9th 2022

Flood watch / warning zones for northern Alberta, Canada for Monday, May 9th, 2022

Mean annual temperature at a 300 arcsecond resolution for Canada

Mean annual temperature at a 300 arcsecond resolution for Canada

Non-Catastrophe Risk

The risk of a home burning from an unattended oven is not the same as that from a wildfire.

We isolated pure non-catastrophic risk, and applied extremely selective machine learning algorithms to consider the fine details of each policy.

The purity of our approach, combined with the sophistication of our non-catastrophe models, results in much richer and more precise recognition of the components of risk, meaning we have unprecedented insights into the driving forces that cause claims.

Classic frameworks that use territories combined with classes of risk and discounts, cannot match our capabilities.

We Must Model Every Possible Loss Event.
No simplified, single output models.

To accurately price for a risk, we must be able model the underlying scientific properties of the events that cause claims using full scale stochastic models for all perils.

We do everything the hard way so that we can pick up every detail that might impact the risk:

  • We run full catastrophic event catalogues of at least 10,000 years, if not 100,000 years, for every single catastrophic peril which could affect any policy we look to insure. We do this live at the point of sale, in less than 2 seconds. It is the only way to accurately know the entire loss distribution.
  • We built the most sophisticated full scale stochastic wildfire model ever for Canada.

    The model incorporates a plethora of data sources and algorithms that underlie the very physics of wildfire spread. This creates a complete set of potential loss events, which are modelled with the risk data to determine intensities, vulnerabilities and insured losses.
  • We run 4 million years of live simulations at the time of quote for each non-catastrophe peril. Often times, there are inter-relationships between data that happen in dimensions in which the human brain cannot function. Our models employ an extensive neutralnet framework to capture all unseen interactions, inter-relationships, high order factors and volatilities/dispersions.
Loss by Event Graph
Wildfire burn scar near Ashcroft, British Columbia, Canada from 2021

Wildfire burn scar near Ashcroft, British Columbia, Canada from 2021

We Must Control for Concentration Risk, Live at the Point of Sale

As noted, primary insurance companies largely act as though it is not their responsibility to properly price for catastrophe risk within a single home insurance policy, since they are not the ones ultimately bearing that risk.

Here are the issues though:

  1. Aggregation to catastrophes can break you in this business.

    If an insurer writes too many policies that are exposed to a single catastrophic event, and that event happens, they don’t want to be counting on luck to ensure they have enough reinsurance coverage.

  2. Even if you are passing off the risk, reinsurance still has a cost.

    An insurer shouldn’t wait until the annual reinsurance renewal to find out if all those policies they wrote in the last year have similar exposures, because if they do, their reinsurance rates will go up unexpectedly, deteriorating returns.

Most insurers check their aggregations once a year when they do their reinsurance renewals. But this is a reactive approach. To facilitate a rapidly growing portfolio and control for aggregation risk, insurers must have an immediate and precise understanding of the accumulation of their risks to individual loss events.

As such we:

  • model all catastrophe risks live at the point of sale for each policy, and
  • calculate the resulting accumulation risk for each and every event to ensure the overall portfolio remains balanced in terms of risk profile and price.
Multi-day composite of NASA MODIS data at 250 metre resolution

Multi-day composite of NASA MODIS data at 250 metre resolution

A Quote Involves Billions of Calculations in 2 Seconds

Personal property policies are the most highly commoditized insurance products that a consumer can buy. Convenience cannot be sacrificed for precision.

Legacy rating methods show a price instantaneously due to their simplistic nature. We cannot fail to deliver that same experience.

We needed to pack billions of calculations into 2 seconds to keep the same customer experience.

Furthermore, there are no shortcuts. You can’t pre-run models. Policyholders often update their risk data at the time of quote, and small changes in data can yield large impacts on risk, and therefore price.

1 in 3 people

on our team works
in Scientific Research
& Engineering Roles

20TB

of extensive research data
is used to support our
proprietary pricing approach

10,000 years

of loss event simulations, at minimum, are run for each and every peril to price a single insurance policy

Engineering the Science needed to implement our approach

The layers of modelling, computational breakthroughs, integration of a vast array of pertinent data and proper risk packaging make this an extraordinarily complex problem, which even the best data scientists are unequipped to attack.

Here’s how we did it:

Rasterized version of a vegetation photo plot used in wildfire fuel mapping Rasterized version of a vegetation photo plot used in wildfire fuel mapping

Rasterized version of a vegetation photo plot used in wildfire fuel mapping

Wildfire Fuel Map for Canada

Wildfire Fuel Map for Canada

We Developed Everything Ourselves

This was our vision. We were the ones who had to execute on it. We did not want compromises.

We wrote hundreds of thousands of lines of code.

We compiled one of the industry’s largest research datasets.

We hired a team and developed a culture that was dedicated to only the purest & most sophisticated risk analysis.

If you want something done right, you have to do it yourself.

We Engineered Scientific Breakthroughs

At times, we honestly questioned whether our approach to analyzing risk was technically feasible.

Standard catastrophe models can take at least 15 minutes to run. We needed models that could run in 2 seconds.

We had to analyze the risk on a single home. Existing science is not built for this level of precision.

We needed to analyze every peril including wildfire. Wildfire had never been modelled before in such detail for the applications we required.

Again and again, we hit roadblocks where the existing weather science, data science and computer science were not advanced enough to support the no compromises approach we’d envisioned.

But one by one, we engineered ourselves over those roadblocks in the pursuit of the purest risk analysis.

Pixi Image

Bind times for a Policy

Broker Timeline
Color-coded map of the terrestrial ecozones of Canada

Color-coded map of the terrestrial ecozones of Canada

Total burn footprints of the Lytton Creek fire

Total burn footprints of the Lytton Creek fire (left, $65M in property damage) and the White Rock Lake fire (right, $60M in property damage) which occurred in British Columbia, Canada in July & August 2021.

We Carefully Constructed our Systems Architecture

The components of our architecture were meticulously selected to optimize the performance, robustness and longevity of the system. Even if it meant being unconventional.

There were tens of thousands of considerations along the way.
These are just a few:

  • Our databases are designed to keep speed with our risk models.
  • Our data tables are structured with the ability to instantly change & introduce variables.
  • Our code is clean and efficient, easy to update and resilient to changes.
  • Our core processing logic is programmed to work agnostically with any platform.
  • Our task managers are engineered to optimize speed by concurrently using many CPUs.
  • Our network is structured with enough redundancies to ensure we’re always fast & online.
  • Our system security strictly administers authenticated access.
Fire Fighter

The Solution

Skynet – Automated, Scientific Underwriting. Done live at the point of sale.

The last decade has procured all the data & computational power necessary to enable a quantum leap in precision risk analysis. To harness these resources, the insurance industry requires a fundamental shift in the methods it uses to price for and manage risk.

We have re-engineered the entire pricing & risk evaluation process for property & casualty insurance.

In less than 2 seconds, live at the point of sale for a single insurance policy, we run:

  • At least 10,000 years of catastrophic events per peril through a multitude of physics based models
  • At least 4,000,000 years of simulations via machine learning models for non-catastrophe perils

The result is unrivalled rating precision. This drives:

Rapid organic growth

by offering lower risk-adjusted prices to an enormous array of policyholders.

Lower loss ratios

for a portfolio of risks, even while growing.

Better portfolio construction

meaning improved resilience to catastrophic events & reinsurance optimization.

Automation of policy administration

since the complexities of human centric underwriting now live in the computations of Skynet.

However - All this technical capability & all these opportunities are wasted if someone doesn’t deploy the technology properly.

But here’s the thing...

We apply our own technology to insure real homes.

We launched the first implementation in our own insurance company.

We partner with industry leaders to proliferate the global impact of Scientific Underwriting.

About Our Business