How insurers skim your personal online data to price insurance

There’s little transparency around how insurance firms use personal data to establish insurance costs, write Kayleen Manwaring, Zofia Bednarz and Kimberlee Weatherall

What if your insurer was tracking your online data to price your car insurance? Seems far-fetched, right?

Yet there is predictive value in the digital traces we leave online. And insurers may use data collection and analytics tools to find our data and use it to price insurance services. For instance, some studies have found a correlation between whether an individual uses an Apple or Android phone and their likelihood of exhibiting certain personality traits.

In one example, US insurance broker Jerry analysed the driving behaviour of some 20,000 people to conclude Android users are safer drivers than iPhone users. What’s stopping insurers from referring to such reports to price their insurance?

Our latest research shows Australian consumers have no real control over how data about them, and posted by them, might be collected and used by insurers. Looking at several examples from customer loyalty schemes and social media, we found insurers can access vast amounts of consumer data under Australia’s weak privacy laws.

Australian consumers have no real control over how certain data might be collected and used by insurance firms-min.jpg
Research shows Australian consumers have no real control over how certain data might be collected and used by insurance firms. Image: Shutterstock

Your data is already out there

Insurers are already using big data to price consumer insurance through personalised pricing, according to evidence gathered by industry regulators in the United Kingdom, European Union and United States.

Consumers often “agree” to all kinds of data collection and privacy policies, such as those used in loyalty schemes (who doesn’t like freebies?) and by social media companies. But they have no control over how their data are used once it’s handed over.

There are far-reaching inferences that can be drawn from data collected through loyalty programs and social media platforms – and these may be uncomfortable, or even highly sensitive. Researchers using data analytics and machine learning have claimed to build models that can guess a person’s sexual orientation from pictures of their face, or their suicidal tendencies from posts on Twitter.

Read more: Three useful things to know about data, AI and the privacy debate

Think about all the details revealed from a grocery shopping history alone: diet, household size, addictions, health conditions and social background, among others. In the case of social media, a user’s posts, pictures, likes, and links to various groups can be used to draw a precise picture of that individual.

What’s more is Australia has a Consumer Data Right which already requires banks to share consumers’ banking data (at the consumer’s request) with another bank or app, such as to access a new service or offer. The regime is actively being expanded to other parts of the economy including the energy sector, with the idea being competitors could use information on energy usage to make competitive offers.

The Consumer Data Right is advertised as empowering for consumers – enabling access to new services and offers, and providing people with choice, convenience and control over their data. In practice, however, it means insurance firms accredited under the program can require you to share your banking data in exchange for insurance services.

The previous Coalition government also proposed “open finance”, which would expand the Consumer Data Right to include access to your insurance and superannuation data. This hasn’t happened yet, but it’s likely the new Albanese government will look into it.

Artificial intelligence tools employed in mass data analytics can be inaccurate and discriminatory-min.jpg
The outputs of artificial intelligence tools employed in mass data analytics can be inaccurate and discriminatory. Image: Shutterstock

Why more data in insurers’ hands may be bad news

There are plenty of reasons to be concerned about insurers collecting and using increasingly detailed data about people for insurance pricing and claims management. For one, large-scale data collection provides incentives for cyber attacks. Even if data is held in anonymised form, it can be re-identified with the right tools.

Also, insurers may be able to infer (or at least think they can infer) facts about an individual which they want to keep private, such as their sexual orientation, pregnancy status or religious beliefs.

There’s plenty of evidence the outputs of artificial intelligence tools employed in mass data analytics can be inaccurate and discriminatory. Insurers’ decisions may then be based on misleading or untrue data. And these tools are so complex it’s often difficult to work out if, or where, errors or bias are present.

Although insurers are meant to pool risk and compensate the unlucky, some might use data to only offer affordable insurance to very low-risk people. Vulnerable consumers may face exclusion. A more widespread use of data, especially via the Consumer Data Right, will especially disadvantage those who are unable or unwilling to share data with insurers. These people may be low risk, but if they can’t or won’t prove this, they’ll have to pay more than a fair price for their insurance cover.

They may even pay more than what they would have in a pre-Consumer Data Right world. So insurance may move further from a fair price when more personal data are available to insurance firms.

Read more: Can the law truly protect consumers from data profiling?

We need immediate action

Our previous research demonstrated that apart from anti-discrimination laws, there are inadequate constraints on how insurers are allowed to use consumers’ data, such as those taken from online sources.

The more insurers base their assessments on data a consumer didn’t directly provide, the harder it will be for that person to understand how their “riskiness” is being assessed. If an insurer requests your transaction history from the last five years, would you know what they are looking for? Such problems will be exacerbated by the expansion of the Consumer Data Right.

Interestingly, insurance firms themselves might not know how collected data translates into risk for a specific consumer. If their approach is to simply feed data into a complex and opaque artificial intelligence system, all they’ll know is they’re getting a supposedly “better” risk assessment with more data.

Recent reports of retailers collecting shopper data for facial recognition have highlighted how important it is for the Albanese government to urgently reform our privacy laws, and take a close look at other data laws, including proposals to expand the Consumer Data Right.

Dr Kayleen Manwaring is a Senior Lecturer at UNSW Law & Justice and Senior Research Fellow in the UNSW Allens Hub for Technology, Zofia Bednarz is a lecturer in Commercial Law at University of Sydney and Kimberlee Weatherall is a Professor of Law at University of Sydney. A version of this post first appeared on The Conversation.


You are free to republish this article both online and in print. We ask that you follow some simple guidelines.

Please do not edit the piece, ensure that you attribute the author, their institute, and mention that the article was originally published on Business Think.

By copying the HTML below, you will be adhering to all our guidelines.

Press Ctrl-C to copy