Facebook lets advertisers target users based on sensitive interests

  16 May 2018    Read: 1578
Facebook lets advertisers target users based on sensitive interests

Facebook allows advertisers to target users it thinks are interested in subjects such as homosexuality, Islam or liberalism, despite religion, sexuality and political beliefs explicitly being marked out as sensitive information under new data protection laws.

The social network gathers information about users based on their actions on Facebook and on the wider web, and uses that data to predict on their interests. These can be mundane – football, Manhattan or dogs, for instance – or more esoteric.

A Guardian investigation in conjunction with the Danish Broadcasting Corporation found that Facebook is able to infer extremely personal information about users, which it allows advertisers to use for targeting purposes. Among the interests found in users’ profiles were communism, social democrats, Hinduism and Christianity.

The EU’s general data protection regulation (GDPR), which comes into effect on 25 May, explicitly labels such categories of information as so sensitive, with such a risk of human rights breaches, that it mandates special conditions around how they can be collected and processed. Among those categories are information about a person’s race, ethnic origin, politics, religion, sex life and sexual orientation.

The information commissioner’s office says: “This type of data could create more significant risks to a person’s fundamental rights and freedoms, for example, by putting them at risk of unlawful discrimination.”

Organisations must cite one of 10 special dispensations to process such information, such as “preventive or occupational medicine”, “to protect the vital interests of the data subject”, or “the data subject has given explicit consent to the processing of those personal data for one or more specified purposes”.


Facebook already applies those special categories elsewhere on the site. As part of its GDPR-focused updates, the company asked every user to confirm whether or not “political, religious, and relationship information” they had entered on the site should continue to be stored or displayed. But while it offered those controls for information that users had explicitly given it, it gathered no such consent for information it had inferred about users.

The data means an advertiser can target messages at, for instance, people in the UK who are interested in homosexuality and Hinduism – about 68,000 people, according to the company’s advertising tools.

Facebook does demonstrate some understanding that the information is sensitive and prone to misuse. The company provides advertisers with the ability to exclude users based on their interests, but not for sensitive interests. An advertiser can advertise to people interested in Islam, for instance, but cannot advertise to everyone except those interested in Islam.

The company requires advertisers to agree to a set of policies that, among other things, bar them from “using targeting options to discriminate against, harass, provoke or disparage users, or to engage in predatory advertising practices.”

In a statement, Facebook said classifying a user’s interests was not the same as classifying their personal traits. “Like other internet companies, Facebook shows ads based on topics we think people might be interested in, but without using sensitive personal data. This means that someone could have an ad interest listed as gay pride because they have liked a Pride-associated page or clicked a Pride ad, but it does not reflect any personal characteristics such as gender or sexuality.”

The company also said it provided some controls to users on its ad preferences screen. “People are able to manage their ad preferences tool, which clearly explains how advertising works on Facebook and provides a way to tell us if you want to see ads based on specific interests or not. When interests are removed, we show people the list of removed interests so that they have a record they can access, but these interests are no longer used for ads.”

It added: “Our advertising complies with relevant EU law and, like other companies, we are preparing for the GDPR to ensure we are compliant when it comes into force.”

The findings are reminiscent of Facebook’s previous attempts to skirt the line between profiling users and profiling their interests. In 2016 it was revealed that the company had created a tool for “racial affinity targeting”.

At the time, Facebook repeatedly argued that the tool “is based on affinity, not ethnicity”. Discussing a person who was in the African American affinity group, for instance, the company said: “They like African American content. But we cannot and do not say to advertisers that they are ethnically black.”

Almost a year later, after it was revealed that advertisers could use the ethnic affinity tools to unlawfully discriminate against black Facebook users in housing adverts, Facebook agreed to limit how those tools could be used.

 


More about: #Facebook  


News Line