Why we place too much trust in machines - iWONDER

  20 October 2021    Read: 888
  Why we place too much trust in machines -   iWONDER

While many people might claim to be sceptical of autonomous technology, we may have a deep ingrained trust of machines that traces back to our evolutionary past.

As Air France Flight 447 hurtled belly-first towards the Atlantic Ocean at nearly 300km per hour (186mph), pilot Pierre-Cédric Bonin wrestled with the controls. He and his crew had taken over after the autopilot suddenly switched itself off, apparently due to a build-up of ice on the aircraft. It was a situation that demanded manual intervention.

The pilots, unfamiliar with this scenario, struggled to steady the plane. Confusing messages and alarms from the aircraft's computer bombarded them and suggested the craft was not stalling when in fact it was. Bonin's last known words, captured on the flight recorder, were, "We're going to crash – this can't be true. But what's happening?"

All 228 passengers and crew on board perished that day, 1 June 2009. When accidents like this occur, involving humans and machines, there are usually multiple factors or causes at work. But analysts have blamed the tragedy of Flight 447 partly on an excessive reliance or trust in machines. They pointed to the flight crew's expectation that the autopilot would stay switched on, and that the plane's information systems would provide accurate information. It is far from the only incident in which an over-reliance on technology has contributed to fatalities.

It is a well-studied phenomenon known as automation bias, which sometimes also leads to automation complacency, where people are less able to spot malfunctions when a computer is running the show. But what is perhaps surprising is that our tendency to "overtrust" machinery is perhaps directly influenced by millions of years of evolution.

"Technology overtrust is an error of staggering proportion," writes Patricia Hardré of the University of Oklahoma in a book chapter on why we sometimes put too much faith in machines. She argues that people generally lack the ability to judge how reliable a specific technology is. This can actually go both ways. We might dismiss the help of a computer in situations where it would benefit us – or blindly trust such a device, only for it to end up harming us or our livelihoods.

The behaviour of one of our closest relatives, the chimpanzee, may contain a clue about why we are so bad at assessing the trustworthiness of machines. It could be because we are primed to evaluate other members of our species instead.

In a recent experiment, researchers set up an apparatus in which chimpanzees at a sanctuary in Kenya could pull on a rope to retrieve a food reward. One rope offered a basic food reward – one piece of banana. But they were also presented with a second option – a bigger reward of two pieces of banana and a slice of apple that could be dished out either by a machine or a fellow chimpanzee.

Sometimes it was the machine on the other end of the rope, other times it was another chimp, but never both. However, sometimes the machine would not deliver the reward, and sometimes the other chimp would choose not to share. So while there was potentially a bigger reward, it was a less certain choice to pull the second rope.

The chimpanzee participant was thus faced with either a social or a non-social condition. They'd need to trust either the machine or the other chimp in order to have the chance of a bigger food reward.

The study found that, when a fellow chimpanzee presided over the uncertain option, the primates were less likely to go for it. They refused to engage with the social trials 12% of the time, but they only displayed this aversion 4% of the time in non-social trials where a machine presided over the reward. In other words, they had greater trust in the machine.

"They were much more hesitant… when the partner was another chimpanzee," says Lou Haux at the Max Planck Institute for Human Development, who designed and ran the experiment with her colleagues. It's one of just a few studies revealing that social risk plays a big part in how chimpanzees and humans navigate the world.

It's called "betrayal aversion" says Haux: "The fear of being duped by another human [or chimpanzee], which is thought to cause stronger emotions." She likens this to putting money into a vending machine only for it to fail to dispense the soft drink you requested. That could prompt irritation, no doubt, but imagine how you'd feel if a bartender took your cash and then proceeded to drink your cola right in front of you. You'd probably be livid. Of course, the vending machine didn't make a decision to dupe you, it just failed to deliver, while the bartender decided to drink your order despite knowing how that might make you feel.

The research team didn't stop there, however. They went on to perform another experiment involving chimpanzees who had already gained an understanding of how likely they were to get a better food reward when selecting the uncertain option, thanks to having participated in the first experiment. The uncertain option was, in truth, no longer completely uncertain – the chimps now possessed a sense of what kind of risk they were taking.

And that's when a surprise emerged. The chimpanzees stopped discriminating between the social and non-social options – they no longer seemed to trust the machine more than the fellow chimpanzee.

"That's why we think it's an exciting finding, that they distinguish between the social world and the non-social world in cases where there's still a lot of uncertainty around," says Haux.

It makes sense when you think about how important it is for primates to negotiate their social environment, says Darby Proctor, a psychologist  at the Florida Institute of Technology.

"With a machine, there's no future implications," she explains. "You don't have that additional potential social cost." After all, chimpanzees in these experiments often have to go and spend time with their fellow participants once the experiment is over – any displeasure caused by taking part could have onward consequences for their relationships.

Proctor and colleagues previously carried out similar tests and also found that chimpanzees were more likely to trust objects in the search for food rewards versus other chimpanzees. Proctor mentions that when one of the primates was seemingly let down by another chimpanzee who failed to give out a hearty food reward, the dejected chimp made their feelings known by spitting water at the partner. "A common display of unhappiness," says Proctor.

Proctor questions whether the chimpanzees in these experiments really trusted the machines more. It could also be described as individuals simply having a more muted reaction to a bad deal when a social partner isn't involved.

“It’s not that we have confidence that the machine will give us a good payout, it might be that we don’t see it as emotionally salient, so we might just be more inclined to gamble or take risks with this inanimate object,” she hypothesises.

But either way, evolution seems to have influenced primates' willingness to engage with uncertainty, based on whether we feel we're taking a social risk or not.

Evolution hasn't really prepared us for the fact that it can be quite costly to be betrayed by a machine, argues Francesca de Petrillo at the Institute for Advanced Study in Toulouse, who studies primates. For millions of years, there was no need to develop an ability to assess machines as carefully as we assess members of the same species. But today, when technology can have a huge impact on people's lives, there arguably is.

There are other factors involved here. Evolution aside, our willingness to trust technology is also influenced by personal knowledge about a machine or device, and cultural expectations. A 2019 study found that people were, on average, 29% more likely to give away their credit card details during a text-based chat if they thought they were speaking to a computer versus another human being. The researchers found that this effect was even more pronounced among those who had a pre-existing expectation that machines were more secure or trustworthy than humans.

Then again, sometimes people report a strong aversion to trusting technology. Many surveys have suggested that people are often uncomfortable with the idea of self-driving cars or handing over work responsibilities to machines. There are lots of reasons why suspicions about new technology can take hold. People might fear losing a part of their identity if an automaton takes over. Or they might simply feel sceptical that a computer will approach certain tasks with the required caution and dexterity. When you've seen a hundred videos of robots falling over, or experienced an obstinate computer refusing to function correctly, that's not necessarily surprising.

Among those who have studied what can influence an individual's willingness to trust a particular technological system is Philipp Kulms, a social psychologist at Bielefeld University in Germany. He and a colleague came up with a Tetris-style puzzle game in which participants collaborated with a computer partner that also had control over some of the pieces. When the computer played the game well (demonstrating competence) and allowed the human player access to high-value pieces that netted them extra points (demonstrating warmth), the participants were more likely to report that they trusted the computer player and also more likely to exchange puzzle pieces with it in a collaborative effort, which shows that they also expressed that confidence in-game. It was trust formed by mechanics, if you like.

"To our surprise, this very limited set of variables that we could manipulate was apparently enough," says Kulms.

If we accept that people are generally ill-equipped to assess the trustworthiness of machines because, from an evolutionary standpoint, we are primed to judge trustworthiness based on social cues, then this makes perfect sense. It also chimes with other research that suggests gamblers will gamble more when playing on slot machines that have been designed to display human-like properties.

In other words, we aren't just bad at weighing up the trustworthiness of a machine, we're also easily seduced by mechanical objects when they start behaving a bit like a social partner who has our best interests at heart.

So, in contrast to the person who has been stung by unresponsive computers and therefore distrusts them all, the individual who has learned to put their faith in certain systems – such as aircraft autopilots – may struggle to understand how they could be wrong, even when they are.

De Petrillo notes that she's experienced a feeling of confidence in computers when interacting with voice-activated assistants like Apple's Siri or Amazon's Alexa.

"I assume that they are acting in my best interests, so I don't need to question them," she says. So long as they appear competent and reasonably warm, Kulms' study suggests, that will continue to be the case, for de Petrillo and many others. Kulm points out that this is why it's so important for designers of technology to make sure their systems are ethical and their functionality transparent.

The great irony of all this is that behind a seemingly trustworthy machine may lie a malicious human with nefarious intentions. Undue trust in a faulty machine is dangerous enough, never mind one designed to deceive.

"If we were taking the same type of risk with a human, we would be much more attuned to what the potential negative outcomes could be," says Proctor. She and Haux agree that more work is needed to tease out how much the chimpanzees in their studies genuinely trust machines, and to what extent that reveals truths about human behaviour.

But there's a hint here that our occasional, sometimes disastrous, overtrust in technology has been influenced by a simple fact: we evolved to be social animals in a world where machines did not exist. Now they do and we put our money, our personal data and even our lives under their supervision all the time. That's not necessarily the wrong thing to do – it's just that we're often pretty bad at judging when it's right.

 

BBC


More about:


News Line