Users intimidated by human-like virtual assistants, claim researchers

People don't want to appear stupid in front of so-called smart assistants, suggest researchers

Smart device users will avoid using human-like virtual assistants for fear of looking "dumb" for asking stupid questions, according to research by psychologists.

In recent years, virtual assistants such as Apple's Siri and Amazon's Alexa have boomed in popularity with the tools pre-loaded onto smartphones and other devices.

But pyschologists have suggested that some people may be intimidated, rather than helped, by them. They suggest that the more human they are made, the less likely people will use them to ask questions.

The technologies are intended to improve the simplicity of apps and help users with everyday tasks. However, Daeun Park of Chungbuk National University claims that the more human assistants may deter people from using them.

They may end up asking themselves questions such as "Will I look dumb?" for asking this, according to the researcher. People, according to Park, are conscious about apps that measure achievement. These findings were published in the journal Psychological Science.

"We demonstrate that anthropomorphic features may not prove beneficial in online learning settings, especially among individuals who believe their abilities are fixed and who thus worry about presenting themselves as incompetent to others," said Park.

"Our results reveal that participants who saw intelligence as fixed were less likely to seek help, even at the cost of lower performance."

In the past, research has suggested that people view virtual assistants as "social beings", and this can make them "seem less intimidating and more user-friendly".

But Park and co-authors Sara Kim and Ke Zhang disagree with this claim, believing that people may feel like systems are trying to compete with their knowledge. This is particularly true when performance is concerned, they suggested.

"Online learning is an increasingly popular tool across most levels of education and most computer-based learning environments offer various forms of help, such as a tutoring system that provides context-specific help," said the researcher.

"Often, these help systems adopt human-like features. However, the effects of these kinds of help systems have never been tested."

It may, though, also be related to the knowledge or fear that the virtual assistants are slurping up data every time they are used, while the research might also only be exposing the embarrassment of looking ignorant in front of the research team.

The test involved exposing 187 people to a task that supposedly measured their intelligence. They were given three words and had to come up with a fourth one related to them all.

If they ended up running into difficulty, they could use an on-screen computer icon or a so-called helper. The research indicated that participants were "embarrassed" if they had to use the AI rather than the icon.

"Educators and program designers should pay special attention to unintended meanings that arise from humanlike features embedded online learning features," concluded Park.

"Furthermore, when purchasing educational software, we recommend parents review not only the contents but also the way the content is delivered."