Search

Norwich Weather

Partly Cloudy

Partly Cloudy

max temp: 17°C

min temp: 11°C

Would you trust a robot with personal information? UEA scientists trying to make chatbots 'trustworthy'

PUBLISHED: 05:30 29 May 2019 | UPDATED: 07:27 29 May 2019

Would you trust a robot with your personal details? The University of East Anglia is launching a research project to find out how chatbots could be made more

Would you trust a robot with your personal details? The University of East Anglia is launching a research project to find out how chatbots could be made more "trustworthy", to make them work more effectively. Picture: Anthony Devlin/PA Wire

From banking and insurance to shopping and dating, chatbots are becoming increasingly ingrained in our ever more digital society.

Dr Oliver Buckley, of the University of East Anglia's (UEA) school of computing sciences, is lead researcher on a project by the UEA and other universities to explore how chatbots could be made more Dr Oliver Buckley, of the University of East Anglia's (UEA) school of computing sciences, is lead researcher on a project by the UEA and other universities to explore how chatbots could be made more "trustworthy". Picture: University of East Anglia

But organisations using these human-like computer programs are coming up against a serious barrier: talking to a robot does not inspire the same trust as talking to a person, meaning users are reluctant to disclose sensitive information to them.

Now, a team of researchers at the University of East Anglia (UEA) are launching a project to make chatbots more trustworthy, examining how their personality and even appearance can affect how users perceive them.

Lead researcher Dr Oliver Buckley, from UEA's school of computing sciences, said businesses and governments were increasingly turning to chatbots to feed user demand for fast, reliable and accurate information.

"Chatbots are all around us, particularly in customer support roles. They're speaking to us on the phone, emailing us, and responding to text messages - with answers to queries and even providing advice and guidance," he said.

You may also want to watch:

"They are very convincing but a big problem is that people don't trust them with sensitive or private information, for example to do with their health, or banking.

"We want to know how chatbots can become even more personable to encourage people to disclose sensitive or confidential information."

The PRoCEED (A Platform for Responsive Chatbot to Enhance Engagement and Disclosure) project, which has received £500,000 of funding from the Engineering and Physical Sciences Research Council, will involve researchers from the UEA's school of computing sciences and school of psychology, the University of Kent, Oxford Brookes and Cranfield University.

Looking at three key sectors for chatbot use and sensitive information sharing - healthcare, defence and security and technology - the team will investigate the implicit trust a user has with a chatbot and how the context in which information is provided can play a role in its perceived sensitivity.

Dr Buckley said: "In order to fully understand the use of chatbots, it is essential to properly understand the nature of personal, sensitive information and also their perceived trustworthiness."

Most Read

Most Read

Latest from the Norwich Evening News

Hot Jobs

Show Job Lists