Get Even More Visitors To Your Blog, Upgrade To A Business Listing >>

Study: Chatbots Found to Restrict Information, Echo User Biases, and Hinder Independent Thought

A new study by Johns Hopkins University finds that chatbots have limited information and they reinforce ideologies on users. The research discusses how the use of AI chatbots is constricting people from forming their own opinions, which can make them vulnerable to manipulation. When people ask ChatGPT to write a summary of a book, research or any other form of content, ChatGPT just uses the information that has already been stored in it to form an opinion. This prevents people from discerning what’s right and what’s wrong.

The lead author of the study, Ziang Xiao, says that chatbots aren’t designed to be biased but their answers reflect the biases of the person who designed them and the type of questions users are asking them. So, this means that chatbots are only giving the answers users want to hear from them. Ziang Xiao wanted to study how online searches are influenced by chatbots and how much difference people felt in answers from search platforms and Chatbots. For the study, the researchers asked 272 members to write about different topics like healthcare, student loans or some other things and look for more writing material from ChatGPT and ordinary search engines especially built for this study.

When the search results came, participants of the study wrote their essays for the second time and also answered some questions about their topic. The researchers also made participants read two contradicting articles about the same topic and also asked them questions about which article has the accurate information in their opinion. The results also showed that chatbots had limited information about the topics as compared to ordinary search engines and chatbots were also only reflecting the attitudes of the users. Xiao said that people only look for information that is similar to their viewpoints.

There is also a difference in the way people seek answers from an ordinary search engine and a chatbot. People tend to only write keywords in the search box while they write full questions on chatbots. Chatbots use this form of communication to their own advantage and look for clues and attitudes from which they give answers according to what the user wants. Researchers also created a chatbot which was designed to agree with whatever the user says. It was more powerful than the other chatbots. But there was also a chatbot which was designed to disagree with whatever the user says. But it was not powerful enough to change people’s opinions.

Image: DIW-Aigen

Read next: The Impact of AI on Jobs, A Warning from the IMF Chief


This post first appeared on Digital Information World, please read the originial post: here

Share the post

Study: Chatbots Found to Restrict Information, Echo User Biases, and Hinder Independent Thought

×

Subscribe to Digital Information World

Get updates delivered right to your inbox!

Thank you for your subscription

×