The Scientific Reason Why ChatGPT Leads You Down Rabbit Holes


This chatbot According to a new study, it just says you want to believe.

Regardless of how to use Google or Openai a traditional search engine as a conversation vehicle ChatgptYou tend to use terms that reflect your bias and perceptions in accordance with learnThis article broadcast in the proceedings of the National Academy of Sciences. More importantly, search engines and conversations often give the results that strengthen these beliefs, even your intention is to learn more about the topic.

For example, think that you are trying to learn about it Health effects of drinking coffee every day. If you enjoy the two cups of Joe in the morning, if you enjoy the two cups of Joe, “Coffee is healthy?” You can search something like. or “health benefits of coffee.” If you are already skeptical (perhaps a Tea Puristi), “Coffee for you is bad?” instead. Researchers could skew the consequences of the questions of the questions, on the contrary, the opposite could show a large number of answers.

There's Atlas

“When people are looking for information, Google or Chatgpt’s use, they really use the search terms that are already believed,” Eugina Leung said the author of an associate professor and research at Tulane University.

The abundance of AI Chatbots and confident and special results make you freely make it easier to fall to the rabbit hole, and it is easier to realize you. There has never been a more important time to think deeply You get information on the Internet.

The question is: How do you get the best answers?

To ask the wrong questions

Researchers, researchers, approximately 10,000 participants, caffeine, gas prices, criminal proportions, coviet-19 and nuclear energy health effects, 21 research conducted. Search engines and tools include Google, Chatrept and Specially designed search engines and AI chatbots.

The results of the researchers showed that they called the “narrow search effect”, and showed how people ask questions and how their technological platforms were answered. People have a habit of asking (or asking questions in a wrong way). Artıq düşündüklərini və ya bu cavablara çatdırılan dar, son dərəcə müvafiq cavabları təmin etmək üçün hazırlanmış axtarış şərtlərini və ya AI-ni və ya axtarış motorları və söhbət motorları və söhbət motorları və söhbət motorları və axtarış motorları və söhbət motorları və çənlər axtarmağa meyllidirlər. “The answers are mostly confirmed by what they believe in the first place,” he said.

Read more: ESSENTIALS: According to our experts, Gen AI 29 ways to work for you

Researchers also checked to see the participants changed their beliefs after the search. In a wide range of responses, the answers are less likely to see significant changes when the answers are running a narrow choice. However, researchers have changed more, when a specially built search engine and chatbot designed to respond to a wider range.

Leung said that the platforms can provide users with a more specific search option in situations where users trying to find a wider range of sources. “Our research does not always try to offer search engines or algorithms not always expand the search results,” he said. “I think there is a lot of value in certain situations in the center of attention and very narrow search results.”

3 ways to ask the right questions

If you want to answer a wider range of questions, there are some things you can do, Leung said.

Be accurate: Consider specifically about you are trying to study specifically. Leung used an example to decide whether you want to invest in the stock of a particular company. If you ask for a good share or a bad stock or a bad stock to buy, if you ask to make it better, more positive news than more negative news. Instead, try a single, more neutral search term. Or ask both conditions and appreciate the results of each.

Get other views: You can easily require a wide perspective directly with an AI chatBot. If you continue to drink two glasses of coffee a day, ask chatbot for various ideas and evidence behind them. Researchers tried this in one of their experiences and found them more than the results. “We asked Chatgpt to respond to the survey from participants and give these allegations to much evidence to return as much as possible,” he said.

Stop asking at once: The following questions did not work much, Leung said. If these questions do not receive a wider response, you can get the opposite effect – even narrow, confirmation results. In many cases, people who asked many follow-up questions, “They fell into the rabbit hole.”





Source link

Leave a Reply

Your email address will not be published. Required fields are marked *