In the age of artificial intelligence (AI), assistive tools like ChatGPT have become increasingly popular to help people compose text messages, emails, and even personal interactions. However, a recent study conducted by Ohio State University has reached an interesting conclusion about what could be called "the artificial intelligence gap": the use of AI to compose personal messages could have a negative impact on how recipients perceive those who use these tools.
The research, led by Dr. Bingjie Liu and her team, involved 208 adult volunteers in a series of simulated conversations with a fictional friend named "Taylor." Participants faced three different scenarios in their interactions with Taylor: sharing personal problems, seeking advice in work situations, or talking about plans for a birthday party. The peculiarity of these interactions lay in how Taylor elaborated his responses, information that was kept secret from the participants.
In some cases, it was revealed that Taylor had used an AI, such as ChatGPT, to compose his responses, something that was evaluated negatively by participants. In particular, they felt that Taylor did not meet their expectations of a "close friend" in these cases. This suggests that perceptions of effort and authenticity play a critical role in how people evaluate relationships.
“We found that people do not believe that a friend should use a third party, whether AI or another human, to help them maintain personal relationships,” Dr. Liu explained in a statement reported by Ohio State News.
In others, participants were told that Taylor had received help from an outside person editing her messages. The results also indicated that the participants reacted similarly to the first case.
“Effort is crucial in relationships. “People want to know how much you are willing to invest in a friendship, and if they feel that you are taking shortcuts (like using AIs or support from someone else), it will not turn out well,” Dr. Liu added in the letter.
Finally, a third group of participants believed that Taylor had generated all of his responses himself. Although all participants received thoughtful and thoughtful responses, the results showed that people in this third group obtained a higher level of satisfaction from their interaction with Taylor.
The Turing test was created to know if the interlocutor was a human or a machine. To do this, they were based on characteristics such as simplicity, empathy and the diversity of topics. This test is in line with other previous studies on online communication.
In 2016 Moira Burke, a research scientist at Facebook and a doctor at the Institute of Human-Computer Interaction at Carnegie Mellon University; and Robert E. Kraut, PhD at Yale University and professor of human-computer interaction at Carnegie Mellon's School of Computer Science and Tepper School of Business, published the study The relationship between Facebook use and well-being It depends on the type of communication and the strength of the bond.
The research analyzed the use of Facebook's "Like" button, and the conclusion reached was similar to that of the study led by Dr. Liu. And the fact of receiving a reaction through a button instead of an elaborate response generated a negative reaction.
Although AIs like ChatGPT offer convenience and efficiency in communication, it is important to remember that sincerity and authenticity remain essential to maintaining healthy and meaningful relationships. Using technology to simplify interactions can take a toll on the perception of relationships, reminding us of the importance of maintaining a balance between comfort and authenticity in everyday communication.