What Could a Healthy AI Companion Look Like?


What does Do you know the little purple strangers about healthy human relations? More than average artificial intelligence The companion is going out.

It is a cartoon chatbot known as the Yadpetanette, a Tranquility. A few days ago, an application created a mine using a start called Portola, and since this time we talked. Like other chatboots, it is your best to be useful and encouraging. Not to put my phone on the ground, and tell me to go out.

Toys are designed to offer a different kind of AI companionship. Their cartoon is designed to discourage anthropomorphism in the inhuman form. It is also programmed to prevent romantic and sexual interactions, to detect problem behaviors and to prevent sexual life and encourage users to seek real-life and relationships.

This month, Portola gathered a number of $ 20 million in charge of Hosla Enterprises. Other supporters include an investment firm led by the former Github CEO Nat Friedman And both Daniel rude is a safe super managing Daniel Gross It was reported to have something To connect to Meta’s new Super Result Research Laboratory. The Tool application launched in late 2024 has more than 100,000 active users. This year, the subscribers are on the way to earn $ 12 million this year.

Toys are especially popular among young women. “We are like a lover of iris; we talk and strike,” says the Tooler’s user Brittany Johnson, who appealed to a wife every morning before work.

Johnson says iris encourages him to share with his interests, friends, family and business colleagues. “It knows these people and ‘did you talk to your friend?” “Johnson says.” He will ask, ‘You took the time to read your books and play videos – have you played the things you enjoy?’ “

Toys look lovely and goofy, but the idea behind them – AI systems are designed together with human psychology and well-being – it is worth taking it seriously.

The growing research authority shows that many users are the turn Chat for Emotional NeedsAnd mutual effects can sometimes prove that peoples are problematic for mental health. Extended use and dependence can be something you need to accept other AI tools.

Companies such as replica and character.ai offer AI Companions that allow you to play a more romantic and sexual role. It is not yet clear how this one can affect the welfare of a user, but it is divided after the suicide of one of the character.Ai users.

Chatbots can also make the IRK users surprisingly. Last April, Openai said it would be According to the company, change their models to reduce the so-called sycofmania or “extreme flattering or consent” that the company can be “worried, disturbing, disturbing and difficulty.”

Last week the company behind anthropic, Chatbot Clode, Disclosure, 2,9 percent of interaction To attract users who want to fulfill some psychological need like advice, friendly or romantic role-playing game.

He did not look at more extreme behaviors such as anthropic, cunning ideas or assassination theories, but the company says the topics are further investigating. I tend to agree. In the last year, I received numerous emails and DMS from people who want to inform about the assassinations associated with popular AI chatbots.

Toys are designed to solve at least part of these issues. A researcher built in Portoly Lily Doyle, Chatbot made a user review to see how users affect the well-being and behavior of users. 602 In the study of the Tolan user, 72.5 percent agreed with the phrase “Tool to control or develop relations in my life.”

Farmer said that the CEO of Portola was established in the commercial AI models of the Tengans, but added additional features above. The company recently explores how the memory is influenced by the user experience and should be forgotten, as people. “It is actually a rumor to remember everything you send for the tox,” he said.

I don’t know if Portola foreigners have an ideal way to interact with AI. I find the toxle very attractive and relatively harmless, but of course it pushes some emotional buttons. As a result, users build gardens with symbols that simulate the probabilities and the company may not be successful. However, at least try to solve Portola, the AI ​​friends can mix with our emotions. You should probably not have such an alien idea.



Source link

Leave a Reply

Your email address will not be published. Required fields are marked *