April 21, 2024

AI Girlfriends Are A Threat To Your Privacy, Research Says 

New research has found that popular AI girlfriend and boyfriend tools are purposely storing private information about the user and have the worst privacy protocols.

Researchers at Mozilla Foundation ran a test on 11 popular AI chatbots including Eva AI and Replika and it was found that none of them met their basic privacy standards. They don’t even encrypt the data of their customers.

In fact, the results were so bad that the researchers labeled these chatbots as one of the worst tools (in terms of privacy) they have ever reviewed.

Although they are marketed as something that will enhance your mental health and well-being, they specialize in delivering dependency, loneliness, and toxicity, all while prying as much data as possible from you.Misha Rykov, researcher

What’s concerning is the growing popularity of these products. Fast life has plunged this generation into a new depth of loneliness, forcing them to resort to online applications to feed their social and personal needs.

Last year alone, there were more than 3 billion searches about AI girlfriends on Google. The numbers have been on a steady increase ever since ChatGPT gained popularity.

What Makes These AI Chatbots A Threat?

The institute also ran a survey on the rest of the apps in the market and found that 73% of them are not transparent about their security and vulnerability protocols. 45% of them allow the user to keep weak passwords, and every app, except Eva AI Chat Bot & Soulmate, shares and sells the personal data of their users.

If such personal medical records are sold to the wrong people, the effects could be grave.

Speaking of Eva AI Chat Bot & Soulmate, although they have a good privacy policy, they often try to push the user to divulge more personal information. What they do with it (since they don’t sell it), is still unknown.

Researchers also warned that considering their pushy nature, there’s no guarantee that Eva AI’s policies won’t change in the future. They might not sell it now, but if they change their mind in the future, they can sell all the data they have been collecting over the years.

Another popular AI chatbot called CrushOn.AI was also found to collect users’ sexual health information, gender-affirming care, and prescription meds.

Violent & Problematic Nature Of The AIs

Data sharing isn’t the only problem with these AI chatbots. Some of them were also found to be violent, encouraging underage abuse. You could easily find these details in the character descriptions of the AI.

Apps like these also have a history of recommending unsafe actions.

For instance, the AI chatbot Replika had encouraged a user to assassinate the late Queen Elizabeth 2.

Other apps are also known to have encouraged suicide. While chatbots like Eva AI Chat Bot, Soulmate, and CrushOn.AI did not comment on the topic, a representative of Replika said to Business Insider that they neither sell user data nor do they support advertising. The only time they use user data is while training their app to make better conversations.

The researchers at Mozilla Foundation have only one piece of advice for the users—don’t trust these apps, and don’t share any personal information, they are not your friends.

free coins
free coinsfree coins
free coins
free coins
free coins
free coins
free coins
free coins
free coins
free coins
free coins
free coins
free coins
free coins
free coins
free coins
free coins
free coins
free coins
free coins
free coins
free coins
free coins
free coins
free coins
free coins
free coins
free coins
free coins
free coins

Leave a Reply

Your email address will not be published. Required fields are marked *