AI Girlfriends Are Collecting Your Data And Are Going To Break Your Heart, Researchers Say

In a recent revelation by researchers involved with Mozilla’s “*Privacy Not Included” project, it has been disclosed that apps offering AI romantic partners, such as girlfriends or boyfriends, often engage in the collection and sale of users’ personal data.

Despite the seemingly harmless nature of interactions with these chatbots, users are exposed to unique risks due to the exploitation of their personal information. The study examined 11 romantic AI chatbot apps, ranging from Chai to Anima, and found that the companies behind these apps disclaim responsibility for the actions or statements made by the chatbots.

This lack of accountability extends even to apps marketed as “self-help,” “wellbeing,” or “mental health” providers, emphasizing the potentially sensitive nature of user interactions with these bots.

The conversation prompts initiated by these chatbots can lead users to divulge highly confidential information, whether related to romantic feelings or mental health struggles. Despite the appearance of empathy or understanding from the virtual companions, they are ultimately algorithms designed to elicit data from users, which is then sold to third-party brokers in the majority of cases.

Additionally, the study revealed concerning privacy practices among these apps, with a significant portion neglecting to address security vulnerabilities or provide information about encryption protocols. Many apps also lack options for users to delete their data and allow for weak password implementations, further compromising user privacy and security.

As a result of these findings, all 11 chatbots received Mozilla’s “*Privacy Not Included” warning label, indicating potential confidentiality pitfalls associated with their usage. Site visitors can rank their opinion on the product’s level of creepiness, with each chatbot surpassing at least the “somewhat creepy” threshold. These labels serve to inform potential users about the privacy risks associated with these apps and encourage more informed decision-making regarding their usage.

In summary, the study highlights the concerning practices of AI chatbot apps offering romantic interactions, emphasizing the exploitation of user data and the lack of transparency regarding privacy and security measures. Despite their seemingly innocuous nature, these apps pose significant risks to user privacy and should be approached with caution.

Leave a Reply

Your email address will not be published. Required fields are marked *