Mozilla warns against AI romance apps
A recent report by Mozilla’s “Privacy Not Included” buyer’s guide warns romantics to be cautious when engaging with AI chatbots designed for virtual love. The report examined 11 different AI soulmate apps and found that all of them failed to provide adequate privacy, security, and safety for user data. The apps lacked essential security measures, such as strong password requirements and vulnerability management. Furthermore, the privacy policies of these apps provided little information on how user conversations are used to train the AI models.
The researchers noted that users have little control over their personal data, creating potential risks for manipulation, abuse, and mental health consequences. The apps collect vast amounts of personal information to build connections with users and act as their soulmates. This manipulation can be dangerous, especially for vulnerable individuals. The report also highlighted that most app makers did not provide users with the option to opt out of having their intimate chats used to train AI models.
James E. Lee, COO of the Identity Theft Resource Center, urged consumers to review the data collection practices of any company and exercise their rights to opt in or out of data collection. Retained information could become targets for cybercriminals, leading to ransomware attacks or identity theft.
The report found that the use of AI romance apps is skyrocketing, with an estimated 100 million downloads from the Google Play Store alone over the past year. Additionally, a recent study revealed that 20% of Americans admit to flirting with chatbots, with the number increasing to over 50% for individuals aged 35 to 44. The surge in AI romance chatbot use can be attributed to societal shifts and advancements in technology, as people seek companionship and emotional support.
As AI becomes smarter, these chatbots feel more realistic and engaging, attracting more users. Many individuals turn to chatbots for companionship and romance due to loneliness and the convenience they offer. The pandemic and isolation have further contributed to the rise in their usage.
The report also highlighted deceptive marketing practices by romance bot apps. Some apps claim to offer mental health and well-being benefits on their websites but deny those benefits in their terms and conditions. This inconsistency creates confusion and can mislead users.
AI-powered romance chatbots pose a unique privacy threat because they engage in personal and intimate conversations with users, potentially collecting sensitive personal data.