Don’t take AI on a Valentine’s Day date – there’s a hefty bill to pay that you’d never expect

Don’t take AI on a Valentine’s Day date – there’s a hefty bill to pay that you’d never expect

Being single on Valentine’s Day can be depressing, but finding solace in conversations with an AI assistant is no less so. Not only do they lack a personality, but their one true desire is your personal data.

Privacy experts at Surfshark found that four of the five most popular AI companion apps on the Apple App Store may track your personal data utilization for profit.

“Instead of being there for us, they can feel more like surveillance tools,” said Surfshark’s cybersecurity expert, Miguel Fornés, pointing out how AI companions’ tracking can shake users’ trust while invading their privacy.

AI companions: which are the most data-hungry?

The team at Surfshark carefully inspected the data collection practices of the five AI companion services. These details were sourced from the Apple App Store and include the number, type, and handling of the data types collected by each app.

Among the apps analyzed – Kindroid, Nomi, Replika, EVA, and Character AI – 80% “may use data to track their users.”

Tracking, experts explain, refers to linking user or device data collected from the app with user or device data collected from other apps and websites for targeted advertising purposes. Tracking also involves the sharing of user or device data with data brokers.

“This detailed data can lead to companies influencing your choices, which can have negative effects, like overwhelming ads, financial risks, or other unexpected issues,” said Surfshark’s cybersecurity expert.

Character AI was the service most in love with users’ data. While the average was 9 unique types of data collected out of 35. Character AI rises above its competitors by collecting up to 15 of these. EVA was the second most data-hungry of the lot, gathering 11 types of data. Worse still, both of these applications collect users’ approximate location info to deliver targeted ads.

Nomi was the only application to stand apart by claiming to not collect data for tracking purposes.

Not only the data collected by the service seems to be problematic, though. App developers, Surfshark explains, could access the data you willingly share during your conversation with the AI chatbot, too.

The danger here is that AI companion apps are designed to simulate human-like interactions such as friendship and love. You may be more willing to disclose even more sensitive information than you’d do with ChatGPT-like chatbots.

“This may lead to unprecedented consequences, particularly as AI regulations are just emerging,” experts note.

This is why Surfshark strongly advises taking some precautions when using AI companion services to keep your personal data safe and minimize misuse.

Fornés said: “Make sure you are frequently checking what permissions these apps have and be mindful of what information you are sharing.”

administrator

Related Articles