This new scarily realistic AI-call scam is targeting Gmail users

This new scarily realistic AI-call scam is targeting Gmail users

For a long time, email phishing scams have often been a poorly-worded, typo-ridden, desperate plea for funds, that will of course be paid back tenfold. Well, now that our guard is down, AI is here to make sure we don’t get too comfortable.

A new, hyper-realistic scam is hitting Gmail users, and the AI-powered deceptions are capable of fooling even the most tech-savvy amongst us. In this new wave of fraud, the classic ‘Gmail account recovery’ phishing attack is paired with an ultra-realistic voice-call to trick users into a panic.

In a recent blog post, Microsoft solutions consultant, Sam Mitrovic, explained how he almost fell victim to the elaborate scam, and he recounts an account recovery notification that was followed by a very real sounding phone call from ‘Google Assistant’.

Don’t get caught out

Mitrovic revealed repeated emails and calls were sent from seemingly legitimate addresses and numbers, and that the way he cottoned on to the scam was by manually checking his recent activity on Gmail.

This is part of a worrying larger trend of a ‘deepfakes’, which are already targeting businesses and consumers more than ever. Criminals can use ultra realistic video or audio footage to trick unsuspecting users into transferring over funds or information.

Almost half of businesses have reported encountering deepfake fraud already in 2024, and the trend looks set to continue.

The key to staying safe from this type of scam is by staying vigilant and taking your time – criminals will almost always try to rush you into a decision or into handing over money or details, but by taking a step back to evaluate, you can get perspective and even an outside assessment from someone you can trust.

Via Forbes

More from TechRadar Pro

administrator

Related Articles