Imagine you meet someone new. Be it on a dating app or social media, you chance across each other online and get to talking. Theyâre genuine and relatable, so you quickly take it out of the DMs to a platform like Telegram or WhatsApp. You exchange photos and even video call each over. You start to get comfortable. Then, suddenly, they bring up money.
They need you to cover the cost of their Wi-Fi access, maybe. Or theyâre trying out this new cryptocurrency. You should really get in on it early! And then, only after itâs too late, you realize that the person you were talking to was in fact not real at all.
They were a real-time AI-generated deepfake hiding the face of someone running a scam.
This scenario might sound too dystopian or science-fictional to be true, but it has happened to countless people already. With the spike in the capabilities of generative AI over the past few years, scammers can now create realistic fake faces and voices to mask their own in real time. And experts warn that those deepfakes can supercharge a dizzying variety of online scams, from romance to employment to tax fraud.
David Maimon, the head of fraud insights at identity verification firm SentiLink and a professor of criminology at Georgia State University, has been tracking the evolution of AI romance scams and other kinds of AI fraud for the past six years. âWeâre seeing a dramatic increase in the volume of deepfakes, especially in comparison to 2023 and 2024,â Maimon says.
âIt wasnât a whole lot. Weâre talking about maybe four or five a month,â he says. âNow, weâre seeing hundreds of these on a monthly basis across the board, which is mind-boggling.â
Deepfakes are already being used in a variety of online scams. One finance worker in Hong Kong, for example, paid $25 million to a scammer posing as the companyâs chief financial officer in a deepfaked video call. Some deepfake scammers have even posted instructional videos on YouTube, which have a disclaimer as being for âpranks and educational purposes only.â Those videos usually open with a romance scam call, where an AI-generated handsome young man is talking to an older woman.
More traditional deepfakesâsuch as a pre-rendered video of a celebrity or politician, rather than a live fakeâhave also become more prevalent. Last year, a retiree in New Zealand lost around $133,000 to a cryptocurrency investment scam after seeing a Facebook advertisement featuring a deepfake of the countryâs prime minister encouraging people to buy in.
Maimon says SentiLink has started to see deepfakes used to create bank accounts in order to lease an apartment or engage in tax refund fraud. He says an increasing number of companies have also seen deepfakes in video job interviews.
ââAnything that requires folks to be online and which supports the opportunity of swapping faces with someoneâthat will be available and open for fraud to take advantage of,â Maimon says.
