*.*
News7News 7
HomeScience and NatureThis AI-generated grandma thwarts scammers with long stories about her cat

This AI-generated grandma thwarts scammers with long stories about her cat

by News7

An AI-generated rendition of the “Daisy” AI bores a scammer with descriptions of her fictional cat. Credit: O2

Scammers worldwide are making off big. Last year alone, the Federal Trade Commission estimates US consumers lost a record $10 billion due to fraud, a 14% increase from just one year prior. More and more, scammers are targeting older, vulnerable people over the phone. Over than two-thirds of UK residents over the age of 75 surveyed in a recent research paper claimed they had experienced at least one fraud attempt in the past six months. 40% of those respondents faced frequent fraud attempts.  

Now, an AI-generated UK grandmother named “Daisy” is trying scammers’ jobs a bit more tedious. UK mobile operator Virgin Media O2 created Daisy in order to speak with bad actors and waste as much of their time as possible. Using ChatGPT-like large language models, Daisy will ramble on about her passion for knitting and tell long-winded, fabricated stories about family members with the goal of keeping scammers on the line. In theory, every minute spent frustratingly chatting with Daisy about its made-up family or daily chores is one less minute a scammer could be targeting a real person. 

Get the Popular Science newsletter
Breakthroughs, discoveries, and DIY tips sent every weekday.

“The newest member of our fraud-prevention team, Daisy, is turning the tables on scammers–outsmarting and outmaneuvering them at their own cruel game simply by keeping them on the line,” Virgin Media O2 Director of Fraud Murray Mackenzie said in a blog post. 

AI grandmother was trained on real scam calls 

O2 says it worked with professional scam network disruptors to have phone numbers linked to the AI added to known lists of numbers targeted by scammers. If a scammer tries to call one of those numbers they will immediately start interacting with Daisy. Recorings of conversations with the scammers posted by O2 show Daisy trolling exacerbated scammers by talking about its fictional cat “fluffy” and generally dancing around their questions. Daisy will also provide scammers with false personal information and bogus banking details to make them think they are actually defrauding a real person. These conversations can rattle scammers. O2 provided clips of recording where frustrated scammers can be heard yelling expletives at the AI on the other end.

“Stop calling me ‘dear’ you stupid [expletive],” one scammer can be heard saying. 

“Got it, dear,” Daisy responds. 

To do all of this, Daisy first uses a voice-to-text AI model to transcribe the scammer’s speech. It then takes that text and runs it through another AI model that drafts a response using relevant content. Another text-to-speech model then vocalizes that response with the sound of a senior woman. All of this processing happens in seconds so scammers think they are speaking with a real person. Daisy was trained using real recordings of “scam baiters” collected by O2. 

O2 deliberately used an older woman because they are often disproportionate targets of scams. In this case, Daisy was programmed to engage in meandering, long-winded conversations designed to keep scammers talking. The model has reportedly already kept numerous scammers on the line for over 40 minutes.

Callers in the UK targeted by scammers can send their assailers to the AI by forwarding the call they receive to the number 7726. That then sends the call to the Daisy hotline. O2 says it’s hopeful Daisy can make a meaningful difference amidst a surge in fraudulent phone activity. Around one in five British respondents surveyed by O2 in its recent research reported being targeted by a scam every week. 

AI is also contributing to new scams 

While Daisy is tasked with stopping fraud, scammers are using similar AI tools to launch a variety of new attacks. So-called AI “voice clones” which use snippets of audio to mimic a person’s voice, have been used in recent years to commit bank and wire fraud. In several extreme examples, scammers have even used AI to trick people into believing their loved ones had been kidnapped or held hostage. The victims, believing their son or daughter is in imminent danger, then pay the scammers a fake ransom. Scams like these are becoming more common. One in four respondents recently surveyed by cybersecurity firm McAfee claimed they or someone they knew had been targeted by an AI voice clone scam. Tools like Daisy could theoretically help stem that tide by sending other AI scam bots down winding rabbit holes. 

“Let’s face it dear,” Daisy said in one recording. “I’ve got all the time in the world.”

 

PopSci’s Guide to Cyber Monday

The best Cyber Monday sales, deals, and everything else you need to know. Our team spends hundreds of collective hours searching and evaluating every deal we can find online, focusing on well-made and reviewed products for prices that make sense.

 

Source : Popular Science

You may also like

12345678..........................%%%...*...........................................$$$$$$$$$$$$$$$$$$$$--------------------.....