Friday, November 22, 2024

AI bot cloned my voice in minutes & was so believable a pal agreed to send £1k

Must read

WHAT if a criminal could take a tiny snippet of your voice from your social media, clone it using AI technology and then trick your friends and family into handing over cash?

It’s a chilling and terrifying thought, but anyone can fall victim – with hoaxers requiring just a three-second clip of your voice to clone your accent.

8

Fabulous writer Miranda Knox sent out an AI generated plea for money, which was a clone of her own voiceCredit: INSTAGRAM/MIRANDA KNOX

Almost a quarter of Brits have already experienced an AI voice scam or know someone who has, according to research by online protection company McAfee and a staggering 78 per cent lost money as a result.

And scammers needing such a small voice clip is particularly eye-opening when you consider how much of our lives we now share on social media.

F-Secure Threat Intelligence Lead Laura Kankaala is an expert in the field of security threats and in just minutes, she revealed how incredibly easy it is for scammers to take on your identity and use it to extract money for criminal gain.

Using an AI audio generator tool, Laura cloned my voice and it sounded so identical to my own, even my close family and friends – those you’d think would know better – fell for it.

READ MORE FABULOUS FEATURES

It’s tech anyone can use, and is readily available online, with dozens of free and paid-for sites.

Laura says: “Voice cloning is happening all the time and people don’t often realise how easy it is to copy our likeness – our voice and pictures.

“Right now it’s a big issue as there have been a lot of new tools that have emerged over the last year or so to copy someone’s voice [and] they’re very easy to use.”

One pal distractedly said they were happy to transfer £1k to pay for flights

8

One pal distractedly said they were happy to transfer £1k to pay for flightsCredit: Supplied

Freaking out

I’m obviously not a scamming pro, but once my two clips were ready, it was alarmingly quick and easy to fool those closest to me.

I sent two voice notes to close friends and family asking for money, loosely replicating a cruel hoax common with scammers.

To one friend, I sent: “Hi, it’s Miranda! About to book flights but I’ve lost my credit card – really need to book now. Can you send me a grand and I’ll pay you back? I’ll really owe you one!”

We’d chatted previously about going on holiday together and were are the point of booking flights when I sent the voice note.

“I can, but just in a meeting,’ she quickly replied – making it obvious just how easy it would be to fall victim when distracted.

Needless to say, I came clean before she got to the point of asking for bank details.

And in a second note, which I sent to friends and family, the cloned voice said:

“Hi, it’s Miranda – having a nightmare and lost my bag and phone! Could I borrow some money so I can get home please?

“Promise I’ll pay you back!”

“Freaking out! Have you actually lost everything?’ one pal replied, contacting our mutual friends to see if I was OK.

My brother-in-law did question it initially, but then asked: ‘How much do you need?’ before sending me £30 – which I sent back immediately.

Thankfully there was no fooling my mum, although she admitted it did sound exactly like me and had I “been a better scammer” she could potentially have fallen for it.

Jennifer DeStefano and her daughter were targeted by an elaborate AI voice scam

8

Jennifer DeStefano and her daughter were targeted by an elaborate AI voice scamCredit: Supplied

Emotions take over

It sounds far-fetched that you’d fall for a robot mimicking your pal’s voice – but it’s a lot more believable than people realise.

Scammers rely on emotional manipulation, high-stress scenarios and tight deadlines to apply pressure, so they essentially want all logic to go out the window.

McAfee Senior Security Researcher Oliver Devane explains: “Scammers are using AI voice cloning technology to dupe parents into handing over money, by replicating their childrens’ voices via call, voicemail or WhatsApp voice note. 

“In most of the cases we’re hearing of, the scammer says they have been in a bad accident, such as a car crash, and need money urgently.

“The cybercriminal is betting on a parent becoming worried about their child, letting emotions take over, and sending money to help.”  

While recipients did question the message, they were quick to jump to the aid of 'Miranda'

8

While recipients did question the message, they were quick to jump to the aid of ‘Miranda’Credit: Supplied
It's not hard to see why people fall for voice scams, and scammers rely on you wanting to help a friend or family member without question

8

It’s not hard to see why people fall for voice scams, and scammers rely on you wanting to help a friend or family member without questionCredit: Supplied

‘The sky’s the limit’

Explaining the process of using voice cloning tech further, F-Secure’s Laura Kankaala says: “In the guidelines and terms it will always say to only use your own voice or a voice that you have permission to use [but] of course cyber criminals are not following any laws so that’s meaningless.

“Some only need three to 10 seconds of audio [to create a clone], and the one I used needs about one minute.

“On social media, a lot of us post videos talking to a camera, which would be one way to obtain the audio needed for this.

“The problem is not we’re living our lives online and posting things online, as that’s just how the world operates.

“The problem is there are people who want to take advantage of this technology and use it against us to commit crime and scam people out of their money.

“[When it comes to scams], the sky is the limit really.”

F-Secure Threat Intelligence Lead Laura Kankaala wants to raise awareness about the prevalence of scams, and how people can keep themselves safe online

8

F-Secure Threat Intelligence Lead Laura Kankaala wants to raise awareness about the prevalence of scams, and how people can keep themselves safe onlineCredit: Supplied
Scammers are able to take advantage of victim's genuine concern

8

Scammers are able to take advantage of victim’s genuine concernCredit: Supplied

Take a step back

I confessed quickly so it didn’t escalate, but what should my friends have done when they received my voice note?

Most, to their credit, did try calling me immediately – which is one way to find out if you’re being tricked, and several people called my husband too.

Laura says: “If you receive anything from anyone suddenly, out of the blue, it’s always good to step back, especially if they’re asking you to send money or click on a link.

“Sit back and try to contact that person directly through a different means, for example, calling them by phone to ask if they’re actually in trouble.”

HOW TO STAY SAFE FROM AI SCAMMERS

Here, F-Secure Threat Intelligence Lead Laura Kankaala shares her top tips to spot a scam, and what you need to be aware of to avoid falling victim…

  • Any type of content can now be fake: AI tools used by scammers rely on data being available on the web. It takes as little as several seconds of audio to create a convincing AI generated voice note. Similarly, images can easily be taken from social media and altered.  For example, a scammer may use an AI image generator to change the background to a dangerous scenario or add in today’s front page of a newspaper in minutes.
  • Ask a security question or agree a safe word: AI is intelligent but it can’t replicate personal relationships. If suspicious, ask something personal and unique. For example, ‘what’s the name of your first teddy bear?’ or ‘how many bones have you broken and how?’. Avoid asking things like, ‘what’s the first line of your address? or ‘what is the name of your first pet?’ as scammers can access this type of information, which is often inputted online for online purchases and account recovery.
  • Agree on a password with family members: It should be noted however that in the case of a real emergency, your child may not remember or say the password out loud, so this might not be the most reliable option.
  • Phone numbers can be spoofed: Even if the call you receive looks like it’s from your child’s phone number, phone numbers can be cloned. This is known as spoofing, and is a common tactic used by scammers as it encourages the recipient to trust the caller. Using a phone number alone as verification is not sufficient.
  • Hang up and tell them you’ll call them back: If it’s a call you’ve received and you suspect it’s not genuinely them, say you’ll hang up and contact them. A phone number can be faked for incoming calls, but not outgoing. So use your phone’s contact app and call your child that way, to confirm it was them.
  • Never give any bank details over the phone: Never share bank details over the phone or via messages/email even in legitimate circumstances, as it increases the risk of someone misusing or accidentally leaking your sensitive information. It’s normal for your child to ask for money, but if any suspicions arise, give them a call and make sure they are okay.
  • Educate and inform: Education is the best form of prevention – by being aware of such scams you can be alert to the warning signs and act accordingly. Share this knowledge with family members – particularly older relatives who may be less technologically savvy. You should also educate your children, even if they’re older and comfortable with technology.

Laura Kankaala is the Threat Intelligence Lead at F-Secure – a cyber security company who help protect more than 30m worldwide.

Robot ‘mimicking teen’s cries & screams’

AI voice scams are an increasingly alarming issue, and one that’s sure to become more prevalent as the capabilities of AI technology continues to advance.

When mum-of-four Jennifer DeStefano picked up the phone last January her blood ran cold as her terrified teen daughter Briana sobbed and screamed for her help – only for it to be an elaborate AI voice scam.

It wasn’t Jennifer’s daughter at all, but an AI robot perfectly mimicking her cries and voice seemingly as part of an elaborate ploy to try and scam Jennifer out of tens of thousands of pounds.

To this day, she still has no idea how her daughter’s voice was cloned – or who was responsible, and had never even heard of AI scams before becoming subjected to an attempt to trick her into handing over thousands to save her daughter Briana from kidnappers in 2023.

Mum Jennifer DeStefano was told by scammers that her daughter Briana had been kidnapped and they demanded thousands of pounds for her release

8

Mum Jennifer DeStefano was told by scammers that her daughter Briana had been kidnapped and they demanded thousands of pounds for her releaseCredit: Facebook/Jennifer DeStefano

Shedding light on ‘a whole new world’

While Jennifer didn’t hand over any money, it was a shocking experience.

Since sharing her story, she has called for greater legislation around AI technology.

She says: “I’ve had so many people reach out to say they’d experienced a similar thing, whether it’s a call about a kidnapping or an accident or they’re in trouble and in prison – there are loads of different scenarios, and the deep fake videos that are coming out now too are so scary. 

“We need more legislation around AI, and AI being used to aid crime. There needs to be penalties and consequences to misusing it.

“It’s a relief my daughter is safe and that situation wasn’t real, however it also sheds light on a whole new world and reality that I had no idea even existed – and that is terrifying.” 

Hoping to raise awareness of this sort of scam to try and prevent people falling victim online, Laura adds: “These things sound scary, but I want to talk about this as it’s a tricky world we’re living in right now.

“We’re so dependent on the internet but there are so many ways our data can be weaponised against us – and this is just one example.”

Latest article