Speaking at the US Federal Reserve on Tuesday, OpenAI boss Sam Altman said the world is on the brink of a ‘fraud crisis’.
Altman warned that artificial intelligence has caught up with the scam prevention techniques used by banks, and that fraudsters could use platforms such as his own ChatGPT to scam unsuspecting victims.
He said: ‘AI has fully defeated most of the ways that people authenticate currently, other than passwords.’
‘I am very nervous that we have an impending, significant […] fraud crisis.’
Scammers using our own voices and images against us is a worrying prospect, and it is already translating into a tangible problem in the UK.
Almost a third, 28 per cent, of people in the UK think they have been targeted by an AI voice cloning scam in the past year, according to 2024 data from Starling Bank.

Concerns: Sam Altman warned of a ‘fraud crisis’ enabled by artificial intelligence
This is when a scammer uses technology to impersonate someone’s voice on the phone, for example to persuade a family member to transfer them money.
Financial fraud in the UK is rising, with offences jumping 46 per cent in the last year alone.
One estimate from the Global Anti-Scam Alliance suggests a staggering $1.03trillion was lost to scammers in 2024 across the globe.
Not only can AI be used to create voice clones and ‘deepfake’ videos, but it also allows scammers to commit fraud at immense scale, being used to contact countless victims and rapidly gather victim data.
How to spot an AI cloning scam
Just a third of people know how to look out for an AI cloning scam, according to Starling Bank, so many could sleepwalk into fraudsters traps and won’t realise they are on the hook until it is too late.
Most commonly, these cloning scams look to replicate the voice of someone you know well, be it a child, parent or friend.
What this also means though, is that your knowledge may give you the upper hand in a call with a scammer.
Sarah Lennette, financial crime specialist at Starling, said: ‘Think about the tone with which the “person” is speaking.
‘Does their voice sound flatter? Is there any background noise that might come with a usual voice clip? It’s also worth listening for any sudden shifts in pronunciation and unnatural pauses.
‘If the voice is meant to be that of someone you know, I’d also recommend thinking about whether this sounds like something they would really say, as well as how they are “saying” it.’
Of course, not all cloning scams will use the voice of someone you know well enough to discern the difference in sound.
Lennette warns that anyone asking you for money over the phone, or for personal or financial details, should be a red flag that you don’t ignore.
How can you protect yourself?
Lennette suggests setting up a safe phrase with your close family and friends.
This is something known only to you, and can be used in the same way as a password.
To ensure it is secret, Lennette says it should be ‘completely separate from any other passwords, memorable data or recognisable attributes like your favourite football team.’
Without the forethought of setting up a safe phrase though, it is best to make the most of shared knowledge that you and the person you think you are speaking to have.
This means asking a question that only they would know the answer to, and that isn’t available online.
To protect others from being scammed, and yourself from being cloned, it is wise to prevent a scammer from having access to the data they need to clone your voice in the first place.
Lennette warns that scammers only need three second of audio to clone voices, so it is best to avoid having a public social media presence and only accept friends and followers who you know.