Exclusive Content:

Home Office Blunder: Thousands of Deportation-Intended Migrants Missing Before Rwanda Flights

A recent revelation has cast a glaring spotlight on...

Taxes: here is the (large) amount of the advance that the tax authorities will pay you on Monday January 15

The end-of-year holidays have just ended and it is...

Weather: what will the weather be like in February, March and April?

At the start of 2024, the temperatures on the...

Bank fraud | New tool: voice deepfakes

spot_img

Last spring, Florida investor Clive Kabatznik called the local Bank of America representative to discuss a large money transfer he was planning to make. Then he called back.

Except the second call was not from Kabatznik. Software had artificially generated his voice and tried to trick the banker into transferring the money elsewhere.

Mr. Kabatznik and his banker have been the target of a cutting-edge scam attempt that is attracting the attention of cybersecurity experts: the use of artificial intelligence to generate voice deepfakes, i.e. vocal renderings that mimic the voices of real people.

The problem is still new enough that there is no comprehensive assessment of the frequency of scams. But an expert whose company, Pindrop, monitors audio traffic for a significant number of major U.S. banks says he’s seen an increase in the phenomenon this year — and in the sophistication of scammers’ voice fraud attempts. Another major voice authentication vendor, Nuance, saw its first successful deepfake attack against a financial services industry customer late last year.

In Mr. Kabatznik’s case, the fraud was detectable. But the speed of technological development, the falling costs of generative artificial intelligence programs, and the wide availability of voice recordings on the internet have created the perfect conditions for voice-related AI scams.

Customer data, such as bank details, which have been stolen by hackers – and are widely available on underground markets – help scammers carry out these attacks. These are even easier to achieve with wealthy clients, whose public appearances, including speeches, are often widely published on the internet. Finding audio samples from ordinary customers can also be as simple as searching online – for example on social media apps like TikTok and Instagram – to find the name of someone whose banking details the scammers already have.

“There’s a lot of audio content,” says Vijay Balasubramaniyan, CEO and founder of Pindrop, which reviews automatic voice verification systems from eight of the top ten US lenders.

Over the past ten years, Pindrop has reviewed the records of more than 5 billion calls received by the call centers of the financial companies it serves. These centers manage products such as bank accounts, credit cards and other services offered by major retail banks. All call centers receive calls from fraudsters, usually between 1,000 and 10,000 per year. According to Mr. Balasubramaniyan, it is not uncommon to receive 20 calls from fraudsters a week.

Most of the fake voice attacks Pindrop has observed have taken place in credit card service call centers where human representatives deal with customers who need help with their card.

Mr. Balasubramaniyan played a reporter to an anonymous recording of one such call that took place in March. Although a very rudimentary example – in this case the voice is robotic and more reminiscent of an e-reader than a person – the call illustrates how scams can occur as the AI makes it easier to imitate human voices.

We hear a banker welcoming the customer. Then the voice, which seemed automated, reported: “My card has been declined. »

The banker replies, “May I ask you who I have the pleasure of talking to?” »

“My card was declined,” the voice repeats.

The banker asks the customer’s name again. A silence follows, during which a slight tapping sound is heard. According to Balasubramaniyan, the number of keystrokes corresponds to the number of letters in the customer’s name. The fraudster types words into a program which then reads them.

In this case, the caller’s artificial debit led the employee to transfer the call to another department and report it as potentially fraudulent, says Balasubramaniyan.

“Synthetic speech leaves artifacts behind, and many anti-spoofing algorithms rely on these artifacts,” says Peter Soufleris, CEO of voice biometrics technology provider IngenID.

But as with many security measures, it is an arms race between fraudsters and their targets, which has recently evolved. A scammer can now simply speak into a microphone or type a prompt and have their words translated very quickly with the voice of the person they impersonated.

Balasubramaniyan noted that a generative AI system, Microsoft’s VALL-E, could create a fake voice that said anything the user wanted using an audio sample of just three seconds.

While scary deepfake demonstrations are commonplace at security conferences, actual attacks remain exceedingly rare, says Brett Beranek, general manager of security and biometrics at Nuance, a voice technology provider that Microsoft acquired in 2021. The single successful breach at a Nuance customer, in October, required more than a dozen attempts by the attacker.

What concerns Mr. Beranek most is not the attacks on call centers or automated systems, such as the voice biometrics systems deployed by many banks. He worries about scams in which a caller directly reaches someone, such as the CEO, pretending to be someone else.

This is what happened in the case of Mr. Kabatznik. According to the banker’s description, someone seemed to be trying to get her to transfer money to a new location, but the voice was repetitive, spoke at the same time as her, and her words were confused. The banker hung up.

After receiving two more such calls in a short time, the banker reported the issue to Bank of America’s security team, Kabatznik said. Concerned about the security of Mr. Kabatznik’s account, she stopped answering his calls and emails, even those from the real Mr. Kabatznik. It took about ten days for them to re-establish contact, when Mr. Kabatznik arranged to visit him at his office.

“We regularly train our team to identify and recognize scams and help our customers avoid them,” said William Halldin, spokesperson for Bank of America. He adds that he cannot comment on specific clients or their experiences.

Although the attacks are increasingly sophisticated, they stem from a basic cybersecurity threat that has existed for decades: a data breach that reveals the personal information of a bank’s customers.

Once they harvest a batch of numbers, the hackers sift through the information and match it to real people. The people who steal the information are almost never the people who end up using it. Instead, thieves put them up for sale. Specialists can use one of many readily available programs to spoof the phone numbers of targeted customers, which Mr. Kabatznik likely did.

It is easy to find recordings of his voice. On the internet, there are videos of him speaking at a conference and participating in a fundraiser.

“I think it’s pretty scary,” Mr. Kabatznik said. “The problem is, I don’t know what to do. Should we just go underground and disappear? »

Latest articles

Nvidia and AMD Stocks React as Semiconductor Sector Faces Turbulence

The semiconductor market experienced significant fluctuations as Nvidia and AMD stocks reacted to industry...

Adrian Newey Announces Departure: Red Bull Racing Faces Transition in F1 Design Leadership

End of an Era: Adrian Newey Announces Departure from Red Bull Racing In a significant...

Home Office Blunder: Thousands of Deportation-Intended Migrants Missing Before Rwanda Flights

A recent revelation has cast a glaring spotlight on the Home Office, as it...

Boris Johnson Makes Startling Political Comeback Ahead of Pivotal Election

In a surprising turn of events, Boris Johnson has emerged from his political hiatus,...

More like this

Home Office Blunder: Thousands of Deportation-Intended Migrants Missing Before Rwanda Flights

A recent revelation has cast a glaring spotlight on the Home Office, as it...

Taxes: here is the (large) amount of the advance that the tax authorities will pay you on Monday January 15

The end-of-year holidays have just ended and it is nice to benefit from an...

Weather: what will the weather be like in February, March and April?

At the start of 2024, the temperatures on the thermometer are enough to make...