Text

Innovation Insights: More than Meets the Eye – Deception by Deepfakes

Amy Donaghue

By: Brian Laverdure, AAP, Director, Emerging Payments Education

On July 20, 1969, the Apollo Lunar Module Eagle safely touched down in the Sea of Tranquility, and astronauts Neil Armstrong and Buzz Aldrin became the first men to land on the moon. Their accomplishment so famously described by Armstrong as, “one small step for [a] man, one giant leap for mankind,” was broadcast live around the world to millions of captivated television viewers. With all those witnesses, surely there can be no doubt about the success of the Apollo mission, right? Well, viewers of a new video released by the Massachusetts Institute of Technology (MIT) may not be so certain! Don’t worry—this article is not an attempt to propagate conspiracy theories about NASA. Instead, it is a cautionary tale about a troubling new technological development that is quickly finding its way into financial crimes: complete deepfakes!

What is a deepfake?
A deepfake is a combination of “deep learning” and “fake” that describes the use of artificial intelligence to create synthetic media that blends existing audio or visual media with manipulated speech, text or likenesses of people. A “complete” deepfake is one that fuses several fake elements to create a recording with artificial audio and video. You may have played with some simple apps on your phone that quickly swap your face with those of your friends or even your pets, but deepfakes can be much more sophisticated, unsettling and threatening. MIT’s Center for Advanced Virtuality released their deepfake of President Nixon announcing the failure of the Apollo 11 mission to highlight just how far the underlying technology has advanced in the past few years. To create the clip, researchers combined historical video footage of President Nixon with a video of a voice actor reading a real speech drafted by William Safire in case the astronauts crashed or were unable to return home. Look closely at the video—does it look and sound authentic? Now imagine what criminal enterprises could do with the same technology.

Deepfakes and Financial Crime
Using programs readily available today, criminals can steal pictures from social media accounts to construct fake images to blackmail or embarrass unsuspecting victims. But cybersecurity experts have growing concerns about the dangers that deepfakes may pose to financial systems. Jon Bateman, a researcher at the Carnegie Endowment for International Peace, published a white paper in July that identified specific synthetic media techniques that could potentially harm the financial services industry, including:

  1. Vishing – Artificial intelligence programs can create cloned voices for social engineering calls. For example, criminals can use vishing to impersonate senior leaders to direct the accounts payable department to originate fraudulent ACH credits.
  2. Synthetic social botnets – Fake social media accounts can support a range of criminal activities, from synthetic identity fraud to romance scams to investor fraud. As artificial intelligence programs mature, the images and videos of fake accounts will be much more lifelike, and therefore, much more convincing to audiences.
  3. Fabricated private remarks – Deepfake videos or audio clips that falsely portray a public figure for malicious purposes can be incredibly damaging. Imagine what could happen if bad actors target your financial institution by creating a deepfake video showing your CEO talking about the bank or credit union’s imminent failure! The false speech could quickly spread on social media and lead to a ruinous bank run.

What happens next?
Synthetic media, combined with synthetic identity fraud, will likely challenge existing customer/member identification and authentication processes and may help criminals evade current fraud monitoring systems. Although no one can predict exactly how deepfakes may disrupt our lives or the integrity of our financial systems, financial institutions and account holders should remain vigilant and informed about new developments. To prepare for the uncertainty of the future, financial institution employees need the latest news and up-to-date training to remain one step ahead of bad actors. EPCOR members now have access to many different resources to help them learn about new fraud trends, such as:

And, for more information on deepfakes, check out the following sources: