Deepfake Scams and KYC: Challenges for Web3

A silhouette of a person with connected dots on their face, representing facial recognition technology

Is that person you’re seeing online real, or a digital creation? Advances in computer-generated imagery (CGI) have led to the development of increasingly realistic faked photos and videos. The same type of technology that lets us enjoy realistic dinosaurs and giant Transformers at the movies can also be used to create ‘deepfakes’ that impersonate real people. In situations where photo or video biometric ID is required for identification, deepfakes can enable thieves to access your secure financial accounts.

Anyone who owns a smartphone that unlocks with face recognition can understand how easily a fabricated video snippet that appears to be your face could compromise your secure data. Deepfakes are a growing concern, even for web3 users who may trade crypto or participate in blockchain projects in trustless environments with no personal-ID requirements. That’s because at some points in crypto transactions, Know Your Client (KYC) regulations may apply, demanding ID verification.

To better understand the challenge deepfakes present to KYC identification and how such attacks can be prevented, let’s go over some fundamentals that underlie the biometric ID verification process. We’ve also got tips for you on how to avoid being the victim of a deepfake attack.

What are KYC requirements?

KYC rules are intended to prevent fraud and money laundering by confirming the identity of financial customers. KYC is part of the anti-money laundering regulations that TradFi institutions must follow. KYC has three components:  

  • Customer identification–Institutions must obtain four pieces of ID from new customers: name, address, date of birth, and an ID number such as a driver’s license number.
  • Customer due diligence–In this phase, these credentials are reviewed for authenticity and approved or denied.
  • Enhanced due diligence–With customers that are flagged as at higher risk of being associated with terrorism, money laundering or other criminality, additional ongoing monitoring takes place.

Once a customer is initially verified and sets up their financial account, there’s the question of how they login. A simple username and password was once the standard. But that has since proven too easy to hack, and multi-factor ID became the norm. In addition to username and password, many financial accounts now require another step such as entering a code sent via email or phone.  

As this multi-factor ID setup proved easy to hack as well–steal your phone and the attacker receives your code–security norms have shifted to requiring biometric ID. At first, a fingerprint was the commonly accepted biometric, but that soon moved on to facial recognition. And that’s where our deepfake troubles begin.

What is a deepfake scam?

In a deepfake scam, an altered photo or video is used for deception and theft. Procuring a photo or video of you online–often from your social-media profiles–the attacker then combines the stolen imagery with other login credentials they’ve stolen to access your accounts and steal your assets. The login system is fooled by the faked photo or video.

How deepfake scams attack the KYC function

How exactly is a deepfake created? If you’ve ever seen a giggling baby with Elon Musk’s head, you know that existing or ‘native’ videos can easily be altered to insert a different person’s head. Popular apps such as FaceApp and Reface use AI-powered face-swapping tools to replace your face with their own uploaded images. As with making Musk into a baby, these tools are often used for humor or simple amusement.

When it comes to using deepfakes for ID fraud, more effort and skill is needed. Deepfake apps use deep learning AI to refine their output until the fake images can often fool biometric ID systems. This is accomplished using generative adversarial networks (GANs).

GANs have two components: one model generates deepfakes, while the adversarial ‘discriminator’ model judges whether the image it’s presented with is real or fake. This information is fed back to the generator, guiding it on how to do better. In this way, the two sides of the GAN create a feedback loop that helps deepfake creators steadily improve their deception.

Deepfakes are highly successful

Why have thieves gravitated towards using deepfakes? Because they’re a highly effective way to defeat biometric ID requirements, a 2022 Sensity study found. The security-software company’s researchers developed a Deepfake Offensive Toolkit that produces deepfaked photos and videos of users. Then, they test their deepfakes out on popular, commercially used KYC systems.  

The result: The team’s deepfaked IDs deceived the KYC systems 86% of the time, compared to just 17.3% for traditional KYC spoofing techniques.

Deepfakes can execute live verification commands

Of course, not all biometric ID systems are created equal. Some simply take a quick look at your photo, or you stick your face in front of your phone for a moment, and you're verified. These simple steps are fairly easy to defeat with deepfakes.

A more sophisticated version of facial ID is ‘live’ verification, in which the biometric ID system issues random commands for the subject to look to the left, blink, look down, and such. The idea is to ensure that the account owner is actually present. Live verification is obviously a more difficult biometric hurdle for deepfakes–but apparently, not an insurmountable one.

It appears some deepfakes are already complying with these random live-verification commands, Binance chief security officer Jimmy Su recently told Cointelegraph:

“Some of the verification requires the user, for example, to blink their left eye or look to the left or to the right, look up or look down,” Su said. “The deepfakes are advanced enough today that they can actually execute those commands.”

The speed at which AI can improve deepfakes to meet tougher verification standards means that as security measures advance, it can be just a short time before deepfakes catch up again.

“Both face recognition and liveness detection are far from perfect technologies,” the Sensity report concluded. “Active liveness is weak against deepfake attacks.”

Their recommendation? Operators of KYC systems using facial recognition must be constantly tested internally with deepfakes to identify ways to improve detection.

How much money can deepfake fraudsters make off with?

The sky’s the limit. For instance, the Sensity report cites a 2021 case in which Chinese deepfake scammers created a sham corporation and then successfully submitted $76.2 million in tax-refund invoices to the Chinese government. Verifying their identity to claim the loot involved a $250 specialized phone, black-market facial images and software that enabled them to hijack a mobile camera to spoof the required face-recognition and liveness tests.

Why are deepfakes a problem in web3?

If traditional forms of ID aren’t required in web3, why are deepfakes a concern? There are ways web3 users may be targeted using deepfakes.

The crypto to fiat conversion

When converting crypto to fiat currency, biometric ID may be required at your financial institution as part of their compliance with anti-money laundering regulations. This offers an opportunity for an attacker to insert a deepfake during the transfer of funds and gain access to your account.

Crypto exchanges typically enable transfers from crypto into fiat currency, so they usually have KYC requirements for those transactions. If a platform uses face recognition as one of their forms of ID, those transactions may be vulnerable to deepfake attacks. Yes, it’s a compromise away from pure web3, but in situations like this regulatory compliance demands it.

Impersonation scams

Major figures in the crypto scene are finding deepfakes of themselves online selling various bogus investment opportunities. They appear to be associated with established, legitimate platforms in web3, but it’s all a scam. It’s not exactly KYC–more like ‘know your expert.’

For instance, an AI-generated video of what appears to be Binance CEO Changpeng ‘CZ’ Zhao circulated on Twitter in February 2023. In the video, ‘Zhao’ urged viewers to join a fake Binance Telegram channel. Getting users to the channel gives attackers a venue for obtaining Binance users’ names and wallet addresses. Armed with this data, photos of the wallet owners can be found online to be used for creating deepfakes.

Tips to avoid deepfake KYC scams

How can you avoid being the victim of a deepfake scam?

Keep your holdings, wallet location, and keys private

Remember, to accomplish a deepfake deception, an attacker would need to first identify you as a holder of substantial crypto and learn where you store a hot wallet. Then, they’d need your private login keys. Armed with these details, they could then find a photo of you on Facebook or elsewhere online to insert in their own app and use it as the final identification requirement to access your account.

This means you can head off deepfake thieves early on by not sharing your real name, wallet location, or private keys with anyone you haven’t verified as trustworthy. As you saw with the fake CZ video, don’t trust what you stumble upon in social media.

Remember, scammers often impersonate legitimate organizations to trick users into disclosing sensitive information. Now, deepfakes have advanced to where you may even see videos of trusted thought leaders that may not be real. Always navigate directly to the site you think is reaching out to you and verify their offer is legitimate rather than clicking a link you’re sent or see online.

Also, avoid mention of your crypto holdings on social media–that’s like waving a red flag at thieves.

Consider self-custody

Storing your crypto assets in a cold, physical wallet you keep mostly offline greatly reduces your exposure to potential attackers. With a self-custodial wallet, you can avoid biometric login requirements, and minimize the risk of being hacked with a deepfake video.

If you find cold storage impractical, consider adopting a multi-wallet approach. It’s a bit like having both a checking and a savings account at your bank. You can keep the bulk of your crypto assets in a cold wallet for security, while keeping a hot wallet on a major platform for daily transactions.

Use stealth addresses

Stealth addresses are one-time wallet addresses that enhance anonymity in cryptocurrency transactions. They act as proxies for a user’s actual wallet address, making it difficult for attackers to identify your real wallet address. And without your actual address, it would be difficult to launch a deepfake attack to access your crypto.

Employ robust tools

Organizations focused on web3 security are racing to offer solutions that will help biometric ID systems better detect deepfakes. For instance, NorthRow’s RemoteVerify combines liveness detection and biometric authentication for a more robust solution that’s harder for deepfakes to fool. Qoobis is also at work on advanced KYC/AML solutions. If your web3 project requires KYC, explore the best available technology for face ID if it must be used.

In the traditional banking industry, iProov’s solution employs ‘controlled illumination’–a unique, one-time sequence of flashing colors the login device must display for verification. For now, it’s thought this color-sequence approach would be impossible for a deepfake video to anticipate.

Avoid verification via face ID

The simplest way to avoid being a victim of deepfake fraud is to reject login schemes that require face ID. If you have a choice of web3 platforms and banks that require KYC, consider doing business with institutions that use fingerprints for biometric verification, or that use other forms of ID that don’t rely on facial recognition.

Stay informed

With AI advancing at an astonishing pace, it’s important to have a network that keeps you in the know about new threats. At Dragonscale, we have developed Iris Collective, a decentralized and inclusive community that promotes transparency and trust across web3 by validating entities to distinguish trusted actors from malicious ones. Participating in a community that provides real-time information on emerging threats keeps your knowledge current and helps you avoid being a victim as deepfakes continue to improve their ability to defeat security systems.

Subscribe to updates from the Dragonscale Newsletter

Don't miss out on the latest posts. Sign up now to get new posts sent directly to your inbox.
jamie@example.com
Subscribe