Surge in deepfake “Face Swap” attacks puts remote identity verification at risk

By neub9
2 Min Read

New research indicates a staggering 704% surge in deepfake “face swap” attacks between the first and second half of 2023.

A report from biometric firm iProov cautions that fraudsters are increasingly utilizing off-the-shelf tools to create manipulated images and videos in “face-swapping” scams.

iProov’s analysts are monitoring over 100 face swap apps and repositories, indicating a wide array of affordable, readily accessible generative AI tools capable of producing highly convincing deepfakes to deceive humans and some remote identity verification solutions that conduct a “liveness” test.

A “liveness” test commonly requires an individual to gaze into a webcam and possibly turn their head from side to side, proving that they are a real person and comparing their appearance to identity documents.


According to the report, the malicious actors most frequently rely on face swap tools such as SwapFace, DeepFaceLive, and Swapstream.

Google Trends demonstrates a consistent uptick in searches for these tools over the past year.

The face-swapping software can generate a highly realistic synthetic video, which is then fed to a virtual camera mimicking a genuine webcam. This deceives a remote identity verification system into believing the subject’s “liveness” and trusting their identity.

Most face swap tools offer a free tier, allowing users to experiment with the technology at no cost, making it more appealing to malicious actors.

As deepfake technology becomes more popular among identity fraudsters, an “arms race” will ensue. Security firms will be working to detect synthetic media, while the perpetrators will be striving to avoid detection.

Editor’s Note: The opinions expressed in this guest author article are solely those of the contributor and do not necessarily reflect those of Tripwire.

Share This Article
Leave a comment

Leave a Reply

Your email address will not be published. Required fields are marked *