Researchers use CycleGAN tool to fool facial recognition systems used at airports and railway stations into misidentifying people
You must have seen many movies that show the villain getting caught using the face recognition tech installed at the airport or a railway station. This tech is a reality and is increasingly being used at airports, malls, railway stations, etc. The face recognition tech uses artificial intelligence to identify the people teeming at airports or railway stations.
However, security researchers at McAfee have discovered a way to fool the facial recognition artificial intelligence into making false identification of the target. The security researchers have written a research paper titled “Dopple-ganging up on Facial Recognition System.” Their research shows how the artificial intelligence powering the facial recognition systems used throughout the world can be easily be fooled.
The research team of Steve Povolny and Jesse Chick have found that they can confuse the facial recognition AI into thinking it’s seeing a completely different person, thereby, fooling the system. To trick the face recognition algorithm the researchers at McAfee used CycleGAN, which is an image translation algorithm that could transform your picture into a totally different image.
Cycle Generative Adversarial Network, or CycleGAN, is a process where a deep convolutional neural network is trained for image-to-image translation. CycleGAN uses higher-level features of an image for translation such as the shape of the head, eye placement, body size, etc.
McAfee researchers fed nearly 1,500 photos to CycleGAN and after hundred of tries, CycleGAN created an image that the face recognition recognized as someone else instead of whom the human eye perceived.
But there are two concerns with the study- first, that the researchers had a similar face recognition system as they do at the airport security but not the same.“I think for an attacker that is going to be the hardest part to overcome, where [they]don’t have access to the target system” said Povolny. Second, CycleGAN takes time to create such an image and the software requires a high-end system to work functionally.
The researchers aimed at the study to point out the vulnerability of Face recognition systems and the dangers of relying solely on these checks.
“AI and facial recognition are incredibly powerful tools to assist in the pipeline of identifying and authorizing people,” Povolny says. “But when you just take them and blindly replace an existing system that relies entirely on a human without having some kind of a secondary check, then you all of a sudden have introduced maybe a greater weakness than you had before.”