The new artificial intelligence tool can synthesize fake fingerprints …

Scientists have developed an artificial intelligence tool that can synthesize fake human fingerprints and potentially mislead biometric authentication systems

Fingerprint authentication systems are a ubiquitous and highly trusted form of biometric authentication, deployed in billions of smartphones and other devices around the world.

However, researchers from New York University (NYU) in the United States revealed a surprising level of vulnerability in these systems.

Using a neural network, they developed a fake fingerprint that could fool biometric authentication for one in five people.

In the same way that a master key can unlock all the doors in a building, these “DeepMaster Impressions” use artificial intelligence to match a large number of impressions stored in fingerprint databases and therefore theoretically , they could unlock a large number of devices.

The work builds on earlier research led by Nasir Memon, a professor at New York University. Memon, who coined the term “MasterPrint,” described how fingerprint-based systems use partial, rather than full, fingerprints to confirm identity.

The devices generally allow users to register several different finger images, and a match for any saved partial prints is sufficient to confirm identity.

Partial fingerprints are less likely to be unique than full copies, and Memon’s work demonstrated that there are enough similarities between partial prints to create MasterPrints capable of matching many partials stored in a database.

The researchers took this concept further, training a machine learning algorithm to generate synthetic fingerprints like MasterPrints.

“Fingerprint-based authentication is still a strong way to protect a device or a system, but at this point, most systems do not verify whether a fingerprint or other biometric is from a real person or a replica,” said Philip, a doctoral student. Bontrager.

“These experiments demonstrate the need for multi-factor authentication and it should be a wake-up call for device manufacturers about the possibility of artificial fingerprint attacks, “he said.