Flaws in biometrics implementation

Here are two very recent examples of well established companies not implementing biometrics as safely as they could have to ensure the benefits of such methods are not outweighed by the risks introduced by design:

Samsung S10: https://www.bbc.co.uk/news/technology-50080586

Google Pixel 4: https://www.macrumors.com/2019/10/18/google-pixel-4s-face-unlock-works-eyes-closed/


I guess we shouldn’t dismiss biometric device security all together because of these flaws but they are certainly serious flaws and quite worrying that something so simple as a screencover can sidestep an entire protocol.

1 Like

I think biometrics are more security theatre than real security. Most of the value of these systems come from the fact that to authenticate requires your phone that the fact they require a fingerprint.

You leave fingerprints everywhere but quickly notice if you leave your phone behind.

Wired have this article on using lasers to mimic voice commands for Alexa, Google Home and Siri. To summarise, the technique pulses lasers mimicking voice on the microphone and triggering a response from the device. These devices do offer user recognition in order to respond to either all commands or sensitive commands such as ‘unlock the front door’. Your voice is your biometric security here. We speak in public all the time and use phone lines that can be tapped. It wouldn’t be hard to steal your voice, particularly if you were in the public eye anyway like a politician or pop star. The hack would simply process your voice into a command and send it over to the device with laser (so effectively silently) and issue commands. Hmmm.