Your Face Isn’t a Password: Fuzzy Commitments Fall Short for Deep‑Learning Biometrics

Your Face Isn’t a Password: Fuzzy Commitments Fall Short for Deep‑Learning Biometrics

Your face isn’t a password

New research shows that a popular way to “lock” face-recognition data—called fuzzy commitments—doesn’t keep it truly safe when used with modern deep‑learning systems.

Why? The face templates these systems produce don’t have enough randomness (entropy). That lets attackers reverse a protected template back into a realistic face image.

How bad is it? From a protected template alone, the authors reconstruct faces that often pass as the real user: in the simplest setup, 78% of reconstructions unlock accounts even when the system is tuned to a 0.1% false‑accept rate. Across different recognition systems, the reconstructed faces are still 50–120× more likely to be accepted than chance.

What this means: biometric “encryption” isn’t the same as password security. If your template leaks, you can’t change your face.

What to do now: prefer systems with cancelable/high‑entropy templates, combine biometrics with passkeys or tokens, limit template reuse, and keep data local when possible. Paper: http://arxiv.org/abs/2012.13293v1

Paper: http://arxiv.org/abs/2012.13293v1

Register: https://www.AiFeta.com

#AI #Security #Biometrics #FaceRecognition #Privacy #Cryptography

Read more