The popular view of biometric security often invokes fingerprint readers, iris or retinal scans, and voice-activated systems. Researchers have now demonstrated how the way a person laughs might be used in biometrics. Initial tests of the approach show that a prototype laughter recognition algorithm can be 90 percent accurate.
The popular view of biometric security often invokes fingerprint readers, iris or retinal scans, and voice-activated systems. However, any unique human characteristic whether the shape of one’s ears, the whole face, the pattern of blood vessels in the back of the hand, walking pattern, heart rhythm or even how one types at a keyboard, might be used to provide a secure signature of login. Some traits are easier to analyze than others and some, such as fingerprints, can be spoofed.
Research published in the International Journal of Biometrics has taken an amusing trait to demonstrate how the way a person laughs might be used in biometrics.
Comfort Oluwaseyi Folorunso, Olumuyiwa Sunday Asaolu, and Oluwatoyin Popoola of the Systems Engineering Department at the University of Lagos in Akoka, Lagos, Nigeria, points out that people can recognize other people by the unique nature of their laughter, perhaps in an even more obvious way than their voice. Moreover, while many people are adept at impersonating the voices of other people, mimicking someone’s laugh is far more difficult. The team has now used statistical analyses of the various audible frequencies present in a person’s laugh to create a digital signature for each unique laugh.
Tests on the approach show their prototype recognition algorithm to be 90 percent accurate, which compares very favorably with the 65 percent accuracy of a conventional Gaussian model. However, combining their algorithm with the Gaussian approach can boost accuracy overall by more than 5 percent.
“Laughter has thus been shown to be a viable biometric feature for person identification which can be embedded into artificial intelligence systems in diverse applications,” the team concludes.