Smile, you’re on camera! We all are nowadays. CCTV, phones and drones in every zone. We pay for it, but did you know you could be paid by it? I’m talking about a camera that can look at a person and determine how they are emotionally feeling, and do it correctly, ethically, and with human accuracy. Enter Ethical Facial Analysis.
Let’s clear a few things up first. Recognition is often a misused and all-encompassing term — so forget that and everything you know about spy films which feature it. Facial recognition and detection are very different to analysis. To put it simply, 'Facial Detection' is the identification of facial presence within a frame. 'Facial Recognition' is the identification of a detected face within a database. The brilliant 'Facial Analysis', also known as attribution, is the identification of facial emotion.
For the longest time, recognition and detection were solely in the sector of security. Think airports and passport control, cameras constantly jotting down the who’s who from Malibu to Timbuktu. Once they know, they remember. Why? Because the face is the most effective biometric. It’s contactless, easy to operate and implement, cheaper than fingerprints and iris scanners, and with today’s technology, almost impossible to cheat. The face is such a complex and unique surface, ever-changing and growing. The London police use it in their cameras to recognise criminals live from their database. Taylor Swift's security use it to stop stalkers.
So if a camera can detect the change in face, then it can surely detect the change in emotion? Not quite. This is the issue we debated in episode four of our AI series. Just because you’re smiling, it doesn’t mean you're happy, just because you’re frowning, it doesn’t mean you’re sad. It’s complicated. People can be happy, free, confused, and lonely at the same time. It’s complex, human. This is the limit of traditional facial recognition technology. But with new advancements in AI and deep learning models, the game has changed.
Besides swifter speeds, AI has enabled greater accuracy in sentimental analysis and with higher baselines, which can be as many as 500 faces at one time. The analysis models are pre-trained on huge face datasets and are getting smarter with every application. This is what separates cutting-edge tech from a Snapchat filter. Oh, and it's totally ethical.
Ethical, how? Because no footage is recorded, no face is saved. They are read live as measurements, not images, effectively hashed data. Even with access to the measurements, the data could never be attributed to an individual. Nothing is traceable, recognisable, or personal. It's ethically compliant.
So, you’ve got the lowdown on all things AI and facial analysis. If you’re interested in finding out how we can use our AI capabilities for your next event or internal comms project, email anythings.possible@drpgroup.com and we will be in touch.