Most of us take it for granted that we can read another person’s emotions through subtleties such as body language, yet this is a real struggle for many others. Enter emotion AI. Researchers at Stanford University modified Google’s augmented reality glasses to read emotions in others and notify the wearer. The glasses detect someone’s mood through their eye contact, facial expressions and body language, and then tell the wearer what emotions it’s picking up. “Emotion AI taps into the individual,” explains Zabeth Venter, CEO and co-founder of Averly. “If you think about facial recognition, which is a kind of emotion AI, I can pick up if you like what I’m saying by whether your smile is a smirk or a real genuine smile.” Such nuances go deeper. Another example is polling: what is your favouri...
As the global need for inclusive biometric security in facial identification increases, developers need not overlook the burden of a non-biased solution. Teki Akuetteh Falconer, from the African Digital Rights’ Hub, says racial profiling is a major concern in terms of facial identification software. This presents a challenge for Africa because if an individual’s identity cannot be reliably processed by both the public and private sectors, their ability to function in a digital society is hampered. The World Bank and the United Nations anticipate that by 2030 all Africans will have some form of digital identity which will prove critical when it comes to accessing essential services such as housing, schooling, healthcare and banking, to name a few. Gur Geva, Founder and CEO of iiDE...