The tech world is demanding a new law for limiting the scope of emotion-detecting technology. This technology reads the emotion through the Microexpressions and voice tone. Big software companies are making use of this technology in developing software to detect human emotions, but there is a catch as highlighted by the U.S research center, AI Now.
Expressions Can’t Be Same, It Keeps on Changing
Where people are happy to see a new trend in technology in the form of emotion-detection, tech analysts and research centers of the U.S, AI Now, has called for restricting the scope of this technology. The AI Now Institute says that “Emotion-detection technology is built on markedly shaky foundations.”
According to the researchers, such technology shouldn’t be used in making important decisions that prove unfavorable for people, their lives and the opportunities they can get or have lost due to it.
Revealing the disaster of emotion-detection technology, AI Now in its annual report says that, emotion-detecting technology claims to read the micro-expressions on our face, the way we talk and the pitch of our voice. Based on these observations, technology interprets the emotion of the person. These expressions can be ever-changing and are not stable. So, decisions based on it can be wrong.
In the words of Prof Kate Crawford, co-founder of AI Now, “It’s being used everywhere, from how do you hire the perfect employee through to assessing patient pain, through to track which students seem to be paying attention in class. At the same time as these technologies are being rolled out, large numbers of studies are showing that there is… no substantial evidence that people have this consistent relationship between the emotion that you are feeling and the way that your face looks.”
Further explaining the issue, Prof Crawford added that some of the firms developing such software are based on the research made back in 1960 by Paul Ekman which says that a human face can express only 6 basic emotions. But later on more variations to this research have been found, which is completely ignored in emotion-detection technology.
What’s The Context?
AI Now also lifts the curtain from the fact that many companies are selling software based on this technology. For instance, Oxygen Forensics have sold the software to corps for better investigating the suspects. Corps get the ability to detect anger, stress or anxiety on their faces. Another instance of it can be taken from HireVue, who sold the software to a company that makes use of it for shortlisting the candidates for interview.
Before counting on emotion-detection technology, the industry should accumulate more proof and facts that the software detects the right emotions, is consistent, and effective in all situations. Otherwise, it adversely affects human lives by making wrong decisions based on ambiguous technology. Eventually, to make the world a better place to live, such technologies should be used responsibly.
For this, the researchers are of the identical opinion that one needs to understand the context in which the emotion expressions are being detected. The same expression could be of a different meaning in different situations. So, context is the key to make decisions based on the expressions detected by Emotion-detection technology.