According to a report, although the pandemic has caused people and authorities to shift their focus to fighting the coronavirus, some technology companies have tried to use this situation as an excuse to introduce “unverified” artificial intelligence (AI) tools to the workplace. And school. The magazine “Nature”. In a heated debate about the potential for abuse of these technologies, several emotion reading tools are being sold on the market for remote monitoring of children and workers to predict their emotions and performance. These tools can capture emotions in real time and help organizations and schools better understand their employees and students, respectively.

For example, one of the tools can decode facial expressions and place them in categories such as happiness, sadness, anger, disgust, surprise, and fear.

The program is called 4 Little Trees and was developed in Hong Kong. It claims to be able to assess children’s emotions during class. Kate Crawford, an academic researcher and author of The Atlas of AI, wrote in Nature that such technologies need to be regulated to improve policy development and Public trust.

The polygraph test (often referred to as the “polygraph test”) can be used against AI, an example that was invented in the 1920s. The US investigative agency FBI and the US military used this method for decades until it was finally banned.

See also  Guggenheim Investment Fund will invest 497 million U.S. dollars in Grayscale GBTC seeking bitcoin risk exposure

Before AI is used for random surveillance by the general public, reliable supervision should be carried out. Crawford wrote: “This may also help establish norms in response to excessive corporate and government intervention.”

It also cited a tool developed by psychologist Paul Ekman that can standardize six human emotions to fit computer vision. After the 9/11 attacks in 2001, Ekman sold his system to US authorities to identify airline passengers who showed fear or pressure to investigate whether they were involved in terrorist acts. The system has been severely criticized for racial prejudice and lack of credibility.

It is unfair to these job seekers. If these technologies are not independently reviewed, it will be unfair to them because their facial expressions do not match the facial expressions of employees. Students will be reported at school because a machine finds them angry. Author Kate Crawford (Kate Crawford) called for legislation to protect these tools to prevent unproven use.