The Information Commissioner’s Office (ICO) has warned against organisations using biometrics for emotion analysing tech. These technologies process data on behavioural and physical characteristics such as facial expressions and heartbeats.

ICO is concerned about the risks of using emotion tech to make impactful decisions about people, with the data privacy watchdog worrying that organisations make critical decisions about people without appreciating there is no scientific evidence that emotion tech works, the BBC reported.

“The inability of algorithms which are not sufficiently developed, to detect emotional cues, means there’s a risk of systemic bias, inaccuracy and even discrimination,” the ICO said in a statement.

Artificial intelligence (AI) is important to emotion tech, but it’s only as good as the underlining data.

“AI can contribute to perpetuate inherent biases, for example associating speech and facial expressions to personality traits,” Laura Petrone, principal analyst at GlobalData, tells Verdict. “The inaccuracy of AI models is still a thorny issue for businesses and regulators. In the future further evidence is likely to push for more regulation [like the UK’s new AI rulebook].”

“Humans aren’t able to make robust links between inner emotions and biometric markers, it is improbable AI could do this,” Caroline Carruthers, CEO and co-founder of data consultancy, Carruthers and Jackson, tells Verdict.

How well do you really know your competitors?

Access the most comprehensive Company Profiles on the market, powered by GlobalData. Save hours of research. Gain competitive edge.

Company Profile – free sample

Thank you!

Your download email will arrive shortly

Not ready to buy yet? Download a free sample

We are confident about the unique quality of our Company Profiles. However, we want you to make the most beneficial decision for your business, so we offer a free sample that you can download by submitting the below form

By GlobalData
Visit our Privacy Policy for more information about our services, how we may use, process and share your personal data, including information of your rights in respect of your personal data and how you can unsubscribe from future marketing communications. Our services are intended for corporate subscribers and you warrant that the email address submitted is your corporate email address.

Sceptics say emotion tech is still in its infancy and should be used cautiously.

“It can be very tempting for businesses to believe there’s a silver bullet which will let them use this technology to ‘peek behind the curtain’ to see, for example, what potential hires think,” Alistair Dent, chief strategy officer at AI company Profusion, tells Verdict.

“Not only does this approach severely undermine trust – it’s like asking someone to take a polygraph – it’s also incredibly difficult to interpret results.”

Dent says there is a place for emotion tech in assessing welfare or security. However, using it to monitor staff is risky. Companies should be mindful of ethical ramifications.

The ICO will issue further guidance in Spring 2023.

GlobalData is the parent company of Verdict and its sister publications.