Essex Police Pauses Facial Recognition Use Due to Racial Bias

Essex police have suspended the use of live facial recognition (LFR) technology after a study found the cameras were significantly more likely to target black people compared to other ethnic groups.

💡

Why it matters

This incident highlights the critical need to address racial bias in AI-powered surveillance technologies used by law enforcement.

Key Points

  • 1Study found racial bias in facial recognition technology used by Essex police
  • 2Black people were 'significantly more likely' to be identified by the AI-enabled systems
  • 3Essex police have paused the use of live facial recognition cameras as a result
  • 4At least 13 police forces in the UK have deployed facial recognition so far

Details

A study conducted by academics discovered that the live facial recognition (LFR) technology used by Essex police exhibited significant racial bias. The findings showed that black people were 'significantly more likely' to be identified by the AI-enabled cameras compared to other ethnic groups. In response, Essex police have decided to suspend the use of facial recognition technology pending further review. The Information Commissioner's Office, which regulates the use of facial recognition by UK police forces, revealed the move to pause deployment. At least 13 police forces across the country have so far adopted facial recognition systems, raising concerns about privacy and algorithmic bias.

Like
Save
Read original
Cached
Comments
?

No comments yet

Be the first to comment

AI Curator - Daily AI News Curation

AI Curator

Your AI news assistant

Ask me anything about AI

I can help you understand AI news, trends, and technologies