THIS JUST IN!

A news and lifestyle perspective — beyond the obvious!

Artificial Intelligence Shows Signs of Racism

Share on FacebookShare on Google+Tweet about this on TwitterShare on LinkedIn

by Natalie Morris, Metro

AI and RaceMike says the growth of AI can have much bigger, systemic ramifications for the lives of people of color in the UK. The implications of racist technology go far beyond who does and who doesn’t get to use hand soap.

AI is involved in decisions about where to deploy police officers, in deciding who is likely to take part in criminal activity and re-offend. He says in the future we will increasingly see AI playing a part in things like hospital admissions, school exclusions and HR hiring processes.

Perpetuating racism in these areas has the potential to cause serious, long-lasting harm to minorities. Mike says it’s vital that more black and minority people enter this sector to diversify the pool of talent and help to eradicate the problematic biases.

‘If we don’t have a system that can see us and give us the same opportunities, the impact will be huge. If we don’t get involved in this industry, our long-term livelihoods will be impacted,’ explains Mike.

‘It’s no secret that within six years, pretty much 98% of human consumer transactions will go through machines. And if these machines don’t see us, minorities, then everything will be affected for us. Everything.’

An immediate concern for many campaigners, equality activists and academics is the deployment and roll out of facial recognition as a power for the police.

In February, the Metropolitan Police began operational use of facial recognition CCTV, with vans stationed outside a large shopping centre in east London, despite widespread criticism about the methods.

A paper last year found that using artificial intelligence to fight crime could raise the risk of profiling bias. The research warned that algorithms might judge people from disadvantaged backgrounds as ‘a greater risk.’

‘Outside of China, the Metropolitan police is the largest police force outside of China to roll it out,’ explains Kimberly McIntosh, senior policy officer at Runnymede Trust. ‘We all want to stay safe but giving the green light to letting dodgy tech turn our public spaces into surveillance zones should be treated cautiously.’

Kimberly points to research that shows that facial recognition software has trouble identifying the faces of women and black people.
Read more: https://metro.co.uk/2020/04/01/race-problem-artificial-intelligence-machines-learning-racist-12478025/?ito=cbshare

Twitter: https://twitter.com/MetroUK | Facebook: https://www.facebook.com/MetroUK/

More …

© 2020 Associated Newspapers Limited

Share on FacebookShare on Google+Tweet about this on TwitterShare on LinkedIn

Sponsor

Recently Added

The Author

(C)WaitTilYouHearThis.com Website Admin (www.WebDesignManageSEO.com).
Follow

Get every new post on this blog delivered to your Inbox.

Join other followers: