Microsoft rejected sales of facial recognition tech over human rights worries

Microsoft Logo at Ignite
Microsoft Logo at Ignite (Image credit: Windows Central)

Microsoft President Brad Smith this week revealed that the company recently turned down a California law enforcement agency's bid to equip officers with its facial recognition tech. The primary concern for Microsoft, Smith said at a Stanford University AI conference this week reported by Reuters, was the potential impact on human rights.

According to Reuters, Microsoft decided that equipping officers' body cameras and cars with the technology would lead to minorities and women being "disproportionately held for questioning." That's largely down to a flaw in the AI used with facial recognition tech, which is disproportionately trained on pictures of white and male subjects.

"Anytime they pulled anyone over, they wanted to run a face scan," Smith said at the conference. While he didn't name the agency involved, Smith said Microsoft decided against providing the technology after weighing its impact.

Smith said that Microsoft also declined to provide facial recognition technology for cameras in the capital city of an unnamed country, Reuters reports. However, the company did allow the tech to be used in a prison after deciding its use would be sufficiently limited and would improve safety.

The human rights effect of facial recognition technology is an issue Microsoft has called attention to before by urging governments to regulate facial recognition technology. "Facial recognition technology raises issues that go to the heart of fundamental human rights protections like privacy and freedom of expression," Smith has previously said on the subject.

CATEGORIES
Dan Thorp-Lancaster

Dan Thorp-Lancaster is the former Editor-in-Chief of Windows Central. He began working with Windows Central, Android Central, and iMore as a news writer in 2014 and is obsessed with tech of all sorts. You can follow Dan on Twitter @DthorpL and Instagram @heyitsdtl