Microsoft Corp recently rejected a California law enforcement agency’s request to install facial recognition technology in officers’ cars and body cameras due to human rights concerns, company President Brad Smith said on Tuesday. Microsoft concluded it would lead to innocent women and minorities being disproportionately held for questioning because the artificial intelligence has been trained on mostly white and male pictures. 

COMMERCIAL BREAK
SCROLL TO CONTINUE READING

AI has more cases of mistaken identity with women and minorities, multiple research projects have found. “Anytime they pulled anyone over, they wanted to run a face scan” against a database of suspects, Smith said without naming the agency. After thinking through the uneven impact, “we said this technology is not your answer.”

Speaking at a Stanford University conference on “human-centered artificial intelligence,” Smith said Microsoft had also declined a deal to install facial recognition on cameras blanketing the capital city of an unnamed country that the nonprofit Freedom House had deemed not free. Smith said it would have suppressed freedom of assembly there.

Watch Zee Business video here:

Microsoft said in December it would be open about shortcomings in its facial recognition and asked customers to be transparent about how they intended to use it, while stopping short of ruling out sales to police.  Smith has called for greater regulation of facial recognition and other uses of artificial intelligence, and he warned Tuesday that without that, companies amassing the most data might win the race to develop the best AI in a “race to the bottom.”

He shared the stage with the United Nations High Commissioner for Human Rights, Michelle Bachelet, who urged tech companies to refrain from building new tools without weighing their impact. “Please embody the human rights approach when you are developing technology,” said Bachelet, a former president of Chile.