By Published: Feb. 15, 2021

Banner image: Morgan Klaus Scheuerman, an Information Science PhD student, has been awarded the Microsoft Research Fellowship for 2021. He studies how and why facial recognition technologies get it wrong.听 Credit: Casey Cass/CU boulder

Growing up in a traditional blue-collar family in one of the most conservative counties in Maryland, Morgan Klaus Scheuerman knew early on what it鈥檚 like to feel marginalized.

He opted to wait until his mid-20s to come out. High school, he recalls, was filled with dark days.

鈥淭here were things said, without knowing I was queer, that were really upsetting. I felt pretty hopeless at times.鈥

Four-year college seemed out of reach, as no one in his family had gone before and money was tight. So, after graduation, Scheuerman took a customer service job at the local Best Buy, started saving, and tried to imagine a future beyond a hometown that many peers never left.

鈥淚 honestly thought that would be me,鈥 he says.

Instead, Scheuerman, now 29, is among the most coveted young minds in the field of social computing. With stints at Google and Facebook already under his belt, and his facial analysis research earning international accolades, he was just awarded Microsoft鈥檚 prestigious 2021 Research Fellowship. That includes two paid years to finish his PhD in Information Science at CU 麻豆影院 and a chance to collaborate with Microsoft researchers.

Morgan seated in his office

Information Science student Morgan Klaus Scheuerman | Credit: Casey Cass/CU 麻豆影院

We have labor laws and advertising laws and housing laws against racial and gender discrimination, but there are no laws specific to embedding discrimination into algorithms.鈥
鈥揗organ Klaus Scheuerman

His work, as he puts it, seeks one fundamental goal: to show tech companies marginalized people matter.

To do so, he studies, literally, how computers see us, focusing on the facial analysis software ubiquitous in everything from cell phones and computers to surveillance cameras at airports and malls. Already, his work and that of others has found such platforms frequently misidentify those who are not white, male and cisgender. He wants to understand why.

Where in the making of such products do things go wrong? Can they be improved? And should, he dares to ask, some technologies not be made at all?

鈥淲e have labor laws and advertising laws and housing laws against racial and gender discrimination, but there are no laws specific to embedding discrimination into algorithms,鈥 Scheuerman says. 鈥淭he only way we discover such discrimination is when it happens to us.鈥

When the security camera gets it wrong

In January 2020, Detroit police pulled into the driveway of a Black man named Robert Williams, handcuffed him in front of his two young daughters and hauled him to jail. Police determined later that a facial recognition service had incorrectly matched his driver鈥檚 license photo to a still image from a security video of a shoplifting incident.

When police asked him if the video image was him, Williams responded: 鈥淣o. You think all Black men look alike?鈥

All charges were dropped. But according to press reports, at least three Black men are known to have been wrongly arrested in the United States based on glitches in facial recognition software, and countless others have been profiled or harassed.

鈥淭he fear that a lot of people have had over this technology is already being realized,鈥 says Scheuerman. While facial recognition software can be remarkably accurate when assessing the gender of white men, it misidentifies women of color one-third of the time, recent research shows.

In one particularly egregious case of mistaken identity, Google鈥檚 photo-categorization software began in 2015 began to label Black people as 鈥済orillas.鈥 Google promptly apologized and took steps to fix the problem.

Scheuermann鈥檚 own research听has shown, although the systems are adept at identifying cisgender women (those assigned female at birth and identifying as such) or cisgender men, they falter when faced with people who don鈥檛 neatly fit those binary categories.

鈥淲hile there are many different types of people out there, these systems have an extremely limited view of what gender looks like,鈥 says Scheuerman, who identifies as male.

Notably, when submitting his own picture鈥攈is long, blue-tinted hair framing his high cheekbones鈥攖o several facial recognition platforms, half got his gender wrong. Such mistakes could potentially lead to real harm, he warns. A match-making app could set someone up on a date with the wrong gender. A mismatch between what a facial recognition program sees and the documentation a person carries could prevent someone from clearing airport security.

Facial analysis software could also be used as a tool for discrimination. For instance, the Chinese government has reportedly been using facial analysis to identify and target members of the Uighur ethnic minority group.

Then, there are more subtle effects, Scheuerman notes.

鈥淭hese systems run the risk of reinforcing stereotypes of what you should look like if you want to be recognized as a man or a woman, and that impacts everyone.鈥

Diversifying the pipeline

So, how does an artificial intelligence platform come to recognize one person as a Black woman while pegging another as a white man? Why do they get it wrong?听 鈥淚t is a very human problem,鈥 Scheuerman explains.

Through online recruiting services like Amazon Turk, tech companies often crowdsource workers to sort photo after photo into categories, including Black or white, male or female.

That data is then used to train computer algorithms that in turn teach our smartphones.

But each of those human annotators comes with his or her own cultural biases and sometimes the so-called 鈥渢raining data鈥 itself lacks diversity, his research suggests.

Elsewhere in the process, computer scientists and engineers make other decisions, considering what technology should be developed, how it could be used, and who might be hurt by it.

A graphic showing how facial analysis software works

Facial recognition software tends to work better in identifying white cisgender individuals, research shows. Illustration: Morgan Klaus Scheuerman

Often, marginalized communities are left out of those discussions.

鈥淚 want to know when and where decisions are made in the pipeline and how researchers, practitioners and policymakers can intervene to shape a more equitable future,鈥 he says.

鈥楺ueer is your superpower鈥

Scheuerman鈥檚 advisor Jed Brubaker, an assistant professor in the Department of Information Science who recruited Scheuerman to join his Identity Lab, says that people from marginalized communities tend to bring wholly unique perspectives to the field of social computing.

鈥淚n a way, queerness is your superpower,鈥 Brubaker said. 鈥淗aving been on the outside lets you see things from multiple angles and consider things that are invisible to most people.鈥

Scheuerman鈥檚 unique blend of technical knowledge and social science expertise has also served him well, Brubaker says.

鈥淚t is a rare student who can bridge both.鈥澨

After Best Buy, Scheurman worked as a barista to help pay his way through college. He credits the early classes he took in gender studies at Goucher College in Baltimore for emboldening him to come out to his family. They surprised him with their warm acceptance.

His master鈥檚 work at University of Maryland, studying how different computer systems impact teens, the visually impaired and racial minorities, led him in 2018 to CU 麻豆影院, home to one of the earliest information science departments in the country.

Only 10 students nationwide, out of more than 1,000 applicants,听were awarded the Microsoft Fellowship.

鈥淭his entire area just felt so inaccessible to me growing up,鈥 he says. 鈥淚 just feel so thankful.鈥

Scheuerman stresses that his intent is not to do away with technology. He uses facial analysis software every day when he logs into his phone or tags people on social media. But he would like to play some small role in bringing a new mindsight to the field.

鈥淚 would like to see companies begin to prioritize people over technological innovation,鈥 he says. 鈥淚 don鈥檛 want the systems we design to hurt anyone鈥攚hether it鈥檚 intentional or not.鈥

This article is featured in the Spring 2021 digital issue of听CMCI Now听magazine听 听 See more stories from CMCI Now