/Facial recognition software has a gender problem

Facial recognition software has a gender problem

Futurizonte Editor’s Note: We just published a story about AI being biased against Latinos. Now, another story of AI being biased against women. Do you see the trend or do you need more examples?

Screen capture of CU Boulder video. See link below to watch the full video.

Original author and publication date: Lisa Marshall – Oct. 8, 2019

With a brief glance at a single face, emerging facial recognition software can now categorize the gender of many men and women with remarkable accuracy. But if that face belongs to a transgender person, such systems get it wrong more than one third of the time, according to new CU Boulder research.

“We found that facial analysis services performed consistently worse on transgender individuals, and were universally unable to classify non-binary genders,” said lead author Morgan Klaus Scheuerman, a PhD student in the Information Science department. “While there are many different types of people out there, these systems have an extremely limited view of what gender looks like.”

The study comes at a time when facial analysis technologies—which use hidden cameras to assess and characterize certain features about an individual—are becoming increasingly prevalent, embedded in everything from smartphone dating apps and digital kiosks at malls to airport security and law enforcement surveillance systems.

Previous research suggests they tend to be most accurate when assessing the gender of white men, but misidentify women of color as much as one-third of the time.

“We knew there were inherent biases in these systems around race and ethnicity and we suspected there would also be problems around gender,” said senior author Jed Brubaker, an assistant professor of Information Science. “We set out to test this in the real world.”

For some gender identities, accuracy is impossible
Researchers collected 2,450 images of faces from Instagram, each of which had been labeled by its owner with a hashtag indicating their gender identity. The pictures were then divided into seven groups of 350 images (#women, #man, #transwoman, #transman, #agender, #agenderqueer, #nonbinary) and analyzed by four of the largest providers of facial analysis services (IBM, Amazon, Microsoft and Clarifai).

As our vision and our cultural understanding of what gender is has evolved, the algorithms driving our technological future have not. That’s deeply problematic.”

  • Jed Brubaker

Notably, Google was not included because it does not offer gender recognition services.

On average, the systems were most accurate with photos of cisgender women (those born female and identifying as female), getting their gender right 98.3% of the time. They categorized cisgender men accurately 97.6% of the time.

But trans men were wrongly identified as women up to 38% of the time. And those who identified as agender, genderqueer or nonbinary—indicating that they identify as neither male or female—were mischaracterized 100% of the time.

“These systems don’t know any other language but male or female, so for many gender identities it is not possible for them to be correct,” said Brubaker.

READ the complete article here.