A study conducted by the University of Colorado Boulder has revealed just how bad AIs are at recognising non-cisgender people.
The worrying problems AIs have with recognising racial minorities are becoming increasingly well-documented, but this new study is among the first to evaluate gender classifications.
AI systems categorise people based on what they can “see” and often use stereotypical parameters (e.g. males don’t have long hair, females don’t have facial hair).
There are a huge number of gender categories: Facebook, for example, has around 71 options for its users. It’s perhaps asking too much to ever expect AI to exactly categorise everyone, but the researchers found concerning miscategorisations which have the potential to cause serious distress.
Morgan Klaus Scheuerman, a researcher who worked on the study, identifies as male. In the image below, Microsoft’s AI on the left correctly identifies him as male but IBM’s on the right identifies him as female:
Scheuerman says in testing his photos he was misclassified about half the time despite identifying as a “cisnormative” gender.
As long as the categorisation is approximately correct, it’s unlikely to cause too much distress. However, imagine someone who has spent years feeling they were the wrong gender, potentially faced bullying and harassment during their transition, perhaps even undertook surgeries and/or hormone treatments, and then an AI categorises them as the gender they were born as.
Scheuerman and his team tested 10 existing facial analysis and image labelling services. As you can see, most of the services currently stick to attempting to identify the cisnormative genders:
On average, the facial analysis systems performed best with images of cisgender women and worst on images of transgender men.
Here are the results for each service when categorising the genders:
Fashion trends evolve with time. Hairstyles, in particular, go through many phases. Men have opted for long hair (think of 70s/80s bands like Whitesnake, Guns n Roses, and Aerosmith) during some periods, while there are successful female models today like Ruth Bell who rock a buzzcut traditionally associated with males.
In a decade or so, it might even be popular for males to have a shaved face and females to have beards. AIs trained on images today would struggle to adapt and categorise such changes which poses yet another issue.
There’s generally a lot of problems with no solutions, but there needs to be. Communities like LGBT are already at higher risk of experiencing poor mental health through factors like societal discrimination and inequalities. Automating those problems will have devastating consequences.
Interested in hearing industry leaders discuss subjects like this? , , , AI &