Understanding Model Bias

Learn how models can become biased, using historical examples of bias in algorithms.

By far the most dangerous (and well-represented in the media) type of bias is model bias. This type of bias is even more subtle than data bias and has tremendous consequences if not handled properly in the development stage. This is because model bias is the exacerbation of data bias—but in a model that most would assume to be objective.

Historical examples

Model bias is far easier to understand through examples, so in this lesson, we’ll cover some of the most notorious mishaps faced by big tech companies who didn’t control for this type of bias in their pipelines.

Google Photos

Google Photos is one of the most used applications for photo storage and includes features like auto-tagging and image recognition. In 2015, it was reported that the Google Photos image recognition algorithm labelled Black people as gorillas; a particularly offensive label because it echoes centuries of racist tropes. This is learned racism and Google, along with other big tech companies, have never really figured out how to stop these types of incidents from happening. This incident was most likely caused by data bias—the underrepresentation of Black individuals during training. Appallingly, Google’s fix was to stop tagging primates in general—now it no longer tags pictures of gorillas as gorillas. For a company like Google, one of the forerunners in AI, this was a surprising display of a lack of clarity around how to fix model bias.

The popular photo website Flickr also suffered similar bias concerns.

Twitter cropping

In 2020, it was found that Twitter’s cropping algorithm tended to crop out people with darker skin in images. When a user uploaded a picture of a white individual and an individual with darker skin, the automatic cropping would cut the darker-skinned person out of the picture. While this isn’t necessarily a high-stakes area of concern, it does demonstrate that Twitter had not properly tested their algorithm (despite claims that they did) on diverse subgroups.

Get hands-on with 1200+ tech skills courses.