Pocket worthyStories to fuel your mind

Cameras Couldn’t Possibly Be Racist, Right?

All tech can inadvertently contain the biases of their creators. And when it comes to cameras, dark-skinned photo subjects bear the brunt of this—often in surprising and disturbing ways. Mozilla Foundation’s Xavier Harding provides a primer on understanding and fighting filmic bias.

Pocket Collections

Read when you’ve got time to spare.

All tech is not created equal. As well-intentioned as developers, engineers, and inventors may be, the biases they carry tend to find a way to seep into the things they create. That includes search engines, virtual assistants, and even algorithms that determine who can take out a bank loan. It even affects “older” technologies like cameras.

These days, we’re far past the times where anyone would question why you’d put a camera on a phone—they’re a ubiquitous and essential addition to smartphones and laptops. Cameras are simple on the surface. They sort of work like our eyes do: The iris (like a camera’s aperture) lets in light and informs our retina on what it sees, like a camera informs film (or, nowadays, your phone). Here’s the problem: the idea that someone’s eyes would only allow them to see people of lighter skin tones would be utterly ridiculous. When it comes to cameras, that actually isn’t always too far off.

Ultimately cameras are technology, so it tracks that they would carry their own bias. This was true even back in the 1950s, when cameras were calibrated in a way that prioritized light skin tones. And even in the pre-internet era, the ripple effects of that were massive. It resulted in cameras that did a better job of photographing white people than Black people, which affected who’d actually look good on TV, on a magazine cover, or even in a family photo. Fast forward to the internet era, where facial recognition algorithms can be more likely to mistake one Black person for another. You can imagine what flubs like this could mean when such software is used by law enforcement.

Seeing is understanding; we say “I see” when we finally comprehend something. What message does it send when people create a camera and that camera can’t see you? What does it mean when a piece of glass is connected to a supercomputer and the entire internet and it still can’t discern your selfie at the bar? This reading list is mostly a viewing list, in hopes that you’ll see what I mean. —Xavier Harding

How ‘Insecure’ Lights Its Actors So Well

Xavier Harding
Mic

Xavier Harding: “Back in 2017, I wrote this article because I was consistently impressed by how the actors on Insecure looked freaking amazing. Cinematographers like Ava Berkofsky have to put in effort to compensate for the shortcomings of cameras. Insecure is a testament to the fact that dark-skinned people can look beautiful on film, but that it does, still, take extra effort. The question is, should it?”

Amaya’s Flashlight [WATCH]

Mozilla
Twitter

XH: “In 2021, I spoke with a student named Amaya. During the pandemic, many like Amaya were forced to resume schooling remotely. She told me about her experience trying to take a lab quiz and struggling to get the test proctoring software—the kind used to prevent cheating—to see her. The lab quiz was 30 minutes; it took her 45 to get the sample quiz to see her, which didn’t bode quite well for the actual test. Amaya was savvy enough to get the software to recognize her eventually, but her lighter-skinned classmates had the privilege of focusing on just the quiz itself. We made this video to show how this sort of bias isn’t just an inconvenience. Unchecked, it can result in real learning loss.”

Gender Shades [WATCH]

MIT Media Lab
YouTube

XH: “In her thesis for MIT, computer scientist Joy Buolamwini proved how certain facial detection software failed to recognize her face but detected the face of her lighter colleagues just fine. What finally did the trick? A white mask. Joy’s research kicked off a renewed interest in ethical bias in computing and led to her founding the Algorithmic Justice League.”

Coded Bias [WATCH]

Netflix

XH: “If you’re reading this, odds are you’ve seen or at least heard about Netflix’s The Social Dilemma. And if you watched that, you’ll want to watch this too. Joy Buolamwini, Deb Raji, Dr. Timnit Gebru, and others join forces to explain just how bad bias in tech can get and what we can do to solve it.”

Are We Automating Racism? [WATCH]

Vox
YouTube

XH: “While Coded Bias is a feature-length documentary, this video serves as a brief primer on ethics and AI. The team at Vox shows a myriad of ways in which computers and software can inadvertently prioritize people with white features over Black features. Even in well-lit photos! Watch this video, then check out Vox’s great explanation on Shirley cards.”

Wrongfully Accused by an Algorithm

Kashmir Hill
The New York Times

XH: “Designing facial recognition algorithms that don’t properly account for people of color can lead to devastating consequences. Robert Julian-Borchak Williams was held in custody for 30 hours because a computer said he was guilty of larceny. It was a false positive. Williams isn’t alone, it happens more than it should, even as facial recognition tech has been known to be less effective at telling apart Black faces than white ones.”

Does Snapchat’s Beauty Filter Want You to Look More White?

Veronica Wells
madamenoire.com

XH: “Tech companies don’t have the cleanest track record when it comes to racial bias in their products. Snapchat has been accused multiple times of its ’beauty’ filters simply applying white features to photo subjects, making skin lighter and slimming down the nose. And Google has had their own embarrassing oversights, including searches for “Black girls” surfacing porn site results, as Dr. Safiya U. Noble wrote about for Time.”

Google Built the Pixel 6 Camera to Better Portray People With Darker Skin Tones. Does It? [PAYWALL]

Nicole Nguyen and Dalvin Brown
The Wall Street Journal

XH: “So what are companies doing to combat this? On the Google side, the company claims its latest phone puts extra effort into making sure Black subjects look good on film. (This WSJ piece has examples of how it performs.) On the Snapchat side, the company has said it will do better in spotting biased assumptions, like assuming smaller noses equals better noses.

Take both with a grain of salt—ultimately, both companies will say whatever makes them look less racist and do whatever’s best for their bottom line. And yet, seeing major tech companies start to address these issues is a step in the right direction.”

Self-Driving Cars May Be More Likely to Hit You if You Have Dark Skin

Karen Hao
MIT Technology Review

XH: “It’s one thing if your computer’s webcam can’t see you during a test, it’s another if an autonomous vehicle’s cameras can’t see you out on the road. A study from Georgia Institute of Technology found that those with dark skin could be more at risk in a world filled with self-driving cars. You can read the full paper hereif you want more—this is one topic I hope to never see on video.”

Xavier Harding

Xavier Harding is a content writer for Mozilla, sometimes-host of Mozilla’s Dialogues & Debates, and a former journalist. Prior to Mozilla, Xavier wrote for publications like The Markup, Mic, and Popular Science. At Mic, Xavier won a Webby Award for his story on HBO’s Insecure and how the show’s cinematographer properly lights dark-skinned actors, despite most camera tech being calibrated against light skin.