A new documentary on Netflix is well-timed to help us understand the ground that is shifting beneath our feet when it comes to our society’s relationship with technology. Check out Coded Bias and meet the endearing MIT student whose experience trying to code a simple game led her to a revelation that most algorithms are fundamentally biased. Joy Buolamwini was using facial analysis that she realized could see a white face much better than a black face…and a man’s face much better than a woman’s. As a Black woman, Joy had two counts against her from the start and decided to do something about it.
Joy’s story is the engrossing frame for the doc’s larger exploration of surveillance capitalism: how algorithms are coded and fed to machine learning systems that inform fundamental tools we all use, as well as how data being collected on our phones every day is used by corporations, and governments. Shalini Kantayya’s documentary rates 100% on Rotten Tomatoes and will offer pithy talking points for your Memorial Day weekend gatherings.
Joy founded the “Algorithmic Justice League” to help spread the word about how bias in coding affects everyone in society, debuting her arguments in a compelling Ted Talk here in 2016. Her arguments are direct and pure, and she has an appealing confidence – in a short time, her work has achieved global recognition. She’s testified in Congress, is taking on bias at Amazon, and her effortless blend of activism and science provides and excellent role model for kids. Get to know her by listening to a recent podcast with Kara Swisher here (April 2021).
Joy’s journey frames the documentary, which introduces us to other academics (mostly female) and human rights activists who explain today’s buzz words (algorithm, artificial intelligence and machine learning) in straight-forward, practical terms. If you follow the breadcrumbs, you get an unsettling picture of the insidious nature of how power is being wielded through the data collected on our phones. We learn how cameras are being used in China and the UK to track folks with facial recognition based on what is most likely faulty data. In China, citizens receive credit scores based on financial stability and behavior – one’s face is used as passport to everything in society, so your face quite literally represents the state of your credit. In our society, similar data is used to improve our shopping experiences. Smart teenagers will recognize shades of “Brave New World” and “Minority Report.”
After watching the film, you’ll see Joy as a super hero worthy of the Justice League. You’ll understand that we can’t have true justice without understanding how these algorithms operate. Bonus: she writes beat poetry and recites it over the course of the film!