0 Flares 0 Flares ×

Big Brother is watching you. And listening to your conversations. And checking your search history, credit score, college transcript, driving record, fertility, and social status. And no one is doing all that much to stop it. That’s the chilling message at the heart of Coded Bias, Shalini Kantayya’s extremely timely documentary about the power of AI and facial recognition technology.

Using research by MIT computer scientist/digital activist Joy Buolamwini as a jumping-off point, the film explores the inherent bias in the algorithms used by many tech companies. Designed largely by white men, it turns out that a lot of facial-recognition programs are best at recognizing…white men. So when those technologies are used to scan the faces of women and people of color, they’re far more likely to make incorrect associations and conclusions. Inconvenient if you want to use your face to open a biometric lock; potentially devastating if a database suggests you’re a terrorist.

And it’s not just facial-recognition programs that are problematic. Algorithms used to scan candidate listings for hiring have a tendency to eliminate women and people of color. Software used to assess job performance can unfairly penalize people based on their race and socioeconomic status. Meanwhile, governments around the world are deploying advanced technology to help them keep an eye on the populace. In the UK, an organization called Big Brother Watch works hard to raise citizens’ awareness of the cameras following their every move; in China, it’s just a fact of life. One of the film’s most sobering moments is when a young Chinese woman talks about the benefits of everyone being tracked, recognized, and earning “social credit” for “behaving”: “We can choose to immediately trust someone based on their credit score instead of relying on [our] own senses to figure it out. This saves me time from having to get to know a person.” Yikes.

But if Buolamwini and her fellow crusaders for algorithmic justice have their way, tech companies and their ambitious, flawed algorithms will become subject to oversight and regulation. As she pointedly says, “People who have been marginalized will be further marginalized if we’re not looking at ways of making sure the technology we’re creating doesn’t propagate bias.” Because it’s not just The Man you have to stay ahead of now; it’s The Machine, too. — Betsy Bozdech

Team #MOTW’s comments:

Sandie Angulo Chen: Director Shalini Kantayya’s documentary about the ways that AI and algorithms can be biased against people of color, women, and the poor is a must-see cautionary tale about unregulated technology. It’s important for people of all backgrounds to understand that automation technology has not only limitations but serious, far-reaching consequences. The film, via interviews with computer scientists like MIT’s Joy Buolamwini and other experts, reveals how there are proven flaws with the AI used in facial recognition software, or for human resources, employment, and credit-card application screenings. The film offers a thoughtful examination of a subject most people probably don’t take into consideration.

Pam Grady: Welcome to the brave new world of artificial intelligence, algorithms, facial recognition, and the surveillance state. Shalini Kantayya’s chilling documentary lays out the Pandora’s box unleashed by advancing technology. Computer scientist and activist Joy Buolamwini, a Black woman, identifies how software designed by white men is inherently biased against women and people of color. In England, activists work against Britain’s surveillance state in which the police are ever more reliant on software – which is frequently wrong – to identify suspects. Good luck getting a job, a train ticket, or even a Pepsi in China without facial recognition and the social credit that accrues with total social control. Algorithms are simply mathematical equations, but ones that already have grave implications in our lives, effecting not just our Netflix recommendations or what we see on Facebook, but also our credit and our very identity. Kantayya’s film lays out the blueprint for a battle that is just beginning that pits control versus freedom. With behemoth companies like Facebook, Amazon and Google further developing the technology, the film lays bare a technological arms race we all abet with every computer click and post on social media.

Marina Antunes There’s a myth floating around that AI isn’t racist but in reality, AI is only as unbiased as its architects and even they don’t always know how the AI is reaching its conclusions. Where does that leave more than half of the world’s population which is left out, or unfairly judged by “blind” tech? With her documentary Coded Bias, filmmaker Shalini Kantayya introduces us to the researchers and advocates who are speaking out against the use of AI for everything from facial recognition to algorithms devised to help streamline systems but which are instead dividing society further. A must watch.

Leslie Combemale Director Shalini Kantayya brings us Coded Bias, a documentary that rightly feeds fears on a subject many have blithely ignored, the increasing control technology is having over the world, and in every aspect of life. Worse, it is pervasively anchored in bias, expanding disparity in wealth, education, health, safety, and so much more. The film is an eye-opening examination of just how little the public is aware of how and when they are being watched, categorized, pigeonholed, and discarded, often because of their gender, race, or background, more often than not erroneously. Read full review.

MaryAnn Johanson There’s a maxim in computer programming that goes: “Garbage in, garbage out.” Meaning: Whatever computers can do for us is only as good as the instructions we give them. Here, Shalini Kantayya reminds us that the inescapabable corollary of that is: “Bigotry in, bigotry out.” And she does that in the slyest possible way, via an array of talking-head experts who are, as we come to slowly realize, almost exclusively women, and women of color to boot. Hooray! Which only serves to underscore the reality that when it’s only a *very* narrow slice of white men who responsible for all of the systems that dictate our lives — and that are on track to dictate even more, and more insidiously, in the near future — injustice is in the cards for the rest of us. We can do better… but only if we understand what is happening, and what is at stake. This film does an excellent job of doing that.

Nikki Baughan: This fascinating, insightful and timely documentary takes a deep dive into the artificial intelligence revolution, and examines how software like facial recognition and algorithmic learning is proving to be — like the world around it — systemically biased towards anyone who isn’t white or male. That’s unsurprising, given that the programmers who dominate this space are both white and male, but it’s still shocking to see facial recognition software failing to compute black faces, or the extent to which police forces are using these flawed algorithms to infringe on basic human rights. But, alongside these dire facts, director Shalini Kantayya retains an optimistic tone, giving her floor the brilliant, inspirational mathematicians and activists – all female — who are fighting to raise awareness of these practices on both sides of the Atlantic.

Nell Minow: There’s an old expression in computer data: Garbage in, garbage out. And yet, we trust computers as though they are not subject to the biases and mistakes and hubris of the people who build and program them. Coded Bias is a powerful reminder that for computers as well as humans, representation matters.

Loren King Coded Bias is a fascinating and compelling expose about how Joy Buolamwini, a Media Lab researcher at Massachusetts Institute of Technology, discovered that facial-recognition software technology does not register the faces of darker-skinned people and is less also likely to register women’s visages. Investigating her findings, Buolamwini found that this was likely a reflection of a high tech industry dominated by white men; in other words, the data system is biased. Filmmaker Shalini Kantayya examines the real-world implications of this. Besides the threat to privacy, there is the potential for abuse by governments that use facial-recognition systems for security and surveillance. Among the experts interviewed is Silkie Carlo, director of Big Brother Watch in London. We see her monitoring the efforts by British police to identify criminals on the street using the flawed technology which results in widespread misidentification and a violation of civil liberties. Kantayyamakes all this complex information more accessible to general audiences by including references to artificial intelligence in popular culture. There are clips from movies including Steven Spielberg’s dystopian Minority Report (2002) and Stanley Kubrick’s prescient 2001:A Space Odyssey (1968) that draw parallels between science fiction and here-and-now reality.

Jennifer Merin We might all be in need of and very keen on watching escapist comedies during the ongoing pandemic social restrictions, but this compelling documentary is an important and timely film. Knowledge is power, and Coded Bias delivers knowledge that can help us to resist/reverse the further deterioration of democracy and of human civilization as we know it. Read full review.

Sandie Angulo Chen: Director Shalini Kantayya’s documentary about the ways that AI and algorithms can be biased against people of color, women, and the poor is a must-see cautionary tale about unregulated technology. It’s important for people of all backgrounds to understand that automation technology has not only limitations but serious, far-reaching consequences. The film, via interviews with computer scientists like MIT’s Joy Buolamwini and other experts, reveals how there are proven flaws with the AI used in facial recognition software, or for human resources, employment, and credit-card application screenings. The film offers a thoughtful examination of a subject most people probably don’t take into consideration.

Kathia Woods Racism is prevalent, including in science. An experiment at MIT reveals a flaw in facial recognition technology that applies to people of color, specifically Black people, who don’t exist when it comes to “Aspire Mirror.” Joy Buolamwini, the creator of the MIT experiment, discovered that the software couldn’t read her face because she’s Black. The idea of the software is to project fun filters to start your day. Coded Bias shows that the odds are against people of color and women when it comes to Stem. The film informs the viewer of the social and political challenges that are present in the institutions of science. On the upside, I felt immense pride seeing women of color — Black and Brown women — showing their knowledge of Stem. They turned a negative situation into a positive one. We still can’t ignore the fact that the system was made to keep people of color from actively participating and benefitting. Coded Bias sounds a critical alarm that the scientific community still ignores minorities when creating products and when it comes to realizing those products.

Susan Wloszczyna: In Coded Bias, the charismatic Joy Buolamwini, a student at MIT, calls out the lack of legislation and oversight over the use of such invasive facial recognition and other algorithmic tools and how governments use such tech to unfairly judge people. For instance, Amazon employed an algorithm to select a first cut of job applicants, only to find it picked only white males. That reflects the culture of Silicon Valley, which is controlled by primarily white males. Read full review.

Cate Marquis In the eye-opening documentary Coded Bias, the assumption that decisions made by computer program are free of the biases of humans is upended, a process that begins when a young, black woman MIT computer programming student notices that facial recognition software has a hard time seeing her, a curious fact that leads to the surprising discovery that the artificial intelligence it runs on was trained mostly with male and light-skinned faces – creating a hidden, built-in-bias against faces that look “different.” But what seems, at first, like it could be an isolated glitch is soon revealed as one of many unconscious biases in AI programs, along with a fundamental lack of understanding by the tech pros of how exactly this kind of software works, even as it becomes a pervasive tool of governments, banking and hiring departments with the power to impact all our lives. Far from a Luddite’s complaint, director Shalini Kantayya’s Coded Bias brings in expert AI and tech voices, to warn the public how widespread use of this technology, by business and government, can actually reinforce discrimination or lead to other untended and potentially dangerous consequences for society.


Title: Coded Bias

Director: Shalini Kantayya

Release Date: November 11, 2020

Running Time: 90 minutes

Language: English

Screenwriter: Shalini Kantayya. Dcumentary

Distribution Company: 7th Empire Media


Official Website

AWFJ Movie of the Week Panel Members: Sandie Angulo Chen, Marina Antunes, Nikki Baughan, Betsy Bozdech, Leslie Combemale, Pam Grady, MaryAnn Johanson, Loren King, Cate Marquis, Jennifer Merin, Nell Minow, Liz Whittemore, Susan Wloszczyna, Kathia Woods

Previous #MOTW Selections

Other Movies Opening This Week

Edited by Jennifer Merin

0 Flares Twitter 0 Facebook 0 0 Flares ×

Jennifer Merin

Jennifer Merin is the Film Critic for Womens eNews and contributes the CINEMA CITIZEN blog for and is managing editor for Women on Film, the online magazine of the Alliance of Women Film Journalists, of which she is President. She has served as a regular critic and film-related interviewer for The New York Press and She has written about entertainment for USA Today, The L.A. Times, US Magazine, Ms. Magazine, Endless Vacation Magazine, Daily News, New York Post, SoHo News and other publications. After receiving her MFA from Tisch School of the Arts (Grad Acting), Jennifer performed at the O'Neill Theater Center's Playwrights Conference, Long Wharf Theater, American Place Theatre and LaMamma, where she worked with renown Japanese director, Shuji Terayama. She subsequently joined Terayama's theater company in Tokyo, where she also acted in films. Her journalism career began when she was asked to write about Terayama for The Drama Review. She became a regular contributor to the Christian Science Monitor after writing an article about Marketta Kimbrell's Theater For The Forgotten, with which she was performing at the time. She was an O'Neill Theater Center National Critics' Institute Fellow, and then became the institute's Coordinator. While teaching at the Universities of Wisconsin and Rhode Island, she wrote "A Directory of Festivals of Theater, Dance and Folklore Around the World," published by the International Theater Institute. Denmark's Odin Teatret's director, Eugenio Barba, wrote his manifesto in the form of a letter to "Dear Jennifer Merin," which has been published around the world, in languages as diverse as Farsi and Romanian. Jennifer's culturally-oriented travel column began in the LA Times in 1984, then moved to The Associated Press, LA Times Syndicate, Tribune Media, Creators Syndicate and (currently) Arcamax Publishing. She's been news writer/editor for ABC Radio Networks, on-air reporter for NBC, CBS Radio and, currently, for Westwood One's America In the Morning. She is a member of the Critics Choice Association in the Film, Documentary and TV branches and a voting member of the Black Reel Awards. For her AWFJ archive, type "Jennifer Merin" in the Search Box (upper right corner of screen).