“Coded Bias" is a documentary film directed by Shalini Kantayya that explores the hidden biases embedded in artificial intelligence (AI) algorithms and their impact on society. The film follows the story of MIT researcher Joy Buolamwini, who discovers that facial recognition software has difficulty recognizing her face because the algorithm was not trained on a diverse set of faces.
The film begins with Buolamwini's personal experience, where she discovered that a facial recognition software failed to recognize her face because of her dark skin tone. This experience led her to investigate further, and she found that this was not an isolated incident. In fact, the bias was built into the algorithm, which was trained primarily on white male faces.
Buolamwini's discovery raised questions about the ethical implications of AI and how algorithms are designed, tested, and implemented. The documentary highlights that the development of AI is often driven by profit motives and corporate interests, which can lead to the exclusion of minority groups.
The film explores several case studies that illustrate the harmful impact of biased AI algorithms. For example, it examines the use of predictive policing software, which uses historical crime data to predict where crimes are likely to occur. However, because the data used to train the algorithm is often biased against minority groups, the software perpetuates and amplifies existing inequalities in the criminal justice system.
Another example the film explores is the use of AI algorithms in the hiring process. The algorithms are designed to filter job applications based on certain criteria, but they often perpetuate gender and racial biases. For example, an algorithm might penalize a job candidate for taking maternity leave, or it might exclude candidates who don't have a certain educational background, which could disproportionately affect minority groups.
The film also highlights the lack of regulation and oversight of AI technology. The industry is largely self-regulated, which means that companies are not required to disclose how their algorithms work or what data they use to train them. This lack of transparency and accountability raises serious concerns about the potential for AI to be used for discriminatory purposes.
My favorite part of the film was the hopeful ending. Despite the alarming findings, the film ends by highlighting the work of activists and researchers who continue to push for more transparency and accountability in AI development. Buolamwini, for example, founded the Algorithmic Justice League, which advocates for more inclusive and ethical AI. I feel like this part of the movie gave me and other viewers somewhat of a “way out,” to show that
Overall, "Coded Bias" is an important and thought-provoking film that sheds light on the hidden biases in AI algorithms and their impact on society. The film raises important questions about the ethical implications of AI and the need for greater transparency and accountability in its development. It is a must-watch for anyone interested in technology, social justice, and the future of our society.
Comments