Review: Artificial Unintelligence in ‘Coded Bias’

Anyone who spends too much time online – which is virtually everyone nowadays – has likely come across discussion of “algorithms”; a catch-all term for automated computer processes, in particular those generated through data-set based machine learning. It’s infiltrated into internet-based pop culture, whether it be Netflix’s decidedly specific 76,000+ entries-strong genre catalogue, or the spurious “I forced a bot” Twitter scriptwriting meme. My own Twitter bio is “sentient and cute screenwriting algorithm”, a response to Screen Queensland’s recent announcement that they would be using a Wattpad algorithm to decide which projects to invest in. Automated processes to categorise data have never been trendier.

But the applications of modern automated computing have far larger consequences than what films you’re going to be recommended by a streaming service (or what films will be made at all). Focusing primarily on the US and the UK, Coded Bias breaks down the ways in which predictive algorithms and facial recognition software have been haphazardly implemented by law enforcement, their continued use even when proven inaccurate, the lack of any real legal oversight, and how they ultimately facilitate the school-to-prison pipeline. 

(For what it’s worth, Australia is also developing a national facial recognition database, while Victoria Police has been using facial recognition software within police stations. My jokes at that time about wearing a face mask in public have not aged well.)

Director Shalini Kantayya (who also produced and co-edited the film) approaches the subject in a way which feels straightforward, but never simplistic. Centering on MIT computer scientist Joy Buolamwini, a researcher of bias against black women in facial recognition software, Kantayya carefully balances over a dozen experts and interview subjects. After introducing the relevant technologies, the film unfolds into a methodical examination of how computing perpetuates discrimination, particularly classist misogynoir.

The choice of experts featured in Coded Bias is refreshing: until the appearance of UK human rights lawyer Ravi Naik half an hour in, there had not been a single male talking head. It speaks to one of the film’s subtler suggestions: as with any data set, to correct uneven representation there must be so-called “over-representation” so the numbers even out overall. Structural inequality is generational, so for any future to be equitable there have to be conscious decisions to mitigate inherited power. With computer engineering historically dominated by white men, the film showcases the increasing diversity of the field. Coded Bias centres experts who are almost entirely female, many being women of colour, all of whom are eloquent, knowledgeable, and insightful. It’s a welcome decision and one the film is stronger for.

The film also manages to avoid scaremongering in its coverage of China’s trial social credit system, where facial recognition is one element used to enforce numerical ranking based on permitted or banned behaviour. A brief segment in Hangzhou shows the technology as part of everyday life, explaining that people with low scores have been banned from taking public transport. But Kantayya side-steps the often inaccurate reporting around this subject: featured mathematician Cathy O’Neil immediately provides context, making clear that the US has similar processes, such as credit scores and algorithms which target poor people with predatory financial products – processes owned and run by corporations, not government.

Moments where the film moves away from the technical are no less valuable. Buolamwini grapples with the knowledge that while her work is anti-racist, facial recognition software without bias would be more effective in oppressing marginalised people. While at the hairdresser’s, she recites part of her spoken word piece “AI, Ain’t I a Woman”:

    Can machines ever see my queens as I view them? /

    Can machines ever see our grandmothers as we knew them?

A discussion around the fallout of Buolamwini’s work going viral likewise exposes the pushback against both her and fellow researchers. “I’m underestimated so much,” she states after sharing criticism from an Amazon representative. As well as documenting the artificial hurdles imposed by poorly constructed tech, Coded Bias provides a snapshot of the completely conscious racism and sexism aimed towards black female scientists.

In an otherwise focused documentary, there are minor distractions. Brief interludes featuring a red orb (à la 2001: A Space Odyssey’s HAL 9000) as a fictional self-aware AI with a female voice provide a break between talking heads, but little actual information. The film’s focus means that an exploration of why digital assistants such as Siri are coded as female would be slightly off-track, but the use of a fictional one makes it feel like an omission. Considering the film’s well-argued point that artificial intelligence is not intelligent at all, and merely repeats biases built-in by humans, including a fictional AI narrator feels unwarranted. Similarly missing from the film are concrete statistics around many of the points discussed – though it’s clear that a lot of that data would require freedom of information requests.

Culminating in a rapid-fire edit of a congressional hearing at which Buolamwini testifies (and Alexandria Ocasio-Cortez questions), the film ends with some optimism about the future of US federal oversight and regulation of facial recognition. But it also shows the concrete work of dozens of activists, how each win is hard-fought, and that without the battle there would be no wins at all. At the end of the day, it is as the faux-AI narrator states: there is no algorithm to decide what is just.

Coded Bias screens as part of the 2020 Melbourne International Film Festival from 6-23 August.

View the MIFF site for more info.

**********

Agnes Forrester is a screen writer and critic based in Melbourne, Australia. She has even more opinions on Twitter at @cartridgepink.

Agnes Forrester