Counter-Productive

Counterproductive is a short film about the consequences of algorithmic bias in artificial intelligence. Thesis project in collaboration with Ravin Raori at dfpi. Finally.

We started submitting this film for festivals less than two weeks ago. So far (Until 7th Feb. 2021) we have won an award, got three nominations and was enlisted in one of the semi-finals

There is an increasing dependence on AI in the daily lives of human beings. As people co-inhabit the world with these technologies, they need to understand how their algorithms work and to what extent they are able to affect their lives. The film investigates the algorithm as a human creation, questioning its perceived neutrality. Biases in algorithms take several forms and manifest in different ways. The work focuses on biases in training and classification data, how this data is transferred to specific contexts, and the consequences specifically on gender and gender identity.

IALab official link

The research found an increasing dependency on algorithms which sort and classify people. This classification is assumed to be objective. The resulting film is a metaphorical extrapolation of the consequences of this assumption, when living with AI-powered devices has become normalised. The production involved designing and building the devices which serve as characters in the film, and which ‘act’ in real-time during the film. The film was written and produced with a full crew during challenging times of the Covid-19 lockdown. Further, the documentation and interviews with participants serve as a companion film about contemporary gender relations.

Concept


Biases by Gender

Biases by Gender

We chose to focus on biases by gender. In order to represent these biases in machines we had to first look to the real world, to get an idea of what our own gendered biases are.

We then asked ourselves how we could represent these biases in iterations of smart technology. “What if our daily household objects that are slowly gaining more and more so-called ‘smart’ capabilities were somehow percolated by some of these biases?”

The biased interactions we chose (to translate) were:

  1. The phenomenon of women being asked to smile more often.
  2. The expectation on women to have sweeter and softer voices and the connotations associated with voice.
  3. The difference in the way men and women look at each other on the street and the idea of masculine surveillance.

Research and Theory

The work of Joy Buolamwini is especially relevant to our research as it highlights how data in algorithms can exclude people that belong to oppressed communities. Her work stems from her own master’s project where she was investigating why a facial recognition algorithm wouldn’t recognise people of colour. She came to the conclusion that the problem lay in the lack of diversity prevalent in the data used to train the model.

Prop-Making

All props for the film were designed and manufactured at the Interactive Architecture Lab, The Bartlett, UCL at Here East.


Overview




The lamp


The lamp was trained using a machine learning model that could recognise how wide the user opened their hands as a gesture to turn lights on and then built upon that to see what a future smart lamp potentially could do. The idea of masculine surveillance was explored through the lamp.

“What if the lamp’s gaze could follow you everywhere you went?”

Eventually while shooting, the lamp was also connected to a controller that would allow the film crew to manually control it, behind the scenes as the actor interacted with it in front of the camera.


The Kettle


The kettle was trained as a model to recognise when faces were smiling and when they were not based on track points on their lips. This information was then mapped as visual feedback to show the kettle’s willingness to boil water as and when someone would smile at it.


The toaster


The toaster was a manifestation of what interaction would look like if a user had visual feedback on their voice. Based on interview research, it was concluded that this is something that women face on a daily basis and defines how seriously people take them. The toaster only toasts bread if the user’s voice is under a certain decibel level.

StoryBoards


Story Board Breakdown 1

Story Board Breakdown 1


Story Board Breakdown 2

Story Board Breakdown 2

Production

The film Counterproductive was shot in 4 days over 5 locations during the November 2020 Covid-19 lockdown.




References

https://www.stylist.co.uk/life/stop-telling-women-to-smile-feminism-sexism-victoria-beckham-brie-larson-experiment-psychology-opinion/229587 https://www.istockphoto.com/photos/human-mouth-gag-adhesive-tape-women?phrase=human%20mouth%20gag%20adhesive%20tape%20women&sort=best
https://www.tn2magazine.ie/feminist-film-series-the-male-gaze/ http://www.poetofcode.com/