Libby Odai

Digital Projects

Teach the Machine

Harry demoing the software
Audience memeber exploring AI
Audience demoing the software

“Aren’t you….?"

“Do you know…?”

“Aren’t you related to…?”

“You look like Beyoncé”

Like a child, you teach the machine the world. Your world.

Teach the machine: This is a woman, this is a black woman, this is a queer black woman.

Most POC (people of colour) living in majority white countries have had instances where they are assumed to be or confused for someone else. Psychological phenomenons like the cross-race effect (1), where people less readily recognise faces from unfamiliar races, seep into daily interactions. Questions like “Aren’t you so and so?” and “Aren’t you sure??” can be a regular part of daily life. Often, celebrities are used as a reference point. As flattering the comparison is, I most definitely do not look like Beyoncé.

Increasingly, our lives are dictated by the whims of Artificial Intelligence. From the route you take to work in the morning, what clothes you buy, and even who you date is nowadays not a human choice. Instead obtuse and mysterious robotic decisions drive the machines that we progressively rely on for our daily comings and goings.

However these robots are not fully autonomous agents. The lack of diversity in the tech sector is echoed in the software and hardware it creates. Take for example in facial recognition software. It is increasingly being used in security systems both in wider contexts like CCTV as well as personal devices like phones, and is a daily part of modern life. For white men it has an error rate of less than 1%. However for black women the error rate is 35%.

This can also take a sinister turn. Facial recognition software can be used to tell the sexuality of someone from a photograph, having potentially dire consequences if used in the wrong hands. The British Home Office already routinely refuses LGBT+ persecution asylum claims with a “culture of disbelief” . There is a worry that, given their eagerness to use facial recognition, marginalised people could be further discriminated against with a system that does not work for them.

Racialised questions, “Aren’t you so and so??” and “You look like Beyoncé” are now not only being asked by humans. For the world of machines where QTPOC (queer/trans people of colour) people do not exist, it is imperative that we make our voices heard.

Teach the Machine is a digital experience which aims to use dance to explore the disparities in tech. A protest against machines and the tech sector being the “experts” instead marginalised groups will be able to dictate the decisions of an AI programme. It aims to empower the QTPOC community using the expressive dance medium of vogue. Using the idea of “walking the ball” individuals can express and showcase their personality. By forcing machines to learn from us, it serves as a symbol to recognise the beauty and diversity of the community.

The piece involves an interactive dance wall where participants will be able to “teach” an AI programme. The participant is able to input a dance move or sequence of moves and link it to their own projected text, image and sound.

To start, vogue artists Claricia Kruithof, Frankie Mullholland and Harry acted as a starting point for the learning algorithm. By inputting vogueing dances and their text, QTPOC culture becomes the “default”, where all future participants are judged against their benchmark. When a dance is judged by the AI to be similar to the inputted dances it will display the text “bedroom vogue” and “high femme”.

Over the course of the piece the AI network learns more moves and people, not only becoming more accurate but becoming more rich with participatory input. People added dances and images from their own experiences, increasing the cultural scope of the piece. An important part of the piece is audience education. For many people the piece is the first time AI has been made explicit in their lives. Talking through the software, how it works and the disparities in technology is fundamental to the overall piece.


The piece uses a Kinect camera to input skeleton tracking data into a Processing sketch which sends the data to Wekinator.