Libby Odai

Digital Projects

Teach the Machine



Harry demoing the software
Audience memeber exploring AI
Audience demoing the software



“Aren’t you….?"

“Do you know…?”

“Aren’t you related to…?”

“You look like Beyoncé”

--------------

Like a child, you teach the machine the world. Your world.

Teach the machine: This is a woman, this is a black woman, this is a queer black woman.

-------------------------

Most POC (people of colour) living in majority white countries have had instances where they are assumed to be or confused for someone else. Psychological phenomenons like the cross-race effect (1), where people less readily recognise faces from unfamiliar races, seep into daily interactions. Questions like “Aren’t you so and so?” and “Aren’t you sure??” can be a regular part of daily life. Often, celebrities are used as a reference point. As flattering the comparison is, I most definitely do not look like Beyoncé.

Increasingly, our lives are dictated by the whims of Artificial Intelligence. From the route you take to work in the morning, what clothes you buy, and even who you date is nowadays not a human choice. Instead obtuse and mysterious robotic decisions drive the machines that we progressively rely on for our daily comings and goings.

However these robots are not fully autonomous agents. The lack of diversity in the tech sector (2) is echoed in the software and hardware it creates. Take for example in facial recognition software. It is increasingly being used in security systems both in wider contexts like CCTV as well as personal devices like phones, and is a daily part of modern life. For white men it has an error rate of less than 1%. However for black women the error rate is 35%. (3)

This can also take a sinister turn. Facial recognition software can be used to tell the sexuality of someone from a photograph (4), having potentially dire consequences if used in the wrong hands. The British Home Office already routinely refuses LGBT+ persecution asylum claims with a “culture of disbelief” (5)). There is a worry that, given their eagerness to use facial recognition (6), marginalised people could be further discriminated against with a system that does not work for them.

Racialised questions, “Aren’t you so and so??” and “You look like Beyoncé” are now not only being asked by humans. For the world of machines where QTPOC (queer/trans people of colour) people do not exist, it is imperative that we make our voices heard.

-------------------------------

The Piece:

The piece is a digital experience which aims to use dance to explore the disparities in tech. A protest against machines and the tech sector being the “experts” instead marginalised groups will be able to dictate the decisions of an AI programme. It aims to empower the QTPOC community using the expressive dance medium of vogue. Using the idea of “walking the ball” individuals can express and showcase their personality. By forcing machines to learn from us, it serves as a symbol to recognise the beauty and diversity of the community.

The piece involves an interactive dance wall where participants will be able to “teach” an AI programme. The participant is able to input a dance move or sequence of moves and link it to their own projected text, image and sound.

To start, vogue artists Claricia Kruithof, Frankie Mullholland and Harry acted as a starting point for the learning algorithm. By inputting vogueing dances and their text, QTPOC culture becomes the “default”, where all future participants are judged against their benchmark. When a dance is judged by the AI to be similar to the inputted dances it will display the text “bedroom vogue” and “high femme”.

Over the course of the piece the AI network learns more moves and people, not only becoming more accurate but becoming more rich with participatory input. People added dances and images from their own experiences, increasing the cultural scope of the piece. An important part of the piece is audience education. For many people the piece is the first time AI has been made explicit in their lives. Talking through the software, how it works and the disparities in technology is fundamental to the overall piece.

Technical

The piece uses a Kinect camera to input skeleton tracking data into a Processing sketch which sends the data to Wekinator. This then uses Dynamic Time Warping to match the dances to the models. This output is then displayed by another Processing sketch.

---------------------------

(1) https://en.wikipedia.org/wiki/Cross-race_effect

(2) https://technation.io/insights/diversity-and-inclusion-in-uk-tech-companies/

(3) https://www.nytimes.com/2018/02/09/technology/facial-recognition-race-artificial-intelligence.html

(4) https://www.theguardian.com/technology/2018/jul/07/artificial-intelligence-can-tell-your-sexuality-politics-surveillance-paul-lewis

(5) https://www.theguardian.com/uk-news/2019/sep/02/home-office-refused-thousands-of-lgbt-asylum-claims-figures-reveal

(6) https://www.ukauthority.com/articles/home-office-backs-facial-recognition-trials/