MyFACE consists of several phases, and the first activity will be carried out in collaboration with Bits of Freedom. The kick-off of the project MyFACE will be executed in the summer of 2022, to collect signatures for the campaign Reclaim Your Face from Bits of Freedom, a European initiative to ban biometric mass surveillance practices. MyFACE walks the line between activism in public space and research regarding the relations between identity, AI and the weaponization of biometric data.
The project consists of a series of performances in public spaces. People wear silicone copies of the artist Laura A Dima’s face. During the performance, the performers wearing the face masks will post material on social media producing online content that the algorithms of social media will recognize as Laura A Dima, who seems to be be appearing in multiple places at once. The aim is threefold: we try to create a glitch between a registered identity and the software managing that data, we will find out how masks change ones behaviour in public, and we will learn how an audience reacts to masked performers in public space.
The performances involve a group of people who venture into the city of Amsterdam, all wearing silicone masks. Some masks appear identical to the artists’s face, while others include alterations of her face. The volunteers are assigned a specific route through Amsterdam. Some of the performers will go by public transport, others will walk or take the bike and others stop every now and then at a shop or café. By assigning participants different routes, the performance covers different areas of the city and therefore has more exposure. It also explores different aspects of public life; it is, for example, not permissible to wear face-covering in public spaces or on public transport (such as a burka). However, will that law apply to the case of performers wearing silicone masks?
During their assigned route, the participants shoot videos and pictures with their mobile phones. This content will be uploaded to their personal social media accounts. Researchers have demonstrated how easy it is to recognize and track people using real-time data from social media, and therefore have shown how this can be used to surveil us. By fooling the system, algorithms on social media will identify me as being at multiple places at once, creating a glitch between identity and what so-called Big Tech and Big Data consider to be Laura A Dima.
The first performance happened at on 15 June 2022 at Sociëteit SEXYLAND a cultural clubhouse and ‘free space’ for creatives in Amsterdam. SEXYLAND gives artists the opportunity to produce their own programs.
In Europe, there is the AVG law to protect us against the misuse of (biometric) personal data. Biometric personal data is personal individual data resulting from specific technical processing of physical, physiological or behavioural characteristics of an individual. There are several loopholes in this law, and there are not enough civil servants to check compliance regarding it. Many companies and organizations appear to misuse this sensitive data. Biometric data contains more information than is strictly necessary for identification. Jevon’s Paradox proposes an increase in efficiency in the use of a resource leads to an overall increase of that resource, not a decrease. Therefore, economising facial recognition possibilities, surveillance techniques and data mining might lead to more facial recognition possibilities.
Combining algorithms and biometric personal data creates discriminatory and racist irregularities. People of colour are more often suspected, convicted and end up in a criminal database easier than white people. We need to be aware that when our technology gets smarter, the problem of wrongly convicting marginalised groups will likely worsen. The technology, however, is not alone the cause of the problem, it is also the developers behind it (the data is implemented by people, with (mostly) unconscious biases). As technology gets smarter, the problem of data harvesting and surveillance will actually increase. While some people are willing to sacrifice their privacy for the sake of our common security, an ethical problem arises; we will have to ask ourselves why and when we use automatic facial recognition and for whom. In any case, it is certain that systems with facial recognition lead to more inequality in our society.
Blurring the boundaries of privacy with the use of AI facial recognition software has implications worldwide. There are already a number of extreme examples. In China facial recognition has supplanted passwords, and the government collects data to publicly shame its citizens, awarding them social credits as a means of creating value. People are constantly being surveilled and the AI can, if we are not careful, draw its own conclusions that in turn can lead to a cynical control of society with oblique values that are no longer shared by the entire population.
The artist uses technology in a critical way to show the danger of it. We need to think about how we as individuals relate to nowadays technology and how we want it to relate to us. Besides showing how technology can be abused, MyFace asks us: How does an individual’s behaviour change if one’s no longer responsible for their own identity, but instead for a borrowed one?
In the future the artist aims to expand the MyFACE project by sending the masks to people abroad: then Laura A Dima’s registered identity is not only in multiple places at once in the Netherlands, but also appears all over the world as a symbol of resistance.