MISGENDERING - AUTOMATED CENSORSHIP

Which bodies are in the club on platforms like Instagram? What position do they have within the hierarchies, and which are not represented at all? Over the years, several studies have shown how mainstream platforms, that in many ways define the global public have systematically excluded and censored non-binary identities.

Ada Ada Ada is a Copenhagen-based artist. She researches how computers understand gender, and what automated censorship does to human bodies. “My work is in the tension between the digital and the analog, because you can put as many mobiles in our hands as you want, but we are still analog bodies.” And Ada’s latest work, IN TRANSITU (2022), questions, for instance, the relationship between power structures and body identities in the digital space. “We only use parts of the body when we communicate digitally: Fingers, eyes, ears. It creates a special bodily starting point in the meeting between the digital and the analog.” Ada is interested in how computers read bodies. For example, a selfie as a communication genre is strategic and set up: “Something you don’t experience in the same way in the analog world—and what I find interesting is what it does to us, and how we reflect on each other. Computers interactwith people and read bodies via technologies, such as machine learning and thereby, through predefined images of text—and that’s not a particularly good thing. A computer’s only tools are images and text, and naturally, there are limits to what you can use text and images for when it comes to bodies. It’s comparable to using another tool—like a hammer—to understand the world. It would mean we would have to make all problems look like nails, because otherwise they couldn’t be solved.” 

On Instagram, I post an update from my gender transition every Thursday. The question is: When will Instagram ban my nipples? It’s not just about nipples; it’s about when I have the shady pleasure of being censored, because I go from being categorized as a man, to being categorized as a woman and so being excluded from the community.
— Ada Ada Ada

FROM CULTURE WAR TO AUTOMATED CENSORSHIP

Among Ada’s inspiration is Os Keyes of Washington University, who researches the role technology plays in constructing the world in The Misgendering Machines: Trans/HCI Implications of Automatic Gender Recognition. Automatic Gender Recognition (AGR) is a subfield of facial recognition that aims to algorithmically identify the gender of individuals from photographs or videos. TRANSITU (2022) is an example of a computer’s limitation, and the problematic of understanding gender from predefined and categorized images: “As a transwoman, the system you encounter on a platform like Instagram is a very heterosexual culture. When a computer understands gender based on images selected, defined, and categorized by few people, which in Instagram’s case has not only a clear binary but also an American starting point, then an exclusion automatically arises.

The global information system can be accessed by an incredible number of different bodies and cultures, but the dominating understanding is extremely narrow. An example in relation to the cultural aspect is that in Denmark and other countries in general, nipples are not a problem, but it is a huge problem in the USA.” Like the culture war that erupted when Janet Jackson’s nipple appeared for a split second in the Super Bowl halftime show in 2004: “If the culture wars could have a 9/11, it’s February 1, 2004,” says one of the cast in the documentary Malfunction: The Dressing Down of Janet Jackson. The film is part of The New York Times Presents series, which is also behind Controlling Britney Spears. The official synopsis states: “In 2004, a culture war was brewing when the Super Bowl halftime show audience saw a white man expose a Black woman’s breast for 9/16ths of a second. A national furor ensued.”

THE HIDDEN, DISCRIMINATORY SYSTEMS

Ada describes how the work TRANSITU magnifies the hidden, discriminatory systems that we all too rarely talk about—because they are invisible: “Gender recognition neural network” is a technology for defining and categorizing gender, which, according to Ada, is found in countless places—both at companies (like Amazon’s “AWS Recognition AI”) and in various open-source editions. “It’s incredibly easy to get a computer to give a rating,” says Ada, who goes on to explain how the work also examines and demonstrates how this assessment happens and thereby reveals biases and so the automatic discrimination that is baked into the technology. Every week, Ada sends a photo to various services, such as face++, azure, AWS, etc., “and they get confused. None can agree on the allocation of percentages—whether it should be called female or masculine. It’s frightening how many of these types of predefined censorship mechanisms exist. Even despite much criticism. Companies like IBM and Google have chosen to remove them, but plenty still remain, and they are easy to find.” Ada says the problem is that there is no transparency: “It is a classic deep learning problem, you gather a lot of images that are to form templates to train a computer with machine learning. And when it comes to gender, there are, for example, a lot of pictures of beards and so the computer learns that it is a man, and that long hair is a woman. It is only a few people and their biases who sort and define.”

WHAT IS FEMININITY AND MASCULINITY?

Is it the same mechanism with a service like the Google search engine—that it’s a predefined system? “Google doesn’t offer a service to categorize gender but may still use a gender-binary system for their search engine. The problem arises with data—there is a person sitting and assessing whether it is one gender or the other. And if that person doesn’t know or recognize anything other than the binary system, then the service disregards other data sets. Another problem with this form of gender recognition is that Black women are judged to be more masculine than white women. No one has addressed the basic question: What is femininity and masculinity? The systems themselves are one thing but it is very wrong if those who build the systems live in filter bubbles—for example, if they only understand gender binary.”Although it has been known for a long timeand despite many people having problematized that discrimination and exclusion of certain

bodies have been built into the systems that many use online, not much has happened. In

fact, the entire “freethenipple” hashtag has been completely removed from Instagram. Ada is inspired by the trans activist Courtney Demone:“I can tell you a fun fact or maybe rather a depressing fact—another trans woman made a similar project back in 2016. She made an HBO mini-dock—and nothing has happened. Nothing about Instagram’s nipple policy.”

“It’s fine that the world is being divided into spaces, but it’s starting to become a problem when a platform acquires so much power— because who is really setting the rules? Over time, the Internet has become a major exclusionary mechanism: There are those who fit into the predefined categories, and there are those—often minorities—who do not. As a transwoman, you have to fight against the algorithms. Of course, there are platforms that are inclusive, but not Instagram, because it is not a particularly trans-friendly place.”



SOURCE:

https://ada-ada-ada.art/projects/in-transitu

Keyes, O. (2018) The Misgendering Machines:

Trans/HCI Implications of Automatic Gender

Recognition. Proceedings of the ACM on

Human-Computer Interaction, Vol. 2, No.

CSCW, Jersey City, NJ



Previous
Previous

Next
Next

National Selfie Day 2022