Who owns my body?
In an age of posthumanist aspirations weaving humans and technology increasingly closer, one might begin to question where a person begins and ends. What is a person? What delimits their personhood over time, and across platforms? As per the legendary feminist Judith Butler, identity is formed through ongoing construction, through actions of everyday life, rather than the biological body. So where does that place us in a world where our lives, identities and likeness are evermore melded into technology? Virtual avatars, in all their 3D CGI glory, particularly when intended as a representation of a ‘real’ flesh-and-blood human, are a perfect example.
Virtual avatars are an opportunity to expand and negotiate dualities of personhood. Can your person include a cyberspace lookalike? Is an avatar of you also you? What if this avatar is acting in real-time in the metaverse? Talking, sleeping, behaving as you. Is it you? By questioning the meaning of somatic existence, this technology could be an opportunity to rewrite engrained myths about the self and the other. How does it feel to act through your avatar? An out-of-body embodiment. Will you think and act differently? Avatar creation can be an experimental, boundary-pushing dynamic art form and experience, a mode of participatory art. Sounds great, so far. But it delves down a much darker path when you consider the misuses and abuses that these avatars are vulnerable to, and importantly the risks they pose to real humans.
Many of the ethical quandaries about virtual avatars are not specially novel, but it is their extreme, intensive nature which makes them so vulnerable to dystopian dilemmas. It’s an extreme environment of endless possibilities, and with it endless possible risks. With increasing serious attention paid to avatars and the metaverse, we need an ethical roadmap.
The creation of virtual avatars has the very real danger of being harmful to marginalised groups, in particular people of colour and women. Tech, and specifically the AI and metaverse spaces continue to be dominated by heteronormative white cis men. They are a space that could be easily swallowed by male fantasy, shaped by the ponderings of the male gaze and its constructed myths – a misogynist, racist cyber dystopia. The creative process of a virtual avatar created from a real human’s identity is woven with deep power imbalances. The person who is being avatar-ified, let’s call them the model, is often in the most vulnerable position. They can be subject to the flippant objectification of for example the 3D artist, the stylist, makeup and hair artists under the guise of “creative direction”. Should the avatar be physically altered to be more interesting, more appealing, more.. perfect? In fact, how can it not be altered if it’s CGI? These questions of digital editing are not new, and easily compared to the well known photoshopping of fashion editorials. What is alarming is the extreme potential for mishandling with virtual avatars, currently at the mercy of individual artists. Altering the appearance of an avatar of a real person is not without consequence. In a mostly male-dominated field, these alterations can slip into sexualised objectifications of women, through the male gaze. You could call this digital abuse, with real consequence on both the model and the observer, warping imaginations and visions of the body. What’s to stop an artist from augmenting a model’s breast size, lifting her eyelids or shaving a few centimetres off her waist, perhaps? Avatars offer the opportunity for a creator, and viewer, to invest in them their own fantasies. It becomes a private indulgence of another person’s self.
The question then arises, who is responsible for these avatars? Who decides how they look or how they act? And in fact, who owns an avatar? In a context of gendered power imbalances, it can quickly become an eerie fantasy of female objectification. The avatar’s likeness can be so deeply interwoven with one’s identity, if you consider it an extension of your being. So who will own your body? If not you, have you sold a part of yourself? Has your personhood been stolen? How many times can it be used? How can the avatar’s use be ethically and sustainably compensated? Not to mention, what happens when someone dies?
This is an issue of feminist intersectional concern. True to the tech industry, avatar creation poses the risk of hoarding work from marginalised real people. It is a primed space for exclusion, whether consciously or not, the biases of involved creators must be acknowledged. These racial biases extend past the human element into the AI software used to create avatars. It is well documented that AI’s normative baseline is Western, white bodies. Even in the creation of this spread, Soo Joo’s features were automatically Euro-warped by the software, before being manually corrected.
The grave danger is there are no repercussions for the mishandling of virtual avatars. There are, so far, no regulations, not even boundaries delineating the ethics of the space and its mode of existence and creation. It is a very likely possibility that these innovative technologies could be another medium of upholding existing systems of oppression. The architects of virtual avatars and their environments need to be cognisant, committed, and active in their ethical responsibilities.
It is without question that we need to follow a feminist ethic of care, meaning a commitment to considering individuals and their contexts, from a perspective of empathy rather than a sterile omnisicent observer. A human-led, anti-sexist, anti-racist ethical standard. Before things get out of hand, there must be a reflective ethical protection of digital representations of real humans. The central focus must be the care of humans involved and impacted.
The first step is to acknowledge these avatars and their environments are dynamic and embodied experiences. They are not objective or autonomous. The avatars are drawn from and closely connected to real people, with lives and communities. They are created by real people who make decisions about how and where avatars are represented. These decisions, consciously or not, are embedded with personal and systemic biases