Woven Threads: My Issue with Moralism in the Conversation About Art and AI

Woven Threads is a series of blog posts where I compile and expand on tweets and threads I’ve posted on Twitter. They may cover a number of topics that I often ruminate on. 

It’s always jarring to me when my extremely leftist and trauma-informed community erupts into moralistic debates as whatever popular news item unearths some deeply controversial topic. I personally feel like every time I’ve tried to apply absolute moral judgement to anything, the universe has taken that as an invitation to prove me wrong. 

Which is why I’ve shied away from sharing an opinion on the unsettling proliferation of AI in creative fields. It’s not that I don’t see a problem with the implementation of AI by entertainment tech and finance executives to undermine professional artists. It’s that I don’t think that AI-generation is the root of the problem. 

Ultimately the blame falls with capitalism, but what has made the cultural discourse particularly difficult for me to navigate is the lack of consensus on societal understandings of art and artists. 

I staunchly believe that getting precious about what and how we define art will only invalidate and create barriers for those with the least access to creative resources. To me, AI-generation is merely a tool like any other medium is a creative tool. I believe that it has the potential to open up creative possibilities for emerging disabled and low-income artists to engage in creative expression in ways that they historically have had to sacrifice to prioritize their basic survival. 

But my issue is how we define an “artist” and their role in our society. 

First I need to contextualize that I think there’s a difference between an artist and a person who makes art. It's not related to the frequency with which you make it or if you get paid to do it or even directly correlated to your technical proficiency. It's all to do with your intent and the way you engage with the arts as a whole.

You can be technically proficient but not an artist; you can be a professional hobbyist. I'd define a hobbyist as someone who makes or engages with art because it's fun or they're good at doing a particular thing, but that just about summarizes the depth of their relationship to it. However, an artist tries to use art to serve a purpose.

In my experience speaking and working with other artists, an Artist makes Art to process, to understand and interpret the world in a way that makes sense to them, or to communicate something beyond themselves and make a connection with someone or something else. 

The art is a tool they use, but it’s not their ultimate goal. They study and practice and refine and critique in order to find the best ways to meet that goal.

It's not about whether the art is "good" or "bad", it's about whether it served the purpose you intended. And I think art can be anything used to communicate to or interpret each other. Anything you do or create can be art if that's what you intend it to be.

Which is why I don’t think that you can apply universal judgement to using AI but what everyone is sounding the alarm about AI isn’t whether or not what it produces is considered art. That’s why it doesn’t matter if we think AI art is technically “good or bad” or if we define someone who uses AI to make art as an artist or not. 

Because AI isn’t “replacing” artists. Artists will continue to create and make the art they want—as long as they’re able to do it and survive at the same time. The issue with AI is it reveals the fundamental issue with capitalism and what it defines as “productive” and “socially valuable.” It reveals what happens when we reduce art to “cultural products” that can be mass produced as a service to consumers rather than valuing the artist for creating products that impact us on a personal level. 

This wasn’t a problem that was created by AI. In retrospect, I had similar misconceptions about art and artists growing up, even before technology had completely transformed our creative industries. I remember feeling as though, as a fan (or what I’d now describe as a consumer), that my favourite bands, musicians, visual artists, filmmakers, TV writers, whoever, owed it to me to create work that met whatever my expectations or desires or motivations were. I genuinely felt betrayed when those artists would grow and evolve in their practice to explore new and different ways to create and express. 

Now that I actually consider myself an artist, I sometimes try to imagine how I would react if anyone had expressed they felt that way about my art and it makes my skin crawl from embarrassment. It’s another reason I’m glad that I didn’t grow up with the ability and the expectation of getting to interact directly with my idols.

Capitalism has been reducing art to “cultural products” for centuries now. It hasn’t mattered whether the final products are considered “good” or “bad” as long as it brings in profits. But before AI-generated art, those entertainment and business executives still had to rely on human beings to create those cultural products. 

Now if you have any experience with creative industries, or if you even just pay attention to growing public attention on streaming and broadcast networks shady finance structures, it’s obvious these executives have never cared about art or artists. Whether it’s AI or unsafe working conditions, they will always find a way to pinch pennies where they can. 

However, if you can run a program that can create something that serves its marketing purpose? How much leverage will union strikes hold then? 

These companies and projects aren’t usually the true goals of artists in the industries they’re in. Usually it’s just how they get their bills paid and leave them with enough leftover to pursue the projects they actually want to do. 

This is also made more complicated by colonial definitions of intellectual property rights and how capitalism has made them necessary for artists’ survival. While I take issue with the way that the concept of intellectual property attempts to commodify knowledge and art, which can often impede our ability to collaborate, interrogate, and engage with other knowledge and creative producers. I believe that knowledge and creativity should be shared openly and freely and critically. 

When dorky little computer nerds began creating open source bots and experimenting with AI chats and generators, they weren’t using it for profit. They were using it to poke fun at and explore our cultural conceptions of creation and language and connection. There was no intent to monetize it. But then again, as we learned from nuclear energy and the atomic bomb, we live in a system that figures out how to turn any potential resource to improve society into a weapon against it before we even have a chance to understand it. 

So as a creator who lives in a capitalist society that forces me to monetize and commodify my “knowledge”, “experience”, or “expertise” in order to survive. So when programmers and coders uninterested in mutual collaboration are scraping your work to derive some magical formula that distinguishes your work as Great and Valuable with no regard for the intentionality or emotional value to it in a grab for money that could’ve gone into your pocket while you’re already struggling to pursue your own practice, suddenly those intellectual property rights start making a lot of sense. 

[Sidenote: There’s something vaguely ironic about the infamous Oppenheimer quote about his role on the Manhattan Project being a translation of a line from the Bhagavad Gita, a text my own family was convinced was “backwards.” 

“I am become Death, destroyer of worlds.”]

So while the issue isn’t AI but the capitalist system that hinges our survival on our ability to get paid for what we produce, I understand the trepidation around its emergence. But I worry that by creating cultural moral absolutes on AI-integrated practices, it’ll be poor and disabled artists who suffer the most as a result. 

When we rely on moralistic absolutes to define who is helping or hurting movements, rather than approaching each instance within its broader context of power, impact and purpose, the voices that are already being ignored don’t often get the chance to explain themselves, locking them further out of these conversations. And typically, the intended targets often barely notice an impact on their bottom line, and that’s what matters to them.

Maybe instead of immediately jumping into a preemptive defense against the ways AI could be used to take away our power, we need to start asking how we can use AI to take our power back for ourselves.  

Previous
Previous

Just Gilmore Girly Things: L is for Luke and Lorelai

Next
Next

Just Gilmore Girly Things: The Three Generations of Gilmore Girls