Creators of Popular AI Tools Are Facing Lawsuits for Copyright Violations
Fighting wars against robots and AI is already well-explored in many Si-Fi books and movies. However, we can finally say that it is no longer a part of the fictional world, as artists and companies are fighting the latest AI development trends.
In the middle of January 2023, a lawsuit initiated by Getty Images, a prominent visual media company, dawned on Stability AI. Stability AI is a solution studio behind Stable Diffusion, a popular text-to-image AI tool released in 2022.
Source: Stable Diffusion
Getty versus Stability AI is one of the first cases involving AI tools and copyright issues. It’s also a big deal because it has the predisposition to reform the law around copyright and finally gives us a clearer picture of what’s tolerable and what isn’t.
This legal dispute was filed in the UK, the country with some of the most stringent copyright infringement laws. But what led to it, and is that the only lawsuit against Stability AI?
Well, text-to-image AI models rely on billions of images from the internet to soak in information and use it to replicate content, giving you a unique output after each request. The problem here is that Getty Images claims that the company behind Stable Diffusion unlawfully collected and processed Getty’s images for its own benefit without getting consent from the other party involved.
Stability AI did not seek any such license from Getty Images and instead, we believe, chose to ignore viable licensing options and long‑standing legal protections in pursuit of their standalone commercial interests.
Getty Images said
But wait, we aren’t done here.
Getty isn’t the only party suing Stability AI, nor is Stability AI the only company sued for copyright issues. At around the same time, Mathew Butterick and a team of litigators filed lawsuits against Stability AI, Midjourney, and DeviantArt, on behalf of Kelly McKernan, Karla Ortiz, and Sarah Andersen, three plaintiffs from the art community. Butterick is taking a step in this direction because “AI needs to be fair and ethical to everyone,” and is likely hoping a lawsuit could lead to impactful changes and give the power back to the artists.
Photo illustration: Freepik
According to the court documents, the companies used the copyrighted work of thousands of artists as training data without providing compensation or asking for artists’ consent.
The lawsuit initiated the inevitable, and what was supposed to be a fun and useful app is now involved in various legal questions and concerns. One of the most critical ones is who really owns the images here and who is the real artist? And, of course, in what corners of the internet are these bots allowed to go if unsupervised data processing might result in copyright infringement?
As the lawsuit unfolds, we’ll hopefully get concrete answers to these questions.
What Is AI Art and How Is It Produced?
AI artwork refers to pieces of art generated through the assistance of artificial intelligence. It comes in many shapes and sizes, meaning it could be anything from a story delivered to you by ChatGPT, a mashup of your favorite songs, or a generated picture of Mona Liza decorated with three layers of Snapchat filters. The options are endless. In other words, as long as the art was created using artificial intelligence, it counts as AI artwork.
Source: DALL-E
AI art has been circulating the market lately, provoking a not-often-seen hype about the revolutionary tools that allow users endless possibilities in the artsy world.
Indeed, this technology is changing the way we create and perceive inventiveness. We can use it to improve efficiency, overcome creative blocks, and express our imaginations and creativity, even if we don’t necessarily have the skills to do it.
But is it necessary?
And more importantly, is it ethical?
Is AI Obscuring What Constitutes an Artist?
Creating, borrowing and recycling are the foundation of every artist’s evolution. There isn’t an artist who was never inspired by another’s work. It might have been a song, a building, a garden, or a painting– we stumble upon artistic influence at all times and are often unaware of it. Some may even argue that the entire art community is a never-ending loop of recycled creativity, but that’s a debate for later.
Of course, that’s not necessarily a bad thing. Art movements with specific styles or philosophies are being adopted by various groups, allowing artists to use the same foundations and concepts to create something entirely unique. If you’re an artist, you know what I’m talking about– this was common long before the Renaissance and will continue to occur generation after generation.
Photo illustration: Freepik
However, modern times bring us a modern approach, meaning there isn’t a clear line between what is or isn’t allowed within the art community. We all know it’s unethical to steal someone’s work and decorate it with our signature, but with all the emerging technologies, do we have to redefine stealing?
This is where AI artwork comes into the picture. The AI training process involves exposing AI programs to bits and pieces of information they can use to learn a new command or a process. So, if you want to create an AI tool that generates artwork, you must first expose your AI bot to art so that it has enough data to provide an output.
Now, here comes the tricky part. Many AI tools work so well and give us high-quality output because their developers use images of people’s artwork to train their AI bots. Without the artists’ consent, of course.
Photo illustration: Freepik
If we go back to the first argument and reiterate the fact that people have been recycling each other’s art for years, we could reason that this approach to art is just the latest version of “I’m learning from the great masters” or “I saw it on Pinterest, so it’ll draw it, too.”
Still, we must consider that AI tool developers are using other people’s work without their consent to train new technology and profit from it. To many artists, this is devastating news.
Taking the time to create something only to find that someone else is profiting from it is a special form of capitalism’s betrayal of the working class. And since this is a relatively new issue, the line between what is allowed and what is lawsuit-worthy is as thin as ice.
Some artists are even worried their jobs are at stake, as people won’t have to pay hundreds or thousands of dollars for art commissions. Instead, they could simply pay a fraction of the cost, and the AI will do the rest in the shortest time possible.
Another potential issue is that these AI tools allow people to enter the market even if they aren’t technically the creators of the art they’re selling. On top of that, AI would offer new artists convenience and allow for mass production, potentially disrupting the current art market and, therefore, financially affecting artists.
With that in mind, we’re impatient to see the unfolding of the latest lawsuits against AI tool creators. Theoretically and ideally, we could have the best of both worlds and help artists and AI creators find common ground and benefit from the latest inventions. In practice, that’s very unlikely to happen. But, these lawsuits are just the beginning, and it might be too early for speculations.