Generative AI and creators are set to square off — here’s why they should compromise

Originally appeared on Medium | Written by Dan Neely Co-Founder and CEO of Vermillio

In the past several weeks, three lawsuits — one filed by a group of artists, two by media giant Getty Images — made the first attempt at putting the brakes on generative AI. The plaintiffs allege that the tech platforms named in the suits infringe upon copyrights by using creative works to train image-generation algorithms without permission. Generative AI companies, on the other hand, view their datasets as upholding fair use doctrine and the suits as hurdles to innovation.

The fight could be existential — with a “win” for one side spelling the demise of the other. But in reality, the outcomes of this debate need not — and should not — be the binary result that has often been presented.

No matter one’s allegiances, it’s a good thing for these issues to be examined in a legal setting.

Generative AI holds enormous potential to enhance human creativity. Consider, for example, advertising agencies that are using text-to-image generation to speed up and enrich the storyboarding process for campaigns; designers who are using the technology to dream up new products; or a child who finds delight in using it to craft personalized bedtime stories.

Yet, generative AI’s capabilities will never be realized unless a workable solution is found to questions about content creation and ownership.

These questions represent new ground; on one level, cases need to be brought for legal practitioners to determine where the rights of creators and generative AI platforms fall within existing intellectual property law. These foundational cases could reveal a need to update statutes — gaps that lawmakers should then fill.

Hashing out these disputes will not be without its perils. There’s a scenario in which any use of generative AI — even to make private creations — is deemed copyright infringement. Intellectual property holders — especially large corporations — would be incentivized to withhold content from generative AI platforms that are accessible to the public, limiting creativity.

At the other extreme, as the artists fear, one could envision courts determining that generative AI companies have no obligation to compensate copyright holders for use of their work, dealing creative professionals a crushing blow.

Neither of these outcomes, of course, is ideal for democratizing access to new tools and maximizing creativity. So what is the way forward?

At Vermillio, we’ve spent the past three years having conversations with people on all sides of this debate. What we’ve learned is that balancing the interests of content owners big and small with those of generative AI firms is not only possible — but essential.

Creators should be fairly credited and compensated for their work in the digital world — period. Recent history has shown that this can be done; consider the series of legal disputes that set a new status quo for the digital music industry in the early 2000s. Though the current state of affairs is not perfect, music piracy was curtailed and everyone is all the better for it: streaming platforms pay artists, and music is widely available to listeners at a reasonable cost.

It should also be said that simply owning content does not equate to an ability to create generative AI models. The algorithms need context — or a base model — to understand the data they’re being fed. Regardless of the outcomes of the legal cases, generative AI companies both new and incumbent will build these base layer models using publicly accessible data in order to keep algorithms going.

As the legal framework develops, creators and tech platforms will need mechanisms to ensure seamless compliance with laws and ethical standards. Blockchain technology provides one fix for tracing the entire lineage of creations — underpinning systems like ours at Vermillio, which also analyzes and extracts digital signatures out of every fragment, or “atom”, that makes up a piece of content to track and authenticate its use in AI-generated derivatives. These collections of atoms are matching data for all generative AI participants to use so they can determine lineage, ownership and a way for fair compensation across all contributors.

Creators could then, through these digital signatures, have more control over how their works are used. Tracking the AI derivatives of their work can help artists and content owners to curb creations that don’t align with their values or intent — such as disinformation or other malicious material.

The answer encompasses far more than just us — it means continuing to collaborate with creators, generative AI platforms, and regulators in the space to strike the right balance. Hammering out a solution may put even the most sophisticated legal minds to the test, but the headaches will be worth it.

Get our free REPORT

THE “i” in Generative AI

Get our free REPORT

THE “i” in Generative AI

Protect My Content with TraceID by Vermillio

By submitting this form, you agree to our
Terms of Service & Privacy Policy

Scroll to Top