First appeared in Boston Herald
By Aron Solomon
These days, we read a lot about what people “create” on ChatGPT and, following naturally from that, whether they actually own what they have created.
The idea of creation on ChatGPT and other similar AI tools might be a little bit of a stretch. These are language models that, at least in theory, are continuously improved as we put more data sets into them.
But the idea of us having intellectual property rights over anything that we create in ChatGPT makes me think about, of all things, a Big Mac.
I think about a Big Mac because there is a popular YouTube series where a famous chef is given fast food by his adult son and then asked to create something gourmet using what he was given. The end result is something that usually differs fairly dramatically from whatever fast food item the chef was given.
But when I think about what we do with ChatGPT and similar tools, I think about being given a Big Mac, disassembling it, and putting it back together in a different order. I don’t think that gives me any ownership over the Big Mac, nor is it particularly creative. Fundamentally and foundationally, all I’m doing is using whatever control I have over the ingredients to make modifications that don’t really alter what they are.
So when we use ChatGPT for the purpose of creating something to be published, I think we need to be very aware that we are simply borrowing the output rather than creating it or ultimately owning it.
Of course, at least anyone I’ve ever met who is even vaguely familiar with ChatGPT doesn’t simply put in a query and cut and paste the response. There are many reasons for that, ranging from simple honesty to the quality of the output we receive. Almost always, we take a paragraph of research from a tool, such as ChatGPT, put it into our own words, and then, by doing so, we change the output and its meaning.
But doing this does absolutely not mean that we are then able to exert intellectual property ownership rights over what we have “created” because, well, we haven’t created a thing.
The European Commission recently decided to ask ChatGPT itself who owns the content created on it, then they published the answer on an official EU site:
I do not own the content that I generate. I am a machine learning model developed and owned by OpenAI, and the content generated by me is subject to OpenAI’s license and terms of use.
As the same EU report highlighted:
As ChatGPT itself states, while the content generated may be protected by copyright, it will not be owned by the AI itself. Indeed, under European (and US) law AI cannot own copyright, as it cannot be recognised as an author and does not have the legal personality which is a pre-requisite for owning (intangible) assets.
The license and terms of use for something that’s completely open source are very different from something that’s for-profit.
CopyAI is a for-profit service offering the same AI writing outputs as ChatGPT. Their terms of service address the IP issue as clearly as possible:
Ownership of Intellectual Property.
We may make available through the Services content that is subject to intellectual property rights, including Generated Content. We and our licensors (as applicable) retain all rights to that content.
But it gets even more interesting than this. Not only do you not own what you create on CopyAI, but others can also access, change, and use it:
Permissions to Your User Content.
By making any User Content available through the Services you hereby grant to Copy.AI, its licensors, and their affiliates a non-exclusive, transferable, perpetual, irrevocable, worldwide, royalty-free, fully paid-up license, with the right to sublicense, to use, host, store, copy, communicate, modify, create derivative works based upon, distribute, publish, publicly display, and publicly perform your User Content in connection with Copy.AI, its licensors, and their affiliate providing, operating, securing, and improving their services.
Richard DiTomaso, a Philadelphia lawyer, points out that part of the problem here is expectation vs. reality:
“In early 2023, few people who use artificial intelligence to generate content understand what can and can’t be done with what they create and who ultimately owns it. This will change but not overnight.”
My own belief is that any notion of creation here ultimately belongs to the creators of the software. But that’s simply a legal hypothesis until litigation proves it true or false.
What’s clear is that the laws we have on the books today about intellectual property aren’t going to cut it. It’s going to take litigation to drive change in the law we currently have because they’re just not able to deal with the kinds of change we’re already seeing.
About Aron Solomon
A Pulitzer Prize-nominated writer, Aron Solomon, JD, is the Chief Legal Analyst for Esquire Digital and the Editor-in-Chief for Today’s Esquire. He has taught entrepreneurship at McGill University and the University of Pennsylvania, and was elected to Fastcase 50, recognizing the top 50 legal innovators in the world. Aron has been featured in Forbes, CBS News, CNBC, USA Today, ESPN, TechCrunch, The Hill, BuzzFeed, Fortune, Venture Beat, The Independent, Fortune China, Yahoo!, ABA Journal, Law.com, The Boston Globe, YouTube, NewsBreak, and many other leading publications.