AI vs. copyright
Last year, I noted that OpenAI’s view on copyright is that it’s fine and dandy to copy, paste, and steal people’s work. OpenAI is far from alone. Anthropic, Google, and Meta all trot out the same tired old arguments: AI must be free to use copyrighted material under the legal doctrine of fair use so that they can deliver top-notch AI programs.
Further, they all claim that if the US government doesn’t let them strip-mine the work of writers, artists, and musicians, someone else will do it instead, and won’t that be awful?
Of course, the AI companies could just, you know, pay people for access to their work instead of stealing it under the cloak of improving AI, but that might slow down their leaders’ frantic dash to catch up with Elon Musk and become the world’s first trillionaire.
Horrors!
In the meantime, the median pay for a full-time writer, according to the Authors Guild, is just over a year. Artists? annually. And musicians? Those numbers are all on the high side, by the way. They’re for full-time professionals, and there are far more part-timers in these fields than people who make, or try to make, a living from being a creative.
What? You think we’re rich? Please. For every Stephen King, Jeff Koons, or Taylor Swift, there are a thousand people whose names you’ll never know. And, as hard as these folks have it now, AI firms are determined that creative professionals will never see a penny from their work being used as the ore from which the companies will refine billions.
Some people are standing up for their rights. Publishing companies such as the New York Times and Universal Music, as well as nonprofit organizations like the Independent Society of Musicians, are all fighting for creatives to be paid. Publishers, in particular, are not always aligned with writers and musicians, but at least they’re trying to force the AI giants to pay something.
At least part of the US government is also standing up for copyright rights. “Making commercial use of vast troves of copyrighted works to produce expressive content that competes with them in existing markets, especially where this is accomplished through illegal access, goes beyond established fair use boundaries,” the US Copyright Office declared in a recent report.
Personally, I’d use a lot stronger language, but it’s something.
Of course, President Donald Trump immediately fired the head of the Copyright Office. Her days were probably numbered anyway. Earlier, the office had declared that copyright should only be granted to AI-assisted works based on the “centrality of human creativity.”
“Wait, wait,” I hear you saying, “why would that tick off Trump’s AI allies?” Oh, you see, while the AI giants want to use your work for free; they want their “works” protected.
Remember the Chinese AI company DeepSeek, which scared the pants off OpenAI for a while? OpenAI claimed DeepSeek had “inappropriately distilled” its models. “We take aggressive, proactive countermeasures to protect our technology and will continue working closely with the US government to protect the most capable models being built here,” the company said.
In short, OpenAI wants to have it both ways. The company wants to be free to Hoover down your work, but you can’t take its “creations.”
OpenAI recently spelled out its preferred policy in a fawning letter to Trump’s Office of Science and Technology. In it, OpenAI says, “we must ensure that people have freedom of intelligence, by which we mean the freedom to access and benefit from AGI, protected from both autocratic powers that would take people’s freedoms away, and layers of laws and bureaucracy that would prevent our realizing them.”
For laws and bureaucracy, read copyright and the right of people to be paid for their intellectual work.
As with so many things in US government these days, we won’t be able to depend on government agencies to protect writers, artists, and musicians, with Trump firing any and all who disagree with him. Instead, we must rely on court rulings.
In some cases, such as Thomson Reuters v. ROSS Intelligence, the actual legal definition of copyright and fair use has found that wholesale copying of copyrighted material for AI training can constitute infringement, especially when it harms the market for the original works and is not sufficiently transformative. Hopefully, other lawsuits against companies like Meta, OpenAI, and Anthropic will show that their AI outputs are unlawfully competing with original works.
As lawsuits proceed and new regulations are debated, the relationship between AI and copyright law will continue to evolve. If it comes out the right way, AI can still be useful and profitable, even as the AI companies do their damnedest to avoid paying anyone for the work their large language modelsrun on.
If the courts can’t hold the wall for true creativity, we may wind up drowning in pale imitations of it, with each successive wave farther from the real thing.
This potential watering down of creativity is a lot like the erosion of independent thinking that science fiction writer Neal Stephenson noted recently: “I follow conversations among professional educators who all report the same phenomenon, which is that their students use ChatGPT for everything, and in consequence learn nothing. We may end up with at least one generation of people who are like the Eloi in H.G. Wells’s The Time Machine, in that they are mental weaklings utterly dependent on technologies that they don’t understand and that they could never rebuild from scratch were they to break down.”
#copyright
AI vs. copyright
Last year, I noted that OpenAI’s view on copyright is that it’s fine and dandy to copy, paste, and steal people’s work. OpenAI is far from alone. Anthropic, Google, and Meta all trot out the same tired old arguments: AI must be free to use copyrighted material under the legal doctrine of fair use so that they can deliver top-notch AI programs.
Further, they all claim that if the US government doesn’t let them strip-mine the work of writers, artists, and musicians, someone else will do it instead, and won’t that be awful?
Of course, the AI companies could just, you know, pay people for access to their work instead of stealing it under the cloak of improving AI, but that might slow down their leaders’ frantic dash to catch up with Elon Musk and become the world’s first trillionaire.
Horrors!
In the meantime, the median pay for a full-time writer, according to the Authors Guild, is just over a year. Artists? annually. And musicians? Those numbers are all on the high side, by the way. They’re for full-time professionals, and there are far more part-timers in these fields than people who make, or try to make, a living from being a creative.
What? You think we’re rich? Please. For every Stephen King, Jeff Koons, or Taylor Swift, there are a thousand people whose names you’ll never know. And, as hard as these folks have it now, AI firms are determined that creative professionals will never see a penny from their work being used as the ore from which the companies will refine billions.
Some people are standing up for their rights. Publishing companies such as the New York Times and Universal Music, as well as nonprofit organizations like the Independent Society of Musicians, are all fighting for creatives to be paid. Publishers, in particular, are not always aligned with writers and musicians, but at least they’re trying to force the AI giants to pay something.
At least part of the US government is also standing up for copyright rights. “Making commercial use of vast troves of copyrighted works to produce expressive content that competes with them in existing markets, especially where this is accomplished through illegal access, goes beyond established fair use boundaries,” the US Copyright Office declared in a recent report.
Personally, I’d use a lot stronger language, but it’s something.
Of course, President Donald Trump immediately fired the head of the Copyright Office. Her days were probably numbered anyway. Earlier, the office had declared that copyright should only be granted to AI-assisted works based on the “centrality of human creativity.”
“Wait, wait,” I hear you saying, “why would that tick off Trump’s AI allies?” Oh, you see, while the AI giants want to use your work for free; they want their “works” protected.
Remember the Chinese AI company DeepSeek, which scared the pants off OpenAI for a while? OpenAI claimed DeepSeek had “inappropriately distilled” its models. “We take aggressive, proactive countermeasures to protect our technology and will continue working closely with the US government to protect the most capable models being built here,” the company said.
In short, OpenAI wants to have it both ways. The company wants to be free to Hoover down your work, but you can’t take its “creations.”
OpenAI recently spelled out its preferred policy in a fawning letter to Trump’s Office of Science and Technology. In it, OpenAI says, “we must ensure that people have freedom of intelligence, by which we mean the freedom to access and benefit from AGI, protected from both autocratic powers that would take people’s freedoms away, and layers of laws and bureaucracy that would prevent our realizing them.”
For laws and bureaucracy, read copyright and the right of people to be paid for their intellectual work.
As with so many things in US government these days, we won’t be able to depend on government agencies to protect writers, artists, and musicians, with Trump firing any and all who disagree with him. Instead, we must rely on court rulings.
In some cases, such as Thomson Reuters v. ROSS Intelligence, the actual legal definition of copyright and fair use has found that wholesale copying of copyrighted material for AI training can constitute infringement, especially when it harms the market for the original works and is not sufficiently transformative. Hopefully, other lawsuits against companies like Meta, OpenAI, and Anthropic will show that their AI outputs are unlawfully competing with original works.
As lawsuits proceed and new regulations are debated, the relationship between AI and copyright law will continue to evolve. If it comes out the right way, AI can still be useful and profitable, even as the AI companies do their damnedest to avoid paying anyone for the work their large language modelsrun on.
If the courts can’t hold the wall for true creativity, we may wind up drowning in pale imitations of it, with each successive wave farther from the real thing.
This potential watering down of creativity is a lot like the erosion of independent thinking that science fiction writer Neal Stephenson noted recently: “I follow conversations among professional educators who all report the same phenomenon, which is that their students use ChatGPT for everything, and in consequence learn nothing. We may end up with at least one generation of people who are like the Eloi in H.G. Wells’s The Time Machine, in that they are mental weaklings utterly dependent on technologies that they don’t understand and that they could never rebuild from scratch were they to break down.”
#copyright
·39 Views