AI: The Copyright and Plagiarism Story of 2022 and 2023

Typically, when I do these year-end reviews, I cover a wide variety of stories that happened and separate out the copyright and plagiarism.

Simply put, copyright and plagiarism are two different things. The former dealing with legal rights a creator has and the latter the ethics of attribution and reuse. Their interests over overlap, but year-to-year the concerns and stories are often very different.

However, that is not true this year.

In 2022 and looking ahead to 2023, there is one story that is dominating the narrative of both copyright and plagiarism matters: Artificial Intelligence

It doesn’t matter if you’re an artist looking to protect their work, a teacher worried about students cheating on assignments or just an observer of both worlds, AI has dominated the headlines and for good reason.

2022 was a breakout year for artificial intelligence in creative fields, and 2023 will likely be one too. Not only are we still living in the fallout from what happened last year, but also because new developments are almost certainly on the horizon. 

With that in mind, let’s recap what we learned about artificial intelligence in 2022 and what may be coming in 2023.

Copyright and AI

When looking at the copyright issues that AI has created, there are two key questions that are currently on everyone’s mind.

  1. The Rights of Human Creators: What rights do human creators have when their work is ingested and used to train AI? What happens when an AI produces a work that is too similar to that source material?
  2. Ownership of AI Works: Can work created by an artificial intelligence be protected by copyright? If so, who has the rights in question?

The answer to the first question is still very much up in the air. Backlash against AI was strong in 2022, especially from visual artists. For example, in December, artists on ArtStation began to protest against AI-generated images, flooding the site with protest images. 

We saw similar backlashes on DeviantArt and many sites, such as Getty Images, have banned AI artwork for the time being.

At least one case, one dealing with the use of AI to write code, has resulted in a lawsuit. However, that lawsuit is only two months old, and no rulings have been made.

We simply don’t know what the courts will say here, and the answers will likely vary from situation to situation, even when we do get answers. The scenario of someone who had their own site scraped by an AI is different from someone who uploaded an image to DeviantArt and had the site use it.

In short, we’re unlikely to get solid legal answers in 2023 on this issue, though the battle lines should become more clear. We may also see the beginnings of social norms around AI ingestion. 

On the second question, the U.S. Copyright Office (USCO) has given us some more clear guidance. It’s ruled that AI-generated content does not qualify for copyright protection and that any use of AI in a work needs to be disclosed when filing a registration so that material can be excluded.

The logic follows the “monkey selfie” case, where the USCO made it very clear that copyright required a human authorship. Not a macaque and not an AI. However, those using AI are going to create new edge cases, including hybrid works, and there are even questions about when AI prompts can be protected by copyright even if the output can’t.

In short, 2022 was a big year for copyright and AI, and 2023 appears to be set to be the sequel.

Plagiarism and AI 

When turning to the ethics side of things, the focus becomes singular: How can we be sure a human wrote the piece that they are claiming as theirs?

This hit especially hard in December 2022 with the launch of ChatGPT, a text-generating system created by OpenAI that is both high-quality and free to use (as of this writing).

Educators felt the shock most profoundly, with some saying that the essay is dead as an assessment tool. I took a more nuanced view, saying that assessment was going to change, but that the essay isn’t dead.

The good news is that education, and other spaces, have actually been working on this problem for years. Tools such as Turnitin’s Authorship Investigation tool and Unicheck’s Emma were developed to combat essay mills. However, their approach, detecting changes in the student’s writing, can work for AI as well.

There’s also some hope from AI itself. Tools such as RoBERTA Base OpenAI Detector do at least a decent job detecting works written by AI models, including the one ChatGPT is based upon. 

In my testing of it today (a more thorough analysis likely in the future), it guessed correctly all the text I had written versus ones written by ChatGPT though it was more confident when given blocks of text larger than four paragraphs.

However, there’s a real problem in applying these tools, especially punitively. Where a traditional plagiarism analysis gives you a transparent report that humans can analyze and draw their own conclusions from. These authorship and anti-AI tools don’t.

There’s no indication of how the tools reached the conclusions that they did, no evidence to examine, and not even a good indicator of what is a normal baseline for such tests.

This means that, even if these tools do show a student or other author likely misused an AI, acting on that information can be difficult. 

This will be the tension that we will likely see over the next year as AI checking becomes more common and disputes about whether an author did or did not plagiarize become more common.

Bottom Line

In the seventeen years I’ve run Plagiarism Today, I have never seen the issues of copyright and plagiarism so closely align. If you come away with only one story from both worlds, it would be AI.

However, it’s important to remember that, in both cases, we are still in the very early stages. It’s unlikely that we’ll know the long-term impacts of AI in 2023 or even 2024.

What is taking place right now is merely the initial reactions. The realization has hit that AI is entering creative spaces that, previously, were solely the domain of humans. Our legal and social systems are not prepared.

Whatever happens over the next 12 months will be important to watch and may dictate both the law and the ethics of content creation for generations to come. 

Want to Reuse or Republish this Content?

If you want to feature this article in your site, classroom or elsewhere, just let us know! We usually grant permission within 24 hours.

Click Here to Get Permission for Free