How a Fake AI-Generated Book List Made it Into Major Papers

On May 18, the Chicago Sun-Times published a special summer section entitled Heat Index. The section is a “best of summer” special that highlights various activities, products and other seasonal guides.
As part of that special section, the paper published a 2025 summer reading guide featuring 15 book recommendations. However, there was a serious problem with that guide: only five of the books actually exist.
Ten of the books are fake books attributed to real authors. Simply put, even though the authors exist, the books don’t.
The error was first noticed on social media platforms and was initially reported by 404 Media.
The list doesn’t contain a byline. However, freelancer Marco Buscaglia claims to have written the entire Heat Index section. He admits to using AI to generate that list but says he missed that it wasn’t accurate. “I can’t blame anybody else,” he said.
So, how did this article make it into the Chicago Sun-Times (and at least one edition of the Philadelphia Inquirer)? What can be done to keep it from happening again? To understand, we first have to look at how newspapers are written.
How the Sausage is Made

Though newspapers can seem like a monolith of content, they are far from it. Only a fraction of a newspaper’s content is written by its staff or freelancers working directly for it.
Though the percentage varies from publication, much of the content in the average newspaper is from outside sources.
One of the more common is wire services (or news agencies) such as the Associated Press (AP), Reuters and Agence France-Presse (AFP). Such services focus on national and international news. Since local papers are unlikely to have reporters worldwide, it’s a way for all papers to cover national and international news.
Wire services have existed since the 1830s. Historically, they have not been controversial, but in recent years, they have been criticized for publishing press release content as news articles. One glaring example happened in 2020, when the AP published an article touting essay writing services.
However, as newspaper consolidation has grown, conglomerates have sought to syndicate content among their publications. This allows them to cut costs by having one person write articles for multiple publications.
That, in turn, is what happened here. Buscaglia does not work for any of the publications involved. Instead, he works for King Features, a company owned by Hearst, which also owns the papers involved.
According to the King Features website, the company focused on “Delivering fun and engaging content loved by readers and publishers.” This includes comics, puzzles and written content. In this case, the company produced the Heat Index section.
That, in turn, is how an AI article with rampant errors ended up in at least two major newspapers.
Fixing the Problem
For the past two centuries, the newspaper industry has generally mixed self-produced content with wire service articles and other outside material.
Outside material is sometimes used to craft new articles, but it is often not checked for factuality or authenticity before publication. The reason, historically, is that wire services were seen as a gold standard in journalism. The news agencies held themselves to as high a standard as, if not a higher standard than, the newspapers they serve.
This is why the AP Stylebook is the leading style guide in journalism. Their work is heavily trusted and relied upon.
However, King Features is not the AP. It is a syndication service that focuses mainly on comics and puzzles. While that doesn’t make it inherently unethical, it doesn’t have the history of the AP and other news agencies.
Clearly, neither King Features nor the papers fact-checked this column. Even a cursory fact check would have prevented this. However, publications are not in the habit of checking outside content.
Buscaglia highlighted this, telling NPR, “Huge mistake on my part and has nothing to do with the Sun-Times. They trust that the content they purchase is accurate and I betrayed that trust.”
That trust is not tenable in 2025, especially from non-wire services. AI content like this article can shatter the public’s confidence in a major newspaper’s brand. It doesn’t matter how an AI-generated article with 10 fake books made it into the Sun Times. All that matters is that it did.
Unfortunately, this problem is unlikely to get better. If anything, it is likely just the beginning.
Why it Can’t Be Fixed
However, there is a serious problem with this. For newspapers, this is literally the worst time for such a shift.
Newspapers all over the country have been laying off staff. This includes the Sun-Times, which laid off 20% of its staff in March. If newspapers ever had the resources to check outside content, they do not now.
At no point in the past two hundred years have newspapers had a greater need to vet outside content and a weakened ability to do so. This double-whammy virtually guarantees stories like this one will happen again.
Ultimately, Buscaglia is to blame for this incident. To his credit, he acknowledges and admits this. But that doesn’t mean that King Features and Hearst should escape blame.
King Features clearly lacks an editorial process for catching these kinds of issues. Even a cursory fact check would have spotted the problems. King Features essentially allowed AI slop to be sent directly to major newspapers without serious examination.
Buscaglia may be the one who used AI inappropriately, but King Features and Heart enabled it.
They may have even contributed to it, to some degree. Buscaglia hasn’t commented on how the project came about, but authors like him often face unreasonable deadlines and content demands. AI may have been a practical necessity.
To be clear, I’m not saying that’s what happened in this case. However, I acknowledge that, as publications pressure fewer reporters to generate more content, it’s inevitable.
Bottom Line
Newspapers are in a bind. Their single most significant asset in the digital age is their history and reputation. As stalwarts of traditional media, they carry a degree of recognition and trust.
But stories like this one do irreparable damage to that trust. Even worse, shrinking newsrooms increase the reliance on outside content and harm the paper’s ability to vet it, making more stories like this one inevitable.
There is no simple solution here. It’s easy to say that syndication services like King Features must hold their work to the same standards as news agencies and have robust and transparent ethics policies. However, the resources and the motivation to enforce those standards are lacking.
But all of this was true before the rise of generative AI., which added another layer to the problem. News publications were never in a great position to police third-party content, but they’re even worse off now.
For any change to happen, responsibility will have to come from the top down. Hearst will have to hold King Features responsible, which will, in turn, hold its writer responsible.
But this responsibility goes well beyond this one case. It will include new policies, new editorial processes and other checks along the way. The question is whether Hearst has the resources and the will to implement those changes.
Newspapers cannot fix this alone. But they will ultimately pay the price if it isn’t addressed.
Want to Reuse or Republish this Content?
If you want to feature this article in your site, classroom or elsewhere, just let us know! We usually grant permission within 24 hours.