A search expert at Google has confirmed the need for human reviews to ensure the quality of all online content generated via artificial intelligence (AI).
In an exclusive interview, Gary Illyes from Google spoke about various AI-related subjects, including the relationship between ‘quality’ content and anything created using AI.
When asked about whether Google punishes AI-generated content, Illyes responded:
“So basically when we say that it’s human, I think the word human created is wrong. Basically, it should be human curated. So basically someone had some editorial oversight over their content and validated that it’s actually correct and accurate.”
His answer is valid for users creating content for both search engine optimisation (SEO) purposes, as well as anyone training AI models of their own. Quality content is not necessarily that which has been written by a human, but has instead been approved and verified as quality by a human editor or reviewer.
Illyes also commented that he does not believe AI-generated content is polluting the Large Language Models (LLMs) that big companies in the AI industry such as Google, OpenAI and others are using to help train their generative models.
He did admit, however, that model trainers should be learning how to exclude AI-generated content where necessary. This is to avoid what he calls a “training loop”, whereby models are training themselves based on content that has been generated, in effect, by itself.
Online content creators should be aware that they should not be solely relying on AI for quality content. At Engage Web, our journalist-trained humans specialise in ensuring our clients’ online content is contributing to their digital growth. Learn how we can help you by speaking to our team today.
- Why Google recommends human reviews for AI-generated content - August 20, 2025
- Google will now bring you just the good news - August 24, 2018
- Barnsley FC reaches out to depressed fan - August 22, 2018