Google talks about how Googlebot handles content generated by artificial intelligence

Google’s Martin Splitt was asked how Googlebot’s crawling and rendering are adapting to the increase in content generated by artificial intelligence.

Martin’s answer provided insight into how Google handles AI-generated content and the role of quality control.

View the Googlebot webpage

Web page rendering is the process of creating a web page in a browser by downloading HTML, images, CSS, and JavaScript and then compiling them into a web page.

Google’s crawler, Googlebot, also downloads HTML, image, CSS, and JavaScript files to display the web page.

How Google handles AI-generated content

The context for Martin’s comments was in a webinar called Exploring Presentation Art with Google’s Martin Splitt, which was produced by Duda.

An attendee asked if the large amount of AI content had an impact on Google’s ability to render pages when crawled.

Martin provided an explanation but also added information about how Google, at crawl time, determines if a web page is of low quality and what Google does after determining.

Ammon Jones asked the question that Ulrica Viberg read.

Here is the question:

“So, we have one from Ammon as well, and that is something that gets talked about a lot.

I see that a lot.

They said, content production is ramping up due to AI, which increases crawl and rendering loads.

Is it likely that it will be necessary to simplify rendering processes?

It seems that what Ammon wants to know is whether there are any special operations that occur in response to the AI ​​content in order to handle the crawl and rendering overhead.

Martin Split replied:

“No, I don’t think so, because my best guess is…”

Martin then addresses the obvious problem with AI content search engine optimization (SEO) discovery.

Martin continued:

“So we do quality detection or quality control at multiple stages, and most sexual content doesn’t necessarily need JavaScript to show us how awesome it is.

So, if we find out it’s dirty content before, we skip the show, so what’s the point?

If we see, well, that sounds like an absolute.. we can be pretty sure that’s bullshit, and JavaScript might add more bullshit, then bye.

If it’s a blank page, we might say, we don’t know.

People usually don’t put blank pages here, so let’s at least try to show it.

And then, when the show comes back badly, we say, yeah well, fair enough, that was rubbish.

So, this is already happening. This is nothing new.

AI may increase the scope of work, but it doesn’t change that much. Presentation is not the culprit here.

Quality detection applies to AI

Martin Split did not say that Google was applying AI detection to content.

He said Google was using quality detection in multiple stages.

And this is very interesting because the Search Engine Journal published a report An article on the quality detection algorithm It also detects low quality AI content.

The algorithm is not built to find low-quality, machine-generated content. But they discovered that the algorithm detected it automatically.

Much of this algorithm tracks everything Google has announced about its useful content system that is designed to identify content that people write.

Books by Danny Sullivan About the useful content algorithm:

“…we’re rolling out a series of improvements to search to make it easier for people to find useful content created by and for people.”

And he didn’t just mention the content people wrote once. His article announcing the Useful Content System mentioned it three times.

The algorithm is designed to detect automatically generated content which also detects low quality content in general.

The research paper is titled, Generative models are unsupervised predictors of page quality: a large-scale study.

The researchers note:

“This paper hypothesizes that detectors trained to distinguish between human and machine-written text are effective predictors of the language quality of web pages, and outperform a basic, moderated spam classifier.”

Going back to what Martin Splitt said:

“…we do quality checks or quality control in multiple stages…

So, this is already happening. This is nothing new.

AI may increase volume, but it doesn’t change that much.

What Martin appears to be saying is that:

  1. There is nothing new being applied to AI content
  2. Google uses quality detection for both human content and AI content

Watch Duda’s webinar featuring Martin Splitt at the 35:50-minute mark:

Exploring the Art of Presentation with Martin Split from Google

Source link

Related Articles

Leave a Reply

Your email address will not be published. Required fields are marked *

Back to top button