Google has recently announced the launch of its new web crawler bot called Google-Extended. This innovative bot is designed to collect data for AI tasks, specifically for applications like Bard and Vertex AI. With Google-Extended, website owners have the ability to easily configure whether they allow the bot to crawl and gather information from their web pages through the use of the existing robots.txt file.
Since Google-Extended follows Google’s standard bot framework, website owners can write rules for it in the same way they would for other bots using the robots.txt file. For example:
User-agent: Google-Extended
Disallow: /
This announcement from Google showcases their continuous efforts to provide webmasters with greater control over how their sites are accessed and utilized by bots. The integration of Google-Extended with the familiar robots.txt framework allows for seamless management and customization.
In summary, Google has introduced their new web crawler bot, Google-Extended, which is specifically built for AI tasks. Website owners can now easily determine whether they want to grant access to this bot via the robots.txt file. This development exemplifies Google’s commitment to empowering webmasters and enhancing their website management capabilities.
TLDR: Google has unveiled Google-Extended, a bot designed for AI tasks, allowing website owners to control its access through the robots.txt file. This showcases Google’s dedication to enabling webmasters with greater customization options.
Leave a Comment