Anthropic Separates Claude Crawlers, Introducing More Precise Robots.txt Controls | Spherical Coder

Anthropic now categorizes its web crawlers into three roles: one for model training, one for search indexing, and one for responding to user-driven queries. ClaudeBot is responsible for collecting content used to train models. Claude-SearchBot handles indexing for AI-powered search results.