Bootstrap

apache、iis规则屏蔽拦截ClaudeBot等蜘蛛爬虫抓取网页

如果是正常的搜索引擎蜘蛛访问,不建议对蜘蛛进行禁止,否则网站在百度等搜索引擎中的收录和排名将会丢失,造成客户流失等损失。但也有大量人工智能模型会爬取你的网页,这些就很没必要了,此时需要屏蔽这些蜘蛛爬虫访问。

Linux下规则文件.htaccess(手工创建.htaccess文件到站点根目录)

<IfModule mod_rewrite.c>
RewriteEngine On
#Block spider
RewriteCond %{HTTP_USER_AGENT} "Apache-HttpClient|SemrushBot|Webdup|AcoonBot|AhrefsBot|Ezooms|EdisterBot|EC2LinkFinder|jikespider|Purebot|MJ12bot|WangIDSpider|WBSearchBot|Wotbox|xbfMozilla|Yottaa|YandexBot|Jorgee|SWEBot|spbot|TurnitinBot-Agent|mail.RU|curl|perl|Python|Wget|Xenu|ZmEu" [NC]
RewriteRule !(^robots\.txt$) - [F]
</IfModule>

Windows2008、2012或更高系统下规则文件web.config (手工创建web.config文件到站点根目录)

<?xml version="1.0" encoding="UTF-8"?>
<configuration>
  <system.webServer>
   <rewrite>
    <rules>
     <rule name="Block spider">
      <match url="(^robots.txt$)" ignoreCase="false" negate="true" />
      <conditions>
      <add input="{HTTP_USER_AGENT}" pattern="Apache-HttpClient|SemrushBot|Webdup|AcoonBot|AhrefsBot|Ezooms|EdisterBot|EC2LinkFinder|jikespider|Purebot|MJ12bot|WangIDSpider|WBSearchBot|Wotbox|xbfMozilla|Yottaa|YandexBot|Jorgee|SWEBot|spbot|TurnitinBot-Agent|mail.RU|curl|perl|Python|Wget|Xenu|ZmEu" ignoreCase="true" />
      </conditions>
       <action type="AbortRequest"/>
     </rule>
    </rules>
   </rewrite>
  </system.webServer>
</configuration>

注:“{HTTP_USER_AGENT}”所在行中是不明蜘蛛名称,根据需要添加以"|"为分割。
  规则中默认屏蔽部分不明蜘蛛,要屏蔽其他蜘蛛按规则添加即可,附各大蜘蛛名字:
  google蜘蛛:googlebot
  百度蜘蛛:baiduspider
  百度手机蜘蛛:baiduboxapp
  yahoo蜘蛛:slurp
  alexa蜘蛛:ia_archiver
  msn蜘蛛:msnbot
  bing蜘蛛:bingbot
  altavista蜘蛛:scooter
  lycos蜘蛛:lycos_spider_(t-rex)
  alltheweb蜘蛛:fast-webcrawler
  inktomi蜘蛛:slurp
  有道蜘蛛:YodaoBot和OutfoxBot
  热土蜘蛛:Adminrtspider
  搜狗蜘蛛:sogou spider
  SOSO蜘蛛:sosospider
  360搜蜘蛛:360spider  

;