网站访问非常慢cpu占用非常高负载非常高
前言:最近网站访问非常慢,cpu占用非常高,服务器负载整体也非常高,打开日志发现有很多不知名的蜘蛛一直在爬行我的站点,根据经验肯定是这里的问题,于是根据我的情况写了规则做了屏蔽,屏蔽后负载降下来了,下面整理下iis及nginx及apache环境下如何屏蔽不知名的蜘蛛ua。
注意(请根据自己的情况调整删除或增加ua信息,我提供的规则中包含了不常用的蜘蛛ua,几乎用不着,若您的网站比较特殊,需要不同的蜘蛛爬取,建议仔细分析规则,将指定ua删除即可)
屏蔽后的效果
1、我网站是nginx,下面是我的屏蔽规则,将规则添加到配置文件的server段里面,当这些蜘蛛来抓取时会返回444;
if ($http_user_agent ~ "MegaIndex|MegaIndex.ru|BLEXBot|Qwantify|qwantify|semrush|Semrush|serpstatbot|
hubspot|python|Bytespider|Go-http-client|Java|PhantomJS|SemrushBot|Scrapy|Webdup|AcoonBot|AhrefsBot|Ezooms
|EdisterBot|EC2LinkFinder|jikespider|Purebot|MJ12bot|WangIDSpider|WBSearchBot|Wotbox|xbfMozilla|Yottaa|
YandexBot|Jorgee|SWEBot|spbot|TurnitinBot-Agent|mail.RU|perl|Python|Wget|Xenu|ZmEu|^$" )
{
return 444;
}
2、IIS7/IIS8/IIS10及以上web服务请在网站根目录下创建web.config文件,并写入如下代码即可;
3、IIS6请在isapi重写组件中添加规则
#Block spider
RewriteCond %{HTTP_USER_AGENT} (MegaIndex|MegaIndex.ru|BLEXBot|Qwantify|qwantify|semrush|Semrush|serpstatbot
|hubspot|python|Bytespider|Go-http-client|Java|PhantomJS|SemrushBot|Scrapy|Webdup|AcoonBot|AhrefsBot|Ezooms|
EdisterBot|EC2LinkFinder|jikespider|Purebot|MJ12bot|WangIDSpider|WBSearchBot|Wotbox|xbfMozilla|Yottaa|YandexBot
|Jorgee|SWEBot|spbot|TurnitinBot-Agent|mail.RU|perl|Python|Wget|Xenu|ZmEu|^$) [NC]
RewriteRule !(^/robots.txt$) - [F]
4、apache请在.htaccess文件中添加如下规则即可:
RewriteEngine On
#Block spider
RewriteCond %{HTTP_USER_AGENT} "MegaIndex|MegaIndex.ru|BLEXBot|Qwantify|qwantify|semrush|Semrush|serpstatbot
|hubspot|python|Bytespider|Go-http-client|Java|PhantomJS|SemrushBot|Scrapy|Webdup|AcoonBot|AhrefsBot|Ezooms|
EdisterBot|EC2LinkFinder|jikespider|Purebot|MJ12bot|WangIDSpider|WBSearchBot|Wotbox|xbfMozilla|Yottaa|YandexBot
|Jorgee|SWEBot|spbot|TurnitinBot-Agent|mail.RU|perl|Python|Wget|Xenu|ZmEu|^$" [NC]
RewriteRule !(^robots\.txt$) - [F]