2021-02-08 分類: 網(wǎng)站建設(shè)
一、Apache
①、通過修改 .htaccess 文件
修改網(wǎng)站目錄下的.htaccess,添加如下代碼即可(2 種代碼任選):
可用代碼 (1):
- RewriteEngine On
- RewriteCond %{HTTP_USER_AGENT} (^$|FeedDemon|Indy Library|Alexa Toolbar|AskTbFXTV|AhrefsBot|CrawlDaddy|CoolpadWebkit|Java|Feedly|UniversalFeedParser|ApacheBench|Microsoft URL Control|Swiftbot|ZmEu|oBot|jaunty|Python–urllib|lightDeckReports Bot|YYSpider|DigExt|HttpClient|MJ12bot|heritrix|EasouSpider|Ezooms) [NC]
- RewriteRule ^(.*)$ – [F]
可用代碼 (2):
- SetEnvIfNoCase ^User–Agent$ .*(FeedDemon|Indy Library|Alexa Toolbar|AskTbFXTV|AhrefsBot|CrawlDaddy|CoolpadWebkit|Java|Feedly|UniversalFeedParser|ApacheBench|Microsoft URL Control|Swiftbot|ZmEu|oBot|jaunty|Python–urllib|lightDeckReports Bot|YYSpider|DigExt|HttpClient|MJ12bot|heritrix|EasouSpider|Ezooms) BADBOT
- Order Allow,Deny
- Allow from all
- Deny from env=BADBOT
②、通過修改 httpd.conf 配置文件
找到如下類似位置,根據(jù)以下代碼 新增 / 修改,然后重啟 Apache 即可:
Shell
- DocumentRoot /home/wwwroot/xxx
- SetEnvIfNoCase User–Agent “.*(FeedDemon|Indy Library|Alexa Toolbar|AskTbFXTV|AhrefsBot|CrawlDaddy|CoolpadWebkit|Java|Feedly|UniversalFeedParser|ApacheBench|Microsoft URL Control|Swiftbot|ZmEu|oBot|jaunty|Python-urllib|lightDeckReports Bot|YYSpider|DigExt|HttpClient|MJ12bot|heritrix|EasouSpider|Ezooms)” BADBOT
- Order allow,deny
- Allow from all
- deny from env=BADBOT
二、Nginx 代碼
進(jìn)入到 nginx 安裝目錄下的 conf 目錄,將如下代碼保存為 agent_deny.conf
- cd /usr/local/nginx/conf
- vim agent_deny.conf
- #禁止Scrapy等工具的抓取
- if ($http_user_agent ~* (Scrapy|Curl|HttpClient)) {
- return 403;
- }
- #禁止指定UA及UA為空的訪問
- if ($http_user_agent ~* “FeedDemon|Indy Library|Alexa Toolbar|AskTbFXTV|AhrefsBot|CrawlDaddy|CoolpadWebkit|Java|Feedly|UniversalFeedParser|ApacheBench|Microsoft URL Control|Swiftbot|ZmEu|oBot|jaunty|Python-urllib|lightDeckReports Bot|YYSpider|DigExt|HttpClient|MJ12bot|heritrix|EasouSpider|Ezooms|^$” ) {
- return 403;
- }
- #禁止非GET|HEAD|POST方式的抓取
- if ($request_method !~ ^(GET|HEAD|POST)$) {
- return 403;
- }
然后,在網(wǎng)站相關(guān)配置中的 location / { 之后插入如下代碼:
Shell
- include agent_deny.conf;
如下的配置:
Shell
- [marsge@Mars_Server ~]$ cat /usr/local/nginx/conf/zhangge.conf
- location / {
- try_files $uri $uri/ /index.php?$args;
- #這個位置新增1行:
- include agent_deny.conf;
- rewrite ^/sitemap_360_sp.txt$ /sitemap_360_sp.php last;
- rewrite ^/sitemap_baidu_sp.xml$ /sitemap_baidu_sp.php last;
- rewrite ^/sitemap_m.xml$ /sitemap_m.php last;
保存后,執(zhí)行如下命令,平滑重啟 nginx 即可:
Shell
- /usr/local/nginx/sbin/nginx –s reload
三、PHP 代碼
將如下方法放到貼到網(wǎng)站入口文件 index.php 中的第一個
PHP
- //獲取UA信息
- $ua = $_SERVER[‘HTTP_USER_AGENT’];
- //將惡意USER_AGENT存入數(shù)組
- $now_ua = array(‘FeedDemon ‘,‘BOT/0.1 (BOT for JCE)’,‘CrawlDaddy ‘,‘Java’,‘Feedly’,‘UniversalFeedParser’,‘ApacheBench’,‘Swiftbot’,‘ZmEu’,‘Indy Library’,‘oBot’,‘jaunty’,‘YandexBot’,‘AhrefsBot’,‘MJ12bot’,‘WinHttp’,‘EasouSpider’,‘HttpClient’,‘Microsoft URL Control’,‘YYSpider’,‘jaunty’,‘Python-urllib’,‘lightDeckReports Bot’);
- //禁止空USER_AGENT,dedecms等主流采集程序都是空USER_AGENT,部分sql注入工具也是空USER_AGENT
- if(!$ua) {
- header(“Content-type: text/html; charset=utf-8”);
- die(‘請勿采集本站,因?yàn)椴杉恼鹃L木有小JJ!’);
- }else{
- foreach($now_ua as $value )
- //判斷是否是數(shù)組中存在的UA
- if(eregi($value,$ua)) {
- header(“Content-type: text/html; charset=utf-8”);
- die(‘請勿采集本站,因?yàn)椴杉恼鹃L木有小JJ!’);
- }
- }
四、測試效果
如果是 VPS,那非常簡單,使用 curl -A 模擬抓取即可,比如:
模擬宜搜蜘蛛抓取:
Shell
- curl –I –A ‘YisouSpider’ bizhi.bcoderss.com
模擬 UA 為空的抓?。?/p>
Shell
- curl –I –A ” bizhi.bcoderss.com
模擬百度蜘蛛的抓?。?/p>
Shell
- curl –I –A ‘Baiduspider’ bizhi.bcoderss.com
分享標(biāo)題:怎樣禁止服務(wù)器爬蟲?
本文URL:http://jinyejixie.com/news/99765.html
成都網(wǎng)站建設(shè)公司_創(chuàng)新互聯(lián),為您提供微信公眾號、定制開發(fā)、網(wǎng)站改版、網(wǎng)站收錄、標(biāo)簽優(yōu)化、軟件開發(fā)
聲明:本網(wǎng)站發(fā)布的內(nèi)容(圖片、視頻和文字)以用戶投稿、用戶轉(zhuǎn)載內(nèi)容為主,如果涉及侵權(quán)請盡快告知,我們將會在第一時間刪除。文章觀點(diǎn)不代表本網(wǎng)站立場,如需處理請聯(lián)系客服。電話:028-86922220;郵箱:631063699@qq.com。內(nèi)容未經(jīng)允許不得轉(zhuǎn)載,或轉(zhuǎn)載時需注明來源: 創(chuàng)新互聯(lián)
猜你還喜歡下面的內(nèi)容