創(chuàng)新互聯(lián)內(nèi)容分發(fā)網(wǎng)絡(luò)(Content Delivery Network,簡(jiǎn)稱CDN)是建立并覆蓋在承載網(wǎng)之上,由分布在不同區(qū)域的邊緣節(jié)點(diǎn)服務(wù)器群組成的分布式網(wǎng)絡(luò)。創(chuàng)新互聯(lián)CDN分擔(dān)源站壓力,避免網(wǎng)絡(luò)擁塞,確保在不同區(qū)域、不同場(chǎng)景下加速網(wǎng)站內(nèi)容的分發(fā),提高資源訪問(wèn)速度。許多用戶在實(shí)際運(yùn)用CDN的過(guò)程中都會(huì)遇到一些CDN常見(jiàn)的問(wèn)題,今天小編跟大家聊一下公司使用快網(wǎng)的CDN的問(wèn)題。
成都創(chuàng)新互聯(lián)網(wǎng)站建設(shè)公司一直秉承“誠(chéng)信做人,踏實(shí)做事”的原則,不欺瞞客戶,是我們最起碼的底線! 以服務(wù)為基礎(chǔ),以質(zhì)量求生存,以技術(shù)求發(fā)展,成交一個(gè)客戶多一個(gè)朋友!專注中小微企業(yè)官網(wǎng)定制,成都做網(wǎng)站、成都網(wǎng)站制作,塑造企業(yè)網(wǎng)絡(luò)形象打造互聯(lián)網(wǎng)企業(yè)效應(yīng)。#!/bin/bash function line() { cols=`tput cols` for t in `seq 1 ${cols}` do echo -n "=" done } function global_define() { yesterday=`date -d "-1day" +%F` comp_path="/srv/cdnlog/downloads/${yesterday}/" cdnlog_list=`ls ${comp_path} | sed 's#.gz##g'` uncomp_path="/srv/cdnlog/uncomp/${yesterday}/" user=`whoami` backup_path="/srv/cdnlog/downloads/${yesterday}/" } function global_check() { if [ $user != root ];then echo "Please use root ." exit 1 fi } function cycle() { for n in $cdnlog_list do check define uncomp segmentation source_address method http_code http_size referer retime platform tag back cdn_node response_size delete done } function check() { if [ -d ${uncomp_path}${n}/ ];then rm -rf ${uncomp_path}${n}/ fi } function define() { gz_file="${n}.gz" log_file="${n}.log" log_path="/srv/cdnlog/uncomp/${yesterday}/${n}/" } function uncomp() { mkdir -p ${log_path} gunzip -c ${comp_path}${gz_file} > ${log_path}${log_file} } function segmentation() { log_file_count=`cat ${log_path}${log_file} | wc -l` awk -vsour_addr=${log_path}sour_addr.tmp -vmethod=${log_path}method.tmp -vhttp_code=${log_path}http_code.tmp -vhttp_size=${log_path}http_size.tmp -vreferer=${log_path}referer.tmp -vplatform=${log_path}platform.tmp -vcdn_node=${log_path}cdn_node.tmp -vresponse_size=${log_path}response_size.tmp '{print $1 > sour_addr ;print $6,$7 > method ;print $9 > http_code ;print $10 > http_size ;print $11 > referer ;print $(NF) > cdn_node ;print $(NF-1) > response_size}' ${log_path}${log_file} cat ${log_path}${log_file} | egrep "FCACHE_HIT_DISK|FCACHE_HIT|FCACHE_MISS" | awk -F 'FCACHE_HIT_DISK|FCACHE_HIT|FCACHE_MISS' '{print $2}' > ${log_path}more.tmp awk -vretime1=${log_path}retime1.tmp -vretime2=${log_path}retime2.tmp -vretime3=${log_path}retime3.tmp -vretime4=${log_path}retime4.tmp -vretime5=${log_path}retime5.tmp -vretime6=${log_path}retime6.tmp -vretime7=${log_path}retime7.tmp -vretime8=${log_path}retime8.tmp -vretime9=${log_path}retime9.tmp -vtag=${log_path}tag.tmp -vback=${log_path}back.tmp '{print $1 > retime1 ;print $2 > retime2 ;print $3 > retime3 ;print $4 > retime4 ;print $5 > retime5 ;print $6 > retime6 ;print $7 > retime7 ;print $8 > retime8 ;print $9 > retime9 ;print $10 > tag ;print $12 > back }' ${log_path}more.tmp cat ${log_path}${log_file} | awk -F '[(|)]' '{print $2}' > ${log_path}platform.tmp } function source_address() { line >> ${log_path}sour_addr.log echo "Source Address Top 30 :" >> ${log_path}sour_addr.log echo "" >> ${log_path}sour_addr.log cat ${log_path}sour_addr.tmp | sort -r | uniq -c | sort -nr | head -30 >> ${log_path}sour_addr.log sour_addr_total=`cat ${log_path}sour_addr.tmp | sort -r | uniq -c | wc -l ` echo "" >> ${log_path}sour_addr.log echo "Source Address Total : $sour_addr_total" >> ${log_path}sour_addr.log line >> ${log_path}sour_addr.log } function method() { line >> ${log_path}method.log echo "Method & URL Top 30 :" >> ${log_path}method.log echo "" >> ${log_path}method.log cat ${log_path}method.tmp | sed 's#"##g' | sort -r | uniq -c | sort -nr | head -30 >> ${log_path}method.log line >> ${log_path}method.log } function http_code() { line >> ${log_path}http_code.log echo "Http Code Top :" >> ${log_path}http_code.log echo "" >> ${log_path}http_code.log http_code_list=`cat ${log_path}http_code.tmp | sort -r | uniq -c | sort -nr | awk '{print $2}'` cat ${log_path}http_code.tmp | sort -r | uniq -c | sort -nr >> ${log_path}http_code.log echo "" >> ${log_path}http_code.log for y in $http_code_list do count=`cat ${log_path}http_code.log | awk -vncode=$y '$2==ncode{print $1}'` rate=`echo "$count $log_file_count" | awk '{printf ("%.2f\n",$1*100/$2)}'` echo -e "HTTP $y Count: \t$count" >> ${log_path}http_code.log echo -e "HTTP $y Rate: \t\t${rate}%" >> ${log_path}http_code.log echo "" >> ${log_path}http_code.log done line >> ${log_path}http_code.log } function http_size() { size_max=`cat ${log_path}http_size.tmp | sort -nr | head -1` size_min=`cat ${log_path}http_size.tmp | sort -nr | tail -1` size_total_tmp=`cat ${log_path}http_size.tmp | awk '{sum+=$1}END{print sum}'` size_total=`echo $size_total_tmp | awk '{printf ("%.0f\n",$1)}'` size_avg=`echo "$size_total $log_file_count" | awk '{printf ("%.2f\n",$1/$2)}'` line >> ${log_path}http_size.log echo "" >> ${log_path}http_size.log echo -e "Http Size Max : \t$size_max" >> ${log_path}http_size.log echo -e "Http Size Min : \t$size_min" >> ${log_path}http_size.log echo -e "Http Size Total : \t$size_total" >> ${log_path}http_size.log echo -e "Http Size Avg : \t$size_avg" >> ${log_path}http_size.log line >> ${log_path}http_size.log } function referer() { line >> ${log_path}referer.log echo "Referer Top 30 :" >> ${log_path}referer.log echo "" >> ${log_path}referer.log cat ${log_path}referer.tmp | sed 's#"##g' | sort -r | uniq -c | sort -nr | head -30 >> ${log_path}referer.log line >> ${log_path}referer.log } function retime() { sed -ie '/-/d' ${log_path}retime{1..9}.tmp for i in `seq 1 9` do eval retime${i}=`cat ${log_path}retime${i}.tmp | wc -l` eval sum${i}=`cat ${log_path}retime${i}.tmp | awk '{sum+=$1}END{print sum}' | awk '{printf ("%.2f\n",$1)}'` eval avg${i}=`eval echo "\\${retime${i}} \\${sum$i}" | awk '{printf ("%.2f\n",$2/$1)}'` eval max${i}=`cat ${log_path}retime${i}.tmp | sort -nr | head -1` eval min${i}=`cat ${log_path}retime${i}.tmp | sort -nr | tail -1` done for k in `seq 1 9` do line >> ${log_path}retime${k}.log echo "Retime${k} :" >> ${log_path}retime${k}.log echo "" >> ${log_path}retime${k}.log echo -en "Max \t: \t" >> ${log_path}retime${k}.log eval echo "\${max$k}" >> ${log_path}retime${k}.log echo -en "Min \t: \t" >> ${log_path}retime${k}.log eval echo "\${min$k}" >> ${log_path}retime${k}.log echo -en "Avg \t: \t" >> ${log_path}retime${k}.log eval echo "\${avg$k}" >> ${log_path}retime${k}.log echo -en "Total : \t" >> ${log_path}retime${k}.log eval echo "\${sum$k}" >> ${log_path}retime${k}.log line >> ${log_path}retime${k}.log done sed -i 's#-nan#0.00#g' ${log_path}retime{1..9}.log } function platform() { iphone_count=`cat ${log_path}platform.tmp | egrep "iPhone" | wc -l` ipad_count=`cat ${log_path}platform.tmp | egrep "iPad" | wc -l` mac_count=`cat ${log_path}platform.tmp | egrep "Macintosh" | wc -l` android_count=`cat ${log_path}platform.tmp | egrep "Android|Adr" | wc -l` windows_count=`cat ${log_path}platform.tmp | egrep "Windows NT|Windows 98|Windows XP" | wc -l` windows_mobile_count=`cat ${log_path}platform.tmp | egrep "Windows Phone" | wc -l` java_count=`cat ${log_path}platform.tmp | egrep "java" | wc -l` linux_count=`cat ${log_path}platform.tmp | egrep "X11; Linux|linux-gnu" | wc -l` iphone_rate=`echo "$iphone_count $log_file_count" | awk '{printf ("%.2f\n",$1/$2*100)}'` ipad_rate=`echo "$ipad_count $log_file_count" | awk '{printf ("%.2f\n",$1/$2*100)}'` mac_rate=`echo "$mac_count $log_file_count" | awk '{printf ("%.2f\n",$1/$2*100)}'` android_rate=`echo "$android_count $log_file_count" | awk '{printf ("%.2f\n",$1/$2*100)}'` windows_rate=`echo "$windows_count $log_file_count" | awk '{printf ("%.2f\n",$1/$2*100)}'` windows_mobile_rate=`echo "$windows_mobile_count $log_file_count" | awk '{printf ("%.2f\n",$1/$2*100)}'` java_rate=`echo "$java_count $log_file_count" | awk '{printf ("%.2f\n",$1/$2*100)}'` linux_rate=`echo "$linux_count $log_file_count" | awk '{printf ("%.2f\n",$1/$2*100)}'` line >> ${log_path}platform.log echo "Platform Info :" >> ${log_path}platform.log echo "" >> ${log_path}platform.log echo -e "iPhone count \t\t: $iphone_count" >> ${log_path}platform.log echo -e "iPhone rate \t\t: ${iphone_rate}%" >> ${log_path}platform.log echo "" >> ${log_path}platform.log echo -e "iPad count \t\t: $ipad_count" >> ${log_path}platform.log echo -e "iPad rate \t\t: ${ipad_rate}%" >> ${log_path}platform.log echo "" >> ${log_path}platform.log echo -e "Mac count \t\t: $mac_count" >> ${log_path}platform.log echo -e "Mac rate \t\t: ${mac_rate}%" >> ${log_path}platform.log echo "" >> ${log_path}platform.log echo -e "Android count \t\t: $android_count" >> ${log_path}platform.log echo -e "Android rate \t\t: ${android_rate}%" >> ${log_path}platform.log echo "" >> ${log_path}platform.log echo -e "Windows count \t\t: $windows_count" >> ${log_path}platform.log echo -e "Windows rate \t\t: ${windows_rate}%" >> ${log_path}platform.log echo "" >> ${log_path}platform.log echo -e "Windows mobile count \t: $windows_mobile_count" >> ${log_path}platform.log echo -e "Windows mobile rate \t: ${windows_mobile_rate}%" >> ${log_path}platform.log echo "" >> ${log_path}platform.log echo -e "Java count \t\t: $java_count" >> ${log_path}platform.log echo -e "Java rate \t\t: ${java_rate}%" >> ${log_path}platform.log echo "" >> ${log_path}platform.log echo -e "Linux count \t\t: $linux_count" >> ${log_path}platform.log echo -e "Linux rate \t\t: ${linux_rate}%" >> ${log_path}platform.log line >> ${log_path}platform.log } function tag() { tag_total=`cat ${log_path}more.tmp | wc -l` tag_suc=`cat ${log_path}more.tmp | awk '{print $10}' | grep "1" | wc -l` tag_fail=`cat ${log_path}more.tmp | awk '{print $10}' | grep "0" | wc -l` tag_suc_rate=`echo "$tag_suc $tag_total" | awk '{printf ("%.2f\n",$1/$2*100)}'` tag_fal_rate=`echo "$tag_fail $tag_total" | awk '{printf ("%.2f\n",$1/$2*100)}'` line >> ${log_path}tag.log echo "Tag of Download Info :" >> ${log_path}tag.log echo "" >> ${log_path}tag.log echo -e "Download Success Count \t\t: $tag_suc" >> ${log_path}tag.log echo -e "Download Success Rate \t\t: ${tag_suc_rate}%" >> ${log_path}tag.log echo -e "Download Fail Count \t\t: $tag_fail" >> ${log_path}tag.log echo -e "Download Fail Rate \t\t: ${tag_fal_rate}%" >> ${log_path}tag.log line >> ${log_path}tag.log } function back() { back_total=`cat ${log_path}more.tmp | awk '{print $12}' | egrep -v "-"| wc -l` back_rate=`echo "$back_total $log_file_count" | awk '{printf ("%.2f\n",$1/$2*100)}'` line >> ${log_path}back.log echo "Back Source Info :" >> ${log_path}back.log echo "" >> ${log_path}back.log echo -e "Back Source Count \t: $back_total" >> ${log_path}back.log echo -e "Back Source Rate \t: ${back_rate}%" >> ${log_path}back.log line >> ${log_path}back.log } function cdn_node() { line >> ${log_path}cdn_node.log echo "CDN Node Top 30 :" >> ${log_path}cdn_node.log echo "" >> ${log_path}cdn_node.log cat ${log_path}cdn_node.tmp | sort -n | uniq -c | sort -nr | head -30 > ${log_path}cdn_node.log line >> ${log_path}cdn_node.log } function response_size() { response_size_total=`cat ${log_path}response_size.tmp | awk '{sum+=$1}END{print sum}' | awk '{printf ("%.0f\n",$1)}'` response_size_avg=`echo "$response_size_total $log_file_count" | awk '{printf ("%.2f\n",$1/$2*100)}'` line >> ${log_path}response.log echo "Response Size Info :" >> ${log_path}response.log echo "" >> ${log_path}response.log echo -e "Response Size Total \t: $response_size_total" >> ${log_path}response.log echo -e "Response Size Avg \t: $response_size_avg" >> ${log_path}response.log line >> ${log_path}response.log } function delete() { rm -f ${log_path}*.tmp rm -f ${log_path}*.tmpe rm -f ${log_path}${log_file} } function backup() { tar zcf ${backup_path}statistics_${yesterday}.tar.gz ${log_path}*.log } function main() { global_define global_check cycle backup } main
如果大家還有什么不明白的地方可以在創(chuàng)新互聯(lián)官網(wǎng)找我們的CDN技術(shù)工程師的,創(chuàng)新互聯(lián)CDN技術(shù)工程師在行業(yè)內(nèi)擁有十幾年的經(jīng)驗(yàn)了,所以會(huì)比小編回答的更加詳細(xì)專業(yè)。創(chuàng)新互聯(lián)官網(wǎng)鏈接www.yisu.com
另外有需要云服務(wù)器可以了解下創(chuàng)新互聯(lián)scvps.cn,海內(nèi)外云服務(wù)器15元起步,三天無(wú)理由+7*72小時(shí)售后在線,公司持有idc許可證,提供“云服務(wù)器、裸金屬服務(wù)器、高防服務(wù)器、香港服務(wù)器、美國(guó)服務(wù)器、虛擬主機(jī)、免備案服務(wù)器”等云主機(jī)租用服務(wù)以及企業(yè)上云的綜合解決方案,具有“安全穩(wěn)定、簡(jiǎn)單易用、服務(wù)可用性高、性價(jià)比高”等特點(diǎn)與優(yōu)勢(shì),專為企業(yè)上云打造定制,能夠滿足用戶豐富、多元化的應(yīng)用場(chǎng)景需求。
新聞標(biāo)題:快網(wǎng)CDN怎么樣統(tǒng)計(jì)腳本-創(chuàng)新互聯(lián)
網(wǎng)頁(yè)網(wǎng)址:http://jinyejixie.com/article2/djegic.html
成都網(wǎng)站建設(shè)公司_創(chuàng)新互聯(lián),為您提供品牌網(wǎng)站設(shè)計(jì)、App設(shè)計(jì)、網(wǎng)站制作、網(wǎng)站設(shè)計(jì)、用戶體驗(yàn)、小程序開(kāi)發(fā)
聲明:本網(wǎng)站發(fā)布的內(nèi)容(圖片、視頻和文字)以用戶投稿、用戶轉(zhuǎn)載內(nèi)容為主,如果涉及侵權(quán)請(qǐng)盡快告知,我們將會(huì)在第一時(shí)間刪除。文章觀點(diǎn)不代表本網(wǎng)站立場(chǎng),如需處理請(qǐng)聯(lián)系客服。電話:028-86922220;郵箱:631063699@qq.com。內(nèi)容未經(jīng)允許不得轉(zhuǎn)載,或轉(zhuǎn)載時(shí)需注明來(lái)源: 創(chuàng)新互聯(lián)
猜你還喜歡下面的內(nèi)容