怎样封禁恶意 ip,避免服务器产生巨额流量费

From 清冽之泉
Jump to navigation Jump to search

有一天早上醒来,收到两条短信,一是说我的服务器流量耗尽,已产生欠费;二是说我的服务器已保护性关停。还好我的流量用尽就自动停机,要是自动续费,可能被吸血一万块,我都没知觉。

我观察了一天,发现恶意 ip 一天就消耗了我 30GB 流量,而我每个月总共才 tm 600G,照它们的吸血速度,我还差 300G 才够他们吸。我都没从我的网站赚到钱,自费购买服务器、域名,花费时间精力写文章,却不但为恶意 ip 打工,还要帮它们缴出站流量费,我又不是傻子。遂决定,封禁恶意 ip。

那怎样抓住恶意 ip,拯救自己的流量费?

所用工具

  • iftop 查看瞬时流量
  • vnstat 按日、按时查看消耗流量
  • robots.txt 爬虫协议
  • iptables 防火墙工具,利用它拒绝恶意 ip 连接
  • awk 利用它获取字段
  • tail 利用它读取日志尾部内容
  • sort 排序日志内容
  • access.log 从访客日志里找线索
  • curl 测试封禁效果 查本机 ip:curl -s https://ifconfig.me

抓住坏蛋

请求数最多

sudo awk '{print $1}' /var/log/apache2/access.log | sort | uniq -c | sort -rn | head -n 30

# 示例输出
    1342 ::1
    416 51.161.86.195
    170 65.108.2.171
    164 37.27.51.140
    151 57.141.16.75
    150 195.201.199.99
    101 194.247.173.99
     82 147.135.214.103
     79 135.181.180.59
     73 122.97.68.77
     71 216.244.66.238
     61 57.141.16.83
     60 57.141.16.71
     60 57.141.16.63

流量用最多

sudo awk '$10 != "-" {print $1, $10}' /var/log/apache2/access.log \
| awk '{bw[$1]+=$2} END {for (ip in bw) printf "%.2f MB %s\n", bw[ip]/(1024*1024), ip}' \
| sort -rn \
| head -n 30

# 示例输出
15.69 MB 3.224.215.150
15.01 MB 3.212.205.90
14.21 MB 18.232.11.247
13.98 MB 34.206.212.24
13.57 MB 3.221.156.96
13.20 MB 3.227.180.70
13.11 MB 34.206.249.188
13.00 MB 34.226.89.140
12.48 MB 3.218.103.254
12.23 MB 18.232.36.1
11.97 MB 34.231.45.47
11.96 MB 54.152.163.42
11.95 MB 52.70.209.13
11.93 MB 3.213.85.234
11.72 MB 3.213.106.226
11.65 MB 52.44.174.136
11.56 MB 54.84.93.8

最频繁页面

sudo awk '$10 != "-" {print $7, $10}' /var/log/apache2/access.log \
| awk '{bw[$1]+=$2} END {for (u in bw) print bw[u], u}' \
| sort -rn \
| head -n 40 \
| while read size url; do
    decoded_url=$(printf '%b' "${url//%/\\x}")
    echo "$size $decoded_url"
  done

# 示例输出
65174752 /load.php?lang=en&modules=ext.SimpleMathJax,SimpleTooltip|jquery,oojs,oojs-ui-core|jquery.client,lengthLimit,textSelection|mediawiki.String,Title,api,base,cldr,cookie,htmlform,jqueryMsg,language,storage,user,util|mediawiki.editfont.styles|mediawiki.libs.pluralruleparser|mediawiki.page.ready|mediawiki.widgets.visibleLengthLimit|oojs-ui-core.icons,styles|oojs-ui.styles.indicators|skins.vector.legacy.js&skin=vector&version=kzx8c
24853627 /load.php?lang=en&modules=codex-search-styles,jquery,oojs,oojs-ui,oojs-ui-core,oojs-ui-toolbars,oojs-ui-widgets,oojs-ui-windows,site|ext.SimpleMathJax,SimpleTooltip|jquery.client,textSelection|mediawiki.String,Title,Uri,api,base,cldr,cookie,diff,experiments,jqueryMsg,language,router,storage,template,user,util|mediawiki.libs.pluralruleparser|mediawiki.page.ready|mediawiki.page.watch.ajax|mediawiki.template.mustache|mobile.init,startup|mobile.pagelist.styles|mobile.pagesummary.styles|oojs-ui-toolbars.icons|oojs-ui-widgets.icons|oojs-ui-windows.icons|skins.minerva.scripts&skin=minerva&version=1b0yp
19195748 /load.php?lang=en&modules=ext.SimpleMathJax,SimpleTooltip|jquery,oojs,oojs-ui,oojs-ui-core,oojs-ui-toolbars,oojs-ui-widgets,oojs-ui-windows,site|jquery.client,textSelection|mediawiki.String,Title,api,base,cldr,cookie,diff,jqueryMsg,language,storage,user,util|mediawiki.editfont.styles|mediawiki.libs.pluralruleparser|mediawiki.page.ready|oojs-ui-toolbars.icons|oojs-ui-widgets.icons|oojs-ui-windows.icons|skins.vector.legacy.js&skin=vector&version=5fbmc
11748862 /load.php?lang=en&modules=ext.visualEditor.core.utils.parsing|ext.visualEditor.desktopArticleTarget.init|ext.visualEditor.progressBarWidget,supportCheck,targetLoader,tempWikitextEditorWidget,track,ve&skin=vector&version=1dqsr

最频繁 UserAgent

sudo awk -F'"' '{print $6}' /var/log/apache2/access.log | sed '/^$/d' | sort | uniq -c | sort -rn | head -n 30

# 示例输出
   8257 Mozilla/5.0 AppleWebKit/537.36 (KHTML, like Gecko; compatible; Amazonbot/0.1; +https://developer.amazon.com/support/amazonbot) Chrome/119.0.6045.214 Safari/537.36
   6454 Sogou web spider/4.0(+http://www.sogou.com/docs/help/webmasters.htm#07)
   4369 meta-externalagent/1.1 (+https://developers.facebook.com/docs/sharing/webmasters/crawler)
   1342 Apache/2.4.62 (Debian) OpenSSL/3.0.15 (internal dummy connection)
   1188 Mozilla/5.0 (compatible; MJ12bot/v1.4.8; http://mj12bot.com/)
    654 Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/58.0.3029.110 Safari/537.3
    324 Mozilla/5.0 (Linux; Android 7.0;) AppleWebKit/537.36 (KHTML, like Gecko) Mobile Safari/537.36 (compatible; PetalBot;+https://webmaster.petalsearch.com/site/petalbot)
    279 Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/120.0.0.0 Safari/537.36
    251 Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/120.0.0.0 Safari/537.36
    243 Mozilla/5.0 (Windows NT 10.0; Win64; x64; rv:121.0) Gecko/20100101 Firefox/121.0
    234 Mozilla/5.0 (Windows NT 10.0; Win64; x64; rv:120.0) Gecko/20100101 Firefox/120.0

明确封禁

以下封禁方法,要多管齐下。

用爬虫协议封

# 以下内容放入 sth.com/robots.txt
# robots.txt 必须放在根目录
# 聊胜于无,因为好爬虫不乱来,坏爬虫写了它们也不遵守
User-agent: Amazonbot
Disallow: /

User-agent: SogouSpider
Disallow: /

User-agent: meta-externalagent
Disallow: /

User-agent: *
Disallow:

用脚本封

用配置封

然后 sudo systemctl reload apache2

其他知识

%2C = ,(逗号)
%2F = /(斜杠)
%7C = |(竖线)
%3A = :(冒号)
%3F = ?(问号)
# 用 ufw 前先开 ssh 端口,别把自己关门外了
sudo ufw allow ssh

sudo ufw allow 80/tcp
sudo ufw allow 443/tcp