OpenResty实现对Nginx限流

为防止突发流量把后端压垮,可以通过OpenResty的resty.limit.count 模块实现Qps的控制

    location ~/(bidding.do)$ {
        access_by_lua_file conf/limit.lua;
        proxy_pass http://backend;
    }

http://backend 为后端接口负载均衡upstream模块配置

在http模块下加入以下内容:

    init_by_lua_block {
        require "resty.core"
    }

    lua_shared_dict my_limit_store 100m;

实现限流的limit.lua文件内容如下:

local limit_count = require "resty.limit.count"
-- 限制QPS为500
local lim, err = limit_count.new("my_limit_store", 500, 1)
if not lim then
    ngx.log(ngx.ERR, "failed to instantiate a resty.limit.count object: ", err)
    return ngx.exit(204)
end
-- 当前时间当做limit的key
local key =  os.date("%c");
local delay, err = lim:incoming(key, true)

if not delay then
    if err == "rejected" then
        ngx.header["X-RateLimit-Limit"] = "100"
        ngx.header["X-RateLimit-Remaining"] = 0
        return ngx.exit(204)
    end
    ngx.log(ngx.ERR, "failed to limit count: ", err)

    return ngx.exit(204)
end
local remaining = err

ngx.header["X-RateLimit-Limit"] = "100"
ngx.header["X-RateLimit-Remaining"] = remaining

也可以用下面的方式按比例放行流量

location @fail{
      proxy_pass http://fail;
}

location @bidder{
      proxy_pass http://backend;
}

location ~/(bidding.do)$ {
      #放行50%的流量
      content_by_lua '
          abtest_num = 50
          local num = math.random(100);
            if (num <= abtest_num) then
          ngx.exec("@bidder")
          else
            ngx.exec("@fail")
          end
      ';
}

你可能感兴趣的:(OpenResty实现对Nginx限流)