Howdy, Stranger!

It looks like you're new here. If you want to get involved, click one of these buttons!


How to limit the connection speed per IP with nginx? - Page 2
New on LowEndTalk? Please Register and read our Community Rules.

All new Registrations are manually reviewed and approved, so a short delay after registration may occur before your account becomes active.

How to limit the connection speed per IP with nginx?

2»

Comments

  • kbeeziekbeezie Member
    edited December 2012

    @connercg yea that second one is mine. The naming I used should be fine for 0.9.* and newer. I mainly apply the limitations on dynamic requests such as actual passes back to php-fpm or python/uwsgi, since nginx by itself tends to handle whatever you throw at it.

  • kbeeziekbeezie Member
    edited December 2012

    Seems a lil complicated and not nginx-related, let alone web-specific (ie: for user shells).

  • CoreyCorey Member
    edited December 2012

    @Zen said: Pretty ignorant thing to say, "10mbit per visitor is not a lot" - lol.

    It isn't.... especially when EVERY user isn't going to consume 10mbit.

    'Imagine 10 users online at same time using 10mbit'.

    So does every user that connects to your site CONSTANTLY download stuff? I would think not. They download it for a half a second... and then they are gone.

    Unless you are running a site that offers a big file for download and 100 people are downloading it at once this shouldn't be a big deal, but even then it will take people longer to download the files.... and you will end up with more concurrent downloads and the limiting would have done no good.

  • MunMun Member
    edited December 2012

    My question is why limit them, you are at that point creating an extended wait time for your users. So if 1 person is on they will only get 2mbit(s) of connection. Not only will this create a bottleneck and increase the load on your server, you are also creating an effect where your site will get a lower rank due to the slower speed. If anything I would just let the server do it as fast as it can, or move the images to CDN service such as cloudflare so that load isn't done on your servers rather the CDNs.

  • I'll probably confuse things here, but if serving images in mass, you need to run front end caching on separate server:

    Like this ---- [VPS] ---> [REAL SERVER]

    VPS can leave speed wide open, full wire speed, except for the cacheable elements where you throttle.

    Big image host sites love Varnish and Nginx and often together for this exact deployment type.

    The CDN recommendation should be considered, but you can accomplish the offloading with one or several VPSes instead --- think < $20 a month.

  • Just a thought. Try cloudflare, if you don't like it move to pub's idea. The nice thing about cloudflare is that it is perfect for sites like yours.

  • tommytommy Member
    edited December 2012

    -removed

  • Heck I was being over price happy above. Bunches of ever cheaper VPSes out there today. Could pull this off even cheaper, but depends on what your disk requirements are more than anything.

    If you are rolling your own solution ala VPS-CDN then should look to see where your viewers are coming from and find a provider tightly mixed into prominent major companies serving consumers there (think cable and telco companies).

    If your customers are say all the United States then we typically break that into two sites - far east coast and far west coast.

    In some metros, tying in via Peering exchanges is another method, but you want a provider with an upstream to that peering exchange. Works in bigger cities like New York and LA.

  • One way is to have a php page process the file. Then hand it over webserver in this case nginx using X-SendFile header (it's actually X-Accel-Redirect in nginx)

    http://wiki.nginx.org/XSendfile

    There are options to limit the rate of sending each file. This is more for serving larger files though.

    pubcrawler and ramnet's suggestions should be tried first. Although I'd prob increase limit_rate to 200k-300k and reduce number of connections to 4 or 5.

  • Interesting feature there @FluX :) New to me, might give that a spin when dust settles here.

  • @Amitz
    Here is what you need.
    https://github.com/bigplum/Nginx-limit-traffic-rate-module
    This module can not only limit the ip speed but also support other conditions, like uri, request_uri, host, etc. For more on this, see the link above.

  • StarryStarry Member, Host Rep

    @LazyZhu
    Hello, small grey wolf-_-#

  • Thank you all!
    Very interesting options, especially the Nginx-limit-traffic-rate-module mentioned by LazyZhu. I will have to do some man page digging over the next days, I guess! :-)

Sign In or Register to comment.