Howdy, Stranger!

It looks like you're new here. If you want to get involved, click one of these buttons!


Too many connection to MySQL
New on LowEndTalk? Please Register and read our Community Rules.

All new Registrations are manually reviewed and approved, so a short delay after registration may occur before your account becomes active.

Too many connection to MySQL

emdademdad Member
edited September 2016 in Help

Hello All,

I am having a delicate situation. One of my app is getting a lot of hit (I mean really a lot). While I'm happy about the traffic, MySQL on my server is getting overloaded. Receiving can't connect message. It's a leaseweb VPS (2GB one) with basic apache, mysql, php setup. No fine tuning, nothing special.

What do you suggest I do to prevent this & improve performance? I have higher budget to move my DB, if that's the only solution. Would love to hear from you pros in this field.

Thanks in advance.

Comments

  • first: optimize db if possible, look if your app is making proper use of indexes.

    do some research if you can lower the mysql request in your app at all by using proper caching algorhythms for php, object-cache, make use of redis, memcache etc.

    also you may simply raise the connection limit of mysql, lower some timeouts, raise buffers/cache, look into possible memory consumption. have mysql create temp-tables in ram and not on disk to minimize io-wait, also optimize on buffering/flushing methods of innodb if used.

    Thanked by 1emdad
  • racksxracksx Member
    edited September 2016

    Hello,
    Are you using standard mysql? or mariadb? perconadb?

    have you tried to check your actual connections? echo "show variables like 'max_connections';" | mysql

    Thanked by 1emdad
  • A few other thoughts

    • If you're using TCP rather than a UNIX socket, ensure only your desired IPs can access it. Preferably use IP tables to drop anything outside those IP ranges... that'll prevent connetions being made at all

    • Increase max_connections

    • Optimise if possible

    • See if you have any long running/often accessed processes where the DB can be connected for a while , .e.g before and after a call over a network.

    Thanked by 1emdad
  • Thanks for the ideas guys. Trying to implement what you suggested, starting with max connection, will come back to this post on Saturday, when same or more hit is expected.

  • @emdad if you are already paying 10e then another euro will get you an avoton 8 core supermicro dedi with 8GB RAM and a 120GB (or if you are lucky 160) SSD and GigE (~250mbit)
    https://www.lowendtalk.com/discussion/92591/dedibox-limited-available#latest

  • MikePTMikePT Moderator, Patron Provider, Veteran

    Hire me and I'll tweak your setup

  • A good place to start is the queries your app is making and see if you can reduce the number of calls.

  • @mycosys said:
    @emdad if you are already paying 10e then another euro will get you an avoton 8 core supermicro dedi with 8GB RAM and a 120GB (or if you are lucky 160) SSD and GigE (~250mbit)
    https://www.lowendtalk.com/discussion/92591/dedibox-limited-available#latest

    Thanks for pointing out, I'm not so sure if I want to get caught in their bandwidth politics, I have a lots of OVH & Kimsufi box if I wanna go that route though :)

    @MrGeneral said:
    Hire me and I'll tweak your setup

    ha ha, you're 1st on my hiring list. But you know, that partner I have :)

    @tragic said:
    A good place to start is the queries your app is making and see if you can reduce the number of calls.

    I intend to serve a large bunch of data from text, rather than DB, if that's what you mean...

  • raindog308raindog308 Administrator, Veteran

    MrGeneral said: Hire me and I'll tweak your setup

    Hire me and I'll twerk your setup.

    Thanked by 2xaoc mycosys
  • smansman Member
    edited September 2016

    Just run mysqltuner. It's not super intelligent but it will give you a lot of helpful breadcrumbs at a minimum.

    http://mysqltuner.com/

    Tuning MySQL is not rocket science. Just takes a bit of patience and reading.

    Thanked by 1emdad
  • **Update : **served data from file instead of DB for some major api of the app, but it didn't help. Changed max connection to 10k, still not solved. Although received almost 1.5x hit today, comparing last day. In addition IO is a problem now too. Thinking about using redis for whole setup, everything on memory instead of HDD.

  • vfusevfuse Member, Host Rep

    I highly doubt 2GB of ram is enough to handle 10k mysql connections, would probably need at least 32GB.

    Thanked by 1WHT
  • @vfuse said:
    I highly doubt 2GB of ram is enough to handle 10k mysql connections, would probably need at least 32GB.

    but the server never gets down, only MySQL does.

  • vfusevfuse Member, Host Rep

    @emdad said:

    @vfuse said:
    I highly doubt 2GB of ram is enough to handle 10k mysql connections, would probably need at least 32GB.

    but the server never gets down, only MySQL does.

    Probably cause it's running out of memory?

    Thanked by 1emdad
  • look into some stats if you run into the connection limit at all (phpmyadmin shows max used connections for instance)

    or use aged tuning-primer.sh which will also calucalte some possible memory consumption to work with.

    if you run into memory issues it might help to not raise but cut on some caches.

    serving data from files may make things worse, as you describe it will hit IO much harder and probably delay mysql operations even further so existing and simply waiting connections will block new ones.

    Thanked by 1emdad
  • @vfuse said:

    @emdad said:

    @vfuse said:
    I highly doubt 2GB of ram is enough to handle 10k mysql connections, would probably need at least 32GB.

    but the server never gets down, only MySQL does.

    Probably cause it's running out of memory?

    I checked and may be that's a reason. Coz I had almost 0 free memory. So moving to a OVH dedi (special dedi promo that was out a few days back)

    @Falzo said:
    look into some stats if you run into the connection limit at all (phpmyadmin shows max used connections for instance)

    or use aged tuning-primer.sh which will also calucalte some possible memory consumption to work with.

    if you run into memory issues it might help to not raise but cut on some caches.

    serving data from files may make things worse, as you describe it will hit IO much harder and probably delay mysql operations even further so existing and simply waiting connections will block new ones.

    Trying to use redis cache to avoid IO issue. Will it work? Is there any way to stress test (simulate thousands of get request)?? Meantime I'll work with tuning-primer.sh script.

  • rincewindrincewind Member
    edited October 2016

    emdad said: Is there any way to stress test (simulate thousands of get request)??

    ApacheBench (ab). Use the -n and -c options to set number of total requests, and concurrency. Can output to HTML/CSV/GNUplot formats.

    BTW, if you are benchmarking, do it over localhost (Eg: ab -n 1000000 -c 10000 127.0.0.1:80/html/path) to avoid flooding the network.

Sign In or Register to comment.