11
PaaPsik
6y

Working with acutal BigData. Will be ''promoted'' to a new team where I will work on a system wrote in php+mysql with literal millions of requests and database rows. We are currently seeing server crashes around once a week on peak usage. Stack is a vps 64gb ram server + i dont know how many cpu/ cores. Apache, php, mysql.Best ways to optimise and adapt in this case? Kafka? Rabbitmq?ngnix? More hardware?

Comments
  • 1
    How many requests do you have on peak? Return the mysql queries you serve individual respones per user or more like content pages (aka the same result for everyone)? Do you know if it's a ram or a CPU problem? Which php version?
  • 1
    You could throw more resources at it (ram, ssd, ect) or work on query optimisation and request minimisation.

    If it is bigData even moving out of MySQL would probably be a good idea.
  • 0
    @kolaente not sure how many yet. Results are based on conditionals, and a/b testing, so caching is not an option i think. The guy that works on it said that the db locks up because of too many requests
  • 2
    Is it just reads or do you do writes too? Do you use views?

    Index creation is a good string to start pulling...
  • 1
    any cache, varnish or other shit ?
  • 0
    @vane i dont think at the moment, but i implemented memcached on my project recently, so could probably cache something and get something out of it
  • 0
    @rsync i'll check it out, thanks
  • 0
    @mmcorreia i dont know yet about these, but we use views amd indexes in other projects, so it would be smart to assume the sams for this project
  • 1
    @pionell cool add cache to most expensive queries, you can start small with one query and see the results fast
  • 1
    App should be stateless as much as it can be.
  • 0
    @vane what do you mean by that?
  • 2
    You might want to try partitioning (MySQL 8.0) and stored procedures. Those are essentials when working with large databases. Yours is kinda medium, but should still benefit from it.
  • 2
    Get an ops guy to actually diagnose bottlenecks
  • 2
    Check for requests and how many and frequent they are. Nginx could help with that. Is it on PHP 7.2? If it is still 5.x upgrade it, you’ll see some speed gains.

    What MySQL server is being used? You could give Percona a spin as a replacement. Going with NoSQL will probably require some refactoring, but it might help you out aswell.

    You could also try to use Varnish of course to take load of the server. Also look at your load and maybe move away from a monolithic stack and put it on multiple servers with a loadbalancer in front of it.
  • 3
    App Server: upgrade PHP, use Opchache/APC to make code run faster
    Cache server (I personally would put cache on a seperate server), have memcache to cache expensive queries, have varnish to cache static ressources (like assets), have elastic search to simplify searches
    Db server (I personally would have the db on a seperate server), optimize the db using indexes. Log all the queries and then look at every one using `explain`, add indexes and/or optimize them.
  • 1
    @pionell don’t pass big user state between requests, ex in sessions
  • 2
    @vane sessions can be moved to the db or better memcache...
  • 0
    @Wack yeah cache the sessions
  • 0
    @fml89 running php 7.1, it not a web solution this is more of a big api
  • 0
    @rsync thanls, i'll look into it first thing when i'll get added to the team
  • 0
    📌
Add Comment