Well, for starters there was a cron to restart the webserver every morning.

The product was 10+ years old and written in PHP 5.3 at the time.

Another cron was running every 15 minutes, to "correct" data in the DB. Just regular data, not from an import or something.
Gotta have one of those self-healing systems I guess.

Yet another cron (there where lots) did run everyday from 02:00 to 4ish to generate the newest xlsx report. Almost took out the entire thing every time. MySQL 100%. CPU? Yes. RAM? You bet.

Lucky I wasn't too much involved at the time. But man, that thing was the definition of legacy.

Fun fact: every request was performed twice! First request gave the already logged-in client an unique access-token. Second request then processed the request with the (just issued) access-token; which was then discarded. Security I guess.

I don't know why it was build this way. It just was. I didn't ask. I didn't wanted to know. Some things are better left undisturbed. Just don't anger the machine. I became superstitious for a while. I think, in the end, it help a bit: It feels like communicating with an alien monster but all you have is a trumpet and chewing gum. Gentle does it.

Oh and "Sencha Extjs 3" almost gave me PTSD lol (it's an ancient JS framework). Followed by SOAPs WSDL cache. And a million other things.

  • 2
    Without taking anything away from your travails, just want to point out that narratives like this fuel the unjustified hate that php gets, not just as a poster boy for just challenges (unworkable legacy), but it'll cause them to go out their way resisting any attempt for younger developers to learn how improved the language and ecosystem is now
  • 2
    Oh lol, sounds almost exactly like a service I maintained before, but we used Java/Spring on Backend and Angular on Frontend. Otherwise, same thing!
  • 2
    @Nmeri17 You are right. If a programmer does not care about the craft, they can make a mess in any language.Like writing a good, readable bash-scripts is certainly doable, but it will take a while.

    This is more about the overall bad decisions, that lead to this system becoming a legacy burden. Not so much about PHP 5.3.

    On the bright side, PHP 8.2 will be released in a couple of days (readonly classes ♥)!
  • 0
    Unrelated but I still face trouble in creating strategy for exporting data. You mentioned that MySQL records exporting process consumed 100% CPU while creating Excel file. I face trouble with a similar scenario where my client needs an XML feed everyday containing all the website data. Maybe he feeds it to some aggregation service but querying everything and putting it in a single XML feels so inefficient everytime we do. Any better approaches on exporting data without affecting the whole server?
  • 1
    @themissingbrace I'm not sure.. Broadly speaking, the aggregated data for an export could be fetched once, and updated on-change. This could be done with SQL stream readers or DB lifecycle events (aka event handler/observer). Or you could use "view tables" instead of event handlers. This would result in a lot of SQL queries (and migrations).

    The "export view" DB table could then be dumped as the XML feed. This might have a negative performance impact (single threaded app/transactions/slow filtering in event handlers/...).

    Another idea would be to create a SQL cluster on a separate machine and sync to the new replica DB. Then load the feed data from the replica. Use a read-only SQL User. With this approach, the application must be able to handle multiple DB connection URLs. This would result in a more complex and costly deployment (monitoring, updates, migrations, ..). But maybe you can pitch this as a new and fancy "report microservices" 😄

    Good luck!
Add Comment