Do all the things like ++ or -- rants, post your own rants, comment on others' rants and build your customized dev avatarSign Up
From the creators of devRant, Pipeless lets you power real-time personalized recommendations and activity feeds using a simple APILearn More
Nmeri177964dWithout taking anything away from your travails, just want to point out that narratives like this fuel the unjustified hate that php gets, not just as a poster boy for just challenges (unworkable legacy), but it'll cause them to go out their way resisting any attempt for younger developers to learn how improved the language and ecosystem is now
Hazarth740664dOh lol, sounds almost exactly like a service I maintained before, but we used Java/Spring on Backend and Angular on Frontend. Otherwise, same thing!
@Nmeri17 You are right. If a programmer does not care about the craft, they can make a mess in any language.Like writing a good, readable bash-scripts is certainly doable, but it will take a while.
This is more about the overall bad decisions, that lead to this system becoming a legacy burden. Not so much about PHP 5.3.
On the bright side, PHP 8.2 will be released in a couple of days (readonly classes ♥)!
Unrelated but I still face trouble in creating strategy for exporting data. You mentioned that MySQL records exporting process consumed 100% CPU while creating Excel file. I face trouble with a similar scenario where my client needs an XML feed everyday containing all the website data. Maybe he feeds it to some aggregation service but querying everything and putting it in a single XML feels so inefficient everytime we do. Any better approaches on exporting data without affecting the whole server?
@themissingbrace I'm not sure.. Broadly speaking, the aggregated data for an export could be fetched once, and updated on-change. This could be done with SQL stream readers or DB lifecycle events (aka event handler/observer). Or you could use "view tables" instead of event handlers. This would result in a lot of SQL queries (and migrations).
The "export view" DB table could then be dumped as the XML feed. This might have a negative performance impact (single threaded app/transactions/slow filtering in event handlers/...).
Another idea would be to create a SQL cluster on a separate machine and sync to the new replica DB. Then load the feed data from the replica. Use a read-only SQL User. With this approach, the application must be able to handle multiple DB connection URLs. This would result in a more complex and costly deployment (monitoring, updates, migrations, ..). But maybe you can pitch this as a new and fancy "report microservices" 😄
JsonBoa13We were still using python 2.7 waaay into 2020 - It had been heralding the impending doom since 2018 and final...
ZaLiTHkA10So.. the software team I work with still maintain Java 6 apps.. meanwhile management keep asking when we're "m...
nebula8IMHO technical dept is kind of like smoking cigarettes for some decades. You were told that shit will hit the...