16
exerceo
277d

A developer might think "now that computers have more RAM and an abundantly strong CPU, I am free to create resource-hungry inefficient software!"

This sets a dangerous precedent.

Computers can only get faster if the software stays efficient while the processors get faster and the RAM increases.

If computers get more powerful but software also gets more bloated and less efficient, it defeats the performance benefit.

Also, software must be efficient to extend the battery time on portable devices.

Jody Bruchon video: https://youtube.com/watch/...

Comments
  • 15
    Yes, there is this sentence that dates back to 1995 and is called Wirth's law: "software gets slower faster than hardware gets faster".
    https://en.m.wikipedia.org/wiki/...
  • 1
    This Jody guy is a complete moron.
  • 8
    What I’ve noticed is that programmers don’t deliberately write resource hungry software.

    Many write software that "seems" to work fine and fast…
    and that is because they’re using machines with a lot of RAM and strong CPU.
    But when that software is then used on a slower machine, performance issues arise and the engineers realize they’ve been duped by their machines (and their lack of competence to write optimized software)
  • 2
    There is also some pervasive addiction to black box algorithms.
    Not every problem needs a GenAlg or a neural network or a branch-and-bound solver, but those are soooo easy to use that you will find those even when performance should be a larger concern.
    Then there is plain old laziness, hurriedness and ignorance, where people make a brute force algorithm just because it is simple to understand.
    I've once saw a kid compute a warehouse layout plan using 6 nested "for each" loops, at least half of all iterations were unnecessary. Probably many more.
    Yet since he only had like 1h to write it, and I took longer than that just to explain to him that "algorithm" is not just another word for "code".
    (we later used some nice greedy+dynamic method that sped things up 3700%)

    Thus I would blame the industry, not devs, for the rush quick-to-develop-but-inefficient-in-production software.
    When energy becomes expensive again this trend will revert.
  • 7
    Developers compete to finish as much work as possible in a unit amount of time. The outcome of this race affects each dev's career progression individually, whereas performance is always a collective burden. To a lesser extent this is also the case with code quality. Any decent dev is perfectly capable of optimizing their code, they just never see any benefit to doing so.
  • 4
    I'll take easy to change, readable code any day over perfomant ones 🪖😖
  • 1
    @cafecortado I left Wirth's Law out of the post to see if someone would bring this up. Thank you for bringing it up!
  • 0
    @SidTheITGuy This Jody guy is a complete genius!
  • 0
    If I have an algo that seems it will be resource intensive for a large amount of items I will test it with a large amount of items. I like to see the upper bound. I will only do this if I could conceivably see a large amount of items though. What is the point of you only ever process 10 items max?

    When I do test though I see how many orders of magnitude I can go. I have server that is used for process control. It could conceivably server 200 items. So I found the upper bound is 8500 on the hardware I was testing. The server was based upon a 3rd party protocol provider. The library just forced a disconnect at 8500 items per second being read. I am confident we will never see more than 200. But if we do need more we can go way higher. I will tell people the upper bound is 5000 though.

    Seeing how big you can go is kinda fun sometimes.
Add Comment