12

Is your code green?

I've been thinking a lot about this for the past year. There was recently an article on this on slashdot.

I like optimising things to a reasonable degree and avoid bloat. What are some signs of code that isn't green?

* Use of technology that says its fast without real expert review and measurement. Lots of tech out their claims to be fast but actually isn't or is doing so by saturation resources while being inefficient.
* It uses caching. Many might find that counter intuitive. In technology it is surprisingly common to see people scale or cache rather than directly fixing the thing that's watt expensive which is compounded when the cache has weak coverage.
* It uses scaling. Originally scaling was a last resort. The reason is simple, it introduces excessive complexity. Today it's common to see people scale things rather than make them efficient. You end up needing ten instances when a bit of skill could bring you down to one which could scale as well but likely wont need to.
* It uses a non-trivial framework. Frameworks are rarely fast. Most will fall in the range of ten to a thousand times slower in terms of CPU usage. Memory bloat may also force the need for more instances. Frameworks written on already slow high level languages may be especially bad.
* Lacks optimisations for obvious bottlenecks.
* It runs slowly.
* It lacks even basic resource usage measurement.

Unfortunately smells are not enough on their own but are a start. Real measurement and expert review is always the only way to get an idea of if your code is reasonably green.

I find it not uncommon to see things require tens to hundreds to thousands of resources than needed if not more.

In terms of cycles that can be the difference between needing a single core and a thousand cores.

This is common in the industry but it's not because people didn't write everything in assembly. It's usually leaning toward the extreme opposite.

Optimisations are often easy and don't require writing code in binary. In fact the resulting code is often simpler. Excess complexity and inefficient code tend to go hand in hand. Sometimes a code cleaning service is all you need to enhance your green.

I once rewrote a data parsing library that had to parse a hundred MB and was a performance hotspot into C from an interpreted language. I measured it and the results were good. It had been optimised as much as possible in the interpreted version but way still 50 times faster minimum in C.

I recently stumbled upon someone's attempt to do the same and I was able to optimise the interpreted version in five minutes to be twice as fast as the C++ version.

I see opportunity to optimise everywhere in software. A billion KG CO2 could be saved easy if a few green code shops popped up. It's also often a net win. Faster software, lower costs, lower management burden... I'm thinking of starting a consultancy.

The problem is after witnessing the likes of Greta Thunberg then if that's what the next generation has in store then as far as I'm concerned the world can fucking burn and her generation along with it.

Comments
  • 13
    As people write and run increasingly high-level code, optimization fades into obscurity. It simply isn't possible to optimize code that relies upon something that isn't optimized, or relies upon ridiculous numbers of layers of abstraction. And as for the former, as frameworks and libraries become the norm, going without is seen as bad practice. It becomes impossible to write optimized "green code."

    I firmly believe everyone interested in development should learn C first (and Assembly is a nice second because of what it can teach). Also, any abstraction beyond what's required should be scorned as over-engineering.

    But in an age of NPM and PyTorch, I just dont see that happening. Now people see CPU and memory resources as abundant and wasted if they're not used. That 5ghz 32 core threadripper seemingly performs about as fast as a 500mhz PentiumIII. Thanks, bloatware.

    "Resources are cheap; optimizing is expensive."
    Ugh.
  • 5
    That might apply if you write fairly optimal code upfront and if you're aiming for a more closer to literal definition of optimal.

    People don't just use high level languages and additional layers of abstraction that make things ten times slower each time though unnecessary layers is one common cause of wasted resources. The reality is a huge amount of optimisation still relies on skill and if I can get some scripting language to be a hundred times less resource demanding that's still a major win.

    Virtually every direction will have some diminishing returns and what's the immediate sweet spot varies.

    Your point is basically, optimisation is cheap, resources are unaffordable when for example a developer does something in O(n^3) that could be O(log(n)) through sheer incompetence and it happens. There's no limit to how slow and inefficient you can make something. O(inf) isn't that uncommon a mistake.

    Measurably I routinely see huge scope for significant savings in the industry.
  • 4
    @Root One common trap with abstractions is that people don't understand what's being abstracted, and that can easily lead to accidental O(n^2) that aren't even obvious. In C, a simple example can be repeated strcat() in a loop.

    @RANTSMCPANTS It's even worse with web dev. Boated client-side frameworks combined with bloated server-side CMS. What could be done in less than 50k is now 3 MB. What could be done statically (fastest code is no code) now causes hundreds of useless database requests server-side for a single page view.

    Small algorithmic micro optimisations won't cut it if the big picture is already fucked up because the chosen tech stack is rather a pile of shit.
  • 3
    Here I was waiting for the "use my green service" link at the bottom, but you're dead right.

    I don't have low level languages under my belt like C/C++ ect but even in high level languages, I find a lot of under achieving code that would take 10 minutes to refactor if it had been done at the time, instead now it's going to take me a couple of weeks to decrypt the mess, understand what the mess is trying to achieve, and rewrite it while making it still achieve what it already does.

    Business won't give two hoots about it though, as they see no difference from the application side. It still does the same thing.
  • 1
    yo imagine optimizing code in 2020

    no one can anymore
  • 0
    This is exactly why I use custom frameworks and often my own version of modules.

    You can't attest to the speed or optimization of other peoples code, and it would take just as long (if not longer) to test it all versus just doing it yourself.

    The open source community is great, but lies and misinformation are the rule not the exception, especially with frameworks.
Add Comment