Getting really tired of newer devs in the OSS world re-creating something that has been around for decades, slapping a flashy logo on it, and saying they invented a "blazing fast", "under 200 LOC" way to do something.

"Under X lines of code!!1" is not impressive. It just means you don't understand how abstraction works.

  • 15
    Although I agree with you, less code is generally a good thing. Less code - less moving parts, less potential bugs. And possibly better performance.

    Recent example: wireguard
  • 4
    what @netikras said.

    My other thought is it's a game of degrees. I have a number of questions I have to ask myself to arrive at a meaningful answer:

    Is it more efficient with an equivalent and compatible feature set?

    Did the old library have too many concerns?

    Was the previous library algorithmically sound?

    Was the previous library lacking an essential for of abstractive comparability with up to date libraries? (Re: modularization)

    Is there anything essential in the new library left on the cutting room floor?
  • 10
    That's the tool cycle. A tool starts out lean and mean, and easy to use. Then it gets more functionality, weird edge cases, and ends up with 200 CLI options. This isn't testable anymore, and it's overwhelming.

    So people say, fuck it, I need JUST this stuff here, and I'm faster writing it myself than trying to wade through a huge swamp of outdated and/or misleading documentation. That re-starts the cycle.

    It's also why there are e.g. so many static website generators.
  • 4
    Wireguard is better because it simply nuked everything.

    VPN services are a guacamole of protocols, compatibility up-/ downwards, extensions and so on.

    Wireguard throws it all away and says: Fuck it. If a new version comes out, all endpoints must be updated.

    That's why it's small code wise - it's a bare, specialized solution fitting the "do one thing and do it well" principle.

    To the original post....

    Many people have an utter misunderstanding of software architecture.

    Saying you wrote a small library which does X instead of a framework Y that does A-Z including X and getting hyped because of the LOC count... It's dumb.

    It's non comparable.

    I'm still not fully convinced of pico libraries consisting of eg only one function.

    You'll need a ton of them... A framework usually means that certain guidelines were applied and the framework has a certain flow.

    Tons of mini libraries... Tons of different flows. Tons of versions.

    I guess there is no "perfect" approach, but pico libraries don't mean less hassle at all.

    LOC is in my opinion a pretty dumb idea to measure code quality - too many buts and ifs if you want to get it right.
  • 1
    @Fast-Nop Sounds interesting. Maybe this needs some revolutionary changes.
  • 0
    I would add another pretty important (for me) question:
    - Is it written in an inherently safer language than the beast it replaces? Do i get the benefit, that it is less likely to contain exploitable bugs or are the exploitations of such bugs more likely to be of a lesser severity (like getting a DoS instead of an RCE)?
  • 0
    and yet there's still gaps in FOSS where there's no *good* solution but 30 bad ones. i've hit several, but i don't know any fast language so i can't even try filling them.
Add Comment