36
Condor
6y

This rant is particularly directed at web designers, front-end developers. If you match that, please do take a few minutes to read it, and read it once again.

Web 2.0. It's something that I hate. Particularly because the directive amongst webdesigners seems to be "client has plenty of resources anyway, and if they don't, they'll buy more anyway". I'd like to debunk that with an analogy that I've been thinking about for a while.

I've got one server in my home, with 8GB of RAM, 4 cores and ~4TB of storage. On it I'm running Proxmox, which is currently using about 4GB of RAM for about a dozen VM's and LXC containers. The VM's take the most RAM by far, while the LXC's are just glorified chroots (which nonetheless I find very intriguing due to their ability to run unprivileged). Average LXC takes just 60MB RAM, the amount for an init, the shell and the service(s) running in this LXC. Just like a chroot, but better.

On that host I expect to be able to run about 20-30 guests at this rate. On 4 cores and 8GB RAM. More extensive migration to LXC will improve this number over time. However, I'd like to go further. Once I've been able to build a Linux which was just a kernel and busybox, backed by the musl C library. The thing consumed only 13MB of RAM, which was a VM with its whole 13MB of RAM consumption being dedicated entirely to the kernel. I could probably optimize it further with modularization, but at the time I didn't due to its experimental nature. On a chroot, the kernel of the host is used, meaning that said setup in a chroot would border near the kB's of RAM consumption. The busybox shell would be its most important RAM consumer, which is negligible.

I don't want to settle with 20-30 VM's. I want to settle with hundreds or even thousands of LXC's on 8GB of RAM, as I've seen first-hand with my own builds that it's possible. That's something that's very important in webdesign. Browsers aren't all that different. More often than not, your website will share its resources with about 50-100 other tabs, because users forget to close their old tabs, are power users, looking things up on Stack Overflow, or whatever. Therefore that 8GB of RAM now reduces itself to about 80MB only. And then you've got modern web browsers which allocate their own process for each tab (at a certain amount, it seems to be limited at about 20-30 processes, but still).. and all of its memory required to render yours is duplicated into your designated 80MB. Let's say that 10MB is available for the website at most. This is a very liberal amount for a webserver to deal with per request, so let's stick with that, although in reality it'd probably be less.

10MB, the available RAM for the website you're trying to show. Of course, the total RAM of the user is comparatively huge, but your own chunk is much smaller than that. Optimization is key. Does your website really need that amount? In third-world countries where the internet bandwidth is still in the order of kB/s, 10MB is *very* liberal. Back in 2014 when I got into technology and webdesign, there was this rule of thumb that 7 seconds is usually when visitors click away. That'd translate into.. let's say, 10kB/s for third-world countries? 7 seconds makes that 70kB of available network bandwidth.

Web 2.0, taking 30+ seconds to load a web page, even on a broadband connection? Totally ridiculous. Make your website as fast as it can be, after all you're playing along with 50-100 other tabs. The faster, the better. The more lightweight, the better. If at all possible, please pursue this goal and make the Web a better place. Efficiency matters.

Comments
  • 11
    It's also about web wankers being unable to understand KISS and use only the minimum tech stack that is actually required. Part of the reason is that they don't really understand the foundations and compensate that by bloated shit that can do everything. It's like buying a 40t truck to get to the nearest kiosk instead of walking.

    I remember the CMS hype around 2010 or so when every little brochure site was converted into WordShit, claiming that the customer could update the stuff himself. Well no, he can't, and he doesn't know how to do updates, and then he gets hacked or the page breaks randomly during updates. And it's slower than before.
  • 8
    I don't want to be a buzzkill as I agree with everything you said, but as long as clients demand fast and cheap shit that's what they'll get; cheap, near instant, bloated WordPress template "development" and their customers will suffer.

    The change starts with the customer's users, if they complain the site is too slow or braking all over then they'll listen, after all a developers (and anyone in any profession) will always give you what you pay for.

    I can focus on fast download speeds, cross platform compatibility and lightweight minimalist design on my own time to serve the user's I have as best as I can.
  • 6
    @JKyll fast, cheap, quality. Pick 2 right?

    And then another thing is, nobody is correcting them nor anybody will be. Instead certain people will even encourage them to forget about optimization on their side. Why? Because those certain people get benefits when newer hardwares are released and sold.

    After all it's all in our DNA. Humans don't give a rat ass about optimization. They will consume every possible resources they see and they can access until everything is run out of it.
  • 2
    @cursee that's right pick 2
  • 1
    I agree with everything, hurrah!

    Another thing that irks me is responsive design. (badly implemented at least) Yes it has its advantages, but have you ever tried to use them on a low res? Fucking janky 90% of the time

    Sidebars taking priority over readability of main article, header and footers that overlay 90% of the page.

    Fuck all this razzle. Give me readable, structured CONTENT!
  • 3
    @dufferz well the alternative before was "desktop only" layout, and that sucked on mobile. Then we had "mobile version", usually with that "m." subdomain, and it sucked also because a) content was reduced and b) twice the maintenance cost.

    With proper HTML5 and responsive CSS, this can be solved neatly and without much bloat. Of course, navigation on mobile is difficult, and good tools in a fool's hand will still fuck up.

    The issue right now is incompetent designers who design in too much useless shit, or their managers who force them to do that. But that's not a mobile problem, is just that it's even sucker on mobile than on desktop.
  • 3
    Higher tech != Better. That's why my shitter doesn't analyze my feces and suggest a diet plan. Because that's interesting but I don't fucking want that. Just like I don't want your bloated JavaScript framework.
  • 1
    I agree, and don’t even get me started with Electron:
    Telegram which is based on C++ and QT weighs few MBs, in the meantime Teams is a fucking Electron + Angular based Frankenstein which takes hundreds of MBs of RAM and is plagued by bugs.
    I can’t accept an Electron app from a small start up or an OSS project but a billionaire corporation should be able to optimize one of her main products for the major desktop platforms without using Electron.
  • 0
    @AlgoRythm same in the BE, I get mad when I’m forced to do load 500MB of stuff from nom just for start and to learn how to use anotherFuckingOrm for a couple of queries when I can implement them manually through the DB driver without wasting hours troubleshooting cryptic error messages.
Add Comment