22
exerceo
1y

I miss the good times when the web was lightweight and efficient.

I miss the times when essential website content was immediately delivered as HTML through the first HTTP request.

I miss the times when I could open a twitter URL and have the tweet text appear on screen in two seconds rather than a useless splash screen followed by some loading spinners.

I miss the times when I could open a YouTube watch page and see the title and description on screen in two seconds rather than in ten.

I miss the times when YouTube comments were readily loaded rather than only starting to load when I scroll down.

JavaScript was lightweight and used for its intended purpose, to enhance the experience by loading content at the page bottom and by allowing interaction such as posting comments without having to reload the entire page, for example.

Now pretty much all popular websites are bloated with heavy JavaScript. Your browser needs to walk through millions of bytes of JavaScript code just to show a tweet worth 200 bytes of text.

The watch page of YouTube (known as "polymer", used since 2017) loads more than eight megabytes of JavaScript last time I checked. In 2012, it was one to two hundred kilobytes of HTML and at most a few hundred kilobytes of JavaScript, mostly for the HTML5 player.

And if one little error dares to occur on a JavaScript-based page, you get a blank page of nothingness.

Sure, computers are more powerful than they used to be. But that does not mean we should deliberately make our new software and website slower and more bloated.

"Wirth's law is an adage on computer performance which states that software is getting slower more rapidly than hardware is becoming faster."

Source: https://en.wikipedia.org/wiki/...

A presentation by Jake Archibald from 2015, but more valid than ever: https://youtube.com/watch/...

Comments
  • 7
    I do too. But tech always recycles itself. I wonder if it's time to take advantage of it and start making websites the old fashioned way
  • 15
    I miss downvotes on youtube.
  • 2
    @Demolishun Almost everyone does.
  • 3
    @iceb That's how you get a perfect 100/100 score on Google Pagespeed. BTDT.
  • 3
    Just give me a motherfucking website..

    https://github.com/lyoshenka/...
  • 6
    The main reasons for the bloat is customers wanting more features (not all but most) and companies wanting to get a return on investment.

    Many of the early websites you talk about was run on a steady loss of money.

    That worked while the web was young and people with money was willing to risk their money for the hopes of making more money.

    But as that profit became more and more elusive you got more advertising and then ways to prevent adblock and the need to reach more groups of customers and legal demands to support disabilities (something most of the early web was horribly bad at)

    And then when you started to make money came the copy cats and you had to add more superficial features to compete since the mass of the customers did not care for quality but “shiny”.

    The sad truth is that the old style web was never sustainable and could only ever be a foot note in history :/
  • 0
    @saucyatom Brilliant stuff.
  • 4
    @Voxera One would expect that lightweight websites cost less. Also, these heavy JavaScript websites don't get much more work done.

    YouTube has less functionality than it had in 2012, yet it needs several megabytes of JavaScript. This means it is far less efficient.
  • 1
    @Demolishun @exerceo
    There are browser plugins which bring it back.
    It works because the api is still there and google just removed it from the ui.
  • 1
    @Lensflare Legacy YouTube used no API for the basic page layout. It immediately served the page text through HTML.
  • 0
    @saucyatom lol. i love the salt there and all valid criticism
  • 2
    When you post stuff on YouTube it doesn't even show up. They pre-approve comments lol

    Dead echo chamber
  • 1
    The thing I miss the most is the flaming skull gif websites for video game guides, tips, bgm downloads, and other fan work.
    Maybe I should build one
  • 0
    @ars1 oh I often find these when i'm looking for tips for old games.
  • 1
    @ars1 we could bring 90s style shit sites back for nostalgia. We need to blinking gifs to cause seizures too.
  • 2
    > yes sire I want to create this very tight coupling between frontend and backend; so much so, in fact, my backend returns the actual fucking frontend rather than the two of them being entirely separated projects as any sane person would treat two programs running in different environments, different languages and different machines with different requirements and different limitations. I want my API calls to return GUI elements rather than actual data and if I need to make a mobile application I need to make all of these endpoints from scratch again -this one time to return the actual data, because maybe this new application can't render HTML because it's not a fucking browser-, I want to be stuck with form url encoded stuff rather than having the freedom to send back and forth all the data I need, I want all of that bloat because the nostalgic feel is so good :)

    Said by nobody ever.
  • 1
    YouTube hides comments and is Jew consorship machine
  • 4
    I miss the times when games actually had to be played not interactive movies
  • 2
    you've clowned yourself. you can still write stuff as lightweight as you want, no one is stopping you
  • 5
    @fullstackcircus You can write lightweight, but when using the web, you're reading, and you can't read lightweight if shit requires 100k of markup, 500k of CSS, and 2M of JS just to display a measly 1k of relevant text.

    For stuff that isn't web apps, but things like blogs or informational websites, we could have blazingly fast stuff, fully responsive and shit, but we don't.
  • 0
    @Fast-Nop shall i refer you to this library?

    https://astro.build/

    not my cup of tea because *gasp* i actually LIKE using javascript in my webpages

    i mean you can enable JS with astro but you have to specifically specify it
  • 3
    @fullstackcircus Apart from that they already fail at HTML and hence shouldn't even be doing a website, and they also can't write CSS, and that there is no reason to even use JS with such a website at all, it's surprisingly lightweight.

    Still, there's no reason to go down the NPM hell when stuff like Hugo already exists. Though I rather rolled my own solution that clocks in at under 30k for the landing page - including images, responsiveness, CSS design with side-wide stylesheet, and accessibility.

    Though that's pretty extreme just because I can. It took me a while to figure out how to have a collapsible mobile navigation that is still accessible, but without JS.

    There are some pages where I do need JS for some interactive widget stuff, originally a jQuery plugin, but I rewrote it to throw out jQuery because that shaved off 30k of page load.
  • 3
    Progress = Verschlimmbesserung

    some things get better others get worse

    funny that the same Google that promotes pagespeed ranking are the same that did amp and the same that made YouTube worse
  • 2
    @Fast-Nop again, if you want to write pure HTML all day, be my guest, no one is stopping you

    There is something to be said however about a sort of concept of tree shaking / breaking down the requirements of a website until literally 100% and not a character more of source code are used to meet the site's functionality... could be cool
  • 2
    @PurgeXenos this I agree with full heartedly :)
  • 1
    @usr--2ndry amp did start out to reduce download size for mobile devices back when 3G was some elitist dream.

    Modern 4G and 5G makes it less needed.
  • 4
    @fullstackcircus Not pure HTML - quite a lot of CSS as well so that it looks nice on all devices. A document kind of website (as opposed to a web app) doesn't need JS except for interactive widgets.

    What's actually hard is slicing up a whole lot of information in a way that also works well with mobile navigation, but that's more information architecture than coding. When it's done well, it looks so easy, nearly trivial, but it's hard to get to that point.
  • 4
    @Voxera The only point AMP had ever going for it, and the only reason why it was fast, was that it restricted the superfluous shit websites would do in the first place. Don't do it, and you're faster than AMP anyway.
  • 1
    @aviophille Ofc it is, and since that also generates direct and indirect CO2 emissions, we can safely say that FE is killing the planet.

    We should offset that by forcing wasteful FE devs to stop breathing for long enough as to compensate the CO2 emissions of their trainwrecks.
  • 2
    @Fast-Nop I agree, same as how emscriptem restricted the js code his compiler generated to operations that he found the jit compiler could make very very fast, so fast that Mozilla even added a special optimization mode for emscriptem generated js and which then was the idea they created webassembly from.

    So restricting features can be a good way to speed things up :)
  • 1
    @Voxera Loading speed only increases if Internet speed increases but the site does not get heavier.
  • 3
    @exerceo And even then, the issue isn't necessarily bandwidth, but latency, especially when doing hundreds of requests (90% of them superfluous) and also nested dependencies because link rel preload in the head is apparently too difficult.
  • 0
    @Alexanderr Not sure which comment you replied to.
  • 1
    @Alexanderr If sites are heavier, they load slower. And YouTube is an 8 MB chunk of JavaScript to get the same work done that 100 KB of HTML did in 2012.

    If the browser needs to fight through 8 MB of JavaScript, it adds delay.
Add Comment