4
Sumafu
20d

A question to all web devs here: Do you think that the <noscript> tag is necessary nowadays?

Comments
  • 5
    Only for "You need JavaScript to use this site" or "Click here for non JS version" sorta things
  • 2
    @12bitfloat This shouldn't happen regardless IMO.

    But then that's why I have No Script installed.
  • 5
    Yes. Everything should degrade gracefully, and that's not limited to just websites.

    Also: I have javascript disabled by default because I seriously distrust the javascript ecosystem. It's far too easy to accidentally include malicious code in your js project and not even realize it. There's also the whole spying and tracking norm I'd rather avoid.
  • 5
    Yes, it is, because people use the NoScript browser extension to intentionally disable JS.

    If your JS is doing something non-essential, <noscript> is not needed, just live without the script. Otherwise, place some nicely styled <noscript> message right where a functionality cannot work. That lets the page work as far as possible without JS.

    Oh, and don't even think of relying on JS for your main navigation!
  • 1
    I actually don't care about no script users. Not a significant user base to justify the extra development time. More so if you use a single page app framework, which usually heavily relies on js. I lump them in with the lynx users.

    Same reason why I don't bother with accessibility. I will add it as soon as it represents a significant market segment I want to touch.
  • 1
    @TheCommoner282 I can agree with not bothering to support users of an extension that is designed to break websites, but accessibility is pretty important. The number of people with various disabilities is not that low.
  • 0
    @RobbieGM if they are a significant part of the market, then I'll support it. But let's be honest, in the expected roi, adding accessibility doesn't change a lot.

    Well except if nobody did it, then I'd tackle that as a nische market.
  • 1
    @TheCommoner282 That's what the devs at Domino's thought, too. And then Domino's was dragged into court over their inaccessible piece of trash website, and lost. That's why.

    Also, NoScript doesn't break websites. Websites are accessible by default. YOU break them, you as dev.
  • 0
    @Fast-Nop I agree that websites are accessible by default, but NoScript does break websites. It's no secret that JavaScript is required for most websites to work. But if you want to go back to the stone age of doing everything through forms (rather awful UX, due to constant page reloading and no client side rendering) you go right ahead.
  • 1
    @RobbieGM Most websites require JS for no good reason at all except the usual gross incompetence of frontend devs, and that's a reason, but not a good one.

    Of course, interactive widgets won't work without JS, that's OK, but what I mean is "nothing works at all" which plain bad.

    I've seen shit like an onclick handler in JS that changes the document URL instead of using an anchor tag.
  • 0
    Interesting that you think writing web apps as single page client rendered applications has no good use case and is only a result of "gross incompetence."
  • 1
    @RobbieGM A web app counts as big interactive widget, I added that later to clarify.

    But you even see news articles where the whole page stays blank without JS, and that's gross incompetence.
  • 1
    @Fast-Nop
    You have no idea about modern single page apps. The whole idea is that information is taken via ajax calls rather then having to render HTML on the server. Of course the website stays blank without Javascript, it can neither receive data nor create HTML without it.

    And about the domino thing. If there is a legal requirement, then you have to do it. Luckily it is not everywhere. But even if there is, if the result is just a fine it can go into your cost/benefit calculation and that might tell you that to maximise your outcome you have to postpone adding accessibility.
  • 2
    @TheCommoner282 Apps are widgets by definition because they're supposed to be dynamic, and that's the use case for JS. Something like devRant itself would be an example.

    But! just abusing JS for displaying static content like news articles is moronic. A news article isn't an app.

    Also, if you're adding accessibility as an afterthought, it becomes expensive because that's the wrong way of designing it. Legal requirements are there to get companies to do it the right way. Not complying will get the company into court AGAIN, and it becomes more expensive - until the company gets it.

    A frontend dev these days has to know about WCAG 2.1 and design his stuff accordingly. That's also basics of the craft.

    But given e.g. the shit trend design with light grey fonts on white background, even sighted users are struggling.
  • 1
    @Fast-Nop Your view on legal requirements are simplistic. It's not black or white. Sometimes paying a fine is the cheaper alternative.

    But far more important, single page applications. The response you get from your backend for asking for the newest article will look something like this: {title: "Newest Article", body: "This is the new thing that happened in the world"}.

    It does not look like: <head></head><h1>NewestArticle</h1><p>This is the new thing that happened in the world</p><body></body>

    That allows us to use the same backend for the website as well as for the desktop app or the android app and publicise it as public API for third party apps.

    That's just good design. So how the fuck is the frontend supposed to display it without js?

    You are speaking about server-side rendering. But that does have many disadvantages. Try to go with the time.
  • 1
    Well you could use a framework like next.js to handle getting the same content rendered client side and server side, but I know not everyone does that. Also in the US you can get sued under the ADA if you have >=15 employees, just FYI.
  • 1
    @TheCommoner282 Legal requirements are black and white. If you don't get this, talk to the legal department of your company. With your misguided stance, you are increasingly becoming a liability for your company.

    "Going with the time" means introducing tons of useless JS for you. And then wondering why that stupid shit loads and parses slow as hell. Yeah, because you are misusing JS to replicate HTML, and probably also CSS, that's why.
  • 0
    @Fast-Nop

    Wow, you're full of crap. Your argument can be boiled down to 'that's what it supposed to be.' Well, that's an ideology and not an argument. If I responded to that, I'd just repeat my earlier arguments.
  • 1
    @TheCommoner282 You're the one full of shit, namely the typical grossly incompetent frontend dev. You risk legal problems for your company, you bloat websites needlessly with JS.

    From what you said here, you are an example for frontend shitheads. The only reason why you don't understand this is because you're on the low end of the Dunning-Kruger effect.

    And yeah, that your shitty creations don't display anything without JS won't astonish anyone.
  • 0
    @Fast-Nop

    Hey moron, if you took the time to read anything about Dunning-Kruger, then you'd know that there is no such psychological phenomena. It was suggested in one paper. No follow ups.

    Further, I don't risk anything for my company. Those decisions aren't made by me. Those are business decisions that involve marketing, legal and programming alike.

    And finally, I am no front-end developer.

    But I am always happy to see someone like you left behind by the world, accusing everyone of incompetence because of his own incapability to see the bigger picture.
  • 0
    @TheCommoner282 so you are not a front end developer, but you presume to talk about good and bad behavior in front end development?
  • 1
    @TheCommoner282 The bigger picture are ever more bloated websites which bleed money because users just bounce away, companies dragged into court and losing over their trash sites, and you being too stupid to even understand the concept behind Dunning-Kruger.
  • 0
    @Sumafu why wouldn't I? Does who I am change my argument? Ever heard of a argument from authority fallacy?
  • 0
    @Fast-Nop The stupider you are the more incapable you are to recognise you are stupid. Damn hard concept, dim whit. It describes a tendency, not an absolute.

    But is there really such a tendency? Aren't those just narcissists? And narcissists often have higher than average intelligence.

    Does reality actually matter for you or do you just like a concept and declare it true in your personal world view?

    But dealing with you makes me want to believe in Dunning-Kruger, even though there isn't any compelling psychometric to back it up.
  • 1
    @TheCommoner282 It should - because I have solid arguments while you have just nonsense handwaving. Come on, entertain me a little more. I'm curious how you will secure your "idiot of the month" title.

    How about "arguing" that functional stuff on the web is hopelessly outdated and that the mondern web is meant to be unusable shit?
  • 0
    @Fast-Nop

    Quid pro quo. Here are answers for you, then you answer finally some of my questions:

    - Modern web apps have finally some state and don't have to encode data in the DOM. This reduces ugly hacks in which you just add hidden form inputs to give some data to your code.
    - A modern environment allows for a more functional immutable approach. It is trivial to destroy and recreate a Vue-Component.
    - Less bandwith. Even though you have to download the whole page on your first visit in subsequent visits it is cached and only data is exchanged.
    - PWA: Those sites can act as local apps. Drastically shortening the time to market and widening the return on investment.
    - etc

    Now, you explain why you believe that a modern Vue-Js application is unusable?
  • 1
    @TheCommoner282 The DOM argument is true - for complex applications like Facebook. Otherwise, the reason for spaghetti is that many controls impact many other controls, which makes a confusing UI anyway.
    - More bandwidth. You argued for even ajaxing in static content, which renders the browser cache useless.
    - Slower load time because more pieces are loaded and pieced together.
    - More battery drain because of useless JS being downloaded and run.
    - PWAs suck for anything even slightly complex. Just compare an Electron editor with Notepad++.
    - Single page design breaks bookmarking. Ever tried to browse a Reddit group down? Takes an eternity, AND you can't even bookmark it like with pagination.
    - Often used to "lazy load", which means instead of eliminating the bloat, they try to lessen the effects. And when you scroll down, you still have the shit to load, only now with more delay.
  • 0
    - Ajaxing in static content does not render the browser cache useless. Properly built RESTful APIs will allow the browser to use its built-in cache, and GraphQL API results can be cached by the client using something like Apollo (which is not so lightweight) or URQL (which is, but has inferior caching).

    - Client-side rendering != larger page load time. Bloated scripting is not caused by how you render your pages. There are many lightweight and well-built PWAs out there (like dev.to) as well as crappy static sites (dailymail.co.uk). Not everything is "pieced together" in a waterfall fashion.

    - Bloat != CSR. Yes we should try to avoid bloating pages with unnecessary JS, but that doesn't mean that all JS is bad or all frontend frameworks are bad and we need to go back to the days of CGI.

    - What? I can name one electron editor that is quite fast and far more fully featured that Notepad++. Yes it's VSCode.
  • 0
    (continued)

    - If you do it improperly, yeah. But good SPAs will store the right data in the URL and use it to load the right content when loading the page. Just look at vue-router, react-router, reach-router, etc.

    - Would you rather lazy load and get the most important stuff in 2 seconds but have the other stuff take an additional 2 seconds so a couple widgets may not be loaded, or take the full 4 seconds to load the page? This one sounds like a no-brainer to me.
  • 0
    @Fast-Nop

    - True about the static content. Even if news wouldn't be static. But I'd do the same on a blog and that's static.
    - Load time is only initial load time. Afterwards it's actually improved. And can be masked better.
    - Battery drain is true. But also more functionality
    - PWA's are great for time to market and hitting many goals with less development effort. If it really gets successful, you can still redevelop it. Everything else would be premature optimisation.
    - Bookmarking can be easily solved. Many SPA have internal routers. Those can still be book marked. It's just a rewrite rule in the reverse proxy

    The big difference, what you want is not an application. It's closer to a document. A single page app is a full application. And I think web assembly will be a game changer for those.
  • 1
    @RobbieGM

    - Basically, even more bloat just to replicate what the browser could do without effort: inner platform effect looming.
    - Using JS to fetch the pieces and render does take longer than using static markup for static content. Not least because the DOM has to be modified, and that's never fast.
    - VScode is dog slow compared to Notepad++. No wonder because Notepad++ is done in C++.
    - That would require devs to manually emulate the native URL behaviour, which most don't, and it's again additional effort just to replicate native browser behaviour.
    - I would rather eliminate the bloat. 3MB for 500 bytes of text is downright crazy.
    - Don't even get me started on the rampant abuse of JS to not only do HTML's job, but even CSS'.
  • 1
    @TheCommoner282
    Someone removes leftpad.

    A dependency of a dependency of a dependency decency of vue injects a keylogger that flows unnoticed into upstream projects.

    You introduce a subtle flaw into your navigation js, rendering the site unusable for 5% of your users (like my router!). It takes a month of debugging to find and fix. If you do not fix it, you increase your legal liability surface, but if you do it's expensive and it doesn't justify 5% more sales.

    Also why the hell are you only serving json from your backend? Converting is trivial; storing both even moreso.

    To be clear, I'm not against JS in general, but I am absolutely against using it for everything. It's a magic hammer that makes everything look like a nail. Also, integrating the graceful degradation approach is exceedingly difficult to do in an established (large) project; it's something you should have in mind when you start. It's a design pattern. I also don't expect any of this to change your mind; you seem pretty set in your ways.

    "Less bandwidth" Your 2mb js bundle would take a lot of markup savings to realize this point. Also it's better to focus on responsively serving smaller (compressed) images for larger bandwidth (and hosting cost) savings.

    "A modern environment allows for a more functional immutable approach." That's great! Html is also completely immutable 😉. But seriously, this is circular logic. You don't need immutability and trivially recreating vue objects if you don't use vue.

    There's also the argument of wanting to know what your computer is doing, and with the web, that's not really possible anymore. We live in times where not even the developer knows what their code is doing anymore, let alone what it will be doing next month.

    Also, even gmail has an html-only version. Apparently the profit-minded company decided it was worthwhile to maintain; and I'm glad because Google decided to permanently break the JS version on my computer -- even when I allow all JS.
  • 0
    @Fast-Nop a lot of what you're saying is true for small applications or simple static sites. But when it comes to larger ones, the benefits of this extra JS start to become clear:

    - GraphQL caching can be more effective than HTTP caching since it is more granular. For example, you could fetch relations on an object in one query and then run a completely different query fetching the same relations and the relations could be cached from the first query while other stuff is loaded from the server.

    - If you have one app shell and load the content inside separately, you can cache the app shell to make subsequent page loads really fast. And AFAIK DOM modification is rarely slow, but feel free to find me an example.

    - Dog slow? Never experienced that. After startup, which only takes a couple seconds, everything is pretty much instant.
  • 0
    (continued)

    - Client-side routing is not hard. You're just a history.pushState() away from updating the URL instantly. This is not "replicating native browser behavior"--this is native browser behavior.

    - The discussion was about lazy loading or not lazy loading, and your answer is "I would rather just eliminate the bloat." Of course I would like to that too, but what if you want to add some widgets to your site without slowing down the rendering of the main content? Like a share widget? Or the part of your navigation bar that doesn't appear until the user clicks on it? Or a notification reader component? If those features are necessary, wouldn't you rather have them lazy loaded?
  • 1
    @RobbieGM Last time I needed an interactive, JS driven widget with animation and SVG, the whole pageload was 18 kB. It's not like you have to pull in megs of JS to do things.

    Also, the browser cache is uncompressed data (or has that changed?), and mobiles are pretty aggressive in evicting items. Unfortunately, mobile connections are the worst.

    While bandwidth improves, latency does not, especially mobile. That's another reason why unneseccarily piecing together many small items isn't the best idea.

    I mean, e.g. implementing a blog with ajaxing stuff together has to be one of the most absurd ideas ever. I'd go with an SSG instead, eliminating the JS bloat and the backend bloat at once. Fast, much less attack surface, easy win.
  • 1
    @Fast-Nop Good. I never said that more JS = better website. I'm just saying that when you need to have more JS, lazy loading is useful.

    According to https://stackoverflow.com/questions... it seems that most browsers store the data in compressed format. I would be surprised if they didn't. iOS Safari allows 50MB of cache per website before starting to clear stuff out. 50MB should be plenty.

    Why do you think that piecing together small scripts into one (like with webpack) means they are all loaded in a waterfall, one after the other? They aren't. They can all be bundled together, or they can be loaded separately but concurrently.

    Why not have the benefits of both? Load statically, but with scripts that make subsequent loads faster. This is what Next.js does, for example.
  • 1
    @RobbieGM Interesting with the browser cache, though still half a meg of compressed useless JS is a lot for 50 MB total cache.

    And no matter how you do it, you have to request the pieces, which means requests to your backend, which means a lot more requests, and a lot more fudging around with the DOM. If you pull this in as JSON, converting it into markup on the client is also not in zero time.

    There's no way that this would be faster than not using JS, not having dozens of superfluous DB requests, and not fudging around with the DOM.

    KISS. Code you don't run will always be faster than code that you do run. Also, code that you don't have has no bugs, no security problems, no dependency risks.
Add Comment