5
nickojs
2y

It's been a while since I first noticed that web development is becoming way too complicated. I'm not sure why people decided to overcomplicate everything. Is it to look smarter?

I just spent a few hours trying to understand why a unit test was failing. I decided to debug every statement of that unit test until I realized what was going on.

This project uses a library called ImmutableJS. I was calling for the "length" property of an array, like a regular human being would do, but that returned undefined because the correct property within this library is "size."

Good Monday, everyone.

Comments
  • 4
    Use Typescript. The complexity in webdev is mostly necessary to deal with Javascript and the browser APIs, and the solution isn't to reject progress but to pick your libraries wisely and manage your complexity with tools like Typescript.
  • 0
    When webdev was simple, it was normal to have the user redownload the whole page everytime you wanted to change something on the page. It was also normal to have the server query the database everytime a new user requested the same page even though nothing could have changed. It was an inefficient sloppy mess that only passed because we had nothing better and because most webpages were built by amateurs.
  • 1
    Today webpages are built like actual programs, with executables downloaded from a static file server and variable data obtained through an API that's preferably stateless by design and has a clever stateful cache. Client side caches are programmable and invalidations can be delivered through any of three widely supported server-client message protocols. It's not to look smart, it's engineering. Although 5G gets a lot of media coverage, most mobile traffic still flows through slow and expensive connections to devices with tiny batteries, and the only reason our ever increasing traffic doesn't break everything is the insanely sophisticated infrastructure we designed to make things as efficient as humanly possible.
  • 0
    Within 3 years the efficiency of our web infrastructure will plateau. I don't know what will happen, network hardware manufacturers will probably have to take the torch of progress so that managers and users don't notice the insane growth of demand, but what we have right now is near the best we can have in terms of network usage.
  • 1
    I told everyone I talked to about webdev almost 10 years ago.

    The web is no longer "the web". We now run all institutions and their applications on the internet. Web development is now web application development.

    Websites are going the way of blogs. Companies want to track more, and they can't do that with a website.

    Why do you think Amazon and Facebook have apps?

    If you got into development for web only, go into UI/UX. Take heed though, even that will come to end.

    The year is 2056. UI is generated through AI using online trends from competitors designs (which were also designed by AI). Backend services are pre-built service functions that run in the cloud. These are maintained by the big three corporations.

    Average developer pay sits around $50k. Anyone who is specialized is quickly snatched by a government job that pays sightly above average.

    Options are slim, jobs are hard to find, and with the recent legislation in place it's illegal to develop code without a license.
  • 2
    @sariel "and with the recent legislation in place it's illegal to develop code without a license."?

    Searching, don't see anything on this. Just regular license stuff. Is this a EU thing?

    Edit:

    Or is this part of your dystopian future?
  • 1
    @Demolishun most definitely part of the dystopian future, but it makes sense.

    Today, those who control the data, control the people. Who controls the data? The developers.

    It's only a matter of time before Facebook or some conglomerate attempts to limit access to development resources. It's the natural progression as more of our society and the success of world economies is directly tied to the internet.

    The argument will be made that forces lawmakers to restrict access to technological resources "to save American jobs".

    2056 was a year I pulled out of my ass, in reality something like this could start to happen in the next 5 years.

    Look at how the state government handled the developer in Florida that was continuing to provide public data to the public on covid deaths and infections. She was one of us, she could have been any one of us.

    Mark my words, in our lifetimes we will see mandatory licensing and registration for development much like you see with physical buildings.
  • 0
    @sariel I just searched for that person. I don't know who you are talking about.
  • 1
    @Demolishun her name is Rebekah Jones.

    https://npr.org/2020/12/...

    She refused to cook their covid books, and after she was fired I remember hearing she continued to provide information to them public(cannot find the source any more).
  • 1
    @sariel Huh, msn isn't treating her very well: https://msn.com/en-us/news/...

    Thanks for sharing. Hadn't heard about here before.
  • 1
    @Demolishun not just MSN, that article was published from the National Review, a notorious hard-nosed conservative soapbox.

    Of course they won't paint her in a flattering light, she brings offence to everything they promote.

    Notable is the author as well, he's a huge neo-con pundit.

    Thanks for sharing though.
  • 0
    @Ubbe If you use tree-shaking and code splitting (both of which are supported by all modern bundlers) and if those megabytes of libraries are well structured to support these technologies then you aren't going to download megabytes of libraries.
  • 0
    @Ubbe I genuinely do not understand what you're on about, with code splitting and tree shaking the client downloads only what is needed on the new page and has not yet been downloaded by another page. All this in the minimal number of requests, because the established shards will be bundled into a single file each.
  • 0
    @Ubbe Well, if you do make use of all that fancy tech then you probably download exactly as much code as necessary to make any browser do what you want. You will download that code in fragments as it becomes relevant, and you will not download much XML at all, as it's a very inefficient encoding. If you feel like your code is still too big you can gzip it, which is generally a great idea but particularly useful if you use a framework because gzip backreferences things like "R.createElement" into two byte addresses pointing at a single instance of the string. You also get to choose between statically linking the subset of your dependencies that you are using or referencing the whole from a CDN. If you care this much, the choice can be made based on the perceived popularity of the library and whether it's likely to already be cached on the client.
  • 0
    @Ubbe Many webpages are slow and network hungry because the clients don't know any better and the devs don't care, but we have the technology to make PWAs run on 30kbps mobile networks.
Add Comment