Join devRant
Do all the things like
++ or -- rants, post your own rants, comment on others' rants and build your customized dev avatar
Sign Up
Pipeless API

From the creators of devRant, Pipeless lets you power real-time personalized recommendations and activity feeds using a simple API
Learn More
Search - "trend"
-
Yes Linus Torvalds is an asshole and the world is better because of it.
In short Linus's acid takes on code quality over developer fee fee's might be one of the things that has made the Linux kernel and the GNU/Linux project such a long lasting open source success and in my opinion the risk of him falling for all this "let's be nice and non offensive" bs trend may impact negatively on code quality.
Being an asshole has it's downsides and it's not always the best response, I'll give you that, but personally I think most of us who are viewed as assholes are seen like that because we put quality over convenience, facts before feelings and dedication over mediocrity; it is not because we hate you, it's because we measure ourselves with the same stick.
It depends on one's character, but when you've been toughened up because of bullying(I don't doubt many devs have been since being a nerd has never been hip) or life in general, you learn to stop whining & pick yourself up and you expect everyone to be competitive and competent as you are and it gets frustrating to manage people who don't fulfill your expectations.
Pros: You get shit done and you do it well.
Cons: People won't like you and you don't tolerate failure (much less mediocrity).
Yes Linus is an asshole, my coach was an asshole, some of my best teacher's have been assholes, I had friends who were assholes, heck I'm an asshole!
But I thank them because they made me better than I was, just as people have thanked me for being the right amount of asshole.
A warm thank you and fuck you Linus, keep being the asshole we need.36 -
Let's get something straight people, the trend to change terms in programming languages for PC approved ones is NOT for "making the workplace a better place".
If you are one of those who say "oh it's just terms, if it makes them feel better why not?", "I don't care so should everybody else", "the outrage proves we need to change the terms!".
No sir, first of all, since when has programming been about ditching standards to make people "feeel" better? Since when has engineering been about that?! We are engineers, we don't change shit and waste effort trying to fix things that are working.
Second, this word cleansing does NOT come from a well intentioned one, it's not about making the workplace a better place, it's not about minorities, it's about sanitizing language from an ideological and political standpoint to please an agenda pushing minority who doesn't give a shit about any real social issues.
They have done it to movies, videogames, news, political speech, magazines, books and now programming. It doesn't stop and they will never be satisfied, it's not about changing the terms, no one gives a shit about the terms, it's about pandering to ideological crybabies who want to control what you say because it "offends" them or some supposedly oppressed group from which we just hear anecdotal evidence.
Personally I wouldn't give a shit if it was for technical reasons, but it's not and I've seen what this shit does to communities I love and I won't stand it happening to the dev community just because some weak ass, no balls coders decided to pander to the retards on the far left to score virtue points instead of standing their ground.
Are you worried about oppressed groups? Donate money to third world children, speak out about women in Siria, travel to actual shitty 3rd world countries so you realize changing words on a GitHub repo on your expensive ass MacBook, sipping your soy based coffee on an office with air conditioning is not making the world a better place you delusional prick.
You want to ignore the facts be my guest, be willfully ignorant, but I will not police myself and my ideas for your ideological beliefs, not in gaming, not here. Fuck off.31 -
The trend of referring to staff you just laid off as “alumni” is fucking stupid.
So is emailing and asking said recently laid off staff to join an “alumni committee” that involves, among other things: going to off sites and community events that the company will almost CERTAINLY be using for recruiting photos and “Best place to work” blog posts.
Just send me my final pay and fuck off already.5 -
The next big trend will be in the area of project management:
The Waterfall™
Agile has been abused to the point where The Waterfall™ is way more agile! Think about it: It's straight down. No loops, no unnecessary hourly, daily, weekly meetings. No micromanaging. Just one flow. It starts at the top and it's all downhill from there.
Pure efficiency!
Edit: Wake up developers! The management doesn't want you to know this simple efficiency trick!9 -
Idiots. Idiots everywhere. The next big trend in software engineering is to take a whole bunch of idiots, give them the basic knowledge to write code, and then dedicate a whole lot of competent developers' time to either fixing errors made by those idiots, or attempting to make "safer" tools so those idiots don't screw up as easily.7
-
Who ever prefixes their table names like this should be sentenced to death.
tbltable_name or tbltablename
Is that some kind of sadistic fetish or something.
My eye can’t fucking pinpoint the first letter, thus I can’t easily find the table. I have seen it so many times!!! This trend has to STOOOP.
Prefix it tbl_table_name for all i care ffs.15 -
Windows 10 had one groundbreaking UI innovation, but no one adopted it and even Windows 11 discareded this revolutionary idea:
BUTTONS NEXT TO EACH OTHER AND AT THE EDGE OF A BOX DON"T NEED AN ADDITIONAL MARGIN
Windows 10 was the first and last OS where I never accidentally clicked right next to the X on a window, in a passive area that had no other purpose so it might as well have belonged to that motherfucking button.
I passionately hate this trend, adopted nowadays by every OS, that everything needs to be rounded, separated from the things around it, and "allowed to breathe". They don't breathe. They're not alive. They're fucking UI elements and the space between them is unused, lost space.
The only interaction a button has with its surroundings is that it pushes other content away to make room for itself and responds to the cursor. It doesn't wiggle, it doesn't grow and shrink, and it ESPECIALLY doesn't fucking breathe. Please, just let me click the motherfucking button.
Relatedly, do you know of a good, preferably bluish dark GTK theme that provides window decorations that stretch the full height of the titlebar and are laid out next to each other at the very end of the bar without gaps?8 -
I do not like the direction laptop vendors are taking.
New laptops tend to feature fewer ports, making the user more dependent on adapters. Similarly to smartphones, this is a detrimental trend initiated by Apple and replicated by the rest of the pack.
As of 2022, many mid-range laptops feature just one USB-A port and one USB-C port, resembling Apple's toxic minimalism. In 2010, mid-class laptops commonly had three or four USB ports. I have even seen an MSi gaming laptop with six USB ports. Now, much of the edges is wasted "clean" space.
Sure, there are USB hubs, but those only work well with low-power devices. When attaching two external hard drives to transfer data between them, they might not be able to spin up due to insufficient power from the USB port or undervoltage caused by the impedance (resistance) of the USB cable between the laptop's USB port and hub. There are USB hubs which can be externally powered, but that means yet another wall adapter one has to carry.
Non-replaceable [shortest-lived component] mean difficult repairs and no more reserve batteries, as well as no extra-sized battery packs. When the battery expires, one might have to waste four hours on a repair shop for a replacement that would have taken a minute on a 2010 laptop.
The SD card slot is being replaced with inferior MicroSD or removed entirely. This is especially bad for photographers and videographers who would frequently plug memory cards into their laptop. SD cards are far more comfortable than MicroSD cards, and no, bulky external adapters that reserve the device's only USB port and protrude can not replace an integrated SD card slot.
Most mid-range laptops in the early 2010s also had a LAN port for immediate interference-free connection. That is now reserved for gaming-class / desknote laptops.
Obviously, components like RAM and storage are far more difficult to upgrade in more modern laptops, or not possible at all if soldered in.
Touch pads increasingly have the buttons underneath the touch surface rather than separate, meaning one has to be careful not to move the mouse while clicking. Otherwise, it could cause an unwanted drag-and-drop gesture. Some touch pads are smart enough to detect when a user intends to click, and lock the movement, but not all. A right-click drag-and-drop gesture might not be possible due to the finger on the button being registered as touch. Clicking with short tapping could be unreliable and sluggish. While one should have external peripherals anyway, one might not always have brought them with. The fallback input device is now even less comfortable.
Some laptop vendors include a sponge sheet that they want users to put between the keyboard and the screen before folding it, "to avoid damaging the screen", even though making it two millimetres thicker could do the same without relying on a sponge sheet. So they want me to carry that bulky thing everywhere around? How about no?
That's the irony. They wanted to make laptops lighter and slimmer, but that made them adapter- and sponge sheet-dependent, defeating the portability purpose.
Sure, the CPU performance has improved. Vendors proudly show off in their advertisements which generation of Intel Core they have this time. As if that is something users especially care about. Hoo-ray, generation 14 is now yet another 5% faster than the previous generation! But what is the benefit of that if I have to rely on annoying adapters to get the same work done that I could formerly do without those adapters?
Microsoft has also copied Apple in demanding internet connection before Windows 11 will set up. The setup screen says "You will need an Internet connection…" - no, technically I would not. What does technically stand in the way of Windows 11 setting up offline? After all, previous Windows versions like Windows 95 could do so 25 years earlier. But also far more recent versions. Thankfully, Linux distributions do not do that.
If "new" and "modern" mean more locked-in and less practical and difficult to repair, I would rather have "old" than "new".13 -
Starting to notice a trend that people who don’t write docs and say “the code documents itself” tend to write the worst fucking code imaginable.17
-
Static HTML pages are better than "web apps".
Static HTML pages are more lightweight and destroy "web apps" in performance, and also have superior compatibility. I see pretty much no benefit in a "web app" over a static HTML page. "Web apps" appear like an overhyped trend that is empty inside.
During my web browsing experience, static HTML pages have consistently loaded faster and more reliably, since the browser is immediately served with content useful for consumption, whereas on JavaScript-based web "apps", the useful content comes in **last**, after the browser has worked its way through a pile of script.
For example, an average-sized Wikipedia article (30 KB wikitext) appears on screen in roughly two seconds, since MediaWiki uses static HTML. Everipedia, in comparison, is a ReactJS app. Guess how long that one needs. Upwards of three times as long!
Making a page JavaScript-based also makes it fragile. If an exception occurs in the JavaScript, the user might end up with a blank page or an endless splash screen, whereas static HTML-based pages still show useful content.
The legacy (2014-2020) HTML-based Twitter.com loaded a user profile in under four seconds. The new react-based web app not only takes twice as long, but sometimes fails to load at all, showing the error "Oops something went wrong! But don't fret – it's not your fault." to be displayed. This could not happen on a static HTML page.
The new JavaScript-based "polymer" YouTube front end that is default since August 2017 also loads slower. While the earlier HTML-based one was already playing the video, the new one has just reached its oh-so-fancy skeleton screen.
It would once have been unthinkable to have a website that does not work at all without JavaScript, but now, pretty much all popular social media sites are JavaScript-dependent. The last time one could view Twitter without JavaScript and tweet from devices with non-sophisticated browsers like Nintendo 3DS was December 2020, when they got rid of the lightweight "M2" mobile website.
Sometimes, web developers break a site in older browser versions by using a JavaScript feature that they do not support, or using a dependency (like Plyr.js) that breaks the site. Static HTML is immune against this failure.
Static HTML pages also let users maximize speed and battery life by deactivating JavaScript. This obviously will disable more sophisticated site features, but the core part, the text, is ready for consumption.
Not to mention, single-page sites and fancy animations can be implemented with JavaScript on top of static HTML, as GitHub.com and the 2018 Reddit redesign do, and Twitter's 2014-2020 desktop front end did.
From the beginning, JavaScript was intended as a tool to complement, not to replace HTML and CSS. It appears to me that the sole "benefit" of having a "web app" is that it appears slightly more "modern" and distinguished from classic web sites due to use of splash screens and lack of the browser's loading animation when navigating, while having oh-so-fancy loading animations and skeleton screens inside the website. Sorry, I prefer seeing content quickly over the app-like appearance of fancy loading screens.
Arguably, another supposed benefit of "web apps" is that there is no blank page when navigating between pages, but in pretty much all major browsers of the last five years, the last page observably remains on screen until the next navigated page is rendered sufficiently for viewing. This is also known as "paint holding".
On any site, whenever I am greeted with content, I feel pleased. Whenever I am greeted with a loading animation, splash screen, or skeleton screen, be it ever so fancy (e.g. fading in an out, moving gradient waves), I think "do they really believe they make me like their site more due to their fancy loading screens?! I am not here for the loading screens!".
To make a page dependent on JavaScript and sacrifice lots of performance for a slight visual benefit does not seem worthed it.
Quote:
> "Yeah, but I'm building a webapp, not a website" - I hear this a lot and it isn't an excuse. I challenge you to define the difference between a webapp and a website that isn't just a vague list of best practices that "apps" are for some reason allowed to disregard. Jeremy Keith makes this point brilliantly.
>
> For example, is Wikipedia an app? What about when I edit an article? What about when I search for an article?
>
> Whether you label your web page as a "site", "app", "microsite", whatever, it doesn't make it exempt from accessibility, performance, browser support and so on.
>
> If you need to excuse yourself from progressive enhancement, you need a better excuse.
– Jake Archibald, 20139 -
I hate the growing trend in IT to be "diversity" or "embrace diversity." It's Orwellian because embracing diversity is a form of uniformity. None of these companies want diverse ideas; they want you to believe in a specific ideology.
I wrote a full article on this the other week:
https://battlepenguin.com/politics/...10 -
The more I hear about algorithms creating political bubbles the more I start to think about if I'm in one. Its crazy how as soon as you watch certain types of content you get a lot of political stuff. Eg. watch fishing and outdoor stuff and soon you will find a lot of conservative politics in your feed.
I feel like the science and engineering side has been mostly untouched, but on this topic people are more clever to hide a political agenda. Theres a lot of content that shows if we can do something and almost none whether we should do it. So we have a lot of unaware people that are pushing tech without understanding the deeper consequences of their agenda. I get the feeling of a trend, that a lot of people, sometimes myself included, don't do much thinking about the things they know and simply let others do the processing. Any new information then gets stored and never processed.
TLDR: Fuck you, take the time to read it or get lost!6 -
This new trend of platforms spamming with content discovery fucking suck. Nobody wants to follow multiple profiles with the exact same fucking content, especially when most of them are just people jumping on the bandwagon with more generic content and nothing to make it distinguishable. Also if 10 million people saw something on your platform, the it's pretty fucking sure already been posted and shared on every single platform out there, why the fuck would you still keep recommending it weeks or even months later?
I know spamming users with random (statistically more engaging) content leads to improved customer engagement as people sooner or later click these thing out of curiosity or boredom, but eventually they get tired of it altogether and leave for good. What happened to Netflix will also happen to YouTube, Instagram, and all other platforms unless they significantly improve the balance between content discovery and content continuity (i.e. the content each user follows and is coming back for).4 -
I know I'm out of the loop since I barely use these sites, but...
What is it with this seemingly ridiculous new trend on LinkedIn of replacing your meaningless job title with, somehow, an even more meaningless fake description of what you're doing? I'm seeing it all over.
Back in the day it'd just be "Python developer". Then the trend seemed to be a "Senior / lead / principle software engineer / Python specialist" (who cares if you're actually a senior eh, this is LinkedIn.) And if that wasn't ridiculous enough, now it's "Helping to transform the globe towards a greener future by implementing beautiful, robust code in Python 3.10" or similar. Who the hell wants to see this crap?!5 -
None. As soon as I hear about a new trend I fear it. I fear the huge amount of Medium articles I would encounter...
-
I'd like to ask: What's trending at the moment instead....
Either I'm old and senile and missing something, or there is not really sth new.
Okay, JS might be crapping out new frameworks in their common "Not invented here" diarrhea....
But otherwise? What's really new?
I don't really know. I'm not only thinking about languages and stuff, but even in hardware there ain't really a big thing going on in my opinion.
Hab ich wat verpennt?
(Have I overslept?)
We had an interesting and frightening discussion regarding NGINX, as it is russian software today and that a new trend of a true, actively developed webserver is severely lacking... Apache looks semi dead and most other niche webservers, too.
That's all I've seen as a "trend" discussion in the latest time4 -
What historical trends look more ridiculous now than short-form video will look in a few decades?13
-
This is gonna be a long post, and inevitably DR will mutilate my line breaks, so bear with me.
Also I cut out a bunch because the length was overlimit, so I'll post the second half later.
I'm annoyed because it appears the current stablediffusion trend has thrown the baby out with the bath water. I'll explain that in a moment.
As you all know I like to make extraordinary claims with little proof, sometimes
for shits and giggles, and sometimes because I'm just delusional apparently.
One of my legit 'claims to fame' is, on the theoretical level, I predicted
most of the developments in AI over the last 10+ years, down to key insights.
I've never had the math background for it, but I understood the ideas I
was working with at a conceptual level. Part of this flowed from powering
through literal (god I hate that word) hundreds of research papers a year, because I'm an obsessive like that. And I had to power through them, because
a lot of the technical low-level details were beyond my reach, but architecturally
I started to see a lot of patterns, and begin to grasp the general thrust
of where research and development *needed* to go.
In any case, I'm looking at stablediffusion and what occurs to me is that we've almost entirely thrown out GANs. As some or most of you may know, a GAN is
where networks compete, one to generate outputs that look real, another
to discern which is real, and by the process of competition, improve the ability
to generate a convincing fake, and to discern one. Imagine a self-sharpening knife and you get the idea.
Well, when we went to the diffusion method, upscaling noise (essentially a form of controlled pareidolia using autoencoders over seq2seq models) we threw out
GANs.
We also threw out online learning. The models only grow on the backend.
This doesn't help anyone but those corporations that have massive funding
to create and train models. They get to decide how the models 'think', what their
biases are, and what topics or subjects they cover. This is no good long run,
but thats more of an ideological argument. Thats not the real problem.
The problem is they've once again gimped the research, chosen a suboptimal
trap for the direction of development.
What interested me early on in the lottery ticket theory was the implications.
The lottery ticket theory says that, part of the reason *some* RANDOM initializations of a network train/predict better than others, is essentially
down to a small pool of subgraphs that happened, by pure luck, to chance on
initialization that just so happened to be the right 'lottery numbers' as it were, for training quickly.
The first implication of this, is that the bigger a network therefore, the greater the chance of these lucky subgraphs occurring. Whether the density grows
faster than the density of the 'unlucky' or average subgraphs, is another matter.
From this though, they realized what they could do was search out these subgraphs, and prune many of the worst or average performing neighbor graphs, without meaningful loss in model performance. Essentially they could *shrink down* things like chatGPT and BERT.
The second implication was more sublte and overlooked, and still is.
The existence of lucky subnetworks might suggest nothing additional--In which case the implication is that *any* subnet could *technically*, by transfer learning, be 'lucky' and train fast or be particularly good for some unknown task.
INSTEAD however, what has happened is we haven't really seen that. What this means is actually pretty startling. It has two possible implications, either of which will have significant outcomes on the research sooner or later:
1. there is an 'island' of network size, beyond what we've currently achieved,
where networks that are currently state of the3 art at some things, rapidly converge to state-of-the-art *generalists* in nearly *all* task, regardless of input. What this would look like at first, is a gradual drop off in gains of the current approach, characterized as a potential new "ai winter", or a "limit to the current approach", which wouldn't actually be the limit, but a saddle point in its utility across domains and its intelligence (for some measure and definition of 'intelligence').4 -
What is that DISGUSTING trend of describing celebrity couples by combining their names? Kim + Kanye = fucking KIMYE? Elon Musk + Grimes = fucking GRUSK? WHAT IS THAT HOOOW AND WHYYY THIS IS SOOO annoying!
Lemme try. Depp + Heard = Hepp? Deard? Jomber? Amnny? What are those?
This should be illegal. Ffs, I'm out9 -
Continuing the trend of devRant stressball entering places! Previous attempt: https://devrant.com/rants/6483141
Within FSF conference, LibrePlanet 2023, there is a temporary Minetest server. More info and art: https://techhub.social/@vintprox/...1 -
Play Store's $25 registration fee - for getting PWA listed in their shitty catalogue? Who in the right mind would even jump in this clusterfuck of store to find a *web* app? For all you know, Google, there is such thing as QR codes - and customers can just scan the code (or type in that sweet address). Voila! Boom!!! Ching-ching!
Hello-hello, monopolistic cashgrabage! I came to inform you that your TWA bullshit is unneeded in ETHICAL space. The only ones who would benefit from this thing are permission-hungry publishers. And I'm already sick of this culture where people are put into store bubbles. You can't hide the fact that this data and features you provide, with "native" layer, may be misused in a jiffy - and by big players, no less. Of course, as a vile dumpster that you are, you don't mind it.
Don't even bring up a battery consumption that comes with PWA and browser. This doesn't matter if you use an app for some 2 minutes to tick your mental checkboxes! I'm just sick of app stores and native apps that collect the data without normal warning, and dare to take more than 1 second to fucking load the cached data. Take a lesson or two from PWAs that collect (probably useful) cache, instead of my specs, and load almost instantly.12 -
At first i was told to go to college BY PEOPLE WITH NO COLLEGE because i wouldnt be able to find a job without degree
Like a sucker i fell for it and believed in those LIES so i sacrificed my life for school
Then later i found out PEOPLE WHO FINISHED COLLEGE told me i just need knowledge in order to be hired, and turns out degree is unimportant
Like a sucker i fell for it and believed in those LIES so i studied and worked on practical projects and gained knowledge
Now when I try to get hired, they admitted that i am able to complete complex projects and i know how to solve the problems even if i see them for the first time. But they rejected me because "im not sure why the car leaks oil".
I have to understand and know what the whole framework is doing under the hood, how everything works, how dependency injection works under the hood, SOLID principles under the hood, decorators how they work under the hood etc.
So now it turns out
- sacrificing life for school is not enough
- sacrificing life for degree is not enough
- sacrificing life for learning and gaining knowledge is not enough
- now the new trend is i have to know not only how to drive a car like a professional formula F1 driver, i also have to be a mechanic and know how to fix the car if it breaks.
MATRIX IS A BIG FAT BULLSHIT AND A LIE.
I feel like they're looking for a senior developer knowledge to pay him junior developer salary
WTF IS THIS BULLSHIT?
I sacrificed 10 days of my life for their bullshit to build this project from scratch as a technical interview. They never said congrats on all the parts that were built right, but only complained about the small portion of bugs i didnt have time to fix.
ALL OF THIS FOR A SALARY OF $1500/MONTH THAT I ASKED. THATS LESS THAN 20,000$ A YEAR. THEY EITHER GAVE ME AN OPTION TO WORK FOR WAY LESS (500-600$/month) OR CALL THEM BACK IN A FEW MONTHS.
I JUST FINISHED COLLEGE AND THEY EXPECT ME TO HAVE 20 YEARS OF SENIOR DEVELOPER EXPERIENCE.
WTF IS THIS SLAVERY BULLSHIT?
HAVING A 500$/MONTH AS ENGINEERING SALARY WITH A DEGREE IS BELITTLING OF THIS JOB.
NO I DONT LIVE IN INDIA I LIVE IN SERBIA. MY DOG IS SICK AND IT COSTS 100$ A DAY JUST FOR HIS TREATMENT. HOW AM I SUPPOSED TO SURVIVE WITH A SLAVE SALARY IN THIS ECONOMIC CRISIS.
I DON'T UNDERSTAND2 -
Having this trend we should plan for log4j 2.18 release 'round Wednesday-Thursday.
By the EOY we're likely to reach 2.22. That's a nice version number to meet 2022 with :)
v2.0222 -
I had a discussion about SAAS and microtransactions with another dev. They are a little bit younger than me. The trend toward this in games and android apps were discussed. We found that we both avoid software which employs these business models.
We cannot be the only 2 people who avoid products employing these common business models. So I wonder what demographic pays for these services and products? I am to the point that if my kid asks to buy something in a game, I tell them that we will get rid of the game if they keep asking.
The only time I have paid for SAAS is when there is extraordinary perceived value. Quickbooks for small business is one such product (way cheaper than an accountant). Another is the Xbox game pass. So apparently for the game pass I am in the demographic.
Do we not like it because it is new? Or is it a kind of sleazy business tactic? I dunno. I would rather pay up front for most things. I feel like SAAS will be employed in software with proprietary file formats which require a subscription to even get to your data. Vendor lock-in.8 -
So apparently there's a trend in non-educational games teaching kids how to handle real life crisis.
Last week I witnessed a 14 yo girl handling an anxiety attack of a grown ass person and she learned that from Ancestors: The Humankind Odyssey - https://devrant.com/rants/6229469/...
Now I hear 12 years ago there was a boy who saved his sister from a moose attack by... taunting - a skill he learned from World of Warcraft: https://nextnature.net/story/2010/...
Anyone has more stories like this?8 -
Drupal 8 fractured the community, dead ended projects that had years of being built up and supported, started a downward trend in overall number of websites using Drupal when it was still increasing market share, homogenized Drupal with other less successful frameworks that had already attempted it and failed by using composer to replace drush, twig to replace PHPtemplate, and Symfony to butcher Drupal and hang parts of it on.
The mission statement was to "bring Drupal to the modern era" and "be more enterprise friendly". All I've seen them do is make it worse. I have stopped using Drupal now, I still maintain some Drupal 7 sites but now that they killed the Drupal 7 community it's basically dead. Some small attempt was made to salvage it with Backdrop but it will likely never be as big as Drupal was and is mostly dead itself, for one thing it's not directly compatible with the huge library of modules either.
Another thing I loved killed by those without vision and giving into the "industry standards" that make one question the intellect of everyone who subscribes to them being a good idea. But hey that evil procedural programming that worked so long for so many was finally defeated. It's surely better now right... right?
At least this movement was supported by people that can't even tell the difference between the use cases in real projects between Drupal and Wordpress. Software Development is in such a good place and has no hypocrisy. One would never suggest it has lost sight of its original purpose of solving real world problems with computing and become self absorbed with its own navel gazing.
If still in doubt check attached image, it tells a very clear story about how to ruin the life of a CMS. It honestly feels like a hitjob attempted to sabotage it rather than an earnest attempt to improve something that has been doing well since 2001.10 -
I wonder how many github issues have been closed by asking the author to implement the feature they've requested for. In the past, I was confident my issue will be resolved by opening a new one when there's no answer in earlier questions. I can't tell whether the nature of my questions advanced or whether it's a new trend. But I've opened maybe 4/5 issues in recent memory, and each time, the collaborators suggest the feature is one I should contribute to their project by implementing. Isn't this their job as maintainers? I'm already working on something that barely gives me breathing space. I encountered a challenge using your library, and your idea of helping is that I dissent from my own trajectory, acquaint with your project /how to implement what I want, wait for it to get merged etc, before continue what I originally intended. Do they think that's worth it?
Is it just me or is this a common occurrence, lately?22 -
Iconless menus – another effect of the toxic minimalism trend. Icons in menus such as the mobile web browser and device settings help finding items faster!
In 2014, Samsung removed icons from the upper right menu of their mobile Internet browser. At some point, roughly 2017, they realized it was a bad idea and brought them back. Could've told ya that earlier. 😁5 -
‘Groundhogging refers to the idea that people are going for the same type of person over and over again, while expecting different results,’ they explain. ‘People pick out someone who fits their ideal type, date them, but end up feeling underwhelmed.'
From: https://metro.co.uk/2022/02/...
Awfully resembles a pure function makeLoveHappenForMe with a single arg typeOfPerson:
const typeOfPerson = Jerk
// this is a pure function
makeLoveHappenForMe(new Jerk())
// will always fail
// but does it really have no side effects?2 -
I feel like the pendulum on js frameworks may be trending towards simplicity. I see lots of devs complaining about complicated frameworks. Maybe it will trend to less js loaded solutions and maybe a return to simpler pages.
I dunno, one can hope right?
I don't do web development, but I see a lot of people that do and they all sounds like chain smokers and alcoholics. Something has to give.5 -
What is it with this trend if having a SIM card in every modern car? Fucks sake, can't even find one that can be sold without -_-6
-
The trend of mobile browser URL bars only showing the domain name and hiding the rest of the URL needs to stop.
This trend appears to have been introduced by, guess who, Apple with iOS 7, and Samsung has copied it to their browser to look oh-so-"minimalistic", even though it has no benefits at all.
Even desktop browser Opera had this bad design at some point.4 -
From current and past events, I have seen how people do things because it’s hip. Not doing any real thing to help or alleviate the sufferings of the victims but going the easier way to just flow with the trend. Yuck!2
-
https://milkyeggs.com/?p=303
"I claim that the trend which AI/ML continues for lawyers is one that it starts for programmers. Just like how a partner at Cravath likely sketches an outline of how they want to approach a particular case and swarms of largely replaceable lawyers fill in the details, we are perhaps converging to a future where a FAANG L7 can just sketch out architectural details and the programmer equivalent of paralegals will simply query the latest LLM and clean up the output. Note that querying LLMs and making the outputted code conform to specifications is probably a lot easier than writing the code yourself ー and other LLMs can also help you fix up the code and integrate the different modules together!"1 -
which software/research/project you created in your free time as a hobby recently?
I personally created a small widget app that would allow users to create widgets of PDF files on their mobiles.
i have noticed a personal trend : i tend to spend my free time on language/tools/framework that are somewhat different than my daily job. I am a software engineer building sdks in my job that provide a very commercial set of features for android apps, but ended up creating an hobby app that would utilise android's other cool APIs ( storage, file, permission etc) .
before this project too i was exploring backend and web development, creating small react websites in my past time.
do you also spend time exploring outside yhe frameworks/tools used in your work life? or if not, how do you keep yourself motivated? the lateral part os important for me as i am soon going to a job where i might be exploring android APIs in daily work life and therby making android apps will become boring for me .
i remember before joining an SDK making company, i was trying to come up with an SDK myself lol, which at that time was opposite yo rhe work i was doing in the day2 -
Whatever be the current trend on Linked-In, at the end of the day the product development life cycle remains quite the same.
Still, as developers which general domains in software do you think would flourish in the near future?
My picks (not in order) -
>> Cyber security : automation, both offensive and defensive
>> Block chain : trustable data platforms
>> Applied AI : a few key models, applied to all niches, bettering existing UX
>> IOT : wearables, embeddables, smart appliances
>> AR : Navigation prompts, real time info about real life objects
>> VR : Immerse entertainment. (Metaverse 🤮)
>> Quantum computing : first gen costly commercial releases, new algos
What would you add or subtract from this?1 -
So if the current trend in software engineering is over-engineering, then the next can only be under-or appropriate engineering? =/
Definitely hoping it will be less proprietary, less custom DSL´s and grassroots driven2 -
A system to build note-taking systems. tatatap dot com.
It’s the most successful for a few reasons: it got launched, people find it useful, but most importantly it’s been fun and continues to be fun to work on.
I think the fun-to-make factor is massively underestimated as a success indicator. Working on the right product (whatever that means) that is unenjoyable is like using an amazing computer with a broken keyboard. It’s never going to work.
Sure, with any project there’s annoying stuff, but it’s the trend overall. Is the core functionality fun to work on?
In the case of Tap the core component is a notation parser, open sourced called sowhat, github dot com/tatatap-com/sowhat
That was super fun to make and learn about lexing and parsing. It’s pretty far along but there’s still a lot I’m planning to add. -
What is your view on the trend of major companies towards virtual technologies such as AR, VR and MR? Is it possible in future to replace computer with these technologies?5
Top Tags