Join devRant
Do all the things like
++ or -- rants, post your own rants, comment on others' rants and build your customized dev avatar
Sign Up
Pipeless API
From the creators of devRant, Pipeless lets you power real-time personalized recommendations and activity feeds using a simple API
Learn More
Search - "abstraction"
-
Long time no see Friend:- "hey dude what do you do for a living"
Me:- "I suffer in silence"
Ltnsf:- " but what exactly do you do"
Me:-"I stare at a rectangular space hoping to find enlightenment" -
I actually hate this job, seems like there's not a single project with decent code abstraction. Everything is a fucking spaghetti like:
```
// we only care about e-mail fields, which are odd
isValid(index) {
if(!(index%2)) {
return true;
}
...
}
```
Like MOTHERFUCKER, WHAT BUSINESS RULE DOES THIS SHITCODE REFLECTS?!?! WHY CAN'T YOU SHITHEADS WRITE PROPER BUSINESS ABSTRACTION RATHER THAN JUST COLLEGE-GRADUATE QUALITY SHITCODE.
FUCKING KILL ME ALREADY I SHOULD HAVE INSTEAD BECAME A PSYCHIC CAUSE I'M SURELY GOOD AT GUESSING WHAT THE FUCKING FUCK THIS FUCKING FUCKCODE INTENDS TO ACHIEVE.
AND YOU CALL YOURSELF TOP-NOTCH DEV CAUSE THIS IS JAVASCRIPT... YOU KNOW WHAT, SHITHEADS LIKE YOU, WHO DON'T KNOW SHIT OTHER THAN GLOBALLING EVERY FUCKING NPM LOCAL PACKAGE IS WHY GOOD ENGINEER LIKE US GET SHIT FROM PHPEPSI ZENDFRAMESHIT FUCKHEADS DEVS.
DO YOU THINK YOUR COMMENT WAS HELPFUL??? DO I LOOK LIKE A BUSINESS GRADUATE FUCKTARD WHO DOESN'T KNOW WHAT THE FUCK THE MODULE OPERATOR IS??? I WANT TO KNOW WHY YOU WROTE THAT SHITFUCK INSTEAD OF WHAT IT DOES; THE REASON I'M READING YOUR POORLY WRITTEN MODULE OPERATOR SOAP-OPERA IN THE FIRST PLACE IS CAUSE I KNOW WHAT IT'S DOING, IT'S BREAKING SHIT.
OH AND ONE MORE THING, FUCK YOU FUCK FUCK FUCKSHIT SHITFUCK FUCk11 -
relationship with dev perks (just happened):
GF : *bad mood* i'm hungry
Me : Let's go get some food ! *trying to cheer her up*
GF : No.
Me : Ok, whatever you say.
GF : Do you really wa--
Me : Whaat? you said "No"?
GF : Don't you see abstraction in my face?
Me : so what ? you want me to Implement it?
GF : NO. PUT IT IN YOUR GODDAMN MAIN FUNCTION.
Me : ok let's go *still don't understand what she meant*
GF : Good Job.68 -
FML. An overreaching supergenius "architect" and a database team:
A: "We have decided that apps should use mysql. Install a MySQL so we match cloud"
DBA: "we don't have an image or experience with MySQL. We have mssql and Oracle "
A: "ok, use mssql in data center and mysql in production cloud"
DBA: "that's... not going to work well"
A: "just do it!"
...
Me, reading this shit, sends email: "ignoring the fact that we have more than 500 queries in this application which will need to be checked and most likely rewritten, how are we supposed to test the mysql queries without production access?"
A: "just use mssql local and MySQL in cloud"
M: "... Just to make sure I understand, you want us to write queries for mssql, test them locally, and then write separate queries, with a separate SQL connection abstraction that deploys to production? Again, how are we going to test this?"
A: "no, use same queries, should be fine"
M: "they really won't, they're different dialects"
A: "do the needful, make work!"
If karma were a thing, this person would have long since exploded into a cloud of atomized blood.18 -
"Hey nephew, why doesn't the FB app work. It shows blank white boxes?"
- It can't connect or something? (I stopped using the FB app since 2013.)
"What is this safe mode that appeared on my phone?!"
- I don't know. I don't hack my smartphone that much. Well, I actually do have a customised ROM. But stop! I'm pecking my keyboard most of the time.
"Which of my files should I delete?"
- Am I supposed to know?
"Where did my Microsoft Word Doc1.docx go?"
- It lets you choose the location before you hit save.
"What is 1MB?"
- Search these concepts on Google. (some of us did not have access to the Internet when we learned to do basic computer operations as curious kids.)
"What should I search?"
- ...
"My computer doesn't work.. My phone has a virus. Do you think this PC they are selling me has a good spec? Is this Video Card and RAM good?"
- I'm a programmer. I write code. I think algorithmically and solve programming problems efficiently. I analyse concepts such as abstraction, algorithms, data structures, encapsulation, resource management, security, software engineering, and web development. No, I will not fix your PC.7 -
I’m surrounded by idiots.
I’m continually reminded of that fact, but today I found something that really drives that point home.
Gather ‘round, everybody, it’s story time!
While working on a slow query ticket, I perused the code, finding several causes, and decided to run git blame on the files to see what dummy authored the mental diarrhea currently befouling my screen. As it turns out, the entire feature was written by mister legendary Apple golden boy “Finder’s Keeper” dev himself.
To give you the full scope of this mess, let me start at the frontend and work my way backward.
He wrote a javascript method that tracks whatever row was/is under the mouse in a table and dynamically removes/adds a “.row_selected” class on it. At least the js uses events (jQuery…) instead of a `setTimeout()` so it could be worse. But still, has he never heard of :hover? The function literally does nothing else, and the `selectedRow` var he stores the element reference in isn’t used elsewhere.
This function allows the user to better see the rows in the API Calls table, for which there is a also search feature — the very thing I’m tasked with fixing.
It’s worth noting that above the search feature are two inputs for a date range, with some helpful links like “last week” and “last month” … and “All”. It’s also worth noting that this table is for displaying search results of all the API requests and their responses for a given merchant… this table is enormous.
This search field for this table queries the backend on every character the user types. There’s no debouncing, no submit event, etc., so it triggers on every keystroke. The actual request runs through a layer of abstraction to parse out and log the user-entered date range, figure out where the request came from, and to map out some column names or add additional ones. It also does some hard to follow (and amazingly not injectable) orm condition building. It’s a mess of functional ugly.
The important columns in the table this query ultimately searches are not indexed, despite it only looking for “create_order” records — the largest of twenty-some types in the table. It also uses partial text matching (again: on. every. single. keystroke.) across two varchar(255)s that only ever hold <16 chars — and of which users only ever care about one at a time. After all of this, it filters the results based on some uncommented regexes, and worst of all: instead of fetching only one page’s worth of results like you’d expect, it fetches all of them at once and then discards what isn’t included by the paginator. So not only is this a guaranteed full table scan with partial text matching for every query (over millions to hundreds of millions of records), it’s that same full table scan for every single keystroke while the user types, and all but 25 records (user-selectable) get discarded — and then requeried when the user looks at the next page of results.
What the bloody fucking hell? I’d swear this idiot is an intern, but his code does (amazingly) actually work.
No wonder this search field nearly crashed one of the servers when someone actually tried using it.
Asdfajsdfk.rant fucking moron even when taking down the server hey bob pass me all the paperclips mysql murder terrible code slow query idiot can do no wrong but he’s the golden boy idiots repeatedly murdered mysql in the face21 -
Mountain climbing. Increases social skills, teamwork and trust.
Building a house. Increases spatial visualization and planning skills.
Electronics. Increases mathematical and problem solving skills.
Chemistry. Increases precision and analytical reasoning skills.
Psychedelic drugs. Increases imagination, inspiration and abstraction skills.24 -
When you create a bunch of objects in Java and it crashes because you're used to the memory usage of C's structs.3
-
I don't get it
My brain does not have the capacity to understand it
How the fuck does my colleague manage to write 12 classes/interfaces for something so stupidly simple??
Two classes, a hand full of functions, done.
Why do you need this level of abstraction?
To mock the interfaces in unit tests? The unit tests you didn't write because "they're not necessary"?
No one will be able to understand this clusterfuck of a module even though it's entire purpose is "read number and write number elsewhere"...21 -
C'mon people! Spread the word! "The cloud" is not "just someone elses computer", it's a completely different way to compute!
I'm so tired of the oversimplifications done trying to explain the consept. The massive amount of work, sweat and tears put into the orchestration, automation and abstraction layers to deliver truly elastic, scalable and self healing infrastructure, applications and services deserves a fuckload more respect than "just someone elses computer"!
Hosting and time-sharing have been with us almost as long as we have had computers (mainframes etc), but dismissing the effort of thousands upon thousands of devs and ops people to make systems robust and automated enough to literally being able to throw a wrench in the engine any time during production and not have the systems suffer is fucking insane!
The whole reason the term "cloud" is so fitting is not just because it was coined from the cloud-shape used in technical and non-technical drawings and illustrations symbolising the internet, but also because of the illusion of magic it gives the end-user not being able to see "whats inside the music box".19 -
A decade ago 800x600 was pretty much the standard resolution for devices and 5 sec response time was considered fast. Animations were minimal and websites were easier to read. Programmers debated around topics like which loop runs faster, i++ or ++i, while vs doWhile and so on. In general, we were closer to understanding what happens behind the browser curtain and how code needs to be organized to make it more maintainable.
Today the level of abstraction is much higher. I don't think devs can contemplate on the finer aspects of programming efficiency; they'd rather rely on a code library to do all the grunt work. With the explosion of devices and platforms, the focus has shifted from programming to assembling. Programmers need to know their tools first, then write code. The tool is expected to work well with a millisecond response time, not the programmer's code.
Moving forward, I think programming would be more about building higher abstraction utilities/libraries that are integrated by other tools, which is already happening. Marketing an App would become more important than the actual skill needed to develop it.
A bit far-fetched, but I think the future programmer would be a lot like a stock market analyst who has a bunch of windows in front, just observing data or algorithm patterns created by an AI engine and cherry-picking a specific combination of modules that might make the next big sensational app.8 -
Well, it all started off with hardware-level programming involving jumpers and stuff like that... Then came Assembly, which was good.. B, C compilers. Finally came the interpreted languages, and that's where in my opinion the abstraction should've ended. But no, we needed more frameworks, more libraries, even more abstraction! Where does it end? As it seems to be going, I guess that users will have kid toys - no iToys! - for electronics and we'll be programming on with bloated Scratch GUI's. Nothing against Scratch, but that shit ain't proper programming anymore. God I can't wait for the future.
ABSTRACT ALL THE THINGS!!!
Oh and not to mention that all software will be governed in political correctness by some Alex SJW AI shit that became sentient. Not a single programming term will be non-offensive anymore, no matter how hard you try to not offend anyone, or God forbid - don't care about it because you just want to make something that's readable, usable and working!! Terms, UI names for buttons, heck even icons! REMOVE IT BECAUSE IT OFFENDS SOMEONE THAT I DON'T EVEN KNOW JACK SHIT ABOUT!!!18 -
So I have a teacher that when he use "C++" it is basically C with a .cpp file-extension and -O0 compiler flag.
Last assignment was to implement some arbitrary lengthy calculation with a tight requirement of max 1 second runtime, to force us to basically handroll C code without using std and any form of abstraction. But because the language didn’t freeze in time 1998, there is a little keyword named "constexpr" that folded all my classes, arrays, iterators, virtual methods, std::algorithms etc, into a single return statement. Thus making my code the fastest submitted.
Lesson of the story, use the language to the fullest and always turn on the damn optimizer
Ok now I’m done 😚7 -
Assembly...
Do I really need to say more? Okay, it's low level so there is no abstraction. All you deal with is cold naked mathematical truth and physical limitations.
It's still ❤️8 -
Bad dev practices:
1. Forgetting to version control some fun project i am doing for a long time and then commit everything at once. And forget about it again..
2. I probably have too much love for abstraction. So i abstract stuff just for the fuck of it to the point my friends dont even understand what the program is for.
3. I have no patience and due to that i lose motivation when i think of some idea that is big.
4. I cant keep my ideas small enough, and i dream too big until problem3 kicks in, and then i drop the entire idea.6 -
There are cybercrimes. That means you can be put to jail for performing certain actions with your computer. I’m taking about serious crimes like hacking crucial governmental servers but not about insulting people online. I’m talking about something that’ll make government chase you.
Every action at the computer could be done with keyboard only.
My face when there is finite sequence of keys that you press one by one and then become a criminal. And go to jail.
My face when if you put that sequence into script file, there is file that you double-click and instantly become criminal.
Press here to go to jail. The whole new level of abstraction.
Really makes me think.7 -
I’m adding some fucking commas.
It should be trivial, right?
They’re fucking commas. Displayed on a fucking webpage. So fucking hard.
What the fuck is this even? Specifically, what fucking looney morons can write something so fucking complicated it requires following the code path through ten fucking files to see where something gets fucking defined!?
There are seriously so fucking many layers of abstraction that I can’t even tell where the bloody fucking amount transforms from a currency into a string. I’m digging so deep in the codebase now that any change here will break countless other areas. There’s no excuse for this shit.
I have two options:
A) I convert the resulting magically conjured string into a currency again (and of course lose the actual currency, e.g. usd, peso, etc.), or
B) Refactor the code to actually pass around the currency like it’s fucking intended to be, and convert to a string only when displaying. Like it’s fucking intended to be.
Impossible decision here.
If I pick (A) I get yelled at because it’s bloody wrong. “it’s already for display” they’ll say. Except it isn’t. And on top of that, the “legendary” devs who wrote this monstrosity just assumed the currency will always be in USD. If I’m the last person to touch this, I take the blame. Doesn’t matter that “legendary Mr. Apple dev” wrote it this way. (How do I know? It’s not the first time this shit has happened.) So invariably it’ll be up to me to fix anyway.
But if I pick (B) and fix it now, I’ll get yelled at for refactoring their wonderful code, for making this into too big of a problem (again), and for taking on something that’s “just too much for me.” Assholes. My après Taco Bell bathroom experiences look and smell better than this codebase. But seriously, only those two “legendary” devs get to do any real refactoring or make any architecture decisions — despite many of them being horribly flawed. No one else is even close to qualified… and “qualified” apparently means circle jerking it in Silicon Valley with the other better-than-everyone snobs, bragging about themselves and about one another. MojoJojo. “It was terrible, but it fucking worked! It fucking worked!” And “I can’t believe <blah> wanted to fix that thing. No way, this is a piece of history!” Go fuck yourselves.
So sorry I don’t fit in your stupid club.
Oh, and as an pointed, close-at-hand example of their wonderful code? This API call I’m adding commas to (it’s only used by the frontend) uses a json instance variable to store the total, errors, displayed versions of fees/charges (yes they differ because of course they do), etc. … except that variable isn’t even defined anywhere in the class. It’s defined three. fucking. abstraction. layers. in. THREE! AND. That wonderful piece of smelly garbage they’re so proud of can situationally modify all of the other related instance variables like the various charges and fees, so I can’t just keep the original currency around, or even expect the types to remain the same. It’s global variable hell all over again.
Such fucking wonderful code.
I fucking hate this codebase and I hate this fucking company. And I fucking. hate. them.7 -
The more depressed you get over the current state of software is how you know you made it.When you start making your own opinions and say"wow these people are full of shit"
Primary example, the web development overblown bullshit. Fuck me dude, you really don't need that full featured react, vue, angular framework to make sense of shit. You are going over the top for fucking ajax functionality and state management that you could do by yourself without needing to learn a full framework, by the time you finish learning react you probably would have been better served with standard vanilla af JS and server side rendering.
Our world is full of fads and many talented people that perpetrate them. Its fine, it is a the nature of the beast. But a lot...A LOT of software is very POORLY written. And adding levels of abstraction over a very broken paradigm (web in this case) does and will not make it better.
Basically I am fucking hating being a web developer and want to go back to a time in which we cared about how much memory consumption our applications made as well as not worrying about the fucking frontend having the ability to implement machine learning.
I want to run sublime.exe and being sure that it is a native application to my system and not using a fucking contained web browser to implement my fucking text editor. With 20mb of ram at most instead of 500mb WTF.
I knew I made it when I could read comments on Hacker news and reddit and say "this idiot is full of shit", I knew I made it when I would sigh heavily at the idea of having another project rather than having a fan girl attitude towards it.
I knew I made it when people writing about software development meant shit to me rather than the wonder of what the fuck they were talking about.
I knew I made it when getting laid was more important to me than fucking around with code.
pussy > code
Fuck you.13 -
Help.
I'm a hardware guy. If I do software, it's bare-metal (almost always). I need to fully understand my build system and tweak it exactly to my needs. I'm the sorta guy that needs memory alignment and bitwise operations on a daily basis. I'm always cautious about processor cycles, memory allocation, and power consumption. I think twice if I really need to use a float there and I consider exactly what cost the abstraction layers I build come at.
I had done some web design and development, but that was back in the day when you knew all the workarounds for IE 5-7 by heart and when people were disappointed there wasn't going to be a XHTML 2.0. I didn't build anything large until recently.
Since that time, a lot has happened. Web development has evolved in a way I didn't really fancy, to say the least. Client-side rendering for everything the server could easily do? Of course. Wasting precious energy on mobile devices because it works well enough? Naturally. Solving the simplest problems with a gigantic mess of dependencies you don't even bother to inspect? Well, how else are you going to handle all your sensitive data?
I was going to compare this to the Arduino culture of using modules you don't understand in code you don't understand. But then again, you don't see consumer products or customer-specific electronics powered by an Arduino (at least not that I'm aware of).
I'm just not fit for that shooting-drills-at-walls methodology for getting holes. I'm not against neither easy nor pretty-to-look-at solutions, but it just comes across as wasteful for me nowadays.
So, after my hiatus from web development, I've now been in a sort of internet platform project for a few months. I'm now directly confronted with all that you guys love and hate, frontend frameworks and Node for the backend and whatever. I deliberately didn't voice my opinion when the stack was chosen, because I didn't want to interfere with the modern ways and instead get some experience out of it (and I am).
And now, I'm slowly starting to feel like it was OKAY to work like this.10 -
Refactoring
When wrote some abstraction,
commented out the old stuff,
replaced it with the new calls
and after it passes all tests
get to remove all the old crud.
( ☼ ◡ ☼ )
I feel clean again.3 -
Hi everyone, long time no see.
Today I want to tell you a story about Linux, and its acceptance on the desktop.
Long ago I found myself a girlfriend, a wonderful woman who is an engineer too but who couldn't be further from CS. For those in the know, she absolutely despises architects. She doesn't know the size units of computers, i.e. the multiples of the byte. Breaks cables on the regular, and so on. For all intents and purposes, she's a user. She has written some code for a college project before, but she is by no means a developer.
She has seen me using Linux quite passionately for the last year or so, and a few weeks ago she got so fed up with how Windows refused to work on both her computers (on one of them literally failing to run exe's, go figure), that she allowed me to reinstall both systems, with one of them being dualbooted Windows 10 + Linux.
The computer that runs Linux is not one she uses very often, but for gaming (The Sims) it's her platform to go. On it I installed Debian KDE, for the following reasons:
- It had to be stable as I didn't want another box to maintain.
- It had to be pretty OOTB, as first impressions are crucial.
- It had to be easy to use, given her skill level.
- It had to have a GUI abstraction to apt, the KDE team built Discover which looks gorgeous.
She had the following things to say about Linux, when she went to download The Sims from a torrent (I installed qBittorrent for her iirc).
"Linux is better, there's no need to download anything"
"Still figuring things out, but I'm liking it"
"I'm scared of using Windows again, it's so laggy"
"Linux works fine, I'm becoming a Linux user"
Which you can imagine, it filled me with pride. We've done it boys. We've built a superior system that even regular users can use, if the system is set up to be user-friendly.
There are a few gripes I still have, and pitfalls I want to address. There's still too many options, users can drown in the sheer amount of distro's to choose from. For us that's extremely important but they need to have a guide there. However, don't do remote administration for them! That's even worse than Microsoft's tracking! Whenever you install Linux on someone else's computer, don't be all about efficiency, they are coming from Windows and just want it to be easy to use. I use Mate myself, but it is not the thing I would recommend to others. In other words, put your own preferences aside in favor of objective usability. You're trying to sell people on a product, not to impose your own point of view. Dualboot with Windows is fine, gaming still sucks on Linux for the most part. Lots of people don't have their games on Steam. CAD software and such is still nonexistent (OpenSCAD is very interesting but don't tell me it's user-friendly). People are familiar with Windows. If you were to be swimming for the first time in the deep water, would you go without aids? I don't think so.
So, Linux can be shown and be actually usable by regular people. Just pitch it in the right way.11 -
TL.DR.: Emojis in commit messages + bad commit messages made by Microsoft™ employees.
Yes, I'm looking at you Microsoft. It would be helpful if I can, you know, understand your commit messages instead of trying to guess wtf _that_ emoji means. That is, if it is the same emoji on my machine. We didn't figure that one out yet. And no, "Some 💄 changes ✨" is not a good commit message, even if you interpret it correctly (which depends on your emoji icon set).
idk about you, but that shitty 💄 emoji tends to be (see image) and I happen to associate that with an XLR audio cable. I had to ask someone else to understand a commit message; a message supposed to be explicit—stating what you changed and optionally why you changed it (you can off-load that part to an issue tracker).
Furthermore, that "Some 💄 changes ✨" commit did none of that. "I made cosmetic changes somewhere for some reason without linking to an issue." If you didn't catch that little detail yet: "COSMETIC CHANGES" is vague as fuck. What is a cosmetic change?
* Does a cosmetic change mean adjusting indentation?
* Does it mean deleting unnecessary abstraction to make the code more readable?
* Does it mean refactoring code to add that beauty factor?
* Does it mean all of the above? Or perhaps a specific combination of these?
Human communication is shit enough, don't make it worse than it already is.22 -
For almost twenty years I have sheltered in the protective, safe, warm bosom of Debian. For a long time, it had the largest body of available software of all the distros, and by far when Ubuntu rose to prominence. So I used Ubuntu for years for the depth of package availability, and because if something esoteric was released, it would almost certainly come out first on Ubuntu, and sometimes only on Ubuntu. I was happy. Things were good.
But over time, Ubuntu and even Debian started to lean harder and harder on gnome, which I've always hated, along with all desktop environments, as they obscure the system from the user, and introduce graphical layers of abstraction, so the actual job of getting things done becomes a black art, hidden behind gnome-specific tools. This is my preference, and It's been disheartening in recent years to see the direction the desktop appears to be taking.
Then I joined devrant in 2017, and until then, I had heard peripherally about Arch, but never more than that. I had not heard of Manjaro at all. People started posting success stories and happy screenshots, and I was intrigued.
In 2018 I built a windows machine to use for parsec streaming games that wouldn't run on my linux rig. For not a great deal of money, I built a solid machine that's unequivocally better than any machine I've ever used, and installed windows on it. For a while, I was pleased. I had the best of both worlds: a windows box to stream some games from, and a linux desktop for everything else.
But after a couple months, as proton matured, I found fewer and fewer reasons to use my windows machine. My use of it declined to where I was last week: it had been months since I'd even powered it on. It was the most powerful machine I've ever used, and it was just collecting dust behind the TV in the living room. The full realization came to me while I was fighting a battle in the Gnome Takeover War, and I realized: I don't have to do this.
I pulled the newer machine out from behind the TV and installed Manjaro architect edition on it. The flexibility in the install was staggering. I am using nilfs2 for my /boot and / partitions: an option that Ubuntu has never offered. Normally they just default you into the garbage ext4 filesystem, and if you can dig deep enough, you can install with something else, though you have to really want it, in my opinion.
But Manjaro has been a dream-come-true. Pacman is easily the best package manager I have ever used, and pamac's intuitive and easy commands are a great view into AUR. Booting into the virtual console instead of a display manager has been wonderful too. On Ubuntu, I had to disable systemd's version of runlevel 5 to even get it working. But I just popped my xrandr script into my .xinitrc, and X opens with startx in less than a second. On Ubuntu, it takes about 5-10 seconds.
This has nothing to do with Manjaro, but I also switched to Radeon for this install, and I couldn't be happier about that. No more "installing" nvidia's drivers.
No more gnome. No more PPAs. No more settling. I am a Manjaro user now. Full stop. Thank you, devrant, for bringing it to my attention.11 -
Spend 14 hours a week studying more with my free time.
Things to be studied:
-discrete math
-data structures
-algorithms
-coding challenges
-problem defining
-abstraction
-other relevant maths
Other things I want to improve:
-confidence at work
-reaching out to teams with questions
-social skills
-time management
-enjoying the little things
-patience
-consistency (with everything above)
Last big thing would be being more conscious with what type of data/platforms I am digesting everyday. Just like a good diet I want to get in the habit of consuming “good” useful content that’s thought provoking or knowable rather than fast food social media carbs
Wish everyone a productive New Year!6 -
Manager: Could you create the UI for the new feature? The client wants to test it. We need it in 3 days.
*1 week later*
Client: IT DOESNT WORK
Me: This is just a visual demo... but everything will work when we realse the feature.
Client: okay but can I see what it will do?
Of course you can! Just wait until we relase it!
*2 weeks later*
Manager: What are you doing?
Me: Working on the UI for the new feature.
Manager: Wait, hadn't you already done it for the demo?
Me: That UI didn't really work. It was basically a bunch of HTML, without reactivity or abstraction or any functionality.
Manager: Okay, how much where you able to re-use?
Me: almost nothing.
Manager: So... you wasted those 3 days?
Oh so I'm the one who wasted 3 days.
Me: Kinda, yeah
Manager: Why couldn't you have done this when I asked you to do the UI?
You can't expect good quality code in 3 days. Pls stop wasting it on demos.3 -
OK heavy rant on 'modern' software development coming! --> don't take it to seriously though :-)
Electron... why does that shit exist? It is like stacking all the worst technologies available to mankind into an enormous pile of crap and polishing that turd to look like something wonderful. It is big, slow and overall AWFUL!
An example? ... Microsoft Teams :-( it burns your PC like fire and makes it squeal for mercy.
When a library/framework becomes the ultimate evolution of abstraction layer upon abstraction layer and it simply should stop to exist and a reset button needs to be pressed.
I would love to see some research on the real world environmental impact that all those shitty slow and bloated web technologies have.
Solution:
Software energy label!
C, C++ and Rust e.t.c. and all accompanying efficient UI libraries should be the only languages/implementations allowed to get a A, B and C label.
Python (without C libraries like Numpy), JavaScript and all those other slow interpreted scripting/Web API nonsense should get a D, E or F label by default.
Have fun!12 -
Getting really tired of newer devs in the OSS world re-creating something that has been around for decades, slapping a flashy logo on it, and saying they invented a "blazing fast", "under 200 LOC" way to do something.
"Under X lines of code!!1" is not impressive. It just means you don't understand how abstraction works.7 -
Since fucking when did "bare metal" mean just running on an OS?? At a conference and literally everyone is like "we got kubernetes running on bare metal", got super excited for a bit because just the idea of that sounds amazing but they're using it as slang for "basically not in a container or vm."
Nothing exciting at all. Now we're patting ourselves on the back for getting software working without it being preconfigured as a container or a VM image. No one knows how to do anything any more. MUCH too much abstraction going on.
I guess it keeps me more employable, but the state of the world from a developer standpoint is just sad.
(For reference, this is what the first sentence of "Bare Metal" looks like on wikipedia "In computer science, bare machine (or bare metal) refers to a computer executing instructions directly on logic hardware without an intervening operating system.")4 -
Don't you hate it when your co-worker does dumb things, but thinks it's the "clean code" way?
The following is a conversation between me and a co-worker, who thinks he's superior to everyone because he thinks he's the only one who read the Clean Code series. Let's call him Bill.
Me: I think the feature we need is quite simple, our application needs to call this third party API, parse the response and pass it to the next step. Why do you need to bury everything under an abstraction of 4 layers?
Bill: bEcAuSe It'S dEcOuPlInG, aNd MaKe ThE cOdE tEsTaBlE
Me: I don't know man, you only need to abstract the third party api client, and then mock it if you want. Some interfaces you define makes no sense at all. For example, this interface only has 1 concrete implementation, and I don't think it will ever have another. Besides, the concrete implementation only gets the input from the upper layer and passes it down the lower layer. Why the extra step? I feel like you're using interface just for the sake of interface.
Bill: PrOgRaMmInG tO iNtErFaCe, NoT cOnCrEtE iPlEmEnTaTiOn!!!
Me: You keep saying those words, I don't think they mean what you think they mean. But they certainly do not mean that every method argument must be an interface
Bill: BuT uNcLe BoB blah blah blah...
Me: *gives up all hope*14 -
brain: ABSTRACTION ABSTRACTION ABSTRACTION too much ABSTRACTION!
me: jeez calm down a lil i just deployed a boilerplate ember web app with cli tools with next to nothing amount of 'my' code.
b: YES U SUCKER THAT'S WHAT WENT WRONG U DON'T KNOW SHIT ABOUT THE LIL STUFF THAT HAPPENS BEHIND THE SCENES THE FUCK MAN U CALL YOURSELF A CS STUDENT YOU CAN'T EVEN WRITE A COMPILER YET
m: sooo remember when we were studying logic gates and binary conversions and you sigkilled all my threads cuz it was 'boring'?
b: why yes why do you ask
m: WELL that's where we'll end up again if you don't stop nagging me about going down. Trust me, I KNOW how to starve you and you'll beg me to use Python again. You start making advanced data structures in C and the next thing you know you're writing assembly code 'just for fun'.
I have a hackathon coming right up and I have to use a framework or my team loses the advantage. Are we good?
b: well if you put it that way...BUT AFTER THAT YOU'RE TAKING ME TO AN ALGORITHM SESSION
m: *eerily stares at the dusty book in the corner*
you... have a deal3 -
Working in the embedded systems industry for most of my life, I can tell you methodical testing by the software engineers is significantly lacking. Compared to the higher level language development with unit tests and etc, something i think the higher level abstracted industry actually hit out the of park successfully.
The culture around unit testing and testing in general is far superior in java and the rest.
Down here in embedded all too often I hear “well it worked on my setup... it worked at my desk”.. or Oh I forgot to test that part.. or I didn’t think that perticular value could get passed in... etc I’ve heard it all. Then I’ve also heard, you can’t do TTD or unit tests like high level on embedded... HORSESHIT!
You most definitely can! This book is a great book to prove a point or use as confirmation you are doing things correctly. My history with this book was I gonna as doing my own technique of unit testing based on my experience in the high level. Was it perfect no but I caught much more than if I hadn’t done the testing. THEN I found this book, and was like ohh cool I’m glad I’m on the right thought process because essentially what they were doing in the book is what I was doing just slightly less structured and missing a few things.
I’ve seen coworkers immediately think it’s impossible to utilize host testing .. wrong.
Come to find out most the of problems actually are related to lack of abstraction or for thought out into software system design by many lone wolf embedded developers.. either being alone, or not having to think about repercussions of writing direct register writes in application or creating 1500 line “main functions” because their perception is “main = application”. (Not everyone is like this) but it seems to be related to the EEs writing code ( they don’t know wha the CS knows) and CS writing over abstraction and won’t fit on Embedded... then you have CEs that either get both sides or don’t.. the ones to understand the low level need but also get high level concepts and pariadigms and adapt them to low level requirements BOOM those are the special folks.
ANYway..the book is great because it’s a great beginner book for those embedded folks who don’t understand what TDD is or Unit testing and think they can’t do it because they are embedded. So all they do is AdHoc testing on the fly no recording results no concluding data very quick spot check and done....
If your embedded software engineers say they can’t unit test or do TDD or anything other than AdHoc Testing...Throw the book at them and say you want the unit test results report by next week Friday and walk away.
Lol7 -
Yknow, I want to make an android app that I have in my mind for about half a year now and I already tried twice, both with Kotlin and with Java but everytime I try it's just pain and suffering and frustration...
No it's not because of the language, I like Java and I like Kotlin too and I'd say I'm at least decent at Kotlin and really good in Java...
No no.. the issue is the fucking Android SDK and the mix-and-match documentation available online!!!
Every fucking time I want to implement some sort of UI element, user action or a background service and I start googling how to do it It comes with with at least 3 different stack overflow solutions, all of them saying "that way of doing it is deprecated, instead you should X" and looking up the OFFICIAL FUCKING DOCS it will just make me roll up in the corner and cry because of how fucking inconsistent it is and the retarded domain language it uses... fucking transactions for fucking fragments inside fucking activities... because I guess the word "screen"/"view"/"template" or something similar natural just was too mainstream for the all knowing alphabet soup that google is...
And then you start looking up what the fucking difference even is and how to code it up only to find out there's at least 12 other opinions on how fragments should be used and what should be an activity and what should be a damn fragment...
But that's not all, that's just the base... I get a headache even thinking about how the fucking inflating of templates and the entire R. notation works. You want to open a fucking tiny corner menu with the settings options? WELL THEN YOU FUCKING BETTER REMEMBER TO IMPLEMENT IT THROUGH SOME SORT OF EVENT AND INFLATE THE MENU YOURSELF EVEN THOUGH ITS THE SAME FUCKING THING WITH STATIC STRINGS...
AND WHY THE FUCK DO I NEED LIKE 4 NEW FILES TO IMPLEMENT A FUCKING LISTVIEW...
also talking about ListViews... what was wrong with "ListView"... Why do we need a "RecyclerView"... oh right... because the fucks fucked the fuck up and all the legacy components were designed by a monkey and are next to useless! SO WE NEEDED A NEW NAME FOR THE FIXED VERSION, CANT NAME IT LISTVIEW AGAIN... FUCK YOU...
honestly... if I got a dolar for every "what the fuck android" I said during trying to understand that mess I'd be richer by a few hundred...
oh oh oh, but you know what? You don't like the android SDK? that's fine, you can use fucking React or Flutter or something... yeah.. because instead of torturing myself with the android SDK I want to torture myself with an abstraction of the same SDK and JavaScript as the fucking cherry on top... HAVE YOU FUCKING SEEN THE CODE FLUTTER SHOWS ON THEIR WEBSITE AS THE "Introduction" ?!!!
Look at this piece of shit:
[code in attached image, we could really use a proper Markdown support at least for rants]
THAT'S NOT EVEN THE ENTIRE THING, THAT'S JUST THE *REALLY* UGLY PART...
The fucking nesting... What is it with JS and all the fucking nesting everytime?! It looks like shit.... It reads like shit as well...
WHY, in the name OF FUCK, IS THERE MORE THAN 5 ANDROID FRAMEWORKS and ALL of them... used this FUCKING NOVEL idea of programming using A FUCKING BRACKET WALL
It always looks like:
(code(code[code{code(code{code()})}]));
If I wanted to make a fucking app or a website using fucking Haskell I'd do that.... at this point reading assembly code feels like heaven compared to this retardation... Why is this so popular?! WHAT DO YOU PEOPLE SEE IN IT?! Clearly it's not the aesthetics... it looks like a fucking frog vomit running down an emus leg, fuck that.... I don't even hate classic JavaScript, it's a good enough language and it does what I tell it to... but these ugly fucking frameworks like react, angular and whatever else uses this fucking format can go fuck right off. This is not the way JS is gonna get a better name for itself...
So:
Fuck Google
Fuck the marionette that designed the Android SDK
Fuck the Hellspawn the came up with the "functional-like" way of using JavaScript
Fuck everyone that thinks "JavaScript everywhere" is a good thing
And deeply future-fuck everyone that makes a new framework following any of these standards, stucks a .js at the end of the name and releases his hairball.js of an invention into the fucking world....
It's a mess... fuck everything android related...14 -
I’m trying to add digit separators to a few amount fields. There’s actually three tickets to do this in various places, and I’m working on the last of them.
I had a nightmare debugging session earlier where literally everything would 404 unless I navigated through the site in a very roundabout way. I never did figure out the cause, but I found a viable workaround. Basically: the house doesn’t exist if you use the front door, but it’s fine if you go through the garden gate, around the back, and crawl in through the side window. After hours of debugging I eventually discovered that if I unlocked the front door with a different key, everything was fine… but nobody else has this problem?
Whatever.
Onto the problem at hand!
I’m trying to add digit separators to some values. I found a way to navigate to the page in question (more difficult than it sounds), and … I don’t know what view is rendering the page. Or what controller. Or how it generates its text.
The URL is encrypted, so I get no clues there. (Which was lead dev’s solution to having scrapeable IDs instead of just, you know, fixing them). The encryption also happens in middleware, so it’s a nightmare to work through. And it’s by the lead dev, so the code is fucking atrocious.
The view… could be one of many, and I don’t even know where they are. Or what layout. Or what partials go into building it.
All of the text on the page are “resources” — think named translations that support plus nested macros. I don’t know their names, and the bits of text I can search for are used fucking everywhere. “Confirmation number” (the most unique of them) turns up 79 matches. “Fee” showed up in 8310 places before my editor gave up looking. Really.
The table displaying the data, which is what I actually care about, isn’t built in JS or markup, but is likely a resource that goes through heavy processing. It gets generated in a controller somewhere (I don’t know the resource name so I can’t find it), and passed through several layers of “dynamic form” abstraction, eventually turned into markup, and rendered as a partial template. At least, that’s how it worked in the previous ticket. I found a resource that looks right, and there’s only the one. I found the nested macros it uses for the amount and total, and added the separators there… only to find that it doesn’t work.
Fucking dead end.
And i have absolutely nothing else to go on.
Page title? “Show”
URL? /~LiolV8N8KrIgaozEgLv93s…
Text? All from macros with unknown names. Can’t really search for it without considerable effort.
Table? Doesn’t work.
Text in the table? doesn’t turn up anything new.
Legal agreement? There are multiple, used in many places, generates them dynamically via (of course) resources, and even looking through the method usages, doesn’t narrow it down very much.
Just.
What the fuck?
Why does this need to be so fucking complicated?
And what genius decided “$100000.00” doesn’t need separators? Right, the lot of them because separators aren’t used ANYWHERE but in code I authored. Like, really? This is fintech. You’d think they would be ubiquitous.
And the sheer amount of abstraction?
Stupid stupid stupid stupid stupid.11 -
Never realized with a industry that changes by the second how relevant and timeless a single book(set) can remain. 52 year old book.
The work that knuth put into this collection to keep it timeless and language in-specific keeping it to theory rather than details of syntactical details is amazing.
Sure there are other timeless classics out there.. the algorithm book, K&R C, the dragon books, the wizard book.
But I think this single book outweighs them all in the abstraction point of view... AND it’s abstraction in the “opposite direction”... abstraction to a machine language architecture that is purely theoretical... brilliant.21 -
What's the point of abstraction layers if you're riddling all the code with
if (AbstractionLayer.SomeType instanceof ImplemantationLayer.SomeTypeIml1) {
// do stuff
} else if (AbstractionLayer.SomeType instanceof ImplemantationLayer.SomeTypeIml2) {
// do stuff
} else if (AbstractionLayer.SomeType instanceof ImplemantationLayer.SomeTypeIml3) {
// do stuff
} else if (AbstractionLayer.SomeType instanceof ImplemantationLayer.SomeTypeIml4) {
// do stuff
}
???
Seriously.. Guys. Am I missing some point here?9 -
Hell World
So to followup with the enterprise grade goodness, I made a little prototype~
https://github.com/EnterpriseSoftwa...
Not very enterprise like yet, but a fun first 'extension' to writing a proper hello world program.
Ideas
--------
*Things that might make it more business like*
- Lots and lots of abstraction
- Tests ( not very business like but more stuff = better )
- FFI | Shared library, because why not
- Threading / workers
Hardcore:
Design a dedicated language for writing hello world programs that is compiled / interpreted on a simulated custom hello-world-cpu and displays it's content on a simulated screen.
Note
--------
I want to keep the documentation & code normal / actually helpful as a contrast to the concept itself and of course to keep my sanity.24 -
I’m pissed.
I had previously ranted about being assigned to a very messy project. I spent 3-4 months alone adding features and CLEANING things up.
Recently, there had been talks about a new major development phase on this project. But things lingered and the day before I’m to go on vacation, I get the news that this new phase starts in 2 days. Since I’m going to be on break they’re putting other guys on the project who don’t know anything about it.
Fast forward two weeks later.
I’m back from vacation.
I find out one of the guys has strong opinions about doing things certains ways… but unfortunately they are "ways" of unnecessary complexity, abstraction and verbosity.
After just a couple of weeks I’m already lost in the complexity of his code, which supports features of VERY LOW complexity. Fuck, has he ever heard of KISS? Has anybody heard of it where I work?
Now I have to spend my mental energy trying to make sense of this pile of crap rather than actually spending it getting things done.1 -
1) Built an entire SoC around a MIPS CPU. Fixed bugs in the CPU. Created hardware, busses, firmware, wrote Linux drivers, ported Linux.
2) Still working on a C++ abstraction framework for heterogenous computations for 4 years. About to solve / create a prototype for GPGPU and maybe even HDL code generation. Utilizes dynamic dispatch for scalar, SSE, AVX and other targets. I started this only because I did not like the performance of procedural noise algorithms utilized in a game prototype I started in 2015.
3) Created a game in 5 months to drag myself out of depression. Feeling success while your job sucks is soooo goooodd...13 -
“You have a Jira?”
“You need a Jira, first”
“Open a Jira”
“You need a Jira for that too”
“That Jira is on the wrong board, plus you need to email _this_ form first”
Jira, Jira, Jira-fucking-jira, AAAHHHHHHHHHHH, STOP!
My job appears to be nothing more than an abstraction layer around Jira. A leaky abstraction11 -
From NAND to Tetris..
This book is IMO the best book for those who want to venture to the lower level programming.
This books retrains you’re thinking, teaches you from the bottom up! Not the typical top down approach.
You begin with the idea of Boolean algebra. And the move on to logic gates.. from there you build in VHDL everything you will use later.
Essentially building your own “virtual machine”.. you design the instruction set. Of which you will then write assembly using the instruction set to control the gate you built in VDHL.
THEN you will continue up the abstraction layer and will learn how a compiler works, and then begin written c code that is then compiled down to your assembly of your instructions set to be linked and ran on your virtual machine you built.
All the compiler and other tools are available on the books website. The book is not a book where you copy and paste, run and done.... you kinda have to take the concepts and apply them with this book.
Then once you master this book, take it the extra step and learn more about compilers and write your own compiler with the dragon book or something.
Fantastic book, great philosophy on teaching software.. ground up rather than top down. Love it! It’s Unique book.21 -
I AM TIRED
warning: this rant is going to be full of negativity , CAPS, and cursing.
People always think and they always write that programming is an analytical profession. IF YOU CANNOT THINK IN AN ANALYTICAL WAY THIS JOB IS NOT FOR YOU! But the reality could not be farther from the truth.
A LOT of people in this field whether they're technical people or otherwise, just lack any kind of reasoning or "ANALYTICAL" thinking skills. If anything, a lot of of them are delusional and/or they just care about looking COOL. "Because programming is like getting paid to solve puzzles" *insert stupid retarded laugh here*.
A lot of devs out there just read a book or two and read a Medium article by another wannabe, now think they're hot shit. They know what they're doing. They're the gods of "clean" and "modular" design and all companies should be in AWE of their skills paralleled only by those of deities!
Everyone out there and their Neanderthal ancestor from start-up founders to developers think they're the next Google/Amazon/Facebook/*insert fancy shitty tech company*.
Founder? THEY WANT TO MOVE FAST AND GET TO MARKET FAST WITH STUPID DEADLINES! even if it's not necessary. Why? BECAUSE YOU INFERIOR DEVELOPER HAVE NOT READ THE STUPID HOT PILE OF GARBAGE I READ ONLINE BY THE POEPLE I BLINDLY COPY! "IF YOU'RE NOT EMBARRASSED BY THE FIRST VERSION OF YOU APP, YOU DID SOMETHING WRONG" - someone at Amazon.
Well you delusional brainless piece of stupidity, YOU ARE NOT AMAZON. THE FIRST VERSION THAT THIS AMAZON FOUNDER IS EMBARRASSED ABOUT IS WHAT YOU JERK OFF TO AT NIGHT! IT IS WHAT YOU DREAM ABOUT HAVING!
And oh let's not forget the tech stacks that make absolutely no fucking sense and are just a pile of glue and abstraction levels on top of abstraction levels that are being used everywhere. Why? BECAUSE GOOGLE DOES IT THAT WAY DUH!! And when Google (or any other fancy shit company) changes it, the old shitty tech stack that by some miracle you got to work and everyone is writing in, is now all of a sudden OBSOLETE! IT IS OLD. NO ONE IS WRITING SHIT IN THAT ANYMORE!
And oh my god do I get a PTSD every time I hear a stupid fucker saying shit like "clean architecture" "clean shit" "best practice". Because I have yet to see someone whose sentences HAVE TO HAVE one of these words in them, that actually writes anything decent. They say this shit because of some garbage article they read online and in reality when you look at their code it is hot heap of horseshit after eating something rancid. NOTHING IS CLEAN ABOUT IT. NOTHING IS DONE RIGHT. AND OH GOD IF THAT PERSON WAS YOUR TECH MANAGER AND YOU HAVE TO LISTEN TO THEM RUNNING THEIR SHITHOLE ABOUT HOW YOUR SIMPLE CODE IS "NOT CLEAN". And when you think that there might be a valid reason to why they're doing things that way, you get an answer of someone in an interview who's been asked about something they don't know, but they're trying to BS their way to sounding smart and knowledgable. 0 logic 0 reason 0 brain.
Let me give you a couple of examples from my unfortunate encounters in the land of the delusional.
I was working at this start up which is fairly successful and there was this guy responsible for developing the front-end of their website using ReactJS and they're using Redux (WHOSE SOLE PURPOSE IS TO ELIMINATE PASSING ATTRIBUTES FOR THE PURPOSE OF PASSING THEM DOWN THE COMPONENT HIERARCHY AGIAN). This guy kept ranting about their quality and their shit every single time we had a conversation about the code while I was getting to know everything. Also keep in mind he was the one who decided to use Redux. Low and behold there was this component which has THIRTY MOTHERFUCKING SEVEN PROPERTIES WHOSE SOLE PURPOSE IS BE PASSED DOWN AGAIN LIKE 3 TO 4 TIMES!.
This stupid shit kept telling me to write code in a "functional" style. AND ALL HE KNOWS ABOUT FUNCTIONAL PROGRAMMING IS USING MAP, FILTER, REDUCE! And says shit like "WE DONT NEED UNIT TESTS BECAUSE FUNCTIONAL PROGRAMMING HAS NO ERRORS!" Later on I found that he read a book about functional programming in JS and now he fucking thinks he knows what functional programming is! Oh I forgot to mention that the body of his "maps" is like 70 fucking lines of code!
Another fin-tech company I worked at had a quote from Machiavelli's The Prince on EACH FUCKING DESK:
"There is nothing more difficult to take in hand, more perilous to conduct, or more uncertain in its success, than to take the lead in the introduction of a new order of things."
MOTHERFUCKER! NEW ORDER OF THINGS? THERE 10 OTHER COMPANIES DOING THE SAME SHIT ALREADY!
And the one that got on my nerves as a space lover. Is a quote from Kennedy's speech about going to the moon in the 60s "We choose to go to the moon and do the hard things ..."
YOU FUCKING DELUSIONAL CUNT! YOU THINK BUILDING YOUR SHITTY COPY PASTED START UP IS COMPARABLE TO GOING TO THE MOON IN THE 60S?
I am just tired of all those fuckers.13 -
Not ONLY does the new code a coworker wrote straight up not work (and they somehow managed to merge it to master) but it also broke an entirely unrelated endpoint due to an abstraction they tried to make. Very clear they didn't even run their code at all.2
-
So today I saw another 'OOP should die' article.
And I decided I should google around a bit to find out why.
Reasons I found:
- Things get too complicated
- Things get too abstract (same as the above really)
But when I search for alternatives, only functional programming and different ways to use OOP get mentioned.
I still don't get why OOP is supposedly bad though.
Maybe my 20-30k LOC projects aren't big enough to see it?
For me the abstraction works very well. The abstraction is used to keep the complexity low(er).
And the different ways of using OOP are a plus-point for me. (Like the Entity-Component system)
I don't know enough about functional programming to be able to say it's better or worse, but the ideas behind it a perfectly usable in languages like C#.
So if any of you have a good concrete reason to not use OOP, please feel welcome to tell me in the comments :)12 -
Honestly I see more and more abstraction layers added and from year to year less people understanding how a computer and some algorithms works. In the end it would be like the mechanicum or whatever the name was of the w40k universum where the specialists have not a damn clue how their creations (in our case software) works and how to optimize it anymore6
-
Learning Go. How I didn't learned it before? It has the efficiency of a low level language with the beautiful syntax and abstraction of a high level one.4
-
Linus Torvalds on C++
“C++ leads to really really bad design choices. You invariably start using
the nice library features of the language like STL and Boost and other total and utter crap, that may help you program, but causes:
- infinite amounts of pain when they don't work (and anybody who tells me
that STL and especially Boost are stable and portable is just so full of BS that it's not even funny)
- inefficient abstracted programming models where two years down the road you notice that some abstraction wasn't very efficient, but now all your code depends on all the nice object models around it, and you cannot fix it without rewriting your app.”
http://harmful.cat-v.org/software/...3 -
Me: "How many MB of files do you want to send me via mail? Maybe it would be better to upload everything..."
Client: "Oh don't worry. It won't be to much. It's just five folders."5 -
dates are just an index of time
practicing is just offsetting your initial, natural ability in a positive direction
do you guys ever just think of things in an abstract sense?
what are other examples?16 -
i honestly hate the ap computer science principles curriculum. we're taking an ap test soon, so for the past few weeks, we've been constantly taking practice tests.
it pisses me off so much. the questions, the criteria, it's all bs.
we have questions like "what will reduce the digital divide?" with choices like "education for low income families on computers." like, I DONT FUCKING KNOW.
frankly, I DONT FUCKING CARE. giving electronics to people who cant afford it is great and all, BUT IT DOESNT INVOLVE ANYTHING ABOUT COMPUTING.
HEY, COLLEGE BOARD, KNOWING IF AN ALGORITHM IS TECHNICALLY AN "ABSTRACTION" DOESNT FUCKING MATTER. WHAT MATTERS IS THAT I CAN IDENTIFY WHATS MORE EFFICIENT, WHERE A BUG IS, CONCEPTS INVOLVED IN PROGRAMS, THINGS LIKE THAT.
NOT IF DNS IS SIMILAR IN STRUCTURE TO THE US POSTAL SYSTEM.
god i hope whoever wrote this gets hit in the head by a github server that was dropped from the 2^8th floor.2 -
'Hey I found a bug in your code, it's probably a typo, see here.'
Me: Oh right, yeah. How stupid of me. Thanks, I'll push it.
'It's okay. You can push it or I can do it too after you push the changes we just discussed. I actually simplified one of your methods.'
Me: You, what... ?
(You crammed multiple lines in a single line with your stupid as fuck, rigid constructs, removing my error handling, loosely coupled service, in the name of simplification?)
' Yeah it's just four lines in a single function now, no need to call the function again and again.'
Me: (No... Just no. This totally undos whatever little I could do to avoid supporting your idiotic object in the first place.)
Oh... okay, we'll see. I'll let you know.
What life.
Life in a company full of ignorant, inflated egos is no joke.
Details:
I created a service that reads a configuration file and returns the configuration. This person needs five entries for his app logic. He collected them in a object. Quite alright. Except that the class prototype is shitty. I, like a normal person, made my service return a value based on input. I was asked to incorporate this awful object so that I can return the five entries together, which is awful because the service is not supposed to know about how the entries are clubbed. It should most certainly not know about the data members of the object!4 -
Oh, $work.
Ticket: Support <shiny new feature> in <seriously dated code> to allow better “searching” (actually: generating reports, not searching)
UI: “Filter on” inputs above a dynamic JS table don’t update said table; they trigger generating a new report.
Seriously dated code: 12 years old. Rails v3-isms. Blocks access without appropriate role; role name buried in secrets configuration files. Code passes data round-trip between server/client/server/model that isn’t ever used. Has two identical reports with slightly different names, used interchangeably. Uh, I guess I’ll update both?
Reports: Heavily, heavily abstracted; zero visibility.
Shiny new feature: Some new magical abstraction layer with no documentation nor comments. Nobody in my team knows how it works. The author… won’t explain, but sent me her .ppt presentation on it (the .ppt, not a recording).
Useless specs for seriously dated code: Tests exclusively factory-generated data; not the controller, filters/lookups, UI, table data, etc.
Seriously dated code and useless spec author: the CISO.
The worst part: I’m not even surprised at any of this.2 -
Central team: No, your team must be doing something wrong. Our pipeline is super-configurable and works for any situation! You just have to read the docs!
Me: Where are the docs?
Central team: Uhh, well, umm... we'll hook you up with a CI/CD coach!
Me: Okay, cool. In the mean time, can you point me at the repo where all the base scripts are?
Central team: Sure, it's here.
Me, some weeks later: Yeah, uhh, the coach can't seem to figure out how to make our Prod deployment work either.
Central team: That's impossible! It's so easy and completely configurable!
Me: Well, okay... but, here's the thing: your pipeline IS pretty "configurable", in the sense that you look for A LOT of variables...
Central team: See! We told you!
Me: ...none of which are actually documented, so they're just about useless to me...
Central team: But, but the coach...
Me: ...couldn't make heads or taisl of it either despite him literally being ON YOUR TEAM...
Central team: Then your project must just be architected wrong!
Me: Well, we're not perfect, so could be...
Central team: Right!
Me: ...but I think it's far more likely that the scripts... you know, the ACTUAL Python scripts the pipeline executes... while it took me DAYS to get through all your levels of abstraction and indirection and, well, BULLSHIT... it turns out they are incredibly NOT flexible. They do one thing, all the time, basically disregarding any flexibility in the pipeline. So, yeah, I'm thinking this is probably one of this "it's you, not me" deals.
Central team: Waaaaahhhhhhhh!!!!!!2 -
Oh gosh... This week a "friend" of mine will have a job interview for a company I am working at. This guy really just can't Code. He has no understanding of clean code, abstraction etc. He just knows the basics. But he loves to brag how good he is and got his bachelor degree. Damn I hate this guy and I hope HR won't hire him.7
-
When I was in college OOP was emerging. A lot of the professors were against teaching it as the core. Some younger professors were adamant about it, and also Java fanatics. So after the bell rang, they'd sometimes teach people that wanted to learn it. I stayed after and the professor said that object oriented programming treated things like reality.
My first thought to this was hold up, modeling reality is hard and complicated, why would you want to add that to your programming that's utter madness.
Then he started with a ball example and how some balls in reality are blue, and they can have a bounce action we can express with a method.
My first thought was that this seems a very niche example. It has very little to do with any problems I have yet solved and I felt thinking about it this way would complicate my programs rather than make them simpler.
I looked around the at remnants of my classmates and saw several sitting forward, their eyes lit up and I felt like I was in a cult meeting where the head is trying to make everyone enamored of their personality. Except he wasn't selling himself, he was selling an idea.
I patiently waited it out, wanting there to be something of value in the after the bell lesson. Something I could use to better my own programming ability. It never came.
This same professor would tell us all to read and buy gang of four it would change our lives. It was an expensive hard cover book with a ribbon attached for a bookmark. It was made to look important. I didn't have much money in college but I gave it a shot I bought the book. I remember wrinkling my nose often, reading at it. Feeling like I was still being sold something. But where was the proof. It was all an argument from authority and I didn't think the argument was very good.
I left college thinking the whole thing was silly and would surely go away with time. And then it grew, and grew. It started to be impossible to avoid it. So I'd just use it when I had to and that became more and more often.
I began to doubt myself. Perhaps I was wrong, surely all these people using and loving this paradigm could not be wrong. I took on a 3 year project to dive deep into OOP later in my career. I was already intimately aware of OOP having to have done so much of it. But I caught up on all the latest ideas and practiced them for a the first year. I thought if OOP is so good I should be able to be more productive in years 2 and 3.
It was the most miserable I had ever been as a programmer. Everything took forever to do. There was boilerplate code everywhere. You didn't so much solve problems as stuff abstract ideas that had nothing to do with the problem everywhere and THEN code the actual part of the code that does a task. Even though I was working with an interpreted language they had added a need to compile, for dependency injection. What's next taking the benefit of dynamic typing and forcing typing into it? Oh I see they managed to do that too. At this point why not just use C or C++. It's going to do everything you wanted if you add compiling and typing and do it way faster at run time.
I talked to the client extensively about everything. We both agreed the project was untenable. We moved everything over another 3 years. His business is doing better than ever before now by several metrics. And I can be productive again. My self doubt was over. OOP is a complicated mess that drags down the software industry, little better than snake oil and full of empty promises. Unfortunately it is all some people know.
Now there is a functional movement, a data oriented movement, and things are looking a little brighter. However, no one seems to care for procedural. Functional and procedural are not that different. Functional just tries to put more constraints on the developer. Data oriented is also a lot more sensible, and again pretty close to procedural a lot of the time. It's just odd to me this need to separate from procedural at all. Procedural was very honest. If you're a bad programmer you make bad code. If you're a good programmer you make good code. It seems a lot of this was meant to enforce bad programmers to make good code. I'll tell you what I think though. I think that has never worked. It's just hidden it away in some abstraction and made identifying it harder. Much like the code methodologies themselves do to the code.
Now I'm left with a choice, keep my own business going to work on what I love, shift gears and do what I hate for more money, or pivot careers entirely. I decided after all this to go into data science because what you all are doing to the software industry sickens me. And that's my story. It's one that makes a lot of people defensive or even passive aggressive, to those people I say, try more things. At least then you can be less defensive about your opinion.53 -
Talking to a second year student about what they've learnt so far, and what they should learn next:
"Cool, so what general topics would you say you know really thoroughly at the moment?"
"Oh, I've now learnt Java, C#, C, C++, Rust, Javascript, node.js, HTML, CSS, Angular, Vue, Erlang and probably a bunch of other stuff I've forgotten. What do you think I should concentrate on next?"
"Hmm. Probably best to take just one of those and learn it really thoroughly."
"...but I already know them all really thoroughly."
"Ok. Can you explain what an abstract class is in say Java, C# or C++?"
"Sure, I can create a new class called abstract and then use it for abstraction. I do that loads."
...🤷♂️🤦♂️
First lesson: Stop BS'ing. Might work for flexing to non-devs, but that's about it.10 -
So it's been awhile since I switched from PHP to Golang.
At first I missed PHP a lot, but golang really has some advantages, so its fine.
But over time, and when the project grows in features, I more and more and more start to miss more and more at least basic classes / inheritance and abstraction friendly stuff from php.
Im finally reaching the point where I start to truly miss php, I can't stop myself after writing a feature to think how much less work that would be in php.
Call me crazy, but damn, it's real.14 -
This will definitely trigger many but the truth regardless of how you feel is the greatest programmers are those who understand both the hardware level and software .. only then are you more than a dev or programmer.. you are an engineer...
I challenge the devs who dis believe to go out and learn to build circuits, write optimized, efficient bare metal code.: no sdk.. no api... no drivers ..remove the unneeded abstraction layers that have blinded you...build it yourself, expand your potential and understanding..
Not only will you become more valuable overall, but you will write better code as you are more conscious of performance and space and physics of the physical layer.
I’m not talking about Arduino or raspie
Those who stand strong that high level abstraction languages and use of third party apis is a sufficient sustainable platform of development are blind to reality.. the more people who only know those levels, the less people pushing the industry of the low level.., which is the foundation of everything in the industry.. without that low level software the high level abstractions and systems cannot run
Why did we have huge technology advancements from 70s to early 2000s.... because more people in our industry understood the hardware layer..: wrote the software at the less abstracted layers..
Yeah it takes longer todo things at that low level abstraction.. but good robust products that change the world and industry don’t take a few week or months to build.....
Take this with what you will... I’m just trying to open the eyes of the blind developers to the true nature and reality of our industry23 -
Everyone's saying "oh my, I'm so ashamed of my code I wrote 4 days ago, it's so horrible"
Well... At least you can relate to someone. When I look at my project's code I wrote half a year ago (or sometime before that) I'm genuinely surprised to see I'm not browsing some library's codebase - the abstraction layers, the generics, the structure... it's brilliant! It's as SOLID as it gets. -
"What are the four pillars of OOP?"
Me:(I'm not an OOP guy, but focused on design patterns)
1. Encapsulation
2. Abstraction
3. Polymorphism
4. ??(was it inheritance or composition).
Fuck, Because of the phrase "composition over inheritance". I've been mixing both composition and inheritance at the same time.9 -
Sprint 0: This design is the appropriate amount of engineering abstraction.
Sprint 2: This is over-engineered, too much work
Sprint 5: This is under-engineered, too many edge cases
Sprint 10: This is over-engineered, component Foo could be replaced by a bash script
Sprint 42: Foo is now the cornerstone of half our business logic2 -
When each layer of abstraction is peeled off from a program and I understand it down to some level.
It always gives me goosebumps thinking about how much each generation of humanity is contributing to our advancement.5 -
So we had this legacy Objective-C codebase for a mobile app that was actually pretty good: I'd inherited the codebase and spent the past several years gradually improving it and I was actually quite proud of the work I put into it. So of course management decides to scrap it (with NO consultation from the engineers) and outsource a complete rewrite of the app in C# for Windows Universal.
Let me tell you. That code was without a doubt and without exaggeration the *worst* code I've seen in my close to 30 years of experience as a developer. I mean they broke every rule in the book, I'm talking rookie mistakes. Copypasta everywhere, no consistent separation of concerns, and yet way too many layers. Unnecessary layers. Layers for the sake of layers. There was en entire abstraction layer complete with a replicated version of every single data class *just* to map properties in pascal case to the same property in camel case. Adding a new field to a payload in the API amounted to hours of work and about eight different files that needed to be modified. It was a complete nightmare. This was supposed to be a thin client, yet it had a complete client-side Sqlite database with its own custom schema (oh and of course a layer for that!) completely unrelated to the serverside schema, just for kicks. The project was broken up into about eight or nine different subprojects, each having their own specific dependencies on various of the other subprojects in such a tightly-knit way that it made gradual refactoring almost impossible. This architecture was so impressively bad, it was actually self-preserving!
Suffice it to say it was a complete nightmare, and was one of the main reasons I ended up leaving that company. So just sayin', legacy code isn't always bad. :) -
I gotta say, I actually admire the work that content creators must go thru to make quality content.
So as I stated before I’m working on YouTube channel, under the name “TheSoftwareSage” ... to create tutorials and a way of me teaching software the way I believe it should be taught, not how the mainstream methods of today are.
Bottom up approach rather than top down
(Must start with a firm understanding of the foundation.. and build upon the knowledge as we go thru the layers of abstraction but the key concepts must be understood first)
Anyway, I’m working on this in my spare time and I was not aware of how much effort I would actually need todo this right haha. At first I figured I’d just screencast a monitor and have a ppt or text editor or terminal open and that stuff and just do it.
As In person with my interns I never have “planned” lessons or content is all impromptu based on the need at the time and I just go with it, with their computers and a whiteboard lol.
I was wrong for video recording lol... maybe it’s OCD... or perfectionism, I’ll make a video, review it like 5times and then be like shit I forgot to mention this or that or I didn’t like how I explained this or that
OR
I keep worrying too much about colors, and sound levels and quality and transitions and video angles and all this other shit.
And then post editing fuck.... I’m about ready to say fuck it and “do it live .. one shot” and just upload the end result.
I guess this would be in the content world similar to our “paralysis analysis” notion.10 -
aaaaaghh fucking Handlers man. Android is so fucking full of shit, i wonder why am i still doing it. love is pain.
Why can't there be one mother fucking solution to all lazy ass asynchronous programming? handlers, threadpools, asynctask, executers, Broadcasts, intentService, coroutines, rxjava,.... i don't what new stuff are people snorting these days.
Ok , leave everything. A handler is class- no sorry, Handler, alongside some fucking Looper clss (and maybe some more stuff i don't know) other classes is a way of handling inter thread communication. Handlers can:
-send data to ui thread
-recieve data from ui thread
-send "messages" to ui thread
-recieve "messages" from ui thread.
- can be attached to ui thread
- can be attached to any child thread
- can be accessed anonymosly via any view
- can be present in multiple places, working together
- can kill night king with a dagger
- can do porn better than johnny sins
- can run for president of the whole fucking world
- do some more shits that i have yet to discover
And where do i find this? buried deep insides some medium articles or in some guy's horrible accent video.
Is background processing really this much of a toughnut to crack?
earlier i was all about using asynctask or foreground/background services, because these are the most easy to understand abstraction of a fairly difficult topic.
But as i see more projects, i see underlying apis like handlers, threadpools , executers , being directly used.
Why cant there be a fucking single abstraction, that could be "lightly tweaked" to handle every ugly case.6 -
Recently, I had to make a minor modification to some Node.js code a coworker wrote a year ago which buffers stringified JSON into Kinesis. I just needed to add a new key to the input object, it took minutes to make the change, but hours to make sense out the absolute trash spaghetti code this guy wrote. After spending half a day trying to make his code readable, I just got so pissed off. I replaced his 15 files/+1,500 lines of uncommented code, filled with classes, factory functions, poorly named functions and vars, and so, so many spelling mistakes.
We now have a single, well commented, 300 line file that does the same thing.
Get that shit code out of here. -
React is an overengineered pile of shit designed to let pretentious developers show of their golden arse holes with useless implentations of worthless business cases where everything and anything is an abstraction of some silly theory.5
-
Eventually you reach a point at which you see that even java's baked-in libs lack abstraction and more layers.4
-
SO MAD. Hands are shaking after dealing with this awful API for too long. I just sent this to a contact at JP Morgan Chase.
-------------------
Hello [X],
1. I'm having absolutely no luck logging in to this account to check the Order Abstraction service settings. I was able to log in once earlier this morning, but ever since I've received this frustratingly vague "We are currently unable to complete your request" error message (attached). I even switched IP's via a VPN, and was able to get as far as entering the below Identification Code until I got the same message. Has this account been blocked? Password incorrect? What's the issue?
2. I've been researching the Order Abstraction API for hours as well, attempting to defuddle this gem of an API call response:
error=1&message=Authentication+failure....processing+stopped
NOWHERE in the documentation (last updated 14 months ago) is there any reference to this^^ error or any sort of standardized error-handling description whatsoever - unless you count the detailed error codes outlined for the Hosted Payment responses, which this Order Abstraction service completely ignores. Finally, the HTTP response status code from the Abstraction API is "200 OK", signaling that everything is fine and dandy, which is incorrect. The error message indicates there should be a 400-level status code response, such as 401 Unauthorized, 403 Forbidden or at least 400 Bad Request.
Frankly, I am extremely frustrated and tired of working with poorly documented, poorly designed and poorly maintained developer services which fail to follow basic methodology standardized decades ago. Error messages should be clear and descriptive, including HTTP status codes and a parseable response - preferably JSON or XML.
-----
This whole piece of garbage is junk. If you're big enough to own a bank, you're big enough to provide useful error messages to the developers kind enough to attempt to work with you.2 -
Debugging TLS failures.
In Java.
With the funny certstore cause "we need to do this by ourselves".
Fucking shitty broken pile of cunt code.
At least the debugging output is good.
As much as I love TLS, debugging it is a nightmare and when a programming language like Java decides to wrap it, it becomes Ctulhu.
OS
- TLS Library
-- TLS Certificate Chain
- JDK
-- JDK SSL Handler
--- JDK Certstore
---- Java Library Abstraction, eg. WS SSL
Joyfully fingering of a tentacle arsehole.2 -
So, I've had a personal project going for a couple of years now. It's one of those "I think this could be the billion-dollar idea" things. But I suffer from the typical "it's not PERFECT, so let's start again!" mentality, and the "hmm, I'm not sure I like that technology choice, so let's start again!" mentality.
Or, at least, I DID until 3-4 months ago.
I made the decision that I was going to charge ahead with it even if I started having second thoughts along the way. But, at the same time, I made the decision that I was going to rely on as little external technology as possible. Simplicity was going to be the key guiding light and if I couldn't truly justify bringing a given technology into the mix, it'd stay out.
That means that when I built the front end, I would go with plain HTML/CSS/JS... you know, just like I did 20+ years ago... and when I built the back end, I'd minimize the libraries I used as much as possible (though I allowed myself a bit more flexibility on the back end because that seems to be where there's less issues generally). Similarly, any choice I made I wanted to have little to no additional tooling required.
So, given this is a webapp with a Node back-end, I had some decisions to make.
On the back end, I decided to go with Express. Previously, I had written all the server code myself from "first principles", so I effectively built my own version of Express in other words. And you know what? It worked fine! It wasn't particularly hard, the code wasn't especially bad, and it worked. So, I considered re-using that code from the previous iteration, but I ultimately decided that Express brings enough value - more specifically all the middleware available for it - to justify going with it. I also stuck with NeDB for my data storage needs since that was aces all along (though I did switch to nedb-promises instead of writing my own async/await wrapper around it as I had previously done).
What I DIDN'T do though is go with TypeScript. In previous versions, I had. And, hey, it worked fine. TS of course brings some value, but having to have a compile step in it goes against my "as little additional tooling as possible" mantra, and the value it brings I find to be dubious when there's just one developer. As it stands, my "tooling" amounts to a few very simple JS scripts run with NPM. It's very simple, and that was my big goal: simplicity.
On the front end, I of course had to choose a framework first. React is fine, Angular is horrid, Vue, Svelte, others are okay. But I didn't want to bother with any of that because I dislike the level of abstraction they bring. But I also didn't want to be building my own widget library. I've done that before and it takes a lot of time and effort to do it well. So, after looking at many different options, I settled on Webix. I'm a fan of that library because it has a JS-centric approach. There's no JSX-like intermediate format, no build step involved, it's just straight, simple JS, and it's powerful and looks pretty good. Perfect for my needs. For one specific capability I did allow myself to bring in AnimeJS and ThreeJS. That's it though, no other dependencies (well, at first, I was using Axios because it was comfortable, but I've since migrated to plain old fetch). And no Webpack, no bundling at all, in fact. I dynamically load resources, which effectively is code-splitting, and I have some NPM scripts to do minification for a production build, but otherwise the code that runs in the browser is what I actually wrote, unlike using a framework.
So, what's the point of this whole rant?
The point is that I've made more progress in these last few months than I did the previous several years, and the experience has been SO much better!
All the tools and dependencies we tend to use these days, by and large, I think get in the way. Oh, to be sure, they have their own benefits, I'm not denying that... but I'm not at all convinced those benefits outweighs the time lost configuring this tool or that, fixing breakages caused by dependency updates, dealing with obtuse errors spit out by code I didn't write, going from the code in the browser to the actual source code to get anywhere when debugging, parsing crappy documentation, and just generally having the project be so much more complex and difficult to reason about. It's cognitive overload.
I've been doing this professionaly for a LONG time, I've seen so many fads come and go. The one thing I think we've lost along the way is the idea that simplicity leads to the best outcomes, and simplicity doesn't automatically mean you write less code, doesn't mean you cede responsibility for various things to third parties. Those things aren't automatically bad, but they CAN be, and I think more than we realize. We get wrapped up in "what everyone else is doing", we don't stop to question the "best practices", we just blindly follow.
I'm done with that, and my project is better for it! -
Hey Lemonade is looking for 10x engineers! Please apply only if you write code that has at least 7 layers of abstraction. Thanks!
🤣
https://makers.lemonade.com/recipe/...8 -
# source_code.py
crawler.do_abstracted_operation_on_a_count_variable_in_crawler_for_when_new_page_is_added_and_ready_to_be_parsed_i_love_abstraction()
# crawler.py
def do_abstracted_operation_on_a_count_variable_in_crawler_for_when_new_page_is_added_and_ready_to_be_parsed_i_love_abstraction(self):
self.count += 18 -
Since I'm back to working for myself again and haven't been able to find a reliable hire, I'm alone. In this bubble, no one cares/sees/appreciates my backend code and I just realized that's why I've been slacking so bad on this ETL process. No one gives a shit about it but me. If I build an interface, I get kudos and everyone celebrates, but working on a three server process with layers of abstraction, auto-scaling, etc...and people just wonder if I'm jerking off all day.
Sometimes it sucks to be a lone ranger.1 -
Wondering how many people use git cmd and how many use different git clients.
I regularly use git cmd. I made a transition from clients a while back because I wanted to learn more about how it actually works and it works just fine for me, except when I have to google something I don't remember (like how to revert local commit)
Git clients will for sure do abstraction which can be both good and bad, but I'm wondering if there are any definitive pros for clients.12 -
For all the hate that Java gets, this *not rant* is to appreciate the Spring Boot/Cloud & Netty for without them I would not be half as productive as I am at my job.
Just to highlight a few of these life savers:
- Spring security: many features but I will just mention robust authorization out of the box
- Netflix Feign & Hystrix: easy circuit breaking & fallback pattern.
- Spring Data: consistent data access patterns & out of the box functionality regardless of the data source: eg relational & document dbs, redis etc with managed offerings integrations as well. The abstraction here is something to marvel at.
- Spring Boot Actuator: Out of the box health checks that check all integrations: Db, Redis, Mail,Disk, RabbitMQ etc which are crucial for Kubernetes readiness/liveness health checks.
- Spring Cloud Stream: Another abstraction for the messaging layer that decouples application logic from the binder ie could be kafka, rabbitmq etc
- SpringFox Swagger - Fantastic swagger documentation integration that allows always up to date API docs via annotations that can be converted to a swagger.yml if need be.
- Last but not least - Netty: Implementing secure non-blocking network applications is not trivial. This framework has made it easier for us to implement a protocol server on top of UDP using Java & all the support that comes with Spring.
For these & many more am grateful for Java & the big big community of devs that love & support it. -
I wanna go back to the age where a C program was considered secure and isolated based on its system interface rathe than its speed. I want a future where safety does not imply inefficiency. I hate spectre and I hate that an abstraction as simple and robust as assembly is so leaky that just by exposing it you've pretty much forfeited all your secrets.
And I especially hate that we chose to solve this by locking down everything rather than inventing an abstraction that's a similarly good compile target but better represents CPUs and therefore does not leak.31 -
Non tech hobbies that helped me with developement:
Lego technic/mecano/knex were a great way to learn about abstraction, you build modules that you can reuse somewhere else.
Cooking is similar, you notice useful patterns that you can reproduce. E.g. roux, which is butter and flour is used for a lot of sauces, then add milk and you get béchamel, which is again used for a lot of sauces.
Coffee brewing helps because I can't focus if I don't get coffee.2 -
My biggest influence on coding style is working with other people's code. I know the temptation to write "clever" code and I've been (and probably still occasionally am) guilty of it myself, but it's not until you have to debug someones oneliner iterator which has !(i-j) as the stop condition that you start to appreciate dumb, boring, obvious code.
If having a series of if checks in a long list makes it readable, keep it that way. If it makes it more readable to rewrite it into a nested switchcase with a couple of ternary bits, go ahead. Just don't spend half a day wrapping it up into two layers of abstraction that will require an onboarding process for the rest of the team.2 -
I discover last week Eloquent ORM.
A php database abstraction layer.
How to make things quickly! Great tools!6 -
A quick rant about dependency injection.
I see far too often in projects, a huge over-reliance on dependency injection / IOC frameworks which permeate throughout the entire codebase.
I cringe every time I see a constructor annotated with @Inject and 10 params.
The benefit of these frameworks is how easy they make it to manage many dependencies. What I dislike about them, is exactly that. I feel that they make it TOO easy to manage many dependencies.
How trivial is it to simply add another constructor param? exactly. And people then wonder why their dependency tree looks insane.
I am a strong believer in injecting dependencies the traditional way, via the constructor with no fancy framework. The reason being that it forces you to think more about the dependencies you are adding to your classes, and consider if they are really all needed.
The other problem I have with it, is it basically encourages you to inject everything because its so easy. The purpose of dependency injection is inversion of control and allowing classes to depend on abstraction rather than concrete implementation. All that goes out the window when you @Inject 6 different concrete classes.
Use dependency injection for its intended purpose, not as an excuse to be lazy and avoid thinking about dependencies.3 -
Had to explain for 30 minutes to a consultant why:
1. registering a the container in the container.
2. writing abstraction for the container to use the abstraction of the container internally.
3. doing service locator thus hiding dependences.
4. having your business logic code know about DI.
5. writing log4net in a massively over complicated manner just because you didn't read the fucking manual.
6. coping code from github into our source.
are just wrong. -
Teach things properly, most teachers are confused and they start throwing keywords at even more confused students who then have no clue what they are doing and they then ask me to do their work for them showing me their unindented(well... kinda, they all seem to fight with the IDE, which is trying to properly indent their mess, for some reason), teachers think that Turbo Pascal is the way of life and that it is used everywhere(one teacher tried to tell me that Pascal is used in the stock market and in modern operating systems - U wot m8?! how high are you right now) and they don't teach user input sanitization and type checking, they stare at you like you are the fucking satan when you dare to use objects, collections and abstraction because they are scared to death of that stuff... and then they think 60 minutes is enough to teach HTML, CSS, JS and PHP in one go(which they even don't know properly - the teacher that made and maintains the school's website is probably stuck in 1998 judging by the design and functionality of the website and his clothes) and they then send absolutely clueless students to compete in a web design competition (and then they get angry at the judges for giving the students 0 points)6
-
!rant
Just learned a new thing today. This was on the Gophers Slack Channel
Every problem in software development can be solved with another layer of abstraction
Corollary: for every layer of abstraction, you create n! more problems.
Reference: https://gophers.slack.com/archives/... -
What do you think about class-based OOP vs. Prototype-based OOP?
I think that class-based OOP is an unnecessary abstraction of structs, functions and global variables most of the time.16 -
This was initially a reply to a rant about politics ruining the industry. Most of it is subjective, but this is how I see the situation.
It's not gonna ruin the industry. It's gonna corrupt it completely and fatally, and it will continue developing as a toxic sticky goo of selfishness and a mandatory lack of security until it chokes itself.
Because if something can get corrupted, it will get corrupted. The only way for us as a species to make IT into a worthy industry is to screw it up countless times over the course of a hundred years until it's as stable and reliable as it can possibly be and there are as many paradigms and individually reasonable standards as there can possibly be.
Look around, see the ridiculus amount of stupid javascript frameworks, most of which is just shitcode upon vulnerabilities upon untested dependencies. Does this look to you like an uncorrupted industry?
The entire tech is rotting from the hundreds of thousands of lines of proprietary firmware and drivers through the overgrown startup scene to fucking Node.js, and how technologies created just a few decades ago are unacceptable from a security standpoint. Check your drivers and firmware if you can, I bet you can't even see the build dates of most firmware you run. You can't even know if it was built after any vulnerability regarding that specific microcontroller or whatever.
Would something like this work in chemical engineering? Hell no! This is how fucking garage meth labs work, not factories or research labs. You don't fucking sell people things without mandatory independent testing. That's how a proper industry works. Not today's IT.
Of course it's gonna go down in flames. Greed had corrupted the industry, and there's nothing to be done about it now but working as much as we can, because the faster we move the sooner we'll get stuck and the sooner we can start over on a more reasonable foundation.
Or rely on layers of abstraction and expect our code to be compilable on anything the future holds for us.2 -
This started as an update to my cover story for my Linked In profile, but as I got into a groove writing it, it turned into something more, but I’m not really sure what exactly. It maybe gets a little preachy towards the end so I’m not sure if I want to use it on LI but I figure it might be appreciated here:
In my IT career of nearly 20 years, I have worked on a very wide range of projects. I have worked on everything from mobile apps (both Adroid and iOS) to eCommerce to document management to CMS. I have such a broad technical background that if I am unfamiliar with any technology, there is a very good chance I can pick it up and run with it in a very short timespan.
If you think of the value that team members add to the team as a whole in mathematical terms, you have adders and you have subtractors. I am neither. I am a multiplier. I enjoy coaching, leading and architecture, but I don’t ever want to get out of the code entirely.
For the last 9 years, I have functioned as a technical team lead on a variety of highly successful and highly productive teams. As far as team leads go, I tend to be a bit more hands on. Generally, I manage to actively develop code about 25% of the time to keep my skills sharp and have a clear understanding of my team’s codebase.
Beyond that I also like to review as much of the code coming into the codebase as practical. I do this for 3 reasons. I do this because as a team lead, I am ultimately the one responsible for the quality and stability of the codebase. This also allows me to keep a finger on the pulse of the team, so that I have a better idea of who is struggling and who is outperforming. Finally, I recognize that my way may not necessarily be the best way to do something and I am perfectly willing to admit the same. I have learned just as much if not more by reviewing the work of others than having someone else review my own.
It has been said that if you find a job you love, you’ll never work a day in your life. This describes my relationship with software development perfectly. I have known that I would be writing software in some capacity for a living since I wrote my first “hello world” program in BASIC in the third grade.
I don’t like the term programmer because it has a sense of impersonality to it. I tolerate the title Software Developer, because it’s the industry standard. Personally, I prefer Software Craftsman to any other current vernacular for those that sling code for a living.
All too often is our work compiled into binary form, both literally and figuratively. Our users take for granted the fact that an app “just works”, without thinking about the proper use of layers of abstraction and separation of concerns, Gang of Four design patterns or why an abstract class was used instead of an interface. Take a look at any mediocre app’s review distribution in the App Store. You will inevitably see an inverse bell curve. Lot’s of 4’s and 5’s and lots of (but hopefully not as many) 1’s and not much in the middle. This leads one to believe that even given the subjective nature of a 5 star scale, users still look at things in terms of either “this app works for me” or “this one doesn’t”. It’s all still 1’s and 0’s.
Even as a contributor to many open source projects myself, I’ll be the first to admit that have never sat down and cracked open the Spring Framework to truly appreciate the work that has been poured into it. Yet, when I’m in backend mode, I’m working with Spring nearly every single day.
The moniker Software Craftsman helps to convey the fact that I put my heart and soul into every line of code that I or a member of my team write. An API contract isn’t just well designed or not. Some are better designed than others. Some are better documented than others. Despite the fact that the end result of our work is literally just a bunch of 1’s and 0’s, computer science is not an exact science at all. Anyone who has ever taken 200 lines of Java code and reduced it to less than 50 lines of reactive Kotlin, anyone who has ever hit that Utopia of 100% unit test coverage in a class, or anyone who can actually read that 2-line Perl implementation of the RSA algorithm understands this simple truth. Software development is an art form. I am a Software Craftsman.
#wk171 -
Taking over a deserter's work, the level of over abstraction and over generalization is off the charts.
WTF-per-minute (WPM): 33 -
It probably will be an unanswered question, but let's try.
Does anyone know of a large project using onion / hexagonal/ ddd or similar architecture with free access to the source code...
Or an example of said architectures that goes beyond "trivial dumb example".
The new recruits need... A lot of brushing up (I'd be for electro shock treatment and other stuff, but somehow HR thinks I'm joking).
As said, most examples I found are too basic. On the other hand, if I write now a good example, I'd need to do it in either my free time (nope, just nope) or jiggle it in somewhere in company time (aka it will be never finished nor be in a useful state).
Programming language preferred would be Java, but as I'm fluent in most languages except the forbidden ones (JavaScript and it's friends) ...
Anything would be helpful.
Most welcome would be an example with a focus on Adapter / Ports, e.g. abstraction of HTTP client usage / ORM etc.
Thanks.12 -
Sharing a first look at a prototype Web Components library I am working on for "fun"
TL;DR left side is pivot (grouped) table, right side is declarative code for it (Everything except the custom formatting is done declaratively, but has the option to be imperative as well).
====
TL;DR (Too long, did read):
I'm challenging myself to be creative with the cool new things that browsers offer us. Lani so far has a focus on extreme extensibility, abstraction from dependencies, and optional declarative style.
It's also going to be a micro CSS framework, but that's taking the back-seat.
I wanted to highlight my design here with this table, and the code that is written to produce this result.
First, you can see that the <lani-table> element is reading template, data, and layout information from its child elements. Besides the custom highlighting code (Yellow background in the "Tags" column, and green gradient in the "Score" column), everything can be done without opening even a single script tag.
The <lani-data-source> element is rather special. It's an abstraction of any data source, and you, as a developer can add custom data sources and hook up the handlers to your whim (the element itself uses the "type" attribute to choose a handler. In this case, the handler is "download" which simply sends a fetch request to the server once and downloads the result to memory).
Templates are stored in an html file, not string literals (Which I think really fucks the code) and loaded async, then cached into an object (so that the network tab doesn't get crowded, even if we can count on the HTTP cache). This also has the benefit of allowing me to parse the HTML templates once and then caching the parsed result in memory, so templates are never re-parsed from string no matter how many custom elements are created.
Everything is "compiled" into a single, minified .js file that you include on your page.
I know it's nothing extraordinary, but for something that doesn't need to be compiled, transpiled, packaged, shipped, and kissed goodnight, I think it's a really nice design and I hope to continue work on it and improve it over time1 -
1. Learn to be meticulous.
1. Learn to anticipate and prepare a functionality up to 90% accuracy and coding it in a one shot.
1. Become advanced in SQL.
1. Increase my modularity abstraction awareness.
1. Learn to TDD properly.
1. Don‘t get angry with my kids but explain to them with papa is always right in a Calm voice.
1. Do the same for partner.
1. Train my speed running in case partner wants to bash me.
1. Become advance d in Java.
1. Learn to write a bot.
1. Learn more about servers and hack at least one thing even if its a wifi.
1. Install kali linux.
1. Make myself a custom pc.
1. Ask god (or buddha if god is too busy) to make days longer.
1. Buy a vaporiser ao i can smoke my weed without mixing it to tobacco.
1. Get my license.
1. Start investing.
1......... -
I feel fucking proud of myself.
I spent the better part of three days trying to figure out how to compile source code on Linux using ./configure and stuff and best place to put the compiled source after running make and it all works. It's such a small thing but seeing as I've been tarnished with Windows this is a great accomplishment to me.
Also because I wasted days figuring this out, jumping to multiple topics, progressing deeper and deeper into different topics to figure it out... abstraction would've been nice... -
More high-level stuff, sweet libs, awesome frameworks and the next generation of devs complaining about bad performance and too much abstraction.
-
FML having to take on and support a python test framework that looks like it was written by a junior C embedded dev without a mentor.
- Imports everywhere in the code
- No abstraction or OOP
- sys.path.append fest (broken imports of course)
- Global variables fest
- No docstrings
- No readme
- Somehow mixed with a jUnit test framework as well
- Uses Windows environment variables profusely
- Pycharm has a stroke when I open files from this project5 -
To create an abstraction simple enough to make complex business logic challenges solvable for non programmers.
The issue I see today is programmers solving problems they don't understand as well as the user. I think two ways might be taken:
- Programmers specialize in other fields and solve problems there
- Other professionals create their own software
Both will happen in the future (IMHO) and I want to help the second happening.
Note: Excel does this really well, but I think we can so quite better today.2 -
VR/AR - inferior interaction with the computer, just like touchscreen are inferior to physical buttons
Microservices - having a stew of different coding standards will in the end introduce performance issues due to layers needed to reduce code entropy
Agile - programming was and always will be creation on existing technologies. Every library writer who follows Agile is making a hellhole for anyone above the layer of abstraction
Web freedom and Anonymity - At this point it should be obvious already, this fad took 30 years too long5 -
You know that experience when you update a modern desktop app that uses a zillion abstraction layers and the first time you do anything it freezes for a little bit while the heavily deferred metaprogramming and asset transformations are executed and cached?
I always imagine polystyrene balls flying all over the place.1 -
Allright, so now I have to extend a brand new application, released to LIVE just weeks ago by devs at out client's company. This application is advertised as very well structured, easy to work on, µservices-based masterpiece.
Well either I lack a loooot of xp to understand the "µservices", "easy to work on" and "well structured" parts in this app or I'm really underpaid to deal with all of this...
- part of business logic is implemented in controllers. Good luck reusing it w/o bringing up all the mappings...
- magic numbers every-fucking-where... I tried adding some constants to make it at least a tiny bit more configurable... I was yelled at by the lead dev of the app for this later.
- crud-only subservices (wrapped by facade-like services, but still.. CRUD (sub)services? Then what's a repository for...?). As a result devs didn't have a place where they could write business logic. So business logic is now in: controllers (also responsible for mapping), helpers (also application layer; used by controllers; using services).
- no transactions wrapping several actions, like removing item from CURRENT table first and then recreating it in HISTORY table. No rollback/recovery mechanism in service layers if things go South.
- no clean-code. One can easily find lines (streams) 400+ cols long.
- no encapsulation. Object fields are accessed directly
- Controllers, once get result from Services (i.e. Facade), must have a tree of: if (result instanceof SomeService.SomeSubservice1.Item1) {...} else if (result instanceof SomeService.SomeSubservice2.Item4) {...} etc. to build a proper DTO. IMO this is not a way to make abstraction - application should NOT know services' internals.
- µservices use different tables (hats off for this one!) but their records must have the same IDs. E.g. if I order a burger and coke - there are 2 order items in my order #442. When I make a payment I create an invoice which must have an id #442. And I'm talking about data layer, not service or application (dto)! Shouldn't µservices be loosely coupled and be able to serve independently...? What happens if I reuse InvoiceµService in some other app?
What are your thoughts?1 -
Hey, would you mind trying this new powerful JS library?
It has really powerful abstraction over powerful features that compose powerful components using powerful patterns based on a powerful new asynchronous paradigm2 -
Most actual GraphQL explanation:
1. Still uses your xhr/fetch/axios on FE
2. Just sends all the requests to single endpoint
3. On BE uses its own resolution schema to call proper controller to handle the request, rather than relying on router for that
That's all!
Just another useless layer of abstraction with its learning curve, tricks and bugs as ORMs are9 -
My reasoning is stupid, I just think it's cute in a pimp my ride kind of way. I heard you like getting colossally pounded in the fucking ass, so we put a virtual machine inside your compiler so you can use your binaries while you compile your binaries.
But there is a practical angle to it, too. It's state, structures and execution within the code itself -- that is, in a sense, generators "embedded" within the source, but without any kind of special syntax.
Rather, the code is all the same, and I'd have the option to make calls at compile time: the output of these calls could, in turn, be part of the resulting binary or processed by further calls.
It'd greenlight the wildest fuckery in the jungle, because *that* is the true and ultimate abstraction: programs that write other programs with minimal human intervention. But is my (still) theoretical, cheap ass two-dollar prototype approach held together with clown jizz and prayers better than the endless cumloads worth of corporate investment that's dumped and pumped into generative AI on a daily basis?
Well... **lights cigarette**
That's what we're about to find out, mother fuckers.1 -
To be honest with you, I’ve never had a bad experience with PHP.
Yes, it’s “dirty” compared to something like Haskell, but it’s not a bad thing. Dirty things usually bring simplicity and allow implementing the intended case super quickly, at the cost of breaking apart at scale. There are no bad tools, there are wrong tools for the job.
Premature optimization is the root of all evil. The more I launch new projects for me/other companies, the more I come to the realization that the vast majority of the projects out there will never see scale. They will be proven non-viable/impractical and deemed obsolete way before they outgrow the $20 VPS they were hosted on.
Sometimes (all the time, really) launching quickly like there is no tomorrow is the most viable business strategy. If (yes, “if”, not “when”) your project outgrows PHP and gets to the point when PHPs abstraction model is the bottleneck, you’ll have the money to rewrite the project in any language out there, trust me.
As someone said on biking subreddit to a person that asked how to buy the newest super-aero helmet, “if the aerodynamics of the old helmet is what holds you back, someone will be sending you the new one for free”.6 -
TL;DR - I came up with an ingenious version of a solution to a problem and still got 0 marks.
In my bachelor's degree we learned about abstraction, as usual for CS degree students.
In a later exam, a coding question asked us to swap two variables values without using a third variable and print the before and after on the screen.
You can read the question above again, because wait for it....
So this is what I wrote basically (JS equivalent solution),
class Solution {
constructor(obj) {
this.var1 = obj.var1;
this.var2 = obj.var2;
}
swap() {
return {"var1": this.var2, "var2": this.var1};
}
}
let input = {"var1":5, "var2": 7}
let object = new Solution(input);
console.log('Before');
console.log(input);
let solution = object.swap();
console.log('After');
console.log(solution);
Now look, before your boomer asses jump in and say "aCkChUaLlY tHiS iS iNcORrEcT"
I did include all kinds of comments that this is abstracted. The swap function is hidden away and the object variable doesn't need to know what it's doing.
In the context of this question, this is absolutely acceptable as a solution since the end-goal is to print the results on the screen and the user wouldn't see the source code.
I still got 0 on that question and I still get pissed about it sometimes, when I remember it, like just now.16 -
I always thought wordpress was ok, not great not terrible, from a coding perspective. Now every new framework I have worked on makes me see why Wordpress is on 40% of the internet.
Now I love wordpress not because of what it did do, but because of all the really stupid things it managed to avoid doing including: over abstraction, trend chasing, using "new transformative technology" that disappears in 2 years, breaking plugin economy with updates and making devs start over, making everything OOP for the sake of making everything OOP, making adding on a bit of code take multiple files of multiple formats and boiler plate code, boiler plate code, compiling dependencies, composer, twig, laravel, one page applications, react, angular, vue, javascript only stacks (MEAN), not letting you control sql queries, protected/private scopes and design that doesn't let you fix or alter bad code others did, and the list goes on and on.
Wordpress did a lot right, and devs should try learning from it instead of making more problems to solve. Sure it's not elegant, but you known what it does do? Focus on a solving a problem. Then it does. Without inventing new ideas or concepts to inject into the code and create new problems.
And you know what else? Hooks are actually very well implemented in Wordpress. I've seen it done much worse.
Honestly my main gripe with the entire platform is a slow moving to OOP for no reason and the database design should separate post type into different tables, the current design makes it less scalable for large data sets for multiple reasons so I'd fix that.5 -
I was trying to make a circular buffer in C++. I was also trying to expose iterators for using the buffer with STL algorithms. I kept trying to think about how to add the functions needed to manipulate the existing internal iterators to not exceed the bounds of the buffer. Then I realized I was "too close" to the problem. There was no way I could properly control the internal iterators of the storage vector I was using. Not without giving too much power to the user of my library. So I abstracted the iterators up one level. Hid all the details of the internal iterator and made a new iterator.
The solution of abstracting the iterator was not the epiphany. The epiphany was if you are struggling with how to solve a particular problem. You keep running into problems with how to represent something, there is too much power available at a particular representation, or the object you are trying to make work just don't fit. This is when you should consider abstracting a level up. Take a higher look at the problem and simplify the interface.
Abstraction could be a number of things. Divide and conquer, hiding details, specializing an object, etc. Whatever tool is needed to make the problem more consumable to your brain. -
Low code platforms (e.g. OutSystems) will gradually win over developers who will look back at all the time they've wasted and think "why the fuck didn't tell me about this sooner!!!"
Disclaimer: I work for OutSystems and can honestly say that I can create a production-ready app while you're still picking the best JS framework to get started with. Been there...
Actual rant: Time to move up the abstraction layer, get stuff done on time and on budget and go home (also on time) to enjoy life!12 -
css frameworks are a sign your ui/ux team is an empty bag of chips.
vuetify examples look like toys in their docs and work that way in prod. if you put any two vuetify components together on a page you basically dont have a website anymore. mx px are indicators that your styling abstraction is so bad that adding 8 resize shims to every single node on the dom is the correct solution to your visual spacing dilemma.
css offers so many powerful tools out of the box now, and it takes like a week to actually learn them. instead, we cloak all the functionality and expressiveness of modern css in black-box m a t e r i a l d e s i g n and pretend like obtuse blobs are a viable substitute for coherent, accessible, user-friendly ux.5 -
One thing that is really difficult is when you are writing let's say C code for months, and then you switch to C# or Python, you immediately use C-style logic and forgo the easier, shorter Python syntax!
I did a python kata on the Codewars website. After submitting it, I realized my solution was like 10x longer than every other solution. Talking about hand in face.
Stuck to basics and forgot about Python's amazing shortcuts. What are you going to do!1 -
I really hate the term fullstack developer. Just call it what it really is. Javascript react developer who dabbles in node occasionally.
If you don't have some knowledge of tuning a database, tuning your runtime, handling issues with networks and latency in your code, dealing with issues with message queues, writing abstraction layers for the database, etc. you aren't a backend developer, sorry to say.
Being able to reason about a mean stack running on digital ocean doesn't not make you proficient in the backend.3 -
So apparently friends have access to privates.
Just some coding thoughts. Yes I'm talking about C++, not...
Indeed, programmers are tiny Gods.1 -
Scala is as horrible as Java. Been using Java at uni and once having discovered the simplicity and beauty of other languages (Python, golang), never went back.
Currently trying some apache projects (kafka streams, apache flink) where Scala is native. Same crap as Java. Needs 10x lines to write the same thing, abstraction over abstraction, and intuitive = 0. Why tf did it even got invented?6 -
I feel so lost all the time Everytime I think about the future. How are you all going forward?
- What should i be doing ? I used to like computer science when it was taught with lots of simplification and abstraction (in the school level). Now i know there are a 100+ research areas/work areas/branches in it, and i am an average in all of them.
I like most of them more or less, and won't mind giving away my years of life working/learning them. But for what and why?
-- Money? Every profile turns into a decent salary after a certain time. This means i can ride any boat i want.
-- Passion/interest? Now what exactly is this?as i said everything feels doable, given enough time to get a hang of it.
-- Fame? Its rare the developes, testers or other individuals in computer science ever gets a solo credit. Most of the time its either the ceos, the researchers or the company itself. So i guess getting a fame is equal to burning your neighbors by flaunting your cash for most ppl
-- Happy life? Meh, this point is affected by a lot of other factors. Would come back to this point later
- everyday in my feed, there are people showing 6, 7 sometimes even 8 figure salaries. Other people would get inspired with those, but i feel very weird about these.
I never see myself earning those, idk why. Why would someone give me those huge amounts?
How do you find yourself deserving for ythat big ass money? At what point you hit that realisation? Here is a small story :
I did an Android dev course around 2.5 years ago. There was a guy there an year older than me. He was very bad in this, i tell you. Most of the time, i was explaining the concepts to him after class.so last year he graduated, and took a job, We both used to expect a decent salary amount, say x (with me having a little ego that i expect certainly more than him, say x+20% ), but he took a job for half that number , say x/2.
After 1 increment and 1 job shift in 1.5 years, he has now successfully achieved package greater than x. I on the other hand, being still at college and with a lot of bad internship experiences now feel that i won't be getting even x/3 at my start no matter what.
- There is also this thing about people going into more of a management and other non tech roles once they start growing in this field. Why? What did they realized? I am sure not everyone of them would have hit this realization that tech is not what they want to do (which i can't understand why). Maybe its the money and/or happy life expectations?
i have started to feel dumb for not being able to think innovative new ideas and being an average mind :/
And about the happy life, so far its not much happiness for me, and am confused.
I am grateful about the usual things i have (healthy middle class parents, working body, roof , food,etc) , unhappy about the things i don't and see with others (more money, materialistic assets, confidence, siblings, social life, love life, etc) and that's it.
From what i understood of 21 years on this earth is that everyone is running to achieve that list of their desires and wants to move them from todo to done, like trello task. If you can't then keep fighting to achieve or grudgingly accept the fact that you couldn't and be happy about it.
So is that it? That's your happy life goals?2 -
There is an idea of a perfect woman. Some kind of abstraction. But there is no real me. Only an entity. Something illusory. And though I can hide my cold gaze, and you can shake my hand and feel flesh gripping yours, and maybe you can even sense our lifestyles are probably comparable, I simply am not there.2
-
1/2 dev and a fair warning: do not go into the comments.
You're going anyway? Good.
I began trying to figure out how to use stable diffusion out of boredom. Couldn't do shit at first, but after messing around for a few days I'm starting to get the hang of it.
Writing long prompts gets tiresome, though. Think I can build myself a tool to help with this. Nothing fancy. A local database to hold trees of tokens, associate each tree to an ID, like say <class 'path'> or some such. Essentially, you use this to save a description of any size.
The rest is textual substitution, which is trivial in devil-speak. Off the top of my head:
my $RE=qr{\< (?<class> [^\s]+) \s+ ' (?<path>) [^'] '\>}x;
And then? match |> fetch(validate) |> replace, recurse. Say:
while ($in =~ $RE) {
my $tree=db->fetch $+{class},$+{path};
$in=~ s[$RE][$tree];
};
Is that it? As far the substitution goes, then yeah, more or less. We have to check that a tree's definition does not recurse for this to work though, but I would do that __before__ dumping the tree to disk, not after.
There is most likely an upper limit to how much abstraction can be achieved this way, one can only get so specific before the algorithm starts tripping balls I reckon, the point here is just reaching that limit sooner.
So pasting lists of tokens, in a nutshell. Not a novel idea. I'd just be making it easier for myself. I'd rather reference things by name, and I'd rather not define what a name means more than once. So if I've already detailed what a Nazgul is, for instance, then I'd like to reuse it. Copy, paste, good times.
Do promise to slay me in combat should you ever catch me using the term "prompt engineering" unironically, what a stupid fucking joke.
Anyway, the other half, so !dev and I repeat the warning, just out of courtesy. I don't think it needs to be here, as this is all fairly mild imagery, but just in case.
I felt disappointed that a cursed image would scare me when I've seen far worse shit. So I began experimenting, seeing if I could replicate the result. No luck yet, but I think we're getting somewhere.
Our mission is clearly the bronwning of pants, that much is clear. But how do we come to understand fear? I don't know. "Scaring" seems fairly subjective.
But I fear what I know to be real,
And I believe my own two eyes.11 -
Python list / dict comprehension, one of the numerous reason I truly love python. Simple and powerful abstraction2
-
I want to know the name of the evil mastermind who once conceived the "literal" function in Sequelize.
- You design a method to insert pieces of raw SQL exactly the way they are written, no further processing
- You release this method, you call it LITERAL to make sure people know its intended purpose: it is used to insert LITERALLY everything you write, nothing more and nothing less
- Then make sure this "literal" method changes the fucking case of column names. Because that's what "literal" means in the head of this rabid animal: you arbitrarily change the code written by the developer
WHY
WHY ARE ALL AR ORM DESIGNED BY FUCKING ANIMALS
ELOQUENT IS TRASH, SEQUELIZE IS TRASH, TENS OF DEVELOPERS AT WORK TO ALCHEMICALLY CREATE THE MOST ROTTEN CODE THEY POSSIBLY CAN, BECAUSE YOU MUST NOT BE ALLOWED TO WRITE ANY QUERY MORE ADVANCED THAN "SELECT * FROM users WHERE id =1", NOT A FUCKING SHRED OF DOCUMENTATION AND 16 MILLION LAYERS OF ABSTRACTION TO MAKE SURE EVERY BUG FUCKING STAYS THERE, DON'T YOU DARE TO USE A JOIN, DON'T YOU DARE TO TREAT A DMBS LIKE AN ACTUAL FUCKING DBMS INSTEAD OF A HOT STEAMING PILE OF METHODS IMPLEMENTED BY MONKEYS.6 -
Making software is science. I'm not talking about overengineering, just doing things right (with a minimum of automated testing, abstraction, architecture easy to modify, stuff like that). If you don't wan't to invest money in science, but only in business, get external providers for parts of your product, if not all of it. Stop making custom stuff that already exists, unless you can make something better (because it most probably ain't be cheaper, regardless the quality level).2
-
So I've been forced to do a new project with Spryker (PHP ecommerce framework) ... some of you guys may know that normally I do SAP hybris stuff (since a few years). I tend to rant about it (not just here but everywhere :-D)
But now as I'm trying to extend Spryker's domain model I realize how good hybris actually is ...
Well actually I alwas knew that the persistence layer is awesome (that's why I tried to implement a quite similar approach with https://core-next.io). But the rest is ... shit.
But Spryker's persistence layer - based on Propel - is ... just ... WTF. Designing tables and relations in XML on such a low-level-with-basically-no-abstraction-even-not-allowing-proper-inheritance WTF.
Fuck you Spryker, fuck you Propel, fuck you PHP. I want my beloved java-hybris-fuck-the-world-persistence-layer back ... -
It just hit me that despite being possibly the most object-heavy language out there, JavaScript actually wasn't even properly object oriented for the longest time. No language-level support for Encapsulation, Inheritance, and without a strict class system, it can never really have polymorphism or abstraction.
Since literally everything is an object, it's impossible to make it object oriented 🤯6 -
Don’t fall for the “language A is better than language B” -it’s all just syntax (within the same paradigm). Learn higher concepts such as: data structures, abstraction, loops, variables, conditionals, functions.
Only then can you decide that your favourite language is better than anyone else’s...5 -
Itd be cool if we could get something like Schenzen.Io going, but you build the chips from the gates up
Maybe package them into modular units, and connect those at a higher level or abstraction, ad infinitum.
Then add access to virtual LCD output, and other peripherals, or even map output to real hardware, essentially letting you build near bare-metal virtual machines.
Dont know about that last part, but the closest I've seen to the rest is circuit simulator and again, schenzen.
On the machine learning front I figured out I need about ten times as many training samples as validation samples, or vice versa. I'll have to check my notes. Explains why I could get training loss below 2.11
Also, I'm looking at grouping digits, and trying different representations. I'm looking at the hidden variables for primorials to see what that reveals. And I realized because of the amount of configurations and training that I want to do, even a personally built cloud isnt going to be sufficient. I'm gonna have to rent someone else's hardware and run it "in duh cloud."
Any good providers that are ridiculously upfront for beginners to get started with? Namely something cheapish.3 -
Why do we still speak in direct DNS?
I don't know about you, but I have observed so many DNS mishaps in my day, and also have observed that developers and non-devs consistently fail to have a succinct mental model of how to set DNS properly for a website.
There are lots of services that make setting DNS easier than ever, but I'm kind of surprised so many people still have to think directly in terms of CNAMES, APEX DOMAINS, and all the direct domain knowledge of DNS.
Can't we have a higher level abstraction that compiles to DNS with more safety guards? Sure, let me dip into DNS when I need to, but why are DNS settings tables still such a normal thing?
I write Ruby code so I don't have to write C code. I'm sure there are attempts in DNS abstraction, but the fact that I haven't come across them means they are probably still too leaky or just not mainstream.
Thoughts on the matter?4 -
According to my university lecture you have clean and good code if every tiny little functionality is split into 5+ files. Gotta have an interface, factory, low level implementation, high level implementation, and at this point I don't even know what purpose the other abstraction levels have. Just end me already...
Sometimes I think of how much great and useful stuff you could learn at an university if they used time efficiently. But instead you spend years mostly just studying theoretical or very abstract topics. Whereas 80%+ of useful knowledge and skills you learn on your own.3 -
"Abstract all the things!"
Theres so much fucking abstraction and so many different ways to abstract shit that I don't fucking know how anything works anymore.
Fuck this5 -
I'm thinking about what language to dive into next.
I already have a pretty good knowledge of Go and mediocre knowledge of C and Java.
So far I thought about...
1. CPP, as I need it for school and it runs on literally anything.
2. Rust, as is seems to spread and the combination of low-level, memory-safety and abstraction seems pretty appealing to me.
3. Kotlin, specifically kotlin-native, is it combines java-like high-level programming with native speed.
4. Nim, as it combines high-level techniques with c-like freedom.
What do you people recommended, or something completely else?6 -
Urgh... No exceptions in Rust annoys me. Now you only have the choice between "this didn't work please handle this error, thank you ^-^" and "you fool, prepare for annihilation". So basically if anything remotely serious happens your programs dead and there's nothing you can do about it. I don't get why people have this hate for exceptions. Everytime a new language gets made it's always either "ew it has exceptions" or "it's so nice it doesn't even have exceptions". NOOO! They can deal with serious situations in the best possible way and they can be statically checked (so no "but they're so complex and unpredicable" stuff please). If you can expect an exception they shouldn't be used in the first place (eventhough they are absolutely no less good than Option returntypes or whatever, just different) but in cases when it's impossible to predict an error they really shine. And not having them makes your language worse. If a device driver accesses illegal memory it should throw an exception, so instead of the computer shitting the bed, first the offending function has a chance to resolve the problem at it's root, then a few functions up the call stack, the general control functions of the device drivers can handle it and restart the operation if applicable, and even if the driver fails to handle it, the OS can jump in and restart the driver, log an error and do whatever. It's absolutely beautiful: This hierarchical ramp from near the accident site to more high level operations code ensures the error can be caught at the right level of abstraction without introduction a lot of boilerplate. If everything fails and nobody can handle it *then* the program or kernel or whatever can panic.4
-
Can someone please explain why LISP and LISP inspired langs breed the most insufferable twats?
I mean, just look at this, I'm trying to learn Clojure and happened across this site/slash book: braveclojure.com
Some highlights:
>Chapter 7 - Clojure Alchemy: Reading, Evaluation, and Macros:
>The philosopher’s stone, along with the elixir of life and Viagra, is one of the most well-known specimens of alchemical lore, pursued for its ability to transmute lead into gold. Clojure, however, offers a tool that makes the philosopher’s stone look like a mere trinket: the macro.
> The -> also lets us omit parentheses, which means there’s less visual noise to contend with. This is a syntactic abstraction because it lets you write code in a syntax that’s different from Clojure’s built-in syntax but is preferable for human consumption. Better than lead into gold!!!
>Chapter 10 - Clojure Metaphysics: Atoms, Refs, Vars, and Cuddle Zombies:
>The Three Concurrency Goblins are all spawned from the same pit of evil: shared access to mutable state.
>In fact, Clojure embodies a very clear conception of state that makes it inherently safer for concurrency than most popular programming languages. It’s safe all the way down to its meta-freakin-physics.
And look at this: https://quora.com/Why-are-Lisp-prog...
It reminds me of Python before the data-science craze and its adherents thought IT was God's programming language.1 -
I have come to an interesting realization. I am nothing more than an abstraction layer around Jira.1
-
(I'm not completely sure of what I'm saying here, so don't take this too seriously)
Settling on a language to write the api for ranterix is hard.
I'm finding a lot of things about elixir to be insanely good for a stable api.
But I'm having a lot of gripes with the most important elixir web framework, phoenix.
Take a look at this piece of code from the phoenix docs:
defmodule Hello.Repo.Migrations.CreateUsers do
use Ecto.Migration
def change do
create table(:users) do
add :name, :string
add :email, :string add :bio, :string
add :number_of_pets, :integer
timestamps()
end
end
end
Jesus christ, I hate this shit.
Wtf are create, add and timestamps. Add is somehow valid inside the create, how the fuck is that considered good code? What happens if you call timestamps twice? It's all obscure "trust me, it works" code.
It appears to be written by a child.
js may have a million problems. But one thing I like about CJS (require) or ESM (import) is that there's nothing unexplained. You know where the fuck most things come from.
You default export an eatShit() function on one file and import it from another, and what do you get?
The goddamn actual eatShit function.
require is a function the same way toString is a function and it returns whatever the fuck you had exported in the target file.
Meanwhile some dynamic langs are like "oh, I'll just export only some lang construct that i expect you to specify and put that shit in fucking global of the importing file".
Js is about the fucking freedom. It won't decide for you what things will files export, you can export whatever the fuck you want, strings, functions, classes, objects or even nothing at all, thanks to module.exports object or export statement.
And in js, you can spy on anything external, for example with (...args) => debugger; fnToSpyOn(...args)
You can spoof console.log this way to see what the fuck is calling it (note: monkey patching for debugging = GOOD, for actual programming = DOGSHIT)
To be fair though, that is possible because of being a dynamic lang and elixir is kind of a hybrid typed lang, fair enough.
But here's where i drop the shit.
Phoenix takes it one step further by following the braindead ruby style of code and pretty DSLs.
I fucking hate DSLs, I fucking hate abstraction addiction.
Get this, we're not writing fucking poetry here. We're writing programs for machines for them to execute.
Machines are not humans with emotions or creativity, nor feel.
We need some level of abstraction to save time understanding source code, sure.
But there has to be a balance. Languages can be ergonomic for humans, but they also need to be ergonomic for algorithms and machines.
Some of the people that write "beautiful" "zen" code are the folks that think that everyone who doesn't push the pretty code agenda is a code elitist that doesn't want "normal" people to get into programming.
Programming is hard, man, there's no fucking way around it.
Sometimes operating system or even hardware details bleed into code.
DSLs are one easy way to make code really really easy to understand, but also make it really fucking hard to debug or to lose "programming meaning".7 -
Css positioning is harder to understand than the full OOP concept.
So i wanted to create a very simple page with a single css file. I spent 2 hours to position the buttons in the fixed header and center some things.
I got the whole OOP concept with abstraction, polymorphism and inheritance in an hour and could use it right after without problems.
Holy shtcake i never want to do frontend.8 -
Sydochen has posted a rant where he is nt really sure why people hate Java, and I decided to publicly post my explanation of this phenomenon, please, from my point of view.
So there is this quite large domain, on which one or two academical studies are built, such as business informatics and applied system engineering which I find extremely interesting and fun, that is called, ironically, SAD. And then there are videos on youtube, by programmers who just can't settle the fuck down. Those videos I am talking about are rants about OOP in general, which, as we all know, is a huge part of studies in the aforementioned domain. What these people are even talking about?
Absolutely obvious, there is no sense in making a software in a linear pattern. Since Bikelsoft has conveniently patched consumers up with GUI based software, the core concept of which is EDP (event driven programming or alternatively, at least OS events queue-ing), the completely functional, linear approach in such environment does not make much sense in terms of the maintainability of the software. Uhm, raise your hand if you ever tried to linearly build a complex GUI system in a single function call on GTK, which does allow you to disregard any responsibility separation pattern of SAD, such as long loved MVC...
Additionally, OOP is mandatory in business because it does allow us to mount abstraction levels and encapsulate actual dataflow behind them, which, of course, lowers the costs of the development.
What happy programmers are talking about usually is the complexity of the task of doing the OOP right in the sense of an overflow of straight composition classes (that do nothing but forward data from lower to upper abstraction levels and vice versa) and the situation of responsibility chain break (this is when a class from lower level directly!! notifies a class of a higher level about something ignoring the fact that there is a chain of other classes between them). And that's it. These guys also do vouch for functional programming, and it's a completely different argument, and there is no reason not to do it in algorithmical, implementational part of the project, of course, but yeah...
So where does Java kick in you think?
Well, guess what language popularized programming in general and OOP in particular. Java is doing a lot of things in a modern way. Of course, if it's 1995 outside *lenny face*. Yeah, fuck AOT, fuck memory management responsibility, all to the maximum towards solving the real applicative tasks.
Have you ever tried to learn to apply Text Watchers in Android with Java? Then you know about inline overloading and inline abstract class implementation. This is not right. This reduces readability and reusability.
Have you ever used Volley on Android? Newbies to Android programming surely should have. Quite verbose boilerplate in google docs, huh?
Have you seen intents? The Android API is, little said, messy with all the support libs and Context class ancestors. Remember how many times the language has helped you to properly orient in all of this hierarchy, when overloading method declaration requires you to use 2 lines instead of 1. Too verbose, too hesitant, distracting - that's what the lang and the api is. Fucking toString() is hilarious. Reference comparison is unintuitive. Obviously poor practices are not banned. Ancient tools. Import hell. Slow evolution.
C# has ripped Java off like an utter cunt, yet it's a piece of cake to maintain a solid patternization and structure, and keep your code clean and readable. Yet, Cs6 already was okay featuring optionally nullable fields and safe optional dereferencing, while we get finally get lambda expressions in J8, in 20-fucking-14.
Java did good back then, but when we joke about dumb indian developers, they are coding it in Java. So yeah.
To sum up, it's easy to make code unreadable with Java, and Java is a tool with which developers usually disregard the patterns of SAD. -
Figure I can simplify the code if I have the compiler handle *some* of the register allocation.
Eh? What do you mean "NP-hard"? Dafuq's an ENN-PEE?
**frantically reads wiki**
I can proudly say that I understood absolutely nothing; CS stands for cocksucker or rather abysmal failure at the most basic forms of communication, I don't just sit here all day expecting you to flawlessly prove my point with every swallow of breath you draw, yet here we are.
Perhaps one factor involved in producing the generalized cluelessness of my colleagues, I mean their "imposter s*ndrome", has a bit to do with how fucking thick you've formulated this glorified bollocks you call theory. Were not for your incompetence, arcane crackheads like me would simply __not__ be capable of rising to the top of this field entirely via determination and a big salami, therefore I owe you both a debt of gratitude as well as every last word and sign of total disrespect.
As interesting as the study of computational complexity can be, if done correctly that is, you idiots are stuck in a mathematician's abstract mindset in a field entirely devoted to application of ideas rather than *just* the ideas themselves.
To answer my own question, it means there's no known efficient solution. That's it. The part about nondeterministic polynomial convolution of an irreductible rectosigmoid junction can apparently be skipped altogether. Anyway, I solved the problem with the computational equivalent of pizza sticks while you were out in the field mentally jacking off to λ.
Lecture is over, now go clean up the ethereal masturbatory residue if you will, I have mystical el Khwarizmi type-shit to solve via further clubbing of abstraction through liverwurst bologna of immense proportions. ^D3 -
Me: Keep the abstractions in check. Go too far and you'll end up with something called 'thing' or 'object'!
C#: Hold my beer 🍺🍺🍺7 -
I'm working with an xml schema that effectively emulates xml... inside xml...
It looks like this
<child>
<tag>ABC</tag>
<value>DEF</value>
</child>
I don't event want to get started on how it handles child elements.
This is some next level abstraction hell.
And it's not like it can't use normal XML tags. In other parts of the project it uses a slightly more sane schema.5 -
Sometimes people want to be too smart. If you want to consume a handful different restful API, it might make sense to abstract away some common functionality in your client implementation — yet to assume they follow the same convention in how their URI is built is borderline insane.
All I wanted to do was to change one API to a newer version, and now the implementation breaks for at least two other because it was done in an Abstract class and now I have to untangle that mess.
In some cases code duplication wouldn't be that bad. Even if an otherwise unrelated API seemingly share the same contract, still assume it has its own contract. You never know how those API evolve and I proclaim they will evolve towards breaking your assumptions.1 -
This aesthetics in physics that is hard to explain to someone outside the field. How beautiful for example that you can express the whole classical mechanics in just 1 equation and electro magnetism in 4, that sense of symmetry and clarity they express. Essentially in what Einstein and others believed: that the universe is a orderly place not chaos and that its rules can be understood by our crippled minds.
And I think there is a similar notion in code. As physicists are driven to more general and powerful theories that shall some time explain all interactions of matter that we know of, programmers, I believe, strive for similar ideals: brevity, conciseness, generality, abstraction, powerfulness of your symbolic system - one line of code to end it all. -
I go to add a method call in a business logic class that's used exclusively in a particular service, and get blocked in the PR by some other guy-
Other Guy: refactor this into the shared framework referenced by all our microservices
Me: it is only and would only be used in this service
OG: what about the other business logic class in this service?
Me: it's not used there, and if it does end up used there then we can refactor it into a class that they both reference then
OG: I need to know when the abstraction of this function will be done. is it going to be delivered next sprint?
Me: YAGNI - better to avoid doing extra work when we don't know if we'll even need it
OG: tbh you can still abstract it with some generics and lambda magic, but im not gonna enforce that
Me: premature abstraction is the root of all evil (tongueout)
OG: not really, its the root of not having a million miles of tech debt in 2 years
I just can't win for losing with the anti-YAGNI yogi.1 -
TL;DR: Brainfuck & Abstraction is so cool!
One of my dreams is to make a Mandelbrot Fractal with Brainfuck as the one on Rosetta Code.
I'm too lazy... So i'm writing a Compiler for Brainfuck.
At now i have 900 line of Python code and the operation VAR, SET, ADD, SUB, MUL with nested operation compatibility, IF, ELSE, ENDIF...
Probably i will doing it fast directly with BF but damn if abstraction is so cool!!4 -
Those semi-ugly contraptions you hide away underneath an "abstraction layer" in the hope that no-one stumbles upon them...2
-
Learning how to break a result into the steps necessary to produce it, along with the broader concept of abstraction in computer science has allowed me to apply this thinking to my personal experience. I've traced personality traits and behaviors to specific events from my childhood, and can finally relax knowing that understanding computers has given me all the linguistic tools I need to talk to myself, which traditionally has been impossible. I no longer feel trapped in a terrifyingly imaginative mind.
-
fuck it, tell me straight.
Can i live into this tech world with poor math skills and no interest in web dev and designing?
my experience as native mobile dev was enjoyable and still is, but i fear that this is not a very broad career choice.
You see their is blockchain, dapps , hybrid apps, webapps, server designing, tensorflow models and Ai models( though they can be integrated with native apps too i guess ) , and many more tech and therefore jobs that rely on knowing about the webdev. and all i know is how to make a decent native java app.
and why the fuck should i join this web dev cult? its such a fucking mess. 8 different types of text sizes sizes, <b> and <strong> being the same thing, do you know about a thing called abstraction? My android studio would give me fucking murder warnings if i even dared to introduce hard coded texts along with code. and here, an html page is basically text + attributes? fucking kill me.2 -
Books.
Do you guys know a good book for professional PHP 7 programming, especially OOP, concepts, design patterns, abstraction, algorithms, security and data structures?
Please not that beginner stuff, I want to dive deeper into PHP 7 😁
Maybe in German or English 😋3 -
feeling stuck :/ for the last 3 days I'm struggling with the same problem -- how to refactor that core thingie into a sufficiently reusable abstraction...6
-
Repeat yourself, but don’t repeat other people’s hard work. Repeat yourself: duplicate to find the right abstraction first, then deduplicate to implement it. -- tef
-
I work daily on a project, in which, rather than buy in a decent message bus a bunch of half interested, unqualified developers were tasked with hammering together an in-house solution. This monstrosity has around six layers of abstraction, separate objects per project and dynamically loading converters between the components. It's largely not unit testable, certainly not integration testable and has already wasted more money in developer time and Bugfixes than a half decent external solution would have cost.
Every time I have to change an object in one part, start the associated web/win service and do a "update service references" I die a little inside.
There are so many better ways but we'll never be able to change because "there's no time for that"
And all for some up front savings -
I guess the moment I wanted to become a dev was when I was playing Skyrim and just got curious on what the underlying mechanics of the game looked like (and obviously how they worked). That lead to me embracing math (CS is derived from math and they both exercise logic flow and abstraction) and realized how good it felt solving problems. I get the same euphoric feeling from solving problems in mathematics as I do when I solve problems through code. I can say that I will be happy and have meaning developing software for the rest of my life, but I wouldn't lie and say that'll be my only focus. Along the way I'll definitely pursue other interest, but from my standing and mindset now I'll definitely be
developing things as more than just a hobby in the near future. -
My thoughts on how progression goes from top to bottom:
I'm going to use the terms all wrong because I don't know correct terminology but this is just how I make sense of a good workflow in programming.
From top to bottom:
Hard coding
Variablizing (is this a word? I use it to myself)
Functionizing
Abstracting the function
Adding an interface to the abstracted function (another layer of abstraction saves so much effort later)
Testing each step if possible.
Then when I feel a bit of code is good, giving it some more time and more testing then finding bugs I didn't see before and improving things.
If I get tripped up and spend too much time on some issue, I'll just let it sit for a little bit and take a walk or think of something else. The problem is still being worked on subconsciously and when I return after a rest usually is more apparent.
Testing, testing, testing and more testing!1 -
Doing the Full Stack Nanodegree from Udacity
Using Google's oAuth Sign in in my Flask App, I realized that no matter what browser I use, I was unable to logout, Google always threw an error my way. I figured something must be wrong with my code..
Searched on Google, couldn't find anything relevant, gave up on first 4 results(not pages, yeah I'm that lazy!)
Spent 3 hours Debugging at different points, removing all the abstraction I've put in using various libraries (Bad move)
Finally it dawned on to me to check Udacity forum as well. It's a frickin cache/cookie thing. Tried the app in an incognito window, worked like a charm. Reverted code back with all the libraries, worked like a charm again!
FUCK YOU GOOGLE! In your attempts to track users, you're even making our work difficult!
(in hindsight, I should probably be better at asking/looking for help)1 -
I decided to rewrite the cross-window comms lib from the ground up. After all it isn't too big (some 500 lines for the first level, 300 for a little abstraction) and the original is more of an artwork than good code. It somehow works but there are as many explanations as to why as viewers and nobody is allowed to touch it because it would probably break.1
-
The devs delved too greedily and too deep. You know what they awoke in the darkness of abstraction hell... shadow and flame1
-
Rubber ducking your ass in a way, I figure things out as I rant and have to explain my reasoning or lack thereof every other sentence.
So lettuce harvest some more: I did not finish the linker as I initially planned, because I found a dumber way to solve the problem. I'm storing programs as bytecode chunks broken up into segment trees, and this is how we get namespaces, as each segment and value is labeled -- you can very well think of it as a file structure.
Each file proper, that is, every path you pass to the compiler, has it's own segment tree that results from breaking down the code within. We call this a clan, because it's a family of data, structures and procedures. It's a bit stupid not to call it "class", but that would imply each file can have only one class, which is generally good style but still technically not the case, hence the deliberate use of another word.
Anyway, because every clan is already represented as a tree, we can easily have two or more coexist by just parenting them as-is to a common root, enabling the fetching of symbols from one clan to another. We then perform a cannonical walk of the unified tree, push instructions to an assembly queue, and flatten the segmented memory into a single pool onto which we write the assembler's output.
I didn't think this would work, but it does. So how?
The assembly queue uses a highly sophisticated crackhead abstraction of the CVYC clan, or said plainly, clairvoyant code of the "fucked if I thought this would be simple" family. Fundamentally, every element in the queue is -- recursively -- either a fixed value or a function pointer plus arguments. So every instruction takes the form (ins (arg[0],arg[N])) where the instruction and the arguments may themselves be either fixed or indirect fetches that must be solved but in the ~ F U T U R E ~
Thusly, the assembler must be made aware of the fact that it's wearing sunglasses indoors and high on cocaine, so that these pointers -- and the accompanying arguments -- can be solved. However, your hemorroids are great, and sitting may be painful for long, hard times to come, because to even try and do this kind of John Connor solving pinky promises that loop on themselves is slowly reducing my sanity.
But minor time travel paradoxes aside, this allows for all existing symbols to be fetched at the time of assembly no matter where exactly in memory they reside; even if the namespace is mutated, and so the symbol duplicated, we can still modify the original symbol at the time of duplication to re-route fetchers to it's new location. And so the madness begins.
Effectively, our code can see the future, and it is not pleased with your test results. But enough about you being a disappointment to an equally misconstructed institution -- we are vermin of science, now stand still while I smack you with this Bible.
But seriously now, what I'm trying to say is that linking is not required as a separate step as a result of all this unintelligible fuckery; all the information required to access a file is the segment tree itself, so linking is appending trees to a new root, and a tree written to disk is essentially a linkable object file.
Mission accomplished... ? Perhaps.
This very much closes the chapter on *virtual* programs, that is, anything running on the VM. We're still lacking translation to native code, and that's an entirely different topic. Luckily, the language is pretty fucking close to assembler, so the translation may actually not be all that complicated.
But that is a story for another day, kids.
And now, a word from our sponsor:
<ad> Whoa, hold on there, crystal ball. It's clear to any tzaddiq that only prophets can prophecise, but if you are but a lowly goblinoid emperor of rectal pleasure, the simple truths can become very hard to grasp. How can one manage non-intertwining affairs in their professional and private lives while ALSO compulsively juggling nuts?
Enter: Testament, the gapp that will take your gonad-swallowing virtue to the next level. Ever felt like sucking on a hairy ballsack during office hours? We got you covered. With our state of the art cognitive implants, tracking devices and macumbeiras, you will be able to RIP your way into ultimate scrotolingual pleasure in no time!
Utilizing a highly elaborated process that combines illegal substances with the most forbidden schools of blood magic, we are able to [EXTREMELY CENSORED HERETICAL CONTENT] inside of your MATER with pinpoint accuracy! You shall be reformed in a parallel plane of existence, void of all that was your very being, just to suck on nads!
Just insert the ritual blade into your own testicles and let the spectral dance begin. Try Testament TODAY and use my promo code FIRSTBORNSFIRSTNUT for 20% OFF in your purchase of eternal damnation. Big ups to Testament for sponsoring DEEZ rant.3 -
So they develop this app. That uses our front end component library. That queries a GraphQL layer developed as NPM package. That uses a data service abstraction NPM package. That uses another NPM package mapper library. That queries an old REST API returning XML.
It takes days to make a newly added XML node in the bottom-most layer available in the app, requiring changes to 4 repositories and 3 NPM releases.
Refactoring is dead, because 1 change will affect all layers. And the worst part is: theres only 1 app using these packages, so no case for re-use. Overzealous separation of concerns I guess?2 -
Fucking jQuery in Polymer 0.5.
When polymer 0.5 was released, things seemed incredibly easy at first, but when you need to do some complex things, the abstraction layer provided by web components are not of much help. Babel wasn't there too, so I ended up using scope hacks to access event listeners (var self = this). Worse, I have to use jQuery because many of things are downright tedious or fucked up back then, including myself.
Now, React is here; No jQuery, no hacks, no web component polyfills, no unsolvable perf bugs, no scope hacks, no 10sec loading time, no regrets.1 -
I love it when you take such a long time abstracting and your boss complains that you're not doing anything but once the flexible and beautiful abstraction is completed you finish the rest of the job in a mere couple of hours implementing the abstraction and your boss goes like "whoa how did you do that so fast" (more like "you finally finished fooling around!") and your mind is relaxed as you implement every option of that perfect abstraction2
-
Let's do a story mapping session! Ok cool. PO asks the team: so guys what do you think? *silence*... *more silence*.... PO: come on guys, please respond. *silence*.... Then someone finally responds.
I'm starting to hate this big time. It's almost always like that, no matter the type of session (story mapping, refinement) And there's someone in the team that thinks he always knows best, so if ever someone speaks up, it will always be challenged and lead to useless discussions. He always wants the perfect solution. A good solution is good enough, it doesn't have to be perfect. PO is happy with a good solution (good = maintainable, scoring at least x on our code quality tooling), so why the fuck would you want to go for the 'perfect' solution, which may score just slightly higher in regard to quality, cost much more to develop and people have a hard time maintaining it due to the high level of abstraction? He's always refactoring stuff because it's not future proof. Well, why completely reimplement parts that have been working properly for 2 years and have a very very small chance of needing a change, which then still only needs to be done in just 1 place?
And you know what? All these fancy structures, patterns etc are in there but will their flexibility ever really be used? In my 20 years experience haven't seen such flexibility being really used. Some exceptions of course.
Once it's built, it will keep running, yes, changes will need to be made, but in most cases they never touch all these expensive fancy structured components. Just because most changes are in content or small changes in functionality.1 -
1.) good exercise for brain
2.) like the abstraction
3.) like building things from mental constructs -
Why do you lil' shits keep making LAYERS and LAYERS of unnecessary abstraction and then call it goddamn progress???
Dude what the fuck is this UEFI shit?!
Why the hell do I NEED to import a frigging library and read tons of boring and overly complicated documentation just so I can paint a pixel on the screen now uh??
Alright alright yeah so the BIOS is a little basic but daaaamit son if you want something a bit more complicated you make it yourself or install an OS that provides it! Like we've been doing it for years!!!
Dude, you don't get to know what a file system is until I tell you!
The PC be like:
"You wanna dereference the 0x0 pointer? There you go: it's 0xE9DF41, anything else?
You wanna write to the screen? Ok I have a perfectly convinient interrupt setup for that.
Wanna paint a pixel yellow? Ok, just call this other interruption. Theere we go.
And it only took four bytes and a nanosecond to do it."
That shit works, and if you want something more complex, but not too much, that still runs efficiently install DOS.
Don't mess around with the hardware pleeease.
We can still understand what's going on down there. Once UEFI steps in, it'll be like sealing a door forever. Long live BIOS damn it all!1 -
Do you guys think someday programming languages will have reached their absolute limit? Where any more abstraction or additions would be more of a detriment than a plus?2
-
I shouldn't be modifying 7 fucking 200+ line files in your UI to make a single get request. Can we please use some of our common sense to see that doing shit like this completely defeats the principles of our jobs - abstraction and simplification?!
-
My new task: Improve performance of an archiving algorithm.
"We don't know what the cause of the performance issues is." I can tell you: Because it's overengineered bullshit! This is spaghetti inside spaghetti on top of spaghetti!
And it doesn't help that they don't want to know about different archiving algorithms, because I offered multiple alternatives, one 80% smaller than the other, but the other is 80% faster.
At least I'm instructed to throw it all away and rewrite it and not add even more to the garbage pile. I'm very happy about that. -
!rant
Think you've mastered abstraction /seperation of what should be Android classes versus what should be Android resources?
Decompile the Gmail app and check all the xml tags you might have missed.
Remember: UI and most application properties are loaded on the xml codes first (during inflation), then on activities or fragments last (especially if the views are only instantiated).
I just realized that I need to learn a lot of tags today. -
I want to make my own Android app. I have completed Java basics. In Java basics I have completed encapsulation, abstraction, inheritance, polymorphism etc. I have basic knowledge of these all. I cannot switch myself to Android studio because I am not having a good laptop but after one month I will buy a good laptop. So in this one month of time what should I learn which will help me in my Android app development.3
-
Been thinking about game design. Making things testable and modular. I think I have a lot to learn in this area. One idea popped up in my head. How can I design the game so that it can be networked later? One idea was an interface in the logic that gives authority to another set of code (module?). Basically include the abstraction as if networking was there. So if/when I decide the next version of the game or existing version should have online play. I think learning to create client/server would be instructional. Maybe networking could be a dlc or something.5
-
interesting conversation with the department head at my college (I'm a graduating senior): for incoming freshmen with no programming experience, do you think we should be teaching them to use a nice IDE, or start them with just a decent text editor and command line?
I was trying to convince the department head that we shouldn't show them an IDE until they've got the basics of command line down, and understand that a lot of what an IDE does is just abstraction of the command line.8 -
Has anybody experience with Scrum in small web development agencies? Especially estimating stories with story points instead of hours/days?
We have a new junior project manager, without any practical experience working agile, who wants to establish scrum because what he read about it sounded so good... I already worked agile with kanban before and I loved it, but I only have little experience with scrum.
I think scrum, or agile in general, won't work with the clients we have. Most of the time, our clients have a fixed deadline, a fixed budget (either money or time) and they know their requirements, so there is no much room for beeing agile.
Regarding story points, I just adding an unneccessary layer of abstraction, because the customer wants to know how long a specific feature takes. Sure, story points are just another, more dynamic unit for time, but then why nut estimate in static time unit in the first place? Another fear I have, is that some devs may be more ignorant regarding deadlines and expectations on customers side. "yeah I'm working for 10 days on this story, but it's 8 points!" instead of informing the project manager "Currently I spend 2 days on this feature, we estimated 3 days, but it seems I need 3 days more".
Maybe I shouldn't be worried, but it would be great if you could share your experience and learnings. Thanks in advance!14 -
When you join a React JS SPA project months after the start to discover that 2 CSS grid systems with totally different breakpoints have been in use in parallel since 5 months, hidden by layer upon layer of abstraction, and that no dev bothered to fix, let alone notice.
#FML -
so am switching jobs as an Android dev from a company which made android libs (using almost 0 external dependencies and mostly java) to a company which makes android apps( and is probably using either rx/guava/ribs/hilt etc or the more fancy hilt/compose/coroutines/clean-arc etc. its either one of them depending upon the maturity of product)
B2C folks use tons of libraries in favor of delivering fast but learning about those libraries while taking new tasks and fixing bugs CAUSED by those libraries ( or their inappropriate usage) is a big PAIN IN THE FUCKING ASS.
I remember i had once became such a weird dev coz of my prev company ( before the current libraries one, which was also a B2C) .
on weekends i would come up with a nice app idea, start a new android studio project, and before writing a single line of useful code, i would add a bunch of libraries, gradle scripts and extensions .
that ocd will only settle once all the steps are done and i can see a working app after which i would write the code for actual code for feature implementation.
granted that these libs are good for creating robust scalable code, but most of the times those infinite kayers of seperation, inheritance and abstraction are not really needed for a simple , working product.
:/
i have also started reading about rxjava , and although i am repulsive to this library due to its complicated black box like structure, i find its vast number of operators nd built in solutions very cool.
at the end of the day, all i want is to write code that is good enough for monkeys, get it shipped without any objections and go back home.
and when you work on a codebase that has these complicated libs, you bet your ass that there will be thos leetcode bros and library lover senëõr devs waiting to delay the "go back home" part 😪2 -
Every work call you have is a workaround. On call, if you explain something related to code or toolchain, it’s your failure at either documentation or choosing abstraction level. If you explain processes and task priorities, it’s your failure at management. If you discuss deadlines, it’s your failure at estimation.
If you’re an IT manager and do your job right, you should barely have calls.3 -
I'm in a big fat fucking stinking rut, as in progress on this project has absolutely stagnanted.
Gonna rubber face your duck now **UNZIPS** excepts I don't have zippers, as joggers are the one true way; fake Adidas til I fucking drop.
Brain damage aside, I understand both how I've layed out the data and what I'm supposed to do with it. We have a virtual machine, an array of instructions and arguments for a given process within it, and we need to walk this array and map values to registers.
We also need to spill values inside registers to stack, IF they are required at a further point within that block. This also isn't terribly complex. We simply look forward in the array and see if the value is an argument to any instruction that *needs* this value to be loaded (ie, within a register).
So this implies multiple iterations; we need to better understand how one particular value is used throughout an F before we can make a final decision on how many registers and stack space are actually needed for the whole block.
Here's where it gets tricky. If there's a call, we need to be certain that the symbol being invoked has already been fully processed. Besides the obvious fact that recursion fucks me up, there's another matter: say a private method gets invoked by another private method. We can take advantage of this, by which I mean, sacrilege incoming so put on this toga.
Looking at the output for C compilers, it would seem this is not done in practice, I would assume because it's a pain in the ass. But when you have the guarantee that F will only be called internally, as that's what "private" means, there's two ways it can go:
0. It's well below the 13-20 cycle threshold, so you inline the fucker. No suprises there.
1. It's a more involved affaire, and invoked in more than one place, so you don't inline it. Codesize matters.
Recursion and [1] are the big deal things holding me back. Not because it's too hard, like I said this is kindergarten level abstraction. I'm just slow and fanatical, which is how I prefer to spell "constant obsessive paranoid delusions". I can see the potential optimization I can pull here, so I'm stuck trying to figure it out.
Idea would be, handling the register allocation and stack spill for an internal-internal (or deep internal; what we like to call a "guts" method) in synchronization with the *calling* processes. This is, fundamentally, violating all conventions -- but so under the hood no one will notice.
Let me give you an example. If we were to pass some value to a function, expecting to mutate it and get a different value back, in a lot of cases it'd be stupid to make an implicit copy by using two registers, one for input and another for the output. Dude, it's one cycle. Multiply it by a million, say sixty times per second, for every time you __needlessly__ make a copy of a value that we've already stated is mutable.
Clearly unacceptable. This is, in the strictest sense, everywhere in every single codebase. Premature micro optimization is the root of all goodness, God is great and praiseworthy. So how do we go about it?
Answer is I know and I don't know. By which I mean to say, this very thing I've done by hand. Assembly is fun. Now the issue is teaching a calculator how to do it. Not so fun.
There is a dependency chain between processes, as I believe I've kind of alluded to. I'm trying to make decisions on the side of the caller depending on the details of the callee, which is why recursion is rawdogging my soul. This is the same situation, it's inverting the direction of one or more links in the dependency chain, which makes no fucking sense.
And yet it does.
Brain, explain yourself.
How do *you* handle this without crashing?
Brain?
<<ME STEWPED; BEEP-BOOP>>
Alright then, that was a useless attempt at fuckery. Let's have a nap then, maybe it'll come to me in the morning. That's what I've been saying to myself for almost a month now.
Perhaps it is a hardcoded fuk.1 -
I was tasked with reviving this mobile app purchased off the shelf. Initially, I was impressed with what I was seeing while perusing the codebase. I'm used to editing laravel projects written by handpicked amateurs. So this felt like a breath of fresh air. Coupled with the fact that I'd recently enquired on this very platform whether anyone has chanced upon an impressive code. All is going well, until
I start finding the multi layers of abstraction and indirection cryptic and obfuscatory; and that is coming from an idealist like me who advocates for "clean" patterns such as event emission. I wonder whether it would have helped if the emission or events were typed for easy listener tracking, instead of a black hole like vm.notifyListeners() (DOESN'T EVEN HAVE AN EVENT NAME!)
With time, I become disgusted by the tons of custom elements with so many parents
My take on production level user of the view model pattern: amazing in theory
One of the architectural decisions made on this project that had me foaming in the mouth, pulling my hair and cursing out the author's generations, past, present and future: can you believe these guys are APPENDING IMAGE DOMAINS TO THE RESOURCE? Ie the domain names are tightly coupled to the images and dictated by the api, instead of the client
If this isn't bad enough, the field names of returned entities/models don't exist on the database, of course because the stupid laravel framework abets this sort of madness by combining eloquent "scopes, attributes, and appends". A trifecta of horrors.
I eventual scaled through the horrors, but not without losing my admiration for the team behind it. App has returned to the shelves, because my company lost patience with my resuscitating it. They have the regular api authentication in place, but that's not good enough. They just had to integrate firebase as well, just because. Meanwhile, this isn't documented anywhere. I stumbled into it during my scuffle with app setup, gradle ish. Eventually got banned by firebase for "sending unusual requests". My company's last straw -
Newt OS is a fucking abomination
1. stack apis are fucked up
2. Abstraction architecture is horrible
3. Have to configure 50 different yml files with 1000 different vars to make if work for the target it already say it supported
4. Was up for last 32 hours to make a simple application work, such poorly made documentation !!!!
Looks like some web developer made whole thing
Ewww..... horrible
Also don’t think they follow spec at all!!4 -
I dont get it. Please give me one good reason to use mongoose with a mongoDB.
Once upon a time it might have made sense to use a schema for the db. Today the native driver supports schemas and can check them on inserting. Nevermind one should validate the data before its hitting the db. I listened to an 1hr podcast last week where one of the maintainers tried to give reasons why its might be a good idea to use mogoose, and he failed miserably.
It introduces dependencies that are useless, it doesnt really abstract anything useful from the native driver, its TS support is shit and I dont like the API.
Every time I see someone use it he either fails or doesnt explain at all why to use it. Its so redundant it makes me angry. We have enough abstraction already. We really dont need more code that doesnt provide value. Please just use mongo the way the people of mongoDB intended it to be used.1 -
I'm seeing the rails intro by the creator and being honest, I'm not sure if all the abstraction is a really good thing or a really bad thing.1
-
Do apis and libraries or any kind of abstraction makes developers less aware about what's going on in the inside. I think it makes freshers ignorant about inside workings.
I think that every one who's trying to learn should start from core stuff only. But that also leads to them getting away from coding coz its too difficult.
What's your view?2 -
I would have wanted to bring up SICP again, with the great big warning about the evil assignment operator and state and all the troubles that ensue (just think: concurrency).
But in a way, nothing has really come up from this or my attempts to dig deeper into "everything is a file/object" (Unix, smalltalk), neither from formal languages or the Curry-Howard correspondence. - Maybe there's just nothing, no firm bottom ground to discover. Like the physicists going for their world formula, but instead of a grand, beautiful symmetry that explains everything, we face a shattered world of (incompatible) theories, that is ever so more complex and chaotic through our theories applied to it. There may not be a Platonic ideal world of ideas, but rather partial constructs explaining some particular perceptions.
Similarly the one perfect programming language to rule them all, the perfect abstraction, pattern is probably just another prepubertal fantasy to be sunk.
So maybe instead of seeking the perfect epiphany, we should go for something quite different: the nagging, brooding uneasiness that something is wrong there, that there's something to be fixed... that even negative feeling would propel us to search further, not to stay in whatever is touted as the real thing.
Such irritations I found with Pieter Hintjens' writings. For example when he actively engaged in conspiracy theories. And I'm still not sure, if he just went off the cliff or he's even right alluding that these theories are an act of sanity, a self-defence against the hidden evil mights. I just don't know. Anything. -
Had a lecturer that taught a module on OOP where the entire module was spent teaching how to code on Java while the concept of OOP was just skimmed through at the end of the module. Okay, fine, it's just supposed to introduce OOP, maybe the continuation will go into detail.
The next semester we had the continuation module titled OOP with Java. Entire module was about Javafx. So two semesters later and everyone in the class barely understood things such as polymorphism or abstraction. -
any advice/suggestions to intensively brush up on modern C++ and multithreading for an interview that will likely be technical and cover bases like algorithms, data structures, etc?
I haven’t done c++ for awhile since a few courses in college - I did parallel programming and GPGPU on the side, but nothing on a professional level.
I’ve been mostly doing front web dev since I got out of school and C#, so I’ve been more on design/higher level of abstraction in dev and if I am asked things about pointers, memory allocations, etc I would probably draw a blank but I am motivated to no life it hard for the next week to catch up again.3 -
Is it worth it to learn low-code platforms such outsystem? Is it flexible enough to create custom codes or custom UI/Ux? Is too much abstraction worth it for large systems?3
-
Portrait of Me, Writting Documentation -- a short french film:
The processes applied to any section of memory utilized for a given purpose should be strictly limited to those declared by the associated type that encapsulates the purpose in question until release or mutation.
That is to say, improperly encoding the intended usage of such a block by utilizing an identical type or alias thereof for a multitude of incompatible situations, giving place to guesswork to arise, constitutes the prostitution of an abstraction.
Such heinous acts of symbolical pimping have received strong condemnation from multiple digital rights organizations, as well as our own, prestigious office. Let it be made Crystal, Alizé and Hennessy clear, that we will not stand for this kind of degenerate practice, and that any heretical sects and cabals built around worship of the strange creatures that arise every eleventh night from the depths of the Black Mausoleum will be prosecuted with the full force of the law.
As a young, corageous man once said at the peak of his career: "it is only through the self-inflicted, hyperbolic discharge of smouldered, comminute perennial anadenanthera colubrina spermatic fluid that the cannonical transfiguration of our collective rectosigmoid junction can be brought to fruition". He was immediately violated with might and ire far beyond our wildest, most profligately depraved fantasies, yet his message lives on.
I leave you now to be ritually and figuratively blown by a posssessed mortician that is to become concubine to our dark master; the long journey to the old graveyard will be perilous, and my destination most assuredly fatal, as I depart to give my firstborn to our Lord Berzchjanzad -- a blood sacrifice meant to appease him from peeling off my skin and refashioning it into a bloodied scarf to be worn around his thumping, grandemonic cock.
And in this moment, as I stare blankly at this teleprompter, the president wishes to reassure you of his sacred vows of stalwart and promethean gayhood, and may __these__ nuts bounce on chins forevermore. Here's to *not* bleeding to death in retribution for this unending litany of sins...
Yet all predictions come to pass.
««««««««««« finẽ »»»»»»»»»»» -
Another day, another struggle with time zones.
How many fucking helper methods do I need to create for dates and time zones? How many components, pipes and services do I need to wrap just so two datetimes line up? Apparently another one today. At this point I'm ready to accept flat earth theory if it means no more time zones. I'm fucking sold on it if so.
It's not even the time zone that's the issue. It's business needing it formatted, but also offset properly, based on your browser locale, but with points that cross into DST observing time zones of a different locale simultaneously. Sometimes those times are the same, sometimes they're different, sometimes they're different but only in winter. And despite a plethora of libraries to help with these calculations, nothing ever seems to just work out of the box. So here's to another layer of abstraction, because time zones (and DST) are bullshit.1 -
i always get sucked into this "cute code" hell whenever i am working with a b2c codebase, and especially with kotlin code.
here's a scenario:
task : build a debounce logic for an input view where each user input is currently triggerring an api call.
my steps
1. read what debouncing is.
2. see if any code is available on the internet
=> found a code piece on the internet with some level of abstraction ( basically a simple final class that implements the input event callback and encapsulates the debounce logic)
3) copy it, run it , it wokrs
------
for any sane coder, these steps are hardly 10-30 mins and they can move on with life. but its your truly that made this task into a 6hour research only to come up at similar solution. my curiosity led me to stupid places
1) why this class is final? what if someone else wanna use it but with a different behaviour? lets try open(non final class) .
2) why even use a class? it extends an interface, lets try to wrap the logic in interface itself (kotlin supports interfaces that don't require implementation)
3) umm , the interface works but it looks ugly, with all its global overridden variables. what about we make it extension?
4) yeah the extension approach is also not very good, lets go back to open class.
5) but extend is super nice to look! lets keep the extension and open class too
6) can we optimise the implementation? why it uses an additional handler? what if we provided everything in constructor? how about builder pattern?
FUCK MY BRAIN! there are so much fucking options that i forgot that i spent 4 hours on this small thing
the simplest approach would have been tk just shove all the listeners and everything in activity and forget about it :/
senior devs on this platform, how do you stop yourself from adding every concept that you know into the smallest possible task?6 -
There is an idea of Me Being Online; some kind of abstraction. But there is no real me online, only an entity, something illusory, and though I can hide my naps and you can see my green status and maybe you can even sense our availability is probably comparable: I simply am not there.
-
Time to complain about "Just do X". Often nerds ask broad, open-ended questions, and I see people giving lazy, low effort solutions. As a lazy, open guy, I sympathize. I just wish people gave better "just" answers. How about a few levels of abstraction above "doing" something next time! Tell someone to "just read this", or "just learn about this" instead! Please pass this on to your neighbor.
-
Typically every computer science major begins with either C C# C++ java or python , creating so much abstraction from the hardware which just loads your mind with questions that remain unanswered.When ever i program something i always think of how the under lying stuff is working.They never explain how and where software meets the hardware.Why are they keeping students away from the hardware. I think a cs graduate without knowing the underpinning of a computer should not be considered a cs graduate as opposed to being a software engineer a computer science major relates to everything that is a computer that includes the theoretical stuff and a little bit know how of computer hardware. Instead of teaching this stuff and assembly as a language in the first semester they teach you java or C++. Could not speculate on why this is so.11
-
Do you hand craft each IP packet in binary before you send it out through the network? No. Who cares. We're computer scientists and the entire field is built on layers upon layers of abstraction.
-
Hey guys again I have question for you please help me🙏
1)Abstraction method use in python pgm
2) super keyword used and modify the code for inheritance kilometres pgm13 -
Sooo. That starts to be a bit annoying:
I'm working on a large refactoring with a pretty good inheritance / generic system. And some code generators.
Rghjt now I'm doing a script which generate code files, which will generate code-gen templates which will generate final files.
It's funny and it's a one shot generation, but still. So much abstraction.
(End result is good tho. Everything in small files less than 15 lignes of code. Everything structured.)