Join devRant
Do all the things like
++ or -- rants, post your own rants, comment on others' rants and build your customized dev avatar
Sign Up
Pipeless API
From the creators of devRant, Pipeless lets you power real-time personalized recommendations and activity feeds using a simple API
Learn More
Search - "abstract code"
-
As a developer, sometimes you hammer away on some useless solo side project for a few weeks. Maybe a small game, a web interface for your home-built storage server, or an app to turn your living room lights on an off.
I often see these posts and graphs here about motivation, about a desire to conceive perfection. You want to create a self-hosted Spotify clone "but better", or you set out to make the best todo app for iOS ever written.
These rants and memes often highlight how you start with this incredible drive, how your code is perfectly clean when you begin. Then it all oscillates between states of panic and surprise, sweat, tears and euphoria, an end in a disillusioned stare at the tangled mess you created, to gather dust forever in some private repository.
Writing a physics engine from scratch was harder than you expected. You needed a lot of ugly code to get your admin panel working in Safari. Some other shiny idea came along, and you decided to bite, even though you feel a burning guilt about the ever growing pile of unfinished failures.
All I want to say is:
No time was lost.
This is how senior developers are born. You strengthen your brain, the calluses on your mind provide you with perseverance to solve problems. Even if (no, *especially* if) you gave up on your project.
Eventually, giving up is good, it's a sign of wisdom an flexibility to focus on the broader domain again.
One of the things I love about failures is how varied they tend to be, how they force you to start seeing overarching patterns.
You don't notice the things you take back from your failures, they slip back sticking to you, undetected.
You get intuitions for strengths and weaknesses in patterns. Whenever you're matching two sparse ordered indexed lists, there's this corner of your brain lighting up on how to do it efficiently. You realize it's not the ORMs which suck, it's the fundamental object-relational impedance mismatch existing in all languages which causes problems, and you feel your fingers tingling whenever you encounter its effects in the future, ready to dive in ever so slightly deeper.
You notice you can suddenly solve completely abstract data problems using the pathfinding logic from your failed game. You realize you can use vector calculations from your physics engine to compare similarities in psychological behavior. You never understood trigonometry in high school, but while building a a deficient robotic Arduino abomination it suddenly started making sense.
You're building intuitions, continuously. These intuitions are grooves which become deeper each time you encounter fundamental patterns. The more variation in environments and topics you expose yourself to, the more permanent these associations become.
Failure is inconsequential, failure even deserves respect, failure builds intuition about patterns. Every single epiphany about similarity in patterns is an incredible victory.
Please, for the love of code...
Start and fail as many projects as you can.30 -
Good Morning!, its time for practiseSafeHex's most incompetent co-worker!
Todays contestant is a very special one.
*sitcom audience: WHY?*
Glad you asked, you see if you were to look at his linkedin profile, you would see a job title unlike any you've seen before.
*sitcom audience oooooooohhhhhh*
were not talking software developer, engineer, tech lead, designer, CTO, CEO or anything like that, No No our new entrant "G" surpasses all of those with the title ..... "Software extraordinaire".
*sitcom audience laughs hysterically*
I KNOW!, wtf does that even mean! as a previous dev-ranter pointed out does this mean he IS quality code? I'd say he's more like a trash can ... where his code belongs
*ba dum tsssss*
Ok ok, lets get on with the show, heres some reasons why "G" is on the show:
One of G's tasks was to build an analytics gathering library for iOS, similar to google analytics where you track pages and events (we couldn't use google's). G was SO good at this job he implemented 2 features we didn't even ask for:
- If the library was unable to load its config file (for any reason) it would throw an uncatchable system integrity error, crashing the app.
- If anything was passed into any of the functions that wasn't expected (null, empty array etc.) it would crash the app as it was "more efficient" to not do any sanity checks inside the library.
This caused a lot of issues as some of the data needed to come from the clients server. The day we launched the app, within the first 3 hours we had over 40k crash logs and a VERY angry client.
Now, what makes this story important is not the bugs themselves, come on how many times have we all done something stupid? No the issue here was G defended all of this as the right thing to do!
.. and no he wasn't stoned or drunk!
G claimed if he couldn't get the right settings / params he wouldn't be able to track the event and then our CEO wouldn't have our usage data. To which I replied:
"So your solution was to not give the client an app instead? ... which also doesn't give the CEO his data".
He got very angry and asked me "what would you do then?". I offered a solution something like why not have a default tag for "error" or "unknown" where if theres an issue, we send up whatever we have, plus the file name and store it somewhere else. I was told I was being ridiculous as it wasn't built to track anything like that and that would never work ... his solution? ... pull the library out of the app and forget it.
... once again giving everyone no data.
G later moved onto another cross-platform style project. Backend team were particularly unhappy as they got no spec of what needed to be done. All they knew was it was a single endpoint dealing with very complex model. There was no Java classes, super classes, abstract classes or even interfaces, just this huge chunk of mocked data. So myself and the lead sat down with him, and asked where the interfaces for the backend where, or designs / architecture for them etc.
His response, to this day frightens me ... not makes me angry, not bewilders me ... scares the living shit out of me that people like this exist in the world and have successful careers.
G: "hhhmmm, I know how to build an interface, but i've never understood them ... Like lets say I have an interface, what now? how does that help me in any way? I can't physically use it, does it not just use up time building it for no reason?"
us: "... ... how are the backend team suppose to understand the model, its types, integrate it into the other systems?"
G: "Can I not just tell them and they can write it down?"
**
I'll just pause here for a moment, as you'll likely need to read that again out of sheer disbelief
**
I've never seen someone die inside the way the lead did. He started a syllable and his face just dropped, eyes glazed over and he instantly lost all the will to live. He replied:
" wel ............... it doesn't matter ... its not important ... I have to go, good luck with the project"
*killed the screen share and left the room*
now I know you are all dying in suspense to know what happened to that project, I can drop the shocking bombshell that it was in fact cancelled. Thankfully only ~350 man hours were spent on it
... yep, not a typo.
G's crowning achievement however will go down in history. VERY long story short, backend got deployed to the server and EVERYTHING broke. Lead investigated, found mistakes and config issues on every second line, load balancer wasn't even starting up. When asked had this been tested before it was deployed:
G: "Yeah I tested it on my machine, it worked fine"
lead: "... and on the server?"
G: "no, my machine will do the same thing"
lead: "do you have a load balancer and multiple VM's?"
G: "no, but Java is Java"
... and with that its time to end todays episode. Will G be our most incompetent? ... maybe.
Tune in later for more practiceSafeHex's most incompetent co-worker!!!31 -
Me: Oh I see were using a non-standard architecture on this app. I like this bit but what is this doing? never seen it before.
Him: Ah we use that to abstract the navigation layer.
Me: oh ok, interesting idea, but that means we need an extra file per screen + 1 per module. We also can't use this inbuilt control, which I really like, and we've to write a tonne of code to avoid that.
Him: Yeah we wanted to take a new approach to fix X, this is what we came up with. Were not 100% happy with it. Do you have any ideas?
**
Queue really long, multi-day architecture discussion. Lots of interesting points, neither side being precious or childish in anyway. Was honestly fantastic.
**
Me: So after researching your last email a bit, I think I found a happy middle ground. If we turn X into a singleton, we can store the state its generating inside itself. We can go back to using the in-built navigation control and have the data being fetched like Y. If you want to keep your dependency injection stuff, we can copy the Angular services approach and inject the singletons instead of all of these things. That means we can delete the entire layer Z.
Even with the app only having 25% of the screens, we could delete like 30+ files, and still have the architecture, at a high level, identical and textbook MVVM.
Him: singleton? no I don't like those, best off keeping it the way it is.
... are you fucking kidding me? You've reinvented probably 3 wheels, doubled the code in the app and forced us to take ownership of something the system handles ... but a singleton is a bad idea? ... based off no concrete evidence or facts, but a personal opinion.
... your face is a bad idea15 -
This ist basically my daily work. I have to write Java code in excel files which then are being converted into a DSL and then again being converted into Java code. On top of that many wrappers were built which abstract all this things away..
We have about 30 such excel files which contain about 50000 business rules.
There is no version control for this tables and 5 different team are working on the same tables parallel.
The name of this framework is Drools or as I call it: HELL 😡16 -
I am fed up working with unskilled software developers. Or to be more specific, working with people who have no idea of sofware architecture.
Most people I've worked with have simply no idea what they are doing in the broad picture, they can only follow patterns they see and implement their feature in the same way. They can't think about the abstract concepts which should be the foundation of the project.
They fail to write unit tests which are maintainable. They write one fucking test per method which is testing 50 things at the same time, making it often impossible to understand what is being tested.
They think putting stuff in private methods makes their class better and is some kind of separation of concerns.
They write classes and afterwards create interfaces for these classes named {Class}Interface, shoving all the methods into that interface. They think it's good design to do so.
They are unable to think about the reasons why things are done the way they are done and that you don't do stuff for the sake of doing stuff, but to achieve certain goals like interchangeability.
They don't undestand how to separate business logic from the application code.
They have no sense for naming things beautifully. They don't see how naming things is a major part of good software architecture.
They get layer concepts wrong and then create godlike {EntityName}Service classes, which do everything related to a particular entity.
They fail to shape the boundaries within a software project, entangling stuff which should live in individual modules.
All I want is to work in a team with professionals.2 -
If you need to learn/teach object orientation, these are my approaches (I hate that classic "car" example):
1) Keep in mind games like Warcraft, Starcraft, Civilization, Age of Empires (yes, I am old school). They are a good example of having classes to use, instantiating objects (creatures) and putting them to work together. As in a real system.
2) Think of your program as an office that has a job to do, or a factory that has something to deliver. Classes are the roles/jobs and objects are the workers/employees. They don't need to be complex, but their purpose must be really (really, really) well defined. Just like in a real office / factory.
3) Even better (or crazier), see your classes and objects as real beings, digital creatures in a abstract world, and yourself as a kind of god, who creates species (define classes) with wisdom. Give life when it is the time for them to come into the world (instantiate object) and kill them when they are done with their mission (dispose an object). Give them behavior, logic, conditions to work with, situations where they take action, and when they don't. Make them kinda "smart". Build them able to make decisions and take actions based on conditions. Give them life. Think on your program as an ecossystem. There must be balance, connection, species must be well defined and creatures must work together to achieve a common objective. Don't just throw code and pray for it to run. Plan it.
-----
When I talk about my classes like they are real beings, and programs as mini-worlds, some people say I am crazy, some others say that's passion.
It is both! @__@3 -
I'm currently rewriting perfectly clean and functioning Scala code in Java (because "Enterprise", yay). The amount of unnecessary boilerplate I have to add is insane. I'm not even talking big complicated code but two liners or the lack of simple things like a range from 5 to 10.
Why do I have to write
List<Position> occupiedPositions = placedEntities.stream()
.flatMap((pe) -> pe.occupiedPositions().stream())
.collect(Collectors.toList());
instead of simply
val occupiedPositions = placedEntities.flatMap(_.occupiedPositions)
Why on earth does `occupiedPositions.distinct` suddenly become a monstrosity like `occupiedPositions.stream().distinct().collect(Collectors.toList())` where the majority of code is pure boilerplate? And this is supposed to be the new and better Java8 api which people use as evidence that Java is now suddenly "functional" (yeah no, just no).
Why do APIs that annotate parameters with @Nullable throw NullPointerExceptions when I pass a null? Why does the compiler not help prevent such stupidity? Why do we use static typing PLUS those annotations and it still crashes at runtime like every damn dynamic, interpreted language out there? That's not unfortunate, it's a complete waste of time.
Why is a simple idea like a range from x to 10 (in scala literally `x to 10`) not by default included in Java? There's Guava's version of Range which does not have a helper for integer ranges (even though they are the most used ones). Then there's apache.commons version which _has_ a helper for integers, but is strangely not iterable (wtf I don't even...).
Speaking of Iterable: How difficult could it be to convert an abstract Iterable<T> into a concrete List<T>? In scala it's surprisingly `someIterable.toList`. I found nothing like that so I took to stackoverflow where I found a thread in which people suggested everything from writing your own ListUtils helper class, using Guava (which is a huge dependency!) to using the new Java8 features inline (which is still about three lines long). I didn't know this was such a hard problem in computer science, TIL.
How anyone can be productive in this abomination of a language is beyond me now, even though I've used it for many years while learning to code (back then I didn't know there were much better ways to do things). The only good part is that I have to endure this nonsense for only about 3 days longer then I'm free again!12 -
I learnt programming by making cheats for games and reverse engineering them. It was a fun experience as it wasn't always easy to start with C++ and assembly but it was definitely worth it. Though when you come from a low level language such as C++, looking at highly abstract languages such as Javascript makes everything feel wrong in Javascript, especially when it comes to types and how you can just switch types in the middle of the code :D. But it also gives you an understanding of how Javascript could be implemented, what the engine is doing in the background when you create an object etc..
-
Ok, so when I inherit a Wordpress site I've really stopped expecting anything sane. Examples: evidence that the Wordpress "developer" (that term is used in the loosest sense possible) has thought about his/her code or even evidence that they're not complete idiots who wish to make my life hell going forwards.
Have a look at the screen shot below - this is from the theme footer, so loaded on every page. The screenshot only shows a small part of the file. IT LITERALLY HAS 3696 lines.
Firstly, lets excuse the frankly eye watering if statement to check for the post ID. That made me face palm myself immediately.
The insanity comes for the thousands of lines of JQuery code, duplicated to hell and back that changes the color of various dividers - that are scattered throughout the site.
To make things thousands of times worse, they are ALL HANDED CODED.
Even if JavaScript was the only way I could format these particular elements I certainly wouldn't duplicate the same code for every element. After copy and pasting that JQuery a couple of times and normal developer would think one word, pretty quickly - repetition.
When a good developer notes repetition ways to abstract crap away is the first thought that comes to mind.
Hell, when I was first learning to code god knows how long ago I always used functions to avoid repetition.
In this case, with a few seconds though this "developer" could have created a single JQuery handler and use data attributes within the HTML. Hell, as bad as that is, it's better than the monstrosity I'm looking at now.
I'm aware Wordpress is associated with bad developers due to it's low barrier to entry, but this site is something else.
The scary thing is that I know the agency that produced this. They are very large, use Wordpress exclusively and have some stupidly huge clients that would be know nationally.
Wordpress truly does attract some of the most awful "developers" and deserves it's reputation.
If you're a good developer and use Wordpress I feel sorry for you, as you're in small numbers from my experience.
Rant over, have vented a bit and feel better. Thanks Devrant.6 -
Frustrated, tired and a bit lost.
I'm a "Senior PHP Backend Dev", which includes not the greatest tech stack nor the best job title, but it pays fine, and the company is awesome to work for.
I suck at writing features, but I'm great at bitching, and I easily put complex abstract concepts into usable models. So I'm also QA, tester, tech lead, database architect, whatever.
That makes writing PHP less annoying, because I create the rules, and whip devs around when they forget a return type definition or forget to handle an edge case. But I don't write a lot of code anymore, I mostly read (bad) code.
Lately I REALLY feel like doing something else... problem is that I know JS/ES6, but really dislike React/Vue and the whole crappy modern frontend toolchainchootrain of babelifyingwebpackingyarnballs. I know Python/Tensorflow/etc, but don't feel like I want to go into data science or AI. And then I'm awesome at the shit no one uses, like Haskell, Go and Rust (and worse).
I got a job offer which combines a very interesting PHP codebase with a Java infrastructure, where I could learn a lot... and I'm kind of tempted.
Problem is, everyone always shits on Java. I always made a bit of fun of Java myself. Don't even know exactly why, probably some really cruel instinct which causes kids to bully the least popular kid.
I know the basics, I've written the hello world, and a small backend app for a personal project. I know how strict and verbose it can be. I love the strictness in Haskell and Rust.... but those are both also quite terse.
Should I become a Java dev? I'm not talking about Android SDK, but an insane enterprise codebase at a life sciences corporation.
To the pro Java devs: What are the best and worst things about your job, about the weekly processes, about the toolchains? Have you ever considered other languages? Do you unconditionally love and believe in Java, or do you believe Swift, Kotlin, Scala or whatever will eventually make it completely obsolete?
Will Java hasten my decline into the cynical neckbeard I was always destined to be?
There are a lot more fun langauges, but looking at realistic demand and career value...20 -
Fuck (some of) you backend developers who think regurgitating JSON makes for a good API.
"It's all in JSON. iOS can read JSON, right?"
A well-trained simian can read JSON, still doesn't mean it can do something with it. Your shitty API could be spitting out fucking ancient Egyptian for all I care, just make it be the same ancient Egyptian everywhere!
Don't create one endpoint that spits out the URL for the next endpoint (completely different domain, completely different path structure). Are you fucking kidding me?
As if that wasn't enough, endpoints receive data structured in one way, but return results in another!! "It's all JSON", but it's still dong.
How do I abstract that, you piece of shit? Now I have to write ever so slightly different code in multiple places instead of writing it only once.
How the fuck do I even model that in a database?
Have a crash course on implementing APIs on the client side and only come back when you're done.
Morons.6 -
Don't you hate it when your co-worker does dumb things, but thinks it's the "clean code" way?
The following is a conversation between me and a co-worker, who thinks he's superior to everyone because he thinks he's the only one who read the Clean Code series. Let's call him Bill.
Me: I think the feature we need is quite simple, our application needs to call this third party API, parse the response and pass it to the next step. Why do you need to bury everything under an abstraction of 4 layers?
Bill: bEcAuSe It'S dEcOuPlInG, aNd MaKe ThE cOdE tEsTaBlE
Me: I don't know man, you only need to abstract the third party api client, and then mock it if you want. Some interfaces you define makes no sense at all. For example, this interface only has 1 concrete implementation, and I don't think it will ever have another. Besides, the concrete implementation only gets the input from the upper layer and passes it down the lower layer. Why the extra step? I feel like you're using interface just for the sake of interface.
Bill: PrOgRaMmInG tO iNtErFaCe, NoT cOnCrEtE iPlEmEnTaTiOn!!!
Me: You keep saying those words, I don't think they mean what you think they mean. But they certainly do not mean that every method argument must be an interface
Bill: BuT uNcLe BoB blah blah blah...
Me: *gives up all hope*14 -
Haha kids, you're all dead wrong. Here's my story.
There is a thing called “emergence”. This is a fundamental property of our universe. It works 100% of the time. It can't be stopped, it can't be mitigated. Everything you see around you is an emergent phenomenon.
Emergence is triggered when a lot of similar things come together and interact. One water molecule cannot be dry or wet, but if you have many, after a certain number the new property emerges — wetness. The system becomes _wet_.
Professionalism is an emergent phenomenon too, and its water molecules are abstract knowledge. Learn tech things you're interested in, complete random tutorials, code, and after a certain amount of knowledge molecules is gained, something clicks inside your head, and you become a professional.
Unfortunately, there are no shortcuts here. Uni education can make you a professional seemingly quicker, but it's not because uni knowledge is special, it's because uni is a perfect environment to absorb a lot of knowledge in a short period of time.
It happened to me too. I started coding in Pascal in fifth grade of high school, and I did it till sixth. Then, seventh to ninth were spent on my uni's after-school program. After ninth grade, I drop out of high school to get to this uni's experimental program. First grade of uni, and we're making a CPU. Second grade, and we're doing hard math, C and assembly.
And finally, in the third grade, it happens. I was sitting there in the classroom, it was late, and I was writing a recursive sudoku solver in Python. And I _felt_ the click. You cannot mistake it for anything else. It clicks, and you're a changed person. Immediately, I realized I can write everything. Needless to say, I was passing everything related to code afterwards with flying colours.
From that point, everything I did was just gaining more and more experience. Nothing changed fundamentally.
Emergence is forever. If you learn constantly, even without a concrete defined path, I can guarantee you that you _will_ become a professional. This is backed by the universe itself. You cannot avoid becoming one if you're actively accumulating emergence points.
Here's the list of projects I made in the past 11 years: https://notion.so/uyouthe/...
I'm 24.7 -
This stupid crap is pissing me off.
I write a quick blob of code that performs an http request with custom headers and writes the response to a file. easy squeezy. Everything works.
I abstract it into a class and add request building and stages (enjoayble!), and have one method make the response, read its body, and write to a file. I literally copy/pasted most of my existing code into the method and indented it. The only changes were updating var names to instance vars.
But now? It's complaining something is trying to read the request body twice, and it's throwing a fit. What? How? You were just working!
asfklasjdf;l8 -
I'm so fucking tired of OOP.
This bullshit never ends. Everyone treats OOP in their own, proper (of course) way. You read tons of those fashion books, like uncle bob and shit. and then comes a dumb asshole that starts reviewing your code, and tells you doing it wrong. FUCK. and you can't tell anything to your TL or PM cuz they are same dumb asholes. Because after you fix all the bullshit from the first asshole, those more responsible assholes come and tell you that you still doing it wrong.
- uh.. bruh, why don't you make interface for everything? that' S.O.L.I.D, you know.. it just right thing.
- bruh, why don't you use enum and switch case. we need a factory.
- bruh, we don't use abstract classes, use interface
- could you rewrite your linq/stream thing into a class and a method. it's just simpler for us. foreach loop is something everyone knows.
well,then go and LEARN the tool you're dealing with, coderfucker.
FUUUUCK.13 -
I got in trouble for refactoring code to be modular. They said “thats too complicated the maintenance team”.
Said coworker produced a kludge of copy pasted code so the dumb ass maintenance team could understand it.
tldr; interfaces and abstract classes are too advanced for our employees so make the codebase shitty on purpose.3 -
How come it is so hard to find good developers. Have been doing interviews for a couple of weeks now (for a senior PHP developer role).
First round is me talking about the function and company, asking questions about candidates experience, wishes and we usually end in some tech conversations. Most of the resumes I got are pretty fucking good. I mean, experience with low-level languages, experience with the problems we need to solve here, contributions to open-source, experience in R and MathLab etc etc. On paper they look perfect.
For the second round I give them an assessment which they can do at home on their own machine in their own time. It's not a hard one, just some mathmatical problems they need to solve. A quick google GIVES the answer (no joke!!). But that's OK, I look at their code cleanliness, proper use of commenting so I can determine if they are solo-developers or fit good in a team and if they abstract repeated functions and make sure that they take their work seriously, you know the drill.
It pisses me off that I get BROKEN FUCKING CODE WHICH DOES NOT EVEN RUN and that I get code back which I look at and makes me vomit instantly, I mean, DO YOU EVEN TAKE YOUR PROFESSION SERIOUS? How dare you to ask for 50k the year, a lease-car, extra bonusses AND YOUR FUCKING CODE SPITS OUT COMPLETLY WRONG ANSWERS OR DOES NOT EVEN RUN WHAT THE FUCK DUDE GO BACK TO FROM WHICH EVER HOLE YOU CRAWLED OUT AND STOP WASTING OTHER PEOPLES TIME WITH YOUR FUCKING INCOMPENTENCE...19 -
Abstract anything dealing with external services where if they go out of business, change their internal policies, or you get a wild hair up your ass you won't have to change your entire code base later.3
-
My new coworker: That "I know everything about all and I'm better than you" kind. Is working on Accounting but already has her fingers on my work, telling my boss things like "that's easy to do"...
Of course, she knows absolutely nothing about programming and I.T., but is easy for my boss to believe an easy lie than a complex truth.
(sorry, crude language and caps follows)
Hey, listen you fucking excuse of person, DO YOUR FUCKING JOB and stay away of my DAMN GOOD FUCKING CODE and my FUCKING SERVERS.
Not going to give you admin access in a gazillion years, even if my life depends on it.
And stop saying nonsenses about things that you WILL NEVER UNDERSTAND, because those things are too complex and abstract for your little stupid mind to understand.
Go ahead, mess with me! Will sue you to the end of your FUCKING world!
Thanks girls/guys/lasses/lads.
This is absolutely therapeutical.4 -
When I was in college OOP was emerging. A lot of the professors were against teaching it as the core. Some younger professors were adamant about it, and also Java fanatics. So after the bell rang, they'd sometimes teach people that wanted to learn it. I stayed after and the professor said that object oriented programming treated things like reality.
My first thought to this was hold up, modeling reality is hard and complicated, why would you want to add that to your programming that's utter madness.
Then he started with a ball example and how some balls in reality are blue, and they can have a bounce action we can express with a method.
My first thought was that this seems a very niche example. It has very little to do with any problems I have yet solved and I felt thinking about it this way would complicate my programs rather than make them simpler.
I looked around the at remnants of my classmates and saw several sitting forward, their eyes lit up and I felt like I was in a cult meeting where the head is trying to make everyone enamored of their personality. Except he wasn't selling himself, he was selling an idea.
I patiently waited it out, wanting there to be something of value in the after the bell lesson. Something I could use to better my own programming ability. It never came.
This same professor would tell us all to read and buy gang of four it would change our lives. It was an expensive hard cover book with a ribbon attached for a bookmark. It was made to look important. I didn't have much money in college but I gave it a shot I bought the book. I remember wrinkling my nose often, reading at it. Feeling like I was still being sold something. But where was the proof. It was all an argument from authority and I didn't think the argument was very good.
I left college thinking the whole thing was silly and would surely go away with time. And then it grew, and grew. It started to be impossible to avoid it. So I'd just use it when I had to and that became more and more often.
I began to doubt myself. Perhaps I was wrong, surely all these people using and loving this paradigm could not be wrong. I took on a 3 year project to dive deep into OOP later in my career. I was already intimately aware of OOP having to have done so much of it. But I caught up on all the latest ideas and practiced them for a the first year. I thought if OOP is so good I should be able to be more productive in years 2 and 3.
It was the most miserable I had ever been as a programmer. Everything took forever to do. There was boilerplate code everywhere. You didn't so much solve problems as stuff abstract ideas that had nothing to do with the problem everywhere and THEN code the actual part of the code that does a task. Even though I was working with an interpreted language they had added a need to compile, for dependency injection. What's next taking the benefit of dynamic typing and forcing typing into it? Oh I see they managed to do that too. At this point why not just use C or C++. It's going to do everything you wanted if you add compiling and typing and do it way faster at run time.
I talked to the client extensively about everything. We both agreed the project was untenable. We moved everything over another 3 years. His business is doing better than ever before now by several metrics. And I can be productive again. My self doubt was over. OOP is a complicated mess that drags down the software industry, little better than snake oil and full of empty promises. Unfortunately it is all some people know.
Now there is a functional movement, a data oriented movement, and things are looking a little brighter. However, no one seems to care for procedural. Functional and procedural are not that different. Functional just tries to put more constraints on the developer. Data oriented is also a lot more sensible, and again pretty close to procedural a lot of the time. It's just odd to me this need to separate from procedural at all. Procedural was very honest. If you're a bad programmer you make bad code. If you're a good programmer you make good code. It seems a lot of this was meant to enforce bad programmers to make good code. I'll tell you what I think though. I think that has never worked. It's just hidden it away in some abstraction and made identifying it harder. Much like the code methodologies themselves do to the code.
Now I'm left with a choice, keep my own business going to work on what I love, shift gears and do what I hate for more money, or pivot careers entirely. I decided after all this to go into data science because what you all are doing to the software industry sickens me. And that's my story. It's one that makes a lot of people defensive or even passive aggressive, to those people I say, try more things. At least then you can be less defensive about your opinion.53 -
Android dev job question:
"Describe the activity lifecycle and write an application that does x,y,z in accordance with it"
Fullstack dev job question:
"Write some code that interacts with our API and does x,y,z, put the data into our database and build a web interface"
Java backend dev interview :
"BUILD AN ELEVATOR ALGORITHM WITH LESS THAN o(nlog(n)), FIND NEIGHBORS IN A BINARY TREE, WHAT IS THE DIFFERENCE BETWEEN AN INTERFACE AND ABSTRACT CLASS?"
Why?5 -
Any Haskell programmers here?
I started to learn this language for fun two days ago and so far I find it absolutely amazing and really different to OOP languages. Most of the time the solutions make so much sense, but actually coding them requires really abstract thinking of the problem. How fast did you learn Haskell? How long it took you do code it comfortably? Any advises you can give me? I work mainly through a uni exercise sheet from a friend from a different uni, and the rest is hoogle and google :P10 -
GOD ALMIGHTY I HATE SWIFT & XCODE...
Why the fuck does it take a horrendous amount of time to muck about with layout constraints. Why the heck does xcode choose to add constraint layouts to elements that already have pissing constraints! Why does dealing with something as trivial as tables have to be so god damn fucking involved when HTML and CSS let me create and style tables in fuck all lines.
And what the hell is up with how pissing long xcode takes just to figure out that 1 extra line of code I've just added. You jump to another file and xcode finally decides to be an ide again and bitch at the fact that you've forgotten to add some parameter or that they've decided to rename paramter "x" since version fuck nows what.
Working with abstract classes is fun, lets use protocols (because interfaces are too old school) and then lets tack on something we call extensions and then lets make people piss about with convenience initializers.
And lord almighty, what the fuck is up with casting, what all this ?! BS. What's wrong with just checking if the value is null in the first place, or whats wrong with giving something an initial value, oh because having to unwrap shit is more elegant right??
And good god, I need to own a fucking cinema screen just to have the storyboard open, there's less fucking panels on the Sistine Chapel ceiling
then there is in xcode.1 -
tl;dr Do you think we will any time soon move from editing raw source code? Will IDE or other interfaces allow us to change the code in graphic representation or even through voice?
---
One thing I found funny watching Westworld is how they depicted the "programming" - it is more like swiping on a smartphone, a bit maybe like Tom Cruise's investigations in Minority report. Or giving certain commands and key words by voice.
There was one quote from Uncle Bob's "Clean Code" I could never find again, where he said something along the lines, that back in the seventies or eighties they thought they would soon raise programming languages to such a high level they would use natural language interfaces, and look at us now, still the same "if's".
So I feel uncomfortable without my shell and having tried a graphical programming language once this particular (Labview) seemed clumsy to me at best. But maybe there are a lot of web devs here and it seems with them frameworks you might be able to abstract away a lot of the pesky system programming... so do you feel like moving to some new shiny programming experience or do you think it will stay the same for more decades as the computer is that stupid machine where you have to spill it out instruction by instruction anyways?7 -
Manager:
Hey this client sent us a list with all of their employees in this format... we would tell them to input it themselves but they're a pretty big client, so could you do it?
Me: Sure
*3 hours later*
... why am I taking so long...?
I look back at my code, and see that I've done a whole framework to input data into our system, which accepts not only the client's format but it's actually pretty abstract and extensible for any format you'd like, all with a thorough documentation.
*FACEPALM*
Why can I do this with menial jobs and not for our main code?3 -
HTML is the core building block of the web. Why does everyone feel the need these days to abstract and virtualize and recreate the wheel? You’re only slowing down your site...plus adding layers of confusion for new comers and adding more code to maintain.12
-
Worst architecture I've seen?
The worst (working here) follow the academic pattern of trying to be perfect when the only measure of 'perfect' should be the user saying "Thank you" or one that no one knows about (the 'it just works' architectural pattern).
A senior developer with a masters degree in software engineering developed a class/object architecture for representing an Invoice in our system. Took almost 3 months to come up with ..
- Contained over 50 interfaces (IInvoice, IOrder, IProduct, etc. mostly just data bags)
- Abstract classes that implemented the interfaces
- Concrete classes that injected behavior via the abstract classes (constructors, Copy methods, converter functions, etc)
- Various data access (SQL server/WCF services) factories
During code reviews I kept saying this design was too complex and too brittle for the changes everyone knew were coming. The web team that would ultimately be using the framework had, at best, vague requirements. Because he had a masters degree, he knew best.
He was proud of nearly perfect academic design (almost 100% test code coverage, very nice class diagrams, lines and boxes, auto-generated documentation, etc), until the DBAs changed table relationships (1:1 turned into 1:M and M:M), field names, etc, and users changed business requirements (ex. concept of an invoice fee changed the total amount due calculation, which broke nearly everything).
That change caused a ripple affect that resulted in a major delay in the web site feature release.
By the time the developer fixed all the issues, the web team wrote their framework and hit the database directly (Dapper+simple DTOs) and his library was never used.1 -
Yesterday, I put the final touches on a massive system using hundreds of classes, with thousands of lines of code, all easily maintained because of the way I used abstract classes, and coding to an interface, stubs, etc. And all instantiated with a near english fluent api. With detailed logging and even contacts me when there's problems, result of a year's work. I felt like a genius
Today, this fucking simple contact form that won't do what I want it to for the past 4 hours...1 -
Not sure if it's the worst code review but it's a recent one.
We don't really do code reviews where I work unfortunately but my coworker used my framework for the first time (build some nice composer libraries for cmdline projects) and asked if I could make them do autoloading.
He never used namespaces before so I was glad to help him out.
What I saw was a dreadful mess. His project was called "scripts" so good luck picking a namespace...
Than it was all lose functions in the executable file. All those functions are however called by a class in another file (if they where not calling eachother as a cascading mess). That class was extending an abstract class from my library as instructed. However I never imagined my lib being raped like that.
The functions themselves are a horrible mess. Nothing uniform completely different style (our documentation states PSR's should be used).
Parameters counts higher than 5.
Variable names like Object and Dobject (in calling function Dobject is Object but it needs a fresh one.
If statements on parameters that need basically split it in two (should simply be to functions)
If else statement with return of same variable as a single line (sane people use ternary for that)
Note that I said functions. All of it should have been OO and methods. Would have saved at least some of the parameter hell.
I could go on and on. Do I think the programmer is bad yes (does not even grasp interfaces, dep injection, foreach loops). Is this his best work no. He said that for a one of script like this it just has to work. Not going to be used elsewhere. I disagree as it is a few thousand lines of code that others have to read too.2 -
By making these shitty languages that basically abstract away anything difficult, Python, Javascript whatever, we've only enabled shit code to hit production which inevitably one day will either blow up or just add eternal technical debt. Even worse is when an MBA gets power to enable this.8
-
Okay... I need to confess.
I actually like the idea of counting arrays from 1 like it is in some languages.
It makes code cleaner.
Think about it.
You would never need to subtract 1 from count/size/length or add 1 for things like the month in javascript because the first item would be at index 1 and many many errors wouldn't be happening because we don't need to force our minds to think another way. I learned counting from 1 after I learned to walk so it's the most natural thing to do. Just because the software/hardware below our language works that way doesn't mean we can not abstract this behavior away. What's your opinion about this? Am I wrong?12 -
Wow Angular2 you are beautiful.
I loved you early on angular 1.x but by the end we drifted apart driven by our diffrent needs. I needed a manageable code base and more excitement you needed to stay the way you are. I respect you for that, but we are not right for each other anymore.
I have been hurt Angular2 i may not fully heal but you provide me with what I need. Developing on you is a pleasure that feels like a full object orientated experience. Most of all developing on you is *fast* your seperation of concerns tickles me in all the right ways. The suger you provide with your decorators, classes, abstract classes and interfaces makes me weak at the knees.
Keep growing and improving Angular 2. I think we shall have many projects together.3 -
This started as an update to my cover story for my Linked In profile, but as I got into a groove writing it, it turned into something more, but I’m not really sure what exactly. It maybe gets a little preachy towards the end so I’m not sure if I want to use it on LI but I figure it might be appreciated here:
In my IT career of nearly 20 years, I have worked on a very wide range of projects. I have worked on everything from mobile apps (both Adroid and iOS) to eCommerce to document management to CMS. I have such a broad technical background that if I am unfamiliar with any technology, there is a very good chance I can pick it up and run with it in a very short timespan.
If you think of the value that team members add to the team as a whole in mathematical terms, you have adders and you have subtractors. I am neither. I am a multiplier. I enjoy coaching, leading and architecture, but I don’t ever want to get out of the code entirely.
For the last 9 years, I have functioned as a technical team lead on a variety of highly successful and highly productive teams. As far as team leads go, I tend to be a bit more hands on. Generally, I manage to actively develop code about 25% of the time to keep my skills sharp and have a clear understanding of my team’s codebase.
Beyond that I also like to review as much of the code coming into the codebase as practical. I do this for 3 reasons. I do this because as a team lead, I am ultimately the one responsible for the quality and stability of the codebase. This also allows me to keep a finger on the pulse of the team, so that I have a better idea of who is struggling and who is outperforming. Finally, I recognize that my way may not necessarily be the best way to do something and I am perfectly willing to admit the same. I have learned just as much if not more by reviewing the work of others than having someone else review my own.
It has been said that if you find a job you love, you’ll never work a day in your life. This describes my relationship with software development perfectly. I have known that I would be writing software in some capacity for a living since I wrote my first “hello world” program in BASIC in the third grade.
I don’t like the term programmer because it has a sense of impersonality to it. I tolerate the title Software Developer, because it’s the industry standard. Personally, I prefer Software Craftsman to any other current vernacular for those that sling code for a living.
All too often is our work compiled into binary form, both literally and figuratively. Our users take for granted the fact that an app “just works”, without thinking about the proper use of layers of abstraction and separation of concerns, Gang of Four design patterns or why an abstract class was used instead of an interface. Take a look at any mediocre app’s review distribution in the App Store. You will inevitably see an inverse bell curve. Lot’s of 4’s and 5’s and lots of (but hopefully not as many) 1’s and not much in the middle. This leads one to believe that even given the subjective nature of a 5 star scale, users still look at things in terms of either “this app works for me” or “this one doesn’t”. It’s all still 1’s and 0’s.
Even as a contributor to many open source projects myself, I’ll be the first to admit that have never sat down and cracked open the Spring Framework to truly appreciate the work that has been poured into it. Yet, when I’m in backend mode, I’m working with Spring nearly every single day.
The moniker Software Craftsman helps to convey the fact that I put my heart and soul into every line of code that I or a member of my team write. An API contract isn’t just well designed or not. Some are better designed than others. Some are better documented than others. Despite the fact that the end result of our work is literally just a bunch of 1’s and 0’s, computer science is not an exact science at all. Anyone who has ever taken 200 lines of Java code and reduced it to less than 50 lines of reactive Kotlin, anyone who has ever hit that Utopia of 100% unit test coverage in a class, or anyone who can actually read that 2-line Perl implementation of the RSA algorithm understands this simple truth. Software development is an art form. I am a Software Craftsman.
#wk171 -
I need some opinions on Rx and MVVM. Its being done in iOS, but I think its fairly general programming question.
The small team I joined is using Rx (I've never used it before) and I'm trying to learn and catch up to them. Looking at the code, I think there are thousands of lines of over-engineered code that could be done so much simpler. From a non Rx point of view, I think we are following some bad practises, from an Rx point of view the guys are saying this is what Rx needs to be. I'm trying to discuss this with them, but they are shooting me down saying I just don't know enough about Rx. Maybe thats true, maybe I just don't get it, but they aren't exactly explaining it, just telling me i'm wrong and they are right. I need another set of eyes on this to see if it is just me.
One of the main points is that there are many places where network errors shouldn't complete the observable (i.e. can't call onError), I understand this concept. I read a response from the RxSwift maintainers that said the way to handle this was to wrap your response type in a class with a generic type (e.g. Result<T>) that contained a property to denote a success or error and maybe an error message. This way errors (such as incorrect password) won't cause it to complete, everything goes through onNext and users can retry / go again, makes sense.
The guys are saying that this breaks Rx principals and MVVM. Instead we need separate observables for every type of response. So we have viewModels that contain:
- isSuccessObservable
- isErrorObservable
- isLoadingObservable
- isRefreshingObservable
- etc. (some have close to 10 different observables)
To me this is overkill to have so many streams all frequently only ever delivering 1 or none messages. I would have aimed for 1 observable, that returns an object holding properties for each of these things, and sending several messages. Is that not what streams are suppose to do? Then the local code can use filters as part of the subscriptions. The major benefit of having 1 is that it becomes easier to make it generic and abstract away, which brings us to point 2.
Currently, due to each viewModel having different numbers of observables and methods of different names (but effectively doing the same thing) the guys create a new custom protocol (equivalent of a java interface) for each viewModel with its N observables. The viewModel creates local variables of PublishSubject, BehavorSubject, Driver etc. Then it implements the procotol / interface and casts all the local's back as observables. e.g.
protocol CarViewModelType {
isSuccessObservable: Observable<Car>
isErrorObservable: Observable<String>
isLoadingObservable: Observable<Void>
}
class CarViewModel {
isSuccessSubject: PublishSubject<Car>
isErrorSubject: PublishSubject<String>
isLoadingSubject: PublishSubject<Void>
// other stuff
}
extension CarViewModel: CarViewModelType {
isSuccessObservable {
return isSuccessSubject.asObservable()
}
isErrorObservable {
return isSuccessSubject.asObservable()
}
isLoadingObservable {
return isSuccessSubject.asObservable()
}
}
This has to be created by hand, for every viewModel, of which there is one for every screen and there is 40+ screens. This same structure is copy / pasted into every viewModel. As mentioned above I would like to make this all generic. Have a generic protocol for all viewModels to define 1 Observable, 1 local variable of generic type and handle the cast back automatically. The method to trigger all the business logic could also have its name standardised ("load", "fetch", "processData" etc.). Maybe we could also figure out a few other bits too. This would remove a lot of code, as well as making the code more readable (less messy), and make unit testing much easier. While it could never do everything automatically we could test the basic responses of each viewModel and have at least some testing done by default and not have everything be very boilerplate-y and copy / paste nature.
The guys think that subscribing to isSuccess and / or isError is perfect Rx + MVVM. But for some reason subscribing to status.filter(success) or status.filter(!success) is a sin of unimaginable proportions. Also the idea of multiple buttons and events all "reacting" to the same method named e.g. "load", is bad Rx (why if they all need to do the same thing?)
My thoughts on this are:
- To me its indentical in meaning and architecture, one way is just significantly less code.
- Lets say I agree its not textbook, is it not worth bending the rules to reduce code.
- We are already breaking the rules of MVVM to introduce coordinators (which I hate, as they are adding even more unnecessary code), so why is breaking it to reduce code such a no no.
Any thoughts on the above? Am I way off the mark or is this classic Rx?16 -
I need to stop treating an OO language as if it were a procedural language.
I have the tendency to turn my code into GOTO spaghetti even though I'm semi-aware that objects exist and that they are distinct.
I still have to get used to this paradigm.
My Java professor always swore by the Plato paradigm, i.e.:
""Platonism" and its theory of Forms (or theory of Ideas) denies the reality of the material world, considering it only an image or copy of the real world.
According to this theory of Forms there are at least two worlds: the apparent world of concrete objects, grasped by the senses, which constantly changes, and an unchanging and unseen world of Forms or abstract objects, grasped by pure reason (λογική). which ground what is apparent." (wikipedia)
Thinking in objects, abstractions and metaphysics is not something I haven't done before (I've practiced it during Sociology and Ethics with the whole Pascal Leibniz, Newton and DesCartes approach) but it's certainly not easy.
Then there was my cool Programming 201 professor who said: "Don't worry man, just read those great UML, Program Design and GOF books and it will all become easy, like a story. It'll all make sense.
I mean, I've graduated, I've passed my Software Engineering I, II and III (hard as hell) but since I haven't focused on those theories and practices anymore, I've lost my touch.
It's definitely not easy for a novice programmer to transition between paradigms..10 -
BANano, a free library that transpiles B4X (a crossplatform development tool) source code to JavaScript.
It allows users who are not fluent in JavaScript to make PWA's and dynamic Websites using a VB-like language and the Abstract Designer native to the B4X tool.3 -
The datepicker saga
Part one
So I begin work on a page where user add their details, project is late, taking ages on this page
Nearly done, just need a component to allow users to put in some date of births. Look for react components.
Avoiding that one because fuck Bootstrap.
Ah-ha, that looks good, let's give it a go.
CSS doesn't exist, oh need copy it over from npm dist. Great it applied but...
... WTF it's tiny. Thought it was a problem with my zoom. Nope found the issue in github.com and it's something to do with using REM rather than EM or something, okay someone provided a solution, rather I saw a couple of solutions, after some hacking around I got it working and pasted it in the right location and yes, it's a reasonable size now.
Only it's a bit crap because it only allows scrolling 1 month at a time. No good. Hunting through the docs reveals several options to add year and month drop downs and allow them to be scrolled. Still a bit shit as it only shows certain years, figure I'd set the start date position somewhere at the average.
Wait. The up button on the scroll doesn't even show, it's just a blank 5px button. Mouse scroll doesn't work
Fucking...
... Bailing on that.
Part 2
Okay sod it I'll just make my own three drop down select boxes, day, month and year. Easy.
At this point I take full responsibility and cannot blame any third party. And kids, take this as a lesson to plan out your code fully and make no assumptions on the simplicity of the problem.
For some reason (of which I regretted much) I decided to abstract things so much I made an array of three objects for each drop down. Containing the information to pretty much abstract away the field it was dealing with. This sort of meta programming really screwed with my head, I have lines like the following:
[...].map(optionGroup =>
optionGroup.options[
parseInt(
newState[optionGroup.momentId]
, 10)
]
)...
But I was in too deep and had to weave my way through this kind of abstract process like an intrepid explorer chopping through a rain forest with a butter knife.
So I am using React and Redux, decided it was overkill to use Redux to control each field. Only trouble is of course when the user clicks one of the fields, it doesn't make sense in redux to have one of the three fields selected. And I wanted to show the field title as the first option. So I went against good practice and used state to keep track of the fields before they are handed off to the parent/redux. What a nightmare that was.
Possibly the most challenging part was matching my indices with moment.js to get the UI working right, it was such a meta mess when it just shouldn't have taken so stupidly long.
But, I begin to see the light at the end of this tunnel, it's slowly coming together. And when it all clicks into place I sit back and actually quite enjoy my abysmal attempt at clean and easy to read code.
Part 3
Ran the generated timestamp through a converter and I get the day before, oh yeah that's great
Seems like it's dependant on the timezone??!
Nope. Deploying. Bye. I no longer care if daylight savings makes you a day younger.1 -
TLDR;
Side project update.
Made simple nlp library in python and published it’s first version to open source.
Now I can feed it with parsed pdf text.
See rant https://devrant.com/rants/2192388/...
Why ?
Cause during reading book about nltk I couldn’t find simple extendible way to provide support for polish language and I wanted to abstract stemming, word normalization, tokenizer etc. so I can provide ex. different conditions for separate text files and don’t write much code what is an asset when you work solo.
It’s about 12GB of pdf public accessible law data I am trying to handle ( at first ) which is about 35000 files from last 90 years.
So far I automated downloading web pages and pdf documents from them. Extracting data from web pages and saving it to database. Extracting text from pdf files. I have about 5-6 projects to do all of it above maybe at the end I will put it to some workflow manager like Luigi or just run it by cronjob.
First thing for website version 1.0 part is find correlation between all documents inside law text using nlp library by building custom conditions. Then just generate directory structure and html files with links between documents.
Website version 2.0 is already in my mind but it will be creepy to make it and will take at least 1-2 months and I want to publish fast.
I have some pdfs with only images instead of text and tesseract worked quite good with them so maybe I will try to process them when everything go live.
Learned a lot about pdf as now I know that font in pdf is not always providing unicode characters ( stupid form of obfuscation) so when you extract text you need to build glyph vector to text map for every font.
Pdf is full vector representation - just like svg - what is logic if you think a bit and know that some printers are running using postscript.
Let’s hope next update will be about flutter mobile app which started all of shit above. It’s almost ready ( except getting data from api I am trying to do and logo for release version ). It’s last piece of puzzle.3 -
Fuck fuck fuck I can't even read this source code let alone abstract the core algorithm from it. Fuck C++ and fuck this extremely non verbose code and plethora of syntactic sugar that makes it impossible for anyone who doesn't know the nuances of the language to read it. You could literally put me in the middle of a country where nobody speaks English and i would still have an easier time than I am now.4
-
According to my university lecture you have clean and good code if every tiny little functionality is split into 5+ files. Gotta have an interface, factory, low level implementation, high level implementation, and at this point I don't even know what purpose the other abstraction levels have. Just end me already...
Sometimes I think of how much great and useful stuff you could learn at an university if they used time efficiently. But instead you spend years mostly just studying theoretical or very abstract topics. Whereas 80%+ of useful knowledge and skills you learn on your own.3 -
Am I in developer hell already? A shitty project is about to come to an end (hopefully), or should I rather say: It needs to come to an end. But I am still quite lost in how to deal with it, hence procrastinating on it - making the deadline come closer and with it the realization that I'll probably have to rewrite almost everything. I'm not sure how, but I do know that the current code is a dumpster fire.
Basically what I need to do is dealing with the APIs of different payment providers/gateways (like PayPal, AmazonPay). For most cases I'll get a payment ID from the shop and need to act on it later, e.g. capture the authorized money in the case of a credit card transaction or do refunds (without user interaction, unless there is an error). Now at first I put something together where I try to abstract the payment information into two tables:
orders{1}<->{0..n}payments
payments{1}<->{1..n}paymentDetails
Unfortunately trying to abstract the different payment methods and to squeeze them (and their different possible stati and functions) in these tables was not very successful, it's a total mess with magic numbers, half-broken behavior and without any consideration for partial payments/captures or unfinished requests (i.e. if there is an exception before the response is dealt with, there is no indication that anything has ever been sent). Also the current amount is calculated through the history of the paymentDetails table, which basically works differently for each payment type.
How to fix this mess in a way that I'll still have a job by next week?
I'm trying to improve the db schema first, as I think my biggest problems are lying there. Through some research I've come across a recommendation for making payment type specific subtables (with a magic number/string in the main table to prevent having to look up all subtables). That way I can record what I send and receive without having to abstract it too much, so I'll have an acceptable transaction log. The paymentDetails table can be removed (necessary fields go to the payments table). The payments table gets multiple fields for the amount (differentiating between open, authorized, captured, processing and refunded values) and always reflects the current status.
Tables:
payments
paymentRequestsPaypal
paymentRequestsAmazonpay
paymentRequestsXyz
I think I'm going in the right direction here. hm. Maybe there's some light at the end of this long, dark tunnel. Or a train. I'll have two days to find out.question kill me already send help thank you for being my rubber duck payment gateways deadline approaching rant/question burnout6 -
Sydochen has posted a rant where he is nt really sure why people hate Java, and I decided to publicly post my explanation of this phenomenon, please, from my point of view.
So there is this quite large domain, on which one or two academical studies are built, such as business informatics and applied system engineering which I find extremely interesting and fun, that is called, ironically, SAD. And then there are videos on youtube, by programmers who just can't settle the fuck down. Those videos I am talking about are rants about OOP in general, which, as we all know, is a huge part of studies in the aforementioned domain. What these people are even talking about?
Absolutely obvious, there is no sense in making a software in a linear pattern. Since Bikelsoft has conveniently patched consumers up with GUI based software, the core concept of which is EDP (event driven programming or alternatively, at least OS events queue-ing), the completely functional, linear approach in such environment does not make much sense in terms of the maintainability of the software. Uhm, raise your hand if you ever tried to linearly build a complex GUI system in a single function call on GTK, which does allow you to disregard any responsibility separation pattern of SAD, such as long loved MVC...
Additionally, OOP is mandatory in business because it does allow us to mount abstraction levels and encapsulate actual dataflow behind them, which, of course, lowers the costs of the development.
What happy programmers are talking about usually is the complexity of the task of doing the OOP right in the sense of an overflow of straight composition classes (that do nothing but forward data from lower to upper abstraction levels and vice versa) and the situation of responsibility chain break (this is when a class from lower level directly!! notifies a class of a higher level about something ignoring the fact that there is a chain of other classes between them). And that's it. These guys also do vouch for functional programming, and it's a completely different argument, and there is no reason not to do it in algorithmical, implementational part of the project, of course, but yeah...
So where does Java kick in you think?
Well, guess what language popularized programming in general and OOP in particular. Java is doing a lot of things in a modern way. Of course, if it's 1995 outside *lenny face*. Yeah, fuck AOT, fuck memory management responsibility, all to the maximum towards solving the real applicative tasks.
Have you ever tried to learn to apply Text Watchers in Android with Java? Then you know about inline overloading and inline abstract class implementation. This is not right. This reduces readability and reusability.
Have you ever used Volley on Android? Newbies to Android programming surely should have. Quite verbose boilerplate in google docs, huh?
Have you seen intents? The Android API is, little said, messy with all the support libs and Context class ancestors. Remember how many times the language has helped you to properly orient in all of this hierarchy, when overloading method declaration requires you to use 2 lines instead of 1. Too verbose, too hesitant, distracting - that's what the lang and the api is. Fucking toString() is hilarious. Reference comparison is unintuitive. Obviously poor practices are not banned. Ancient tools. Import hell. Slow evolution.
C# has ripped Java off like an utter cunt, yet it's a piece of cake to maintain a solid patternization and structure, and keep your code clean and readable. Yet, Cs6 already was okay featuring optionally nullable fields and safe optional dereferencing, while we get finally get lambda expressions in J8, in 20-fucking-14.
Java did good back then, but when we joke about dumb indian developers, they are coding it in Java. So yeah.
To sum up, it's easy to make code unreadable with Java, and Java is a tool with which developers usually disregard the patterns of SAD. -
This actually exists in our code-base like this. I have no fucking clue what it does.
function combine(arg0: string, arg1: string): string {
return !arg0 ? arg1 : !arg1 ? arg0 : `${arg0} ${arg1}`;
}9 -
Sometimes I'm to lazy to properly abstract and reuse code, so I'll just copy & paste and then alter it - and then later curse myself about that. I'm working on this though...
-
Figure I can simplify the code if I have the compiler handle *some* of the register allocation.
Eh? What do you mean "NP-hard"? Dafuq's an ENN-PEE?
**frantically reads wiki**
I can proudly say that I understood absolutely nothing; CS stands for cocksucker or rather abysmal failure at the most basic forms of communication, I don't just sit here all day expecting you to flawlessly prove my point with every swallow of breath you draw, yet here we are.
Perhaps one factor involved in producing the generalized cluelessness of my colleagues, I mean their "imposter s*ndrome", has a bit to do with how fucking thick you've formulated this glorified bollocks you call theory. Were not for your incompetence, arcane crackheads like me would simply __not__ be capable of rising to the top of this field entirely via determination and a big salami, therefore I owe you both a debt of gratitude as well as every last word and sign of total disrespect.
As interesting as the study of computational complexity can be, if done correctly that is, you idiots are stuck in a mathematician's abstract mindset in a field entirely devoted to application of ideas rather than *just* the ideas themselves.
To answer my own question, it means there's no known efficient solution. That's it. The part about nondeterministic polynomial convolution of an irreductible rectosigmoid junction can apparently be skipped altogether. Anyway, I solved the problem with the computational equivalent of pizza sticks while you were out in the field mentally jacking off to λ.
Lecture is over, now go clean up the ethereal masturbatory residue if you will, I have mystical el Khwarizmi type-shit to solve via further clubbing of abstraction through liverwurst bologna of immense proportions. ^D3 -
Sometimes people want to be too smart. If you want to consume a handful different restful API, it might make sense to abstract away some common functionality in your client implementation — yet to assume they follow the same convention in how their URI is built is borderline insane.
All I wanted to do was to change one API to a newer version, and now the implementation breaks for at least two other because it was done in an Abstract class and now I have to untangle that mess.
In some cases code duplication wouldn't be that bad. Even if an otherwise unrelated API seemingly share the same contract, still assume it has its own contract. You never know how those API evolve and I proclaim they will evolve towards breaking your assumptions.1 -
I used to think that programming was just straight forward coding what you need.
But now I think it's describing the problem and writing code to solve that problem.
Example: recursive function. It calls itself till it finds a solution or till no options are left. You don't know the answer, but you code something that can find it for you.
Or php, you don't create every single html page, that's done by php dynamically.
The great thing is, it's less work and it is easier to catch error scenarios.
The bad thing is it has become a bit more abstract. -
I took up on a very badly maintained project. You could see that the devs never talked to each other: there was repeated code everywhere, mixingCamelCase with_snake_case, functions that did two very different things and two functions that did almost the exact same. The frameworks being used were a couple of years old (jQuery and its crew) but we wanted to migrate to the more modern ones (Vue and its crew). Instead of nice row-based aggregates in SQL, they preferred to loop through the response and firing up N^2 SQL requests. On top of that, the company was changing its target market, so we wanted to make the code more abstract to fit different customers. To reflect this, they wanted to change the names of the core models.
Oh and did I say that I was the only competent dev in charge of this? The rest were interns.3 -
Damnnn my Team lead is hinting that i write a test for the feature I paired with a team member.
But the large django code base is ridden with abstract classes and classes and inheritance etc...its going to be a long night -
abstract class Ich {
abstract void support(CoWorker coWorker);
abstract void programm(List<Task> tasks);
abstract void analyseCode(String code); // mostly horrible code
abstract void drinkCoffee(Mug mug);
abstract void extinguishFireAndKillBugs(String moreLegacy);
}
...
void work() {
ich.drinkCoffee();
while(isWorkingTime()) {
int rand = Random(0,100)
if(rand < 5) {
ich.myMood += ich.programm(tasksForMe);
} else if (rand >= 5 && rand < 20) {
ich.myMood -= ich.analyse(legacyCode);
} else if (rand >= 20 && rand < 40) {
ich.myMood -= ich.support(GetNextGuyToMe())
} else {
ich.extinguishFireAndKillBugs(moreLegacyCode);
}
if(myMood <= 0) {
ich.gotoHome();
}
}
}1 -
Looking at @striker28 's rant made me think of my time I did my MSc and I think it needs it's own separate rant so here it goes:
So I did an MSc at one of the big league unis in London. First clue was during week 1 where in one of the class a mature student asked whether there would be actual coding during the course. There was an audible gasp from everyone else! Once the lecturer said the unfortunatly they wouldn't be you could hear the sigh of relief from the students...
Next up was all the lectures being placed in the freakin' basement of the university in crap, smelly rooms with annoying ticking A/Cs whereas all the social siences, business and other subjects had lecture halls and classrooms above ground. The contempt for CS from the university's direction was palpable.
Then there was the relegation to the theory-only (i.e. abstract with pen/paper) "tutorial" to the hand of T/As with bugger-all teaching experience. In short most were terrible and should've found a way to abscond themselved from this obligation which was part of the terms of their phd grants unfortunatly.
Further into the course there was the "group project". Oh boy! Out of the 5 in the group my now mature student friend and I were the only one commiting to the repo. There was either no code and a lot of bullshit from the others or crap code that didn't even compile despite their assurances it was all good.. Someone clearly never actually coded and pressed "run" in their lives which is fucking surprising since they've managed to graduate with a BSc and get into a MSc somehow. None of the code "made" by the other 3 persons made it into the master branch for release.
The attitude was that of "We (hahahah) wrote loads of code. We'll get a great mark!". At that stage the core wasn't even complete and the software didn't work yet.
Some of the courses where teaching things already 10 years out of date and when lecturer where pressed on that the few mature students that happen to be there the answer was always "yes, we are planning to update it for next year". Complete bullshit. Didn't help that some of the code on the lecture slides was not even correct! I mean these guy are touted as "experts" in their field...
None of the teory during the entire year was linked to any coding. Everything was abstract with no ties to applied software engineering. I.e. nothing like the real world.
The worst is that none of the youger students realised they were being screwed over and getting very little value for their money. Perhaps one reason why these evaluation forms have such high scores given on them. If you haven't had a job and haven't lived outside academia yet there is nothing to compare it to. It tends to also fall into confirmation bias (hey it's a top UK university, it must be worth it afterall! Look how much they ask for).
By the end of the year I couldn't wait to get the hell out. One of the other mature student sumed it quite well: "I will never send my children here."
Keep in mind that the guy had just over a decade of software engineering experience in the industry and was doing this for fun.
In the end universities are not teaching institutions. The lecturers's primary job is research and their priorities match that. Lectures tend to be the most time efficient teaching format for the ones giving them but, on their own, are not for the consumer.
To those contemplating university for CS: Do the BSc. Get your algo/datastructure chops and learn the basic theory. It is interesting. Don't get discouraged by the subject just because it is taught badly.
Avoid the MSc unless you want to do a phd and go for an academic carrer. You are better off using that year and the money to learn more on your own and get into colaborative projects (open source) on top of some personal ones. Build up your portfolio. It will be cheaper and more interesting!2 -
I dont get it. Please give me one good reason to use mongoose with a mongoDB.
Once upon a time it might have made sense to use a schema for the db. Today the native driver supports schemas and can check them on inserting. Nevermind one should validate the data before its hitting the db. I listened to an 1hr podcast last week where one of the maintainers tried to give reasons why its might be a good idea to use mogoose, and he failed miserably.
It introduces dependencies that are useless, it doesnt really abstract anything useful from the native driver, its TS support is shit and I dont like the API.
Every time I see someone use it he either fails or doesnt explain at all why to use it. Its so redundant it makes me angry. We have enough abstraction already. We really dont need more code that doesnt provide value. Please just use mongo the way the people of mongoDB intended it to be used.1 -
So how do you find motivation to finish a work project which is supposed to "go long"?
So, umm, this is weird, but i have been in this situation a few times and i am not sure if i deal with them correctly.
- the company proposes a brand new feature : a feature which never existed in the product before.
- they have high level directions (both business logics and technical) on how its supposed to be build
-they set vague but comfortable timelines (20-30days) to complete it
- they align me as the main dev for frontend, some x guy for backend , some y guy for parallel frontend (ios/web) and kinda forget about us.
- the business requirements are evolved/cleared as we go on making the product, the backend keeps on providing evolving apis which get stable over time.
- the business ppl shows that yeah there is no pressure and we won't mind extending this for release as other systems will be "obviously" taking time.
- our (the folks on new feature) feature is sidelined .nd we are rarely talked about until we reach those deadlines and at that time we are questioned.
I... am not a powerful performer in these situations. adding a new feature required solving some major problems again and again , while solving smaller problems too, so as the product finally takes shape . for eg:
1. i will start fast by adding all the possible screens, their abstract code, their navigation logic, their xmls etc
2. then based on designs, i will try creating designs a bit
3. then once the apis arrive, i start adding them and modify the logic to handle those.
4. meanwhile many smaller problems come up , like when sending an image from one screen to the previous screen, the thumbnail don't show up, so i spend 5+ hours ensuring that it works precisely . or how i could make 3 api calls in async and make the upload flow better.
5. this goes on for days, until and i and other people start to realise that my project is not upto the point of completion
i keep getting distracted from the original goal of making a working poc first and then fixing the nuances2 -
So I'm writing my compiler and I decide to test error handling, see if I'm catching unexpected tokens and whatnot. I try duplicating a semi-colon at the end of a line, for sure it'll give me an error since that's an unexpected token, isn't it? So I run the compiler and... No errors? I start debugging for a few minutes, snoop around, everything seems ok... "Huh, that's weird" and then it dawns on me, a semi-colon only marks the end of a statement. So, technically, it's not an unexpected token if you have an empty statement (which wouldn't break any rules about statements). I decide to try out my theory. I put ;;;;;;;; at the end of a random line in my rust code, hit compile and... it compiles! So that means it is not a bug anymore! I mean, if the big guys that actually know a tad about language design, compilers and all that cool stuff allow it in their languages, why shouldn't it? So I did it, I turned a bug into a feature and now I can go to sleep in peace and stop dreaming about fucking abstract syntax trees (don't mind my kinks >:) ).
Yeah anyways thanks for reading, till next time! Bye!1 -
The number of concurrent transformations impacting more than half of the codebase in Orchid surpassed 4, so instead of walking the reference graph for each of these I'm updating the whole codebase, from lexer to runtime, in a single pass.
In this process, I also got to reread a lot of code from a year ago. This is the project I learned Rust with. It's incredible, not just how much better I've gotten at this language, but also how much better I've gotten at structuring code on general.
Interestingly though my problem-solving ability seems to be the same. I can tell this by looking at the utilities I made to solve specific well-defined abstract problems. I may have superficial issues with how the code is spelled out in text, but the logic itself is as good as anything I could come up with today.2 -
Decided to learn C# after learning C, its a goddamn nightmare I get that C# like C++ wants to abstract and provide supposedly more easier/shorter ways to write Code but honestly I abhor both languages.
Are there any true alternatives that dont focus too much on oop? or arent bloated to hell?4 -
Somewhere in out application backend we generate a simple bullet chart. But in the most complicated way possible.
We call a web service to retrieve it(yes, a simple bullet chart). The service requires some parameters, and the code that generates them is hidden behind a wall of interfaces and abstract methods (the best and apparently only way to get to the actual code is to debug it).
However, one of these parameters is very well visible and it is a string with (uncommented)javascript function that manipulates the resulting chart, adding some final touches. With hardcoded values etc..
Dear programmers, I know we should avoid reinventing the wheel, but sometimes we should stop and consider the possibility, that we are using the wrong wheel and in completely wrong/obscure way. Thank you.
Yours WhoeverWillMaintainTheCode3 -
Had to refractor and abstract some code into an angular 2 component so it can be reused by another. Well I could have just copied and pasted the code from component 1 to 2, that would have been a lot more faster than making this piece of code separate. The later is better, removes code duplicate and your code reads better.