Join devRant
Do all the things like
++ or -- rants, post your own rants, comment on others' rants and build your customized dev avatar
Sign Up
Pipeless API

From the creators of devRant, Pipeless lets you power real-time personalized recommendations and activity feeds using a simple API
Learn More
Search - "pattern"
-
I've been fairly lucky with my bosses of late since I've progressed in my programming career. But my absolute worst boss was when I first started working in an office environment doing data entry. My boss at the time was terrible, and she was always against innovation or process improvement. She also always tried to make herself look good and taking credit for the accomplishments of others. If she screwed up it was your fault, and she was "always buried in email" so she could never respond to you for pto requests, or escalation of issues between departments. My whole family pretty much worked in various roles in the department and she fired my brother after my mother left the company for no reason, saying he was "sleeping", but I worked right next to him and he's tall and had to slouch just to comfortable see his computer screen since the same manager refused to approve work station improvements for him.
Our workflow was to receive daily spreadsheets of health care claims that we had to manually process and enter into the system. So being the lazy innovator that I am, and trying to find ways I can efficiently work, I delved into studying visual basic and programmed a few functions and tools in excel to analyze, highlight, and process some of the data since the claims on the spreadsheets always had a specific pattern. This was all before I had any formal education in computer science so the program was very basic and clunky but it tripled my efficiency. When I brought it up to my boss to spread it among the rest of our team so they could use it after a short 20 minute training, she struck it down saying any training or use of it would be a waste of resources since it was too technical and complex to be used and if I were to keep improving it or use it I would be fired. It was literally copy and paste from one spreadsheet to the other en masse and clicking a button to sort and fill in the blanks. Eventually I showed it to the director of the department when working on a large data entry project with her, and I was later offered a job as a technical analyst where I was responsible for the codebase that generated the reports for the department and specifically all the reports my old boss used where I would occasionally mess with her to get back at all the crap she gave me and my brother. Since all the reports were blind carbon copied to everyone, I would send out her reports on a delay while everyone else got them on time. It eventually got her in so much crap she had to step down as a manager. She still works in the same company that I started working at again earlier this year, and like the many careers she's ruined she eventually ruined her own within the company 😂5 -
!rant
Finally finished the blanket I’ve spent a month crocheting without a pattern after teaching myself to crochet at, like, the beginning of the month. It’s huge (this is it laying on a King sized bed) and I made so many mistakes that seem super obvious now, but I’m still weirdly proud of it.8 -
!rant I think I may have a way smaller head than the pattern I was using seemed to assume. 😅
(Or my stitches got messed up along the way. Either is possible.)5 -
Remember to regularly defragment your drives on linux. use this handy command.
dd if=/dev/zero of=/dev/sdX bs=1M
Terminology:
dd: Disk defragment
if: input file (the pattern to search for, and should always be /dev/zero)
of: output file (your disk, /dev/sda for instance)
bs: blocsize 1M is fine here17 -
Look at that. The very fucking smart colleague spent 40 days implementing a repository pattern (WHEN WE'RE USING AN ACTIVE RECORD ORM), breaking stuff left and right. Does he use that fucking pattern at the very least?
Of course he doesn't. And along the way he's making sure to create conflicts with the stuff he broke (and I'm fixing). By the time I fix the merge conflicts of one commit, he pushed 6 of them.9 -
Today on "How the Fuck is Python a Real Language?": Lambda functions and other dumb Python syntax.
Lambda functions are generally passed as callbacks, e.g. "myFunc(a, b, lambda c, d: c + d)". Note that the comma between c and d is somehow on a completely different level than the comma between a and b, even though they're both within the same brackets, because instead of using something like, say, universally agreed-upon grouping symbols to visually group the lambda function arguments together, Python groups them using a reserved keyword on one end, and two little dots on the other end. Like yeah, that's easy to notice among 10 other variable and argument names. But Python couldn't really do any better, because "myFunc(a, b, (c, d): c + d)" would be even less readable and prone to typos given how fucked up Python's use of brackets already is.
And while I'm on the topic of dumb Python syntax, let's look at the switch, um, match statements. For a long time, people behind Python argued that a bunch of elif statements with the same fucking conditions (e.g. x == 1, x == 2, x == 3, ...) are more readable than a standard switch statement, but then in Python 3.10 (released only 1 year ago), they finally came to their senses and added match and case keywords to implement pattern matching. Except they managed to fuck up yet again; instead of a normal "default:" statement, the default statement is denoted by "case _:". Because somehow, everywhere else in the code _ behaves as a normal variable name, but in match statement it instead means "ignore the value in this place". For example, "match myVar:" and "case [first, *rest]:" will behave exactly like "[first, *rest] = myVar" as long as myVar is a list with one or more elements, but "case [_, *rest]:" won't assign the first element from the list to anything, even though "[_, *rest] = myVar" will assign it to _. Because fuck consistency, that's why.
And why the fuck is there no fallthrough? Wouldn't it make perfect sense to write
case ('rgb', r, g, b):
case ('argb', _, r, g, b):
case ('rgba', r, g, b, _):
case ('bgr', b, g, r):
case ('abgr', _, b, g, r):
case ('bgra', b, g, r, _):
and then, you know, handle r, g, and b values in the same fucking block of code? Pretty sure that would be more readable than having to write "handeRGB(r, g, b)" 6 fucking times depending on the input format. Oh, and never mind that Python already has a "break" keyword.
Speaking of the "break" keyword, if you try to use it outside of a loop, you get an error "'break' outside loop". However, there's also the "continue" keyword, and if you try to use it outside of a loop, you get an error "'continue' not properly in loop". Why the fuck are there two completely different error messages for that? Does it mean there exists some weird improper syntax to use "continue" inside of a loop? Or is it just another inconsistent Python bullshit where until Python 3.8 you couldn't use "continue" inside the "finally:" block (but you could always use "break", even though it does essentially the same thing, just branching to a different point).19 -
"So Alecx, how did you solve the issues with the data provided to you by hr for <X> application?"
Said the VP of my institution in charge of my department.
"It was complex sir, I could not figure out much of the general ideas of the data schema since it came from a bunch of people not trained in I.T (HR) and as such I had to do some experiments in the data to find the relationships with the data, this brought about 4 different relations in the data, the program determined them for me based on the most common type of data, the model deemed it a "user", from that I just extracted the information that I needed, and generated the tables through Golang's gorm"
VP nodding and listening intently...."how did you make those relationships?" me "I started a simple pattern recognition module through supervised mach..." VP: Machine learning, that sounds like A.I
Me: "Yes sir, it was, but the problem was fairly easy for the schema to determ.." VP: A.I, at our institution, back in my day it was a dream to have such technology, you are the director of web tech, what is it to you to know of this?"
Me: "I just like to experiment with new stuff, it was the easiest rout to determine these things, I just felt that i should use it if I can"
VP: "This is amazing, I'll go by your office later"
Dude speaks wonders of me. The idea was simple, read through the CSV that was provided to me, have the parsing done in a notebook, make it determine the relationships in the data and spout out a bunch of JSON that I could use. Hook it up to a simple gorm golang script and generate the tables for that. Much simpler than the bullshit that we have in php. I used this to create a new database since the previous application had issues. The app will still have a php frontend and backend, but now I don't leave the parsing of the data to php, which quite frankly, php sucks for imho. The Python codebase will then create the json files through the predictive modeling (98% accuaracy) and then the go program will populate the db for me.
There are also some node scripts that help test the data since the data is json.
All in all a good day of work. The VP seems scared since he knows no one on this side of town knows about this kind of tech. Me? I am just happy I get to experiment. Y'all should have seen his face when I showed him a rather large app written in Clojure, the man just went 0.0 when he saw Lisp code.
I think I scare him.12 -
I am soooooooooo fucking stupid!
I was using an np.empty instead of np.zeros to shape the Y tensor...
🤦🏻♀️🤦🏻♀️🤦🏻♀️🤦🏻♀️🤦🏻♀️🤦🏻♀️🤦🏻♀️🤦🏻♀️🤦🏻♀️🤦🏻♀️🤦🏻♀️🤦🏻♀️🤦🏻♀️🤦🏻♀️🤦🏻♀️🤦🏻♀️🤦🏻♀️🤦🏻♀️🤦🏻♀️🤦🏻♀️🤦🏻♀️🤦🏻♀️🤦🏻♀️🤦🏻♀️🤦🏻♀️🤦🏻♀️🤦🏻♀️🤦🏻♀️🤦🏻♀️🤦🏻♀️🤦🏻♀️🤦🏻♀️🤦🏻♀️🤦🏻♀️🤦🏻♀️🤦🏻♀️🤦🏻♀️🤦🏻♀️🤦🏻♀️🤦🏻♀️🤦🏻♀️🤦🏻♀️🤦🏻♀️🤦🏻♀️🤦🏻♀️🤦🏻♀️🤦🏻♀️🤦🏻♀️🤦🏻♀️🤦🏻♀️🤦🏻♀️🤦🏻♀️🤦🏻♀️🤦🏻♀️🤦🏻♀️🤦🏻♀️🤦🏻♀️🤦🏻♀️🤦🏻♀️🤦🏻♀️🤦🏻♀️🤦🏻♀️🤦🏻♀️🤦🏻♀️🤦🏻♀️🤦🏻♀️🤦🏻♀️🤦🏻♀️🤦🏻♀️🤦🏻♀️🤦🏻♀️🤦🏻♀️🤦🏻♀️🤦🏻♀️🤦🏻♀️🤦🏻♀️🤦🏻♀️🤦🏻♀️🤦🏻♀️🤦🏻♀️🤦🏻♀️🤦🏻♀️🤦🏻♀️🤦🏻♀️🤦🏻♀️🤦🏻♀️🤦🏻♀️🤦🏻♀️🤦🏻♀️🤦🏻♀️🤦🏻♀️🤦🏻♀️🤦🏻♀️🤦🏻♀️🤦🏻♀️🤦🏻♀️🤦🏻♀️🤦🏻♀️🤦🏻♀️
No fucking wonder the poor thing was keep failing to detect a pattern.
Let's see if it works now... *play elevator music while we wait for keras*6 -
I've been working on a proof of concept for my thesis for a few days and the async query calls drove me nuts for quite a while. I finally managed to deliver all query results asynchronously while still very much relying on a strong architectural design pattern. I am filled with caffeine, joy and a sense of pride and accomplishment.rant late night coding caffeine async await query proof of concept javascript boilerplate database typescript1
-
I feel like I need to clarify the concept of toxic masculinity and toxic femininity.
The masculinity itself is not toxic! Being a masculine man is not being toxic. Being a man is not being toxic.
Toxic masculinity, in a nutshell, is:
- Teaching boys to never express their feelings. Men don't cry. You should always maintain the “tough” image. If you open up about how you feel, you're a pussy. Domestic abuse of men doesn't exist. A man can't be raped by a woman.
- You should only depend on yourself. Even if you're in trouble (say, with depression or bullying), and you ask for help, you're a pussy.
- Boys will be boys. Aggression is typical for men, and expressing it beating other men is a manly behaviour.
There is also toxic femininity:
- Men should work and provide for the women. Women shouldn't work, they should instead be housekeeping and raising kids.
- Women should be pretty and work on their looks (to attract men).
- If you don't have kids by the age of 30, there is something wrong with you.
It almost seems like traditional grotesque gender roles diminish the personality for the sake of social conformity. The pattern is always “men should”, “women should”. They tell you what to do, authoritatively so, based on your biological sex. They try to “put you in your place” where you “belong” just because of your genitals. This is toxic.
It is important to retain personality. The ultimate goal is to get rid of those stereotypes and finally throw them in the garbage bin where they belong. Because of them, we have anorexia in women (the most deadly mental disorder), and also male suicides through the roof.
Before you label me “feminist”, bear in mind that the third wave is all over the place, to the point they can't agree on what feminism is.20 -
Why does MS need to be such a scumbag with Windows updates?
Every now and then, this unskipable blue setup screen appears and forces the user to make some decisions.
"Do you want to set Edge as the default browser?"
"Do you want a 360 subscription?"
The usual crap.
But it‘s not skippable!
You have to make a decision and the UI for "fuck off" is different for every decision.
You can‘t just press the Nope button every time.
It‘s fucking deliberate. They want you to spend time on reading their shit and force it down your throat.
And let‘s not forget about people who don‘t know computer stuff very well and are confused by this. Then call us because "the computer isn‘t working again."
And you can‘t tell them to skip this slimy rotten vomit of a marketing weasel because you need them to tell you what the options are for each fucking decision screen.
😫17 -
I am so sick and tired of hearing “AI” everywhere all the time. Yeah how about we integrate some AI into your super smart toaster so that it knows when to start preparing for when you put toasts in it in the morning.
Not even mentioning all these idiots being like “oh yeah AI is becoming sentient. Oh yeah AI is gonna take over the world”.
Brother the current state of AI is just machine learning, it’s a stupid pattern detector and generator it doesn’t have thoughts, emotions. Please just stop it.6 -
Seriously, a new guy joined out team and suddenly I'm out of my comfort zone and started following the pattern I used to follow. The thing he did, commented on my PR, a lot of comments.
I had this thing that hey now I can control anything right, new guy? less experienced? yes, so I don't need to be intimidated. But I realised today that I'm easily intimidated my intelligent people because I think now I am the inferior one.
I will push myself to think about it in a better way, by looking at it positively, to learn something from it.10 -
I've assembled enough computing power from the trash. Now I can start to build my own personal 'cloud'. Fuck I hate that word.
But I have a bunch of i7s, and i5s on hand, in towers. Next is just to network them, and setup some software to receive commands.
So far I've looked at Ray, and Dispy for distributed computation. If theres others that any of you are aware of, let me know. If you're familiar with any of these and know which one is the easier approach to get started with, I'd appreciate your input.
The goal is to get all these machines up and running, a cloud thats as dirt cheap as possible, and then train it on sequence prediction of the hidden variables derived from semiprimes. Right now the set is unretrievable, but theres a lot of heavily correlated known variables and so I'm hoping the network can derive better and more accurate insights than I can in a pinch.
Because any given semiprime has numerous (hundreds of known) identities which immediately yield both of its factors if say a certain constant or quotient is known (it isn't), knowing any *one* of them and the correct input, is equivalent to knowing the factors of p.
So I can set each machine to train and attempt to predict the unknown sequence for each particular identity.
Once the machines are setup and I've figured out which distributed library to use, the next step is to setup Keras, andtrain the model using say, all the semiprimes under one to ten million.
I'm also working on a new way of measuring information: autoregressive entropy. The idea is that the prevalence of small numbers when searching for patterns in sequences is largely ephemeral (theres no long term pattern) and AE allows us to put a number on the density of these patterns in a partial sequence, but its only an idea at the moment and I'm not sure what use it has.
Heres hoping the sequence prediction approach works.33 -
Weekly Group Rant suggestion: What anti-pattern exists that still keeps being propagated or infecting other areas of your code base (like a virus)?
Code samples/screen-shots required.13 -
I had some fun with ChatGPT today. I wondered how good its problem solving skills are. Turns out, it's no better than an entry/junior dev armed with all the docs out there - it knows what's written there, how to use the thing (language/framework/tool/etc.), but it has no "understanding" neither of the problem nor the tool, in a holistic way. It's got the knowledge, but it neither has the skill nor understanding of how/why to use it to solve a problem (any problem beyond plain simple complexity).
So the problem I asked it to solve was related to this one I had: https://devrant.com/rants/6312527 .
It was painful to troubleshoot this problem with ChatGPT. It kept on focusing on this particular problem and reacting to errors while trying to fix its initial solution. It took us a good while. Eventually, it reached a working solution, but it was an ugly, convoluted approach that was not feasible to cover my use case with.
FWIW I think it is interesting to follow its line of thought. Eventually, a pattern emerges of how it tries to solve the problem. And it reminds me a lot of myself on the first week in the IT field :)6 -
Is the way people solve problems intrinsic to the native language they learned growing up? Can the shape of our thoughts be optimal for solving certain kinds of problems? Like sentence structure, grammar, etc.
If the pattern of thoughts a language promotes can help us solve problems. Then is there a spoken language that can help promote solving computer science problems?
I know I have to work to think differently to program in different styles of programming. I wonder if we can learn from different spoken languages patterns of logic that are applicable to engineering.
Mathematics, while not a spoken language, has helped me re-frame things in programming. I think programming has also helped in other areas. Like using binary search to find the end of a pipe in the ground.7 -
First of all, merry christmas to everyone on devrant.
Second, another interesting paper--this time on pattern classification using piecewise linear functions vs classic spiking neural nets.
Supposedly it was a *six million* percent improvement in computation time, versus the spiking simulation. Thats my five minute overview of the document anyway.
Highly unusual application (hadn't seen it done before now but maybe I'm unfamiliar). Check it out:
https://link.springer.com/chapter/...4 -
Do you use the well known "pattern" for positioning fingers on the keyboard for faster typing, or you just came up with your own "pattern" as time passed?13
-
!rant
...i realized i can actually pattern-match like this (as in, sequence of elements (including "whatever") instead of just head::rest in F#...
...from watching a talk about prolog.
like "wait... prolog can do this when pattern-matching? that seems very useful. i think i tried to do that in F# but it didn't work, which seems stupid... I'm gonna go try it again".
and sure enough =D
i think i really am gonna like F# if i find the time and resolve to break through how its different mode of thinking stretches my brain in ways it hasn't been stretched for a long time =D6 -
Back from the dead with more vaguely-obscure technical bullshit
Working on a chatbot for my BS-CS. Almost done with college, so the assignment is to make a bot that recommends you a CS career. Cool.
I get through making a joint personality and skill-interest quiz that gives you number grades on different spectra. So far, so good. But this project has to be done entirely in pandorabots' online editor. So no scripting. Zero scripting. 100% markup language. That means to even do math, you need to copy a standard library off GitHub.
I mean, that's fine and all, but the syntax is just atrocious, because everything in AIML is input->response. If you ask the bot "what is 5+5?" you must have it go:
- recognize pattern WHAT IS * + *
-> redirect -> XADD * XS *
-> do math -> recurse result
-> 10
uncomfy. Plus, variables can only be accessed through <get> and <set> tags. But mangeable.
So here's where the story becomes a rant.
In the standard docs, there's all these math functions, and they work. There's also logic.
And then there's this fucker
XIF [ * ] XS [ * ]
Which has no documentation and just doesn't work. No idea what the brackets mean. Tried putting in TRUE, tried putting in true math statements (5 XEQ 5), tried putting in recursion tags to trick it, tried everything. It just ignores it.
There is not a single comment, stackOverflow post, or youtube video that even acknowledges the existence of this thing.
So unless I want to convert the entire logic of my program into nested SWITCH statements with the <condition> tag, I'm just fucked.
The icing on the cake is, I go to tech support on Pandorabots to ask for help with this. What do they have except a chatbot to cheerfully tell me that no humans are around to help me right now?
gonna have to build an entire fuckin turing machine in markup tags to calculate whether x = 3
(:1 -
Fun fact:
The gradual speed increase in the descent pattern of the aliens in space invaders was actually a bug, due to the amount of aliens in the screen.
The more you kill, the faster they get.3 -
Today was a bad dev day working on a shitty React project. Not that React in itself is bad, but it can be hell to work with when the code is a big pile a crap full of anti pattern code. I spent the day refactoring to try to fix a bug, but to no avail. It would take days if not weeks to put some order in this mess and to prevent such bugs.6
-
In the 90s most people had touched grass, but few touched a computer.
In the 2090s most people will have touched a computer, but not grass.
But at least we'll have fully sentient dildos armed with laser guns to mildly stimulate our mandatory attached cyber-clits, or alternatively annihilate thought criminals.
In other news my prime generator has exhaustively been checked against, all primes from 5 to 1 million. I used miller-rabin with k=40 to confirm the results.
The set the generator creates is the join of the quasi-lucas carmichael numbers, the carmichael numbers, and the primes. So after I generated a number I just had to treat those numbers as 'pollutants' and filter them out, which was dead simple.
Whats left after filtering, is strictly the primes.
I also tested it randomly on 50-55 bit primes, and it always returned true, but that range hasn't been fully tested so far because it takes 9-12 seconds per number at that point.
I was expecting maybe a few failures by my generator. So what I did was I wrote a function, genMillerTest(), and all it does is take some number n, returns the next prime after it (using my functions nextPrime() and isPrime()), and then tests it against miller-rabin. If miller returns false, then I add the result to a list. And then I check *those* results by hand (because miller can occasionally return false positives, though I'm not familiar enough with the math to know how often).
Well, imagine my surprise when I had zero false positives.
Which means either my code is generating the same exact set as miller (under some very large value of n), or the chance of miller (at k=40 tests) returning a false positive is vanishingly small.
My next steps should be to parallelize the checking process, and set up my other desktop to run those tests continuously.
Concurrently I should work on figuring out why my slowest primality tests (theres six of them, though I think I can eliminate two) are so slow and if I can better estimate or derive a pattern that allows faster results by better initialization of the variables used by these tests.
I already wrote some cases to output which tests most frequently succeeded (if any of them pass, then the number isn't prime), and therefore could cut short the primality test of a number. I rewrote the function to put those tests in order from most likely to least likely.
I'm also thinking that there may be some clues for faster computation in other bases, or perhaps in binary, or inspecting the patterns of values in the natural logs of non-primes versus primes. Or even looking into the *execution* time of numbers that successfully pass as prime versus ones that don't. Theres a bevy of possible approaches.
The entire process for the first 1_000_000 numbers, ran 1621.28 seconds, or just shy of a tenth of a second per test but I'm sure thats biased toward the head of the list.
If theres any other approach or ideas I may be overlooking, I wouldn't know where to begin.16 -
I don't know what to think of Vue 3 Composition API anymore. At first I hated it because it's nothing but one big ole rip off of React, and I hate React so much; its hook system is the most disgusting anti-pattern I've ever seen in the entire JS ecosystem. This gave me the incentive to try out Svelte instead, but after doing so, I look back at Vue 3 and noticed that they're kinda similar... why are so many JS devs allergic to classes? You can have much better written code that way. Idk, I'm waiting for vue-class-component and vue-property-decorator to fully migrate. In the mean time, if I'm gonna be forced to write composition based code, I might as well use Svelte.3
-
I don't care about market cap. Stick your hype-driven business practices up your ass. Infinite growth doesn't exist. I won't read your fucking books and attend your fucking bootcamps and MBAs. You don't have a business model. Selling data is not a business model. Fuck your quick-flip venture capital schemes, and especially fuck your “ethics”.
I will be the first alt-tech CEO. I only care about revenue. The real money, not capitalization bubble vaporware. You don't need a huge fleet of engineers if you're smart about your technology, know how to do architecture, and you're not a feature creep. You don't need venture capital if you don't need a huge fleet of engineers. You don't need to sell data if you don't need venture capital. See? See the pattern here?
My experience allows me to build products on entirely my own. I am fully aware of the limitations of being alone, and they only inspire lean thinking and great architectural decisions. If you know throwing capacity at a problem is not an option, you start thinking differently. And if you don't need to hire anyone, it is very easy to turn a profit and make it sustainable.
If you don't follow the path of tech vaporware, you won't have the problems of tech vaporware, namely distrust of your user base, shitty updates that break everything, and of course “oops, they raised capital, time to leave before things go south”.
A friend of mine went the path I'm talking about, developed a product over the course of four years all alone, reached $10k MRR and sold for $0.8M. But I won't sell. I only care about revenue. If I get to $10k MRR, I will most likely stop doing new features and focus on fixing all the bugs there are and improving performance. This and security patches. Maybe an occasional facelift. That's it. Some products are valued because they don't change, like Sublime Text. The utility tool you can rely on. This is my scheme, this is what I want to do in life. A best-kept secret.
Imagine 100 million users that hate my product but use it because there are no alternatives, 100 people in data enrichment department alone, a billion dollars of evaluation (without being profitable), 10 million twitter followers, and ten VC firms telling me what to do and what data to sell.
Fuck that. I'd rather have one thousand loyal customers and $10k MRR. I'm different, some call it a mental illness, but the bottom line is, my goals are beyond their understanding. They call me crazy. I won't say it was never about the money, of course it was, but inflating your evaluation is not “money”. But the only thing they have is their terrible hustle culture lives and some VC street wisdom, meanwhile I HAVE products, it is on record on my PH. I have POTDs, I have a fucking Golden Kitty nomination on health and fitness for a product I made in one day. Fuck you.6 -
i had an epiphany today, in a discussion with the software architect of our new project.
i'm having the epic job to design & implement a prototype for a C++ library in a new software project and collected some inspiration in our "old" software, where i'm maintaining the module that fulfills the same functionality (i thought). i've been maintaining this module for around a year now. i analyzed the different features and stuff to consider and created a partial model of the new library.
when i showed it to the architect today, he was like "oh my god, no no no, you don't need all this functionality, this shall not be part of the new library!"
this was the moment when i realized how deeply fucked up the code base of the old module is.
imagine it like this:
you want to automate the process of making yourself a good ol' cup of coffee.
the reasonable thing would be to have
- a smart water boiler where you set parameters water temperature and amount of water to be fetched from the water supply
- a smart coffee bean grinder where you can set type of beans, amount of beans and grinding fineness
- a component where water and ground coffee are joined to brew the coffee, where parameters like duration, pressure etc. are set
- a milk tank where amount of milk, desired temperature and duration / speed of foaming can be set
- a sugar dispenser where amount of applied sugar can be set
- optionally, additional modules with spices, syrup, ice cubes, whatever for your very personal coffee experience
on requesting a coffee, you would then configure and orchestrate all components to your wishes to make you a fine cup of coffee. you can also add routines like "makeCappucchino()", "makeEspresso()", or whatever.
our software is not like this.
it is like this:
- a smart water boiler consisting of submodules that know how to cook water for e.g. "cappucchino with sugar" or for "espresso without sugar, but with milk and ice cubes"
- 5 smart bean grinders that know how to grind beans for e.g. cappucchino, espresso, latte macchiato and for 73ml of water preheated to 82°C
- a very smart sugar dispenser that knows how to add sugar to 95, 98 and 100°C coffee and to coffee made of BOTH coffee arabica AND coffee robusta beans.
etc. etc., i think you're getting the gist.
when i realized this, it was like, right in front of my eyes, this terrible pattern emerged like a foul, corrupted caleidoscope of chaos, through the whole code base of this module.
i've already known how rotten from the core this code base is, but today i've actually identified a really bad pattern that i hadn't realized before. the whole architecture is so bloated that it is hard to have an overview of the whole thing. and it would require a LOT of refactoring to repair this pattern.
but i guess it would also be infinitely satisfying because i could probably reduce the code base for 30% or something...
but unfortunately, this is never going to happen, because screw refactoring.
it's a great feeling to start this new library from scratch, tho...6 -
Is noop actually a user, or is she someone's "second ++"? I noticed a pattern in her behavior, that she always ++'d my rants right after {she256}5d106eb069.4
-
I was reading a few interesting postgreSQL solutions to constrain polymorphic tables (anti-pattern, I know) when something caught my eye.
Some solutions were looked down upon (or looked favorably upon) based on their portability to other rdbms like MySQL.
Is that really so important? I get it, if somehow you decide to change from one to the other for some god-forsaken reason it will make the switch easier. But really, how often will that happen?
I feel there is a tendency to just avoid using SQL beyond the basics.5 -
Having just been let go from a job I couldn't keep up with, I'm now confronted with a job search where every job description includes something like, "Must be able to multi-task across dozens of different projects per day as if you were six people in one."
I'm seeing a pattern here. Apparently, the productivity of only one person is all that is needed to run an entire department's worth of work.1 -
!dev (feel free ignore my rambling)
Fuck my piece of shit landlord that doesn't want me to provide a next tenant because they already have someone (without visiting, I believe they have some shady dealing since I noticed a pattern in regards to the last two flats that changed tenants) and doesn't give a shit about the kitchen I had built in / says "maybe you can leave it [for free] or maybeee the owner will take it but don't expect as much as 500 or 600€ and whatever the owner proposes is non negotiable [...] if you wanna take it out we'll buy a new one" (i.e. fuck you we rather pay 4k for a new one than give you the 2k it's worth)
(╯° □°) ╯︵ ┻━┻
aaaaargh this certainly doesn't help my stress levels which are already 11/10 with the flat search1 -
It's a shame that people don't want to use F# but prise C# for how cool it became and continue becoming. At the same time, little do they know that many of the features were simply drawn from F#.
It's just rediculous how far this OO and C-Style syntax crap has progressed. They keep copying things from functional langugages, making the initial language to be a monstrocity like C++ is now, insted of just using languages like C#. I mean, it was right there before C#: async/task, immutablility, records, indexes, lambdas, non-null by default, who the hell knows what else.
Besides, many people (in my company at least) are just blindly overengineering with patterns and shit, where a simple function would be just enogh.
Watch some some NDC talks about F#, in particular those of Scott Wlaschin. It's just better in so many ways: less noice (I'm looking at you, brackets, commas and semicolons), the whole LOT of type inference and less duplication (just look at the C# signatures of linq methods - it's difficult to read them), immutability by default, non-nullable by default, ADTs and pattern matching, some neat features like type providers (how many times have used "paste special" or an online tool to create C# classes from a JSON/XML file, and how many times have your regenrated it because of schema changes?) and units of measure.
Of course, in some cases it's not optimal, in some cases mutable datastructures of C# are better for performance. But dude, how many performance critical systems have you wrote in C#? I mean, if it comes to performance you should use Rust or C++ or C after all.
*sighs*15 -
I shall delay your PR with my giant egotistical scrotum by:
- Use less words for your function name
- Use this obscure pattern you have not learnt in your outdated compsci degree
- Each loop should get their own function
- Your function has too many parameters
- You must name every instance of magic numbers / strings
- Here, have another obscure test case to write8 -
Theregister.com is wrestling with gpus that need 700 Watts of juice and how to cool them. 50 years ago I was reading an excellent magazine called "Electronics". And I remember that IBM came up with a scheme to absorb enormous amounts of heat from chips. You simply score the underside of the chip in a grid pattern and pump water through it. Hundreds of watts per degree Celsius can be removed. Problem solved.4
-
Up all damn night making the script work.
Wrote a non-sieve prime generator.
Thing kept outputting one or two numbers that weren't prime, related to something called carmichael numbers.
Any case got it to work, god damn was it a slog though.
Generates next and previous primes pretty reliably regardless of the size of the number
(haven't gone over 31 bit because I haven't had a chance to implement decimal for this).
Don't know if the sieve is the only reliable way to do it. This seems to do it without a hitch, and doesn't seem to use a lot of memory. Don't have to constantly return to a lookup table of small factors or their multiple either.
Technically it generates the primes out of the integers, and not the other way around.
Things 0.01-0.02th of a second per prime up to around the 100 million mark, and then it gets into the 0.15-1second range per generation.
At around primes of a couple billion, its averaging about 1 second per bit to calculate 1. whether the number is prime or not, 2. what the next or last immediate prime is. Although I'm sure theres some optimization or improvement here.
Seems reliable but obviously I don't have the resources to check it beyond the first 20k primes I confirmed.
From what I can see it didn't drop any primes, and it didn't include any errant non-primes.
Codes here:
https://pastebin.com/raw/57j3mHsN
Your gotos should be nextPrime(), lastPrime(), isPrime, genPrimes(up to but not including some N), and genNPrimes(), which generates x amount of primes for you.
Speed limit definitely seems to top out at 1 second per bit for a prime once the code is in the billions, but I don't know if thats the ceiling, again, because decimal needs implemented.
I think the core method, in calcY (terrible name, I know) could probably be optimized in some clever way if its given an adjacent prime, and what parameters were used. Theres probably some pattern I'm not seeing, but eh.
I'm also wondering if I can't use those fancy aberrations, 'carmichael numbers' or whatever the hell they are, to calculate some sort of offset, and by doing so, figure out a given primes index.
And all my brain says is "sleep"
But family wants me to hang out, and I have to go talk a manager at home depot into an interview, because wanting to program for a living, and actually getting someone to give you the time of day are two different things.1 -
Every time a new hype thing appeared, I was annoyed, like "kids these days and their tiktoks".
But at some point in time, this pattern of mine changed completely. I don't know how it happened, I don't know when it happened, but now I experience... acute nostalgia?
I miss Elon Musk and his twitter fanbase. I miss tiktok. I feel like a time traveller who went into their past, which is our present, to experience the cultural landmarks once again, because their time here is limited, and tomorrow they will have to go back.
I miss my autism problems and mental health uphill battle. I miss avengers and thanos. I miss metaverse.
Oh, and also... I miss you.4 -
So apparently, the next version of C# is gonna have list pattern matching more powerful than F#...
...so... my motivation to learn F# drops back down to curiosity, since C#'s list pattern matching seems to will have all I needed and wanted for my parser, as opposed to F# which seems to not have it...
also fuck Russia and China, but I don't want to think about the impending apocalypse, thankyouverymuch. -
Question for devs who use Intellij IDEA.
How often do you use livetemplates?
I am a new android dev with ADHD and just discovered live templates. They make my life much easier, for example I have shortcuts for generating recyclerview adapter/viewholder/implementation boilerplate code.
In that way I am able to focus on implementation, and do my coding like building blocks, rather than memorizing every detail of implementation. Also I don't need to go to stackoverflow and copypaste basic things multiple times. Even for example during live coding interview having livetemplates seems awesome, copypasting from stackoverflow would be shameful (I think). Using my own custom shortcuts for livetemplates seems the best way for how my brain functions (I suck at memorizing tiny details, but I remember general idea/flow of a pattern and I would prefer memorizing what to use and when to use, instead of all small details of implementation).
Is getting to dependent on livetemplates a good practice to get used to? Do other developers frown upon a dev who has dozens of livetemplates and relies on them instead of writing all code from memory by hand?8 -
As a dev, how can you work with a teamlead that second, third and 4th guesses your decisions?
Simple example: fixed a bug, but temlead was shitting bricks about some error. Did a thorough research and told him that that error message was already in codebase for years and can be safely ignored because there is no workaround. Main thing is that our solution is working and I followed the latest standards. Basically I had to advocate for myself. Fine. Shit happens I get it. But it seems that this is becoming a pattern.
Then I had to do another issue: fix some bugs. While testing I was not able to reproduce any bugs. Filmed a video of app, attached all proofs to the jira issue and informed the teamlead. He couldnt believe his eyes! One month ago he saw the bug and now its gone! I had to retest 3-4 times everything and he still doesnt take my word for it.
I cant continue working like this. I have few years of experience under my belt, never had to deal with such insecure teamlead. How can I work if he second guesses everything what I do? Jesus.5 -
one of the few times i follow the existing structure and pattern i get told to fuck off and hardcode it instead because we want something temporary
-
SQLAlchemy is such a bloated piece of crap. Even without the fact that many consider ORMs an anti-pattern, this library is extremely janky, salty and uncomfortable to work with.7
-
Apart from the fact that I arrived at a good framework at work to play in problem space than in solution space, this post is more about self realisation and a slight progress in my happiness levels.
Monsoons started in India. The vibe somehow had always been melancholic for me triggering SAD (aka seasonal depression).
However, this year I find it cosier than ever. Hot showers, lazing around on a holiday when it's pouring outside, watching my favourite show/movie. I feel very relaxed in the moment, even when work and life is not as expected/under control.
What I realised is that my problem can be solved. I need a bigger house. That would give me privacy, some personal space for hobbies, and put a barrier between me and parents easing the tension and clashes. I could then get married, and with all the money I will save (from not buying a house myself), can be used to pursue hobbies like music, art, travel, etc.
Whenever I relax, my sleep pattern changes where I have longer duration of deep sleep with many dreams (perhaps processing everything). Does anyone else experience such a phenomenon?
Anyway, life doesn't get easy or hard, we just learn to put up with shit.4 -
Why is it so common for people to insist on following particular patterns at PR when they have no concept of why the pattern originated or when and why it should apply?9
-
1. Went to try chat.openai.com/chat
2. "Write a java program that uses a singleon pattern to calculate the sum of the first 100 prime numbers"
3. ???? HOW DOES IT DO THAT9 -
I am busting moves rn. I'm in the bathroom but the surge of energy is making me pump my arms like the time Leo Messi scored a clutch winner against Valencia in 2019
Remember the plugin I referred to in this rant? https://devrant.com/rants/6019851/...
Yup! I managed to subdue that fossilised codebase. Effected all changes required. To have a rough idea about how ancient the code is, its classes use constructors predating PHP 5. It throws away the ~15 years of autoloading, view templates, routing engines, DI, ORMs (NO PDO!!), lower-cased multi word variable names, etc. I'm looking at SCRIPTS with raw functions north of 4-600 lines. The client insisted I zip the folder across
BUT! The good news is, we surmounted it. In fairness to them, it's commendable for one man to have pulled this off. The codebase is massive and appears to have been predominantly written by some Gideon dude. Who knows where he is now
There is one pattern I appreciate –something I wish Transphporm does–some segments of the rendered view are composed using class methods ie instead of having the HTML file mixed with templating syntax, you have class methods that receive the raw data. Then you can extend this class as you wish, overriding just the method that composes the segment you intend to modify. That was elegant to work with. But it can become dreadful if the class expects a specific structure of data (an array with weird keys) that you have no access to sourcing
So, I finally get to enjoy one good evening in 2/3 weeks. I called 2 friends to express an emotion that's not gloomy, but they were unavailable. Will probably get some sleep4 -
!rant
Go's tuple-style returns. ❤️
What's an exception? I don't even know what those are anymore. 😄
I've started using this pattern in JavaScript / TypeScript too.
also... supabase anyone? ❤️5 -
#Suphle Rant 4: Laravel closing the gap II
I had expected rant 4 to come at least, some days later. Apparently, I'd miscalculated how fast things work in this wonderful world of software. In an earlier rant, I wrote about how dismayed I was to learn laravel had implemented one suphle feature I'm very proud about. They call it Premonition. Idk if it's officially rolled out yet but you can do a search among accepted pull requests for what it's all about
Well, today, I've just seen a draft from one of their maintainers showing one of the things suphle was designed to do: https://twitter.com/enunomaduro/.... They can't integrate it with this pattern since php doesn't have generics, so it'll either get trashed or with plastered as some band aid. In suphle docs, I explicitly indicated the data structure/typing for that feature is a polyfill for the absence of generics
I think I can get away with it because of where I'm using it (model authorization instead of custom exceptions/throwable operations, in general, like theirs)
I don't feel as distraught as I did on finding the Premonition thingy. Am I impressed with these things dawning on them? Ffs Laravel was invented in 2011. It's incredulous to think it gave me hell for years. Waited ~2 years for me to fix all issues in a brand new framework, only to magically gain iq points and start improving their work
It's weird and brutal. If they keep figuring stuff out, it may not be long before there are no features unique to suphle. Then, my worst nightmares will come to life. I will argue there's one thing nobody will ever copy, not without rethinking the mvc architecture in its entirety.2 -
i always get sucked into this "cute code" hell whenever i am working with a b2c codebase, and especially with kotlin code.
here's a scenario:
task : build a debounce logic for an input view where each user input is currently triggerring an api call.
my steps
1. read what debouncing is.
2. see if any code is available on the internet
=> found a code piece on the internet with some level of abstraction ( basically a simple final class that implements the input event callback and encapsulates the debounce logic)
3) copy it, run it , it wokrs
------
for any sane coder, these steps are hardly 10-30 mins and they can move on with life. but its your truly that made this task into a 6hour research only to come up at similar solution. my curiosity led me to stupid places
1) why this class is final? what if someone else wanna use it but with a different behaviour? lets try open(non final class) .
2) why even use a class? it extends an interface, lets try to wrap the logic in interface itself (kotlin supports interfaces that don't require implementation)
3) umm , the interface works but it looks ugly, with all its global overridden variables. what about we make it extension?
4) yeah the extension approach is also not very good, lets go back to open class.
5) but extend is super nice to look! lets keep the extension and open class too
6) can we optimise the implementation? why it uses an additional handler? what if we provided everything in constructor? how about builder pattern?
FUCK MY BRAIN! there are so much fucking options that i forgot that i spent 4 hours on this small thing
the simplest approach would have been tk just shove all the listeners and everything in activity and forget about it :/
senior devs on this platform, how do you stop yourself from adding every concept that you know into the smallest possible task?6 -
Anyone else feeling like this html, css, js, vue setup pattern, vite, vue-router, pinia - etc - is a bit of. Golden moment?5
-
I've been sitting here staring at extension types and I wonder, what if I had a partial file with partial data ?
In general one could say that in every case where say a header is missing that is ALWAYS going to have some identifying characteristics even given a characteristic statistically frequent pattern of data, that there is always a null value that appears as total chaos.
But I wonder, is there a way beyond simply trying every goddamn possible combination of things until meaningful data is extracted to identify a file by its content when part of that content that is usually used for such a purpose, is missing ?
What kind of application or technology would be required for this ? Certainly not neural networks, but obviously some kind of ai right ?10 -
So I have this new role at work, still app development with some added responsibilities. Nothing major. But already I'm noticing what could be a pattern.
Zoom meetings that could have been phone calls or emails. Meeting was setup a week and a half or so in advance. Had real a meaning last week where a team member mentioned it and reminded the other team members of the upcoming meeting. We all confirmed that we'd be there.
I get a notification that the meeting is in 15 minutes. Meeting time!!! So I log on, only to see one person from the other company, two more people from said company log on then my team member. But to my surprise him and I are the only people from my team on zoom.
My team member then goes on to waste this poor man's time asking him questions that he doesn't really have the answers to and I'm here just wondering why.
Why isn't this meeting a 2 minute phone call?
Why am I in this meet?
Is my team member bored?
How does this make my company look in the eyes of these people?
Now I know why my other team member didn't log on. They smelled the rat and knew this would be a wast of time. And me being new to the team walked right into it 😐