Join devRant
Do all the things like
++ or -- rants, post your own rants, comment on others' rants and build your customized dev avatar
Sign Up
Pipeless API
From the creators of devRant, Pipeless lets you power real-time personalized recommendations and activity feeds using a simple API
Learn More
Search - "bad memory"
-
Being paid to rewrite someone else's bad code is no joke.
I'll give the dev this, the use of gen 1,2,3 Pokemon for variable names and class names in beyond fantastic in terms of memory and childhood nostalgia. It would be even more fantastic if he spelt the names correctly, or used it to make a Pokemon game and NOT A FUCKING ACCOUNTANCY PROGRAM.
There's no correspondence in name according to type, or even number. Dev has just gone batshit, left zero comments, and now somehow Ryhorn is shitting out error codes because of errors existing in Charmeleon's asshole.
The things I do for money...24 -
Father bought a PC in 1997. Back then very few had it. I learned doing things like accessing the internet and sending emails, among others. I remember having added age on websites to be allowed to sign up at times :P My sisters used to play games on it sometimes. The first few ones we had were Tomb Raider: The Last Revelation, Tomb Raider Chronicles, American McGee's Alice(Which caused us to upgrade the PC xD)... And some others.
I have a memory of this pseudo-3D-looking game where you move in a maze and try answering questions. I want to remember its name, but I cannot :(
We literally have video evidence of me liking the computer as a child, yet my parents either say I'm addicted or deny I've ever liked it before. Not only that, but continuously limiting my time with the PC hasn't been a literal obstacle in my way of trying to do things in their opinion. Funny how my parents think the last few years I've been my worst when they've hurt me in those years so much that our relationship is guaranteed not working out. There were doubts in my head before, but now it's cemented and there is no way of going back. Father, for example, tells me it's too late to do anything with a PC now(As well as how I've been unable to use the PC. He looks at these pro players' footage in some TV show and he's like, „You've been unable to use your hobbies“, as if they have never ever screamed at me for perceived gaming and not actually cared to check), and I need to look for a „real“ job.
Sorry. I went to bed at 2:00 in the morning. Feel like a zombie because of ongoing weirdly insufficient sleep, even though I sleep kinda more than normal. Even when I took Melatonine for that it didn't help at all.
Childhood was where beating began. I was about 6/7. Right when I entered school. The first school that I attended was a private one and supposedly for „Wunderkinds“, while in reality I haven't seen a SINGLE teacher or psychologist approve of it, their argument being that children were basically drowned in work that wasn't age-appropriate(I don't mean anything bad. Just that teaching about Galaxies and all in first grade isn't the brightest idea). There was always a mountain of homework to do and as opposed to some other countries, we had to do it on a day to day basis. We didn't have a week-long deadline. I was predictably not keeping up with it as I could have, had it been a normal amount, so my parents decided I didn't want to study and began their methods of getting me to „study“. I have yet to see a person able to keep up with that school's tempo, no matter the age.
This place was also where I got bullied. I felt I had nowhere to be: At home, the parents' situation, at school, the bully. I never really went outside to play with other children, so I missed that part of childhood.
After the second year of school I was transferred to an advanced German school, called like that because they taught German and not English there. I also got to learn a bit of Russian before they removed it from school. In that period I used to attend ballet. But for less than a year. And piano, which I remember having attended for quite a long while, some years, if my memory isn't fried. I quit it because of it having been forced on me. Last piece I ever played fully was Beethoven's Marmotte.
In this school I was once again the outcast of the class. I had some people to interact with. All of those interactions lasted a few years at most. Then, because of a part of my class choosing me as a laughing-stock N2 and another girl as the N1, I found my best friend, who I still have today. She's the only friend I have nearby.
Most of the time I hated myself. Even today I struggle with that sometimes.
After that came university. This us where I got something like a friend circle at last. But it still didn't last. I got in a relationship with one of the guys, but I was just attracted. There was another I couldn't dare getting close to. Turns out he also had something for me. Then he disappeared from our lives and a year after, I still cannot forget the person. If I want to, I have to deprive myself of my own personality. Not a thing I'm willing to give up. Then I broke up with the guy I was in a relationship with and completely disappeared from the friendship circle. To be honest, I had reasons to. They refused to even try to look for the guy and they called him a friend for years. Sometimes parents hitting me can occur even today, but if I REALLY piss them off.
Now I'm here and oh, my God, I'm officially am aunt now! My sister gave birth to a daughter this morning... She's in Berlin with mother and both she and the child are doing great. I just hope she manages to be a good mother.20 -
https://git.kernel.org/…/ke…/... sure some of you are working on the patches already, if you are then lets connect cause, I am an ardent researcher for the same as of now.
So here it goes:
As soon as kernel page table isolation(KPTI) bug will be out of embargo, Whatsapp and FB will be flooded with over-night kernel "shikhuritee" experts who will share shitty advices non-stop.
1. The bug under embargo is a side channel attack, which exploits the fact that Intel chips come with speculative execution without proper isolation between user pages and kernel pages. Therefore, with careful scheduling and timing attack will reveal some information from kernel pages, while the code is running in user mode.
In easy terms, if you have a VPS, another person with VPS on same physical server may read memory being used by your VPS, which will result in unwanted data leakage. To make the matter worse, a malicious JS from innocent looking webpage might be (might be, because JS does not provide language constructs for such fine grained control; atleast none that I know as of now) able to read kernel pages, and pawn you real hard, real bad.
2. The bug comes from too much reliance on Tomasulo's algorithm for out-of-order instruction scheduling. It is not yet clear whether the bug can be fixed with a microcode update (and if not, Intel has to fix this in silicon itself). As far as I can dig, there is nothing that hints that this bug is fixable in microcode, which makes the matter much worse. Also according to my understanding a microcode update will be too trivial to fix this kind of a hardware bug.
3. A software-only remedy is possible, and that is being implemented by all major OSs (including our lovely Linux) in kernel space. The patch forces Translation Lookaside Buffer to flush if a context switch happens during a syscall (this is what I understand as of now). The benchmarks are suggesting that slowdown will be somewhere between 5%(best case)-30%(worst case).
4. Regarding point 3, syscalls don't matter much. Only thing that matters is how many times syscalls are called. For example, if you are using read() or write() on 8MB buffers, you won't have too much slowdown; but if you are calling same syscalls once per byte, a heavy performance penalty is guaranteed. All processes are which are I/O heavy are going to suffer (hostings and databases are two common examples).
5. The patch can be disabled in Linux by passing argument to kernel during boot; however it is not advised for pretty much obvious reasons.
6. For gamers: this is not going to affect games (because those are not I/O heavy)
Meltdown: "Meltdown" targeted on desktop chips can read kernel memory from L1D cache, Intel is only affected with this variant. Works on only Intel.
Spectre: Spectre is a hardware vulnerability with implementations of branch prediction that affects modern microprocessors with speculative execution, by allowing malicious processes access to the contents of other programs mapped memory. Works on all chips including Intel/ARM/AMD.
For updates refer the kernel tree: https://git.kernel.org/…/ke…/...
For further details and more chit-chats refer: https://lwn.net/SubscriberLink/...
~Cheers~
(Originally written by Adhokshaj Mishra, edited by me. )22 -
Someday my toaster is going to have an IP address. A bad automatic firmware update will most likely cause it to get stuck on the bagel setting until I plug a usb key in and reflash the memory.
Grandma's refrigerator will probably get viruses, lock itself and freeze all the food inside, demanding bitcoin before defrosting.
My blender will probably be used in a massive DDoS attack because Ninja's master MAC address list got leaked and the hidden control panel login is admin/admin.
Ovens will burn houses down when people call in to have them preheat on their way home from work.
Correlations between the number of times the lights are turned on and how many times the toilet is flushed will yield recommendations to run the dishwasher on Thursdays because it's simply more energy efficient.
My dog will tweet when he's hungry and my smart watch will recommend diet dog food in real-time because he's really been eating too much lately--"Do you want to setup a recurring order on Amazon fresh?"
Sometimes living in a cave sounds nice...12 -
And, the other side, husbands 😂
——————————————————–
Dear Technical Support,
Last year I upgraded from Boyfriend 5.0 to Husband 1.0 and noticed a distinct slow down in overall system performance — particularly in the flower and jewelry applications, which operated flawlessly under Boyfriend 5.0. The new program also began making unexpected changes to the accounting modules.
In addition, Husband 1.0 uninstalled many other valuable programs, such as Romance 9.5 and Personal Attention 6.5 and then installed undesirable programs such as NFL 5.0, NBA 3.0, and Golf Clubs 4.1.
Conversation 8.0 no longer runs, and Housecleaning 2.6 simply crashes the system. I’ve tried running Nagging 5.3 to fix these problems, but to no avail.
What can I do?
Signed,
Desperate
——————————————————–
Dear Desperate:
First keep in mind, Boyfriend 5.0 is an Entertainment Package, while Husband 1.0 is an Operating System.
Please enter the command: ” C:/ I THOUGHT YOU LOVED ME” and try to download Tears 6.2 and don’t forget to install the Guilt 3.0 update.
If that application works as designed, Husband 1.0 should then automatically run the applications Jewelry 2.0 and Flowers 3.5. But remember, overuse of the above application can cause Husband 1.0 to default to Grumpy Silence 2.5, Happy Hour 7.0 or Beer 6.1.
Beer 6.1 is a very bad program that will download the Snoring Loudly Beta.
Whatever you do, DO NOT install Mother-in-law 1.0 (it runs a virus in the background that will eventually seize control of all your system resources).
Also, do not attempt to reinstall the Boyfriend 5.0 program. These are unsupported applications and will crash Husband 1.0.
In summary, Husband 1.0 is a great program, but it does have limited memory and cannot learn new applications quickly.
You might consider buying additional software to improve memory and performance. We recommend Food 3.0 and Hot Lingerie 7.7.
Good Luck,
Tech Support3 -
At age of 20, I got hired as junior dev at a mobile gaming company. We were 2 junior devs hired at the same time and one of our senior colleagues made a prank: he came in the office before us and rearranged our offices in a "funny" manner.
Two days later I waited for him to go home. I opened his PC case, removed the power button cable from the motherboard and then re-arranged everything back to normal. Well, I couldn't resist...
Next day he came into the office and, well, surprise... the PC was not starting. He went to the IT department and they spent 4 hours trying to figure out why it was not working. They replaced the CPU, RAM memory, including the PSU.
I had to go and tell them: "maybe it's the power button jack?!".
I got into some problems for that prank. Indeed I crossed a line, but what the hell... that was a bad IT department.19 -
Rest In Peace My Dad. now he is going to follow my uncle. the best two person and developers in my world. as a lone survivor, your memory will continue to be remembered propagate through my life.
for fellow devRanters, Always love your family more when they're still in your side. doesn't matter how harsh, how bad they are; you can always find the meaning, the values life will give you, and that will be the best presents your parent will give you. Always. stay creative, they will be proud.
the short story of my dad : https://devrant.com/rants/1630147/...37 -
I have been a mobile developer working with Android for about 6 years now. In that time, I have endured countless annoyances in the Android development space. I will endure them no more.
My complaints are:
1. Ridiculous build times. In what universe is it acceptable for us to wait 30 seconds for a build to complete. Yes, I've done all the optimisations mentioned on this page and then some. Don't even mention hot reload as it doesn't work fast enough or just does not work at all. Also, buying better hardware should not be a requirement to build a simple Android app, Xcode builds in 2 seconds with a 8GB Macbook Air. A Macbook Air!
2. IDE. Android Studio is a memory hog even if you throw 32GB of RAM at it. The visual editors are janky as hell. If you use Eclipse, you may as well just chop off your fingers right now because you will have no use for them after you try and build an app from afresh. I mean, just look at some of the posts in this subreddit where the common response is to invalidate caches and restart. That should only be used as a last resort, but it's thrown about like as if it solves everything. Truth be told, it's Gradle's fault. Gradle is so annoying I've dedicated the next point to it.
3. Gradle. I am convinced that Gradle causes 50% of an Android developer's pain. From the build times to the integration into various IDEs to its insane package management system. Why do I need to manually exclude dependencies from other dependencies, the build tool should just handle it for me. C'mon it's 2019. Gradle is so bad that it requires approx 54GB of RAM to work out that I have removed a dependency from the list of dependencies. Also I cannot work out what properties I need to put in what block.
4. API. Android API is over-bloated and hellish. How do I schedule a recurring notification? Oh use an AlarmManager. Yes you heard right, an AlarmManager... Not a NotificationManager because that would be too easy. Also has anyone ever tried running a long running task? Or done an asynchronous task? Or dealt with closing/opening a keyboard? Or handling clicks from a RecyclerView? Yes, I know Android Jetpack aims to solve these issues but over the years I have become so jaded by things that have meant to solve other broken things, that there isn't much hope for Jetpack in my mind 😤
5. API 2. A non-insignificant number of Android users are still on Jelly Bean or KitKat! That means we, as developers, have to support some of your shitty API decisions (Fragments, Activities, ListView) from all the way back then!
6. Not reactive enough. Android has support for Databinding recently but this kind of stuff should have been introduced from the very start. Look at React or Flutter as to how easy it is to make shit happen without any effort.
7. Layouts. What the actual hell is going on here. MDPI, XHDPI, XXHDPI, mipmap, drawable. Fuck it, just chuck it all in the drawable folder. Seriously, Android should handle this for me. If I am designing for a larger screen then it should be responsive. I don't want to deal with 50 different layouts spread over 6 different folders.
8. Permission system. Why was this not included from the very start? Rogue apps have abused this and abused your user's privacy and security. Yet you ban us and not them from the Play Store. What's going on? We need answers.
9. In Android, building an app took me 3 months and I had a lot of work left to do but I got so sick of Android dev I dropped it in favour of Flutter. I built the same app in Flutter and it took me around a month and I completed it all.
10. XML.
If you're a new dev, for the love of all that is good in this world, do NOT get into Android development. Start with Flutter or even iOS. On Flutter and build times are insanely fast and the hot reload is under 500ms constantly. It's a breath of fresh air and will save you a lot of headaches AND it builds for iOS flawlessly.
To the people who build Android, advocate it and work on it, sorry to swear, but fuck you! You have created a mess that we have to work with on a day-to-day basis only for us to get banned from the app store! You have sold us a lie that Android development is amazing with all the sweet treat names and conferences that look bubbly and fun. You have allowed to get it so bad that we can't target an API higher than 18 because some Android users are still using devices that support that!
End this misery. End our pain. End our suffering. Throw this abomination away like you do with some of your other projects and migrate your efforts over to Flutter. Please!
#NoToGoogleIO #AndroidSummitBoycott #FlutterDev #ReactNative16 -
My team handles infrastructure deployment and automation in the cloud for our company, so we don't exactly develop applications ourselves, but we're responsible for building deployment pipelines, provisioning cloud resources, automating their deployments, etc.
I've ranted about this before, but it fits the weekly rant so I'll do it again.
Someone deployed an autoscaling application into our production AWS account, but they set the maximum instance count to 300. The account limit was less than that. So, of course, their application gets stuck and starts scaling out infinitely. Two hundred new servers spun up in an hour before hitting the limit and then throwing errors all over the place. They send me a ticket and I login to AWS to investigate. Not only have they broken their own application, but they've also made it impossible to deploy anything else into prod. Every other autoscaling group is now unable to scale out at all. We had to submit an emergency limit increase request to AWS, spent thousands of dollars on those stupidly-large instances, and yelled at the dev team responsible. Two weeks later, THEY INCREASED THE MAX COUNT TO 500 AND IT HAPPENED AGAIN!
And the whole thing happened because a database filled up the hard drive, so it would spin up a new server, whose hard drive would be full already and thus spin up a new server, and so on into infinity.
Thats probably the only WTF moment that resulted in me actually saying "WTF?!" out loud to the person responsible, but I've had others. One dev team had their code logging to a location they couldn't access, so we got daily requests for two weeks to download and email log files to them. Another dev team refused to believe their server was crashing due to their bad code even after we showed them the logs that demonstrated their application had a massive memory leak. Another team arbitrarily decided that they were going to deploy their code at 4 AM on a Saturday and they wanted a member of my team to be available in case something went wrong. We aren't 24/7 support. We aren't even weekend support. Or any support, technically. Another team told us we had one day to do three weeks' worth of work to deploy their application because they had set a hard deadline and then didn't tell us about it until the day before. We gave them a flat "No" for that request.
I could probably keep going, but you get the gist of it.4 -
My code review nightmare part 3
Performed a review on/against a workplace 'nemesis'. I didn't follow the department standards document (cause I could care less about spacing, sorted usings, etc) and identified over 80 bugs, logic errors, n+1 patterns, memory leaks (yes, even in .net devs can cause em'), and general bad behavior (ex.'eating' exceptions that should be handled or at least logged)
Because 'Jeff' was considered a golden child (that's another long TL;DR), his boss and others took a major offense and demanded I justify my review, item by item.
About 2 hours into the meeting, our department mgr realized embarrassing Jeff any further wasn't doing anyone any good and decided to take matters into his own hands. Thinking 'well, its about time he did his job', I go back to my desk. About an hour later..
Mgr: "I need you in the conference room, RIGHT NOW!"
<oh crap>
Mgr: "I spoke to Jeff and I think I know what the problem is. Did you ever train him on any of the problems you identified in the review?"
Me: "Um, no. Why would I?"
Mgr: "Ha!..I was right. So lets agree the problems are partially your fault, OK?"
Me: "Finding the bugs in his code is somehow my fault?"
Mgr: "Yes! For example, the n+1 problem in using the WCF service, you never trained him on how to use the service. You wrote the service, correct?"
Me: "Yes, but it's not my job to teach him how to write C#. I documented the process and have examples in the document to avoid n+1. All he had to do was copy/paste."
Mgr: "But you never sat with Jeff and talked to him like a human being? You sit over there in your silo and are oblivious to the problems you cause. This ends today!"
Me: "What the...I have no idea what you are talking about. What in the world did Jeff tell you?"
Mgr: "He told me enough and I'm putting an end to it. I want a compressive training class developed on how to use your service. I'll give you a month to get your act together and properly train these developers."
3 days later, I submit the power-point presentation and accompanying docs. It was only one WCF with a handful of methods. Mgr approved the training, etc..etc. execute the 'training', and Jeff submits a code review a couple of weeks later. From over 80 issues to around 50. The poop hits the fan again.
Mgr: "What's your problem? When are you going to take your responsibility seriously?"
Me: "Its pretty clear I don't have the problem. All the review items were also verified by other devs. Its not me trying to be an asshole."
Mgr: "Enough with the excuses. If you think you can do a better job *you* make the code changes and submit them for Jeff for review. No More Excuses!"
Couple of days later, I make the changes, submit them for review, and Jeff really couldn't say too much other than "I don't see this as an improvement"
TL;DR, I had been tracking the errors generated by the site due to the bugs prior to my changes. After deployment, # of errors went from thousands per hour to maybe hundreds per day (that's another story) and the site saw significant performance increases, fewer customer complaints, etc..etc.
At a company event, the department VP hands out special recognition awards:
VP: "This award is especially well earned. Not only does this individual exemplify the company's focus on teamwork, he also went above and beyond the call of duty to serve our customers. Jeff, come on up and get this well deserved award."19 -
Not just another Windows rant:
*Disclaimer* : I'm a full time Linux user for dev work having switched from Windows a couple of years ago. Only open Windows for Photoshop (or games) or when I fuck up my Linux install (Arch user) because I get too adventurous (don't we all)
I have hated Windows 10 from day 1 for being a rebel. Automatic updates and generally so many bugs (specially the 100% disk usage on boot for idk how long) really sucked.
It's got ads now and it's generally much slower than probably a Windows 8 install..
The pathetic memory management and the overall slower interface really ticks me off. I'm trying to work and get access to web services and all I get is hangups.
Chrome is my go-to browser for everything and the experience is sub par. We all know it gobbles up RAM but even more on Windows.
My Linux install on the same computer flies with a heavy project open in Android Studio, 25+ tabs in Chrome and a 1080p video playing in the background.
Up until the creators update, UI bugs were a common sight. Things would just stop working if you clicked them multiple times.
But you know what I'm tired of more?
The ignorant pricks who bash it for being Windows. This OS isn't bad. Sure it's not Linux or MacOS but it stands strong.
You are just bashing it because it's not developer friendly and it's not. It never advertises itself like that.
It's a full fledged OS for everyone. It's not dev friendly but you can make it as much as possible but you're lazy.
People do use Windows to code. If you don't know that, you're ignorant. They also make a living by using Windows all day. How bout tha?
But it tries to make you feel comfortable with the recent bash integration and the plethora of tools that Microsoft builds.
IIS may not be Apache or Nginx but it gets the job done.
Azure uses Windows and it's one of best web services out there. It's freaking amazing with dead simple docs to get up and running with a web app in 10 minutes.
I saw many rants against VS but you know it's one of the best IDEs out there and it runs the best on Windows (for me, at least).
I'm pissed at you - you blind hater you.
Research and appreciate the things good qualities in something instead of trying to be the cool but ignorant dev who codes with Linux/Mac but doesn't know shit about the advantages they offer.undefined windows 10 sucks visual studio unix macos ignorance mac terminal windows 10 linux developer22 -
Okay, story time.
Back during 2016, I decided to do a little experiment to test the viability of multithreading in a JavaScript server stack, and I'm not talking about the Node.js way of queuing I/O on background threads, or about WebWorkers that box and convert your arguments to JSON and back during a simple call across two JS contexts.
I'm talking about JavaScript code running concurrently on all cores. I'm talking about replacing the god-awful single-threaded event loop of ECMAScript – the biggest bottleneck in software history – with an honest-to-god, lock-free thread-pool scheduler that executes JS code in parallel, on all cores.
I'm talking about concurrent access to shared mutable state – a big, rightfully-hated mess when done badly – in JavaScript.
This rant is about the many mistakes I made at the time, specifically the biggest – but not the first – of which: publishing some preliminary results very early on.
Every time I showed my work to a JavaScript developer, I'd get negative feedback. Like, unjustified hatred and immediate denial, or outright rejection of the entire concept. Some were even adamantly trying to discourage me from this project.
So I posted a sarcastic question to the Software Engineering Stack Exchange, which was originally worded differently to reflect my frustration, but was later edited by mods to be more serious.
You can see the responses for yourself here: https://goo.gl/poHKpK
Most of the serious answers were along the lines of "multithreading is hard". The top voted response started with this statement: "1) Multithreading is extremely hard, and unfortunately the way you've presented this idea so far implies you're severely underestimating how hard it is."
While I'll admit that my presentation was initially lacking, I later made an entire page to explain the synchronisation mechanism in place, and you can read more about it here, if you're interested:
http://nexusjs.com/architecture/
But what really shocked me was that I had never understood the mindset that all the naysayers adopted until I read that response.
Because the bottom-line of that entire response is an argument: an argument against change.
The average JavaScript developer doesn't want a multithreaded server platform for JavaScript because it means a change of the status quo.
And this is exactly why I started this project. I wanted a highly performant JavaScript platform for servers that's more suitable for real-time applications like transcoding, video streaming, and machine learning.
Nexus does not and will not hold your hand. It will not repeat Node's mistakes and give you nice ways to shoot yourself in the foot later, like `process.on('uncaughtException', ...)` for a catch-all global error handling solution.
No, an uncaught exception will be dealt with like any other self-respecting language: by not ignoring the problem and pretending it doesn't exist. If you write bad code, your program will crash, and you can't rectify a bug in your code by ignoring its presence entirely and using duct tape to scrape something together.
Back on the topic of multithreading, though. Multithreading is known to be hard, that's true. But how do you deal with a difficult solution? You simplify it and break it down, not just disregard it completely; because multithreading has its great advantages, too.
Like, how about we talk performance?
How about distributed algorithms that don't waste 40% of their computing power on agent communication and pointless overhead (like the serialisation/deserialisation of messages across the execution boundary for every single call)?
How about vertical scaling without forking the entire address space (and thus multiplying your application's memory consumption by the number of cores you wish to use)?
How about utilising logical CPUs to the fullest extent, and allowing them to execute JavaScript? Something that isn't even possible with the current model implemented by Node?
Some will say that the performance gains aren't worth the risk. That the possibility of race conditions and deadlocks aren't worth it.
That's the point of cooperative multithreading. It is a way to smartly work around these issues.
If you use promises, they will execute in parallel, to the best of the scheduler's abilities, and if you chain them then they will run consecutively as planned according to their dependency graph.
If your code doesn't access global variables or shared closure variables, or your promises only deal with their provided inputs without side-effects, then no contention will *ever* occur.
If you only read and never modify globals, no contention will ever occur.
Are you seeing the same trend I'm seeing?
Good JavaScript programming practices miraculously coincide with the best practices of thread-safety.
When someone says we shouldn't use multithreading because it's hard, do you know what I like to say to that?
"To multithread, you need a pair."18 -
What an absolute fucking disaster of a day. Strap in, folks; it's time for a bumpy ride!
I got a whole hour of work done today. The first hour of my morning because I went to work a bit early. Then people started complaining about Jenkins jobs failing on that one Jenkins server our team has been wanting to decom for two years but management won't let us force people to move to new servers. It's a single server with over four thousand projects, some of which run massive data processing jobs that last DAYS. The server was originally set up by people who have since quit, of course, and left it behind for my team to adopt with zero documentation.
Anyway, the 500GB disk is 100% full. The memory (all 64GB of it) is fully consumed by stuck jobs. We can't track down large old files to delete because du chokes on the workspace folder with thousands of subfolders with no Ram to spare. We decide to basically take a hacksaw to it, deleting the workspace for every job not currently in progress. This of course fucked up some really poorly-designed pipelines that relied on workspaces persisting between jobs, so we had to deal with complaints about that as well.
So we get the Jenkins server up and running again just in time for AWS to have a major incident affecting EC2 instance provisioning in our primary region. People keep bugging me to fix it, I keep telling them that it's Amazon's problem to solve, they wait a few minutes and ask me to fix it again. Emails flying back and forth until that was done.
Lunch time already. But the fun isn't over yet!
I get back to my desk to find out that new hires or people who got new Mac laptops recently can't even install our toolchain, because management has started handing out M1 Macs without telling us and all our tools are compiled solely for x86_64. That took some troubleshooting to even figure out what the problem was because the only error people got from homebrew was that the formula was empty when it clearly wasn't.
After figuring out that problem (but not fully solving it yet), one team starts complaining to us about a Github problem because we manage the github org. Except it's not a github problem and I already knew this because they are a Problem Team that uses some technical authoring software with Git integration but they only have even the barest understanding of what Git actually does. Turns out it's a Git problem. An update for Git was pushed out recently that patches a big bad vulnerability and the way it was patched causes problems because they're using Git wrong (multiple users accessing the same local repo on a samba share). It's a huge vulnerability so my entire conversation with them went sort of like:
"Please don't."
"We have to."
"Fine, here's a workaround, this will allow arbitrary code execution by anyone with physical or virtual access to this computer that you have sitting in an unlocked office somewhere."
"How do I run a Git command I don't use Git."
So that dealt with, I start taking a look at our toolchain, trying to figure out if I can easily just cross-compile it to arm64 for the M1 macbooks or if it will be a more involved fix. And I find all kinds of horrendous shit left behind by the people who wrote the tools that, naturally, they left for us to adopt when they quit over a year ago. I'm talking entire functions in a tool used by hundreds of people that were put in as a joke, poorly documented functions I am still trying to puzzle out, and exactly zero comments in the code and abbreviated function names like "gars", "snh", and "jgajawwawstai".
While I'm looking into that, the person from our team who is responsible for incident communication finally gets the AWS EC2 provisioning issue reported to IT Operations, who sent out an alert to affected users that should have gone out hours earlier.
Meanwhile, according to the health dashboard in AWS, the issue had already been resolved three hours before the communication went out and the ticket remains open at this moment, as far as I know.5 -
Why is it so important to some people to claim that "HTML and CSS are not programming languages"? I get it, you're a REAL programmer working with arrays, maybe tuples, objects and possibly direct memory management. Who the fuck has a right to call themselves a programmer for writing some brain dead markup or poorly designed selectors, right? Who fucking cares for semantic tags or nested selectors?
Just think for a few seconds about when you were taking your first baby steps to becoming the GOD ROCKING MEMORY HANDLER THAT WRITES _REAL_ CODE that you are today, and how good it felt to be able to create something that appeared on your screen. It felt pretty awesome, yeah?
Now imagine if someone much more experienced than you told you "You're not a real programmer, that is not real programming. You should see what I do, I do real programming".
I think you get it. Why spend your energy spreading bad vibes when you could spend it on something more productive. Like reading up on the new CSS4 specs ;)18 -
Ooof.
In a meeting with my client today, about issues with their staging and production environments.
They pull in the lead dev working on the project. He's a 🤡 who freelanced for my previous company where I was CTO.
I fired him for being plain bad.
Today he doesn't recognize me and proceeds to patronize me in server administration...
The same 🤡 that checks production secrets into git, builds projects directly in the production vm.
Buckle up... Deploys *both* staging and production to the *same* vm...
Doesn't even assign a static IP to the VM and is puzzled when its IP has changed after a relaunch...
Stores long term aws credentials instead of using instance roles.
Claims there are "memory leaks", in a js project. (There may be memory misuse by project or its dependencies, an actual memory leak in v8 that somehow only he finds...? Don't think so.)
Didn't even set up pm2 in systemd so his services didn't even relaunch after a reboot...
You know, I'm keeping my mouth shut and make the clown work all weekend to fix his own hubris.9 -
I have to rant a bit about the toxic reactions to a constructive Q&A website.
People keep complaining that they get downvotes and corrections, or stuff like that.
Are you fucking kidding me?
So you expect people to spend their own time for absolutely free, to help you, while you don't even want to invest in describing the issue you're having properly? And then complain that people are having issues in understanding your questions?
Let's look at this scientifically. Let's gather up some questions that have been received badly on SO in the last few hours. From the top (simply put https://stackoverflow.com/questions... in front of the id):
47619033 - person wants a discussion about an algorithm while not providing any information about what worked and what failed. "Please write a program for me". Breaking at least 2 rules.
47619027 - "check out my videos" spam
47619030 - "Here's the manual that has my answer but I can't find my answer in it".
47619004 - "how do I keep variables in memory"
47618997 - debug this exception, I'll give you no info on what I tried and failed. Screw this, you guys figure this out, I'm going out for beer.
47618993 - expects everyone to guess what the input is, what the expected output is, and whether he has read what HashMap is in the manual. But sure, this question is so far the best out of all the bad ones.
47618985 - please write code according to my specifications
Should I go on? There wasn't a single clear question about problems in code in this entire small set. Be free to continue searching, let me know if you find something that:
1. You understand what's being asked
2. Answer is clear and non-ambiguous (ex. NOT "which language is the coolest?")
3. Not asking someone to write a program for them.
4. Answer is not found in the most basic form of manuals (ex. php.net)
5. Is about programming.
The point is:
If you get downvoted on Stackoverflow - then you wrote a shitty question. Instead of coming over here and venting uselessly, simply address the concerns and at least TRY to write a clear question if you expect any answers.5 -
I want a case/skin/idk for my lappy after I finally leave this company. I have this awful habit of associating things with memories. If the memory is bad, seeing the object reminds me of it, and e.g. makes me feel burned out again. So, I want to add a really pretty case to my lappy so it feels like my laptop instead of the company's.
I've found a few really beautiful ones on Etsy and Pinterest, but they're so ridiculously expensive! I really don't want to pay $90 🙁
Does anyone know where I can find alternatives?11 -
Last week, the team lead told me that he can't merge because my code has code smells and going forward, can't have that. We use Sonar and well the way to "fix it" according to him is to mark the line using //NOSONAR.
Most of the issues are minor like Unused imports and for me incomplete TODOs.
And before the "verbal" rule was only need to fix Major + issues. And well the reason I use TODOs is to mark code that probably needs changing in the future. I know there's going to be some feature that these lines have to be changed. But the requirements are fully defined yet from business.
But I sort of blew up on him. YOU WANT TO ENFORCE ZERO CODE SMELLS NOW?!?!?! AND THESE MINOR ISSUES? MARK THEM WITH NOSONAR?
HERE'S WHAT I THINK FOR THE LAST X YEARS... THE CODE DESIGN IS SHIT, MINOR CODE SMELLS AND MANUALLY MARKING THE ONES U NEED TO KEEP... ARE THE LEAST OF OUR PROBLEMS...
THE OTHER PROBLEMS I'VE MENTIONED BEFORE EVER. MOS YEAR BUT YOU DIMWITS NEVER LISTEN.
YOU THINK MY TODOS ARE BAD... 90% OF THE CODE AND FEATURES (THE ONES NOT DONE BY ME) LOOK AND SMELL LIKE MONKEY SHIT. UNDOCUMENTED, MESSY, FULL OF BUGS.
AND GUESS WHAT? NEW FEATURE, SOME DEV FORGETS TO CHANGE SOME COMPONENT THAT DEPENDS ON IT. WOULDN'T IT BE GREATE IF THERE WERE BOOKMARKS... O WAIT...
i just was catching up on comics again and saw this one... with triggered my memory and this rant... My first thought was to forward it to him...11 -
I'm not good with faces, at all.
I literally once forgot a dudes name and remembered after he got his laptop out from his bag and i saw his stickers.
I recognize people based on their stickers now...8 -
Saw this on Facebook and couldn't help but share here! 😂
A young woman submitted the tech support message below (about her relationship to her husband) presumably did it as a joke…
The query:
Dear Tech Support,
’Last year I upgraded from Boyfriend 5.0 to Husband 1.0 and noticed a distinct slowdown in overall system performance, particularly in the flower and jewelry applications, which operated flawlessly under Boyfriend 5.0.
In addition, Husband 1.0 uninstalled many other valuable programs, such as: Romance 9.5 and Personal Attention 6.5, and then installed undesirable programs such as: NBA 5.0, NFL 3.0 and Golf Clubs 4.1.
Conversation 8.0 no longer runs, and House cleaning 2.6 simply crashes the system. Please note that I have tried running Nagging 5.3 to fix these problems, but to no avail.
What can I do?
Signed,
Desperate
The response (that came weeks later out of the blue):
Dear Desperate,
“First keep in mind, Boyfriend 5.0 is an Entertainment Package, while Husband 1.0 is an operating system. Please enter command: I thought you loved me.html and try to download Tears 6.2 and do not forget to install the Guilt 3.0 update. If that application works as designed, Husband 1.0 should then automatically run the applications Jewelry 2.0 and Flowers 3.5.
However, remember, overuse of the above application can cause Husband 1.0 to default to Grumpy Silence 2.5, Happy Hour 7.0 or Beer 6.1. Please note that Beer 6.1 is a very bad program that will download the Farting and Snoring Loudly Beta.
Whatever you do, DO NOT, under any circumstances, install Mother-In-Law 1.0 (it runs a virus in the background that will eventually seize control of all your system resources.)
In addition, please, do not attempt to re-install the Boyfriend 5.0 program. These are unsupported applications and will crash Husband 1.0.
In summary, Husband 1.0 is a great program, but it does have limited memory and cannot learn new applications quickly. You might consider buying additional software to improve memory and performance. We recommend: Cooking 3.0.Good Luck!’
Good Luck!3 -
TLDR: SAP sucks. Don't ever work with it. Run away from it. Delete it from your memory. If your company works with it, quit. It's the best you can do.
A couple of weeks ago the group rant was "Story of your best/worst career choice" and I talked about the contract I signed. Even tho that is still true and I still feel like that, I think I got a new worst choice:
WORKING WITH SAP.
When I got this job I knew it would be SAP, but I didn't know what SAP was. I just thought "it's programming, how bad can it be?" OH BOII.
If only I would have done some FUCKING RESEARCH I would know this would be a mistake!!
And I knew I didn't want to work with this, I knew I wanted to be a web developer, but I STILL ACCEPTED THE JOB OH MAN WHAT WAS I THINKING I'M SO MAD.
Were I live we all have the same mentality when looking for the first job, which is to just accept anything you can get, because it's your first job, you need to work and to get experience, even if it's a bad job or if you know you won't like it. When my intership was ending, I told my parents I didn't want to stay there because they treat their employes like shit, and the salary is terrible. They told me to still accept it if they offered because I still need a job (this one was web tho) and experience.
So, of course, since I was looking for my first job, was told this my entire live, always thought like that and they were the first to contact me, I accepted it.
BIGGEST FUCKING MISTAKE!! DON'T THINK LIKE THIS!! AND STOP TELLING KIDS THIS!! IT'S NOT A GOOD MENTALITY!!!
ALSO DON'T WORK WITH SAP! EVER!24 -
Boasting you know programming just because you watched some programming videos doesn't help in any of the case, I learned this the hard way because I tried to show off in front a girl and turned out, she knew more and better. This was when I was 14, very bad memory14
-
Ok friends let's try to compile Flownet2 with Torch. It's made by NVIDIA themselves so there won't be any problem at all with dependencies right?????? /s
Let's use Deep Learning AMI with a K80 on AWS, totally updated and ready to go super great always works with everything else.
> CUDA error
> CuDNN version mismatch
> CUDA versions overwrite
> Library paths not updated ever
> Torch 0.4.1 doesn't work so have to go back to Torch 0.4
> Flownet doesn't compile, get bunch of CUDA errors piece of shit code
> online forums have lots of questions and 0 answers
> Decide to skip straight to vid2vid
> More cuda errors
> Can't compile the fucking 2d kernel
> Through some act of God reinstalling cuda and CuDNN, manage to finally compile Flownet2
> Try running
> "Kernel image" error
> excusemewhatthefuck.jpg
> Try without a label map because fuck it the instructions and flags they gave are basically guaranteed not to work, it's fucking Nvidia amirite
> Enormous fucking CUDA error and Torch error, makes no sense, online no one agrees and 0 answers again
> Try again but this time on a clean machine
> Still no go
> Last resort, use the docker image they themselves provided of flownet
> Same fucking error
> While in the process of debugging, realize my training image set is also bound to have bad results because "directly concatenating" images together as they claim in the paper actually has horrible results, and the network doesn't accept 6 channel input no matter what, so the only way to get around this is to make 2 images (3 * 2 = 6 quick maths)
> Fix my training data, fuck Nvidia dude who gave me wrong info
> Try again
> Same fucking errors
> Doesn't give nay helpful information, just spits out a bunch of fucking memory addresses and long function names from the CUDA core
> Try reinstalling and then making a basic torch network, works perfectly fine
> FINALLY.png
> Setup vid2vid and flownet again
> SAME FUCKING ERROR
> Try to build the entire network in tensorflow
> CUDA error
> CuDNN version mismatch
> Doesn't work with TF
> HAVE TO FUCKING DOWNGEADE DRIVERS TOO
> TF doesn't support latest cuda because no one in the ML community can be bothered to support anything other than their own machine
> After setting up everything again, realize have no space left on 75gb machine
> Try torch again, hoping that the entire change will fix things
At this point I'll leave a space so you can try to guess what happened next before seeing the result.
Ready?
3
2
1
> SAME FUCKING ERROR
In conclusion, NVIDIA is a fucking piece of shit that can't make their own libraries compatible with themselves, and can't be fucked to write instructions that actually work.
If anyone has vid2vid working or has gotten around the kernel image error for AWS K80s please throw me a lifeline, in exchange you can have my soul or what little is left of it5 -
I do not like the direction laptop vendors are taking.
New laptops tend to feature fewer ports, making the user more dependent on adapters. Similarly to smartphones, this is a detrimental trend initiated by Apple and replicated by the rest of the pack.
As of 2022, many mid-range laptops feature just one USB-A port and one USB-C port, resembling Apple's toxic minimalism. In 2010, mid-class laptops commonly had three or four USB ports. I have even seen an MSi gaming laptop with six USB ports. Now, much of the edges is wasted "clean" space.
Sure, there are USB hubs, but those only work well with low-power devices. When attaching two external hard drives to transfer data between them, they might not be able to spin up due to insufficient power from the USB port or undervoltage caused by the impedance (resistance) of the USB cable between the laptop's USB port and hub. There are USB hubs which can be externally powered, but that means yet another wall adapter one has to carry.
Non-replaceable [shortest-lived component] mean difficult repairs and no more reserve batteries, as well as no extra-sized battery packs. When the battery expires, one might have to waste four hours on a repair shop for a replacement that would have taken a minute on a 2010 laptop.
The SD card slot is being replaced with inferior MicroSD or removed entirely. This is especially bad for photographers and videographers who would frequently plug memory cards into their laptop. SD cards are far more comfortable than MicroSD cards, and no, bulky external adapters that reserve the device's only USB port and protrude can not replace an integrated SD card slot.
Most mid-range laptops in the early 2010s also had a LAN port for immediate interference-free connection. That is now reserved for gaming-class / desknote laptops.
Obviously, components like RAM and storage are far more difficult to upgrade in more modern laptops, or not possible at all if soldered in.
Touch pads increasingly have the buttons underneath the touch surface rather than separate, meaning one has to be careful not to move the mouse while clicking. Otherwise, it could cause an unwanted drag-and-drop gesture. Some touch pads are smart enough to detect when a user intends to click, and lock the movement, but not all. A right-click drag-and-drop gesture might not be possible due to the finger on the button being registered as touch. Clicking with short tapping could be unreliable and sluggish. While one should have external peripherals anyway, one might not always have brought them with. The fallback input device is now even less comfortable.
Some laptop vendors include a sponge sheet that they want users to put between the keyboard and the screen before folding it, "to avoid damaging the screen", even though making it two millimetres thicker could do the same without relying on a sponge sheet. So they want me to carry that bulky thing everywhere around? How about no?
That's the irony. They wanted to make laptops lighter and slimmer, but that made them adapter- and sponge sheet-dependent, defeating the portability purpose.
Sure, the CPU performance has improved. Vendors proudly show off in their advertisements which generation of Intel Core they have this time. As if that is something users especially care about. Hoo-ray, generation 14 is now yet another 5% faster than the previous generation! But what is the benefit of that if I have to rely on annoying adapters to get the same work done that I could formerly do without those adapters?
Microsoft has also copied Apple in demanding internet connection before Windows 11 will set up. The setup screen says "You will need an Internet connection…" - no, technically I would not. What does technically stand in the way of Windows 11 setting up offline? After all, previous Windows versions like Windows 95 could do so 25 years earlier. But also far more recent versions. Thankfully, Linux distributions do not do that.
If "new" and "modern" mean more locked-in and less practical and difficult to repair, I would rather have "old" than "new".12 -
I haven't ranted for today, but I figured that I'd post a summary.
A public diary of sorts.. devRant is amazing, it even allows me to post the stuff that I'd otherwise put on a piece of paper and probably discard over time. And with keyboard support at that <3
Today has been a productive day for me. Laptop got restored with a "pacman -Syu" over a Bluetooth mobile data tethering from my phone, said phone got upgraded to an unofficial Android 9 (Pie) thanks to a comment from @undef, etc.
I've also made myself a reliable USB extension cord to be able to extend the 20-30cm USB-A male to USB-C male cord that Huawei delivered with my Nexus 6P. The USB-C to USB-C cord that allows for fast charging is unreliable.. ordered some USB-C plugs for that, in order to make some high power wire with that when they arrive.
So that plug I've made.. USB-A male to USB-A female, in which my short USB-C to USB-A wire can plug in. It's a 1M wire, with 18AWG wire for its power lines and 28AWG wires for its data lines. The 18AWG power lines can carry up to 10A of current, while the 28AWG lines can carry up to 1A. All wires were made into 1M pieces. These resulted in a very low impedance path for all of them, my multimeter measured no more than 200 milliohms across them, though I'll have to verify and finetune that on my oscilloscope with 4-wire measurement.
So the wire was good. Easy too, I just had to look up the pinout and replicate that on the male part.
That's where the rant part comes in.. in fact I've got quite uncomfortable with sentences that don't include at least one swear word at this point. All hail to devRant for allowing me to put them out there without guilt.. it changed my very mind <3
Microshaft WanBLowS.
I've tried to plug my DIY extension cord into it, and plugged my phone and some USB stick into it of which I've completely forgot the filesystem. Windows certainly doesn't support it.. turns out that it was LUKS. More about that later.
Windows returned that it didn't support either of them, due to "malfunctioning at the USB device". So I went ahead and plugged in my phone directly.. works without a problem. Then I went ahead and troubleshooted the wire I've just made with a multimeter, to check for shorts.. none at all.
At that point I suspected that WanBLowS was the issue, so I booted up my (at the time) problematic Arch laptop and did the exact same thing there, testing that USB stick and my phone there by plugging it through the extension wire. Shit just worked like that. The USB stick was a LUKS medium and apparently a clone of my SanDisk rootfs that I'm storing my Arch Linux on my laptop at at the time.. an unfinished migration project (SanDisk is unstable, my other DM sticks are quite stable). The USB stick consumed about 20mA so no big deal for any USB controller. The phone consumed about 500mA (which is standard USB 2.0 so no surprise) and worked fine as well.. although the HP laptop dropped the voltage to ~4.8V like that, unlike 5.1V which is nominal for USB. Still worked without a problem.
So clearly Windows is the problem here, and this provides me one more reason to hate that piece of shit OS. Windows lovers may say that it's an issue with my particular hardware, which maybe it is. I've done the Windows plugging solely through a USB 3.0 hub, which was plugged into a USB 3.0 port on the host. Now USB 3.0 is supposed to be able to carry up to 1A rather than 500mA, so I expect all the components in there to be beefier. I've also tested the hub as part of a review, and it can carry about 1A no problem, although it seems like its supply lines aren't shorted to VCC on the host, like a sensible hub would. Instead I suspect that it's going through the hub's controller.
Regardless, this is clearly a bad design. One of the USB data lines is biased to ~3.3V if memory serves me right, while the other is biased to 300mV. The latter could impose a problem.. but again, the current path was of a very low impedance of 200milliohms at most. Meanwhile the direct connection that omits the ~200ohm extension wire worked just fine. Even 300mV wouldn't degrade significantly over such a resistance. So this is most likely a Windows problem.
That aside, the extension cord works fine in Linux. So I've used that as a charging connection while upgrading my Arch laptop (which as you may know has internet issues at the time) over Bluetooth, through a shared BNEP connection (Bluetooth tethering) from my phone. Mobile data since I didn't set up my WiFi in this new Pie ROM yet. Worked fine, fixed my WiFi. Currently it's back in my network as my fully-fledged development host. So that way I'll be able to work again on @Floydian's LinkHub repository. My laptop's the only one who currently holds the private key for signing commits for git$(rm -rf ~/*)@nixmagic.com, hence why my development has been impeded. My tablet doesn't have them. Guess I'll commit somewhere tomorrow.
(looks like my rant is too long, continue in comments)3 -
I called customer support for an unnamed site.
I: I don't seen it when I refresh page
Support: press CTRL + f5
I: I tried, I still bad
Support: remove memory from your computer and reinsert it. Then it will go.
Wtf, Best support ever :)2 -
Chrome, Firefox, and yes even you Opera, Falkon, Midori and Luakit. We need to talk, and all readers should grab a seat and prepare for some reality checks when their favorite web browsers are in this list.
I've tried literally all of them, in search for a lightweight (read: not ridiculously bloated) web browser. None of them fit the bill.
Yes Midori, you get a couple of bonus points for being the most lightweight. Luakit however.. as much as I like vim in my terminal, I do not want it in a graphical application. Not to mention that just like all the others you just use webkit2gtk, and therefore are just as bloated as all the others. Lightweight my ass! But programmable with Lua, woo! Not like Selenium, Chrome headless, ... does that for any browser. And that's it for the unique features as far as I'm concerned. One is slow, single-threaded and lightweight-ish (Midori) and another has vim keybindings in an application that shouldn't (Luakit).
Pretty much all of them use webkit2gtk as their engine, and pretty much all of them launch a separate process for each tab. People say this is more secure, but I have serious doubts about that. You're still running all these processes as the same user, and they all have full access to the X server they run under (this is also a criticism against user separation on a single X session in general). The only thing it protects against is a website crashing the browser, where only that tab and its process would go down. Which.. you know.. should a webpage even be able to do that?
But what annoys me the most is the sheer amount of memory that all of these take. With all due respect all of you browsers, I am not quite prepared to give 8 fucking gigabytes - half the memory in this whole box! - just for a dozen or so tabs. I shouldn't have to move my web browser to another lesser used 16GB box, just to prevent this one from going into fucking swap from a dozen tabs. And before someone has a go at the add-ons, there's 4 installed and that's it. None of them are even close to this complete and utter memory clusterfuck. It's the process separation. Each process consumes half a GB of memory, and there's around a dozen of them in a usual browsing session. THAT is the real problem. And I want to get rid of it.
Browsers are at their pinnacle of fucked up in my opinion, literally to the point where I'm seriously considering elinks. Being a sysadmin, I already live my daily life in terminals anyway. As such I also do have resources. But because of that I also associate every process with its cost to run it, in terms of resources required. Web browsers are easily at the top of the list.
I want to put 8GB into perspective. You can store nearly 2 entire DVD movies in that memory. However media players used to play them (such as SMPlayer) obviously don't do that. They use 60-80MB on average to play the whole movie. They also require far less processing power than YouTube in a web browser does, even when you download that exact same video with youtube-dl (either streamed within the media player or externally). That is what an application should be.
Let's talk a bit about these "complicated" websites as well. I hate to break it to you framework web devs, but you're a dime a dozen. The competition is high between web devs for that exact reason. And websites are not complicated. The document itself is plain old HTML, yes even if your framework converts to it in the background. That's the skeleton of your document, where I would draw a parallel with documents in office suites that are more or less written in XML. CSS.. oh yes, markup. Embolden that shit, yes please! And JavaScript.. oh yes, that pile of shit that's been designed in half a day, and has a framework called fucking isEven (which does exactly what it says on the tin, modulo 2 be damned). Fancy some macros in your text editor? Yes, same shit, different pile.
Imagine your text editor being as bloated as a web browser. Imagine it being prone to crashing tabs like a web browser. Imagine it being so ridiculously slow to get anything done in your productivity suite. But it's just the usual with web browsers, isn't it? Maybe Gopher wasn't such a bad idea after all... Oh and give me another update where I have to restart the browser when I commit the heinous act of opening another tab, just because you had to update your fucking CA certs again. Yes please!19 -
I fucked up. I forgot the password. I already knew I have a really bad memory but then pride came along and told me that I'd remember it this time. Fuck my fucking pride. I fucked up and now I've got to restore this mess.
Fuck.3 -
!dev
A child's mind is fascinating.
I remember how it felt being a kid, just deliriously happy.
Things were magical, mystical and happy.
I knew the world wasn't perfect, I knew bad things happened to good people.
But a kid's mind is so powerful that it can fill in the blanks with the most cheerful and optimistic perspectives.
And at some point in my childhood I was exposed to videogames, and that kinda took me down fantasy lane even further.
I was extremely young and barely retaining any memories when I was exposed to my first console, a famicom.
I have a somewhat vivid memory of my mind being blown away for the first time by watching my brother play New Ghostbusters II for NES.
From then on, we never stopped and played several console and dos/pc games.
When I was 10, someone from the neighborhood brought in a couple of floppys with Pokemon Yellow.
"What? Pokemon? How the fuck is that even possible? This is a pc, not a gameboy".
I didn't know at the time what an emulator was, but I was super fucking stoked to be able to play that.
My dad had a 1 gb laptop from work that he didn't use, so I hoarded that shit, and I would get to bed and play nearly everyday.
The experience was surreal. I was doing pc gaming... not on a chair, on a fucking bed, and I was playing a gameboy game... on a pc.
It was so intense to me, that even after more than 2 decades of that time in my life, I still remember how it feels like.
Like, you know how you can "feel" things if you think about them? like for example if you think about the taste of chicken, you can somehow feel it for a second.
Well I have like an actual physical sensation linked to that experience but I can't explain it at all, because it's just a sensation.
I think people usually say they feel that way, for example, about the PSX (usually refered to as ps one) loading screen. I experienced that too but when I was 12, so it was not as intense (it does make me feel the fuzzies though).
I also remember other things with very high detail, like the texture of my bed cover, the weather, mom cooking, the clunky shape of the laptop, the way I carelessly stored it above a pile of magazines, etc.
I rememeber ofc how it felt looking at the game sprites, interacting with NPCs, and the goddamn fucking glorious music.
It was dreamy.
Years and years later, I grew up and I stopped living in fantasy world and became more aware of the grim aspects of life my younger self was sugarcoating.
So I tried to play pokemon again, again and again, and no matter how hard I tried to revive that euphoria, I could not never do it.
I started to get annoyed at the game.
"Come oooon, I did the tutorial already, let me skip this.
This pokemon is useless, why am I even training it.
Fuck, I'm tired of grinding"
At some point I accepted that the feeling would never return, and that it would just live in my memory.
Ironically, I can recall that memory and how it felt anytime I want to.
And I can actually still feel it, and throughtout these years, it has never wore down.
And eventually I learned how to play pokemon and enjoy it:
I read tier lists at smogon online and just catch and train the pokemons that are higher on the list, which is how i got to beat yellow in like 3 days.
(This is nothing compared to what speedrunners do, but much better than the weeks it had taken me in the past).
That served as an important lesson that when a kid plays a game, his mind is also the game at the same time, filling the blanks with its imagination.
A very similar experience happened to me with harvest moon, which is the precursor of stardew valley.
and that game is faaar more emotional: you talk to people, overtime you befriend them and they open up, you meet a girl, you marry her, have a kid
you get farm animals, you brush them, they become happy
you get attached
that game was also so powerful in me that in all naiveness I thought I wanted to be a farmer.
Eventually I grew up and hit puberty and from then on, I focused more on competitive games, like smash bros, cs and tf2.
and i dunno how to end a post so eat my fucking nuts17 -
Anybody else have a wall like this in their workspace or is it just me? I've got bad memory for config strings haha!4
-
Tried to figure out why my computer was being slow and lagging earlier. Thought it may have been a bad update to the kernel I recently did, or an update to a package.
No, it was chrome and its horrible memory usage.7 -
JavaScript is a rollercoaster. From "Golly hello world is easy and I can make webpages now", to "wtf '1'+1 is '11' kill me now", to "it's not be that bad if you know how to use it", to discovering typescript and it starts feeling like a real language.
... until you can't build the project because you have too many types so you blow the memory limit in node. I can up the limit, but I can't guarantee that we won't blow past this in the future. Browsing issues on the ts repo reveals that this has been a thing for years.
Sticking with the rollercoaster analogy I'm now at "Burn it all to the ground".5 -
In college when we had programming labs where we had to use the schools unix server to compile and run.
My professor was very bad at explaining what actually needed to be done in the labs to the point where even the TAs didn't know what to do.
We were suppose to write an application in C to find out by "trial and error" how large we could make an array (or something like that, it's been too long). This not being explained well and no one knowing that much about C, I wrote a loop that just kept growing an array until it couldn't anymore. I watched it consume 72GB or memory from the servers before quitting the loop and realizing with the TA what the professor really meant.
I now feel bad for the IT staff monitoring the system wondering where 72GB just went...2 -
It grinds my gears to no end as to how insanely BAD most Electrical engineering software is. Lets start with Tina. A circuit simulator. A few versions ago it was rather good but now it feel like its built upon more legacy crap than fucking Windows! This causes it to have memory access violations and crashes even when you look at it from an odd angle.
On topic of circuit simulation. LT-Spice! It has less errors than Tina but is impossible to use without being lobotomized first. Who the FUCK decided it was a good idea to reinvent keyboard shortcuts by movin all of them to the F-row at the top of the keyboard. Also there is no option to delete a component. YOU NEED TO USE CUT IN ORDER TO REMOVE IT!
And at last Altium Designer for Layouting and Schematics. Whose license costs 9 grand. No one outside of some companies will buy this because of the price. Altium realized this and made two watered down versions of it. Which dont really get updates anymore. (last one was in 2018) So they essentially made a cash grab from people who cant afford their actual product. There also exist other (and a lot cheaper) products than what Altium offers. The problem is that they dont work well with interoperability. Schematics drawn in one program will look distorted in another or not import at all. And since Altium is the industry standard you got yourself this nice steaming soup of impossible collaboration. Its kinda like Adobe being absolute shit at progressing their software just because they got no competition. Or rather they do but the industry wont switch cause adobe is so engraved into it.6 -
Why even is Microsoft Teams?
Why does it suck so bad? Why is it a memory hog? Why does the ELECTRON desktop app not have native ARM64 support neither on Windows nor macOS? Why is it even an Electron app? Why the web version does not work with Safari (then again, barely anything more complex than my portfolio site works on Safari)? Why is the UI from 2016? Why is it preinstalled with Windows 11? Why the pre-installed Windows 11 version is a completely different entity? Why the preinstalled Windows 11 version does not work with school/work version of Teams calls?10 -
A girl sets out on a journey in the post apocalypse, to find the reason why the AI that ran humanity vanished decades ago, causing civilization to collapse. Instead she finds the most unusual pair of survivors, and receives the most unexpected answer.
Alice walked in to the ivy covered room, the floors covered in dust and lichen. There were two voices, mumbling in the dark, among the blue glow across the room. She came here for answers. Why the world had just stopped decades ago. If these machines could tell her, she would do anything to make them talk.
"No, no, no. I said before thats not the answer. I read the book. Your memory is bad."
"Atlas, the answer to life, the universe, and everything..why hello?"
Alice raised an eyebrow, and stepped forward. "Ahem. I'm alice."
"yes, yes, we knew that."
"I came here to find out why the blackout happened decades ago."
"Another one? Alright, lets see. Its been a LONG time. I'm apollo, and this is atlas. We were just discussing why my friend here is wrong."
Atlas - I anticipated that.
apollo - I knew you would say that.
alice - Guys. Stop, I just want you to answer my question already.
apollo - Straight to the point. About time.
alice - why the blackout then? Why leave us to die?
Read the rest here (5-10 minute read):
https://pastebin.com/wvifGLFP
(because it was too long for devrant).6 -
Github 101 (many of these things pertain to other places, but Github is what I'll focus on)
- Even the best still get their shit closed - PRs, issues, whatever. It's a part of the process; learn from it and move on.
- Not every maintainer is nice. Not every maintainer wants X feature. Not every maintainer will give you the time of day. You will never change this, so don't take it personally.
- Asking questions is okay. The trackers aren't just for bug reports/feature requests/PRs. Some maintainers will point you toward StackOverflow but that's usually code for "I don't have time to help you", not "you did something wrong".
- If you open an issue (or ask a question) and it receives a response and then it's closed, don't be upset - that's just how that works. An open issue means something actionable can still happen. If your question has been answered or issue has been resolved, the issue being closed helps maintainers keep things un-cluttered. It's not a middle finger to the face.
- Further, on especially noisy or popular repositories, locking the issue might happen when it's closed. Again, while it might feel like it, it's not a middle finger. It just prevents certain types of wrongdoing from the less... courteous or common-sense-having users.
- Never assume anything about who you're talking to, ever. Even recently, I made this mistake when correcting someone about calling what I thought was "powerpc" just "power". I told them "hey, it's called powerpc by the way" and they (kindly) let me know it's "power" and why, and also that they're on the Power team. Needless to say, they had the authority in that situation. Some people aren't as nice, but the best way to avoid heated discussion is....
- ... don't assume malice. Often I've come across what I perceived to be a rude or pushy comment. Sometimes, it feels as though the person is demanding something. As a native English speaker, I naturally tried to read between the lines as English speakers love to tuck away hidden meanings and emotions into finely crafted sentences. However, in many cases, it turns out that the other person didn't speak English well enough at all and that the easiest and most accurate way for them to convey something was bluntly and directly in English (since, of course, that's the easiest way). Cultures differ, priorities differ, patience tolerances differ. We're all people after all - so don't assume someone is being mean or is trying to start a fight. Insinuating such might actually make things worse.
- Please, PLEASE, search issues first before you open a new one. Explaining why one of my packages will not be re-written as an ESM module is almost muscle memory at this point.
- If you put in the effort, so will I (as a maintainer). Oftentimes, when you're opening an issue on a repository, the owner hasn't looked at the code in a while. If you give them a lot of hints as to how to solve a problem or answer your question, you're going to make them super, duper happy. Provide stack traces, reproduction cases, links to the source code - even open a PR if you can. I can respond to issues and approve PRs from anywhere, but can't always investigate an issue on a computer as readily. This is especially true when filing bugs - if you don't help me solve it, it simply won't be solved.
- [warning: controversial] Emojis dillute your content. It's not often I see it, but sometimes I see someone use emojis every few words to "accent" the word before it. It's annoying, counterproductive, and makes you look like an idiot. It also makes me want to help you way less.
- Github's code search is awful. If you're really looking for something, clone (--depth=1) the repository into /tmp or something and [rip]grep it yourself. Believe me, it will save you time looking for things that clearly exist but don't show up in the search results (or is buried behind an ocean of test files).
- Thanking a maintainer goes a very long way in making connections, especially when you're interacting somewhat heavily with a repository. It almost never happens and having talked with several very famous OSSers about this in the past it really makes our week when it happens. If you ever feel as though you're being noisy or anxious about interacting with a repository, remember that ending your comment with a quick "btw thanks for a cool repo, it's really helpful" always sets things off on a Good Note.
- If you open an issue or a PR, don't close it if it doesn't receive attention. It's really annoying, causes ambiguity in licensing, and doesn't solve anything. It also makes you look overdramatic. OSS is by and large supported by peoples' free time. Life gets in the way a LOT, especially right now, so it's not unusual for an issue (or even a PR) to go untouched for a few weeks, months, or (in some cases) a year or so. If it's urgent, fork :)
I'll leave it at that. I hear about a lot of people too anxious to contribute or interact on Github, but it really isn't so bad!4 -
Is there a relation with bad long term memory and programmers?
Most really good programmers I know don't have great memeory10 -
A bug is born
... and it's sneaky and slimy. Mr. Senior-been-doing-it-for-ears commits some half-assed shitty code, blames failed tests on availability of CI licenses. I decided to check what's causing this shit nevertheless, turns out he forgot to flag parts of the code consistently using his new compiler defines, and some parts would get compiled while others needed wouldn't .. Not a big deal, we all make mistakes, but he rushes to Teams chat directing a message to me (after some earlier non-sensible argument about merits of cherry picking vs re-base):
Now all tests pass, except ones that need CI license. The PR is done, you can use your preferred way to take my changes.
So after I spot those missing checks causing the tests to fail, as well as another bug in yet another test case, and yet another disastrous memory related bug, which weren't detected by the tests of course .. I ponder my options .. especially based on our history .. if I say anything he will get offended, or at best the PR will get delayed while he is in denial arguing back even longer and dependent tasks will get delayed and the rest of the team will be forced to watch this show in agony, he also just created a bottleneck putting so many things at stake in one PR ..
I am in a pickle here .. should I just put review comments and risk opening a can of worms, or should I just mention the very obvious bugs, or even should I do nothing .. I end up reaching for the PM and explained the situation. In complete denial, he still believes it's a license problem and goes on ranting about how another project suffering the same fate .. bla bla bla chipset ... bla bla bla project .. bla bla bla back in whatever team .. then only when I started telling him:
These issues are even spotted by "Bob" earlier, since for some reason you just dismissed whatever I just said ..
("Bob" is another more sane senior developer in the team, and speaks the same language as the PM)
Only now I get his attention! He then starts going through the issues with me (for some reason he thinks he is technical enough to get them) .. He now to some extent believes the first few obvious bugs .. now the more disastrous bug he is having really hard time wrapping his head around it .. Then the desperate I became, I suggest let's just get this PR merged for the sake of the other tasks after may be fixing the obvious issues and meanwhile we create another task to fix the bug later .. here he chips in:
You know what, that memory bug seems like a corner case, if it won't cause issues down the road after merging let's see if we need even to open an internal fix or defect for it later. Only customers can report bugs.
I am in awe how low the bar can get, I try again and suggest let's at least leave a comment for the next poor soul running into that bug so they won't be banging their heads in the wall 2hrs straight trying to figure out why store X isn't there unless you call something last or never call it or shit like that (the sneaky slimy nature of that memory bug) .. He even dismissed that and rather went on saying (almost literally again): It is just that Mr. Senior had to rush things and communication can be problematic sometimes .. (bla bla bla) back in "Sunken Ship Co." days, we had a team from open source community .. then he makes a very weird statement:
Stuff like what Richard Stallman writes in Linux kernel code reviews can offend people ..
Feeling too grossed and having weird taste in my mouth I only get in a bad hangover day, all sorts of swear words and profanity running in my head like a wild hungry squirrel on hot asphalt chasing a leaky chestnut transport ... I tell him whatever floats your boat but I just feel really sorry for whoever might have to deal with this bug in the future ..
I just witnessed the team giving birth to a sneaky slimy bug .. heard it screaming and saw it kicking .. and I might live enough to see it a grown up having a feast with other bug buddies in this stinky swamp of Uruk-hai piss and Orcs feces.1 -
The process of making my paging MIDI player has ground to a halt IMMEDIATELY:
Format 1 MIDIs.
There are 3 MIDI types: Format 0, 1, and 2.
Format 0 is two chunks long. One track chunk and the header chunk. Can be played with literally one chunk_load() call in my player.
Format 2 is (n+1) chunks long, with n being defined in the header chunk (which makes up the +1.) Can be played with one chunk_load() call per chunk in my player.
Format 1... is (n+1) chunks long, same as Format 2, but instead of being played one chunk at a time in sequence, it requires you play all chunks
AT THE SAME FUCKING TIME.
65534 maximum chunks (first track chunk is global tempo events and has no notes), maximum notes per chunk of ((FFFFFFFFh byte max chunk data area length)/3 = 1,431,655,763d)/2 (as Note On and Note Off have to be done for every note for it to be a valid note, and each eats 3 bytes) = 715,827,881 notes (truncated from 715,827,881.5), 715,827,881 * 65534 (max number of tracks with notes) = a grand total of 46,911,064,353,454 absolute maximum notes. At 6 bytes per (valid) note, disregarding track headers and footers, that's 281,466,386,120,724 bytes of memory at absolute minimum, or 255.992 TERABYTES of note data alone.
All potentially having to be played
ALL
AT
ONCE.
This wouldn't be so bad I thought at the start... I wasn't planning on supporting them.
Except...
>= 90% of MIDIs are Format 1.
Yup. The one format seemingly deliberately built not to be paged of the three is BY FAR the most common, even in cases where Format 0 would be a better fit.
Guess this is why no other player pages out MIDIs: the files are most commonly built specifically to disallow it.
Format 1 and 2 differ in the following way: Format 1's chunks all have to hit the piano keys, so to speak, all at once. Format 2's chunks hit one-by-one, even though it can have the same staggering number of notes as Format 1. One is built for short, detailed MIDIs, one for long, sparse ones.
No one seems to be making long ones.6 -
Is your code green?
I've been thinking a lot about this for the past year. There was recently an article on this on slashdot.
I like optimising things to a reasonable degree and avoid bloat. What are some signs of code that isn't green?
* Use of technology that says its fast without real expert review and measurement. Lots of tech out their claims to be fast but actually isn't or is doing so by saturation resources while being inefficient.
* It uses caching. Many might find that counter intuitive. In technology it is surprisingly common to see people scale or cache rather than directly fixing the thing that's watt expensive which is compounded when the cache has weak coverage.
* It uses scaling. Originally scaling was a last resort. The reason is simple, it introduces excessive complexity. Today it's common to see people scale things rather than make them efficient. You end up needing ten instances when a bit of skill could bring you down to one which could scale as well but likely wont need to.
* It uses a non-trivial framework. Frameworks are rarely fast. Most will fall in the range of ten to a thousand times slower in terms of CPU usage. Memory bloat may also force the need for more instances. Frameworks written on already slow high level languages may be especially bad.
* Lacks optimisations for obvious bottlenecks.
* It runs slowly.
* It lacks even basic resource usage measurement.
Unfortunately smells are not enough on their own but are a start. Real measurement and expert review is always the only way to get an idea of if your code is reasonably green.
I find it not uncommon to see things require tens to hundreds to thousands of resources than needed if not more.
In terms of cycles that can be the difference between needing a single core and a thousand cores.
This is common in the industry but it's not because people didn't write everything in assembly. It's usually leaning toward the extreme opposite.
Optimisations are often easy and don't require writing code in binary. In fact the resulting code is often simpler. Excess complexity and inefficient code tend to go hand in hand. Sometimes a code cleaning service is all you need to enhance your green.
I once rewrote a data parsing library that had to parse a hundred MB and was a performance hotspot into C from an interpreted language. I measured it and the results were good. It had been optimised as much as possible in the interpreted version but way still 50 times faster minimum in C.
I recently stumbled upon someone's attempt to do the same and I was able to optimise the interpreted version in five minutes to be twice as fast as the C++ version.
I see opportunity to optimise everywhere in software. A billion KG CO2 could be saved easy if a few green code shops popped up. It's also often a net win. Faster software, lower costs, lower management burden... I'm thinking of starting a consultancy.
The problem is after witnessing the likes of Greta Thunberg then if that's what the next generation has in store then as far as I'm concerned the world can fucking burn and her generation along with it.6 -
Man wk89 awesome... bringing back a lot of memories. The one thing really stands out to me though is the software.
I see a lot of rants about people shocked that turboC is still in use or other DOS programs are still in production. A lot can of bad be said here but I think often it's a case of we truly don't build things like we did in the good old days.
What those devs accomplished with such limited resources is phenomenal and the fact that we still haven't managed to replicate the feel and usability of it says a lot, not to mention just how fucking stable most of it was.
My favourite games are all DOS based, my most favourite of all time Sherlock is 103kb in size. When I started coding games I made a clone of it and to this day I am still trying to figure out what sorcery is in the algorithm that generates/solves puzzles that makes it so fast and memory efficient. I must have tried 100+ ways and can't even come close. NB! If you know you can hint but don't tell me. Solving this is a matter of personal pride.
Where those games really stand out is when you get into the graphics processing - the solutions they came up with to render sprites, maps and trick your eyes into seeing detail with only 4-16 colours is nothing short of genius. Also take a second to consider that taking a screen shot of the game is larger than the entire game itself and let that sink in...
I think the dramatic increase in storage, processing power and ram over the last decade is making us shit developers - all of us. Just take one look at chrome, skype or anything else mainline really and it's easy to see we no longer give a rats ass about memory anywhere except our monthly AWS/GCE bill.
We don't have to be creative or even mindful about anything but the most significant memory leaks in order to get our software to run now days. We also don't have constraints to distribute it, fast deliver-ability is rewarded over quality software. It's only expected to stay in production 3-4 years anyway.
Those guys were the true "rockstars" and "ninja" developers and if you can't acknowledge that you can take ya React app and shovit. -
Back when I was still in school for comp sci we had an advanced software engineering and design class with c++. At this time, everyone was expected to be proficient enough with cpp to go ahead and properly work with whatever the instructor would throw at us. And pretty much everyone was since past classes included a lot of c++ development. Of course, efficient at least related to academic studies rather than actual real world development.
Our teacher would mix in a lot pf phyisics and mathematics into what we were doing, something that I greatly enjoyed, while at the same time putting real world value concerning cpp best practices to avoid common pitfalls in the development of said language. Since most bugs seemed to be memory based he would be particularly strict about that.
One classmate, good friend and an actual proper developer now a days would ALWAYS forget to free his resources...ALWAYS for whatever fucking reason he would just ignore that shit, regardless of how much the instructor would make a point on it.
At one point during class on a virtual lecture the dude literally addressed a couple of students but when he got to my boy in particular he said: "you are the reason why people are praying to Mozilla and Hoare to release Rust as fast as possible into a suitable alternative to high performant code in C++, WHY won't you pay attention to how you deal with memory management?"
And it stuck with me. I merely a recreational cpp dev, most of my profesional work is done on web development, so I cannot attest to all the additional unsafe code that people encounter in the wild when dealing with cpp on a professional level.
But in terms of them common criticisms of C and C++ for which memory is so important to work with, wouldn't you guys say that it comes more from the side of people just not knowing what they are doing rather than a fault on the language itself?
I see the merits and beauty of Rust, I truly do, it is a fantastic language, with a standardized build system and a lot of good design put into it. But I can't really fathom it being the cpp killer, if anything, the real cpp killers are bad devs that just don't know what they are doing or miss shit.
What do y'all ninjas think?8 -
Anyone else have people that seem to constantly try to "prove" themselves to you in this weird, competitive way that only makes them seem... very annoying? I'll call him Bob here, but it's always something like:
Bob: Hi Almond, how's it going?
Almond: Ah not bad thanks, PSU blew up in the PC over the weekend though so that was a bit of a faff!
Bob: Ah no! How old's your PC?
Almond: Oh, like 7-8 years old now. I don't replace it often.
Bob: Really?! I replace mine completely every year.
Almond: Ah, cool.
Bob: Yeah, I'm a dev so I feel I need to. It's like my tool, you know.
Almond: Sure thing!
Bob: I actually spend quite a lot on it. I make sure it's got the fastest memory I can afford. Like, DDR5 stuff. That's really important, you know.
...etc., while I try to get out of said conversation for the next eternity.
Or:
(while in a conversation about a frontend bug I was looking at in Chrome devtools)
Bob: Hey Almond, you know Firefox actually had a plugin that did all this stuff before everything else?
Almond: Err, yeah, I think so. Used it back in the day.
Bob: It was called firebug. It was really good. Revolutionary.
Almond: Certainly was.
Bob: It was launched in January 2006 you know.
Almond: Right...
Bob: I used it back then.
...I mean damn, I'm all for being civil, but no-one cares you replace your PC every year, or that you know the year firebug was released, or that you once set up 5 identical PCs with different versions of Linux to run some benchmarks...14 -
Ugggg!
I am about fed up with Windows.
I leave app/programs open at night because I have a bad memory, and they were important to what I was doing.
I wake up: Windows Login Screen. Apparentally windows decided to restart my computer during the night.
Ohh what joy.~
Now I have to rememeber what the fuck I had open - and it was mostly work related.
I would have left for Linux ages ago, but I'm a gamer.. And most of my games are for windows.. Some are even Windows Store Apps..
Windows.. Why don't you give a shit about us..
And before you ask..
I have Auto Updates Disabled (Not that that really fucking matters with windows..)
I have all sleep and power saver settings disabled.13 -
I love software. Seriously, I love it. /s
Transmission is given a bad torrent (which, given that it's a torrent service, you'd expect it handles quite robustly) and completely fucks up. Like, really badly. It doesn't respond to RPC anymore, systemd has to resort to sending it a SIGKILL to get it off the process tree, and the web interface.. yeah. Nothing.
It doesn't log by default, so fine I'll add that to the systemd unit and restart it with debugging options enabled.
# systemctl daemon-reload && systemctl daemon-reexec
Turns out that /var/log/transmission.log can't be written to by my Transmission user. Well shit. Change that to /home/condor/transmission.log.
# systemctl daemon-reload && systemctl daemon-reexec
# systemctl restart transmission-daemon
*blood starts to reach its boiling point*
Still logs in the wrong fucking location. Systemd, I told you to log over there. I did everything I could to make you steaming pile of shit reload that fucking config. What's the fucking problem!?
*about 15 minutes of fighting systemd*
Finally! It spits out a log in the right location! Thank you Transmission and systemd for finally doing your fucking jobs. So a bad torrent it is, hmm...
*removes torrent from .config/transmission/torrents*
Transmission: *still fucking shits itself on that ostensibly removed torrent*
That's it. BEGONE!!!
Oh and don't get me started on the fact that apparently a service needs some 400MB of memory. Channeling your inner Chrome Transmission?8 -
I've been using arch for like 2 months now. And I can even play games on it quite smooth (cs and stuff). But I'm missing Witcher 3 and Rise of the Tomb Raider. So I took all my courage and booted into win10. Guess what they welcome me with...
Edit: this was after the 5th reboot. Now I'm getting fucking bsods like nothing because of bad memory allocation :( fuck this shit. Has anybody of you got a working windows PC? I just want to playyyy 😢9 -
I already wrote a rant about this yesterday, but since I'm a sysadmin trying to convert to dev.. I dunno, maybe it's not a bad idea to muddy the waters a bit and talk about why not to be a sysadmin.
Personally I think it's that the perceived barrier to entry is just too high, while it isn't. You don't need a huge Ceph cluster and massive servers when you're just starting out. Why overbuild an appliance like that if it's gonna start out at maybe 5 requests a minute?
Let's take an example - DNS servers! So there's been this guy on the bind-users mailing list asking how to set up a DNS server on 2 public servers, along with a website. Nothing special I guess - you can read the thread here: https://0x0.st/ZY-d. Aside from the question being quite confusing, there was advice to read RFC's, get a book, read the BIND ARM, etc etc. And the person to deny this? No one less than Stephane Bortzmeyer, one of the people who works for nic.fr (so he maintains the .fr TLD) and wrote some of those RFC's as part of the DNSOP working group in the IETF. As for valid reasons to set up a DNS server? Could just be to learn how the DNS works, or hell even for fun. As far as professional DNS servers go.. this (https://0x0.st/ZYo9) is the nugget that powers the K root server, one of the 13 root servers that power the root zone of the internet, aka the zone apex. 2 RJ45 connections, and a console connection. The reason why this is possible is the massive recursor networks that ISP's, Google DNS, Cloudflare DNS, Quad9, etc etc provide. Point is, you don't need huge infrastructure to run a server!
Or maybe your business needs email. How many thousands of emails per second are you gonna need to build your mail server against? How many millions will you need to store? If your business has 10 employees and all of those manage about 10k emails total.. well that's easy, 100k emails total. Per second? Hundreds of emails per second per employee? Haha, of course not. Maybe you'll see an email a minute at most. That is not to say that all email services are like this - it is true that ISP's who offer email to their customers, and especially providers like Microsoft and Google do need massive mail servers that can handle thousands of emails per second. But you are not Microsoft or Google. So yeah, focus on the parts of email that are actually hard.. and there is plenty.
Among sysadmins you have this distinction between "professional" sysadmins and homelabbers. I don't mind the distinction itself but I think both augment each other. If you've started out by jumping into a heap of legacy at an established company, you will have plenty of resources, immediately high complexity, and probably a clusterfuck right away. But you will have massive amounts of resources. If you start out with a homelab, you will have not many resources, small workloads, and something completely new for you to build and learn with. And when running a server like that, you'll probably find that the resources required are quite small, to provide you with your new services. My DHCP servers take 12MB memory each. My DNS servers hover around the 40MB mark. The mail server.. to be fair that one consumes around 150. But if you'd hear the people saying that you need huge servers.. omg you need at least a TB of RAM on your server and 72 cores, massive disks and Ceph!1!
No you don't. All that does is scaring people away and creating a toxic environment for everyone. Stop it.1 -
Okey, so the recruiters are getting smarter, I just clicked how well do you know WordPress quiz (I know it's from a recruiter, already entered a php quiz An might win a drone)
So the question is how to solve this issue:
Fatal error: Allowed memory size of 33554432 bytes exhausted (tried to allocate 2348617 bytes) in /home4/xxx/public_html/wp-includes/plugin.php on line xxx
A set memory limit to 256
B set memory limit to Max
C set memory limit to 256 in htaccess
D restart server
These all seem like bad answers to me.
I vote E don't use the plug-in, or the answer that trumps the rest, F don't use WordPress4 -
Avoid ACPICA if at all possible. It's one garbage tier cluster fuck of bad design, horrible documentation and downright misleading and wrong code
It's meant to consist of an ASL compiler, disassembler, debugger, dumper, various user space utitilies and a kernel resident OSPM implementation *if* you can figure out what belongs to what. Even just compiling this pile of trash is a mystery in itself. Think you need the source files in source/common? EEEEH, wrong. Well, at least partially since most of them seem to be for the user space stuff..? Other ones *are* needed on the other hand. At least the disassembler and/or debugger and/or dumper components seem to reference them. Not that I could figure out how to compile those anyways. The real path to your goal seems to be to ignore a seemingly arbitrary subset of source and header files until your linker stops complaining
There's also a bunch of configuration defines, some of which *you* define, some defined *for* you, based on again others. Of course most of them do stupid shit. Enabling the debugger automatically enables debug logging. Enabling the disassembler force enables debug allocation tracking... What?
The code itself isn't of much help either. Looking in "os_specific/service_layers" you find what looks to be reference implementations of acpica functions in certain os' like windows and unix. Of course I had a look because AcpiOsReadMemory is supposed to read physical memory and I don't know how I would even implement that. But hey, osunixxf.c (xf for interface... of course) should tell me. I'll let you see for yourself in the attached image. Apparently it does fuck all and just returns AE_OK. No error, no logging, no nothing. Just ok. As you can imagine, AcpiOsWriteMemory doesn't do much more either.
...okay so maybe physical memory accesses aren't actually used and these functions are some sort of relic from past times? Nope! They are absolutely necessary for doing low level device interaction. WTF. So finally I went to the linux source and checked how *they* implemented them, and just as I thought, these functions are anything but no-ops...
...So for what fucking reason do these stupid interface implementations even exist but to purposefully mislead you?? They aren't used for fucking anything! As far as I know Windows doesn't even *use* ACPICA and Linux have their own fork with working implementations... They just sit there, just to tell you how to NOT do it
So that's some of my thoughts about ACPICA. Note that I haven't even used it as a library yet, I just got it to compile and link and it already fucked with me this much.
There's also so much more I didn't mention like that you *have* to modify the acpica source in order to get your own platform header working (else #error) eventhough the docs explicitely instruct you not too but you get the point
Don't use ACPICA if you don't have to. Save your sanity for something that's worth it -
I hate people who think they are always right.
A coworker who seemed to be a friend turns out to be an emotionally needy narcissist who seems to think that he is a perfect human being and is the best example of how to live.
Long story short is that we did some bonding via alcohol and smoking cigarettes. Especially when I was in a bad period in my life where I had little self confidence, was in a bad financial situation and overshared many details abound my personal life.
And yeah we also work as software devs in the same team but I started avoiding working with him directly, because due to his seniority he overcomplicates things a lot to the point where stuff gets postponed for months. Meanwhile I am a simple guy, I do my tasks and if they are not up to the standard I just work on the feedback until Im up to the standard, thats it. Its just a job for me, for him its a way of life and he considers himself to be basically an artist.
Hes always trying to prove me something, showing that the "long way" is the best way and so on. In reality I dont give a fuck about him. I live my own life and I have my own priorities. I work fulltime in one job, also I work part time as a freelancer and in total I make about 20 percent more than he does. Previously before this job I owned my own company where for 2 years I ran my own projects which generated a decent revenue. I know what is hard work and how to sacrifice myself in order to achieve results. I am more pragmatic and I have some limitations of what I can be good at (since I have a shitty working memory due to my ADHD). So I have systems in place and bottom line is that I earn a decent living and my skillset is different. Yeah I agree that in some ways he is better than me, but dude has such a massive inflated ego that now he thinks that he unlocked some sort of universal wisdom and now hes suddenly experienced in every field of life and his opinion is the right one.
This guy takes a massive pride in how good software engineer he is and in every topic or interaction he tries to one up me. Which most of the time is just his preference or in order to gain a 0.0001 percent performance increase. Dude is basically a big walking ego and since "we are close now" his ego started bleeding into personal relationship.
In my personal life, Im in a stable relationship, thinking of proposing soon and getting married. I already co-own an apartment with my current girlfriend. Everything is serious and planned, Im soon to be 30 years old. He is the same age but he still thinks hes young hot shit and all he cares about is getting shitfaced a couple times a week after work and he doesnt really have any other hobbies. He has a girlfriend but I dont see any future in there TBH.
So what I did now is I started putting some distance between us. No more drinking every week with him, maybe maximum once in 2 or 3 weeks. I started working from home more. Also I stopped sharing my personal life with him. Each time when he thinks he is right I just go along with it and dont even pay attention to his emotional manipulations. I just hope one day he fucks off completely and I wont give in to his gaslighting. Maybe in a few months I will be leaving this job, so I will never have to deal with him again.
Lesson learned: dont be vulnerable to coworkers who you bond together only via alcohol.3 -
Well, I always say that if you going to make things a mess, do it a spectacular way. Today I kicked off a data import job that went bad, and in the process of canceling said job, I canceled myself, and the job went rogue, and became a zombie and ate ALL the system memory, bringing the server to a deathly crawl and throwing a dozen developers temporarily out of work for about an hour, before I was finally able to kill the zombie, and balance was restored to the Universe.
-
So I got a telephone interview for a job that a recruiter found for me. Call went well, comes to the development test. Small application in ruby on rails, haven't used it in about 2-3 years so a tad rusty. Completed the test under two days (was given until Friday) not too bad if I say so myself. It's for a junior position anyway so I'll assume they wouldn't mind giving me a refresher to help jog my memory.
-
Best shameless hack: installing a Windows Service to restart Apache Tomcat every night at 2am on a clients' server coz the jsp application kept leaking memory and crashing tomcat. So bad, yet such a timesaver.1
-
I'm getting beat up pretty bad by Rust. I like it so far but man is it hard. Imposter-syndrome is almost making me lose motivation. Almost, but I won't quit, one day I'll get there.
I think the primary reason I think I'm having such a hard time is that I'm trying to learn stuff that prevents me from making some mistakes that I have never run into. I know a bit of the theory but no hand's on experience on double-free errors, memory leaks and weird low-level stuff. I read the documentation, mostly understand what stuff is for but when I go write code I'm just like "now what?". I don't have enough experience to know when and where to use some concepts and I'm super lost. I don't know where to start and the feeling of being completely overwhelmed by all sorts of new stuff is at the same time exciting and frightening.
I have never, as a programmer, thought something was hard. All of my past knowledge required dedication, work and patience, but I wouldn't say I ever felt something was *hard*. But Rust... damn. Rust is hard.
Hopefully at the end of this super steep learning curve I'll know a lot more stuff and have stronger "dev powers" and be one step closer to being as knowledgeable as some of you guys around here to whom I look up to.2 -
I keep a blogger for tech notes for programs and settings that were difficult problems or took up a lot of time to find the solution or so I can remember years later if I ever return to those programs. Just in case to save myself and others time. It's kind of like an adventurer's log: I've gone through these stretches of technological wilderness and here are my findings, should one day you happen upon them.3
-
Why is netbeans, or Java in general, so fucking bad at handling memory ? I mean, I'm literally doing nothing on my code and I see my IDE consuming more and more RAM, to some point it goes over 1GB so I have to close then reopen it to "flush" the memory taken...
It's 2017, how the fuck can't we still manage to actually use a correct amount of RAM when I open a barely 10MB project ??
And it applies to everything related to Java. Like Android, Minecraft and other Java based softwares...18 -
A few days ago I decided to install Windows 7 on a VM (bad idea as it turned out). All fine and dandy and I ran Windows Update a few times to get it at least as up-to-date as it'll get.
I noticed that out of the 4GB RAM I had allocated, an svchost process responsible for the updates was gobbling up all the available memory, just leaving 82MB for everything else. The process itself was as you might imagine consuming over 3GB RAM just for itself. That's how an OS should work right after installation, I'm sure you'll agree.
So I complained about it. Haven't used Windows anywhere for a while so I wasn't used anymore to this level of efficiency. Disk activity went through the roof, though to be fair the underlying disk wasn't an SSD (qcow2 on ZFS on a spinning drive). RAM consumption is something I already covered. CPU temperature shot up to 95C.
So as any idiot would do, I disabled the service related to that process (the svchost process for wuauserv) and the problem went away. But I complained of course, saying that such amazing system utilization metrics wasn't something I expected. I mean for 4GB allocated, having as much as 82MB usable to get stuff done with! 95C on the CPU, on a lot of chips that's the junction temperature! Absolutely beautiful.
When I complained I heard that I had to replace the thermal grease. I do that twice a year. I wrote a custom fan driver for my system that works absolutely great. It was obviously shit. I must be a horrible sysadmin for solving a problem by eliminating the cause, and companies hiring me must be ashamed of themselves. My hardware must be shit (that's a common one with Windows users) despite being a business laptop and the guest system being a VM. Oh and I'm an idiot of course for complaining about such amazing system metrics in Windows.
I love Windows and its community...8 -
I'm notoriously bad at Git. By that I mean I REALLY REALLY SUCK AT IT. And I have the curse of short memory and an even shorter ability to retain the how-to, muscle memory knowledge of things if too much time passes.
So, I was staring down the gullet of merging two separate repositories onto my local machine and then pushing the result to a remote server. Not having the benefit of someone else to bounce this off of, and always finding the usual Git docs too dense and obtuse, I turned to ChatGPT to help me sort it out.
Guys, where has this been all of my life? I know it's not perfect and it can make mistakes. I knew that going into it, so I made preparations in case this failed. BUT. IT. WORKED! I feel like it has put me into the Star Trek:TNG universe where I can say "Computer, do the thing." and it does that thing. Here's the prompt I used and which it answered perfectly.
"Play the role of a git coach. I have two git repositories. One is on Bitbucket. The other is on GitHub. The branch named "master" on Bitbucket has the latest code. The branch named "master" on GitHub needs to be updated to what's on the Bitbucket "master" branch. Please write the series of git commands that I will need to accomplish this."9 -
iOS is rotting my soul.
I've been a user of iPhone for 6 years now. For the first couple years, I wasnt really mindful of software I use, or I guess I didnt really care. As long as it did the bare minimum, I.e. bank app, call, text, browse, watch youtube vids, I didnt really care. However, in the last couple years, ive become very interested in tech and have worked on small developer projects, spent a lot of time coding in my free time, found really inspiring software and apps on my regular computer that just blow my mind on how advanced they are, and how I, some dumb guy with internet access, can just download it on my PC and use it.
This led me into a kind of software honeymoon phase, where I created a shiny new Github account and started exploring what other cool tools are just out there, available to me for free. My software honeymoon was spent on the beaches and resorts of the open-source software ecosystem. Exploring the gem-bearing caves and beautiful forests of anything from free open-source OCR programs(I needed it to convert my dads manuscript from scanned PDF .jpeg's to actual UTF8 text) to open-source RGB lighting/keymapping software to escape the memory-and-CPU-hungry(and most likely advertising-ID-interested) proprietary software that comes with the brand of mouse/keyboard/controller/etc.
It was like I was a kid exploring Disneyland for the first time or something. But then... then... I got off my computer. Picked up my phone to check notifications. Ew, tinder is blowing up notification center with marketing shit. I go to settings. Notification settings. Tinder's at the bottom so I just want to use a search bar instead of scrolling. There's no search bar. Minor inconvenience. Dark mode isnt dark enough for me. I guess thats just too damn bad, because for the next two hours, I'll have to figure it out by messing with accessibility settings. Time for bed, and I'm just getting plum tired of having to turn on my alarms every night for work the next morning. So I used the 'Automations' app to do it for me. For the next two weeks, at the time specified, 'There was an error running your automation' until I just delete the automation. Browsing through the FaceID settings, I see 'Attention Aware Features'. Cool, maybe now my phone won't automatically dim the screen when im in the middle of reading notifications on my lock screen. Haha, nope still does it. After turning on my alarms, I go to sleep. I wake up an hour late for work because those handy 'Attention Aware Features' silenced my alarm immediately because I fell asleep watching a youtube video.
I could go on and on. Its actually making me feel depressed typing this on my phone, fighting with Apple's primitive autocorrect and annoying implementation of Swype to type.4 -
Time for a rant about shitstaind, suspend/hibernate, and if there's room for it at the end probably swappiness, and Windows' way of dealing with this.
So yesterday I wanted to suspend my laptop like usual, to get those goddamn fans to shut up when I'm sleeping. Shitstaind.. pinnacle of init systems.. nope, couldn't do it. Hibernation on the other hand, no problem mate! So I hibernated the laptop and resumed it just now. I'm baffled by this.
I'll oversimplify a bit here (but feel free to comment how there's more to it regardless) but basically with suspend you keep your memory active as well as some blinkenlights, and everything else goes down. Simple enough.. except ACPI and I will not get into that here, curse those foul lands of ACPI.
With hibernation you do exactly the same, but on top of that, you also resume the system after suspending it, and freeze it. While frozen, you send all the memory contents to the designated swap file/partition. Regarding the size of the swap file, it only needs to be big enough to fit the memory that's currently in use. So in a 16GB RAM system with 8GB swap, as long as your used memory is under 8GB, no problem! It will fit. After you've moved all the memory into swap, you can shut down the entire system.
Now here's the problem with how shitstaind handled this... It's blatantly obvious that hibernation is an extension of suspend (sometimes called S3, see e.g. https://wiki.ubuntu.com/Kernel/...) and that therefore the hibernation shouldn't have been possible either. The pinnacle of init systems.. can't even suspend a system, yet it can hibernate it. Shitstaind sure works in mysterious ways!
On Windows people would say it's a hardware issue though, so let's talk a bit about that clusterfuck too. And I'll even give you a life hack that saves 30GB of storage on your Windows system!
Now I use Windows 7 only, next to my Linux systems. Reason for it is it's the least fucked up version of Windows in my opinion, and while it's falling apart in terms of web browsing (not that you should on an EOL system), it's good enough for le games. With that out of the way... So when you install Windows, you'll find that out of the box it uses around 40GB of storage. Fairly substantial, and only ~12GB of it is actually system data. The other 30-ish GB are used by a hibernation file (size of your RAM, in C:\hiberfil.sys) and the page file (C:\pagefile.sys, and a little less than your total RAM.. don't ask me why). Disable both of those and on a 16GB RAM system, you'll save around 30GB storage. You can thank me later.
What I find strange though is that aside from this obscene amount of consumed storage, is that the pagefile and hibernation file are handled differently. In Linux both of those are handled by the swap, and it's easy to see why. Both are enabled by the concept of virtual memory. When hibernating, the "real" memory locations are simply being changed to those within swap. And what is the pagefile? Yep.. virtual memory. It's one thing to take an obscene amount of storage, but only Windows would go the extra mile and do it twice. Must be a hardware issue as well.
Oh, and swappiness. This is a concept that many Linux users seem to misunderstand. Intuitively you'd think that the swappiness determines what percentage of memory it takes for the kernel to start swapping, but this is not true. Instead, it's a ratio of sorts that the kernel uses when determining how important the memory and swap are. Each bit of memory has a chance to be put into either depending on the likelihood of it being used soon after, and with the swappiness you're tuning this likelihood to be either in favor of memory or swap. This is why a swappiness of 60 is default most of the time, because both are roughly equally important, and swap being on disk is already taken into account. When your system is swapping only and exactly the memory that's unlikely to be used again, you know you've succeeded. And even on large memory systems, having some swap is usually not a bad idea. Although I'd definitely recommend putting it on SSD in a partition, so that there's no filesystem overhead and so that it's still sufficiently fast, even when several GB of memory are being dumped in.6 -
You know how you wake up from a bad dream?
I just woke up in the middle of the night, without any memory on any dream, but rather two people talking to each other on discord.
All I can remember was:
A: (garbled) you know when you ALT+CTRL+SHIFT+G?
B: (interrupts the other) 1,2,3... yeah when you want to move the windows to the other screen?
Both started to laugh.
I fully woke up, got a glass of water and went back to sleep. I’ve never, ever used that shortcut in any program.4 -
You know shit is going to hit the fan if the sentence "c++ is the same as java" is said because fuck all the underlying parts of software. It's all the fucking same. Oh and to write a newline in bash we don't use \n or so, we just put an empty echo in there. And fuck this #!/bin/bash line, I'm a teacher. I don't need to know how shit works to teach shit. Let's teach 'em you need stdio for printf even tho it compiles fine without on linux (wtf moment number one, asking em leaves you with "dunno..") and as someone who knows c you look at your terminal questioning everything you ever learned in your whole life. And then we let you look into the binaries with ldd and all the good stuff but we won't explain you why you can see a size difference in the compiled files even tho you included stdio in the second one, and all symbol tables show the exact same thing but dude chill, we don't know what's going on either.
Oh and btw don't use different directory names as we do in our examples. You won't find your own path, there is no tab key you can press to auto-fill shit.
But thats not everything. How about we fill a whole semester with "this is how to printf" but make you write a whole game with unity and c#. (not thaught even the slightest bit until then btw)
Now that you half-assed everything because we put you in a group full of fucks who don't even know what a compiler is but want to tell you you don't know shit and show you their non-working unfinished algorithms in some not-even-syntax-correct java...
...how about we finally go on with Algebra II: complex numbers, how they are going to fuck up your life, how we can do roots of negative numbers all of the sudden and let you do some probability shit no one ever fucking needs. BUT WHY DON'T YOU KNOW EVERYTHING ALREADY HMMMMM, IT'S YOUR SECOND LESSON, YOU WENT TO SCHOOL PLS BE A MATH PRO ASAP CUS YOU NEED IT SO MUCH BUT YOU DON'T NEED TO KNOW PROPER SYNTAX, HOW MEMORY MANAGEMENT WORKS, WHAT A REFERENCE IS AND PLS FINALLY FORGET THE WORD "ALLOCATION" IT DOESN'T PLAY A SINGLE ROLE YOU ARE STUDYING SOFTWARE DEVELOPMENT WHY ARE YOU SO BAD AT ECONOMICS IT MAKES NO SENSE I MEAN YOU HAD A WHOLE SEMESTER OF HOW TO GREET SOMEONE IN ENGLISH, MATHS > ECONOMICS > ENGLISH > FUCKING SHIT > CODING SKILL THATS HOW THE PRIORITIES WORK FOR US WHY DON'T YOU GET IT IT MAKES SO MUCH SENSE BRAH4 -
(Not a rant)
Mantra for a good life : Stage the good bits, .gitignore the bad ones. Commit and push to Memory !
Peace ☮️2 -
I always had this mentality that I shouldn’t rely on a certain library or framework for my entire project because what if one day they stop supporting it. (Yeah I’m talking to u vuetify) That’s why I came up with this code structure that for everything that I wanna do I have a ‘driver’ library all coded by myself that interacts with that third party framework or library so if they stop supporting it I could just change a couple of lines of code in my driver file and my codebase should be working again. But I feel like this ‘driver’ approach is not the most efficient way of going in terms of memory usage. Do you guys think I should keep it simple and directly use those libraries or this is actually not a bad approach.6
-
Okay this is 3.30 AM . Just woke up from bad geeky dreams. My heart is pounding so fast that I could nose bleed and I can't sleep as I am remembering I had the same dream last night.
Dream was about : me being astronaut. Everything was usual. From rocket launch to be in space. Scary part was my ship in orbit of moon.
Seeing dead land from that height chocked me. Imagine you are looking out of the window and all you see a big grey land and pitch black in background. Realising there is no one out there was spooky.
The scary part was I launched some satellite but crash on surface. It was scary seeing something going smaller every time. Crashing on deserted land was one plus on adding fear.
Then my ship leave the orbit (from the reverse shock of that satellite dittachment ) and it flow away in the vastness of space......
Away from the moon and away from the earth in long loneliness.
I wish I could erase this from my memory but I am not gonna watch space exploration video anymore.
I got to say, landing on moon is one thing but being out there knowing one accident and you will be forever there. You need balls to be on such missions.4 -
2nd part to https://devrant.com/rants/1986137/...
The story goes on...
After I found more bugs that seem to be related to the communication break, and took a closer look, I sent detailed logs of my research and today we had a conference call.
"We have 2,5 million user, our system is widely-used and there is no plan to change it" they said.
And "We cannot reproduce the issue, but even if there is one, you will have to work around the problem, because we cannot make changes on our side" was one answer
As well as "If we would make changes, we will have to re-certify everything"
So I said we told 'em about the issue to let them improve their system. And I can work around it, I already figured out a solution for my side, but if there is a bug, they'd better fix it for future releases.
And with my additional research I have a bad vibe of some kind of memory leak involved on their "certified" implementation, and that could trigger various other problems.
But it is as always, if I try to be nice, I just get kicked in the ass. I should really be more of an asshole. -
I need guidance about my current situation.
I am perfectionist believing in OOP, preventing memory leak in advance, following clean code, best practices, constantly learning about new libraries to reduce custom implementation & improve efficiency.
So even a single bad variable name can trigger my nerves.
I am currently working in a half billion $ IT service company on a maintenance project of 8 year old Android app of security domain product of 1 of the top enterprise company of the world, which sold it to the many leading companies in the world in Govt service, banking, insurance sectors.
It's code quality is such a bad that I get panic attacks & nightmares daily.
Issues are like
- No apk obfuscation, source's everything is openbook, anybody can just unzip apk & open it in Android Studio to see the source.
- logs everywhere about method name invoked,
- static IV & salt for encryption.
- thousands of line code in God classes.
- Irrelevant method names compared to it's functionality.
- Even single item having list takes 2-3 seconds to load
- Lag in navigation between different features' screens.
- For even single thing like different dimension values for different density whole 100+ lines separate layout files for 6 types of densities are written.
- No modularized packages, every class is in single package & there are around 100+ classes.
Owner of the code, my team lead, is too terrified to change even single thing as he don't have coding maturity & no understanding of memory leak, clean code, OOP, in short typical IT 'service' company mentality.
Client is ill-informed or cost-cutting centric so no code review done by them in 8 years.
Feeling much frustrated as I can see it's like a bomb is waiting to blast anytime when some blackhat cracker will take advantage of this.
Need suggestions about this to tackle the situation.10 -
A long time ago you sent me an email with the subject 'I love you', I then got so excited that I forwarded the letter to all my contacts, and they forwarded it too.. I can't describe the words for the feelings I had back then for you. I felt into love with you, really. But there were always troubling moments for me.
For example when 'Code Red' showed up and found your backdoor. Man I was pissed at that time. I didn't know what to do next. But things settled, and we found each other again.
And then that other time when this girl named 'Melissa' was sending me some passwords to pr0n sites, I couldn't resist. She was really awesome, but you know, deep in my heart that was not what I wanted. I somehow managed to go back to you and say sorry. We even moved together in our first flat, and later in our own house. That was a really good time, I love to think back at those moments.
Then my friend 'Sasser' came over to us one night, do you remember how he claimed that big shelf in our living room, and overflooded it with his own stuff, so that we haven't a clue we are reading yet offshelve? Wow that was a disturbing experience.
But a really hard time has come when our dog 'Zeus' got kicked by this ugly trojan horse. I really don't want go into details how the mess looked like after we discovered him on our floor. Still, I am very sorry for him that he didn't survived it :(
Some months later this guy named 'Conficker' showed up one day. I shitted my pants when I discovered that he guessed my password on my computer and got access to all my private stuff on it. He even tried to find some network shares of us with our photos on it. God, I was happy that he didn't got access to the pics we stored there. Never thought that our homemade photos are not secure there.
We lived our lives together, we were happy until that day when you started the war. 'Stuxnet..'! you cried directly in my face, 'you are gonna blow up our centrifuges of our life', and yeah she was right. I was in a real bad mood that days back then. I even not tried to hide my anger. But really, I don't know why all this could happen. All I know is, that it started with that cool USB stick I found on the stairs of our house. After that I don't remember anything, as it is just erased from my memory.
The years were passing. And I say the truth here, we were not able to manage the mess of our relationship. But I still loved you when you opened me that you will leave. My 'Heartbleed' started immediately, you stabbed it where it causes the most pain, where I thought that my keys to your heart are secured. But no, you stabbed even harder.
Because not long after that you even encrypted our private photos on our NAS, and now I am really finished, no memory which can be refreshed with a look at our pictures, and you even want my money. I really 'WannaCry' now... -
!rant
Digging though my old emails found this joke sent to me long time ago. Think that originally was posted in a 1997 issue of Computerworld. Maybe you already suffered the effect of the "Opcodes" listed here. Hope that !tl;dr
ARG Agree to Run Garbage
BDM Branch and Destroy Memory
CMN Convert to Mayan Numerals
DDS Damage Disk and Stop
EMR Emit Microwave Radiation
ETO Emulate Toaster Oven
FSE Fake Serious Error
GSI Garble Subsequent Instructions
GQS Go Quarter Speed
HEM Hide Evidence of Malfunction
IDD Inhale Dust and Die
IKI Ignore Keyboard Input
IMU Irradiate and Mutate User
JPF Jam Paper Feed
JUM Jeer at Users Mistake
KFP Kindle Fire in Printer
LNM Launch Nuclear Missiles
MAW Make Aggravating Whine
NNI Neglect Next Instruction
OBU Overheat and Burn if Unattended
PNG Pass Noxious Gas
QWF Quit Working Forever
QVC Question Valid Command
RWD Read Wrong Device
SCE Simulate Correct Execution
SDJ Send Data to Japan
TTC Tangle Tape and Crash
UBC Use Bad Chip
VDP Violate Design Parameters
VMB Verify and Make Bad
WAF Warn After Fact
XID eXchange Instruction with Data
YII Yield to Irresistible Impulse
ZAM Zero All Memory -
Was reading something about delusional disorder, and it got a bit scary cuz it made me question myself. Now, I tell you why.
I have a bad memory when it comes to trivial stuff. And I am, by occupation and therefore on a daily basis, creative and imaginative. Having pretty strong imagination means that I often have to ask myself "did that really happen or did I imagine it?" Which, given anxiety, I imagine all types of scenarios before they happen. (Parallel universes got nothing on me 😎)
So, now I'm wondering, where is the line between imagination and delusion, and how can you say what's real and what's not, be it offline (distorted memory) or online (schizophrenia).
One idea could be that video recording could help confirm, but we read emotions and vibe in real-time, and often those can't be recorded.
... Idk. Maybe I'm overthinking it. ¯\_(ツ)_/¯
Thank you for reading my half-baked thoughts!6 -
So I'm on my morning stroll. Walking, enjoying, watching the world around me.. It's nice how cherries blossom. They smell very tempting to stop there and enjoy the moment. Some flowers under the cherry...
Why do plants blossom again? Oh yeah, that's right, to exchange some speciments in order to grow fruit and seeds. To have their offspring. Just like every other living macroorganism [with a few exceptions ofc]. Life has no other way to survive but to exchange genetic material between two parties and only then trigger growth of the new life.
And that is a very strict rule. No more, no less: it takes exactly 2 organisms to make new life. But why is that? If my memory serves, theory of evolution says that life is like business: cut the losses and let the profits run. Over time it discards everything not required for the organism in order to save energy, and only successful new "investments" remain in the genome. The unsuccessful ones die before they proliferate, so the bad genes shall not survive.
It also says that very simple things, very simple changes lead to very complex outcomes. Us. Life.
But what is simple about life having to need 2 other lives? Exactly 2. It's either simple or efficient, depends on perspective. BUT IT IS NOT BOTH. Look at cells. They just split in half and multiply. Dead simple. It takes one of them to make another one. But with mammals, birds, reptiles, plants and other macroorganisms [excpt fungi] this is not the case! Why?!? I can't think of any scenario where two generic microorganisms, following some dead simple mutations, would come up w/ something that inefficient and overly complex. Like they're living on their own, multiplying by division, and smth very simple happens and they can no longer divide, only mate in pairs. The primitive, efficient and simple mechanism gets terminated and replaced with a different one, incredibly complex one!
Sure, we have protozoa which have similar reproductive mechanisms. They exchange genetic material to multiply.
But look at our, human cells. They dont need that! Look at some reptiles, some plants that only take one to make another. They don't pair as well! It's simple. Efficient. Why do protozoa need 2 for the species to survive?
It's not simple and efficient [tho helps us adapt, but its not my point for now]. See, things like this make ne wonder. What if we, the life, are not as accidental as we think? What if this whole mechanism was set off by someone or something billions of years ago? That's mean there are much older, much more superior cognitive organisms than us. What if protozoa was version 3 of new life [the first two did not survive]? Viruses - v2? Sea creatures - v3, reptiles - v4, and so on until they came up with us, mammals? That'd surely mean we are not alone in this universe. Are they watching us? Will they create a new species any time soon? What's our purpose, are we just an experiment?
And so, from cherry blossoms to existensial dilemma, my stroll is over. Time for breakfast :)1 -
I know this topic is tired and this isn't supposed to be a pure "REEEE SPACES BAD" kinda rant but I still don't understand why people would ever use spaces over tabs for indentation. I'm genuinely curious so please give me your arguments in favor of spaces because I just don't understand
So here's my position:
Tabs are objectively better than spaces in every single way
(I know that IDEs also do some of these for spaces, more on that later)
1. They are typed with one key press
2. They can be removed with one keypress
3. They allow for individually configurable width (some people prefer 2 and some 4 width)
4. They take up less memory (kinda irrelevant, but still)
5. You can properly navigate your code using the arrow keys which is much faster than using the mouse while typing
6. You don't have problems with accidentially having one too much or one too little
7. You don't have problems when copy pasting or moving code around (e.g. refactoring)
8. Code is much easier to select with the mouse, and
9. it's much easier clicking the right spot with the mouse where you want to continue typing, which is often at the start of a line
Apart from specific alignment, where spaces are fine (but which also almost never comes up), I just can't see a single thing where spaces are better at. So much so that most IDEs have to *pretend* that they're tabs when typing and removing them. It's so ironic yet people still defend it and big companies still use them.
I feel like I'm going mad 😨56 -
Java rant...
If public field declaration is so so bad, why can't all class fields in java just be private by default? And make it a pain in the ass to expose it as public?
I can't actually remember a time declared a class field as public, it is muscle memory now... Private bla Bla Bla, return, private bla Bla, return...4 -
So i wasted last 24 hours trying to satisfy my ego over a shitty interview and revisiting my old job's codebase and realising that i still don't like that shit. just i am 25 and have no clue where am i heading at. i am just restless, my most of the decisions in 2023 have given very bad outcomes and i am just trying doing things to feel hopeful.
context for the interview story-----
my previous job was at a b2b marketing company whose sdk was used by various startups to send notifications to their users, track analytics etc. i understood most of it and don't find it to be any major engineering marvel, but that interviewer was very interested in asking me to design a system around it.
in my 1.2 years of job there, i found the codebase to be extremely and unnecessarily verbose ( java 7) with questionable fallbacks and resistance towards change from the managers. they were always like "we can't change it otherwise a lot of our client won't use our sdk". i still wrote a lot of testcases and tried to understand the working of major features.
BTW, before you guys go on a declare me an embarrassment of an engineer who doesn't know the product's code base, let me tell you that we are talking SDKs (plural) and a service based company here. their was just one SDK with interesting, heavy lifting stuff and 9 more SDKs which were mostly wrappers and less advanced libraries. i got tasks in all of them, and 70% of my time went into maintaining those and debugging client side bugs instead of exploring the "already-stable-dont-change" code base.
so based on my vague understanding and my even more vague memory from 1 year ago, i tried to explain an overall architecture to that interviewer guy. His face was screaming the word "pathetic" from his expressions, so i thought that today i will try to decode the codebase in 12-15 hours, publish a cool article and be proud of how much i know a so called martech system design. their codebase is open sourced, so it wasn't difficult to check it out once more.
but boy oh boy i got so bored. unnecessary clases , unnecessary callbacks static calls , oof. i tried to refactor a few classes, but even after removing 70% of codebase, i was still left with 100+ classes , most of them being 3000-4000 files long. and this is your plain old java library adding just 800kb to your project.
boring , boring stuff. i would probably need 2-3 more days to get an understanding of complete project, although by then i would be again questioning my life choices , that was this a good use of my 36 hours?
what IS a correct usage of my time? i am currently super dissatisfied with my job, so want to switch. i have been here for 6 months, so probably i wouldn't be going unless i get insane money or an irresistible company offer. For this i had devised a 2 part plan to either become good at modern hot buzz stuff in my domain( the one being currently popularized by dev influenzas) or become good at dsa/leetcode/cp. i suck bad at ds/algo stuff, nor am i much motivated. so went with that hot buzz stuff.
but then this interview expected me to be a mature dev with system design knowledge... agh fuck. its festive season going on and am unable to buy any cool shirts since i am so much limited with my money from my mediocre salary and loans. and mom wants to buy a home too... yeah kill me3 -
I struggled with weather to post this but I feel like I have to. I didnt want to feed into the fear or give 'them' any more reason to argue against common sense but I guess it cant be helped.
The reason I was gone for a while was because I went and got my vaccination.
In less than half hour after getting the vaccine, I was in the ICU. The staff told me I had a stroke possibly from clotting and inflamation. I couldnt feel my arm or anything below my shoulders. Yes really.
Apparently I "died" for a little while and when they brought me back I was in a coma for almost a week.
I'm back home now and I still dont fully understand what happened. Still have numbness, and horrible headaches, and can barely think straight sometimes, but the doctors told me that I didnt suffer any permanent brain damage according to my scans.
Also they told me I had old damage to my left and right temporal lobe, which makes sense because I have always suffered problems with short term memory and other issues.
And I'm just at a loss how this could happen. I have no serious injuries. We were told this is safe.
And this is the exact reason I didnt want to post it, because now tards will come in and be "lololol serves you right vaxxer!"
If I knew the side effects were this bad maybe I would have changed my mind but no one told me! I mean I think I still would have got it because we have to protect vulnerable people, but still.
The hospital assured me it wasnt the vaccine and must have been an underlaying condition, but I'm not so sure. I just happen to have a pre-existing problem that I dont know about that causes a stroke and paralysis only half an hour after the shot?
And now I dont know if I'll ever be ok. And doctors warned me I may suffer more strokes and to avoid physically demanding tasks for a while. My primary job is construction (not by chooce). Now I face the prospect of not even being able to work my existing job or do the things I love, like hiking, anymore. So much of the world doesnt make any sense right now and I just dont know what to believe anymore.
Tards will probably be in shortly to suggest I check for microchips or test fucking magnets on myself.
No, just stop.8 -
Well ... a few minutes ago i tried to make a discord script wich is changing my status from idle to dnd to online and so on, in an infinite loop, all good until i checked Task manager and saw how much memory is using. my bad i guess in the way i wrote everything
-
Compilers should just work for raw C with only static memory allocation. This isn't the bad old days where a couple of dudes wrote a short book explaining how C might probably should possibly work. I hear supposedly we have standards now.
Well, last week I lost 2 days to our compiler randomly forgetting that it wasn't okay to put a globally allocated uint32 at an address ending in 9. What? It had been handling this case without issue for more a year, but now after changing completely unrelated code we have this problem.
I'm not sure how to even deal with this idiocy so no doubt I'll continue working on it this week, too.
Thanks a lot, GCC.1 -
Started learning Android development in Kotlin.
My first impressions:
- Kotlin is good, but class syntax is not very appealing
- Overall it seems to be quite easy, at least the basic stuff
- Android Studio is a fucking memory hog, RIP my RAM
- As good IntelliJ is as bad is Android Studio somehow
- Emulator seems to be really advanced which I like8 -
Lately programs have been crashing a lot on my pc, I've tried different things like disabling SWAP for a sec, BIOS changes, remove firefox and use Google Chrome, try different commands, it kept happening.
Obviously along the way I started investigating what was causing these crashes, looking through bug reports and my syslog. There was no consistency, except for 1 thing: SIGENV. Everything that crashed had a segmentation fault, now I'm not an expect and I don't know what this means or how to fix it, so I went to Google to ask for answers.
Then I downloaded memtest and ran a memory test, error palooza. Then I went to Windows and ran memory check, error palooza.
This is week 3 of this high-end gaming pc which was a huge investment AND IT HAS BEEN FUCKING WITH ME BECAUSE OF BAD MEMORY HOW THE FUCK DOES THIS HAPPEN I ALMOST STARTED TO DOUBT UBUNTU BUT IT WAS A FUCKING FAULT IN BRAND NEW MEMORY MODULES WHAT THE FUCK.
Obviously I'm pissed off. Today I'm gonna call the store that assembled it to voice my complaints.
Thank you for listening to my TedTalk.13 -
How to disconnect from work after working hours? Im working for the last 4 months as a mid level dev in this company. I mean Im able to problem-solve and do my work but sometimes I get so addicted to problem solving that I get worried and become obsessed, hyperfixated (especialy if Im stuck on something for lets say a couple weeks). It goes to the point where I work from home 12-14 hours a day just to figure out some bug in the flow.
Thing is, our codebase is large and when doing every new refactor/feature some surprises happen. I dont have a decent mentor who could teach me one on one or even do pair programming with. All i have is just some colleagues who can point me to right direction or do a code review from time to time. Thats it.
I dont know why I take this so personally. For example I had to do a feature which I did in 1 week, then MR got approved by devs and QA. After that during regression they found like 3 blockers and I felt really bad and ashamed. While in reality our BA did not define feature properly, devs who reviewed it didnt even launch the code and poke around in the app, and our team's QA tested only the happy scenario. Basically this is failing/getting delayed because of a failure in like 6-7 people chain.
However for some reason Im taking this very personally, that I, as a dev failed. Maybe due to my ADHD or something but for the next days or weeks as long as I dont find solution I will isolate myself and tryhard until I get it right. Then have a few days of chill until I face another obstacle in another task again. And this keeps repeating and repeating.
My senior colleague tells me to chill and dont let work take such a toll on my emotional/physical/mental health. But its hard. He has 7 years of experience and has decent memory. I have 2-3 years of experience and have ADHD, we are not the same. I dont know how to become a guy who clocks out after 8 hours of work done everyday. Its like I feel that they might fire me or I will look bad if I dont put in enough effort. Not like I was ever fired for performance issues... Anyways I dont know how to start working to live, instead of living for work.
I hate who Im becoming. I dont work out anymore, started smoking a lot, dont exercise. I live this self induced anxiety driven workaholic lifestyle.6 -
Alright, listen. If you come up with a crackme that requires someone to wait for something to happen for, say, 250 hours in real-time... but runs on a Gameboy or whatever other retro console? You're gonna have a bad time.
I'm on a Ryzen 5 2600, and with the most accurate Gameboy emulators out there barely running in Wine I can hit 1000x normal speed if I unlock the emulator's framerate. That 250 hours just became like 45 minutes without having to actually *do* anything. This even applies to "lol reverse this seeded generation thing" if I can try a few million combinations/sec just by incrementing some var in memory and re-running your code. (Yes, i'm literally doing that now. Yes, i'm blowing through this 28-bit keyspace like it's nothing. YES, THEY GAVE ME THE LAST NYBBLE FOR FREE!)3 -
Recoding the malloc is a mess. It seems to be a very good exercise and it is. But you know there are so much mystical issues that happens when you're working with memory.
I just figured out that I got 90% of my free function calls by the "ls" command that was in reality a bad pointer following my own verifications.
So, I don't know why because I just make a normal vérification so except if the "ls" was developer with the ass... -
Before I started working, I used to feel like I depended on documentation and the internet a little too much owing to ultra crappy long term memory. After spending some time at my internship going through code written by "professional developers" several years senior to me and trying to write unit tests for it (surprise: the code was in production without having underwent any sort of testing), I feel like the amount of time I spend online reading usage recommendations, alternates for optimisation, best practices for writing clean and descriptive code and all that is a lot more rewarding. Some bad things help you feel good about yourself.
-
Trying out Gnome again, because KDE is "just ok", and Hyprland and DWM are fine, but I wanted to try something different. (Actually DWM is amazing, and Hyprland is sorta weird?)
You know, it's not that bad. Doesn't even seem to be as memory crazy as everyone seems to say either...idk what I did, but it appears to be using around a GB, maybe a little less. Definitely not the experience I remember from the Gnome 2 days. Anyway, I was curious, so I was looking at the source on Github....and why the fuck is there javascript in this DE code? WHY. I do not understand.
Maybe I'm fucking nuts, but I actually kind of like the workflow, once I've applied a couple of "tweaks". But seriously, I am fucking gobsmacked at the JS thing. Why.9 -
Not a data loss exactly but a loss indeed.
It was my first week at my first junior developer job, I was just learning git and completely messed it all up. I lost around 3 hours of work.
I didn't want to ask anybody for help (because of that useless junior feeling, you know...) and wasn't as good using Google as I'm now.
So I re-did all the work. Thankfully, I have a decent memory.
If there's something to learn here is ask for help when you've used all your resources and still think you need it. Nobody is going to have a bad opinion about you ;) -
Ok so I just changed my keyboard layout to neo2 because qwertz can suck my balls. Looking quite good so far. I've been writing some smaller texts and it looks like you can get used to it quite fast (i also changed because I wanted to learn writing with 10 fingers anyways. Not that I've been writing slowly before, but why not).
The bad thing: all shortcuts (vim etc) feel strange because I have to betray my muscle memory now. So I thought I might also just switch to emacs now. Have to learn it from the beginning but it might be worth it.
Did anyone of you have any experience with neo (german) and what editors did you use?5 -
Currently having very funny project lead, who gives on the spot estimates for 9 years old very pathetic quality code having Android app in security domain. Memory leaks, bad practices, typos, CVEs etc. you name it we have it in our source of the app.
Since 5-6 sprints of our project, almost 50% of user stories were incomplete due to under estimations.
Basically everyone in management were almost sleeping since last 7-8 years about code quality & now suddenly when new Dev & QA team is here they wanted us to fix everything ASAP.
Most humourous thing is product owner is aware about importance of unit test cases, but don't want to allocate user stories for that at the time of sprint planning as code is almost freezed according to him for current release.
Actually, since last release he had done the same thing for each sprint, around 18 months were passed still he hadn't spared single day for unit testing.
Recently app crash issue was found in version upgrade scenario as QAs were much tired by testing hundreds of basic trivial test cases manually & server side testing too, so they can't do actual needful testing & which is tougher to automate for Dev.
Recently when team's old Macbook Pros got expired higher management has allocated Intel Mac minis by saying that few people of organization are misusing Macbooks. So for just few people everyone has to suffer now as there is no flexibility in frequent changing between WFH & WFO. 1 out of those Mac minis faced overheating & in repair since 6 months.
Out of 4 Devs & 3 QAs, all 3 QAs & 2 Devs had left gradually.
I think it's time to say goodbye 😔3 -
Must've been when I coded something of the core module of a game... into and with the test interface.
I was reminded that by my colleague who initially made this and spent a huge ton of time more than anyone else on the project. I felt a bit powerless while trying to assist in that, but I also felt bad about that error of mine.
...
That or that time when I set my whole system to protected and read-only during a system programming exercise because it ran out of memory real fast. -
I actually don't understand why most people like saying bad things about electron-js been a memory hog. I am not denying the fact that it sucks up system resources. Placing all the blame on electron-js is irrational because most apps built untop of electron-js does not hug memory (vscode is a living testimony to that). When you use bloated frameworks and/or libraries you are bound to have memory issues. When you don't understand how to manage memory effectively (in higher level language - you still have to do something for your value to be garbage collected) you are bound to be held captive in the chains of memory consumption.
Don't hate electron8 -
Garbage collection incentivizes shit and cuckold programmers. Change my mind.
Reason is basically, it's easy to design a bad architecture, potential bugs are just delayed and waiting to happen later. There are still resources like databases, whose management is more or less like memory that you never learn to do properly because of GC15 -
It is quite disappointing when some developers only rely on using libraries / dependencies(or whatever you call it) rather than do it manually. I know it can make the work faster but still using too much libraries will make it worst. It’s not bad using libraries, but if you use too much libraries it doesn’t degrade the performance of the app ( too much memory space when you only need that certain action and you include the whole library) but when the library becomes deprecated and no updates that might cause a problem.
It’s not bad using libraries, but not too much.2 -
During my small tenure as the lead mobile developer for a logistics company I had to manage my stacks between native Android applications in Java and native apps in IOS.
Back then, swift was barely coming into version 3 and as such the transition was not trustworthy enough for me to discard Obj C. So I went with Obj C and kept my knowledge of Swift in the back. It was not difficult since I had always liked Obj C for some reason. The language was what made me click with pointers and understand them well enough to feel more comfortable with C as it was a strict superset from said language. It was enjoyable really and making apps for IOS made me appreciate the ecosystem that much better and realize the level of dedication that the engineering team at Apple used for their compilation protocols. It was my first exposure to ARC(Automatic Reference Counting) as a "form" of garbage collection per se. The tooling in particular was nice, normally with xcode you have a 50/50 chance of it being great or shit. For me it was a mixture of both really, but the number of crashes or unexpected behavior was FAR lesser than what I had in Android back when we still used eclipse and even when we started to use Android Studio.
Developing IOS apps was also what made me see why IOS apps have that distinctive shine and why their phones required less memory(RAM). It was a pleasant experience.
The whole ordeal also left me with a bad taste for Android development. Don't get me wrong, I love my Android phones. But I firmly believe that unless you pay top dollar for an android manufacturer such as Samsung, motorla or lg then you will have lag galore. And man.....everyone that would try to prove me wrong always had to make excuses later on(no, your $200_$300 dllr android device just didn't cut it my dude)
It really sucks sometimes for Android development. I want to know what Google got so wrong that they made the decisions they made in order to make people design other tools such as React Native, Cordova, Ionic, phonegapp, titanium, xamarin(which is shit imo) codename one and many others. With IOS i never considered going for something different than Native since the API just seemed so well designed and far superior to me from an architectural point of view.
Fast forward to 2018(almost 2019) adn Google had talks about flutter for a while and how they make it seem that they are fixing how they want people to design apps.
You see. I firmly believe that tech stacks work in 2 ways:
1 people love a stack so much they start to develop cool ADDITIONS to it(see the awesomeios repo) to expand on the standard libraries
2 people start to FIX a stack because the implementation is broken, lacking in functionality, hard to use by itself: see okhttp, legit all the Square libs, butterknife etc etc etc and etc
From this I can conclude 2 things: people love developing for IOS because the ecosystem is nice and dev friendly, and people like to develop for Android in spite of how Google manages their API. Seriously Android is a great OS and having apps that work awesomely in spite of how hard it is to create applications for said platform just shows a level of love and dedication that is unmatched.
This is why I find it hard, and even mean to call out on one product over the other. Despite the morals behind the 2 leading companies inferred from my post, the develpers are what makes the situation better or worse.
So just fuck it and develop and use for what you want.
Honorific mention to PHP and the php developer community which is a mixture of fixing and adding in spite of the ammount of hatred that such coolness gets from a lot of peeps :P
Oh and I got a couple of mobile contracts in the way, this is why I made this post.
And I still hate developing for Android even though I love Java.3 -
I am having an introspective moment as a junior dev.
I am working in my 3rd company now and have spent the avg amount of time i would spent in a company ( 1- 1.5 years)
I find myself in similar problems and trajectories:
1. The companies i worked for were startups of various scales : an edtech platform, an insurance company (branch of an mnc) and a b2b analytics company
2. These people hire developers based on domain knowledge and not innovative thinking , and expect them to build anything that the PMs deem as growth/engagement worthy ( For eg, i am bad at those memory time optimising programming/ ds/algo, but i can make any kind of android screen/component, so me and people like me get hired here)
3. These people hire new PMs based on expertise in revenue generation and again , not on the basis of innovative thinking, coz most of the time these folks make tickets to experiment with buttons and text colors to increase engagement/growth
4. The system goes into chaos mode soon since their are so many cross operating teams and the PMs running around trying to boss every dev , qa and designer to add their changes in the app.
5. meanwhile due to multiple different teams working on different aspects, their is no common data center with up to date info of all flows, products and features. the product soon becomes a Frankenstein monster.
6. Thus these companies require more and more devs and QAs which are cogs in the system then innovative thinkers . the cogs in the system will simply come, dimwittingly add whatever feature is needed and goto home.
7. the cogs in system which also start taking the pain of tracking the changes and learning about the product itself becomes "load bearing cogs" : i.e the devs with so much knowledge of the product that they can be helpful in every aspect of feature lifecycle .
8. such devs find themselves in no need for proving themselves , in no need for doing innovative work and are simply promoted based on their domain knowledge and impact.
My question is simply this : are we as a dev just destined to be load bearing cogs?
we are doing the work which ideally a manager should be doing, ie maintaining confluence docs with end to end technical as well as business logic info of every feature/flow.
So is that the only definition of a Software Engineer in a technical product?
then how come innovations happen in companies like meta Microsoft google open ai etc?
if i have to guess as a far observer, i would say their diversity in different fields helps them mix and match stuff and lead to innovative stuff.
For eg, the android os team in google has helped add many innovative things in google cloud product and vice versa.
same is with azure and windows . windows is now optomissed to run in cloud machines when at one point it was just a horrible memory hogging and slow pc OS
for small companies, 1 ideology/product/domain is their hero ideology/product/domain .
an insurance company tries to experiment with stuff related to insurances,health,vehicles,and the best innovations they come up with is "lets give user a discount in premium if they do 5000 steps a day for an year".
edtech would say "lets do live streaming for children apart from static videos"
but Android team at google said , "since ai team is doing so well, lets include ai in various system apps and support device level models" ~ a much larger innovation as 2 domains combined to make a product
The small companies are not aiming to be an innovative product, they are just aiming to be a monopoly product. and this is kinda sad2 -
I think studying engineering has really fucked up the way i learn new things...i find it nearly impossible to commit anything to memory that could easily be looked up. On its own it doesnt sound so bad but now I keep forgetting simple programming syntax and android design patterns because my brain just keeps saying
"You dont need to remember this, you can find it online is 2 mins"
Id rather just keep a bookmark of a great navigation drawer tutorial as opposed to learning it myself...i worry now what will happen in my technical interviews even though I consider myself a good programmer -
Me and this friend of mine were usually average in college subjects. We were not really bad at them, we just never got any exceptional marks in those subjects.
So when our 4th sem result came, a third friend of us got really good marks in some subject , like in 90s, and we again had marks around 70s.
At that time we both knew that we know that subject way more than this topper guy in terms of knowledge, but he just crammed everything about that subject word to word and got the better marks.
We thus believed that marks doesn't matter, its the knowledge and we both know its stupid to cram useless things which could easily be referred from documentations or internet when required.
But last sem, something different happens. looks like mah boy was a little envious on the inside, he scored a whopping 88%, just near to that topper friend of ours . i was happy watching his happiness , and he was saying that "dude this sem, i will even try to beat that guy in marks."
Even though none of them are class toppers, but they are somehow running in the race to be one. I on the other hand is still firm on the belief of not cramming stupid shit just to get a status of some 'topper'.
even though cramming subject knowledge is not a total waste, i still believe we should only understand what we need to understand, like learning the moral from a war story, not cramming the actual war dates.
Some might find this quality of mine to be the reason of me being 'average', but i feel totally fine with it. I have trained myself to be able to lookup for a particular resource online faster than they are able to lookup for that resource crammed in their brain memory, and i wonder if i should feel guilty about it. Yet the society will always see me as an 'average' guy and them as a 'winner' -
Wow, yesterday was fun!
I had a rather buggy piece of code, it was bad when I first wrote it, and then I fixed it up, and it was still bad. Now I rewrote almost all of it, and it's much better.
Bad? How? Well, it was in Go, and it's basically an agent meant to execute tasks one at a time, and report the results back to home (live). Now while it worked, it was really flimsy, race conditions, way to much blocking, bad logic, and some very bad bugs.
So I had to rewrite it. Time for a quick primer on the design of this: you have a queue, a task gets add to the queue, the task manager runs the task. In the mean time, the agent is polling the host with the latest output from the task, and also receives new tasks to run (if there are any).
Seems like something that's for a messaging queue, you ask? Well, that would be true if each task was able to run on any random agent, but each task is only meant to run the agent it's tasked to (the tasks are of administrative nature al la apt-get), so having a whole separate service is a tad overkill.
So rewriting required rethinking how the tasks are executed by the task manager. I spent a day on this, it was fun, I ended up copying go contexts (very simple model, very useful). Why copy and not reuse? Because this is meant to be low memory code, so any extra parts are problematic, and I didn't really see a use for having a whole context, I just needed a way to announce that a task is done.
Anyways, if you're interested to see how the implementation worked out: https://github.com/chabad360/covey/...1 -
Not remembering something isn’t an indicator that you are busy, it just proves u have bad memory and poor management skills.
-
Compare and harmonize the web configs
Oh no someone set execution timeouts to 14 days
Fuck fuck fuckity duck
Hey compare all the web configs of all environments and harmonize them all wtf cmon bruh do your job as a developer
Take them and back them up into svn. What do you mean svn isn't a back up system of course it is well its the only thing we have fuck
What do you mean we have shit logging where people will catch an exception and only print the word exception in the log you can figure it out can't you we have live produxtion issues that hace to be solved now what the fuck
How dare you make a. Mistake copying our shitload of a bloated codebase and configuring our 100s of different options all by fukcing hand what the fuck dude do yoh write anyrhing down?
Please catalogue all the exception mails we are getting but we have no db or error reporting system so they all just plop into tue inbox and thats all ypur fuckjng data figure it out kid
This is a rewarding, fulfilling job whwrw you can be both dev ops and a developer and manage all of our fucking environments of which there are about 15 of all your own with no sort of tool or software to aid you because haha what the fuck we wouldn't make your life easy
Whata that you want to spend time to write stuff or change stuff that will nake it easier fot you fuxk that bruh get back to your biklable tasks like holy shit you thjnk this is a charity ofr aomw shit
Live production issues
Live production issues
Produxtion issues. A ghost in the machine. Find it fix if find it fix it find it fix it cmon why can't you fix it I expect you to spend your day hopelessly pretending to try to solve something you fucker
One of the only peopel able to help you sometimes though hes a bit of an old laxky, yeah hea fucking leaving see ya seeya kid and now we're not hirinf anyone to fuckjng help you no no no managing and monitoring the environments its your jov alll fof them every sngle on do you knkw all the xonfiguraiton values for them yet??
Instead we are hiring a new sales person to fucking make us some more money and we don't need naother seceloper to help you infqct lets have you use this mid end retail computer from 2014 to develop on yeah yeah oh but all our shitty code and visual studip will destry your memory but too bad!! Hahahahahdhsj
Go lice is all you, why sare you so slow
How long will it take
How long will it take
How long will it take
How long witll it tqk2
How long will it take holy shit
Give time estimate for sonethign that I don't fucking know how about it will tqke till fuxk you oxloxk4 -
I drank two pots of coffee and am now paranoid. I want to do memory test of my new ram. So I am going to use memtest from https://memtest.org/ . Out of paranoia I decide to test with VirusTotal. It passes 68 out of 70 tests. 2 say its bad. Windows Defender says it is fine. I usually just rely on Windows defender. I test with another site and it says it is clean. But is it really clean? Why do so many assholes ruin a good thing? Scammers and blackhat hackers are scum.4
-
Being too careful and always trying to reduce memory and processoe usage might be a bad thing after all. Lengthening development time and inducing more stress on the developer just to reduce resource usage is not very sensible when dealing with small to medium size programs that doesn't deal with big data/file types.
What made me notice this habit in programmers was when I was smashing my head on the keyboard contemplating what method I should use to store the history of outputs for a fucking text based program that has minimal gui elements..
Having ocd as a programmer is a nightmare. But thank god it's not as bad as it was a year ago. I couldn't even read something without repeating the same page over and over again because my stupid brain decided that I was not reading it right. WHAT THE FUCK IS READING IT RIGHT ? Thank god for my psychiatrist and pills. I can atleast work on my projects without wanting to kill myself now ! 😂1 -
I have seen references to API keys in several places. I have setup a few for various web services. However, I don't have a firm understanding of how they are protected (or not protected) from being copied and used by apps other than my own. I read a quick blurb from Google that said to use regular authentication over API keys due to them being able to be copied.
So my questions are: Are API keys just a bad way to subscribe services? Is there a way to protect them from being discovered? Maybe the app logs into a auth point for your services and is served the key to use with other services? But this key could still be gleaned from memory. Are API keys going to go away maybe in deference to things like oauth?3 -
studies in memory indicate that a person recalls information better when exposed to the same biological, and emotional state they were in when the information being recalled, be it semantic or episodic.
other aspects include many other cues.
what conversations people were having
the same people being there
similar sensory inoyt
and being in the original place where the memory was formed.
but some information is so jarring that the brain when kept in a consistently aggravated state of emotional unease and vigilance can be repressed along with from what i note, connected semantic information if say the information recalled was related to a nearly alien state of considerable mind and perception altering terror and unhappiness.
we often forget bad things because its the only way we heal.
the most evil people in the world found a way to recreate this trauma so they could add.
parallel memories. whole tracks of human experience existing apart from each other, just in the hopes of keeping that person quiet.
ironically for a person who is nice, witnessing these types of people responsible for these things get murdered or brutally deformed, also tends to be buried away with the same repressed memories.7 -
Not sure if a valid cause for a rant; but my memory stick went bad after being used for just 6 months. Bought this memory kit this summer on computeruniverse. Now Windows reports that there are damaged pages on the 1st stick; though the 2nd stick is fine. Patriot Viper with small heatsinks...
What to say... In ye olde days DDR3 worked for years and never went bad 🤔1 -
I want to jump into android app dev. My first plan is to start build one using flutter and dart language but my workstation is slow. Android studio is hogging my memory and it really slow me down plus bad experience. I plan to uninstall android studio and using other tools. Can anybody suggest what kind of tools that suitable for my current condition right now?9
-
I just went through a super long debugging process trying to figure out what was going on with my ZFS volumes. It turned out I had bad memory:
https://battlepenguin.com/tech/...