Join devRant
Do all the things like
++ or -- rants, post your own rants, comment on others' rants and build your customized dev avatar
Sign Up
Pipeless API
From the creators of devRant, Pipeless lets you power real-time personalized recommendations and activity feeds using a simple API
Learn More
Search - "wk258"
-
The network starts slowing down, transactions start to fail across the 450+ stores, the website starts to spit 500 errors what is going on?
Queue a frantic running around the office working out what was going wrong... Calls from all 3 data centres, nothing is going in or out of the network.
Notice the network admin come back to his desk, his eyebrows raise and he looks left and right before unplugging his laptop ethernet from one of the server access points
The network rushes back to life, everything is fine.
That particular network mapping tool is now banned for use on production.10 -
There was a time I made an update on one of our client's e-commerce website sign-up page. The update caused a bug that allowed new users to create an account without actually creating an account.
The code block meant to save user credentials (i.e email address and password) to the database was commented out for some reasons I still can't remember to this day. After registration new users had their session created just as normal but in reality they have no recorded account on the platform. This shit went on like this for a whole week affecting over 350 new customers before the devil sent me a DM.
I got a call from my boss on that weekend that some users who had made purchases recently can't access their account from a different device and cannot also update their password. Nobody likes duty calls on a weekend, I grudgingly and sluggishly opened up my PC to create a quick fix but when I saw what the problem was I shut down my PC immediately, I ran into the shower like I was being chased by a ghost, I kept screaming "what tha fuck! what tha fuck!!" cus I knew hell was about to break loose.
At that moment everything seemed off as if I could feel everything, I felt the water dripping down my spine, I could hear the tiniest of sound. I thought about the 350 new customers the client just lost, I imagined the raving anger on the face of my boss, I thought about how dumb my colleagues would think I was for such a stupid long running bug.
I wondered through all possible solutions that could save me from this embarrassment.
-- "If this shitty client would have just allowed us verify users email before usage things wouldn't have gotten to this extent"
-- "Should I call the customers to get their email address using their provided telephone?... No they'd think I'm a scammer"
-- "Should I tell my boss the database was hacked? Pffft hack my a**",
-- "Should I create a page for the affected users to re-verify their email address and password? No, some sessions may have expired"
-- "Or maybe this the best time to quit this f*ckn job!"
... Different thoughts from all four corners of the bathroom made it a really long bath. Finally, I decided it was best I told my boss what had happened. So I fixed the code, called my boss the next day and explained the situation on ground to him and yes he was furious. "What a silly mistake..!" he raged and raged. See me in my office by Monday.
That night felt longer than usual, I couldn't sleep properly. I felt pity for the client and I blamed it all on myself... yeah the "silly mistake", I could have been more careful.
Monday came boss wasn't at the office, Tuesday, Wednesday, Thursday, Friday not available. Next week he was around and when we both met the discussion was about a different project. I tried briefing him about last week incident, he seems not to recall and demands we focus on the current project.
However, over three hundred and fifty customers swept under the carpet courtesy of me. I still felt the guilt of that f*ck up till this day.1 -
DELETE FROM Invoices;
I am on test, right? OH SH!T
had a recent backup tho and no data were lost.
Happened once and never again. I was still pretty much junior in a senior position to be fair 😅6 -
This happened when I got my first IT support job. Naturally as a 1st line support you get to do the fun and not at all tedious thing of resetting passwords.
So I take a ticket from one of our HR people where they say that 3 new employees can't access a certain system.
Without going into too much detail here I reset the passwords according to our procedures and be done with it.
But at the end of the day it turns out that one of those 3 new employees was the new CEO, and he was known to be not the most pleasant of people to work with.
So ofc there was a chain of emails with the words "How can someone not know who I am" in there somewhere.
Had a nice stressful weekend wondering if I'll still have a job after Monday and we had a whole new password reset procedure created because of that.2 -
I was working on a team with people with various employment statuses. Contractors, employees of the client, and me as a regular full time employee of the company that “owned” the contract. My HR manager gave us a presentation about our reporting structure. I had at least seven managers for different reasons across various projects.
I got a new position so needed to resign but I had no idea which managers were the ones I should notify. I looked at the org chart that the HR lady showed. I sent my resignation to five managers that would be affected by my leaving. Unknown to me my project manager was actually a contracting manager hired by the client. He let his employer, the client, know that the lead dev quit.
Apparently it destabilized the contract for my employer. If I hadn’t just issued resignation they would have fired me for telling a customer about a significant internal staffing change. They didn’t fire me because the optics would have been worse for them.2 -
Have a couple I want to air today.
First was at my first gig as a dev, 4-5 months out of school. I was the only dev at a startup where the owner was a computer illiterate psycopath with serious temper tantrums. We're talking slamming doors, shouting at you while you are on the phone with customers, the works...
Anyways, what happened was that we needed to do an update in our database to correct some data on a few order lines regarding a specific product. Guess who forgot the fucking where-clause... Did I mention this boss was a cheap ass, dollar stupid, penny wise asshole that refused to have anything but the cheapest hosting? No backups, no test/dev/staging environment, no local copies... Yeah, live devving in prod, fucking all customers with a missing semi-colon (or where clause).
Amazingly, his sheer incompetance saved my ass, because even if I explained it, he didn't get it, and just wanted it fixed as best we could.
The second time was at a different company where we were delivering managed network services for a few municipalities. I was working netops at that time, mostly Cisco branded stuff, from Voice-over-IP and wifi to switches and some routing.
One day I was rolling out a new wireless network, and had to add the VLAN to the core switch on the correct port. VLAN's, for those who don't know, are virtual networks you can use to run several separated networks on the same cable.
To add a VLAN on a Cisco switch one uses the command:
switchport access vlan add XYZ
My mistake was omitting the 'add', which Cisco switches happily accept without warning. That command however can be quite disruptive as it replaces all of the excisting VLAN's with the new one.
Not a big deal on a distribution switch supplying an office floor or something, but on a fucking core switch in the datacenter this meant 20K user had no internet, no access to the applications in the DS, no access to Active Directory etc. Oh and my remote access to that switch also went down the drain...
Luckily a colleague of mine was on site with a console cable and access to config backups. Shit was over within 15 minutes. My boss at that time was thankfully a pragmatic guy who just responded "Well, at least you won't make that mistake again" when we debriefed him after the dust settled. -
I updated a date field on a table with about 4 million records in it.
Only meant to update about 1000 of them.
This field is used to generate QBRs. So the executive level would notice first.
But then.... They didn't. Nobody noticed. Not for almost a year.
Then someone asked about it and I told them what happened, and they never brought it up again. -
Not myself but friend of mine. Early 2000s working at a large university. Top notch office PCs for the time, best internet connection in the country.
He discovers this "Bittorrent" program. Meh, just another file sharing thing... but who cares, it's 2003-ish so everyone downloads shit from the internet.
Installs it on his office PC, because its university so no one cares.
Friday afternoon, he starts download of his favourite music album (some hard to get live version or something), then goes off into the weekend, computer is left running as always.
Download is finished after an hour or so, then his Bittorrent client starts seeding. Lots of people want this album. Bittorrent adapts to bandwith and when your connection is good you get upvoted in the network and everyone is connecting to you.
Monday comes, my friend arrives back at his desk, bit late because he slept in and its university so no one cares.
Suddenly realises many missed calls on his desk phone. Calls back, it's from the IT department.
Friend: "You have called me? What can I do for you?"
IT Guy (screaming): "WHAT THE HELL ARE YOU DOING??? YOUR PC IS CAUSING 50% OF THE UNIVERSITY'S INTERNET TRAFFIC.!!!!"
Friend: "Whops."
IT Guy (hysterical): "WHATEVER YOU ARE RUNNING STOP IT NOW!!!!"
Friend: *stops Bittorrend client, enjoys his favourite album*
Lucky him, it's a university, so in the end no one cared.4 -
Running a wild UPDATE statement against an inventory database, it was chaotic and didn't have backups.... yea I know 😅 the backup service died god knows when and no one noticed. I was only new at the time so #notMyFault
UPDATE stock_on_hand = 0; WHERE id IN(1,2,3,4,5,88,972,7388);
# rows affected 1,234,567,890
> I think I almost died inside.
Oh the fun that mess was to clean up.
The positive outcome of this was, we had backups working again not long after and the inventory counts where accurate after that stock take.3 -
Forgot to do server side verification.
As the service (an injectable game) was expanding and the old system relied on server side calculation without anything returned from the user, the expansion was done a little too fast.
The result could have been anyone passing wrong data and receiving the grand price like a holiday worth $10k. Quick fix ... -
A couple of years ago I was working on a fairly large system with a complex (by necessity) access control architecture.
As is usually the case with those projects, it's awkward for developers to repro bugs that have to do with a user's accesses in production when we are not allowed to replicate production data in test, let alone locally.
We had a bug where I ended up making myself a new row in the production database for a thing I could have access to without affecting real data to repro it safely. I identified the bug so I could repro it in dev/test and removed the row and ensured everything worked normally, whew scary.
Have you ever walked into the office one day, and everyone is hunched over in a semicircle around one person's workstation, before one turns around to look at you and says - after a pause - "... ltlian?.."
Turns out I had basically "poisoned the well" with my dummy entity in a way where production now threw 500 for everyone BUT me who had transitive access to this post-non-entity. Due to the scope of the system, it had taken about a day for this to gradually propagate in terms of caching and eventual consistencies; new entities coming in was expected, but not that they disappear.
Luckily I had a decent track record for this to be a one-off. I sometimes think about how I would explain testing in prod and making it faceplant before going home for the day, other than "I assumed it would be fine". I would fire me.3 -
Accidentally drop a column of a table and without realising that take is in production,..... luckily I have backup....
-
This was more than 15 years ago. We migrated a bunch if data (home to a new server and repurposed the old one, the same night. This was not the first task on that allweekender, so it was around 3am on Sunday, with very little sleep, when I had to copy the data. I did that by logging in as admin and copying with Total Commander. Obviously, even admin did not have the permissions to some folders, so a lot of financial data were lost, as the users found out on Monday morning. We had no backup. Old server was not only reformatted, but the disks were used to build a different raid set. Luckily, one of the users who had access to this data kept a backup on a flash drive. (If you're wondering, I should've used robocopy with backup mode)
-
Not job but during internship, the guy I didn’t like gave me the electrical circuit task, with acid bath and shit. I had to create 2 from the original tEmplate. I realized one copy had one of its connecting plate removed by mistake. I duplicated my mistake on the second copy as well.