Join devRant
Do all the things like
++ or -- rants, post your own rants, comment on others' rants and build your customized dev avatar
Sign Up
Pipeless API
From the creators of devRant, Pipeless lets you power real-time personalized recommendations and activity feeds using a simple API
Learn More
Search - "best dba"
-
In a user-interface design meeting over a regulatory compliance implementation:
User: “We’ll need to input a city.”
Dev: “Should we validate that city against the state, zip code, and country?”
User: “You are going to make me enter all that data? Ugh…then make it a drop-down. I select the city and the state, zip code auto-fill. I don’t want to make a mistake typing any of that data in.”
Me: “I don’t think a drop-down of every city in the US is feasible.”
Manage: “Why? There cannot be that many. Drop-down is fine. What about the button? We have a few icons to choose from…”
Me: “Uh..yea…there are thousands of cities in the US. Way too much data to for anyone to realistically scroll through”
Dev: “They won’t have to scroll, I’ll filter the list when they start typing.”
Me: “That’s not really the issue and if they are typing the city anyway, just let them type it in.”
User: “What if I mistype Ch1cago? We could inadvertently be out of compliance. The system should never open the company up for federal lawsuits”
Me: “If we’re hiring individuals responsible for legal compliance who can’t spell Chicago, we should be sued by the federal government. We should validate the data the best we can, but it is ultimately your department’s responsibility for data accuracy.”
Manager: “Now now…it’s all our responsibility. What is wrong with a few thousand item drop-down?”
Me: “Um, memory, network bandwidth, database storage, who maintains this list of cities? A lot of time and resources could be saved by simply paying attention.”
Manager: “Memory? Well, memory is cheap. If the workstation needs more memory, we’ll add more”
Dev: “Creating a drop-down is easy and selecting thousands of rows from the database should be fast enough. If the selection is slow, I’ll put it in a thread.”
DBA: “Table won’t be that big and won’t take up much disk space. We’ll need to setup stored procedures, and data import jobs from somewhere to maintain the data. New cities, name changes, ect. ”
Manager: “And if the network starts becoming too slow, we’ll have the Networking dept. open up the valves.”
Me: “Am I the only one seeing all the moving parts we’re introducing just to keep someone from misspelling ‘Chicago’? I’ll admit I’m wrong or maybe I’m not looking at the problem correctly. The point of redesigning the compliance system is to make it simpler, not more complex.”
Manager: “I’m missing the point to why we’re still talking about this. Decision has been made. Drop-down of all cities in the US. Moving on to the button’s icon ..”
Me: “Where is the list of cities going to come from?”
<few seconds of silence>
Dev: “Post office I guess.”
Me: “You guess?…OK…Who is going to manage this list of cities? The manager responsible for regulations?”
User: “Thousands of cities? Oh no …no one is our area has time for that. The system should do it”
Me: “OK, the system. That falls on the DBA. Are you going to be responsible for keeping the data accurate? What is going to audit the cities to make sure the names are properly named and associated with the correct state?”
DBA: “Uh..I don’t know…um…I can set up a job to run every night”
Me: “A job to do what? Validate the data against what?”
Manager: “Do you have a point? No one said it would be easy and all of those details can be answered later.”
Me: “Almost done, and this should be easy. How many cities do we currently have to maintain compliance?”
User: “Maybe 4 or 5. Not many. Regulations are mostly on a state level.”
Me: “When was the last time we created a new city compliance?”
User: “Maybe, 8 years ago. It was before I started.”
Me: “So we’re creating all this complexity for data that, realistically, probably won’t ever change?”
User: “Oh crap, you’re right. What the hell was I thinking…Scratch the drop-down idea. I doubt we’re have a new city regulation anytime soon and how hard is it to type in a city?”
Manager: “OK, are we done wasting everyone’s time on this? No drop-down of cities...next …Let’s get back to the button’s icon …”
Simplicity 1, complexity 0.16 -
--- GitHub 24-hour outage post mortem ---
As many of you will remember; Github fell over earlier this month and cracked its head on the counter top on the way down. For more or less a full 24 hours the repo-wrangling behemoth had inconsistent data being presented to users, slow response times and failing requests during common user actions such as reporting issues and questioning your career choice in code reviews.
It's been revealed in a post-mortem of the incident (link at the end of the article) that DB replication was the root cause of the chaos after a failing 100G network link was being replaced during routine maintenance. I don't pretend to be a rockstar-ninja-wizard DBA but after speaking with colleagues who went a shade whiter when the term "replication" was used - It's hard to predict where a design decision will bite back and leave you untanging the web of lies and misinformation reported by the databases for weeks if not months after everything's gone a tad sideways.
When the link was yanked out of the east coast DC undergoing maintenance - Github's "Orchestrator" software did exactly what it was meant to do; It hit the "ohshi" button and failed over to another DC that wasn't reporting any issues. The hitch in the master plan was that when connectivity came back up at the east coast DC, Orchestrator was unable to (un)fail-over back to the east coast DC due to each cluster containing data the other didn't have.
At this point it's reasonable to assume that pants were turning funny colours - Monitoring systems across the board started squealing, firing off messages to engineers demanding they rouse from the land of nod and snap back to reality, that was a bit more "on-fire" than usual. A quick call to Orchestrator's API returned a result set that only contained database servers from the west coast - none of the east coast servers had responded.
Come 11pm UTC (about 10 minutes after the initial pant re-colouring) engineers realised they were well and truly backed into a corner, the site was flipped into "Yellow" status and internal mechanisms for deployments were locked out. 5 minutes later an Incident Co-ordinator was dragged from their lair by the status change and almost immediately flipped the site into "Red" status, a move i can only hope was accompanied by all the lights going red and klaxons sounding.
Even more engineers were roused from their slumber to help with the recovery effort, By this point hair was turning grey in real time - The fail-over DB cluster had been processing user data for nearly 40 minutes, every second that passed made the inevitable untangling process exponentially more difficult. Not long after this Github made the call to pause webhooks and Github Pages builds in an attempt to prevent further data loss, causing disruption to those of us using Github as a way of kicking off our deployment processes (myself included, I had to SSH in and run a git pull myself like some kind of savage).
Glossing over several more "And then things were still broken" sections of the post mortem; Clever engineers with their heads screwed on the right way successfully executed what i can only imagine was a large, complex and risky plan to untangle the mess and restore functionality. Github was picked up off the kitchen floor and promptly placed in a comfy chair with a sweet tea to recover. The enormous backlog of webhooks and Pages builds was caught up with and everything was more or less back to normal.
It goes to show that even the best laid plan rarely survives first contact with the enemy, In this case a failing 100G network link somewhere inside an east coast data center.
Link to the post mortem: https://blog.github.com/2018-10-30-...6 -
!rant
We just did a massive update to our prod db environment that would implicate damn near all system in our servers....on a friday.
Luckily for us, our DB is a badass rockstar mfking hero that was planning this shit for a little over a year with the assistance of yours truly as backup following the man's lead...and even then I didn't do SHIT
My boy did great, tested everything and the switch was effortless, fast (considering that it went on during working hours) and painless.
I salute my mfking dude, if i make my own company I am stealing this mfker. Homie speaks in SQL, homie was prolly there when SQL was invented and was already speaking in sql before shit was even set in spec, homie can take a glance at a huge db and already cast his opinion before looking at the design and architecture, homie was Data Science before data science was a thing.
Homie is my man crush on the number one spot putting mfking henry cavill on second place.
Homie wakes up and pisses greatness.
Homie is the man. Hope yall have the same mfking homie as I do5 -
Yesterday I spent some time on the meta site for dba.stackexchange.com and found this one guy with 1 rep raging about how his questions aren't getting answered and how is answers are the best etc...
"I have 17 years of experience as a dba, blah, blah, blah, my answers are correct, blah, blah"
He got pretty destroyed by the mods and other users about how shit his answers were and how they weren't factually correct etc...
This just continues to show that no matter how much experience you have you won't always be right.
Same goes for my senior at work, he has 10 years more experience than me (I have 2) and he still asks for my point of view and help without being a dick about it.
I hope we'll all keep being nice people unlike that Stackexchange guy...2 -
A couple of months back we were discussing sh with a third party vendor for a very large ass fuck system that another department uses. I had been called into the meeting because the entire I.T department counts on me to at least act as an assessor to the many issues that other departments might have.
the department for which i was working with manages the databases that our institution uses, and in this particular question the DBA (my best friend mind you) was part of the meeting.
Mind you, issues that the third party vendor were having were all fixed by our DBA, and he had documented and mentioned these items to me as I provided assistance to him through the 3 weeks prior to this meetings. Once such case was that we needed a transitioning as well as intermediary system for some processes to happen from one DB to the other and a lot of other technical babble. Well, the DBA used to be an excellent (fuck you) VB developer who recently re-learned the language into .net. He had shown me many of his old programs and even by the limitations of the language they were elegant and fascinating. They really are and ya'll devrant fam know that I ain't one to hate on tech at all.
When the DBA explained how he went around some of the issues by generating programs that could assist him, he mentioned the tech stack, I had coached him into knowing that being descriptive about the tools he used would be beneficial to everyone else. While he mentioned VB.NET the vendor snickered and my boy got quiet.
Then I broke the silence, fuck you. "what was that?" and the dude said "nothing, sorry"
So I said "no no, I want to know, I am not going past this point until you, the dude getting paid over $100 an hour for something YOU couldn't fix explain to me the little hehe moment you had"
The mfker went silent. then explained how he was aware that people were moving past vb.net and shit like that, me "imagine that, someone used a tech stack that your ignorance thought obsolete to fix something you could not solve, even though we are paying you for it, were it me or in my hands, and mind you i have direct access to the VP so this foolishness might change, I would have cut you and your little sect loose months ago, I have no patience, or appreciation from leeches like you or the rest of the "professionals" that work for your company or other similar entities, much less, as you can see, my patience runs even less when you people snicker at the solutions that our staff has to take when you all slack"
The entire meeting was uncomfortable as high heaven.
Fuck you, if someone I know manages to run shit on fucking liberty basic then so fucking be it. I will slap you 10 fucking times over, and then fuck your girl, if you try to put someone else down for the tech stacks you use.
I hate neck beards, BUT I hate fake ass neckbeards ever more
*Colin Farrell in true detective mode: FUCK....YOU13 -
!rant
MASSIVE UPGRADE ROUND 2:
We took it by steps, the DBA did his portion and I did mine, we had waited for the entire thing to be finalized today on Sunday since our users are probably jerking off to their waifus (as they should) and today was my part. MA BOE the DBA was with me the entire time and the whole process took us about 4 hours of both of us getting multiple heart attacks here and there and praying to the elder gods of Asgard for their devine protection as we venture into the calamity of fire and juten ass mfkers that are our fucking servers for this particular process.
Man I really hope for the pandemic to be over and take my dude out for a nice beer, some wings and some relaxation time.
Best DB/Dev team I have ever been with.7 -
Sysadmin's nemesis: a DBA. Especially an oracle DBA. There's no other kind of tech worker I've seen who's more opposed to best practices.
How about for devs?4 -
FUCK YES
The feeling when you and the DBA completely fix an issue that has been fucking up your users and that the third party vendors themselves couldn't fix on your own teamwork is so..... fucking... addicting.
Wrote an email to the hod to let us off a bit late tomorrow morning, least I can do for this fucking server admin, sql class A mastermind, Oracle fucking super pro.
I really pray for all of you mfkers to get the same type of coworker. this dude has taught me a lot and I really jump at the first opportunity I get to work with him. His accomplishments for the institution are many really, its just one of those happy bromances man.
I raise my beer mug, to the best fucking DBA i have ever worked with.
For my next trick, I am going to make sure the dude gets the position for the manager of his department as soon as the current dude retires (should be soon) a great man himself, but short on giving his dba the praise he deserves.
The previous manager of my departament told me "pay attention to <DBA NAME> he is your secret weapon and you will be his" and by heavens sweet momma was right.