Join devRant
Do all the things like
++ or -- rants, post your own rants, comment on others' rants and build your customized dev avatar
Sign Up
Pipeless API
From the creators of devRant, Pipeless lets you power real-time personalized recommendations and activity feeds using a simple API
Learn More
Search - "pretty metrics"
-
Today was a day at work that I felt like I made a significant contribution. It was not a lot of code. Actually it was a difference of 3 characters.
I am developing an industrial server so that my employer can provide access to their machines to enterprise industrial systems. You know, the big boys toys. Probably in fucking java...
Anyway, I am putting this server on an embedded system. So naturally you want to see how much serving a server can serve. In this case the device in more processor starved than memory starved. So I bumped up the speed of the serving from 1000mS to 100mS per sample. This caused the processor to jump from 8% of one core (as read from top) to 70%. Okay, 10x more sampling then 10x approx cpu usage. That is good. I know some basic metrics for a certain amount of data for a couple of different sampling rates.
Now, I realized this really was not that much activity for this processor. I mean, it didn't seem to me that it "took much" to see a large increase of processor usage. So I started wondering about another process on the system that was eating 60 to 70 % all the time. I know it updated a screen that showed some not often needed data from its display among controlling things. Most of the time it will be in a cabinet hidden from the world. I started looking at this code and figured out where the display code was being called.
This is where it gets interesting. I didn't write this code. Another really good programmer I work with wrote this. It also seemed to be pretty standard approach. It had a timer that fired an event every 50mS. This is 20 times per second. So 20 fps if you will. I thought, What would happen if I changed this to 250mS? So I did. It dropped the processor usage to 15%! WTF?! I showed another programmer: WTF?! I showed the guy who wrote it: WTF?! I asked what does it do? He said all it does it update the display. He said: Lets take to 1000mS! I was hesitant, but okay. It dropped to 5%!
What is funny is several people all said: This is running kinda hot. It really shouldn't be this hot.
Don't assume, if you have a hunch, play with it if its safe to do so. You might just shave off 55 to 60 % cpu usage on your system.
So the code I ended up changing: "50" to "1000".16 -
The GitHub graphql API is pretty neat, mostly because it's a great example of a product where graphql has advantages over REST. As a code reviewer for repos with hundreds of simultaneous PRs, I use it to filter through branches for stuff that needs my attention the most.
NewRelic's NRQL API is also quite nice, as it provides an unusual but very direct interface into the underlying application metrics.
I'm also a big fan of launchlibrary, purely because I love spaceflight, and their API is an extremely rich and actively maintained resource. This makes it a great data source for playing around with plotting & statistics libraries — when I'm learning new languages or tools, I prefer to make something "real" rather than following a tutorial, and I often use launchlibrary as a fun and useful data backend. -
When I was in college OOP was emerging. A lot of the professors were against teaching it as the core. Some younger professors were adamant about it, and also Java fanatics. So after the bell rang, they'd sometimes teach people that wanted to learn it. I stayed after and the professor said that object oriented programming treated things like reality.
My first thought to this was hold up, modeling reality is hard and complicated, why would you want to add that to your programming that's utter madness.
Then he started with a ball example and how some balls in reality are blue, and they can have a bounce action we can express with a method.
My first thought was that this seems a very niche example. It has very little to do with any problems I have yet solved and I felt thinking about it this way would complicate my programs rather than make them simpler.
I looked around the at remnants of my classmates and saw several sitting forward, their eyes lit up and I felt like I was in a cult meeting where the head is trying to make everyone enamored of their personality. Except he wasn't selling himself, he was selling an idea.
I patiently waited it out, wanting there to be something of value in the after the bell lesson. Something I could use to better my own programming ability. It never came.
This same professor would tell us all to read and buy gang of four it would change our lives. It was an expensive hard cover book with a ribbon attached for a bookmark. It was made to look important. I didn't have much money in college but I gave it a shot I bought the book. I remember wrinkling my nose often, reading at it. Feeling like I was still being sold something. But where was the proof. It was all an argument from authority and I didn't think the argument was very good.
I left college thinking the whole thing was silly and would surely go away with time. And then it grew, and grew. It started to be impossible to avoid it. So I'd just use it when I had to and that became more and more often.
I began to doubt myself. Perhaps I was wrong, surely all these people using and loving this paradigm could not be wrong. I took on a 3 year project to dive deep into OOP later in my career. I was already intimately aware of OOP having to have done so much of it. But I caught up on all the latest ideas and practiced them for a the first year. I thought if OOP is so good I should be able to be more productive in years 2 and 3.
It was the most miserable I had ever been as a programmer. Everything took forever to do. There was boilerplate code everywhere. You didn't so much solve problems as stuff abstract ideas that had nothing to do with the problem everywhere and THEN code the actual part of the code that does a task. Even though I was working with an interpreted language they had added a need to compile, for dependency injection. What's next taking the benefit of dynamic typing and forcing typing into it? Oh I see they managed to do that too. At this point why not just use C or C++. It's going to do everything you wanted if you add compiling and typing and do it way faster at run time.
I talked to the client extensively about everything. We both agreed the project was untenable. We moved everything over another 3 years. His business is doing better than ever before now by several metrics. And I can be productive again. My self doubt was over. OOP is a complicated mess that drags down the software industry, little better than snake oil and full of empty promises. Unfortunately it is all some people know.
Now there is a functional movement, a data oriented movement, and things are looking a little brighter. However, no one seems to care for procedural. Functional and procedural are not that different. Functional just tries to put more constraints on the developer. Data oriented is also a lot more sensible, and again pretty close to procedural a lot of the time. It's just odd to me this need to separate from procedural at all. Procedural was very honest. If you're a bad programmer you make bad code. If you're a good programmer you make good code. It seems a lot of this was meant to enforce bad programmers to make good code. I'll tell you what I think though. I think that has never worked. It's just hidden it away in some abstraction and made identifying it harder. Much like the code methodologies themselves do to the code.
Now I'm left with a choice, keep my own business going to work on what I love, shift gears and do what I hate for more money, or pivot careers entirely. I decided after all this to go into data science because what you all are doing to the software industry sickens me. And that's my story. It's one that makes a lot of people defensive or even passive aggressive, to those people I say, try more things. At least then you can be less defensive about your opinion.53 -
I recently tried to apply the same data analytics rationale that I use at work to my personal life. This is not a rant, it is more like an data storytelling of an actual use case I would like some input on.
I set a goal - gotta thin up a bit and calm down my ticker - and got a (almost unreasonably expensive) field expert consultant to yell at me about it for a couple hours.
I unravel the metrics - there is like a million weight-related KPIs and most say nothing at all. I have never seen an non-infrastructure measurable subject that could not be resumed to 2-5 performance metrics. I got overall weight, how well my nine-years-old business suit fits me, heart rate, and day-after relative muscle pain (it will make sense soon).
Then its data-pipeline time. I bought a cheap weight scale and smartwatch, and every morning I input the data in an app. Yes, I try to put on the suit every morning. It still does not fit.
After establishing a baseline, I tried to fit different approaches. Doing equipment-free exercises, going to the gym, dieting. None was actually feasible in the long run, but trying different approaches does highlight the impacts and the handling profile of each method.
Looking at the now-gathered data, one thing was obvious - can't do dieting because it is not doable to have a shopping list and meals for me and another for the family.
Gym is also off the table - too much overhead. I spend more time on the trip there and back than actually there.
And home exercise equipment is either super crappy or very expensive. But it is also the most reasonable approach.
So it is solutions time. I got a nice exercise bycicle (not a peloton), an yoga mat (the wife already had that one) and an exercise program that uses only those two resources. Not as efficient without dieting, not as measurable and broad as the gym, but it fits my workflow. Deploy to production!
A few months pass and the dataset grows. The signal is subtle but has support - it works! The handling, however, needs improvement, since I cannot often enough get with the exercise program. Some mornings are just after some hard days.
I start thinking about what else I can improve in the program, but it is already pretty lean and full of compromises.
So I pull an engineer and start thinking about the support systems and draft profile. What else could be draining my willpower and morning time?
Chores. Getting the kids ready for school, firing up the moka pot, setting the off-brand roomba, folding the overnight-dried clothes, cooking breakfast, doing the dishes, cleaning the toilets. All part of my morning routine. It might benefit from some automation.
Last month I got that machine our elders call "wasteful" and "useless crap lazy entitled Americans invented because they feel oh-so-insulted for simply doing something by hand like everyone always did" - a "dish-washer".
Heh, I remember how hard was to convince my mother-in-law that an remote-controled electric garage door would not make she look like an spoiled brat.
Still to early to call, but I think that the dishwasher just saved me about 25 mins every morning. It might be enough to save willpower for me to do more exercise.
This is all so reflective of all data analytics cases really are out in the wild - the analytics phase seems so small compared to the gathering and practical problem-solving all around. And yet d.a. is what tells you that you are doing the wrong thing all along. Or on what you should work next.7 -
At what point do you say a junior dev is no longer a junior? What metrics do you use? Like scope of knowledge, impact on team / code decisions, years experience, management skills, etc.?
I feel I'm qualified as a mid level developer now despite only being a junior for a little over a year. I had tons of internships in college and was kind of placed in a role where growing fast was required.
I broke a sweat for most of that ~1 year I worked as a junior and my contributions to my project aren't insignificant
I don't say that to toot my own horn here, I really do want to ground myself in reality. But I don't know if my standards are too low or my organizations standards are too high. FWIW, other devs on my team have commented privately / informally that the junior title isn't super fitting.
I'm still pretty dependent on my boss but that's more for final say of things. He'll often have some input to my work but I'll also be involved with design discussion and take up a large chunk of work without question. On light sprints I'm knocking out 20+ taskhours of work, going closer to 30/40 when things pick up. Not uncommon to kill 10 user stories in a sprint.
I don't know, what do you guys think?8 -
Compromise.
I think that sums up development pretty much.
Take for example coding patterns: Most of them *could* be applied on a global scale (all products)… But that doesn't mean you *should* apply them. :-)
Find a matching **compromise** that makes specific sense for the product you develop.
Small example: SOLID / DRY are good practices. But breaking these principles by for example introducing redundant code could be a very wise design decision - an example would be if you know full ahead that the redundancy is needed for further changes ahead. Going full DRY only to add the redundancy later is time spent better elsewhere.
The principle of compromise applies to other things, too.
Take for example architecture design.
Instead of trying to enforce your whole vision of a product, focus on key areas that you really think must be done.
Don't waste your breath on small stuff - cause then you probably lack the strength for focusing on the important things.
Compromise - choose what is *truly* important and make sure that gets integrated vs trying to "get your will done".
Small example: It doesn't really matter if a function is called myDingDong or myDingDongWithBells - one is longer, other shorter. Refactoring tools make renaming a function an easy task. What matters is what this function does and that it does this efficiently and precise. Instead of discussing the *name* of the function, focus on what the function *does*.
If you've read so far and think this example is dumb: Nope... I've seen PR reports where people struggled for hours with lil shit while the elephant in the room like an N+1 problem / database query or other fundamental things completely drowned in the small shit discussion noise.
We had code design, we had architecture... Same goes for people, debugging, and everything else.
Just because you don't like what weird person A does, doesn't mean it's shit.
Compromise. You don't have to like them. Just tolerate them. Listen. Then try to process their feedback unbiased. Simple as that. Don't make discussions personal - and don't isolate yourself by just working with specific persons. Cause living in such a bubble means you miss out a lot of knowledge and insight… or in short: You suck because of your own choices. :-)
Debugging... Again compromise: instead of wasting hours on debugging a problem, ASK for help. A simple: Has anyone done debugging this before or has some input for how to debug this problem efficiently?... Can sometimes work wonders. Don't start debugging without looking into alternative solutions like telemetry, metrics, known problems etc.
It could be a viable, better long term solution to add metrics to a product than to debug for hours ... Compromise. Find a fitting approach to analyze a problem instead of just starting a brute force approach.
....
Et cetera et cetera. -
Just discovered wizzy ... Wow, freaking sweet!
https://github.com/utkarshcmu/wizzy
I like it for many reasons, just started playing with it, therefore #1 reason so far is saving dashboards and having them in a git version control, yay!!!
Also, if you're not familiar with Grafana, let me blow your mind: http://grafana.org4 -
One of the things that gives me a lot of satisfaction is doing integrations. One customer has a good architecture, and they chose to put their metrics database with billions of records in Apache Cassandra. Which is pretty cool! A business consultancy that is helping them grow a lot, has implemented PowerBI and the dashboards are really good, but the most important reports will be based on this gigantic Cassandra database. We just delivered a quick integration using Apache Spark. Small project, fast delivery and everyone happy. Is so good. Mission accomplished feeling.1