Join devRant
Do all the things like
++ or -- rants, post your own rants, comment on others' rants and build your customized dev avatar
Sign Up
Pipeless API
From the creators of devRant, Pipeless lets you power real-time personalized recommendations and activity feeds using a simple API
Learn More
Search - "wk237"
-
Still trying to get good.
The requirements are forever shifting, and so do the applied paradigms.
I think the first layer is learning about each paradigm.
You learn 5-10 languages/technologies, get a feeling for procedural/functional/OOP programming. You mess around with some electronics engineering, write a bit of assembly. You write an ugly GTK program, an Android todo app, check how OpenGL works. You learn about relational models, about graph databases, time series storage and key value caches. You learn about networking and protocols. You void the warranty of all the devices in your house at some point. You develop preferences for languages and systems. For certain periods of time, you even become an insufferable fanboy who claims that all databases should be replaced by MongoDB, or all applications should be written in C# -- no exceptions in your mind are possible, because you found the Perfect Thing. Temporarily.
Eventually, you get to the second layer: Instead of being a champion for a single cause, you start to see patterns of applicability.
You might have grown to prefer serverless microservice architectures driven by pub/sub event busses, but realize that some MVC framework is probably more suitable for a 5-employee company. You realize that development is not just about picking the best language and best architecture -- It's about pros and cons for every situation. You start to value consistency over hard rules. You realize that even respected books about computer science can sometimes contain lies -- or represent solutions which are only applicable to "spherical cows in a vacuum".
Then you get to the third layer: Which is about orchestrating migrations between paradigms without creating a bigger mess.
Your company started with a tiny MVC webshop written in PHP. There are now 300 employees and a few million lines of code, the framework more often gets in the way than it helps, the database is terribly strained. Big rewrite? Gradual refactor? Introduce new languages within the company or stick with what people know? Educate people about paradigms which might be more suitable, but which will feel unfamiliar? What leads to a better product, someone who is experienced with PHP, or someone just learning to use Typescript?
All that theoretical knowledge about superior paradigms won't help you now -- No clean slates! You have to build a skyscraper city to replace a swamp village while keeping the economy running, together with builders who have no clue what concrete even looks like. You might think "I'll throw my superior engineering against this, no harm done if it doesn't stick", but 9 out of 10 times that will just end in a mix of concrete rubble, corpses and mud.
I think I'm somewhere between 2 and 3.
I think I have most of the important knowledge about a wide array of languages, technologies and architectures.
I think I know how to come to a conclusion about what to use in which scenario -- most of the time.
But dealing with a giant legacy mess, transforming things into something better, without creating an ugly amalgamation of old and new systems blended together into an even bigger abomination? Nah, I don't think I'm fully there yet.8 -
When I learnt programming, sugar was still made out of salt and hence not used in coffee.
Also, we didn't have source level debuggers, only the "print" method. However, compiling was also slow. It was faster and more convenient to go through the program and execute the statements in one's head. This helped understanding what code is doing just by reading it. It also kept people from trial and error programming, something that some people fall for when they resort to single step debugging in order to understand what their own code is even doing.
Compiling was slow because computers in general were slow, like single digit MHz. That enforced programming efficient code. It's also why we learnt about big Oh notation already at school. Starting with manual resource management helped to get a feeling for what's going on under the hood.20 -
Started writing an operating system several years ago. Taught me just about everything I need to know about computers.
Oh, and make NES games. That teaches you a lot about how we arrived to where we are today.1 -
I progressively became more right over the course of 30 years. At the point where I was contextually right more often than not, I determined that to be "good." Then I kept getting better, just in case.9
-
I didn't... Some of y'all might argue I'm not even a proper dev.. and I'd agree..
I'm fixing bugz & implementing a thing or two.. but all within how project was done.. you give me a blank project, I'd probably spend days reading up on how to do it properly and still couldn't decide what sounds good to me.7 -
Easy.
I just worked a shitty manual labor job from 5am - 4pm Mon - Friday while going to night school. I told myself if I didn’t succeed in programming I would be stuck at that dead end job which would eventually lead to my own suicide. I kind of put myself in a position where getting good at coding was my only way out of a shitty/brutal lifestyle. It worked, as I now work from home and make twice as much money. It’s a funny thing to think about sometimes, two years ago I had to have knee surgery due to the physical strain of my former job job, and nowadays I sometimes get a neck cramp from not sitting up straight.
Moral of the story, sometimes growth can only happen when we put ourselves in uncomfortable situations.2 -
Honestly? I was always good at maths and creativity. And so, programming was natural to me. I was always good at it with minimum effort. ¯\_(ツ)_/¯
... Algorithms were a whole other story tho. I'm still not confident 'bout any algos I program from scratch. But hey, if it works, it works. (that became my motto about algos, kinda)
Forgot one thing tho: looking at relevant code to whatever I'm doing, be it in a tutorial or stackoverflow. I don't need the text or tutorial or explanation, I need to see code examples.2 -
Patience. It's the only virtue I think if anyone has it, they can make it in this field. And you can't fake patience, it comes from within.
You need patience because coding can be extremely frustrating. And without patience, you might not be able to sit in one place for 12 hours a day trying to find a solution for a single task. -
At age 6 I was deemed as an idiot savant. Coding is boring for me now. Age 7-10: I worked for an underground agency that was focused on harvesting people's organ data from MRI machines to predict the economic future. 10-14: I experimented with smoking crack to increase finger efficiency. Since then I've quit, and I've been living in Miami trying to create a lofi industrial folk album using nothing but a TI-84, some wire, and an old fender amp.2
-
I didn’t. I suck at it. That’s why I ended up being a manager. Not that I didn’t try, but according to a scientific test I took in college to figure out why I kept failing my math classes, I’m screwed in the math and logic department. I sure know how to read and write, though, so I guess I have that going for me.6
-
YEARS of practice. I had my ups and downs. I learned myself, left it myself early on, came back to it half a year later, continued since. Figured out that web development is not the hell I wanted and quickly fell in love with iOS development in Swift. Been riding on the wind ever since, learning something new every single day.
Today I made something that some time ago took me about 3 weeks in less than an hour. If that’s not an improvement, I don’t know what that is.
Practice makes perfect, don’t forget that. Although it sounds ridiculously cheesy and shit, this is how it goes.
I’m getting drafted tomorrow. Well, this is not exactly a full on draft and joining the IDF (Israeli Defense Force) right NAO, it’s what we call a rough draft: I am having a psychotechnical examination so the military can understand how much I need to go to a cybersecurity unit instead of going to Gaza LMAO.2 -
Don't know.
All I know is that I suck at calculus but I write good code.
Weird.
If I would have to decide, early on I wrote random stuff 'til it worked.
Then I attempted to comprehend what I just did and started reading books.
That's probably when I first understood what I was doing.6 -
I became good enough to be hired as a developer by reinventing lots of wheels and making mistakes. A lot of mistakes.2
-
Phonies: "By doing X for Y number of years and reading things such as blah blah blah"
Kings/Queens, silverback devs, rockstar engineers: "I never got good." -
Personal projects, I think, are 50% of the battle, and projects you are required to complete are the other 50%.
Personal projects encourage you to try new and hard things without too much fear of failure.
Required projects make you learn something and complete it.
Both are absolutely essential to craft a well-rounded dev. -
Programming is a passion I’ve had since I was a kid and I saw my brother’s books on Basic and Pascal. YouTube didn’t exist back then... Stack Overflow didn’t exist and yahoo was my search engine after having to listen to the dial-up sounds. Once I found the right tools to learn on my own, after my first hello world program, I didn’t stop.
The fact that I’m still making time to write even a few lines of code every day, go through courses and dive into documentations makes me hope that one day I’ll be good enough!2 -
I picked what I’m good at so solving puzzles after that it’s simple.
10 years experience rule or 10’000 hours of writing code.9 -
I haven't. Yet.
I started taking programming seriously when I got to 9th grade. 3 years isn't enough time. Probably enough to be able to put out okay-ish code in a scripting language, but not good code.3 -
First: I have to give credit to my high school CS teacher. She gave us a good grounding in computer theory about: pointers, memory organization, and algorithms.
Second: Second I just read the fucking manual. Then programmed a LOT more than people who didn't get good. Hundreds of hours during college, thousands since then. I got style information from reading other peoples code and also learned about how not to code by reading other peoples code. Ever buy a book that proclaims to teach you X, but actually teaches you a proprietary wrapper they wrote for X that has a shitty license? Fuck those people. Anyway, when internet sharing became more of a thing I started watching videos by experts and reading articles. And now I learn from people here as well. Never stop learning and always RTFM. -
Practice? Also good is relative, I don't considered myself "good" but it depends on compared to what/who.
Weird question overall. -
[wk237 - how you know you got good at programming]
idk, i dont think im good, ive got to a point where i can just eyeball those stupid interview questions, which makes me happy, but thats just basic logic -
Good code is a lie imho.
When you see a project as code, there are 3 variables in most cases:
- time
- people / human resources
- rules
Every variable plays a certain role in how the code (project) evolves.
Time - two different forms: when certain parts of code are either changed in a high frequency or a very low frequency, it's a bad omen.
Too high - somehow this area seems to be relentless. Be it features, regressions or bugs - it takes usually in larger code bases 3 - 4 weeks till all code pathes were triggered.
Too low - it can be a good sign. But it should be on the radar imho. Code that never changes should be reviewed at an - depending on size of codebase - max. yearly audit. Git / VCS is very helpful here.
Why? Mostly because the chances are very high that the code was once written for a completely different requirement set. Hence the audit - check if this code still is doing the right job or if you have a ticking time bomb that needs to be defused.
People
If a project has only person working on it, it most certainly isn't verified by another person. Meaning that only one person worked on it - I'd say it's pretty bad to bad, as no discussion / review / verification was done. The author did the best he / she could do, but maybe another person would have had an better idea?
Too many people working on one thing is only bad when there are no rules ;)
Rules. There are two different kind of rules.
Styling / Organisation / Dokumentation - everything that has not much to do with coding itself. These should be enforced at a certain point, otherwise the code will become a hot glued mess noone wants to work on.
Coding itself. This is a very critical thing.
Do: Forbid things that are known to be problematic in the programming language itself. Eg. usage of variables in variables, reflection, deprecated features.
Do: Define a feature set for each language. Feature set not meaning every feature you want to use! Rather a fixed minimum version every developer must use and - in case of library / module / plugin support - which additional extras are supported.
Every extra costs. Most developers don't want to realize this... And a code base that evolves over time should have minimal dependencies. Every new version of an extra can have bugs, breakages, incompabilties and so on.
Don't: don't specify a way of coding. Most coding guidelines are horrific copy pastures from some books some smart people wrote who have no fucking clue what you're doing and why.
If you don't know how to operate on people, standing in an OR and doing what a book told you to do would end in dead person pretty sure. Same for code.
Learn from mistakes and experience, respect knowledge from other persons, but always reflect on wether this makes sense at this specific area of code.
There are very few things which are applicable to a large codebase on a global level. Even DRY / SOLID and what ever you can come up with can be at a certain point completely wrong.
Good code is a lie - because it can only exist at a certain point of time.
A codebase should be a living thing - when certain parts rot, other parts will be affected too.
The reason for the length of the comment was to give some hints on what my principles are that code stays in an "okayish" state, but good is a very rare state -
define "good".
If it's "knowing one's way around" - then yes, I guess I'm good in the context of some languages. How did I get there?
1. good night sleep (yes, #1; I've learnt from my mistakes during studies)
2. accepting/making up challenging tasks
3. toying around with the tools and abusing them heavily (like creating video games in bash or doing some metaprogramming)
4. when you find it hard to find any material about the tool/language that would be new to you - consider yourself good at it. -
Shaddock proverb: Continuously trying we ended up succeeding. So the more we fail, the more the success rate improves.
-
This week's prompt implies most or all of us are "good" at programming, which... define the threshhold for this? Actually, before that, how do we QUANTIFY this?
also i'm shit at programming4 -
Being at a shitty job with me as the only person on the project. I check almost everything thrice so that I have as few problems as possible in the future.
-
I don't know if I'm 'good'.
I've only been doing this for a few years.
I do think I'm 'responsible'. I'll admit my mistakes, I'll fix them, I'm happy to get out of my comfort zone. I don't mind working with various folks to get the job done (even if that produces a rant or too... that's healthy). If someone has a different idea I'm happy to try it, and I communicate with those I work with about what is up and such. -
The question is: How do you tell if you are good?
But it's the same as with most skills:
Getting better is achieved by actually doing it. A lot. -
I can't categorically or systematically answer your question now because my IDe is starring right at me.
-
I won't say I'm very good, but relatively I've gotten better over time with sheer practice, nothing else.1
-
I tried, then I tried again, then I tried again, but harder, and after eight years of constant learning every darn day after school something clicked inside my head and I realized everything.
I never really learned that hard since because I don’t need to. -
This begs the question: how do you define being good at programming? How can you tell if you are actually good or just think you are?
Having asked that, I think I’m getting there... by reading other people’s code, by listening to feedback from better devs than I am, by asking questions and discussing matters I may not fully comprehend, by reading books and articles, by trial and error and by constantly seeking new concepts, languages and other relevant matters to learn. That’s how one becomes better - when one is good, is another story altogether. -
Dunno if I would say I've become a good coder.
I regularly see stuff written by others and think, holy shit I couldn't manage that.
But somehow I've built a career out of people telling me what they need and me producing solutions for them. So they pay me.
I'm fairly personable, ask questions, listen to answers, when I fuck up (which is still just as often as 15 years ago) I take responsibility.
On the tech side, just keep doing it. Tackle one problem after another. Ask others and of course:
https://9gag.com/gag/aj9xAmG -
Not quite, maybe almost good?
But I am still trying to get good.
I still read the documentation and guides when I write a program. especially when trying to use a library code within my program. -
I don't consider myself good at all, but I improved a lot with coding competitions, not in programming itself but in problem solving definitely.
Sometimes the best way to improve is to get out of the comfort zone and try something you don't know how to do at the very moment. You'll learn a lot, and learning what you need at the exact time that you need it is way more effective than studying random things from a book for an exam.