Join devRant
Do all the things like
++ or -- rants, post your own rants, comment on others' rants and build your customized dev avatar
Sign Up
Pipeless API
From the creators of devRant, Pipeless lets you power real-time personalized recommendations and activity feeds using a simple API
Learn More
Search - "runtime bugs"
-
I could bitch about XSLT again, as that was certainly painful, but that’s less about learning a skill and more about understanding someone else’s mental diarrhea, so let me pick something else.
My most painful learning experience was probably pointers, but not pointers in the usual sense of `char *ptr` in C and how they’re totally confusing at first. I mean, it was that too, but in addition it was how I had absolutely none of the background needed to understand them, not having any learning material (nor guidance), nor even a typical compiler to tell me what i was doing wrong — and on top of all of that, only being able to run code on a device that would crash/halt/freak out whenever i made a mistake. It was an absolute nightmare.
Here’s the story:
Someone gave me the game RACE for my TI-83 calculator, but it turned out to be an unlocked version, which means I could edit it and see the code. I discovered this later on by accident while trying to play it during class, and when I looked at it, all I saw was incomprehensible garbage. I closed it, and the game no longer worked. Looking back I must have changed something, but then I thought it was just magic. It took me a long time to get curious enough to look at it again.
But in the meantime, I ended up played with these “programs” a little, and made some really simple ones, and later some somewhat complex ones. So the next time I opened RACE again I kind of understood what it was doing.
Moving on, I spent a year learning TI-Basic, and eventually reached the limit of what it could do. Along the way, I learned that all of the really amazing games/utilities that were incredibly fast, had greyscale graphics, lowercase text, no runtime indicator, etc. were written in “Assembly,” so naturally I wanted to use that, too.
I had no idea what it was, but it was the obvious next step for me, so I started teaching myself. It was z80 Assembly, and there was practically no documents, resources, nothing helpful online.
I found the specs, and a few terrible docs and other sources, but with only one year of programming experience, I didn’t really understand what they were telling me. This was before stackoverflow, etc., too, so what little help I found was mostly from forum posts, IRC (mostly got ignored or made fun of), and reading other people’s source when I could find it. And usually that was less than clear.
And here’s where we dive into the specifics. Starting with so little experience, and in TI-Basic of all things, meant I had zero understanding of pointers, memory and addresses, the stack, heap, data structures, interrupts, clocks, etc. I had mastered everything TI-Basic offered, which astoundingly included arrays and matrices (six of each), but it hid everything else except basic logic and flow control. (No, there weren’t even functions; it has labels and goto.) It has 27 numeric variables (A-Z and theta, can store either float or complex numbers), 8 Lists (numeric arrays), 6 matricies (2d numeric arrays), 10 strings, and a few other things like “equations” and literal bitmap pictures.
Soo… I went from knowing only that to learning pointers. And pointer math. And data structures. And pointers to pointers, and the stack, and function calls, and all that goodness. And remember, I was learning and writing all of this in plain Assembly, in notepad (or on paper at school), not in C or C++ with a teacher, a textbook, SO, and an intelligent compiler with its incredibly helpful type checking and warnings. Just raw trial and error. I learned what I could from whatever cryptic sources I could find (and understand) online, and applied it.
But actually using what I learned? If a pointer was wrong, it resulted in unexpected behavior, memory corruption, freezes, etc. I didn’t have a debugger, an emulator, etc. I had notepad, the barebones compiler, and my calculator.
Also, iterating meant changing my code, recompiling, factory resetting my calculator (removing the battery for 30+ sec) because bugs usually froze it or corrupted something, then transferring the new program over, and finally running it. It was soo slowwwww. But I made steady progress.
Painful learning experience? Check.
Pointer hell? Absolutely.4 -
Dear Santa, please bring me a compiler that generates compile errors instead of runtime bugs. Thanks.10
-
I've been using microsoft dev stack for as long as i remember. Since I picked up C#/.NET in 2002 I haven't looked back. I got spoiled by things like type safety, generics, LINQ and its functional twist on C#, await/async, and Visual Studio, the best IDE one could ask for.
Over the past few years though, I've seen the rise of many competing open source stacks that get many things right, e.g. command line tooling, package management, CI, CD, containerization, and Linux friendliness. In general many of those frameworks are more Mac friendly than Windows. Microsoft started sobering up to this fact and started open sourcing its frameworks and tools, and generally being more Mac/Linux friendly, but I think that, first, it's a bit too late, and second, it's not mature yet; not even comparable to what you get on VS + Windows.
More recently I switched jobs and I'm mainly using Mac, Python, and some Java. I've also used node in a couple of small projects. My feeling: even though I may be resisting change, I genuinely feel that C# is a better designed language than Java, and I feel that static type languages are far superior to dynamic ones, especially on large projects with large number of developers. I get that dynamic languages gives you a productivity boost, and they make you feel liberated, but most of the time I feel that this productivity is lost when you have to compensate for type safety with more unit tests that would not be necessary in a static type language, also you tend to get subtle bugs that are only manifested at runtime.
So I'm really torn: enjoy world class development platform and language, but sacrifice large ecosystem of open source tools and practices that get the devops culture; or be content with less polished frameworks/languages but much larger community that gets how apps should be built, deployed, monitored, etc.
Damn you Microsoft for coming late to the open source party.11 -
Worst collaboration experience story?
I was not directly involved, it was a Delphi -> C# conversion of our customer returns application.
The dev manager was out to prove waterfall was the only development methodology that could make convert the monolith app to a lean, multi-tier, enterprise-worthy application.
Starting out with a team of 7 (3 devs, 2 dbas, team mgr, and the dev department mgr), they spent around 3 months designing, meetings, and more meetings. Armed with 50+ page specification Word document (not counting the countless Visio workflow diagrams and Microsoft Project timeline/ghantt charts), the team was ready to start coding.
The database design, workflow, and UI design (using Visio), was well done/thought out, but problems started on day one.
- Team mgr and Dev mgr split up the 3 devs, 1 dev wrote the database access library tier, 1 wrote the service tier, the other dev wrote the UI (I'll add this was the dev's first experience with WPF).
- Per the specification, all the layers wouldn't be integrated until all of them met the standards (unit tested, free from errors from VS's code analyzer, etc)
- By the time the devs where ready to code, the DBAs were already tasked with other projects, so the Returns app was prioritized to "when we get around to it"
Fast forward 6 months later, all the devs were 'done' coding, having very little/no communication with one another, then the integration. The service and database layers assumed different design patterns and different database relationships and the UI layer required functionality neither layers anticipated (ex. multi-users and the service maintaining some sort of state between them).
Those issues took about a month to work out, then the app began beta testing with real end users. App didn't make it 10 minutes before users gave up. Numerous UI logic errors, runtime errors, and overall app stability. Because the UI was so bad, the dev mgr brought in one of the web developers (she was pretty good at UI design). You might guess how useful someone is being dropped in on complex project , months after-the-fact and being told "Fix it!".
Couple of months of UI re-design and many other changes, the app was ready for beta testing.
In the mean time, the company hired a new customer service manager. When he saw the application, he rejected the app because he re-designed the entire returns process to be more efficient. The application UI was written to the exact step-by-step old returns process with little/no deviation.
With a tremendous amount of push-back (TL;DR), the dev mgr promised to change the app, but only after it was deployed into production (using "we can fix it later" excuse).
Still plagued with numerous bugs, the app was finally deployed. In attempts to save face, there was a company-wide party to celebrate the 'death' of the "old Delphi returns app" and the birth of the new. Cake, drinks, certificates of achievements for the devs, etc.
By the end of the project, the devs hated each other. Finger pointing, petty squabbles, out-right "FU!"s across the cube walls, etc. All the team members were re-assigned to other teams to separate them, leaving a single new hire to fix all the issues.5 -
Did a bunch more cowboy coding today as I call it (coding in vi on production). Gather 'round kiddies, uncle Logan's got a story fer ya…
First things first, disclaimer: I'm no sysadmin. I respect sysadmins and the work they do, but I'm the first to admit my strengths definitely lie more in writing programs rather than running servers.
Anyhow, I recently inherited someone else's codebase (the story of my profession career, but I digress) and let me tell you this thing has amateur hour written all over it. It's written in PHP and JavaScript by a self-taught programmer who apparently discovered procedural programming and decided there was nothing left to learn and stopped there (no disrespect to self-taught programmers).
I could rant for days about the various problems this codebase has, but today I have a very specific story to tell. A story about errors and logs.
And it all started when I noticed the disk space on our server was gradually decreasing.
So today I logged onto our API server (Ubuntu running Apache/PHP) and did a df -h to check the disk space, and was surprised to see that it had noticeably decreased since the last time I'd checked when everything was running smoothly. But seeing as this server does not store any persistent customer data (we have a separate db server) and purely hosts the stateless API, it should NOT be consuming disk space over time at all.
The only thing I could think of was the logs, but the logs were very quiet, just the odd benign message that was fully expected. Just to be sure I did an ls -Sh to check the size of the logs, and while some of them were a little big, nothing over a few megs. Nothing to account for gigabytes of disk space gradually disappearing.
What could it be? I wondered.
cd ../..
du . | sort --sort=numeric
What's this? 2671132 K in some log folder buried in the api source code? I cd into it and it turns out there are separate PHP log files in there, split up by customer, so that each customer of ours (we have 120) has their own respective error log! (Why??)
Armed with this newfound piece of (still rather unbelievable) evidence I perform a mad scramble to search the codebase for where this extra logging is happening and sure enough I find a custom PHP error handler that is capturing (most) errors and redirecting them to these individualized log files.
Conveniently enough, not ALL errors were being absorbed though, so I still knew the main error_log was working (and any time I explicitly error_logged it would go there, so I was none the wiser that this other error-catching was even happening).
Needless to say I removed the code as quickly as I found it, tail -f'd the error_log and to my dismay it was being absolutely flooded with syntax errors, runtime PHP exceptions, warnings galore, and all sorts of other things.
My jaw almost hit the floor. I've been with this company for 6 months and had no idea these errors were even happening!
The sad thing was how easy to fix all the errors ended up being. Most of them were "undefined index" errors that could have been completely avoided with a simple isset() check, but instead ended up throwing an exception, nullifying any code that came after it.
Anyway kids, the moral of the story is don't split up your log files. It makes absolutely no sense and can end up obscuring easily fixable bugs for half a year or more!
Happy coding.6 -
Today my fellow @EaZyCode found out a local Hosting Provider has a massive security breach.
He wrote an Plugin for Minecraft with an own file explorer and the ability to execute runtime commands over it.
We discovered that this specific hosting provider stores the ftp passwords one level above the FTP-Root. In FUCKING PLAIN TEXT! AND THE MYSQL PASSWORD TOO! And even more shit is stored there ready to be viewed by intelligent people...
It's one of the fucking biggest Hosting provider Germanys!
But, because EaZyCode has such a great mind and always find such bugs, I give him the title "Providers Endboss" today, he has earned it.
Loving you ❤️
Edit: we used SendMail with runtime commands and sended too many empty Spammails (regret noting)18 -
I want to explain to people like ostream (aka aviophille) why JS is a crap language. Because they apparently don't know (lol).
First I want to say that JS is fine for small things like gluing some parts togeter. Like, you know, the exact thing it was intended for when it was invented: scripting.
So why is it bad as a programming language for whole apps or projects?
No type checks (dynamic typing). This is typical for scripting languages and not neccesarily bad for such a language but it's certainly bad for a programming language.
"truthy" everything. It's bad for readability and it's dangerous because you can accidentaly make unwanted behavior.
The existence of == and ===. The rule for many real life JS projects is to always use === to be more safe.
In general: The correct thing should be the default thing. JS violates that.
Automatic semicolon insertion can cause funny surprises.
If semicolons aren't truly optional, then they should not be allowed to be omitted.
No enums. Do I need to say more?
No generics (of course, lol).
Fucked up implicit type conversions that violate the principle of least surprise (you know those from all the memes).
No integer data types (only floating point). BigInt obviously doesn't count.
No value types and no real concept for immutability. "Const" doesn't count because it only makes the reference immutale (see lack of value types). "Freeze" doesn't count since it's a runtime enforcement and therefore pretty useless.
No algebraic types. That one can be forgiven though, because it's only common in the most modern languages.
The need for null AND undefined.
No concept of non-nullability (values that can not be null).
JS embraces the "fail silently" approach, which means that many bugs remain unnoticed and will be a PITA to find and debug.
Some of the problems can and have been adressed with TypeScript, but most of them are unfixable because it would break backward compatibility.
So JS is truly rotten at the core and can not be fixed in principle.
That doesn't mean that I also hate JS devs. I pity your poor souls for having to deal with this abomination of a language.
It's likely that I fogot to mention many other problems with JS, so feel free to extend the list in the comments :)
Marry Christmas!34 -
Node: The most passive aggressive language I've had the displeasure of programming in.
Reference an undefined variable in a module? Prepare to waste your time hunting for it, because the runtime won't tell you about it until you reference a property or method on the quietly undefined module object.
Think you know how promises work? As a hiring manager, I've found that less than 5% of otherwise well-experienced devs are out of the Dunning Kruger danger zone.
Async causes edge cases and extra dev effort that add to the effort required to make a quality product.
Got a bug in one of your modules? Prepare yourself for some downtime because a single misplaced parentheses can take out the entire Node process, killing unrelated pages and even static file hosting.
All this makes for a programming experience that demands much higher cognitive load, creates more categories of bugs, and leads to code bloat/smell much more quickly than other commonly substituted languages.
From a business perspective, the money you save on scaling (assuming your app is more compute efficient under Node) is wasted on salaries and opportunity costs stemming from longer dev time, more QA, and more frequent outages.
IMO, Node is an awesome experiment, a fun language, a great tool for specific use cases, and a terrible fucking choice for an entire website.8 -
Haskell: Turning runtime bugs into compile-time errors.
Python: Turning compile-time errors into runtime bugs.3 -
Got a call about production was going to fail. They thought it's the application server.
I'm the end it was bogus file mods which were scrambled by the backup tool.
Why we didn't find out earlier? Because the java application was coded like this:
-------
String content;
Try {
File bla = new File
content = ... Read operation
} catch (IoException | SecurityEx | RuntimeEx ex)
// nothing we can do here
}
doWork(content);
---------
Why the fuck do we have code reviews? Why not just log or throw a Runtime Exception? Argh... I thought it would be better in enterprise applications. Perhaps I should tell them to not just use pmd, also spotbugs and sonarqube. But the department for the build tools does not have enough employees. Dang.
Anyway. Earned some money for that.
Now it's 2018 and I still get money for the same kind of bugs as 2008.3