AboutI'm never getting my engineering degree. I also like shouting my opinions like they are the ultimate Truth :)
SkillsC, C++, C#, JS, HTML
Joined devRant on 9/20/2019
Do all the things like ++ or -- rants, post your own rants, comment on others' rants and build your customized dev avatarSign Up
From the creators of devRant, Pipeless lets you power real-time personalized recommendations and activity feeds using a simple APILearn More
Not working, not looking at the phone, not listening to music, just sitting there and doing nothing until you get bored enough to get back to it.
My monitor is standing on top of a tower of comic books which lift up to level with my head. Most are Tintin adventures.4
Damn lots of you knew this shit before turning of age.
I didn't code a single line until I went to college.
I tried to, but it was just too fucking complicated and I didn't understand a thing. Tried to grasp how to use some tools like Unity or an Adventure Maker of sorts and something called Flix for Flash games. Didn't understand shit.
I decided to study systems engineering due to a career aptitude test I took hoping somehow that way I could learn sthg.
First thing I was taught was bash.
When I realised I already knew enough to code a whole text adventure from scratch with such a simple language I felt really hyped.
Always loved text and graphic adventures.
Afterwards I was taught the Z80 assembly language and how CPU registers worked and it blew my fucking mind.
That was the first half-year.
Then I was taught C. And boy was it hard. Didn't get how memory was being handled until the very end.
I happened to be one of the few passing a stupidly complicated semifinal test with triple indirection pointers.
That felt goood.
Learning other languages afterwards was a piece of cake. C#, Java, X86 assembly, C++...
It was a hard door to open. Fucking heavy. But now nothing seems black magic anymore and boy isn't that something to be proud of! :D
"Jump ship!" "GEt oUtTa thEre!" "LoOk fer anOtheR jOb!"
Every fucking rant about bad work environment has those mudafucking answers LIKE THE PERSON DID NOT THINK ABOUT THAT.
Come on man, at least acknowledge the obviousness of your comment by adding a "I know you probably thought about this but...".
It'd be better if you could stop contaminating the comments with such fucking obviousness altogether though.11
I feel the stress in my head and a fire within cause my PRs are rejecteeed ♫
I have an error to solve and frustration to spaaaaare!
What a beautiful wind blowing through~
I wish that it blew my shaaame
Just fire me alreaaaadyyyyy ♫
- a song by Bugged the series on Disney Channel
Not finding what I want via google so I'll ask here: What's the deal with opengles android shaders freezing my phone's screen?
Is it normal unavoidable behaviour for a shader with an infinite loop to fuck up the visual output irreversibly (until phone restart)?
I was merging using Visual Studio Code when I realised I wanted to merge neither the current nor the incoming code. I wanted it both gone. And that option does not exist!
Only accept one, the other, or both.
T'was a mildy interesting experience.5
For windows: Sublime
For linux tty: Vim
VSCode is a bloated piece of shit which can't open a folder without rebooting the whole window. It's only nice next to the rest of the bloated shit out there.19
So here I am reinventing the wheel making an HTTP server in C.
Finished implementing HTTP/1.1 and WebSockets support and now I find out the current thingy is HTTP/2.
Well that's fine, I'll add support for that later. In fact I kinda dig it since it uses binary conventions instead of plain text ones.
I dig a little bit and find out there already is an HTTP/3 going around which uses UDP.
How do you handle error checking? I always feel sad after I add error checking to a code that was beautifully simple and legible before.
It still remains so but instead of each line meaning something it becomes if( call() == -1 ) return -1; or handleError() or whatever.
Same with try catch if the language supports it.
It's awful to look at.
So awful I end up evading it forever.
"Malloc can't fail right? I mean it's theoetically possible but like nah", "File open? I'm not gonna try catch that! It's a tmp file only my program uses come oooon", all these seemingly reasonable arguments cross my head and makes it hard to check the frigging errors. But then I go to sleep and I KNOW my program is not complete. It's intentionally vulnerable. Fuck.
How do you do it? Is there a magic technique or one has to reach dev nirvana to realise certain ugliness and cluttering is necessary for the greater good sometimes and no design pattern or paradigm can make it clean and complete?15
Right off the bat
No beating around the bush like a pansy
Actually knowing the CPU operations (which in essence are very limited and simple) will help him/her understand EVERYTHING: its limits and its capabilities.25
Once usaians stop calling themselves americans they can talk and complain about proper non-misleading identifiers 😘13
As a dev, the present one. First year working as a dev since march. I learned a bajillion things and am being payed.
As a dev this was easily the most productive year of my life.
Today I found this jewel in a PR of a respected dev of my workplace:
Half life 2 runs smoothly in a 12 year old PC with Nvidia 8500, 1 GB RAM, and a dual core.
A FPS with wavy water reflection, body physics and huge designed maps which is updates every fucking frame.
Today I can't run smoothly an IDE with 8 GB of RAM and 4 cores.
A program which only reacts to events stutters if I write at more than 3 letters per sec.
I wanna go back. Can we go back? Lets keep the new hardware and go back with the software pleeeease.21
GLFW is the cleanest, well documented, most convinient API for creating and handling windows in Linux and Windows I've ever used.
The only thing that bugs me is that valgrind detects memory leaks on it.4
I asked at an interview if they documented their code with class diagrams.
One of the interviewers told me: "Good code doesn't need a class diagram"
Why do you lil' shits keep making LAYERS and LAYERS of unnecessary abstraction and then call it goddamn progress???
Dude what the fuck is this UEFI shit?!
Why the hell do I NEED to import a frigging library and read tons of boring and overly complicated documentation just so I can paint a pixel on the screen now uh??
Alright alright yeah so the BIOS is a little basic but daaaamit son if you want something a bit more complicated you make it yourself or install an OS that provides it! Like we've been doing it for years!!!
Dude, you don't get to know what a file system is until I tell you!
The PC be like:
"You wanna dereference the 0x0 pointer? There you go: it's 0xE9DF41, anything else?
You wanna write to the screen? Ok I have a perfectly convinient interrupt setup for that.
Wanna paint a pixel yellow? Ok, just call this other interruption. Theere we go.
And it only took four bytes and a nanosecond to do it."
That shit works, and if you want something more complex, but not too much, that still runs efficiently install DOS.
Don't mess around with the hardware pleeease.
We can still understand what's going on down there. Once UEFI steps in, it'll be like sealing a door forever. Long live BIOS damn it all!1