Join devRant
Do all the things like
++ or -- rants, post your own rants, comment on others' rants and build your customized dev avatar
Sign Up
Pipeless API
From the creators of devRant, Pipeless lets you power real-time personalized recommendations and activity feeds using a simple API
Learn More
Search - "we have the memory"
-
Interviewer: Welcome, Mr X. Thanks for dropping by. We like to keep our interviews informal. And even though I have all the power here, and you are nothing but a cretin, let’s pretend we are going to have fun here.
Mr X: Sure, man, whatever.
I: Let’s start with the technical stuff, shall we? Do you know what a linked list is?
X: (Tells what it is).
I: Great. Can you tell me where linked lists are used?
X:: Sure. In interview questions.
I: What?
X: The only time linked lists come up is in interview questions.
I:: That’s not true. They have lots of real world applications. Like, like…. (fumbles)
X:: Like to implement memory allocation in operating systems. But you don’t sell operating systems, do you?
I:: Well… moving on. Do you know what the Big O notation is?
X: Sure. It’s another thing used only in interviews.
I: What?! Not true at all. What if you want to sort a billion records a minute, like Google has to?
X: But you are not Google, are you? You are hiring me to work with 5 year old PHP code, and most of the tasks will be hacking HTML/CSS. Why don’t you ask me something I will actually be doing?
I: (Getting a bit frustrated) Fine. How would you do FooBar in version X of PHP?
X: I would, er, Google that.
I: And how do you call library ABC in PHP?
X: Google?
I: (shocked) OMG. You mean you don’t remember all the 97 million PHP functions, and have to actually Google stuff? What if the Internet goes down?
X: Does it? We’re in the 1st world, aren’t we?
I: Tut, tut. Kids these days. Anyway,looking at your resume, we need at least 7 years of ReactJS. You don’t have that.
X: That’s great, because React came out last year.
I: Excuses, excuses. Let’s ask some lateral thinking questions. How would you go about finding how many piano tuners there are in San Francisco?
X: 37.
I: What?!
X: 37. I googled before coming here. Also Googled other puzzle questions. You can fit 7,895,345 balls in a Boeing 747. Manholes covers are round because that is the shape that won’t fall in. You ask the guard what the other guard would say. You then take the fox across the bridge first, and eat the chicken. As for how to move Mount Fuji, you tell it a sad story.
I: Ooooooooookkkkkaaaayyyyyyy. Right, tell me a bit about yourself.
X: Everything is there in the resume.
I: I mean other than that. What sort of a person are you? What are your hobbies?
X: Japanese culture.
I: Interesting. What specifically?
X: Hentai.
I: What’s hentai?
X: It’s an televised art form.
I: Ok. Now, can you give me an example of a time when you were really challenged?
X: Well, just the other day, a few pennies from my pocket fell behind the sofa. Took me an hour to take them out. Boy was it challenging.
I: I meant technical challenge.
X: I once spent 10 hours installing Windows 10 on a Mac.
I: Why did you do that?
X: I had nothing better to do.
I: Why did you decide to apply to us?
X: The voices in my head told me.
I: What?
X: You advertised a job, so I applied.
I: And why do you want to change your job?
X: Money, baby!
I: (shocked)
X: I mean, I am looking for more lateral changes in a fast moving cloud connected social media agile web 2.0 company.
I: Great. That’s the answer we were looking for. What do you feel about constant overtime?
X: I don’t know. What do you feel about overtime pay?
I: What is your biggest weakness?
X: Kryptonite. Also, ice cream.
I: What are your salary expectations?
X: A million dollars a year, three months paid vacation on the beach, stock options, the lot. Failing that, whatever you have.
I: Great. Any questions for me?
X: No.
I: No? You are supposed to ask me a question, to impress me with your knowledge. I’ll ask you one. Where do you see yourself in 5 years?
X: Doing your job, minus the stupid questions.
I: Get out. Don’t call us, we’ll call you.
All Credit to:
http://pythonforengineers.com/the-p...89 -
Interview with a candidate. He calls himself "C++ expert" on his resume. I think: "oh, great, I love C++ too, we will have an interesting conversation!"
Me: let's start with an easy one, what is 'nullptr'?
Him: (...some undecipherable sequence of words that didn't make any sense...)
In my mind: mh, probably I didn't understand right. Let's try again with something simple and more generic
Me: can you tell me about memory management in C++?
Him: you create objects on the stack with the 'new' keyword and they get automatically released when no other object references them
In my mind: wtf is this guy talking about? Is he confusing C++ with Java? Does he really know C++? Let's make him write some code, just to be sure
Me: can you write a program that prints numbers from 1 to 10?
Ten minutes and twenty mistakes later...
Me: okay, so what is this <int> here in angle brackets? What is a template?
Him: no idea
Me: you wrote 'cout', why sometimes do I see 'std::cout' instead? What is 'std'?
Answer: no idea, never heard of 'std'
I think: on his resume he also said he is a Java expert. Let's see if he knows the difference between the two. He *must* have noticed that one is byte-compiled and the other one is compiled to native code! Otherwise, how does he run his code? He must answer this question correctly:
Me: what is the difference between Java and C++? One has a Virtual Machine, what about the other?
Him: Java has the Java Virtual Machine
Me: yes, and C++?
Him: I guess C++ has a virtual machine too. The C++ Virtual Machine
Me (exhausted): okay, I don't have any other questions, we will let you know
And this is the story of how I got scared of interviews29 -
Toilets and race conditions!
A co-worker asked me what issues multi-threading and shared memory can have. So I explained him that stuff with the lock. He wasn't quite sure whether he got it.
Me: imagine you go to the toilet. You check whether there's enough toilet paper in the stall, and it is. BUT now someone else comes in, does business and uses up all paper. CPUs can do shit very fast, can't they? Yeah and now you're sitting on the bowl, and BAMM out of paper. This wouldn't have happened if you had locked the stall, right?
Him: yeah. And with a single thread?
Me: well if you're alone at home in your appartment, there's no reason to lock the door because there's nobody to interfere.
Him: ah, I see. And if I have two threads, but no shared memory, then it is as if my wife and me are at home with each a toilet of our own, then we don't need to lock either.
Me: exactly!12 -
We had a Commodore64. My dad used to be an electrical engineer and had programs on it for calculations, but sometimes I was allowed to play games on it.
When my mother passed away (late 80s, I was 7), I closed up completely. I didn't speak, locked myself into my room, skipped school to read in the library. My dad was a lovely caring man, but he was suffering from a mental disease, so he couldn't really handle the situation either.
A few weeks after the funeral, on my birthday, the C64 was set up in my bedroom, with the "programmers reference guide" on my desk. I stayed up late every night to read it and try the examples, thought about those programs while in school. I memorized the addresses of the sound and sprite buffers, learnt how programs were managed in memory and stored on the casette.
I worked on my own games, got lost in the stories I was writing, mostly scifi/fantasy RPGs. I bought 2764 eproms and soldered custom cartridges so I could store my finished work safely.
When I was 12 my dad disappeared, was found, and hospitalized with lost memory. I slipped through the cracks of child protection, felt responsible to take care of the house and pay the bills. After a year I got picked up and placed in foster care in a strict Christian family who disallowed the use of computers.
I ran away when I was 13, rented a student apartment using my orphanage checks (about €800/m), got a bunch of new and recycled computers on which I installed Debian, and learnt many new programming languages (C/C++, Haskell, JS, PHP, etc). My apartment mates joked about the 12 CRT monitors in my room, but I loved playing around with experimental networking setups. I tried to keep a low profile and attended high school, often faking my dad's signatures.
After a little over a year I was picked up by child protection again. My dad was living on his own again, partly recovered, and in front of a judge he agreed to be provisory legal guardian, despite his condition. I was ruled to be legally an adult at the age of 15, and got to keep living in the student flat (nation-wide foster parent shortage played a role).
OK, so this sounds like a sobstory. It isn't. I fondly remember my mom, my dad is doing pretty well, enjoying his old age together with an nice woman in some communal landhouse place.
I had a bit of a downturn from age 18-22 or so, lots of drugs and partying. Maybe I just needed to do that. I never finished any school (not even high school), but managed to build a relatively good career. My mom was a biochemist and left me a lot of books, and I started out as lab analyst for a pharma company, later went into phytogenetics, then aerospace (QA/NDT), and later back to pure programming again.
Computers helped me through a tough childhood.
They awakened a passion for creative writing, for math, for science as a whole. I'm a bit messed up, a bit of a survivalist, but currently quite happy and content with my life.
I try to keep reminding people around me, especially those who have just become parents, that you might feel like your kids need a perfect childhood, worrying about social development, dragging them to soccer matches and expensive schools...
But the most important part is to just love them, even if (or especially when) life is harsh and imperfect. Show them you love them with small gestures, and give their dreams the chance to flourish using any of the little resources you have available.22 -
Father bought a PC in 1997. Back then very few had it. I learned doing things like accessing the internet and sending emails, among others. I remember having added age on websites to be allowed to sign up at times :P My sisters used to play games on it sometimes. The first few ones we had were Tomb Raider: The Last Revelation, Tomb Raider Chronicles, American McGee's Alice(Which caused us to upgrade the PC xD)... And some others.
I have a memory of this pseudo-3D-looking game where you move in a maze and try answering questions. I want to remember its name, but I cannot :(
We literally have video evidence of me liking the computer as a child, yet my parents either say I'm addicted or deny I've ever liked it before. Not only that, but continuously limiting my time with the PC hasn't been a literal obstacle in my way of trying to do things in their opinion. Funny how my parents think the last few years I've been my worst when they've hurt me in those years so much that our relationship is guaranteed not working out. There were doubts in my head before, but now it's cemented and there is no way of going back. Father, for example, tells me it's too late to do anything with a PC now(As well as how I've been unable to use the PC. He looks at these pro players' footage in some TV show and he's like, „You've been unable to use your hobbies“, as if they have never ever screamed at me for perceived gaming and not actually cared to check), and I need to look for a „real“ job.
Sorry. I went to bed at 2:00 in the morning. Feel like a zombie because of ongoing weirdly insufficient sleep, even though I sleep kinda more than normal. Even when I took Melatonine for that it didn't help at all.
Childhood was where beating began. I was about 6/7. Right when I entered school. The first school that I attended was a private one and supposedly for „Wunderkinds“, while in reality I haven't seen a SINGLE teacher or psychologist approve of it, their argument being that children were basically drowned in work that wasn't age-appropriate(I don't mean anything bad. Just that teaching about Galaxies and all in first grade isn't the brightest idea). There was always a mountain of homework to do and as opposed to some other countries, we had to do it on a day to day basis. We didn't have a week-long deadline. I was predictably not keeping up with it as I could have, had it been a normal amount, so my parents decided I didn't want to study and began their methods of getting me to „study“. I have yet to see a person able to keep up with that school's tempo, no matter the age.
This place was also where I got bullied. I felt I had nowhere to be: At home, the parents' situation, at school, the bully. I never really went outside to play with other children, so I missed that part of childhood.
After the second year of school I was transferred to an advanced German school, called like that because they taught German and not English there. I also got to learn a bit of Russian before they removed it from school. In that period I used to attend ballet. But for less than a year. And piano, which I remember having attended for quite a long while, some years, if my memory isn't fried. I quit it because of it having been forced on me. Last piece I ever played fully was Beethoven's Marmotte.
In this school I was once again the outcast of the class. I had some people to interact with. All of those interactions lasted a few years at most. Then, because of a part of my class choosing me as a laughing-stock N2 and another girl as the N1, I found my best friend, who I still have today. She's the only friend I have nearby.
Most of the time I hated myself. Even today I struggle with that sometimes.
After that came university. This us where I got something like a friend circle at last. But it still didn't last. I got in a relationship with one of the guys, but I was just attracted. There was another I couldn't dare getting close to. Turns out he also had something for me. Then he disappeared from our lives and a year after, I still cannot forget the person. If I want to, I have to deprive myself of my own personality. Not a thing I'm willing to give up. Then I broke up with the guy I was in a relationship with and completely disappeared from the friendship circle. To be honest, I had reasons to. They refused to even try to look for the guy and they called him a friend for years. Sometimes parents hitting me can occur even today, but if I REALLY piss them off.
Now I'm here and oh, my God, I'm officially am aunt now! My sister gave birth to a daughter this morning... She's in Berlin with mother and both she and the child are doing great. I just hope she manages to be a good mother.20 -
So this was a couple years ago now. Aside from doing software development, I also do nearly all the other IT related stuff for the company, as well as specialize in the installation and implementation of electrical data acquisition systems - primarily amperage and voltage meters. I also wrote the software that communicates with this equipment and monitors the incoming and outgoing voltage and current and alerts various people if there's a problem.
Anyway, all of this equipment is installed into a trailer that goes onto a semi-truck as it's a portable power distribution system.
One time, the computer in one of these systems (we'll call it system 5) had gotten fried and needed replaced. It was a very busy week for me, so I had pulled the fried computer out without immediately replacing it with a working system. A few days later, system 5 leaves to go work on one of our biggest shows of the year - the Academy Awards. We make well over a million dollars from just this one show.
Come the morning of show day, the CEO of the company is in system 5 (it was on a Sunday, my day off) and went to set up the data acquisition software to get the system ready to go, and finds there is no computer. I promptly get a phone call with lots of swearing and threats to my job. Let me tell you, I was sweating bullets.
After the phone call, I decided I needed to try and save my job. The CEO hadn't told me to do anything, but I went to work, grabbed an old Windows XP laptop that was gathering dust and installed my software on it. I then had to build the configuration file that is specific to system 5 from memory. Each meter speaks the ModBus over TCP/IP protocol, and thus each meter as a different bus id. Fortunately, I'm pretty anal about this and tend to follow a specific method of id numbering.
Once I got the configuration file done and tested the software to see if it would even run properly on Windows XP (it did!), I called the CEO back and told him I had a laptop ready to go for system 5. I drove out to Hollywood and the CFO (who was there with the CEO) had to walk about a mile out of the security zone to meet me and pick up the laptop.
I told her I put a fresh install of the data acquisition software on the laptop and it's already configured for system 5 - it *should* just work once you plug it in.
I didn't get any phone calls after dropping off the laptop, so I called the CFO once I got home and asked her if everything was working okay. She told me it worked flawlessly - it was Plug 'n Play so to speak. She even said she was impressed, she thought she'd have to call me to iron out one or two configuration issues to get it talking to the meters.
All in all, crisis averted! At work on Monday, my supervisor told me that my name was Mud that day (by the CEO), but I still work here!
Here's a picture of the inside of system 8 (similar to system 5 - same hardware)15 -
In a user-interface design meeting over a regulatory compliance implementation:
User: “We’ll need to input a city.”
Dev: “Should we validate that city against the state, zip code, and country?”
User: “You are going to make me enter all that data? Ugh…then make it a drop-down. I select the city and the state, zip code auto-fill. I don’t want to make a mistake typing any of that data in.”
Me: “I don’t think a drop-down of every city in the US is feasible.”
Manage: “Why? There cannot be that many. Drop-down is fine. What about the button? We have a few icons to choose from…”
Me: “Uh..yea…there are thousands of cities in the US. Way too much data to for anyone to realistically scroll through”
Dev: “They won’t have to scroll, I’ll filter the list when they start typing.”
Me: “That’s not really the issue and if they are typing the city anyway, just let them type it in.”
User: “What if I mistype Ch1cago? We could inadvertently be out of compliance. The system should never open the company up for federal lawsuits”
Me: “If we’re hiring individuals responsible for legal compliance who can’t spell Chicago, we should be sued by the federal government. We should validate the data the best we can, but it is ultimately your department’s responsibility for data accuracy.”
Manager: “Now now…it’s all our responsibility. What is wrong with a few thousand item drop-down?”
Me: “Um, memory, network bandwidth, database storage, who maintains this list of cities? A lot of time and resources could be saved by simply paying attention.”
Manager: “Memory? Well, memory is cheap. If the workstation needs more memory, we’ll add more”
Dev: “Creating a drop-down is easy and selecting thousands of rows from the database should be fast enough. If the selection is slow, I’ll put it in a thread.”
DBA: “Table won’t be that big and won’t take up much disk space. We’ll need to setup stored procedures, and data import jobs from somewhere to maintain the data. New cities, name changes, ect. ”
Manager: “And if the network starts becoming too slow, we’ll have the Networking dept. open up the valves.”
Me: “Am I the only one seeing all the moving parts we’re introducing just to keep someone from misspelling ‘Chicago’? I’ll admit I’m wrong or maybe I’m not looking at the problem correctly. The point of redesigning the compliance system is to make it simpler, not more complex.”
Manager: “I’m missing the point to why we’re still talking about this. Decision has been made. Drop-down of all cities in the US. Moving on to the button’s icon ..”
Me: “Where is the list of cities going to come from?”
<few seconds of silence>
Dev: “Post office I guess.”
Me: “You guess?…OK…Who is going to manage this list of cities? The manager responsible for regulations?”
User: “Thousands of cities? Oh no …no one is our area has time for that. The system should do it”
Me: “OK, the system. That falls on the DBA. Are you going to be responsible for keeping the data accurate? What is going to audit the cities to make sure the names are properly named and associated with the correct state?”
DBA: “Uh..I don’t know…um…I can set up a job to run every night”
Me: “A job to do what? Validate the data against what?”
Manager: “Do you have a point? No one said it would be easy and all of those details can be answered later.”
Me: “Almost done, and this should be easy. How many cities do we currently have to maintain compliance?”
User: “Maybe 4 or 5. Not many. Regulations are mostly on a state level.”
Me: “When was the last time we created a new city compliance?”
User: “Maybe, 8 years ago. It was before I started.”
Me: “So we’re creating all this complexity for data that, realistically, probably won’t ever change?”
User: “Oh crap, you’re right. What the hell was I thinking…Scratch the drop-down idea. I doubt we’re have a new city regulation anytime soon and how hard is it to type in a city?”
Manager: “OK, are we done wasting everyone’s time on this? No drop-down of cities...next …Let’s get back to the button’s icon …”
Simplicity 1, complexity 0.16 -
Lads, I will be real with you: some of you show absolute contempt to the actual academic study of the field.
In a previous rant from another ranter it was thrown up and about the question for finding a binary search implementation.
Asking a senior in the field of software engineering and computer science such question should be a simple answer, specifically depending on the type of job application in question. Specially if you are applying as a SENIOR.
I am tired of this strange self-learner mentality that those that have a degree or a deep grasp of these fundamental concepts are somewhat beneath you because you learned to push out a website using the New Boston tutorials on youtube. FOR every field THAT MATTERS a license or degree is hold in high regards.
"Oh I didn't go to school, shit is for suckers, but I learned how to chop people up and kinda fix it from some tutorials on youtube" <---- try that for a medical position.
"Nah it's cool, I can fix your breaks, learned how to do it by reading blogs on the internet" <--- maintenance shop
"Sure can write the controller processing code for that boing plane! Just got done with a low level tutorial on some websites! what can go wrong!"
(The same goes for military devices which in the past have actually killed mfkers in the U.S)
Just recently a series of people were sent to jail because of a bug in software. Industries NEED to make sure a mfker has aaaall of the bells and whistles needed for running and creating software.
During my masters degree, it fucking FASCINATED me how many mfkers were absolutely completely NEW to the concept of testing code, some of them with years in the field.
And I know what you are thinking "fuck you, I am fucking awesome" <--- I AM SURE YOU BLOODY WELL ARE but we live in a planet with billions of people and millions of them have fallen through the cracks into software related positions as well as complete degrees, the degree at LEAST has a SPECTACULAR barrier of entry during that intro to Algos and DS that a lot of bitches fail.
NOTE: NOT knowing the ABSTRACTIONS over the tools that we use WILL eventually bite you in the ASS because you do not fucking KNOW how these are implemented internally.
Why do you think compiler designers, kernel designers and embedded developers make the BANK they made? Because they don't know memory efficient ways of deploying a product with minimal overhead without proper data structures and algorithmic thinking? NOT EVERYTHING IS SHITTY WEB DEVELOPMENT
SO, if a mfker talks shit about a so called SENIOR for not knowing that the first mamase mamasa bloody simple as shit algorithm THROWN at you in the first 10 pages of an algo and ds book, then y'all should be offended at the mkfer saying that he is a SENIOR, because these SENIORS are the same mfkers that try to at one point in time teach other people.
These SENIORS are the same mfkers that left me a FUCKING HORRIBLE AND USELESS MESS OF SPAGHETTI CODE
Specially to most PHP developers (my main area) y'all would have been well motherfucking served in learning how not to forLoop the fuck out of tables consisting of over 50k interconnected records, WHAT THE FUCK
"LeaRniNG tHiS iS noT neeDed!!" yes IT fucking IS
being able to code a binary search (in that example) from scratch lets me know fucking EXACTLY how well your thought process is when facing a hard challenge, knowing the basemotherfucking case of a LinkedList will damn well make you understand WHAT is going on with your abstractions as to not fucking violate memory constraints, this-shit-is-important.
So, will your royal majesties at least for the sake of completeness look into a couple of very well made youtube or book tutorials concerning the topic?
You can code an entire website, fine as shit, you will get tested by my ass in terms of security and best practices, run these questions now, and it very motherfucking well be as efficient as I think it should be(I HIRE, NOT YOU, or your fucking blog posts concerning how much MY degree was not needed, oh and btw, MY degree is what made sure I was able to make SUCH decissions)
This will make a loooooooot of mfkers salty, don't worry, I will still accept you as an interview candidate, but if you think you are good enough without a degree, or better than me (has happened, told that to my face by a candidate) then get fucking ready to receive a question concerning: BASIC FUCKING COMPUTER SCIENCE TOPICS
* gays away into the night53 -
And, the other side, husbands 😂
——————————————————–
Dear Technical Support,
Last year I upgraded from Boyfriend 5.0 to Husband 1.0 and noticed a distinct slow down in overall system performance — particularly in the flower and jewelry applications, which operated flawlessly under Boyfriend 5.0. The new program also began making unexpected changes to the accounting modules.
In addition, Husband 1.0 uninstalled many other valuable programs, such as Romance 9.5 and Personal Attention 6.5 and then installed undesirable programs such as NFL 5.0, NBA 3.0, and Golf Clubs 4.1.
Conversation 8.0 no longer runs, and Housecleaning 2.6 simply crashes the system. I’ve tried running Nagging 5.3 to fix these problems, but to no avail.
What can I do?
Signed,
Desperate
——————————————————–
Dear Desperate:
First keep in mind, Boyfriend 5.0 is an Entertainment Package, while Husband 1.0 is an Operating System.
Please enter the command: ” C:/ I THOUGHT YOU LOVED ME” and try to download Tears 6.2 and don’t forget to install the Guilt 3.0 update.
If that application works as designed, Husband 1.0 should then automatically run the applications Jewelry 2.0 and Flowers 3.5. But remember, overuse of the above application can cause Husband 1.0 to default to Grumpy Silence 2.5, Happy Hour 7.0 or Beer 6.1.
Beer 6.1 is a very bad program that will download the Snoring Loudly Beta.
Whatever you do, DO NOT install Mother-in-law 1.0 (it runs a virus in the background that will eventually seize control of all your system resources).
Also, do not attempt to reinstall the Boyfriend 5.0 program. These are unsupported applications and will crash Husband 1.0.
In summary, Husband 1.0 is a great program, but it does have limited memory and cannot learn new applications quickly.
You might consider buying additional software to improve memory and performance. We recommend Food 3.0 and Hot Lingerie 7.7.
Good Luck,
Tech Support3 -
Things I wish I could tell my 18 year old self.
1) Accept you will make mistakes.
2) Truly learn the language you are using.
3) Write idiomatic code for the language you are using.
4) Be upfront about not knowing something.
5) Don't let not knowing something stop you from learning it.
6) None of us knew X until we learned it.
7) Understand your strengths and weaknesses as a developer, play to them.
8) Be willing to try new things.
9) X language isn't ALWAYS the best choice, X paradigm isn't ALWAYS the best choice. Choose wisely.
10) You won't know everything, but you might know more than others.
11) Your ideas and ego don't matter more than ensuring the product works.
12) "Perfection is the enemy of the good [enough]" - Voltaire
13) "Perfection is not achieved when there's nothing more to add, but when there's nothing more to remove." - Einstein.
14) Conflicts happen, deal with it.
15) Develop a toolset and really learn them.
16) Try new tools, they may prove better than what you were using.
17) Don't manage your own memory unless you absolutely have to, you are probably not smarter than the collective intelligence of the team that built the various garbage collection methods.
18) People can be dicks, especially online.
19) If you are new and people are being dicks to you, did you skip past the irc message about etiquette? If you did, you're the dick in this situation.
20) It can be tough, but it is fun, so have fun!6 -
Senior development manager in my org posted a rant in slack about how all our issues with app development are from
“Constantly moving goalposts from version to version of Xcode”
It took me a few minutes to calm myself down and not reply. So I’ll vent here to myself as a form of therapy instead.
Reality Check:
- You frequently discuss the fact that you don’t like following any of apples standards or app development guidelines. Bit rich to say the goalposts are moving when you have your back to them.
- We have a custom everything (navigation stack handler, table view like control etc). There’s nothing in these that can’t be done with the native ones. All that wasted dev time is on you guys.
- Last week a guy held a session about all the memory leaks he found in these custom libraries/controls. Again, your teams don’t know the basic fundamentals of the language or programming in general really. Not sure how that’s apples fault.
- Your “great emphasis on unit testing” has gotten us 21% coverage on iOS and an Android team recently said to us “yeah looks like the tests won’t compile. Well we haven’t touched them in like a year. Just ignore them”. Stability of the app is definitely on you and the team.
- Having half the app in react-native and half in native (split between objective-c and swift) is making nobodies life easier.
- The company forces us to use a custom built CI/CD solution that regularly runs out of memory, reports false negatives and has no specific mobile features built in. Did apple force this on us too?
- Shut the fuck up5 -
I have been a mobile developer working with Android for about 6 years now. In that time, I have endured countless annoyances in the Android development space. I will endure them no more.
My complaints are:
1. Ridiculous build times. In what universe is it acceptable for us to wait 30 seconds for a build to complete. Yes, I've done all the optimisations mentioned on this page and then some. Don't even mention hot reload as it doesn't work fast enough or just does not work at all. Also, buying better hardware should not be a requirement to build a simple Android app, Xcode builds in 2 seconds with a 8GB Macbook Air. A Macbook Air!
2. IDE. Android Studio is a memory hog even if you throw 32GB of RAM at it. The visual editors are janky as hell. If you use Eclipse, you may as well just chop off your fingers right now because you will have no use for them after you try and build an app from afresh. I mean, just look at some of the posts in this subreddit where the common response is to invalidate caches and restart. That should only be used as a last resort, but it's thrown about like as if it solves everything. Truth be told, it's Gradle's fault. Gradle is so annoying I've dedicated the next point to it.
3. Gradle. I am convinced that Gradle causes 50% of an Android developer's pain. From the build times to the integration into various IDEs to its insane package management system. Why do I need to manually exclude dependencies from other dependencies, the build tool should just handle it for me. C'mon it's 2019. Gradle is so bad that it requires approx 54GB of RAM to work out that I have removed a dependency from the list of dependencies. Also I cannot work out what properties I need to put in what block.
4. API. Android API is over-bloated and hellish. How do I schedule a recurring notification? Oh use an AlarmManager. Yes you heard right, an AlarmManager... Not a NotificationManager because that would be too easy. Also has anyone ever tried running a long running task? Or done an asynchronous task? Or dealt with closing/opening a keyboard? Or handling clicks from a RecyclerView? Yes, I know Android Jetpack aims to solve these issues but over the years I have become so jaded by things that have meant to solve other broken things, that there isn't much hope for Jetpack in my mind 😤
5. API 2. A non-insignificant number of Android users are still on Jelly Bean or KitKat! That means we, as developers, have to support some of your shitty API decisions (Fragments, Activities, ListView) from all the way back then!
6. Not reactive enough. Android has support for Databinding recently but this kind of stuff should have been introduced from the very start. Look at React or Flutter as to how easy it is to make shit happen without any effort.
7. Layouts. What the actual hell is going on here. MDPI, XHDPI, XXHDPI, mipmap, drawable. Fuck it, just chuck it all in the drawable folder. Seriously, Android should handle this for me. If I am designing for a larger screen then it should be responsive. I don't want to deal with 50 different layouts spread over 6 different folders.
8. Permission system. Why was this not included from the very start? Rogue apps have abused this and abused your user's privacy and security. Yet you ban us and not them from the Play Store. What's going on? We need answers.
9. In Android, building an app took me 3 months and I had a lot of work left to do but I got so sick of Android dev I dropped it in favour of Flutter. I built the same app in Flutter and it took me around a month and I completed it all.
10. XML.
If you're a new dev, for the love of all that is good in this world, do NOT get into Android development. Start with Flutter or even iOS. On Flutter and build times are insanely fast and the hot reload is under 500ms constantly. It's a breath of fresh air and will save you a lot of headaches AND it builds for iOS flawlessly.
To the people who build Android, advocate it and work on it, sorry to swear, but fuck you! You have created a mess that we have to work with on a day-to-day basis only for us to get banned from the app store! You have sold us a lie that Android development is amazing with all the sweet treat names and conferences that look bubbly and fun. You have allowed to get it so bad that we can't target an API higher than 18 because some Android users are still using devices that support that!
End this misery. End our pain. End our suffering. Throw this abomination away like you do with some of your other projects and migrate your efforts over to Flutter. Please!
#NoToGoogleIO #AndroidSummitBoycott #FlutterDev #ReactNative16 -
My team handles infrastructure deployment and automation in the cloud for our company, so we don't exactly develop applications ourselves, but we're responsible for building deployment pipelines, provisioning cloud resources, automating their deployments, etc.
I've ranted about this before, but it fits the weekly rant so I'll do it again.
Someone deployed an autoscaling application into our production AWS account, but they set the maximum instance count to 300. The account limit was less than that. So, of course, their application gets stuck and starts scaling out infinitely. Two hundred new servers spun up in an hour before hitting the limit and then throwing errors all over the place. They send me a ticket and I login to AWS to investigate. Not only have they broken their own application, but they've also made it impossible to deploy anything else into prod. Every other autoscaling group is now unable to scale out at all. We had to submit an emergency limit increase request to AWS, spent thousands of dollars on those stupidly-large instances, and yelled at the dev team responsible. Two weeks later, THEY INCREASED THE MAX COUNT TO 500 AND IT HAPPENED AGAIN!
And the whole thing happened because a database filled up the hard drive, so it would spin up a new server, whose hard drive would be full already and thus spin up a new server, and so on into infinity.
Thats probably the only WTF moment that resulted in me actually saying "WTF?!" out loud to the person responsible, but I've had others. One dev team had their code logging to a location they couldn't access, so we got daily requests for two weeks to download and email log files to them. Another dev team refused to believe their server was crashing due to their bad code even after we showed them the logs that demonstrated their application had a massive memory leak. Another team arbitrarily decided that they were going to deploy their code at 4 AM on a Saturday and they wanted a member of my team to be available in case something went wrong. We aren't 24/7 support. We aren't even weekend support. Or any support, technically. Another team told us we had one day to do three weeks' worth of work to deploy their application because they had set a hard deadline and then didn't tell us about it until the day before. We gave them a flat "No" for that request.
I could probably keep going, but you get the gist of it.4 -
We had the most fucking retarded client today. No, seriously, if you ever beat their level you have a serious mental issue.
They had a mail problem for which they'd need to check at the side of another company since we don't have those fucking logs.
Their statements:
- they entered an email address In the text field of mail-tester.com and were furious that they didn't get the results sent.
Note: it says right on that page that YOU JUST NEED TO SEND THE EMAIL ADDRESS WHICH IS PRE-ENTRRED IN THAT TEXT FIELD AN EMAIL.
- their company has been a reputable 'conservative' company which hasn't done anything wrong since 19xx so the fact that they'd end up on a blacklist was FUCKING OUTRAGEOUS and bullshit.
- our support wasn't willing to help and only willing to tell them outrageous lies.
- the other it company was only reachable at a premium number and thus expensive to call.
Emails back and forth and finally they CC'd the other company. They're reply was fucking priceless:
"we never had a premium number. Feel free to call us on *number* any time during the week between *time* and *time*.
Then he told us that we should just go back to sleep.
It was way worse than that but due to privacy and my own memory this is all I can tell.
Just wow.3 -
About two years ago I get roped into a something when someone was requesting an $8000 laptop to run an "program" that they wrote in Excel to pull data from our mainframe.
In reality they are using our normal application that interacts with the mainframe and screen scrapping it to populate several Excel spreadsheets.
So this guy kept saying that he needed the expensive laptop because he needed the extra RAM and processing power for his application. At the time we only supported 32 bit Windows 7 so even though I told him ten times that the OS wouldn't recognize more than 3.5 GB of RAM he kept saying that increasing the RAM would fix his problem. I also explained that even if we installed the 64 bit OS we didn't have approval for the 64 bit applications.
So we looked at the code and we found that rather than reusing the same workbook he was opening a new instance of a workbook during each iteration of his loop and then not closing or disposing of them. So he was running out of memory due to never disposing of anything.
Even better than all of that, he wanted a faster processor to speed up the processing, but he had about 5 seconds of thread sleeps in each loop so that the place he was screen scrapping from would have time to load. So it wouldn't matter how fast the processor was, in the end there were sleeps and waits in there hard coded to slow down the app. And the guy didn't understand that a faster processor wouldn't have made a difference.
The worst thing is a "dev" that thinks they know what they are doing but they don't have a clue.7 -
I had to open the desktop app to write this because I could never write a rant this long on the app.
This will be a well-informed rebuttal to the "arrays start at 1 in Lua" complaint. If you have ever said or thought that, I guarantee you will learn a lot from this rant and probably enjoy it quite a bit as well.
Just a tiny bit of background information on me: I have a very intimate understanding of Lua and its c API. I have used this language for years and love it dearly.
[START RANT]
"arrays start at 1 in Lua" is factually incorrect because Lua does not have arrays. From their documentation, section 11.1 ("Arrays"), "We implement arrays in Lua simply by indexing tables with integers."
From chapter 2 of the Lua docs, we know there are only 8 types of data in Lua: nil, boolean, number, string, userdata, function, thread, and table
The only unfamiliar thing here might be userdata. "A userdatum offers a raw memory area with no predefined operations in Lua" (section 26.1). Essentially, it's for the API to interact with Lua scripts. The point is, this isn't a fancy term for array.
The misinformation comes from the table type. Let's first explore, at a low level, what an array is. An array, in programming, is a collection of data items all in a line in memory (The OS may not actually put them in a line, but they act as if they are). In most syntaxes, you access an array element similar to:
array[index]
Let's look at c, so we have some solid reference. "array" would be the name of the array, but what it really does is keep track of the starting location in memory of the array. Memory in computers acts like a number. In a very basic sense, the first sector of your RAM is memory location (referred to as an address) 0. "array" would be, for example, address 543745. This is where your data starts. Arrays can only be made up of one type, this is so that each element in that array is EXACTLY the same size. So, this is how indexing an array works. If you know where your array starts, and you know how large each element is, you can find the 6th element by starting at the start of they array and adding 6 times the size of the data in that array.
Tables are incredibly different. The elements of a table are NOT in a line in memory; they're all over the place depending on when you created them (and a lot of other things). Therefore, an array-style index is useless, because you cannot apply the above formula. In the case of a table, you need to perform a lookup: search through all of the elements in the table to find the right one. In Lua, you can do:
a = {1, 5, 9};
a["hello_world"] = "whatever";
a is a table with the length of 4 (the 4th element is "hello_world" with value "whatever"), but a[4] is nil because even though there are 4 items in the table, it looks for something "named" 4, not the 4th element of the table.
This is the difference between indexing and lookups. But you may say,
"Algo! If I do this:
a = {"first", "second", "third"};
print(a[1]);
...then "first" appears in my console!"
Yes, that's correct, in terms of computer science. Lua, because it is a nice language, makes keys in tables optional by automatically giving them an integer value key. This starts at 1. Why? Lets look at that formula for arrays again:
Given array "arr", size of data type "sz", and index "i", find the desired element ("el"):
el = arr + (sz * i)
This NEEDS to start at 0 and not 1 because otherwise, "sz" would always be added to the start address of the array and the first element would ALWAYS be skipped. But in tables, this is not the case, because tables do not have a defined data type size, and this formula is never used. This is why actual arrays are incredibly performant no matter the size, and the larger a table gets, the slower it is.
That felt good to get off my chest. Yes, Lua could start the auto-key at 0, but that might confuse people into thinking tables are arrays... well, I guess there's no avoiding that either way.13 -
So, some time ago, I was working for a complete puckered anus of a cosmetics company on their ecommerce product. Won't name names, but they're shitty and known for MLM. If you're clever, go you ;)
Anyways, over the course of years they brought in a competent firm to implement their service layer. I'd even worked with them in the past and it was designed to handle a frankly ridiculous-scale load. After they got the 1.0 released, the manager was replaced with some absolutely talentless, chauvinist cuntrag from a phone company that is well known for having 99% indian devs and not being able to heard now. He of course brought in his number two, worked on making life miserable and running everyone on the team off; inside of a year the entire team was ex-said-phone-company.
Watching the decay of this product was a sheer joy. They cratered the database numerous times during peak-load periods, caused $20M in redis-cluster cost overrun, ended up submitting hundreds of erroneous and duplicate orders, and mailed almost $40K worth of product to a random guy in outer mongolia who is , we can only hope, now enjoying his new life as an instagram influencer. They even terminally broke the automatic metadata, and hired THIRTY PEOPLE to sit there and do nothing but edit swagger. And it was still both wrong and unusable.
Over the course of two years, I ended up rewriting large portions of their infra surrounding the centralized service cancer to do things like, "implement security," as well as cut memory usage and runtimes down by quite literally 100x in the worst cases.
It was during this time I discovered a rather critical flaw. This is the story of what, how and how can you fucking even be that stupid. The issue relates to users and their reports and their ability to order.
I first found this issue looking at some erroneous data for a low value order and went, "There's no fucking way, they're fucking stupid, but this is borderline criminal." It was easy to miss, but someone in a top down reporting chain had submitted an order for someone else in a different org. Shouldn't be possible, but here was that order staring me in the face.
So I set to work seeing if we'd pwned ourselves as an org. I spend a few hours poring over logs from the log service and dynatrace trying to recreate what happened. I first tested to see if I could get a user, not something that was usually done because auth identity was pervasive. I discover the users are INCREMENTAL int values they used for ids in the database when requesting from the API, so naturally I have a full list of users and their title and relative position, as well as reports and descendants in about 10 minutes.
I try the happy path of setting values for random, known payment methods and org structures similar to the impossible order, and submitting as a normal user, no dice. Several more tries and I'm confident this isn't the vector.
Exhausting that option, I look at the protocol for a type of order in the system that allowed higher level people to impersonate people below them and use their own payment info for descendant report orders. I see that all of the data for this transaction is stored in a cookie. Few tests later, I discover the UI has no forgery checks, hashing, etc, and just fucking trusts whatever is present in that cookie.
An hour of tweaking later, I'm impersonating a director as a bottom rung employee. Score. So I fill a cart with a bunch of test items and proceed to checkout. There, in all its glory are the director's payment options. I select one and am presented with:
"please reenter card number to validate."
Bupkiss. Dead end.
OR SO YOU WOULD THINK.
One unimportant detail I noticed during my log investigations that the shit slinging GUI monkeys who butchered the system didn't was, on a failed attempt to submit payment in the DB, the logs were filled with messages like:
"Failed to submit order for [userid] with credit card id [id], number [FULL CREDIT CARD NUMBER]"
One submit click later and the user's credit card number drops into lnav like a gatcha prize. I dutifully rerun the checkout and got an email send notification in the logs for successful transfer to fulfillment. Order placed. Some continued experimentation later and the truth is evident:
With an authenticated user or any privilege, you could place any order, as anyone, using anyon's payment methods and have it sent anywhere.
So naturally, I pack the crucifixion-worthy body of evidence up and walk it into the IT director's office. I show him the defect, and he turns sheet fucking white. He knows there's no recovering from it, and there's no way his shitstick service team can handle fixing it. Somewhere in his tiny little grinchly manager's heart he knew they'd caused it, and he was to blame for being a shit captain to the SS Failboat. He replies quietly, "You will never speak of this to anyone, fix this discretely." Straight up hitler's bunker meme rage.13 -
TLDR;
Wrote a slick scheduling and communication system allowing me to assign photography resources based on time and location.
I'll tell you a little secret ... I'm not actually a dev. I'm a photographer, pretending to be a dev.
Or ... perhaps it's the other way around? (I spend most of my time writing code these days, but only for me - I write the software I use to run my business).
I own a photography studio - we specialize in youth volleyball photography (mostly 12-18 year old girls with a bit of high school, college and semi-pro thrown in for good measure - it's a hugely popular sport) and travel all over the US (and sometimes Europe) photographing.
As a point of scale, this year we photographed a tournament in Denver that featured 100 volleyball courts (in one room!), playing at the same time.
I'm based in California and fly a crew of part-time staff around to these events, but my father and I drive our booth equipment wherever it needs to go. We usually setup a 30'x90' booth with local servers, download/processing/cashier computers and 45 laptops for viewing/ordering photographs. Not to mention 16' drape and banners, tons of samples, 55' TVs, etc. It's quite the production.
We photograph by paid signup only - when there are upwards of 800 teams/9,600 athletes per weekend playing, and you only have four trained photographers, you've got to manage your resources!
This of course means you have to have a system for taking sign those sign ups, assigning teams to photographers and doing so in the most efficient manner possible based on who is available when the team is playing. (You can waste an awful lot of time walking from one court to another in a large convention center - especially if you have to navigate through large crowds - not to mention exhausting yourself).
So this year I finally added a feature I've wanted for quite some time - an interactive court map. I can take an image of the court layout from the tournament and create an HTML version in our software. As I mouse over requests in one window, the corresponding court is highlighted on the map in another browser window. Each photographer has a color associated with them. When I assign requests to a photographer, the court is color coded with the color of the photographer. This allows me to group assignments to minimize photographer walk time and keep them in a specific area. It's also very easy to look at the map and see unassigned requests and look to see what photographer is nearby.
This year I also integrated with Twilio and setup a simple set of text shortcuts that photographers can use to let our booth staff know where they are, if they have memory cards that need picking up, if they need water/coffee/snack, etc. They can also move assignments on their schedule or send and SOS for help if it looks like they aren't going to be able to photograph a team.
Kind of a CLI via the phone. :)
The additions have turned out to be really useful and has made scheduling and managing the photographers much easier that it was in the past.18 -
Application has had a suspected memory leak for years. Tech team got developers THE EXACT CODE that caused it. Few months of testing go by, telling us they're resolving their memory leak problem (finally).
Today: yeah, we still need restarts because we don't know if this new deployment will fix our memory leak, we don't know what the problem is.
WHAT THE FUCK WERE YOU DOING IN THE LOWER REGIONS FOR THREE FUCKING MONTHS?!?!?! HAVING A FUCKING ORGY???????????????
My friends took the time to find your damn problem for you AND YOU'RE GOING TO TELL ME YOU DON'T KNOW WHAT THE PROBLEM IS???
It was in lower regions for 3 MONTHS and you don't know how it's impacting memory usage?!?!?! DO YOU WANT TO STILL HAVE A JOB? BECAUSE IF NOT, I CAN TAKE CARE OF THAT FOR YOU. YOU DON'T DESERVE YOUR FUCKING JOB IF YOU CAN'T FUCKING FIX THIS.
Every time your app crashes, even though I don't need to get your highest level boss on anymore for approval to restart your server, I'M GOING TO FUCKING CALL HIM AND MAKE HIM SEE THAT YOU'RE A FUCKING IDIOT. Eventually, he'll get so annoyed with me, your shit will be fixed. AND I WON'T HAVE TO DEAL WITH YOUR USELESS ASS ANYMORE.
(Rant directed at project manager more than dev. Don't know which is to blame, so blaming PM)28 -
A Developer is desperate: his java application servers are unresponsive, thousand of dead zombie threads are sucking all cpus, memory is leaking everywhere, garbage collector has gone crazy, the cluster sessions are fucked....
The Developer goes to the closest bridge, ties a stone to his neck and gets ready to jump.
Suddenly a bearded old man with a fiery look runs toward him, yelling:
- stop stop!!!! Your application is not scaling and misconfigured, your servers are melting, cpu usage is not sustainable anymore, but don't despair
The Developer, puzzled, looks at him:
-I've never seen you...how do you know...
- Hey, man, I'm the Devil. I know everything. All your problems are solved. I'll give you magic functions. They are called Lambda.
You'll never have to worry about your servers, scalability, security, configuration and shit.
The Developer seems astonished but relieved:
- Ok, sounds great! let's try it - suddenly suspicion creeps in - hmmmm but you are the Devil....so...you want something back, don't you?
(the Devil nods lightly with a diabolic smile)
- ...and...you want my soul, I guess...
- your soul??? come on!!! - the Devil burst in a laugh - we are in 2019. I don't care about your soul. I want your ass.
- What!???!!!?
- yes, I want to fuck your ass
The Developer, evaluates quickly the situation.
Few moments of pain or slight discomfort (?) in exchange for magic lambda. It could be worth. He accepts.
After a while of rough anal fucking, the devil asks
- Hey, how old are you anyway?
- 45, why?
- Oh jeeez...45!!!??? and you still believe in the devil?5 -
Storytime
A story about an Android TVbox which decided to become an iPad
Several years ago we've bought an android tv-box.
It served me and my family well for several years.
Specs are not that important in this story, but there they are:
Android 4.4
1GB RAM
Amlogic quadcore 1.4HGz
8GB memory.
This device served us well - online TV, browsing, music, file sharing and so on. But recently cheap Chinese memory deciteed to take a break and damaged ROM. Because of that device won't boot. The only option was to take it apart and "short circuit" certain legs on memory chip and make it boot from SD card and install new firmware. After such operations tv-box worked well again.
Hoverer, memory glitched again and again and this algorithm was repeated for several months.
But that is not what is this story about.
One day memory went completely crazy and there was no way to install new firmware on it. It just hanged on install. (BTW, it was official firmware for this device)
But after countless attempts it finally worked! It installed the firmware and booted into launcher and connected to WiFi!
But now comes the most interesting part.
It was not android anymore.
It decided to became an iPad.
My dad logged in to his Google account via tv-box and got mail that someboby connected from our IP via iPad (we don't have an iPad) and using safari browser! Stock browser is not safari browser.....
"Ok, nvm, crazy glitch." - we thought.
But preinstalled play marked wont launch. Because he told us, that we're trying to connect from iPad.
And Google chrome page suggested to download chrome for iPad
And everything was acting like it is an iPad.
OK, downloaded iTunes, why not??? ._.
Tried to install elixir for android via apk from flash, but then memory glitched one more time, everything went black and tv-box had damaged ROM again...
After that we decided to not torment it anymore...
That's it. Poor Android TVbox that all his life dreamed to become an iPad. Rest in peace.2 -
I can't name one specific time that was the best memory per say. My team is super close - we get lunch almost every day, have (inappropriate) inside jokes and just act dumb together. Not one person on my team who doesn't fit in. Even our boss is cool. :) #LoveMyTeam4
-
Yeah Mozilla fuck merit and fuck you too!
This, this is what I was talking about when the fucking CoC came out and everyone (including it's author) started it using it as a political weapon.
You castrated fucking virgins! Mozilla, I want to support you I really don't like chrome but you always manage to disappoint everyone. I'm tired, tired of you morally superior socialists infecting my fucking workplace, entertainment and news.
This is just an excuse for lazy assholes to have their cake and eat it too and it's damn fucking INSULTING to us "minorities", I can work to get nice things just like anyone else bitch! having another skin color is not a disability!
Worst of all, you seem to have straight out millennial retards making these decisions seeing as it's based on an article from a washed up "gender research" professor that thinks Barbie Doctor is problematic, the most biased and dumb source you can possibly pull out of your ass.
Two classmates were murdered this morning, do you really think we care about what your diversity and inclusion Dept thinks it's problematic? You delusional halfwits, the only comforting thought is that your soft bigotry will perish alongside your product when it inevitably diminishes it's quality for sake of "equality".
Want to make better products? Ditch your useless diversity and inclusion department and start optimizing the memory consumption on firefox.
Want to help minorities? Start paying your outsourced developers decently.
I hope this helps people who thought including politics in software development wouldn't have dire consecuences to open their eyes; if not, oh well I guess people will get it when mozilla keeps going down the drain and they get fired because they just outsourced their work in the name of "diversity" just to save money.
https://blog.mozilla.org/inclusion/...95 -
Would the web be better off, if there was zero frontend scripting? There would be HTML5 video/audio, but zero client side JS.
Browsers wouldn't understand script tags, they wouldn't have javascript engines, and they wouldn't have to worry about new standards and deprecations.
Browsers would be MUCH more secure, and use way less memory and CPU resources.
What would we really be missing?
If you build less bloated pages, you would not really need ajax calls, page reloads would be cheap. Animated menus do not add anything functionally, and could be done using css as well. Complicated webapps... well maybe those should just be desktop/mobile apps.
Pages would contain less annoying elements, no tracking or crypto mining scripts, no mouse tracking, no exploitative spam alerts.
Why don't we just deprecate JS in the browser, completely?
I think it would be worth it.22 -
Here are the reasons why I don't like IPv6.
Now I'll be honest, I hate IPv6 with all my heart. So I'm not supporting it until inevitably it becomes the de facto standard of the internet. In home networks on the other hand.. huehue...
The main reason why I hate it is because it looks in every way overengineered. Or rather, poorly engineered. IPv4 has 32 bits worth, which translates to about 4 billion addresses. IPv6 on the other hand has 128 bits worth of addresses.. which translates to.. some obscenely huge number that I don't even want to start translating.
That's the problem. It's too big. Anyone who's worked on the internet for any amount of time knows that the internet on this planet will likely not exceed an amount of machines equal to about 1 or 2 extra bits (8.5B and 17.1B respectively). Now of course 33 or 34 bits in total is unwieldy, it doesn't go well with electronics. From 32 you essentially have to go up to 64 straight away. That's why 64-bit processors are.. well, 64 bits. The memory grew larger than the 4GB that a 32-bit processor could support, so that's what happened.
The internet could've grown that way too. Heck it probably could've become 64 bits in total of which 34 are assigned to the internet and the remaining bits are for whatever purposes large IP consumers would like to use the remainder for.
Whoever designed IPv6 however.. nope! Let's give everyone a /64 range, and give them quite literally an IP pool far, FAR larger than the entire current internet. What's the fucking point!?
The IPv6 standard is far larger than it should've been. It should've been 64 bits instead of 128, and it should've been separated differently. What were they thinking? A bazillion colonized planets' internetworks that would join the main internet as well? Yeah that's clearly something that the internet will develop into. The internet which is effectively just a big network that everyone leases and controls a little bit of. Just like a home network but scaled up. Imagine or even just look at the engineering challenges that interplanetary communications present. That is not going to be feasible for connecting multiple planets' internets. You can engineer however you want but you can't engineer around the hard limit of light speed. Besides, are our satellites internet-connected? Well yes but try using one. And those whizz only a couple of km above sea level. The latency involved makes it barely usable. Imagine communicating to the ISS, the moon or Mars. That is not going to happen at an internet scale. Not even close. And those are only the closest celestial objects out there.
So why was IPv6 engineered with hundreds of years of development and likely at least a stage 4 civilization in mind? No idea. Future-proofing or poor engineering? I honestly don't know. But as a stage 0 or maybe stage 1 person, I don't think that I or civilization for that matter is ready for a 128-bit internet. And we aren't even close to needing so many bits.
Going back to 64-bit processors and memory. We've passed 32 bit address width about a decade ago. But even now, we're only at about twice that size on average. We're not even close to saturating 64-bit address width, and that will likely take at least a few hundred years as well. I'd say that's more than sufficient. The internet should've really become a 64-bit internet too.34 -
Him: Relation databases are stupid; SQL injections, complex relationships, redundant syntax and so much more!
Me: so what should we use instead? Mongo, redis, some other fancy new db?
Him: no, I have this class in Java, it loads all the data into memory and handles transfers with http.
Me: ...... Bye!5 -
Long long ago there was a man who discovered if he scratched certain patterns onto a rock he could use them to remind him about things he would otherwise forgot.
Over time the scratching were refined and this great secret of eternal memory were taught to his children, and they taught it to their children.
Soon mankind had discovered a way to preserve through the ages his thoughts and memories and further discovered that if he wrote down these symbols he could transfer information over distances by simply recording these symbols in a portable medium.
Writing exploded it allowed a genius in one place to communicate the information he had recorded across time and space.
Thousands of years passed, writing continued to be refined and more and more vital. Eventually a humble man by the name of Johannes Gutenberg seeking to make the divine word of God accessible to the people created the printing press allowing the written word to be copied and circulated with great ease expanding vastly the works available to mankind and the number of people who could understand this arcane art of writing.
But mankind never satiated in his desire to know all there is to know demanded more information, demanded it faster, demanded it better. So the greatest minds of 200 years, Marconi, Maxwell, Bohr, Von Nueman, Turing and a host of others working with each other, standing on the shoulders of their brobdinangian predecessors, brought forth a way to send these signals, transfer this writing upon beams of light, by manipulating the very fabric of the cosmos, mankind had reach the ultimate limits of transmission of information. Man has conquered time, and space itself in preserving and transmitting information, we are as the gods!
My point is this, that your insistence upon having a meeting to ask a question, with 10 people that could've been answered with a 2 sentence email, is not only an affront to me for wasting my time, but also serves as an affront to the greatest minds of the 19th and 20th centuries, it is an insult to your ancestors who first sacrificed and labored to master the art of writing, it is in fact offensive to all of humanity up to this point.
In short by requiring a meeting to be held, not only are you ensuring the information is delayed because we all now need to find a time that all of us are available, not only are you now eliminating the ability to have a first hand permanent record of what need to be communicated, you are actively working against progress, you are dragging humanity collectively backwards. You join the esteemed ranks of organizations such as the oppressive Catholic church that sought to silence Galialio and Copernicus, you are among the august crowd that burned witches at Salem, the Soviet secret police that silenced "bourgeoisie" science, you join the side of thousands of years of daft ignorance.
If it were not for you people we would have flying cars, we would have nanobots capable of building things on a whim, we would all be programming in lisp. But because of you and people like you we are trapped in this world, where the greatest minds are trapped in meetings that never end, where mistruth and ignorance run rampant, a world where JavaScript is the de facto language of choice every where because it runs everywhere, and ruins everywhere.
So please remember, next time you want to have a meeting ask yourself first. "Could this be an email?" "Do I enjoy burning witches?" if you do this you might make the world a little bit of a less terrible place to be.6 -
import LongRantKit
import NonRantKit
import TldrKit
I don't like stickers on my laptop because it clutters it up. But today I realized the importance of them.
A few months ago I was sitting at a coffee shop working on a paper and I noticed a guy with this cool sticker on a MacBook Pro: it had the integral symbol to the left of the Apple logo, and to the right of it a lowercase d and another Apple logo. It took me a few hours to realize what it meant, but I finally did and at that point I also guessed that not many people know what it is.
So I, as antisocial as I am, I finish up my work and before I leave I walk up to him and say hi. At this point I'm a senior in high school and I learn he's a junior in the same college I plan to attend. We talked a little before I had to leave and got to know each other somewhat.
After I leave I find him on Instagram and Facebook and friend him and such.
Recently I posted a picture saying I had recently joined the Apple Developer Team, and also recently reposted a memory on Facebook from 5 years ago that was a screen capture of an iPhone 4 simulator running iOS 5 showing off one of my first apps.
Then yesterday I get a message from the guy I met at the coffee shop asking for some help with an iOS project he's working on. We decide to meet today and I spend the entire morning showing him the basics of Swift, Xcode, Interface Builder, etc. I feel like I really helped him jumpstart his app and helped him understand the basics of different concepts.
If he didn't have that integral sticker on his laptop I would have never had this opportunity to finally share some iOS development experience.
For this I would like to thank my high school calculus teacher, with whom I spent many classes at Starbucks because I was an only student. I'd like to thank laptop stickers, and finally I would like to thank the coffee shop.
TL;DR: Said hi to a guy with an integral sticker on his laptop, a few months later he approaches me for help understanding iOS development.2 -
Not just another Windows rant:
*Disclaimer* : I'm a full time Linux user for dev work having switched from Windows a couple of years ago. Only open Windows for Photoshop (or games) or when I fuck up my Linux install (Arch user) because I get too adventurous (don't we all)
I have hated Windows 10 from day 1 for being a rebel. Automatic updates and generally so many bugs (specially the 100% disk usage on boot for idk how long) really sucked.
It's got ads now and it's generally much slower than probably a Windows 8 install..
The pathetic memory management and the overall slower interface really ticks me off. I'm trying to work and get access to web services and all I get is hangups.
Chrome is my go-to browser for everything and the experience is sub par. We all know it gobbles up RAM but even more on Windows.
My Linux install on the same computer flies with a heavy project open in Android Studio, 25+ tabs in Chrome and a 1080p video playing in the background.
Up until the creators update, UI bugs were a common sight. Things would just stop working if you clicked them multiple times.
But you know what I'm tired of more?
The ignorant pricks who bash it for being Windows. This OS isn't bad. Sure it's not Linux or MacOS but it stands strong.
You are just bashing it because it's not developer friendly and it's not. It never advertises itself like that.
It's a full fledged OS for everyone. It's not dev friendly but you can make it as much as possible but you're lazy.
People do use Windows to code. If you don't know that, you're ignorant. They also make a living by using Windows all day. How bout tha?
But it tries to make you feel comfortable with the recent bash integration and the plethora of tools that Microsoft builds.
IIS may not be Apache or Nginx but it gets the job done.
Azure uses Windows and it's one of best web services out there. It's freaking amazing with dead simple docs to get up and running with a web app in 10 minutes.
I saw many rants against VS but you know it's one of the best IDEs out there and it runs the best on Windows (for me, at least).
I'm pissed at you - you blind hater you.
Research and appreciate the things good qualities in something instead of trying to be the cool but ignorant dev who codes with Linux/Mac but doesn't know shit about the advantages they offer.undefined windows 10 sucks visual studio unix macos ignorance mac terminal windows 10 linux developer22 -
Story, !rant.
This memory came up as I was commenting on another rant, and thought it was worthy of a better retelling.
So about a year or two ago, I had just gotten a Software Defined Radio, and was tinkering with it and looking around for cool stuff I could do with it. After stalking planes for a while (caught a 747 over my area 😎) I saw this program that decoded satellite images of earth, coming from the NOAA satellites. I thought this was amazing.
So I waited until one was over my area and let the software do its magic. The image was not great, since I had this set up on the first floor and there was a lot of material between me and the satellite.
So I came to the brilliant conclusion that I'd leave the program on automatic more (it will start sampling when the satellite is near) on my terrace, which should yield better results, right?
Perhaps. Who knows. Anyways, couple hours pass and we are running late to a family dinner. So we book it. Family dinner was great, good food and all, and was having fun, so never thought about my poor laptop, sitting alone in the night.
But then, when I was walking home in the rain... It hit me. I started running. I couldn't believe what I had done. Fast forward five minutes, and I'm out of breath, but home. I run upstairs, and see the laptop just sitting there, lid open, no lights on, and of course soaked right through.
I couldn't believe it. My only piece of tech at the time, and my only avenue for programming, gone. And I was 15, so I wasn't getting another one any time soon. Took it inside and drained the water out of it, and just left it there lying on its side.
Next day it worked just fine 🤣 the battery on my laptop only lasted max one hour, so by sheer luck it had lost power before the rain came. That is the one time I have to thank that battery for being such utter trash.7 -
What an absolute fucking disaster of a day. Strap in, folks; it's time for a bumpy ride!
I got a whole hour of work done today. The first hour of my morning because I went to work a bit early. Then people started complaining about Jenkins jobs failing on that one Jenkins server our team has been wanting to decom for two years but management won't let us force people to move to new servers. It's a single server with over four thousand projects, some of which run massive data processing jobs that last DAYS. The server was originally set up by people who have since quit, of course, and left it behind for my team to adopt with zero documentation.
Anyway, the 500GB disk is 100% full. The memory (all 64GB of it) is fully consumed by stuck jobs. We can't track down large old files to delete because du chokes on the workspace folder with thousands of subfolders with no Ram to spare. We decide to basically take a hacksaw to it, deleting the workspace for every job not currently in progress. This of course fucked up some really poorly-designed pipelines that relied on workspaces persisting between jobs, so we had to deal with complaints about that as well.
So we get the Jenkins server up and running again just in time for AWS to have a major incident affecting EC2 instance provisioning in our primary region. People keep bugging me to fix it, I keep telling them that it's Amazon's problem to solve, they wait a few minutes and ask me to fix it again. Emails flying back and forth until that was done.
Lunch time already. But the fun isn't over yet!
I get back to my desk to find out that new hires or people who got new Mac laptops recently can't even install our toolchain, because management has started handing out M1 Macs without telling us and all our tools are compiled solely for x86_64. That took some troubleshooting to even figure out what the problem was because the only error people got from homebrew was that the formula was empty when it clearly wasn't.
After figuring out that problem (but not fully solving it yet), one team starts complaining to us about a Github problem because we manage the github org. Except it's not a github problem and I already knew this because they are a Problem Team that uses some technical authoring software with Git integration but they only have even the barest understanding of what Git actually does. Turns out it's a Git problem. An update for Git was pushed out recently that patches a big bad vulnerability and the way it was patched causes problems because they're using Git wrong (multiple users accessing the same local repo on a samba share). It's a huge vulnerability so my entire conversation with them went sort of like:
"Please don't."
"We have to."
"Fine, here's a workaround, this will allow arbitrary code execution by anyone with physical or virtual access to this computer that you have sitting in an unlocked office somewhere."
"How do I run a Git command I don't use Git."
So that dealt with, I start taking a look at our toolchain, trying to figure out if I can easily just cross-compile it to arm64 for the M1 macbooks or if it will be a more involved fix. And I find all kinds of horrendous shit left behind by the people who wrote the tools that, naturally, they left for us to adopt when they quit over a year ago. I'm talking entire functions in a tool used by hundreds of people that were put in as a joke, poorly documented functions I am still trying to puzzle out, and exactly zero comments in the code and abbreviated function names like "gars", "snh", and "jgajawwawstai".
While I'm looking into that, the person from our team who is responsible for incident communication finally gets the AWS EC2 provisioning issue reported to IT Operations, who sent out an alert to affected users that should have gone out hours earlier.
Meanwhile, according to the health dashboard in AWS, the issue had already been resolved three hours before the communication went out and the ticket remains open at this moment, as far as I know.5 -
The more depressed you get over the current state of software is how you know you made it.When you start making your own opinions and say"wow these people are full of shit"
Primary example, the web development overblown bullshit. Fuck me dude, you really don't need that full featured react, vue, angular framework to make sense of shit. You are going over the top for fucking ajax functionality and state management that you could do by yourself without needing to learn a full framework, by the time you finish learning react you probably would have been better served with standard vanilla af JS and server side rendering.
Our world is full of fads and many talented people that perpetrate them. Its fine, it is a the nature of the beast. But a lot...A LOT of software is very POORLY written. And adding levels of abstraction over a very broken paradigm (web in this case) does and will not make it better.
Basically I am fucking hating being a web developer and want to go back to a time in which we cared about how much memory consumption our applications made as well as not worrying about the fucking frontend having the ability to implement machine learning.
I want to run sublime.exe and being sure that it is a native application to my system and not using a fucking contained web browser to implement my fucking text editor. With 20mb of ram at most instead of 500mb WTF.
I knew I made it when I could read comments on Hacker news and reddit and say "this idiot is full of shit", I knew I made it when I would sigh heavily at the idea of having another project rather than having a fan girl attitude towards it.
I knew I made it when people writing about software development meant shit to me rather than the wonder of what the fuck they were talking about.
I knew I made it when getting laid was more important to me than fucking around with code.
pussy > code
Fuck you.13 -
I'll use this topic to segue into a related (lonely) story befitting my mood these past weeks.
This is entire story going to sound egotistical, especially this next part, but it's really not. (At least I don't think so?)
As I'm almost entirely self-taught, having another dev giving me good advice would have been nice. I've only known / worked with a few people who were better devs than I, and rarely ever received good advice from them.
One of those better devs was my first computer science teacher. Looking back, he was pretty average, but he held us to high standards and gave good advice. The two that really stuck with me were: 1) "save every time you've done something you don't want to redo," and 2) "printf is your best debugging friend; add it everywhere there's something you want to watch." Probably the best and most helpful advice I've ever received 😊
I've seen other people here posting advice like "never hardcode" or "modularity keeps your code clean" -- I had to discover these pretty simple concepts entirely on my own. School (and later college) were filled with terrible teachers and worse students, and so were almost entirely useless for learning anything new.
The only decent dev I knew had brilliant ideas (genetic algorithms, sandboxing, ...) before they were widely used, but could rarely implement them well because he was generally an idiot. (Idiot sevant, I think? Definitely the idiot part.) I couldn't stand him. Completely bypassing a ridiculously long story, I helped him on a project to build his own OS from scratch; we made very impressive progress, even to this day. Custom bootloader, hardware interfacing, memory management, (semi) sandboxed processes, gui, example programs ...; we were in highschool. I'm still surprised and impressed with what we accomplished.
But besides him, almost every other dev I met was mediocre. Even outside of school, I went so many years without having another competent dev to work with. I went through various jobs helping other dev(s) on their projects (or rewriting them), learning new languages/frameworks almost every time: php, pascal, perl, zend, js, vb, rails, node, .... I learned new concepts occasionally (which was wonderful) but overall it was just tedious and never paid well because I was too young to be taken seriously (and female, further exacerbating it). On the bright side, it didn't dwindle my love for coding, and I usually spent my evenings playing with projects of my own.
The second dev (and one one of the best I've ever met) went by Novo. His approach to a game engine reminded me of General Relativity: Everything was modular, had a rich inheritance tree, and could receive user input at any point along said tree. A user could attach their view/control to any object. (Computer control methods could be attached in this way as well.) UI would obviously change depending on how the user could interact and the number of objects; admins could view/monitor any of these. Almost every object / class of object could talk to almost everything else. It was beautiful. I learned so much from his designs. (Honestly, I don't remember the code at all, and that saddens me.) There were other things, too, but that one amazed me the most.
I havent met anyone like him ever again.
Anyway, I don't know if I can really answer this week's question. I definitely received some good advice while initially learning, but past that it's all been through discovering things on my own.
It's been lonely. ☹2 -
I could bitch about XSLT again, as that was certainly painful, but that’s less about learning a skill and more about understanding someone else’s mental diarrhea, so let me pick something else.
My most painful learning experience was probably pointers, but not pointers in the usual sense of `char *ptr` in C and how they’re totally confusing at first. I mean, it was that too, but in addition it was how I had absolutely none of the background needed to understand them, not having any learning material (nor guidance), nor even a typical compiler to tell me what i was doing wrong — and on top of all of that, only being able to run code on a device that would crash/halt/freak out whenever i made a mistake. It was an absolute nightmare.
Here’s the story:
Someone gave me the game RACE for my TI-83 calculator, but it turned out to be an unlocked version, which means I could edit it and see the code. I discovered this later on by accident while trying to play it during class, and when I looked at it, all I saw was incomprehensible garbage. I closed it, and the game no longer worked. Looking back I must have changed something, but then I thought it was just magic. It took me a long time to get curious enough to look at it again.
But in the meantime, I ended up played with these “programs” a little, and made some really simple ones, and later some somewhat complex ones. So the next time I opened RACE again I kind of understood what it was doing.
Moving on, I spent a year learning TI-Basic, and eventually reached the limit of what it could do. Along the way, I learned that all of the really amazing games/utilities that were incredibly fast, had greyscale graphics, lowercase text, no runtime indicator, etc. were written in “Assembly,” so naturally I wanted to use that, too.
I had no idea what it was, but it was the obvious next step for me, so I started teaching myself. It was z80 Assembly, and there was practically no documents, resources, nothing helpful online.
I found the specs, and a few terrible docs and other sources, but with only one year of programming experience, I didn’t really understand what they were telling me. This was before stackoverflow, etc., too, so what little help I found was mostly from forum posts, IRC (mostly got ignored or made fun of), and reading other people’s source when I could find it. And usually that was less than clear.
And here’s where we dive into the specifics. Starting with so little experience, and in TI-Basic of all things, meant I had zero understanding of pointers, memory and addresses, the stack, heap, data structures, interrupts, clocks, etc. I had mastered everything TI-Basic offered, which astoundingly included arrays and matrices (six of each), but it hid everything else except basic logic and flow control. (No, there weren’t even functions; it has labels and goto.) It has 27 numeric variables (A-Z and theta, can store either float or complex numbers), 8 Lists (numeric arrays), 6 matricies (2d numeric arrays), 10 strings, and a few other things like “equations” and literal bitmap pictures.
Soo… I went from knowing only that to learning pointers. And pointer math. And data structures. And pointers to pointers, and the stack, and function calls, and all that goodness. And remember, I was learning and writing all of this in plain Assembly, in notepad (or on paper at school), not in C or C++ with a teacher, a textbook, SO, and an intelligent compiler with its incredibly helpful type checking and warnings. Just raw trial and error. I learned what I could from whatever cryptic sources I could find (and understand) online, and applied it.
But actually using what I learned? If a pointer was wrong, it resulted in unexpected behavior, memory corruption, freezes, etc. I didn’t have a debugger, an emulator, etc. I had notepad, the barebones compiler, and my calculator.
Also, iterating meant changing my code, recompiling, factory resetting my calculator (removing the battery for 30+ sec) because bugs usually froze it or corrupted something, then transferring the new program over, and finally running it. It was soo slowwwww. But I made steady progress.
Painful learning experience? Check.
Pointer hell? Absolutely.4 -
I've been coding for over 8 years, and whenever a recruiter says we have you do these coding challenges or recite them an algorithm from memory, I say "You know, the longer you've been programming, the less you remember how to do this stuff, because you don't use it in real life." They say, "Well we just want to see how you think and how you solve problems." B.S.
These types of algorithmic programming challenges besides the simpler ones don't show how you think. A lot of stuff like the dynamic programming and other optimization problems were solved by phd professors after many years of research. Nobody would think up these solutions on their own.
These programming challenges weed out
experienced developers unless they want to
take the time to re-learn this stuff. It explains why google, facebook or amazon are filled with young and inexperienced developers and how come it takes so many thousands of them to get anything done, and they still have buggy products...23 -
Dev: Hi Guys, we've noticed on crashlytics that one of your screens has a small crash. Can you look?
Me: Ok we had a look, and it looks to us to be a memory leak issue on most of the other screens. Homepage, Search, Product page etc. all seem to have sizeable memory leaks. We have a few crashes on our screens saying iPhone 11's (which have 4gb of ram) are crashing with only 1% of ram left.
What we think is happening is that we have weak references to avoid circular dependencies. Our weak references are most likely the only things the system would be able to free up, resulting in our UI not being able to contact the controller, breaking everything. Because of the custom libraries you built that we have to use, we can't really catch this.
Theres not really a lot we can do. We are following apples recommendations to avoid circular dependencies and memory leaks. The instruments say our screens are behaving fine. I think you guys will have to fix the leaks. Sorry.
Dev 1: hhhmm, what if you create a circular dependency? Then the UI won't loose any of the data.
Dev 2: Have you tried looking at our analytics to understand how the user is getting to your screens?
=================================
I've been sitting here for 15 minutes trying to figure out how to respond before they come online. I am fucking horrified by those responses to "every one of your screens have memory leaks"2 -
!rant but history
I found this old micro controller: The TMS 1000 (from 1974). The specs: 100-400kHz clock speed, 4-bit architecture, 1kB ROM and 32 bytes (!) RAM. According to data sheet, you sent the program to TI and they gave you a programmed controller back - updates to the once upload program were impossible, but an external memory chip was possible.
I'm glad we have computers with more processing power and storage (and other languages than assembler) - on the other hand it enforced good debugging before deployment and and efficient code.
Data sheet: http://bitsavers.org/components/ti/...6 -
Last week, the team lead told me that he can't merge because my code has code smells and going forward, can't have that. We use Sonar and well the way to "fix it" according to him is to mark the line using //NOSONAR.
Most of the issues are minor like Unused imports and for me incomplete TODOs.
And before the "verbal" rule was only need to fix Major + issues. And well the reason I use TODOs is to mark code that probably needs changing in the future. I know there's going to be some feature that these lines have to be changed. But the requirements are fully defined yet from business.
But I sort of blew up on him. YOU WANT TO ENFORCE ZERO CODE SMELLS NOW?!?!?! AND THESE MINOR ISSUES? MARK THEM WITH NOSONAR?
HERE'S WHAT I THINK FOR THE LAST X YEARS... THE CODE DESIGN IS SHIT, MINOR CODE SMELLS AND MANUALLY MARKING THE ONES U NEED TO KEEP... ARE THE LEAST OF OUR PROBLEMS...
THE OTHER PROBLEMS I'VE MENTIONED BEFORE EVER. MOS YEAR BUT YOU DIMWITS NEVER LISTEN.
YOU THINK MY TODOS ARE BAD... 90% OF THE CODE AND FEATURES (THE ONES NOT DONE BY ME) LOOK AND SMELL LIKE MONKEY SHIT. UNDOCUMENTED, MESSY, FULL OF BUGS.
AND GUESS WHAT? NEW FEATURE, SOME DEV FORGETS TO CHANGE SOME COMPONENT THAT DEPENDS ON IT. WOULDN'T IT BE GREATE IF THERE WERE BOOKMARKS... O WAIT...
i just was catching up on comics again and saw this one... with triggered my memory and this rant... My first thought was to forward it to him...11 -
Our team makes a software in Java and because of technical reasons we require 1GB of memory for the JVM (with the Xmx switch).
If you don't have enough free memory the app without any sign just exits because the JVM just couldn't bite big enough from the memory.
Many days later and you just stand there without a clue as to why the launcher does nothing.
Then you remember this constraint and start to close every memory heavy app you can think of. (I'm looking at you Chrome) No matter how important those spreadsheets or illustrator files. Congratulation you just freed up 4GB of memory, things should work now! WRONG!
But why you might ask. You see we are using 32-bit version of java because someone in upper management decided that it should run on any machine (even if we only test it on win 7 and high sierra) and 32 is smaller than 64 so it must be downwards compatible! we should use it! Yes, in 2019 we use 32-bit java because some lunatic might want to run our software on a Windows XP 32-bit OS. But why is this so much of a problem?
Well.. the 32-bit version of Java requires CONTIGUOUS FREE SPACE IN MEMORY TO EVEN START... AND WE ARE REQUESTING ONE GIGABYTE!!
So you can shove your swap and closed applications up your ass but I bet you that you won't get 1GB contiguous memory that way!
Now there will be a meeting about this issue and another related to the issues with 32-bit JVM tomorrow. The only problem is that this issue only occures if you used up most of your memory and then try to open our software. So upper management will probably deem this issue minor and won't allow us to upgrade to 64-bit... in 20fucking1910 -
Saw this on Facebook and couldn't help but share here! 😂
A young woman submitted the tech support message below (about her relationship to her husband) presumably did it as a joke…
The query:
Dear Tech Support,
’Last year I upgraded from Boyfriend 5.0 to Husband 1.0 and noticed a distinct slowdown in overall system performance, particularly in the flower and jewelry applications, which operated flawlessly under Boyfriend 5.0.
In addition, Husband 1.0 uninstalled many other valuable programs, such as: Romance 9.5 and Personal Attention 6.5, and then installed undesirable programs such as: NBA 5.0, NFL 3.0 and Golf Clubs 4.1.
Conversation 8.0 no longer runs, and House cleaning 2.6 simply crashes the system. Please note that I have tried running Nagging 5.3 to fix these problems, but to no avail.
What can I do?
Signed,
Desperate
The response (that came weeks later out of the blue):
Dear Desperate,
“First keep in mind, Boyfriend 5.0 is an Entertainment Package, while Husband 1.0 is an operating system. Please enter command: I thought you loved me.html and try to download Tears 6.2 and do not forget to install the Guilt 3.0 update. If that application works as designed, Husband 1.0 should then automatically run the applications Jewelry 2.0 and Flowers 3.5.
However, remember, overuse of the above application can cause Husband 1.0 to default to Grumpy Silence 2.5, Happy Hour 7.0 or Beer 6.1. Please note that Beer 6.1 is a very bad program that will download the Farting and Snoring Loudly Beta.
Whatever you do, DO NOT, under any circumstances, install Mother-In-Law 1.0 (it runs a virus in the background that will eventually seize control of all your system resources.)
In addition, please, do not attempt to re-install the Boyfriend 5.0 program. These are unsupported applications and will crash Husband 1.0.
In summary, Husband 1.0 is a great program, but it does have limited memory and cannot learn new applications quickly. You might consider buying additional software to improve memory and performance. We recommend: Cooking 3.0.Good Luck!’
Good Luck!3 -
TLDR: SAP sucks. Don't ever work with it. Run away from it. Delete it from your memory. If your company works with it, quit. It's the best you can do.
A couple of weeks ago the group rant was "Story of your best/worst career choice" and I talked about the contract I signed. Even tho that is still true and I still feel like that, I think I got a new worst choice:
WORKING WITH SAP.
When I got this job I knew it would be SAP, but I didn't know what SAP was. I just thought "it's programming, how bad can it be?" OH BOII.
If only I would have done some FUCKING RESEARCH I would know this would be a mistake!!
And I knew I didn't want to work with this, I knew I wanted to be a web developer, but I STILL ACCEPTED THE JOB OH MAN WHAT WAS I THINKING I'M SO MAD.
Were I live we all have the same mentality when looking for the first job, which is to just accept anything you can get, because it's your first job, you need to work and to get experience, even if it's a bad job or if you know you won't like it. When my intership was ending, I told my parents I didn't want to stay there because they treat their employes like shit, and the salary is terrible. They told me to still accept it if they offered because I still need a job (this one was web tho) and experience.
So, of course, since I was looking for my first job, was told this my entire live, always thought like that and they were the first to contact me, I accepted it.
BIGGEST FUCKING MISTAKE!! DON'T THINK LIKE THIS!! AND STOP TELLING KIDS THIS!! IT'S NOT A GOOD MENTALITY!!!
ALSO DON'T WORK WITH SAP! EVER!24 -
Had nothing to do today, so I thought I´ll test the migration of SVN to Git in Gitlab.
Boss sent me a mail today, that when I migrate we need to preserve the history, so I actually have to put some effort in it. *sigh*
Shout-out to the Gitlab documentation at this point.
That´s probably the best doc I´v ever read...
Well so I tried to use svn2git. And well...
Who the fuck thought that this piece of shit software is in any way usable?
Holy crap!
If it fails, it just does so without any info why. Even in verbose mode.
And the RAM usage? What the actual fuck?
This whole thing is a complete memory leak!
32Gigs of RAM full in Minutes and the whole system starts to stall!
And then when I thought it finally runs through.
Bam another git checkout error...
Googling for that error then I found something. A version of svn2git made in .Net Core.
Didn´t expect much but I tried it anyways.
And would you look at that!
It ran so smooth and didn´t need that much RAM , I had some doubt it did work correctly.
But it did!
I think I´m gonna pay a coffee or two to some guy over in China now!6 -
Me: have you tried turning it off and on again?
Customer: oh come on, is that the best you can do!
M:ok how about we
clear all active memory,
Reset the firmware parameters
run system diagnostics and
reinitialise the basic input output system?
C: Wow .. yeah how do we do that?
M: turn it off and on again! -
This happened today
My Manager: How is the progress so far on the search module?
Me(After implementing some crazy shit requirements): It's all set. APIs are working well against the mock in-memory database. I need an actual database to run my unit tests. Where do we have it?
My manager: Let's pretend that there is no database at this moment. Go-ahead with rest of your activities.
Me(IN MY MIND): F*CK you a** hole. You don't know the first thing of software development! Which a** hole promoted you as a manager!!!
Me(TO HIS FACE): Ah.. okay!! As you wish!3 -
Chrome, Firefox, and yes even you Opera, Falkon, Midori and Luakit. We need to talk, and all readers should grab a seat and prepare for some reality checks when their favorite web browsers are in this list.
I've tried literally all of them, in search for a lightweight (read: not ridiculously bloated) web browser. None of them fit the bill.
Yes Midori, you get a couple of bonus points for being the most lightweight. Luakit however.. as much as I like vim in my terminal, I do not want it in a graphical application. Not to mention that just like all the others you just use webkit2gtk, and therefore are just as bloated as all the others. Lightweight my ass! But programmable with Lua, woo! Not like Selenium, Chrome headless, ... does that for any browser. And that's it for the unique features as far as I'm concerned. One is slow, single-threaded and lightweight-ish (Midori) and another has vim keybindings in an application that shouldn't (Luakit).
Pretty much all of them use webkit2gtk as their engine, and pretty much all of them launch a separate process for each tab. People say this is more secure, but I have serious doubts about that. You're still running all these processes as the same user, and they all have full access to the X server they run under (this is also a criticism against user separation on a single X session in general). The only thing it protects against is a website crashing the browser, where only that tab and its process would go down. Which.. you know.. should a webpage even be able to do that?
But what annoys me the most is the sheer amount of memory that all of these take. With all due respect all of you browsers, I am not quite prepared to give 8 fucking gigabytes - half the memory in this whole box! - just for a dozen or so tabs. I shouldn't have to move my web browser to another lesser used 16GB box, just to prevent this one from going into fucking swap from a dozen tabs. And before someone has a go at the add-ons, there's 4 installed and that's it. None of them are even close to this complete and utter memory clusterfuck. It's the process separation. Each process consumes half a GB of memory, and there's around a dozen of them in a usual browsing session. THAT is the real problem. And I want to get rid of it.
Browsers are at their pinnacle of fucked up in my opinion, literally to the point where I'm seriously considering elinks. Being a sysadmin, I already live my daily life in terminals anyway. As such I also do have resources. But because of that I also associate every process with its cost to run it, in terms of resources required. Web browsers are easily at the top of the list.
I want to put 8GB into perspective. You can store nearly 2 entire DVD movies in that memory. However media players used to play them (such as SMPlayer) obviously don't do that. They use 60-80MB on average to play the whole movie. They also require far less processing power than YouTube in a web browser does, even when you download that exact same video with youtube-dl (either streamed within the media player or externally). That is what an application should be.
Let's talk a bit about these "complicated" websites as well. I hate to break it to you framework web devs, but you're a dime a dozen. The competition is high between web devs for that exact reason. And websites are not complicated. The document itself is plain old HTML, yes even if your framework converts to it in the background. That's the skeleton of your document, where I would draw a parallel with documents in office suites that are more or less written in XML. CSS.. oh yes, markup. Embolden that shit, yes please! And JavaScript.. oh yes, that pile of shit that's been designed in half a day, and has a framework called fucking isEven (which does exactly what it says on the tin, modulo 2 be damned). Fancy some macros in your text editor? Yes, same shit, different pile.
Imagine your text editor being as bloated as a web browser. Imagine it being prone to crashing tabs like a web browser. Imagine it being so ridiculously slow to get anything done in your productivity suite. But it's just the usual with web browsers, isn't it? Maybe Gopher wasn't such a bad idea after all... Oh and give me another update where I have to restart the browser when I commit the heinous act of opening another tab, just because you had to update your fucking CA certs again. Yes please!19 -
University Coding Exam for Specialization Batch:
Q. Write a Program to merge two strings, each can be of at max 25k length.
Wrote the code in C, because fast.
Realized some edge cases don't pass, runtime errors. Proceed on to check the locked code in the Stub. (We only have to write methods, the driver code is pre-written)
Found that the memory for the char Arrays is being allocated dynamically with size 10240.
Rant #1:
Dafuq? What's the point of dynamic Memory Allocation if you're gonna fix it to a certain amount anyway?
Continuing...
Called the Program Incharge, asking him to check the problem and provide a solution. He took 10 minutes to come, meanwhile I wrote the program in Java which cleared all the test cases. <backstory>No University Course on Java yet, learnt it on my own </backstory>
Dude comes, I explain the problem. He asks me to do it in C++ instead coz it uses the string type instead of char array.
I told him that I've already done it in Java.
Him: Do you know Java?
Rant #2:
No you jackass! I did the whole thing in Java without knowing Java, what's wrong with you!2 -
trying to do anything on the PS2 is almost fucking impossible
i imagine a board meeting where they were designing the hardware
"how can we make this insanely hard to use?"
"let's make decentralized partition definitions, allow fragmenting of entire partitions, and require all partitions to be rounded to 4MB. If you delete a partition, don't wipe the partition out, just rename it to "_empty" and the system will do it for you, except it actually won't because fuck you"
"let's require 1-bit serial registers to be used for memory card access and make sure you can't take more than 8 CPU cycles to push each bit or it'll trash the memory card"
"let's make the network module run on a 3-bit serial register and when initialized it halves the available memory but only after 8 seconds of activity"
"let's require the system to load feature modules called "IOPs" and require the software to declare which of the 256 possible slots it wants to use (max of 8 IOPs) then insert stubs into those. Any other IOP you call will hang the system and probably corrupt the HDD. You also have to overwrite the stubbed IOPs with your own but only if you can have the stubs chainload the other IOPs on top of themselves"
"let's require you to write to the controller registers to update them, but you have to write the other controller's last-polled state or the controller IOP will hang"
of course this couldn't make sense, it's
s s s s
o o o o
n n n n
y y y y4 -
It would have been back in the 90s 🤫
I was about 8 years old I guess when I had a friend who had a Commodore64 and he loaded up the good old floppy, typed some things in and the screen started doing things, my mind was instantly triggered for “how did you do that?”.
Moving forward after that I was into gaming on consoles (sega, snes, Atari ect) and always wondered how the games were made (being pre-internet) that was not easy to find info for, otherwise I think inprobably would have ended up in the game dev world.
It wasn’t until I was about 10-11 that I finally got a PC in the house ( good old IBM 386 with 10mb HDD.. yes MB not GB for you young folk) and I was addicted from day one, MS paint, changing settings left right and Center in windows 3.11 and then when we upgraded to W95 and then W98 things got more and more interesting.
God the memories, and games (MAME32 was the best)😆
Shit now I want to find some old school games for a trip up memory lane 😂
When I was 15, I made my first website in front page (don’t judge), was a nice big walkthrough with photos and map locations for GTA 3, and since then I’ve never looked back. -
MARKETING IS A MENACE FOR SOCIETY and a large waste of time and resources.
Imagine that for some very stupid reason people were allowed to steal cell phones from unsuspecting victims and sell it on open market legally, tax invoices and all.
One could create a business like this. Steal some lad's brand new $600 iPhone and sell it for $280, because why not?
That is marketing. A company goes and makes a phone for lets say $180. Add in taxes, shipping and development costs, and we get to $300. Put some real nice profits on top (let's say 40%) and we get to $420.
The last 180 are the cost of marketing for society.
Today some stupid marketing conmen goosesteps into my lab and says that we must use Tensorflow and in-memory databases and multicloud redundancy and, I kid you not, "profound learning".
WE HAVE A FREAKING LOGISTICS OPS APPLICATION.
"We are putting it on the brochure, those technologies are set to sell well in our core market, and improve employer-branding" says the conmen.
A request for a feature is one thing, a request for an whole other technology because some snake-oil salesmen read the term in some clickbait rag and thinks that some starry-eyed moneyhead will pay extra because the brochure says "NOW WITH 2X MORE TECH!" is just an assault on society.5 -
Why are people complaining about debugging?
Oooh it’s so hard.
It’s so boring.
Can someone do this for me?
I honestly enjoy debugging and you should too..
if it’s not your code, you’ll get to understand the code better than the actual author. You’ll notice design improvements and that some of the code is not even needed. YOU LEARN!
If it’s your own code (I especially enjoy debugging my own code): it forces you to look at the problem from a different perspective. It makes you aware of potential other bugs your current solution might cause. Again, it makes you aware of flaws in the design. YOU LEARN!
And in either case, if it’s a tricky case, you’ll most likely stop debugging at some point, refactor the shit out of some 50-100 line methods and modulize it because the original code was undebuggable (<- made up a new word there) and continue debugging after that.
So many things I know, I know only because I spend days, sometimes even weeks debugging a piece code to find the fucking problem.
My main language is java and i wouldn’t have believed anyone who told me there’s a memory leak in my code. I mean, it’s java, right? We refactored the code and everything worked fine again. But I debugged the old version anyway and found bugs in Java (java 6.xx I believe?) which made me aware of the fact that languages have flaws as well.. GC has its flaws as well. So does docker and any other software..
Stop complaining, get on your ass and debug the shit out of your bugs instead of just writing it in a different way and being glad that it fixed the issue..
My opinion.3 -
!dev
A child's mind is fascinating.
I remember how it felt being a kid, just deliriously happy.
Things were magical, mystical and happy.
I knew the world wasn't perfect, I knew bad things happened to good people.
But a kid's mind is so powerful that it can fill in the blanks with the most cheerful and optimistic perspectives.
And at some point in my childhood I was exposed to videogames, and that kinda took me down fantasy lane even further.
I was extremely young and barely retaining any memories when I was exposed to my first console, a famicom.
I have a somewhat vivid memory of my mind being blown away for the first time by watching my brother play New Ghostbusters II for NES.
From then on, we never stopped and played several console and dos/pc games.
When I was 10, someone from the neighborhood brought in a couple of floppys with Pokemon Yellow.
"What? Pokemon? How the fuck is that even possible? This is a pc, not a gameboy".
I didn't know at the time what an emulator was, but I was super fucking stoked to be able to play that.
My dad had a 1 gb laptop from work that he didn't use, so I hoarded that shit, and I would get to bed and play nearly everyday.
The experience was surreal. I was doing pc gaming... not on a chair, on a fucking bed, and I was playing a gameboy game... on a pc.
It was so intense to me, that even after more than 2 decades of that time in my life, I still remember how it feels like.
Like, you know how you can "feel" things if you think about them? like for example if you think about the taste of chicken, you can somehow feel it for a second.
Well I have like an actual physical sensation linked to that experience but I can't explain it at all, because it's just a sensation.
I think people usually say they feel that way, for example, about the PSX (usually refered to as ps one) loading screen. I experienced that too but when I was 12, so it was not as intense (it does make me feel the fuzzies though).
I also remember other things with very high detail, like the texture of my bed cover, the weather, mom cooking, the clunky shape of the laptop, the way I carelessly stored it above a pile of magazines, etc.
I rememeber ofc how it felt looking at the game sprites, interacting with NPCs, and the goddamn fucking glorious music.
It was dreamy.
Years and years later, I grew up and I stopped living in fantasy world and became more aware of the grim aspects of life my younger self was sugarcoating.
So I tried to play pokemon again, again and again, and no matter how hard I tried to revive that euphoria, I could not never do it.
I started to get annoyed at the game.
"Come oooon, I did the tutorial already, let me skip this.
This pokemon is useless, why am I even training it.
Fuck, I'm tired of grinding"
At some point I accepted that the feeling would never return, and that it would just live in my memory.
Ironically, I can recall that memory and how it felt anytime I want to.
And I can actually still feel it, and throughtout these years, it has never wore down.
And eventually I learned how to play pokemon and enjoy it:
I read tier lists at smogon online and just catch and train the pokemons that are higher on the list, which is how i got to beat yellow in like 3 days.
(This is nothing compared to what speedrunners do, but much better than the weeks it had taken me in the past).
That served as an important lesson that when a kid plays a game, his mind is also the game at the same time, filling the blanks with its imagination.
A very similar experience happened to me with harvest moon, which is the precursor of stardew valley.
and that game is faaar more emotional: you talk to people, overtime you befriend them and they open up, you meet a girl, you marry her, have a kid
you get farm animals, you brush them, they become happy
you get attached
that game was also so powerful in me that in all naiveness I thought I wanted to be a farmer.
Eventually I grew up and hit puberty and from then on, I focused more on competitive games, like smash bros, cs and tf2.
and i dunno how to end a post so eat my fucking nuts17 -
Today I had a programming exam
We had to read a request, write uml, use case etc...
I think "it's going to be easy!"
Than I remember that for some unholy reason we use java7
Than I remember that the keyword to automatically add getters and setters was added in java10
Had to write getters and setters by hand, on a piece of paper, for 5 classes...
I hate my university, we are Information Engineering that is the closest thing we have to Software Engineering in my city and we still do our programming exams on paper, that doesn't test your ability to program, but your ability to learn a load of information by memory9 -
It's rant time again. I was working on a project which exports data to a zipped csv and uploads it to s3. I asked colleagues to review it, I guess that was a mistake.
Well, two of my lesser known colleague reviewed it and one of the complaints they had is that it wasn't typescript. Well yes good thing you have EYES, i'm not comfortable with typescript yet so I made it in nodejs (which is absolutely fine)
The other guy said that I could stream to the zip file and which I didn't know was possible so I said that's impossible right? (I didn't know some zip algorithms work on streams). And he kept brushing over it and taking about why I should use streams and why. I obviously have used streams before and if had read my code he could see that my code streamed everything to the filesystem and afterwards to s3. He continued to behave like I was a literall child who just used nodejs for 2 seconds. (I'm probably half his age so fair enough). He also assumed that my code would store everything in memory which also isn't true if he had read my code...
Never got an answer out of him and had to google myself and research how zlib works while he was sending me obvious examples how streams work. Which annoyed me because I asked him a very simple question.
Now the worst part, we had a dev meeting and both colleagues started talking about how they want that solutions are checked and talked about beforehand while talking about my project as if it was a failure. But it literally wasn't lol, i use streams for everything except the zipping part myself because I didn't know that was possible.
I was super motivated for this project but fuck this shit, I'm not sure why it annoys me so much. I wanted good feedback not people assuming because I'm young I can't fucking read documentation and also hate that they brought it up specifically pointing to my project, could be a general thing. Fuck me.3 -
I'm convinced this is going to be wildly unpopular, but hey...
Please stop writing stuff in C! Aside from a few niche areas (performance-critical, embedded, legacy etc. workloads) there's really no reason to other than some fumbled reason about "having full control over the hardware" and "not trusting these modern frameworks." I get it, it's what we all grew up with being the de-facto standard, but times have moved on, and the number of massive memory leaks & security holes that keep coming to light in *popular*, well-tested software is a great reason why you shouldn't think you're smart enough to avoid all those issues by taking full control yourself.
Especially, if like most C developers I've come across, you also shun things like unit tests as "something the QA department should worry about" 😬12 -
I just installed Opera Mini on my PSP. That alone isn't very exciting on its own, although I am stoked that my website does in fact render on a device from 2009. With the helpful guidance of a laptop from 2004 that's doing the hotspot duties for this thing.
No, what really got me stoked is that Opera still supports these old platforms, and how small they managed to make it. The .jar file for Opera Mini 4.5 is ~800kB large. There's a .jad file as well but it's negligible in size and seems to be a signature of sorts.
Let that sink in for a moment. This entire web browser is 800kB. Firefox meanwhile consistently consumes 800 MEGABYTES.. in MEMORY. So then, I went to think for a moment, how on earth did they manage to cram an entire functioning web browser in 800kB? Hell, what makes up a web browser anyway?
The answer to that question I got to is as follows. You need an engine to render the web page you receive. You need a UI to make the browser look nice. And finally you need a certificate store to know which TLS certificates to trust. And while probably difficult to make, I think it should be possible to do in 800k. Seriously, think about it. How would you go *make* a web browser? Because I've already done that in the past.
Earlier I heard that you need graphics, audio, wasm, yada yada backends too.. no. Give your head a shake. Graphics are the responsibility of the graphics driver. A web browser shouldn't dabble with those at all. Audio, you connect to PulseAudio (in Linux at least) and you're done. Hell I don't even care about ALSA or OSS here. You just connect to the stuff that does that job for you. And WebAssembly.. God I could rant about that shit all day. How about making it a native application? Not like actual Assembly is used for BIOS and low-level drivers. And that we already have a better language for the more portable stuff called C.
Seriously, think about it. Opera - a reputable browser vendor - managed to do it in 800kB on a 12 year old device. Don't go full wank on your framework shit on the comments. And don't you fucking dare to tell me that there's more to it. They did it for crying out loud. Now you take a look at your shitpile for JS code and refactor that shit already. Thank you.21 -
PM: this is our super fancy new CI/CD pipeline, it's the greatest. i expect you to learn and understand all this in no time.
devs: so i have to spend some more time on this topic because it's completely new to me and requires some learning...
PM: nooo, that's a super easy task with zero effort, my braindead hamster can do that in no time, so can i, and so can you! let's assign 1 story point for that.
~ 3 months latèr ~
also PM, after he has started developing as well: so i'm realizing there are many things that i have to learn, and it takes me some time. i haven't developed with C++ and <other tool stack> for a longer time. by the way, you guys don't need to check for any quality right now, we need to deliver fast. it's okay, when you have memory overflows, your code is completely crappy, poor architecture or memory overflows, it doesn't matter.
he even has a subtask for migrating his code from VS project to our new project structure, since he refused to learn our pipeline right from the beginning and created VS project instead. シ why is this a subtask? this job can be done in no time, my left vanishing twin named Klaus who has dislexia and hates vim can solve this task in 20 seconds!!!!11
(and still no PR, not even a feature branch in our repo)2 -
I am not sure which 24 hours was the craziest one, but I will pick 2.
This one happened just a few weeks after I started working for the one and only company I have ever worked for. The huge-ass multi-tenant website stopped working. There was out of memory exception and nobody knew what is going on. I was still very new and knew shit about how it worked + plus my PHP knowledge was limited back then. Everyone was looking for the culprit but with no luck. Then the next day I finally managed to find a fucking infinite loop in our weather plugin.
We were working on a moderately big project for a client. There was a lot of work lately (on different projects) and we were *very* behind schedule on this one. Deadline? You guessed it - tomorrow. What was worse is that we couldnt move it any further, becuase we already did once before. So I had to work for about 20 hours straight to kinda finish the work. Worst part? Client turned out to be moron and half-scammer, so they are not our client anymore and the project was never deployed to production. Never again.2 -
JavaScript is a rollercoaster. From "Golly hello world is easy and I can make webpages now", to "wtf '1'+1 is '11' kill me now", to "it's not be that bad if you know how to use it", to discovering typescript and it starts feeling like a real language.
... until you can't build the project because you have too many types so you blow the memory limit in node. I can up the limit, but I can't guarantee that we won't blow past this in the future. Browsing issues on the ts repo reveals that this has been a thing for years.
Sticking with the rollercoaster analogy I'm now at "Burn it all to the ground".5 -
I feel so guilty.
I had to make a hotfix today. It is the ugliest piece of shit code I ever intentionally created. But there was no other way. I swear there was no other fucking way!
My boss just assigned this to me. But because she thinks this needs to be a hotfix and can't wait for the next release we just have to change the server and not the client side of our application.
So I had to add a memory to our server so that it knows from which high level method from the client the multiple low level calls to it are coming from.
It just doesn't make sense logically.
I mean I feel like I killed someone. And just so that we get less writes to our DB. I mean yes in some edge cases it is a huge speed-up...
But nothing this fix solves is a new bug.
I'm gonna take a shower now. For like an hour3 -
Rant:
I am at work, some one says to me this system we are working on is multi threaded. I tell the no its not multi threaded and in this context. Things cannot happen concurrently. Its a single core arm 7tdmi. Arguments ensue abot the difference between multithread multitasking an multiprocessing. I proceed to explain this is a multitasking interrupt driven system. With no context switching or memory segmentation so one heap for all tasks cause thats how we have it configured and there is only one core. So there is no way the error he just described could possibly happen. Then he tells me im wrong but refuses to even look at the processor manual and rejects the Wikipedia entry for multithreading. So I plan on calling off so i can just have the next two weeks off while he trys to figure out why two things ar happening at once on this system. He deserves all the frustration that is to follow.1 -
I'm getting really tired of those dumbass programmers that do not understand shit and then come to me when production breaks. (I am also a programmer, not really a DevOps engineer, but I'm the least worst at DevOps stuff, so it's my job...).
We're programming some kind of document management tool. Today we had a release, and one of the new features is to download all of your documents as a zip file, which is asynchronuously generated. When it's done, the user gets a mail with the download link to the zip file.
The feature works basically, but today it broke our production service, as somebody was running a test of it.
Turns out all the documents are loaded into memory to be zipped. So if you have 2 gigs of documents, a container with memory restrictions in that area will crash.
I asked the programmer who reported this «ops problem» to me, why he didn't just shit the files into a temp foler in order to zip them in there.
He told me that he wanted to do so, but did not know how to mock this for a unit test, and therefore went to the in-memory «solution», which was easier for him to mock.
For fuck's sake, unit tests and mocks are fucking tools, not ends in itself! I don't give a fuck about your pointless mocking code when the application crashes!
When I got to deal with such dumbasses, I'd prefer to mock those motherfuckers with a leaky bucket of liquid shit, which basically accomplishes the same task from my perspective: dripping shit all over the place and make everything suck as fuck.3 -
!rant
Human memory is fascinating. It’s interesting to think about the events that helped shape you into what you are. And how sometimes those events are exchanges with people who probably never gave the moment a second thought, but fundamentally changed how you relate to others.
For example: In ninth grade, I became friends with a group of seniors. Spent every lunch period together, auditioned and landed a part in a play to hang out with them. We never hung out over the weekends, but my dad had died two years before and I didn’t hang with anyone on weekends.
Then at the end of the school year, I’d actually got my mum to buy me a school yearbook and was super excited to have my friends sign it. But when I asked them to, one of them furrowed their brow and said, “Amy, we’re not really friends...”
And I haven’t trusted friendships since.
Anyway, for anyone who needs to read this right now:4 -
There are a couple of them to list! But to sum my main ones(biggest personal heroes):
John McCarthy, one of the founding fathers of Artificial Intelligence and accredited with coining such term(sometimes before 1960 if memory serves right), a mathematical prodigy, the man based the original model of the Lisp programming language in lambda calculus. Many modern concepts that we have in programming where implemented in one way or another from his systems back in the day, and as a data analyst and ML nut.....well I am a big fan.
Herb Sutter: C++ programmer extraordinaire. I appreciate him more for his lectures and published articles than anything else. Incredibly smart and down to earth and manages to make C++ less intimidating while still approaching it with respect.
Rich Hickey: The mastermind behind Clojure, the Lisp dialect for the JVM. Rich is really talented and his lectures behind his motivations and reasons behind everything he does with Clojure are fascinating to see.
Ryan Dahl: Awww shit y'all know how it is. The man changed web development both in the backend and the frontend for good. The concept of people writing their own servers to run their pages was not new, but the Node JS runtime environment made it more widely available to people by means of a simple to use language that was already popular with web developers. I would venture to say that Ryan's amazing contributions to JS made the language better, as it stands, the language continues to evolve and new features that make it overall better keep being added. He is currently building Deno, which would be a runtime environment for TypeScript, in Rust.
Anders Hejlsberg: This dude was everywhere man....the original author of Turbo Pascal and the lead of Delphi back in the day. These RAD tools paved the way for what would be a revolution in the computing world. The dude is also the lead architect and designer of the C# programming language as well as TypeScript.
This fucker is everywhere and I love it.
Yukihiro "Matz" Matsumoto: Matsumoto san is the creator of the Ruby programming language. Not only am I a die hard fan of Ruby, but of the core philosophies that the man keeps as the core of his language design: Make the developer happy, principle of least surprise. Also I follow: minswan which is a term made by the Ruby community that states Mats is nice so we are nice. <---- because being cool to others is better than being a passive aggressive cunt.
Steve Wozniak: I feel as if the man does not get enough recognition...the man designed the Apple || computer which (regardless of how much most of y'all bitch and whine) paved the way for modern micro computers. Dude is also accredited with designing one of the first programmable universal remotes(which momma said was shitty) but he did none the less.
Alan Kay: Developed Smalltalk and the original OOP way of doing things. Smalltalk as a concept is really fucking interesting. If you guys ever get the chance, play with Pharo, which is a modern Smalltalk. The thing is really interesting and the overall idea of Smalltalk can be grasped in very little time. It sucks because the software scales beautifully in terms of project building, the idea of hoisting a program as its own runtime environment and ide by preserving state through images is just mind blowing to me. Makes file based programs feel....well....quaint.
Those are some of the biggest dudes for me. I know that the list is large, but I wanted to give credit to the people that inspired me the most. Honorary mention goes to other language creators and engineers of course, but it would be way too large to list!9 -
A friend came to me whether i want to do a project on c++(someone asked him to find a c++ guy).
Me needing money didn’t refuse. Even though i am a Java developer with 0 skills on c++, but wanted to give it a try.
So project started, and it was about a plugin for rhinoceros app(3d graphics app).
The plugin was simple, had some views and some services to upload a file into s3 and some api calls, not something complex..
So i ended up working on the project together with my friend(web dev).
So long story short, we had a lot of issues, but considering we both had no knowledge on c++, we were really lucky to finish the product almost on time(3 days after).
Did no memory management even though i’ve read that we have to do that by our selfs and that c++ doesn’t have garbage collector.
But the plugin worked great even without garbage collector.
Had a lot issues with string manipulation, which almost drive me crazy.
PS: did a post here before taking the project, to ask whether it is a good idea to take the project or not, had some positive and some negative replies, but i deleted the post since i thought i was breaking the NDA i signed 😂😂
PS2: just finished OCAJP 8 last week with a great score😃6 -
The "stochastic parrot" explanation really grinds my gears because it seems to me just to be a lazy rephrasing of the chinese room argument.
The man in the machine doesn't need to understand chinese. His understanding or lack thereof is completely immaterial to whether the program he is *executing* understands chinese.
It's a way of intellectually laundering, or hiding, the ambiguity underlying a person's inability to distinguish the process of understanding from the mechanism that does the understanding.
The recent arguments that some elements of relativity actually explain our inability to prove or dissect consciousness in a phenomenological context, especially with regards to outside observers (hence the reference to relativity), but I'm glossing over it horribly and probably wildly misunderstanding some aspects. I digress.
It is to say, we are not our brains. We are the *processes* running on the *wetware of our brains*.
This view is consistent with the understanding that there are two types of relations in language, words as they relate to real world objects, and words as they relate to each other. ChatGPT et al, have a model of the world only inasmuch as words-as-they-relate-to-eachother carry some information about the world as a model.
It is to say while we may find some correlates of the mind in the hardware of the brain, more substrate than direct mechanism, it is possible language itself, executed on this medium, acts a scaffold for a broader rich internal representation.
Anyone arguing that these LLMs can't have a mind because they are one-off input-output functions, doesn't stop to think through the implications of their argument: do people with dementia have agency, and sentience?
This is almost certain, even if they forgot what they were doing or thinking about five seconds ago. So agency and sentience, while enhanced by memory, are not reliant on memory as a requirement.
It turns out there is much more information about the world, contained in our written text, than just the surface level relationships. There is a rich dynamic level of entropy buried deep in it, and the training of these models is what is apparently allowing them to tap into this representation in order to do what many of us accurately see as forming internal simulations, even if the ultimate output of that is one character or token at a time, laundering the ultimate series of calculations necessary for said internal simulations across the statistical generation of just one output token or character at a time.
And much as we won't find consciousness by examining a single picture of a brain in action, even if we track it down to single neurons firing, neither will we find consciousness anywhere we look, not even in the single weighted values of a LLMs individual network nodes.
I suspect this will remain true, long past the day a language model or other model merges that can do talk and do everything a human do intelligence-wise.31 -
I really think there should be a subject in every CS course to teach us how to handle/work-under Grade-A assholes and dumbfucks. Not that it would help, but atleast warn us on what we are getting into.
In my opinion, development is not *that* hard or frustrating but is made so by these shitty people. But again, what do I know.
I was scolded by my boss for using for-loop to iterate through an array recently. Apparently for-loop is not used in real world projects and this iteration should be done "in-memory". My colleagues and I are still trying to understand and process that.
I was asked to add fitbit integration to a project within 2 hours just because I had "already done it a week ago" in *another* project. Luckily, it was then given to a "senior" developer who took 4 days for it and essentially copy-pasted my work without much changes, ofcourse it stopped working every now and then.
I am given unreal deadlines on my tasks, on technologies I haven't worked on before, and then expected to churn out production ready code with no bugs in them.
My boss literally just sends me the links of 1st three google results on the problems I encounter and report, after humiliating me ofcourse. Yes, I did google it and yes I went through all I could find from Google forums to GitHub issues. When the library/plugin author himself says that this feature is not yet available, don't expect me to develop it in 2 hours you dumbfuck.
And for the love of God, please stop changing the data model every single day and justify it with agile development. Think before making any changes to it. Ever heard of Join queries? Foreign keys? Or any other basic database concepts.
We reached a point where each branch in the repo had different data model. Not kidding. And we were a team of just 4 developers. Atleast inform us when you change models after discussing it with your shit for knowledge "senior" developer, so we don't have to redo it all over again. The channels on slack are not for sharing random articles only.
I am just waiting to complete my year here.
I should have known what I got myself into the day he asked me to remove the comments I had added to explain what my code does. Why you ask? Because "we don't write comments". -
Assignment release: this is a basic assignment that is supposed to help you understand the basics of memory allocation. You are free to use any design you want, however you can implement more advanced features that would lean towards specific designs. What will be punished is that you don't have a dynamic memory allocator. We will run the tests for these offline after the deadline
2 days before the deadline: we released the tests for checking whether you're allocating memory dynamically, but these tests also check for this specific design. So fuck you for choosing any other design than this one. Have fun on sleepless nights.
Fuck me, I guess, for worthlessly working on a different design than they wished for, but didn't specifically ask for from the beginning. I just wasted 2 weeks of my life and feel unmotivated af to do anything anymore regarding this. Fuck this shit. Fuck them. Fuck this course.1 -
A few Challenges at my job:
- a CEO with zero tech skills and zero memory.
- a sysadmin with literal brain damage and epilepsy (but he's great, we just have had to learn how to deal with it)
- another (volunteer) sysadmin who we call @God on Slack and who usually only shows up in extreme crises.
- the budget of a tiny organization, the web traffic of a huge site.
- incoherent business logic subject to the whims of volunteers and the loudest users
- a main revenue stream that contradicts our main mission.
it's fun! woot.1 -
So, I work in a game development studio, right?
We're trying to launch the title on as many platforms as reasonable, because as a social VR app we're kinda rowing upstream.
So far, Steam and Oculus have been fairly reasonable, if oddly broken and inconsistent.
Enter store 3.
Basically no in-game transaction support (our asking prompted them to *start* developing it. No, it's not very complete). No patch-update system (You want an update? Gotta download the whole fsckin' thing!). No beta-testing functionality for most of their stuff ("Just write the code like the example, it will work, trust us!"). No tools besides the buggy SDK (Wanna upload that new build? Say hello to this page in your web browser!).
So, in other words: Fun.
We've been trying to get actively launched for two months now. Keep in mind that the build has been up on Steam and Oculus for over a year and half a year (respectively), so the actual binary functionality is, presumably fine.
The best feedback we get back tends to be "Well, when we click the Launch button it crashes, so fail."
Meanwhile we're going back and forth, dealing with other-side-of-the-world timezone lag, trying to figure out what is so different from their machines as ours. Eventually we get them to start sending logs (and no, Windows Event logs are not sufficient for GAMES, where did you even get that idea????) except the logs indicate that the program is getting killed so terribly that the engine's built-in crash handler can't even kick in to generate memory dumps or even know it died.
All this boils down to today, where I get a screenshot of their latest attempt.
I just can't even right now.5 -
it was my first job as an embedded engineer i was hired to write firmware for arm microcontroller that has ble radio. But the microcontroller we used didn't have FLASH it had a SRAM and an otp ( one time programmable) memory. In ble you can make a proximity beacon and When a phone passes by this beacon it will get a notification '<device_name> nearby'
. I thought it is funny if i keep device name 'MILF' (original name of device is FLIP ) so when somebody's phone is in proximity it will have a notification 'MILF nearby'. joke didn't work as nobody has their bluetooth switched on by default ,but i forgot to change it before programming otp memory.
i just buried that device and told everyone it is not working properly1 -
Man wk89 awesome... bringing back a lot of memories. The one thing really stands out to me though is the software.
I see a lot of rants about people shocked that turboC is still in use or other DOS programs are still in production. A lot can of bad be said here but I think often it's a case of we truly don't build things like we did in the good old days.
What those devs accomplished with such limited resources is phenomenal and the fact that we still haven't managed to replicate the feel and usability of it says a lot, not to mention just how fucking stable most of it was.
My favourite games are all DOS based, my most favourite of all time Sherlock is 103kb in size. When I started coding games I made a clone of it and to this day I am still trying to figure out what sorcery is in the algorithm that generates/solves puzzles that makes it so fast and memory efficient. I must have tried 100+ ways and can't even come close. NB! If you know you can hint but don't tell me. Solving this is a matter of personal pride.
Where those games really stand out is when you get into the graphics processing - the solutions they came up with to render sprites, maps and trick your eyes into seeing detail with only 4-16 colours is nothing short of genius. Also take a second to consider that taking a screen shot of the game is larger than the entire game itself and let that sink in...
I think the dramatic increase in storage, processing power and ram over the last decade is making us shit developers - all of us. Just take one look at chrome, skype or anything else mainline really and it's easy to see we no longer give a rats ass about memory anywhere except our monthly AWS/GCE bill.
We don't have to be creative or even mindful about anything but the most significant memory leaks in order to get our software to run now days. We also don't have constraints to distribute it, fast deliver-ability is rewarded over quality software. It's only expected to stay in production 3-4 years anyway.
Those guys were the true "rockstars" and "ninja" developers and if you can't acknowledge that you can take ya React app and shovit. -
This is the third part of my ongoing series "The Ballad of the Six Witchers and the Undocumented Java Tool".
In this part, we have the massive Battle of Sparks and Storms.
The first part is here: https://devrant.com/rants/5009817/...
The second part is here: https://devrant.com/rants/5054467/...
Over the last couple sprints and then some, The Witcher Who Writes and the Butchers of Jarfile had studied the decompiled guts of the Undocumented Java Beast and finally derived (most of) the process by which the data was transformed. They even built a model to replicate the results in small scale.
But when such process was presented to the Priests of Accounting at the Temple of Cash-Flow, chaos ensued.
This cannot be! - cried the priests - You must be wrong!
Wrong, the Witchers were not. In every single test case the Priests of Accounting threw at the Witchers, their model predicted perfectly what would be registered by the Undocumented Java Tool at the very end.
It was not the Witchers. The process was corrupted at its essence.
The Witchers reconvened at their fortress of Sprint. In the dark room of Standup, the leader of their order, wise beyond his years (and there were plenty of those), in a deep and solemn voice, there declared:
"Guys, we must not fuck this up." (actual quote)
For the leader of the witchers had just returned from a war council at the capitol of the province. There, heading a table boarding the Archpriest of Accounting, the Augur of Economics, the Marketing Spymaster and Admiral of the Fleet, was the Ciefoh Seat himself.
They had heard rumors about the Order of the Witchers' battles and operations. They wanted to know more.
It was quiet that night in the flat and cloudy plains of Cluster of Sparks and Storms. The Ciefoh Seat had ordered the thunder to stay silent, so that the forces of whole cluster would be available for the Witchers.
The cluster had solid ground for Hive and Parquet turf, and extended from the Connection River to farther than the horizon.
The Witcher Who Writes, seated high atop his war-elephant, looked at the massive battle formations behind.
The frontline were all war-elephants of Hadoop, their mahouts the Witchers themselves.
For the right flank, the Red Port of Redis had sent their best connectors - currency conversions would happen by the hundreds, instantly and always updated.
The left flank had the first and second army of Coroutine Jugglers, trained by the Witchers. Their swift catapults would be able to move data to and from the JIRA cities. No data point will be left behind.
At the center were thousands of Sparks mounting their RDD warhorses. Organized in formations designed by the Witchers and the Priestesses of Accounting, those armoured and strong units were native to this cloudy landscape. This was their home, and they were ready to defend it.
For the enemy could be seen in the horizon.
There were terabytes of data crossing the Stony Event Bridge. Hundreds of millions of datapoints, eager to flood the memory of every system and devour the processing time of every node on sight.
For the Ciefoh Seat, in his fury about the wrong calculations of the processes of the past, had ruled that the Witchers would not simply reshape the data from now on.
The Witchers were to process the entire historical ledger of transactions. And be done before the end of the month.
The metrics rumbled under the weight of terabytes of data crossing the Event Bridge. With fire in their eyes, the war-elephants in the frontline advanced.
Hundreds of data points would be impaled by their tusks and trampled by their feet, pressed into the parquet and hive grounds. But hundreds more would take their place. There were too many data points for the Hadoop war-elephants alone.
But the dawn will come.
When the night seemed darker, the Witchers heard a thunder, and the skies turned red. The Sparks were on the move.
Riding into the parquet and hive turf, impaling scores of data points with their long SIMD lances and chopping data off with their Scala swords, the Sparks burned through the enemy like fire.
The second line of the sparks would pick data off to be sent by the Coroutine Jugglers to JIRA. That would provoke even more data to cross the Event Bridge, but the third line of Sparks were ready for it - those data would be pierced by the rounds provided by the Red Port of Redis, and sent back to JIRA - for good.
They fought for six days and six nights, taking turns so that the battles would not stop. And then, silence. The day was won, all the data crushed into hive and parquet.
Short-lived was the relief. The Witchers knew that the enemy in combat is but a shadow of the troubles that approach. Politics and greed and grudge are all next in line. Are the Witchers heroes or marauders? The aftermath is to come, and I will keep you posted.4 -
We often rant about people who think that because we can program we can do everything with computers.
But I have to admit that when I get asked what I do I often only say that I program or do something with computers. I usually don't get more specific because it's so hard to explain to someone who doesn't know anything about the subject that I would have to explain the basics each time. And I'm just to lazy for that.
It's nice when people ask me how it is going at work but I probably won't say anything more than ok or fine because my day was fucked up by a memory alignment bug in the chainloader and I now don't have the patience to explain what these both things are and why they fuck up my day. -
A bug is born
... and it's sneaky and slimy. Mr. Senior-been-doing-it-for-ears commits some half-assed shitty code, blames failed tests on availability of CI licenses. I decided to check what's causing this shit nevertheless, turns out he forgot to flag parts of the code consistently using his new compiler defines, and some parts would get compiled while others needed wouldn't .. Not a big deal, we all make mistakes, but he rushes to Teams chat directing a message to me (after some earlier non-sensible argument about merits of cherry picking vs re-base):
Now all tests pass, except ones that need CI license. The PR is done, you can use your preferred way to take my changes.
So after I spot those missing checks causing the tests to fail, as well as another bug in yet another test case, and yet another disastrous memory related bug, which weren't detected by the tests of course .. I ponder my options .. especially based on our history .. if I say anything he will get offended, or at best the PR will get delayed while he is in denial arguing back even longer and dependent tasks will get delayed and the rest of the team will be forced to watch this show in agony, he also just created a bottleneck putting so many things at stake in one PR ..
I am in a pickle here .. should I just put review comments and risk opening a can of worms, or should I just mention the very obvious bugs, or even should I do nothing .. I end up reaching for the PM and explained the situation. In complete denial, he still believes it's a license problem and goes on ranting about how another project suffering the same fate .. bla bla bla chipset ... bla bla bla project .. bla bla bla back in whatever team .. then only when I started telling him:
These issues are even spotted by "Bob" earlier, since for some reason you just dismissed whatever I just said ..
("Bob" is another more sane senior developer in the team, and speaks the same language as the PM)
Only now I get his attention! He then starts going through the issues with me (for some reason he thinks he is technical enough to get them) .. He now to some extent believes the first few obvious bugs .. now the more disastrous bug he is having really hard time wrapping his head around it .. Then the desperate I became, I suggest let's just get this PR merged for the sake of the other tasks after may be fixing the obvious issues and meanwhile we create another task to fix the bug later .. here he chips in:
You know what, that memory bug seems like a corner case, if it won't cause issues down the road after merging let's see if we need even to open an internal fix or defect for it later. Only customers can report bugs.
I am in awe how low the bar can get, I try again and suggest let's at least leave a comment for the next poor soul running into that bug so they won't be banging their heads in the wall 2hrs straight trying to figure out why store X isn't there unless you call something last or never call it or shit like that (the sneaky slimy nature of that memory bug) .. He even dismissed that and rather went on saying (almost literally again): It is just that Mr. Senior had to rush things and communication can be problematic sometimes .. (bla bla bla) back in "Sunken Ship Co." days, we had a team from open source community .. then he makes a very weird statement:
Stuff like what Richard Stallman writes in Linux kernel code reviews can offend people ..
Feeling too grossed and having weird taste in my mouth I only get in a bad hangover day, all sorts of swear words and profanity running in my head like a wild hungry squirrel on hot asphalt chasing a leaky chestnut transport ... I tell him whatever floats your boat but I just feel really sorry for whoever might have to deal with this bug in the future ..
I just witnessed the team giving birth to a sneaky slimy bug .. heard it screaming and saw it kicking .. and I might live enough to see it a grown up having a feast with other bug buddies in this stinky swamp of Uruk-hai piss and Orcs feces.1 -
We have this kettle that says "perfect memory" on the box, so does that mean it has ecc I would hate for a bit flip to occur and it make my tea too hot.2
-
Pretty much a sort of research work. The first assignment was: "look, we have this CAD viewer, but we would want to eventually optimize the structure of the mesh, so here's this method of minimizing the memory footprint. Try implementing it and integrating it with our application."
PS: the method is using triangle strips, where the next triangle uses two vertexes of the previous one, theoretically reducing the memory footprint of the mesh by 2/3 if the mesh is fully optimized. In the end, due to memory and performance constraints (this had to run on the first gen iPad), and overall application architecture, on the fly striping was unfeasible and gained no benefit, because striping an arbitrary mesh is a fucking hard task.
Another one was an implementation of smooth shading by recalculating vertex normals in runtime.5 -
We had 1 Android app to be developed for charity org for data collection for ground water level increase competition among villages.
Initial scope was very small & feasible. Around 10 forms with 3-4 fields in each to be developed in 2 months (1 for dev, 1 for testing). There was a prod version which had similar forms with no validations etc.
We had received prod source, which was total junk. No KT was given.
In existing source, spelling mistakes were there in the era of spell/grammar checking tools.
There were rural names of classes, variables in regional language in English letters & that regional language is somewhat known to some developers but even they don't know those rural names' meanings. This costed us at great length in visualizing data flow between entities. Even Google translate wasn't reliable for this language due to low Internet penetration in that language region.
OOP wasn't followed, so at 10 places exact same code exists. If error or bug needed to be fixed it had to be fixed at all those 10 places.
No foreign key relationships was there in database while actually there were logical relations among different entites.
No created, updated timestamps in records at app side to have audit trail.
Small part of that existing source was quite good with Fragments, MVP etc. while other part was ancient Activities with business logic.
We have to support Android 4.0 to 9.0 of many screen sizes & resolutions without any target devices issued to us by the client.
Then Corona lockdown happened & during that suddenly client side professionals became over efficient.
Client started adding requirements like very complex validation which has inter-entity dependencies. Then they started filing bugs from prod version on us.
Let's come to the developers' expertise,
2 developers with 8+ years of experience & they're not knowing how to resolve conflicts in git merge which were created by them only due to not following git best practice for coding like only appending new implementation in existing classes for easy auto merge etc.
They are thinking like handling click events is called development.
They don't want to think about OOP, well structured code. They don't want to re-use code mostly & when they copy paste, they think it's called re-use.
They wanted to follow old school Java development in memory scarce Android app life cycle in end user phone. They don't understand memory leaks, even though it's pin pointed by memory leak detection tools (Leak canary etc.).
Now 3.5 months are over, that competition was called off for this year due to Corona & development is still ongoing.
We are nowhere close to completion even for initial internal QA round.
On top of this, nothing is billable so it's like financial suicide.
Remember whatever said here is only 10% of what is faced.
- An Engineering lead in a half billion dollar company.4 -
I usually crib about how stupid people are and how I struggle to stay afloat.
Let's switch some gears now. A post about some good people, product, and processes.
You know what the common theme here is?
The goodness here cannot be measured. Your first interaction with them makes you feel so comfortable that you start feeling butterflies.
These people just keep on giving. They are selfless. They are pure. They actually care.
And when you think it's done, then they give you some more.
What blows me away is, they don't expect or accept anything in return. Absolutely nothing. Not even a simple thank you.
And they are like a wizard. They walk into your life when you least expect them but need them the most. And when the task is done, they'll be gone before you even know.
No lingering, no drama, no bullshit. Just pure goodness.
Like my ex-lead in current company, I have a very senior guy in neighbouring team (for which they were gonna hire me initially), who also happened to interview me, is a gem.
He takes care of me like his own younger brother. Supports me and always answers my queries no matter how occupied he is.
And same is with good products and processes. They feel effortless. So smooth and add exceptional value to your existence. They give rise to wonderful companies.
You'd never experience a single negative aspect about them. No matter how much you try, things will just keep getting better until they don't need to.
And then they'll be long gone. Never to be seen again and never to be forgotten.
You cherish them only in your memory and wish they lasted longer. But they didn't because the purpose was served.
Such people and experiences inspire me. They push me to become a better human.
No matter how the world is or how it treats me, I must always live with high values and be a better version of past self.
The other evening, I was conversing with my mother where we spoke about some family friends who are insanely wealthy but humble and kind.
Mom and I mutually agreed that they don't have such good traits because they are wealthy, but they are wealthy because they live with humility, kindness, and pure intentions.
World is surely a beautiful place because of such people and I aspire to be one. May lord guide me well :)3 -
Some customers are nice.
Ive been working with a customer to enable a feature in the database. It was tough, because the escalation from support was your standard 'Customer wants apples in the T-1000, please do the needful'. After several emails back and forth we reach an agreement about what needs to be done.
This is something I'm sure can't be done. I test it in my local install, yep, confirmed that's normal behaviour. The customer, however, is stalwart - he's suggesting changes to the database that would potentially give him what he needs. I figure if he's going to this much effort, I'll confirm with our product specialist to see if there's a way around it.
Five minutes later I'm emailing the customer with an apology as I have unwittingly never known of, or committed to memory, the existence of a distinctly non-hidden check box that enables this exact function. I pass this box several times a week at least, and I've worked on this software for two and a half years. Never have I needed to use it, so my brain just processed it as background imagery.
The customer just responded with the kindest email absolving me of my sins, thanking me for my humbleness and for my time.
I want him to have more problems so I can work with him more.2 -
What does GPT-3 tell us about how our brains work?
I just read an interesting article (link below) about how it does on the turing test. I've had this inclination for a while that state of the art AI is "incomplete", in the sense that we have some of the systems to make AGI, but not all of them. One of the comments they make is that "GPT-3 often finds it easier to write code to solve a programming problem, than to solve the problem on one example input", and that's the nail on the head. We can codify situations, describe the rules, put them in memory and run those rules in our head. We can manipulate the input to see how it'll change, we can spot from a problem statement what the rules are instead of focusing on what the answer is. Anyway, light bulb moment shared.
Link: https://lacker.io/ai/2020/...9 -
Fellow Deviants, I need your help in understanding the importance of C++
Okay, I need to clarify a few things:
I am not a beginner or a newbie who has just entered this community...
I have been using C++ for some time and in fact, it was the language which introduced me to the world of programming... Before, I switched to Java, since I found it much better for application development...
I already know about the obvious arguments given in favour of C/C++ like how it is a much more faster and memory efficient than other languages...
But, at the same time, C/C++ exposes us and doesn't protect us from ourselves.. I hope that you understand what I mean to say..
And, I guess that it is a fair tradeoff for the kind of power and control that these languages (C/C++) provide us..
And, I also agree with the fact that it is an language that ideally suits our need, if we wish to deal with compilers, graphics, OS, etc, in the future...
But, what I really want to ask here is:
In this age and times, when hardware has advanced so much, where technically, memory efficiency or execution speeds no longer is the topmost priority... These were the reasons for which C/C++ was initially created...
In today's time, human concept of time matters more and hence, syntactical less complicated languages like Java or Python are much more preferred, especially for domains like application development or data sciences...
So, is continuing with C++, an endeavour worth sticking with in the future or is it not required...
I am talking about this issue since I am in a dilemma about the use of C++ in the future...
I would be grateful if we could talk about keeping AI, Machine Learning or Algorithms Optimisation in mind... Since, these are the fields in which I am interested in...
I know that my question could have been posted in a better way.. But, considering the chaos that is present in my mind, regarding this question doesn't allow me to do so...
Any kind of suggestion or thoughts would be welcome and much appreciated...
P.S: I currently use C++ only for competitive programming or challenges...28 -
So another story about college and stupid team assignments that I have to be responsible for dealing with.
So we had an assignment in operating systems 1 course, it was about memory management and we are a team of 3. Then came the time when we should discuss this assignment with the TA and that day I had to stay all night finishing a project in software engineering (literally giving us a description of a big project because that's what the course teaches And I had to finish it in one all nighter alone because my teammates just gave up).
When the discussion time came I was really tired and then the TA asks me something really simple and I say it but then she tells me that I'm wrong so I wondered a bit and then said no what I said was right! She then asks my teammate (who we are supposed to be good friends) "did he say the right thing?" And his answer is a definitive "NO he's wrong" and then he starts to say the right answer which I swear I said the same but in a different way so I start to say again that I was right and say that I said that just a different way and she took that as an insult and said that I'm shouting at her and being disrespectful to her.
When we finished I asked my friend if he heard me say it wrong and he said "I'm sorry but I didn't even hear what you said and I was afraid" WHAT THE FUCK, he just said that I was wrong to please her and make her feel like she is right and I had to be the wrong one even though I said it right but NOoo her pride is more important
All this was last semester and the second semester just started today and I go into operating system 2 and guess what? The TA got her doctorate and is now the professor for OS 2 when she doesn't even understand anything.
Really FUCK the academic system it feels like it is a grind more than actually gaining mastery of a subject.2 -
How is it a thing that developing a desktop app nowadays requires an enormous amount of RAM? I stared working on an electron project and the whole thing takes up 3-4 GB of RAM when running, and that does not factor in my IDE or anything else.
But the packaged app does not go over 400mb, although we have had memory leaks in the past10 -
!rant
How to earn a lot of money as a programmer?
So this question might sound a little naive and too simple, but earning a lot of money is what we all want after all right? Collecting experiences from people in the business should be a good idea.
So this is the position I am in:
I am a German student in my 13th year of school (which means I will graduate this summer) and I am very interested in information technology. I know C++ pretty well by now and I have built a rendering engine for a game I want to make using openGL already, which I am very proud of.
I would love to turn this passion into my profession and thats why I plan to attend a dual course of computer science next year (dual means that I will be employed at a company (or similar) in parallel to the studying course).
But what direction should I be going in if I want to make big money later on? I am ready to spend a lot of time and work on this life project but I don't know which directions are the most promising. I hate being a tiny gear in a huge machine that just has to keep spinning to keep the machine alive, I want to be part of a real project (like most people probably) and possibly sell a product (because I think that is how you really make money).
Now I know there is no magic answer to this, but I bet many people here have made experiences they can share and this could help a lot of people directing their path in a more success oriented way.
I personally am especially interested in fields which are relatively low-level and close to memory (C++), go hand in hand with physics and 3D simulation and are somewhat creative and allow new solutions. (These are no hard lines, I just thought I should give a little direction to what I know already and what I am interested in)
But really, I am interested in any work you are likely to earn a lot of money with.12 -
At a startup where the software was built haphazardly because the developer thought he'll lifelong be the sole maintainer. The dude antagonized me at every turn and refused to help with familiarising with his code. He eventually left majority of the work for me, and dedication to work continued to dwindle until he threw in the towel
After his departure, we surprisingly grew fond of each other, discussing code concepts at length. He was in the habit of refusing to read any of the articles I sent him, or answer open ended questions citing the claim that they require thinking and he was busy. I didn't take any of this to heart
But it accumulated and I deleted his number. I didn't bear him any ill wishes but it wasn't respectful to myself for him to remain in my space. Some day, I was looking for a point raised during our conversations and went rummaging through our chats. Going down memory lane opened scars I'd long forgotten. I was embarrassed to see the way I forgot all about it. I should never have had anything to do with someone like that
He contacted me for a favour just less than a year after I deleted his contact. I didn't even think of declining. But this evening, I randomly remembered how he saw a defect in my code, promised me that the code will fail in production and resisted all pleas to point out what it was. I don't know if I hate him for his dastardly acts. What I feel deepest is sadness/bitterness that I got to experience all that2 -
fuck oracle. fuck my company.
Using Oracle VM Manager/Servers to host Oracle Phone transfer solution without support coverage from Oracle.
Requiring Unix sysadmins to update to latest release and not telling that we do not have coverage from Oracle if anything goes wrong.
Gues what.. We've updated to Oracle VM Manager/Server 3.4.5 which was released this year and it uses fucking XEN hypervisor version 4.4.4 which has been deprecated and dead since who knows when. Latest release of XEN is 4.11. But that is not an issue, whatever, enterprise, legacy software, etc.
This fucking update introduced memory leak on the hypervisor which has been reported as per xen 4.4.4 history. Furthermore, we have no support from Oracle which means that I have to dig through mailing lists and limited information on the net since oracle has freakin support wall on nearly each of the major bugs found on that shitty software.
I have no idea whether any newer version of xen will work with that old Oracle Linux kernel or not.
Furthermore, Oracle provided great documentation on how to rollback the fcking update. Reinstall the hypervisor. Riiiight. XEN does not have export/import feature.
eh1 -
Boss needs certain stats pulled from database once a year for board meeting. This time I delegate it to a junior dba/sysadmin. He looks at my 3-year-old docs that I hastily jotted down and pasted and included my rambling notes with results from way back then. Mostly they were just to jog my own memory, not to be a really neat, clean instruction guide. He does the queries correctly, but in ticket for boss he pastes also all my notes from the docs. boss gets confused, "what is this other number, I don't get it?!" We have to have a meeting of the 3 of us and waste an hour or so just to figure out what went wrong, finally I realize what junior guy accidentally did. Moral of story: to avoid baffling the nontechs, always simplify, simplify, simplify. Alternate moral of story: before delegating a task that seems old hat to you, always review your notes/docs and make sure they're ready for someone else to use them.1
-
I hate people who think they are always right.
A coworker who seemed to be a friend turns out to be an emotionally needy narcissist who seems to think that he is a perfect human being and is the best example of how to live.
Long story short is that we did some bonding via alcohol and smoking cigarettes. Especially when I was in a bad period in my life where I had little self confidence, was in a bad financial situation and overshared many details abound my personal life.
And yeah we also work as software devs in the same team but I started avoiding working with him directly, because due to his seniority he overcomplicates things a lot to the point where stuff gets postponed for months. Meanwhile I am a simple guy, I do my tasks and if they are not up to the standard I just work on the feedback until Im up to the standard, thats it. Its just a job for me, for him its a way of life and he considers himself to be basically an artist.
Hes always trying to prove me something, showing that the "long way" is the best way and so on. In reality I dont give a fuck about him. I live my own life and I have my own priorities. I work fulltime in one job, also I work part time as a freelancer and in total I make about 20 percent more than he does. Previously before this job I owned my own company where for 2 years I ran my own projects which generated a decent revenue. I know what is hard work and how to sacrifice myself in order to achieve results. I am more pragmatic and I have some limitations of what I can be good at (since I have a shitty working memory due to my ADHD). So I have systems in place and bottom line is that I earn a decent living and my skillset is different. Yeah I agree that in some ways he is better than me, but dude has such a massive inflated ego that now he thinks that he unlocked some sort of universal wisdom and now hes suddenly experienced in every field of life and his opinion is the right one.
This guy takes a massive pride in how good software engineer he is and in every topic or interaction he tries to one up me. Which most of the time is just his preference or in order to gain a 0.0001 percent performance increase. Dude is basically a big walking ego and since "we are close now" his ego started bleeding into personal relationship.
In my personal life, Im in a stable relationship, thinking of proposing soon and getting married. I already co-own an apartment with my current girlfriend. Everything is serious and planned, Im soon to be 30 years old. He is the same age but he still thinks hes young hot shit and all he cares about is getting shitfaced a couple times a week after work and he doesnt really have any other hobbies. He has a girlfriend but I dont see any future in there TBH.
So what I did now is I started putting some distance between us. No more drinking every week with him, maybe maximum once in 2 or 3 weeks. I started working from home more. Also I stopped sharing my personal life with him. Each time when he thinks he is right I just go along with it and dont even pay attention to his emotional manipulations. I just hope one day he fucks off completely and I wont give in to his gaslighting. Maybe in a few months I will be leaving this job, so I will never have to deal with him again.
Lesson learned: dont be vulnerable to coworkers who you bond together only via alcohol.3 -
¡rant|rant
Nice to do some refactoring of the whole data access layer of our core logistics software, let me tell an story.
The project is around 80k lines of code, with a lot of integrations with an ERP system and an sql database.
The ERP system is old, shitty api for it also, only static methods through an wrapper to an c++ library
imagine an order table.
To access an order, you would first need to open the database by calling Api.Open(...file paths) (yes, it's an fucking flat file type database)
Now the database is open, now you would open the orders table with method Api.Table(int tableId) and in return you would get an integer value, the pointer.
Now for the actual order. first you need to search for it by setting the search parameter to the column ID of the order number while checking all calls for some BS error code
Api.SetInt(int pointer, int column, int query Value)
Then call the find method.
Api.Find(int pointer)
Then to top this shitcake of an api of: if it doesn't find your shit it will use the "close enough" method of search.
And now to read a singe string 😑
First you will look in the outdated and incorrect documentation given to you from the devil himself and look for the column ID to find the length of the column.
Then you create a string variable with ALL FUCKING SPACES.
Now you call the Api.GetStr(int pointer, int column, ref string emptyString, int length)
Now you have passed your poor string to the api's demon orgy by reference.
Then some more BS error code checking.
Now you have read an string value 😀
Now keep in mind to repeat these steps for all 300+ columns in the order table.
News from the creators: SQL server? yes, sql is good so everything will be better?
Now imagine the poor developers that got tasked to convert this shitcake to use a MS SQL server, that they did.
Now I can honestly say that I found the best SQL server benchmark tool. This sucker creams out just above ~105K sql statements per second on peak and ~15K per second for 1.5 second to read an order. 1.5 second to read less than 4 fucking kilobytes!
Right at that moment I released that our software would grind to an fucking halt before even thinking about starting it. And that me & myself and I would be tasked to fix it.
4 months later and two weeks until functional beta, here I am. We created our own api with the SQL server 😀
And the outcome of all this...
Fixes bugs older than a year, Forces rewriting part of code base. Forces removal of dirty fixes. allows proper unit and integration testing and even database testing with snapshot feature.
The whole ERP system could be replaced with ~10 lines of code (provided same relational structure) on the application while adding it to our own API library.
Best part is probably the performance improvements 😀. Up to 4500 times faster and 60 times less memory usage also with only managed memory.3 -
So we’ve taken over from a project team that disbanded... read: “cut their contracts because fuck this, I can earn more working for better people”.
Me and one other guy have been tasked with saving this heap of shit.
Obviously the project guys left saying “it’s nearly done, just this one feature”. Because cut contracts are easier to deal with if “everything is almost done”.
We jump on and find that’s not the case at all... this thing, is a beast, a big old stats analysis program... so we’re like “cool, let’s see what’s going o...OH MY GOD”.
The “recalculation” function was core to this POS. The contractors had done it in C# through entity framework... it took 24 hours to run, over a reasonably small data set that was due to double every 2-5 years.
So... here’s the deal, it ran over night.... then failed. And no cunt had noticed. Entity framework “can’t commit because I’m muddled up as fuck, did you really just put the whole db in EF in memory to work with it?” Exception.
Que 6 months of me and my lead doing the job properly.
Anyway, the failure: I ended up in Hospital again with a Crohn’s flare up... about 5 months in.
Fuckall to do with all this nonsense I just wanted to tell a story. it was an interesting/fun project to fix and my lead was a legend... so happy days.
Similar story, different set of contracted devs... they’d been defining requirements with the business users using the term “Risk” which the business users knew as a group of risks.
The domain model had been written RiskGroup<>— -
** this means words are muted **
Friday:
I send a mail the client a Google doc with elaborate details about evaluation of an Android tablet from a Chinese manufacturer.
Monday:
The client is upset, he says "You say there is no GPS chip on the tablet while the manufacturer says otherwise"
Me- "I have clearly mentioned that it has a GPS chip"
Client- Opens the Google doc, points to a sentence. Looks at me like I did something horrible.
Me - **This guys is either word blind or something else is wrong with him, the line reads 'GPS chip available'**
Me- "Look, it says 'GPS chip available'.
Client- **Blinks n blinks again** "Alright, but why did you share a Google document, why not PDF, docx"
Me-**Politely** "You can download the document in any format, look I will show you..."
Client- "It should have been in the mail itself ideally"
Me- **WTH** "We normally maintain a document for such things to keep everything organised, but if you want I will put everything in mail itself"
Client- "Hmm.. do both from next time"
Me- "Alright" **BS**
Client- "Why is the new feature taking so much time"
Me- "As planned earlier, we going to deliver it tomorrow"
Client- "Why not today??" **Gives a strange look.**
Me thinking - **Enough**
Me- "See, I am trying to integrate a smarten with a socket connection, reading it's data via exposed APIs that are hardly documented, we need faster performance so I need to implement caching, multi threading, offline handling, multiple processes to avoid memory fluctuations, sync adapter to sync data...."
Client- "Ok ok ok, it's fine if you give working build tomorrow"
Me- "Ok, fine"
#limit1 -
2nd part to https://devrant.com/rants/1986137/...
The story goes on...
After I found more bugs that seem to be related to the communication break, and took a closer look, I sent detailed logs of my research and today we had a conference call.
"We have 2,5 million user, our system is widely-used and there is no plan to change it" they said.
And "We cannot reproduce the issue, but even if there is one, you will have to work around the problem, because we cannot make changes on our side" was one answer
As well as "If we would make changes, we will have to re-certify everything"
So I said we told 'em about the issue to let them improve their system. And I can work around it, I already figured out a solution for my side, but if there is a bug, they'd better fix it for future releases.
And with my additional research I have a bad vibe of some kind of memory leak involved on their "certified" implementation, and that could trigger various other problems.
But it is as always, if I try to be nice, I just get kicked in the ass. I should really be more of an asshole. -
A full 4 level page table on x86_64 is over around 687 GB in size. Allocating it fixed size obviously won't work...
So now when allocating virtual memory pages for an address space I sooner or later have to allocate one page to hold a new page table... which has to be referenced by the global page table for me to use... where there isn't any more space... which is exactly the reason we allocated a new page in the first place... so I'm fucked basically8 -
First thought about programming was in forth class in school, I was 10, and together with a friend we where planing on building a robot.
When we had a basic Idea on how the mechanics would work (theoretically but maybe not really practically sound) we started to consider how to control it. We had heard about computers but had never seen one but we figured out you could not just say, “go shopping” but rather had to break the problem down and doing that we came to the conclusion we would have to start with getting it to take a step.
We never got further as my family moved and I switched schools.
Later the same year I got to play with an actual computer, the Sinclair ZX80, 80 for the year.
A monster with 512 bytes of internal memory ... yes bytes, not kbyte.
And then things got going, after a few curses in Basic I finally got my own Spectra video 128, 14 years old and 2 years later I was teaching basic in ravening classes and I have been working with computers and programming ever since.1 -
Why is netbeans, or Java in general, so fucking bad at handling memory ? I mean, I'm literally doing nothing on my code and I see my IDE consuming more and more RAM, to some point it goes over 1GB so I have to close then reopen it to "flush" the memory taken...
It's 2017, how the fuck can't we still manage to actually use a correct amount of RAM when I open a barely 10MB project ??
And it applies to everything related to Java. Like Android, Minecraft and other Java based softwares...18 -
I don't understand why we have so many apps on the smartphones. I mean, we don't use "apps" for the desktop. We just use the browser and visit the webpage. Why can' t we do the same on smartphones? They do have mobile versions...
(In fact I do, but many ppl use apps like Twitter, Reddit, Facebook instead of just visiting the webpage). For me it's just waste of memory space.6 -
Was reading something about delusional disorder, and it got a bit scary cuz it made me question myself. Now, I tell you why.
I have a bad memory when it comes to trivial stuff. And I am, by occupation and therefore on a daily basis, creative and imaginative. Having pretty strong imagination means that I often have to ask myself "did that really happen or did I imagine it?" Which, given anxiety, I imagine all types of scenarios before they happen. (Parallel universes got nothing on me 😎)
So, now I'm wondering, where is the line between imagination and delusion, and how can you say what's real and what's not, be it offline (distorted memory) or online (schizophrenia).
One idea could be that video recording could help confirm, but we read emotions and vibe in real-time, and often those can't be recorded.
... Idk. Maybe I'm overthinking it. ¯\_(ツ)_/¯
Thank you for reading my half-baked thoughts!6 -
I need to estimate how much ram and CPUs my team will need next year for our apps... That have yet to be built.
We load a lot of data feeds with batch processes running on a few large machines, some can use like 30 GB RAM at times...) which should be a lot less if we get the data real time I hope...
But wondering how to estimate well... I sorta did a worse case analysis where I just multiply and sum # CPUs/memory* nodes*approx apps...
Comes out to be like 600 CPUs and 800 RAM... So wondering if that's ok...
RAM is ok but # of CPUs is way higher bc now all the apps basically run on their own machines...13 -
1) Learning little to nothing useful in formal post-secondary and wasting tons of time and money just to have pain and suffering.
"Let's talk about hardware disc sectors divisions in the database course, rather than most of you might find useful for industry."
"Lemme grade based on regurgitating my exact definitions of things, later I'll talk about historical failed network protocols, that have little to no relevance/importance because they fucking lost and we don't use them. Practical networking information? Nah."
"Back in the day we used to put a cup of water on top of our desktops, and if it started to shake a lot that's how you'd know your operating system was working real hard and 'thrashing' "
"Is like differentiation but is like cat looking at crystal ball"
"Not all husbands beat their wives, but statistically...." (this one was confusing and awkward to the point that the memory is mostly dropped)
Streams & lambdas in java, were a few slides in a powerpoint & not really tested. Turns out industry loves 'em.
2) Landed my first student job and get shoved on an old legacy project nobody wants to touch. Am isolated and not being taught or helped much, do poorly. Boss gets pissed at me and is unpleasant to work with and get help from. Gets to the point where I start to wonder if he starts to try and create a show of how much of a nuisance I am. He meddle with some logo I'm fixing, getting fussy about individual pixels and shades, and makes a big deal of knowing how to use GIMP and how he's sitting with me micromanaging. Monthly one on one's were uncomfortable and had him metaphorically jerking off about his lifestory career wise.
But I think I learned in code monkey industry, you gotta be capable of learning and making things happen with effectively no help at all. It's hard as fuck though.
3) Everytime I meet an asshole who knows more and accomplish than I do (that's a lot of people) with higher TC than me (also a lot of people). I despair as I realize I might sound like that without realizing it.
4) Everytime I encounter one of my glaring gaps in my knowledge and I'm ashamed of the fact I have plenty of them. Cargo cult programming.
5) I can't do leetcode hards. Sometimes I suck at white board questions I haven't seen anything like before and anything similar to them before.
6) I also suck at some of the trivia questions in interviews. (Gosh I think I'd look that up in a search engine)
7) Mentorship is nigh non-existent. Gosh I'd love to be taught stuff so I'd know how to make technical design/architecture decisions and knowing tradeoffs between tech stack. So I can go beyond being a codemonkey.
8) Gave up and took an ok job outside of America rather than continuing to grind then try to interview into a high tier American company. Doubtful I'd ever manage to break in now, and TC would be sweet but am unsure if the rest would work out.
9) Assholes and trolls on stackoverflow, it's quite hard to ask questions sometimes it feels and now get closed, marked as dupe, or downvoted without explanation.3 -
So I'm on my morning stroll. Walking, enjoying, watching the world around me.. It's nice how cherries blossom. They smell very tempting to stop there and enjoy the moment. Some flowers under the cherry...
Why do plants blossom again? Oh yeah, that's right, to exchange some speciments in order to grow fruit and seeds. To have their offspring. Just like every other living macroorganism [with a few exceptions ofc]. Life has no other way to survive but to exchange genetic material between two parties and only then trigger growth of the new life.
And that is a very strict rule. No more, no less: it takes exactly 2 organisms to make new life. But why is that? If my memory serves, theory of evolution says that life is like business: cut the losses and let the profits run. Over time it discards everything not required for the organism in order to save energy, and only successful new "investments" remain in the genome. The unsuccessful ones die before they proliferate, so the bad genes shall not survive.
It also says that very simple things, very simple changes lead to very complex outcomes. Us. Life.
But what is simple about life having to need 2 other lives? Exactly 2. It's either simple or efficient, depends on perspective. BUT IT IS NOT BOTH. Look at cells. They just split in half and multiply. Dead simple. It takes one of them to make another one. But with mammals, birds, reptiles, plants and other macroorganisms [excpt fungi] this is not the case! Why?!? I can't think of any scenario where two generic microorganisms, following some dead simple mutations, would come up w/ something that inefficient and overly complex. Like they're living on their own, multiplying by division, and smth very simple happens and they can no longer divide, only mate in pairs. The primitive, efficient and simple mechanism gets terminated and replaced with a different one, incredibly complex one!
Sure, we have protozoa which have similar reproductive mechanisms. They exchange genetic material to multiply.
But look at our, human cells. They dont need that! Look at some reptiles, some plants that only take one to make another. They don't pair as well! It's simple. Efficient. Why do protozoa need 2 for the species to survive?
It's not simple and efficient [tho helps us adapt, but its not my point for now]. See, things like this make ne wonder. What if we, the life, are not as accidental as we think? What if this whole mechanism was set off by someone or something billions of years ago? That's mean there are much older, much more superior cognitive organisms than us. What if protozoa was version 3 of new life [the first two did not survive]? Viruses - v2? Sea creatures - v3, reptiles - v4, and so on until they came up with us, mammals? That'd surely mean we are not alone in this universe. Are they watching us? Will they create a new species any time soon? What's our purpose, are we just an experiment?
And so, from cherry blossoms to existensial dilemma, my stroll is over. Time for breakfast :)1 -
everytime i buy a new phone ,i feel this sense of extreme regret :(
i bought a moto g 5g phone last year in feb, it was so good . it didn't had any out of the world cameras or some funky stuff, but it gave a decent performance and i couldn't want any other phone.
In October my mom's phone started giving issues so i bought a realme phone for her that was half my phone's price. i couldn't spent any mor e because otherwise she wouldn't take it. she accepted the cheaper phone and within 4 days sue was cursing it. the phone had decent specs but would lag in certain apps like zoom, and won't run some call recorder apps. at the end i swapped my phone with mom's since i didn't cared about zoom or the recorder.
now this shit realme phone's memory has gone around 60% full of my stuff, and its showing its limitations. this shit auto relaunches insta after a few minutes of usage, probably because its runtime memory gets short( 4gb 128gb device gets memory shortages. nice). its video quality is shit and camera also takes rarely good pics.
the worst thing i like about smartphones today is how they over optimise the ui. this insta issue and auto call recorders not working is simply because of the realme skin running over the stock android. i had similar issues with a xiaomi device i bought for my dad sometime ago. (fortunately my dad is more medieval so that crap has not came back to me :'/ )
so overall i am buying a 3rd phone in 17 months.
This time it's Samsung f23 and am worried that it's also going to suck. i was this 🤏close to buying a pixel 6 or even an iphone coz i can afford them.
but the regret of buying such an expensive phone that will need replacement in 2 years made me rethink.
the only android os that have suited me the best is stock and as of now only 2 companies are making it : google and moto(* it's 100% aosp with 3 extra apps but they can't say that, so they also state that they are not stock os) . one plus is also a brand that i have heard makes a good os . but recently i also heard that they have completely scrapped their os and using oppo's softwares . plus the amount of tickets we get for notifications not working in oneplus, am sure their optimization is extremely aggressive.
so everything between a moderate price phone ( that will need a replacement in 2 years ) to a flagship felt unnecessary to me, so i went ahead with a Samsung's shit phone. f23 has almost same specs as moto but it's again a heavily customised os. i wanna waste my money on trying a custom os and declare it shitty.
most of my friends that use Samsung are fan of it but they are also not very techy so i guess it suits them well. i am the guy who first installs nova launcher in his device, so let's see what it brings on the table. from the 3rd person p.o.v, i felt its screen and camera images to he nice whenever i used their mobiles, so let's see what this brings to the table :(7 -
I struggled with weather to post this but I feel like I have to. I didnt want to feed into the fear or give 'them' any more reason to argue against common sense but I guess it cant be helped.
The reason I was gone for a while was because I went and got my vaccination.
In less than half hour after getting the vaccine, I was in the ICU. The staff told me I had a stroke possibly from clotting and inflamation. I couldnt feel my arm or anything below my shoulders. Yes really.
Apparently I "died" for a little while and when they brought me back I was in a coma for almost a week.
I'm back home now and I still dont fully understand what happened. Still have numbness, and horrible headaches, and can barely think straight sometimes, but the doctors told me that I didnt suffer any permanent brain damage according to my scans.
Also they told me I had old damage to my left and right temporal lobe, which makes sense because I have always suffered problems with short term memory and other issues.
And I'm just at a loss how this could happen. I have no serious injuries. We were told this is safe.
And this is the exact reason I didnt want to post it, because now tards will come in and be "lololol serves you right vaxxer!"
If I knew the side effects were this bad maybe I would have changed my mind but no one told me! I mean I think I still would have got it because we have to protect vulnerable people, but still.
The hospital assured me it wasnt the vaccine and must have been an underlaying condition, but I'm not so sure. I just happen to have a pre-existing problem that I dont know about that causes a stroke and paralysis only half an hour after the shot?
And now I dont know if I'll ever be ok. And doctors warned me I may suffer more strokes and to avoid physically demanding tasks for a while. My primary job is construction (not by chooce). Now I face the prospect of not even being able to work my existing job or do the things I love, like hiking, anymore. So much of the world doesnt make any sense right now and I just dont know what to believe anymore.
Tards will probably be in shortly to suggest I check for microchips or test fucking magnets on myself.
No, just stop.8 -
Years ago when I was a kid I was into making things to solve problems. Earlier in my life I remember as a kid seeing a neighbor shove 120VAC into the ground to get worms to come up to the surface. I also remember taking a worm and slapping it on one of the posts and shocking the shit out of myself. Apparently that lesson did not stick.
As a teenager I wanted a device similar to this. So I wired a 120VAC plug and cord to a 1/4" audio connector. I threw this in a box and forgot about it. Years later my sister went through my things looking for a power plug for a synthesizer we had. It had an audio 1/4" plug for headphones. She told me she plugged this cord she found of mine into the synth and it started smoking. She went to pull the plug out and shocked the shit out of her. I don't think that the synth ever worked correctly after that.
Well, today I was thinking fondly about that story. I mean, who wouldn't think fondly about shocking the shit out of your sister (she didn't die, so its okay). However, it was dangerous. Really really dangerous.
The lesson I can take from this memory is this: if you know a software interface (or electrical) is not safe then don't build it. Someone will try and use that shit years later and really fuck some stuff up. I have to wonder. What kind of software traps have I built in the past that are yet to be discovered?3 -
Not exactly a story since I was too young to remember, but my parents told me that I was really enjoyed playing with the games my father made for the good old commodore 64 we had.
He basically had two 5" floppy holders full of his own games and programs he used to make. Unfortunately we only have the disks now. :(
The first memory of me using the computer though, is when my father bought a computer for his office (was win 95 with the "you are now safe to switch off your computer" message) and I was sneaking in to play with paint because it was so cool back then. -
I can work with Angular, even though it's pain in the but.
My current Angular job is actually the job with the first manager that had decent human values and ethics, I like my team, and yeah, what we building is shit. But it's only 30% shit because of Angular, another 30% are due to SAFe, and the rest is the usual stuff.
Still enjoy my job and respect my team.
But please do not expect me to pretend Angular is on a comparable level to React. Angular hasn't brought any actual innovation in most major versions but releases those breaking major updates still at least twice a year.
Ivy might be awesome, but only because Angular told the world 3 years ago also to have Ivy compatible compile targets for their libs/packages doesn't mean everybody cared.
And the ngcc, the awesome compatibility compiler, mutates node modules in place. So ne parallel stuff, no using yarn2 or pnpm.
At the same time, React brought so many innovations into the frontend world but is basically backwards compatible.
Not sure how the Angular partial compilation and whatever needs to go on works, but it seems like there's hardly anyone that really knows, so you can't use Vite or whatever other new tool.
And sure, if you're really good, you can write Angular without producing memory leaks.
But it's really hard. Do you know what's also quite hard: Producing memory leaks with React!
And for sure, Angular Universal, which isn't used by anyone, it feels like, will still be on a comparable level to an open source product that's used all over the world, builds the basis for an open source company, and is improved by thousand of issues day by day.
And sure, two kinds of change detection are a great idea. And yeah, pretending Angular comes with all included makes it worth it that the API is fucking huge and you're better of knowing nothing, because you have to read up things, than knowing quite a lot, since making assumptions and believing apis work in a similar way and follow similar contentions...
Whatever... I work with it. Like the time. Like the company, even my poss. But please don't expect my lying to you this was a good idea, or Angular is even remotely the same level of React.15 -
* Customer reports bug.
* I fix the bug.
* This highlights another issue that I haven't got enough resources to fix.
* I revert the fix.
* <insert hacky workaround here>
We have code that invokes undefined behaviour (freeing memory twice), but somehow people have managed to build around it and now it depends upon it to work.
FML. -
We specified a very optimistic setup for a data science platform for a client....
Minimum one machine with a 16 core CPU with 64GB RAM to process data.....
Client's IT department: Best we can do is an 8 core 16GB server.
Literally what I have on my laptop.
Data scientist doesn't use any out-of-memory data processing framework, e.g. Dask, despite telling him it's the best way to be economical on memory; ipykernel kills the computation anyway because it runs out of memory.
Data scientist has a 64GB machine himself so he says it's fine.
Purpose of the server: rendered pointless.5 -
Perhaps as a tip for the junior devs out there, here's what I learned about programming skills on the job:
You know those heavy classes back in college that taught you all about Data Structures? Some devs may argue that you just need to know how to code and you don't need to know fancy Data Structures or Big o notation theory, but in the real world we use them all the time, especially for important projects.
All those principles about Sets, (Linked) lists, map, filter, reduce, union, intersection, symmetric difference, Big O Notation... They matter and are used to solve problems. I used to think I could just coast by without being versed in them.. Soon, mathematics and Big o notation came back to bite me.
Three example projects I worked in where this mattered:
- Massive data collection and processing in legacy Java (clients want their data fast, so better think about the performance implications of CRUD into Collections)
- ReactJS (oh yes, maps and filters are used a lot...)
- Massive data collection in C# where data manipulation results are crucial (union, intersection, symmetric difference,...)
Overall: speed and quality mattered (better know your Big o notation or use a cheat sheet, though I prefer the first)
Yes, the approach can be optimized here, but often we're tied to client constraints, with some room if we're lucky.
I'm glad I learned this lesson. I would rather have skills in my head and in memory than having to look up things and try to understand them all the time.5 -
Compilers should just work for raw C with only static memory allocation. This isn't the bad old days where a couple of dudes wrote a short book explaining how C might probably should possibly work. I hear supposedly we have standards now.
Well, last week I lost 2 days to our compiler randomly forgetting that it wasn't okay to put a globally allocated uint32 at an address ending in 9. What? It had been handling this case without issue for more a year, but now after changing completely unrelated code we have this problem.
I'm not sure how to even deal with this idiocy so no doubt I'll continue working on it this week, too.
Thanks a lot, GCC.1 -
>Working on code
>Shit works as intended first try, nice
>Goes to play strange bootleg Gameboy Color ROM sent by a friend
>ROM immediately fucking dies
wtf.svg
>Pop emulator's debugger
we're executing from VRAM, stack's firmly embedded in ROM
>why
>Add execution breakpoint to entrypoint of game, restart emulated system (because i'm actually using the legit bios i hacked so it allows null/corrupted games to run)
>Step through everything, everything goes well until all of a sudden we call a function and shit hits the goddamn fan
well we have the culprit
>step through subroutine
if <unused_byte_in_HRAM> != 0 then stackPointer+=32;tryAgain();else return
>***y***
>Realize this is using a bootleg Memory Bank Controller with hard-backed encryption so none of the bytes executed or read as data are the right byte
>Find emulator that'll handle the jank MBC
>read code to try and figure out how it works
if checksumExtendedLogoBlob == some_number then set MBC_Bootleg1 else if checksumExtendedLogoBlob == some_other_number then set MBC_Bootleg2 else if...
>of course
>Spend 10 minutes finding the right bootleg MBC
>code shows 8 possible tables for real bit order based on some value in the cart header
>look for code that gets this value
>not in the header
>not in ANY header in this 1000+ file emulator
>not in any related cpp files???
>get desperate
>email author
>"Delivery failed: email doesn't exist"
fuck me i guess2 -
So i wasted last 24 hours trying to satisfy my ego over a shitty interview and revisiting my old job's codebase and realising that i still don't like that shit. just i am 25 and have no clue where am i heading at. i am just restless, my most of the decisions in 2023 have given very bad outcomes and i am just trying doing things to feel hopeful.
context for the interview story-----
my previous job was at a b2b marketing company whose sdk was used by various startups to send notifications to their users, track analytics etc. i understood most of it and don't find it to be any major engineering marvel, but that interviewer was very interested in asking me to design a system around it.
in my 1.2 years of job there, i found the codebase to be extremely and unnecessarily verbose ( java 7) with questionable fallbacks and resistance towards change from the managers. they were always like "we can't change it otherwise a lot of our client won't use our sdk". i still wrote a lot of testcases and tried to understand the working of major features.
BTW, before you guys go on a declare me an embarrassment of an engineer who doesn't know the product's code base, let me tell you that we are talking SDKs (plural) and a service based company here. their was just one SDK with interesting, heavy lifting stuff and 9 more SDKs which were mostly wrappers and less advanced libraries. i got tasks in all of them, and 70% of my time went into maintaining those and debugging client side bugs instead of exploring the "already-stable-dont-change" code base.
so based on my vague understanding and my even more vague memory from 1 year ago, i tried to explain an overall architecture to that interviewer guy. His face was screaming the word "pathetic" from his expressions, so i thought that today i will try to decode the codebase in 12-15 hours, publish a cool article and be proud of how much i know a so called martech system design. their codebase is open sourced, so it wasn't difficult to check it out once more.
but boy oh boy i got so bored. unnecessary clases , unnecessary callbacks static calls , oof. i tried to refactor a few classes, but even after removing 70% of codebase, i was still left with 100+ classes , most of them being 3000-4000 files long. and this is your plain old java library adding just 800kb to your project.
boring , boring stuff. i would probably need 2-3 more days to get an understanding of complete project, although by then i would be again questioning my life choices , that was this a good use of my 36 hours?
what IS a correct usage of my time? i am currently super dissatisfied with my job, so want to switch. i have been here for 6 months, so probably i wouldn't be going unless i get insane money or an irresistible company offer. For this i had devised a 2 part plan to either become good at modern hot buzz stuff in my domain( the one being currently popularized by dev influenzas) or become good at dsa/leetcode/cp. i suck bad at ds/algo stuff, nor am i much motivated. so went with that hot buzz stuff.
but then this interview expected me to be a mature dev with system design knowledge... agh fuck. its festive season going on and am unable to buy any cool shirts since i am so much limited with my money from my mediocre salary and loans. and mom wants to buy a home too... yeah kill me3 -
First rant here...
Hand full of devs have to create a huge web platform that can shovel a lot of data around in about two months which is impossible...
Project lead has left major decisions in the hands of interns like database we want to use because no question can.be answered by that person. Inexperienced intern has chosen a fucking nosql database for highly relational datasets... why? Because new tech...
Development began and a bunch of problems arised... database was accessable from internet from day one. Random crashes because out of memory exceptions. Every possible feature had a description of at most 10 words... and no standards where enforced on anything.
Now that finaaaally we switch to sql after almost a year of prototypical production everybody keeps coding on new features so i have to port all the crap to the new database...
best part: a bunch of clients on different op systems have to be ported as well!
Even better part: i have to do that cause everybody else has practically no experience in any field...
And now the joke: i got hired for gui/desktop application development
Am i a wizard now? -
Production goes down because there's a memory leak due to scale.
When you say it in one sentence, it sounds too easy. Being developers we know how it all goes. It starts with an alert ping, then one server instance goes down, then the next. First you start debugging from your code, then the application servers, then the web servers and by that time, you're already on the tips of your toes. Then you realize that the application and application servers have been gradually losing memory over a period of time. If the application is one that don't get re-deployed ever so often, the complexity grows faster. No anomaly / change detection monitor can detect a gradual decrease of memory over a period of months.2 -
Who actually started the reign of mixed character passwords? because seriously it sucks to have an unnecessarily complex password! Like websites and apps requesting passwords to contain Upper/Lower case letter, numeric characters and symbols without considering the average user with low memory threshold (i.e; Me).
Let's push the complaint aside and return back to the actual reason a complex password is required.
Like we already know; Passwords are made complex so it can't be easily guessed by password crackers used by hackers and the primary reason behind adding symbols and numbers in a password is simply to create a stretch for possible outcome of guesses.
Now let's take a look into the logic behind a password cracker.
To hack a password,
1) The Password Cracker will usually lookup a dictionary of passwords (This point is very necessary for any possible outcome).
2) Attempts to login multiple times with list of passwords found (In most cases successful entries are found for passwords less than 8 chars).
3) If none was successful after the end of the dictionary, the cracker formulates each password on the dictionary to match popular standards of most website (i.e; First letter uppercase, a number at the end followed by a symbol. Thanks to those websites!)
4) If any password was successful, the cracker adds them to a new dictionary called a "pattern builder list" (This gives the cracker an upper edge on that specific platform because most websites forces a specific password pattern anyway)
In comparison:
>> Mygirlfriend98##
would be cracked faster compared to
>> iloveburberryihatepeanuts
Why?
Because the former is short and follows a popular pattern.
In reality, password crackers don't specifically care about Upper-Lowercase-Number-Symbol bullshit! They care more about the length of the password, the pattern of the password and formerly used entries (either from keyloggers or from previously hacked passwords).
So the need for requesting a humanly complex password is totally unnecessary because it's a bot that is being dealt with not another human.
My devrant password is a short story of *how I met first girlfriend* Goodluck to a password cracker!6 -
I feel like writing or telling people about the time I jumped from Windows 7 Ultimate and jumping to Windows 10. (I'm not against 10, but I'm never updating after what had happened to me)
It all starts when none of my games will play due to a possible issue with my graphics card. I look up "3D source game bug" and not many results pop up. I go on Microsoft's Qna areas and ask this question but to my surprise nothing they say would make sense. "Clean the pins of your graphics card, make sure you verify the games on Steam". I verified the games and they checked out as perfectly fine. I don't have access to my graphics card because this is a laptop, sadly not a tower.
Two months pass and my computer is already showing signs of stress, like it didn't want to live in a sense. It was three times slower than when I was on Windows 7 and it was unallocating areas of my main hard drive where I could make virtual hard drives.
Instantly I start looking up Linux distros and find Linux Mint. 17.3 was the current version at the time. I downloaded it and burned it onto a DVD-rom and rebooted my computer. I loaded into the disc and to my surprise it seemed almost like Windows 7 apart from the Linux part. I grab my external hard drive and partition it to hold the Linux distro and leave it plugged in incase Windows 10 does actually fail.
On December 19, a few months after Windows 10 had released. I start my laptop to try and continue my studies in video game development. But to my surprise, Windows 10 had finally crashed permanently. The screen flickered blue and black, and an error box saying Loginui.exe failed to start. I look at it for a solid minute as my computer had just committed suicide in a sense.
I reboot thinking it would fix the error but it didn't. I couldn't log in anymore.
I force shutdown the laptop and turn it back on putting it into safe mode.
To my surprise loginui.exe works and I sign in. I look at my desktop, the space wallpaper I always admired, the sound files, screen shots I had saved.
I go into file explorer and grab everything out of my default hard drive Windows was installed on. Nothing but 400gb got left behind and that was mainly garbage prototypes I had made and Windows itself. I formatted my external hard drive and placed everything on it. Escaping Windows 10 with around 100GB of useful data I looked at the final shutdown button I would look at.
I click it and try to boot into normal Windows 10. But it doesn't work. It flickers and the error pops up once more.
I force it to shutdown and insert the previous Linux Mint disc I made and format the default hard drive through Linux. I was done. 10 gave me a lot of shit. Java wouldn't work, my games has a functional UI but no screen popped up except a black abyss and it wouldn't even let me try to update my graphics card, apparently my AMD Radeon 5450 was up to date at the AMD Radeon 5000's.
I installed Linux Mint and thinking the games would actually play I open steam and Launch Half-Life 2 to check if Linux would be nicer to me than Windows 10 had been.
To my surprise the game ran. The scene from Highway 17 popped on screen and the UI was fully functional. But it was playing at 10-15fps rather than the usual 60-70fps. Keep look at my drivers and see my graphics card isn't in use. I do some research and it turns out I have a Hybrid Laptop.
Intel HD Graphics and an AMD Radeon 5450 and it was using the Intel and not the AMD. Months of testing and attempts of getting the games to work at high frame rates pass and the Damn thing still functions at a low terrible fps. Finally I give up. I ask my mom for a Windows 7 disc and she says we can't afford it. A few months pass and I finally get a Windows 7 installation disc through money I've saved up. Proudly I put it into my optical disc drive and install it to my main hard drive deleting Linux completely. I announced to all my friends my computer was back in working order and I install everything I needed, Steam, Skype, Blender, and Unity as well as all my games. I test Half-Life 2 and it's running exceptionally smoothly, I test Minecraft at max settings and it's working beautifully. The computer was functioning properly once again and my life as a developer started as I modeled things and blender, learned beginners C# and learned a lot of Batch. Today the computer still runs at a great speed and I warn others of what happened to me after I installed Windows 10 to my machine if they are thinking of switching from 7 or 8 on an older machine.
Truly the damage to my data cannot be undone. But the memory of the maintenance, work, tests, all are a memory of how Windows 10 ruined me and every night before the one year anniversary of Windows 10's release, I took out the battery of my laptop and unplugged it from the a.c. power, just so Windows 10 doesn't show it's DLLs, batch scripts, vbs scripts, anything on my computer. But now, after this has happened and I have recovered, I now only have a story to tell5 -
How to disconnect from work after working hours? Im working for the last 4 months as a mid level dev in this company. I mean Im able to problem-solve and do my work but sometimes I get so addicted to problem solving that I get worried and become obsessed, hyperfixated (especialy if Im stuck on something for lets say a couple weeks). It goes to the point where I work from home 12-14 hours a day just to figure out some bug in the flow.
Thing is, our codebase is large and when doing every new refactor/feature some surprises happen. I dont have a decent mentor who could teach me one on one or even do pair programming with. All i have is just some colleagues who can point me to right direction or do a code review from time to time. Thats it.
I dont know why I take this so personally. For example I had to do a feature which I did in 1 week, then MR got approved by devs and QA. After that during regression they found like 3 blockers and I felt really bad and ashamed. While in reality our BA did not define feature properly, devs who reviewed it didnt even launch the code and poke around in the app, and our team's QA tested only the happy scenario. Basically this is failing/getting delayed because of a failure in like 6-7 people chain.
However for some reason Im taking this very personally, that I, as a dev failed. Maybe due to my ADHD or something but for the next days or weeks as long as I dont find solution I will isolate myself and tryhard until I get it right. Then have a few days of chill until I face another obstacle in another task again. And this keeps repeating and repeating.
My senior colleague tells me to chill and dont let work take such a toll on my emotional/physical/mental health. But its hard. He has 7 years of experience and has decent memory. I have 2-3 years of experience and have ADHD, we are not the same. I dont know how to become a guy who clocks out after 8 hours of work done everyday. Its like I feel that they might fire me or I will look bad if I dont put in enough effort. Not like I was ever fired for performance issues... Anyways I dont know how to start working to live, instead of living for work.
I hate who Im becoming. I dont work out anymore, started smoking a lot, dont exercise. I live this self induced anxiety driven workaholic lifestyle.6 -
YGGG IM SO CLOSE I CAN ALMOST TASTE IT.
Register allocation pretty much done: you can still juggle registers manually if you want, but you don't have to -- declaring a variable and using it as operand instead of a register is implicitly telling the compiler to handle it for you.
Whats more, spilling to stack is done automatically, keeping track of whether a value is or isnt required so its only done when absolutely necessary. And variables are handled differently depending on wheter they are input, output, or both, so we can eliminate making redundant copies in some cases.
Its a thing of beauty, defenestrating the difficult aspects of assembly, while still writting pure assembly... well, for the most part. There's some C-like sugar that's just too convenient for me not to include.
(x,y)=*F arg0,argN. This piece of shit is the distillation of my very profound meditations on fuckerous thoughtlessness, so let me break it down:
- (x,y)=; fuck you in the ass I can return as many values as I want. You dont need the parens if theres only a single return.
- *F args; some may have thought I was dereferencing a pointer but Im calling F and passing it arguments; the asterisk indicates I want to jump to a symbol rather than read its address or the value stored at it.
To the virtual machine, this is three instructions:
- bind x,y; overwrite these values with Fs output.
- pass arg0,argN; setup the damn parameters.
- call F; you know this one, so perform the deed.
Everything else is generated; these are macro-instructions with some logic attached to them, and theres a step in the compilation dedicated to walking the stupid program for the seventh fucking time that handles the expansion and optimization.
So whats left? Ah shit, classes. Disinfect and open wide mother fucker we're doing OOP without a condom.
Now, obviously, we have to sanitize a lot of what OOP stands for. In general, you can consider every textbook shit, so much so that wiping your ass with their pages would defeat the point of wiping your ass.
Lets say, for simplicity, that every program is a data transform (see: computation) broken down into a multitude of classes that represent the layout and quantity of memory required at different steps, plus the operations performed on said memory.
That is most if not all of the paradigm's merit right there. Everything else that I thought to have found use for was in the end nothing but deranged ways of deriving one thing from another. Telling you I want the size of this worth of space is such an act, and is indeed useful; telling you I want to utilize this as base for that when this itself cannot be directly used is theoretically a poorly worded and overly verbose bitch slap.
Plainly, fucktoys and abstract classes are a mistake, autocorrect these fucking misspelled testicle sax.
None of the remaining deeper lore, or rather sleazy fanfiction, that forms the larger cannon of object oriented as taught by my colleagues makes sufficient sense at this level for me to even consider dumping a steaming fat shit down it's execrable throat, and so I will spare you bearing witness to the inevitable forced coprophagia.
This is what we're left with: structures and procedures. Easy as gobblin pie.
Any F taking pointer-to-struc as it's first argument that is declared within the same namespace can be fetched by an instance of the structure in question. The sugar: x ->* F arg0,argN
Where ->* stands for failed abortion. No, the arrow by itself means fetch me a symbol; the asterisk wants to jump there. So fetch and do. We make it work for all symbols just to be dicks about it.
Anyway, invoking anything like this passes the caller to the callee. If you use the name of the struc rather than a pointer, you get it as a string. Because fuck you, I like Perl.
What else is there to discuss? My mind seems blank, but it is truly blank.
Allocating multitudes of structures, with same or different types, should be done in one go whenever possible. I know I want to do this, and I know whichever way we settle for has to be intuitive, else this entire project has failed.
So my version of new always takes an argument, dont you just love slurping diarrhea. If zero it means call malloc for this one, else it's an address where this instance is to be stored.
What's the big idea? Only the topmost instance in any given hierarchy will trigger an allocation. My compiler could easily perform this analysis because I am unemployed.
So where do you want it on the stack on the heap yyou want to reutilize any piece of ass, where buttocks stands for some adequately sized space in memory -- entirely within the realm of possibility. Furthermore, evicting shit you don't need and replacing it with something else.
Let me tell you, I will give your every object an allocator if you give the chance. I will -- nevermind. This is not for your orifices, porridges, oranges, morpheousness.
Walruses.16 -
I had to do a project for my A-levels.
The task was to get a client and develop and application based on their requirements. Naturally I made my friends my clients so that I could make something I was interested in.
The teacher constantly changed my requirements during the start, because he liked everyones applications to be somewhat similar (Probably easier to mark), which demotivated me.
The timescale we were set around easter time was to have a demo by the end of summer which didn't need to work properly, and then a completed version after the Christmas holidays.
I wrote about 90% of the program over my 2 weeks off for Christmas, most of that while drunk, high, or both, and managed to complete it within them two weeks.
I went back to the code a couple months later, with no memory of writing it, to set up a demo to show my teacher and I was actually surprised at it. It was the first project of that type that I had worked on, and while there were a couple noticable bugs, it actually worked fairly well, and was really well documented. I was expecting a pile of buggy spaghetti.1 -
Can anyone help me with this theory about microprocessor, cpu and computers in general?
( I used to love programming when during school days when it was just basic searching/sorting and oop. Even in college , when it advanced to language details , compilers and data structures, i was fine. But subjects like coa and microprocessors, which kind of explains the working of hardware behind the brain that is a computer is so difficult to understand for me 😭😭😭)
How a computer works? All i knew was that when a bulb gets connected to a battery via wires, some metal inside it starts glowing and we see light. No magics involved till now.
Then came the von Neumann architecture which says a computer consists of 4 things : i/o devices, system bus ,memory and cpu. I/0 and memory interact with system bus, which is controlled by cpu . Thus cpu controls everything and that's how computer works.
Wait, what?
Let's take an easy example of calc. i pressed 1+2= on keyboard, it showed me '1+2=' and then '3'. How the hell that hapenned ?
Then some video told me this : every key in your keyboard is connected to a multiplexer which gives a special "code" to the processer regarding the key press.
The "control unit" of cpu commands the ram to store every character until '=' is pressed (which is a kind of interrupt telling the cpu to start processing) . RAM is simply a bunch of storage circuits (which can store some 1s) along with another bunch of circuits which can retrieve these data.
Up till now, the control unit knows that memory has (for eg):
Value 1 stored as 0001 at some address 34A
Value + stored as 11001101 at some address 34B
Value 2 stored as 0010 at some Address 23B
On recieving code for '=' press, the "control unit" commands the "alu" unit of cpu to fectch data from memory , understand it and calculate the result(i e the "fetch, decode and execute" cycle)
Alu fetches the "codes" from the memory, which translates to ADD 34A,23B i.e add the data stored at addresses 34a , 23b. The alu retrieves values present at given addresses, passes them through its adder circuit and puts the result at some new address 21H.
The control unit then fetches this result from new address and via, system busses, sends this new value to display's memory loaded at some memory port 4044.
The display picks it up and instantly shows it.
My problems:
1. Is this all correct? Does this only happens?
2. Please expand this more.
How is this system bus, alu, cpu , working?
What are the registers, accumulators , flip flops in the memory?
What are the machine cycles?
What are instructions cycles , opcodes, instruction codes ?
Where does assembly language comes in?
How does cpu manipulates memory?
This data bus , control bus, what are they?
I have come across so many weird words i dont understand dma, interrupts , memory mapped i/o devices, etc. Somebody please explain.
Ps : am learning about the fucking 8085 microprocessor in class and i can't even relate to basic computer architecture. I had flunked the coa paper which i now realise why, coz its so confusing. :'''(14 -
Web browsers removed FTP support in 2021 arguing that it is "insecure".
The purpose of FTP is not privacy to begin with but simplicity and compatibility, given that it is widely established. Any FTP user should be aware that sharing files over FTP is not private. For non-private data, that is perfectly acceptable. FTP may be used on the local network to bypass MTP (problems with MTP: https://devrant.com/rants/6198095/... ) for file transfers between a smartphone and a Windows/Linux computer.
A more reasonable approach than eliminating FTP altogether would have been showing a notice to the user that data accessed through FTP is not private. It is not intended for private file sharing in the first place.
A comparable argument was used by YouTube in mid-2021 to memory-hole all unlisted videos of 2016 and earlier except where channel owners intervened. They implied that URLs generated before January 1st, 2017, were generated using an "unsafe" algorithm ( https://blog.youtube/news-and-event... ).
Besides the fact that Google informed its users four years late about a security issue if this reason were true (hint: it almost certainly isn't), unlisted videos were never intended for "protecting privacy" anyway, given that anyone can access them without providing credentials. Any channel owner who does not want their videos to be seen sets them to "private" or deletes them. "Unlisted" was never intended for privacy.
> "In 2017, we rolled out a security update to the system that generates new YouTube Unlisted links"
It is unlikely that they rolled out a security update exactly on new years' day (2017-01-01). This means some early 2017 unlisted videos would still have the "insecure URLs". Or, likelier than not, this story was made up to sound just-so plausible enough so people believe it.50 -
Let me run something by all of you. Let's say you once started freelancing as a "Plan B" in case your full-time gig dropped you. Over 12 years you've managed to build a long-standing personal brand around that occasional freelancing. You have several clients who adore you and the work you do and they tell you they would be lost without your talent and have nowhere else to go and nobody else they trust. You know, because in the past you tried to send them elsewhere (for various reasons) and they just kept coming back.
You get laid off from the full-time gig and ACME Company calls and interviews you as a top candidate they're really interested in for that same type of work for a full-time job they're offering.
Here's the catch...if hired, you have two months to basically erase your personal brand and agree never to do any freelancing work as before, even on your own time on evenings and weekends. ACME wants your full focus and attention. Additionally, you find out that the person you'd be replacing is being let go because they weren't sufficiently tech-skilled for the job. And, with a little digging, you find out that person _also_ had several freelancing gigs going on the side. Probably for the same "Plan B" reason. Which is probably why ACME is demanding exclusivity.
Your client base is small. ACME says "we don't care". The work you do is 90% automated and easily achievable in just minutes a day on a weekend or evening. ACME says "doesn't matter". You already had full-time work to begin with so you weren't doing a ton on the side. ACME couldn't be less interested in this "excuse". And you're not keen on the idea of burning down your brand, especially with no guarantees of any kind in the present IT industry hiring/firing/layoffs climate. ACME says this issue is make or break for them.
If you get to the offer stage do you:
a) Flip the bird to your brand and clients you've built up for over a decade and memory-hole it?
b) Negotiate a non-compete clause with ACME, agreeing not to take on any new clients while working full time for them?
c) Flip the bird to ACME and look for something else?
Asking for a friend. ;)16 -
Ok story number 2, this one takes place back when I was at Uni.
We had a group project where we had to make a basic website. Nothing fancy, just basic HTML/CSS/JS.
We were a group of 3, one of the guys did a fairly good job with the CSS, he made a really good looking banner/footer and a kind of ‘featured’ page which looked awesome.
But the other guy... his contribution was the ‘contact us’ page, which consisted of a totally static table with dummy info and an embedded Google map showing the location of our fake car dealership.
Meanwhile I wrote all the Javascript, complete with a fake in-memory database containing our car data for displaying on the home page. Even had basic filtering. I made sure to mention in the peer review that I felt like he could have done more. -
Pressing Ctrl+C shouldn't overwrite an existing clipboard entry that has just been created by pressing Ctrl+C immediately before.
Who thought it was a good idea to use copy + paste shortcut keys exactly next to each other? Some people's muscle memory does not work with such a fine subtlety.
How much working hours, days or even years must have been wasted by people using productivity software accidentally losing what they were about to paste from their clipboards?
Anticipating the first comments, yes, that's another kind of first-world problems affecting people that spend too much life time doing stupid office work on a (German) (PC) keyboard, but here we are, procrastinating on devRant ant wasting even more time.
Antipating even more comments: why am I using a keyboard to work in a German train on a sunny Sunday instead of relaxing at a lake or a swimming pool instead? Well, at least this train doesn't seem to have a pool. More luxury problems for me.3 -
Currently having very funny project lead, who gives on the spot estimates for 9 years old very pathetic quality code having Android app in security domain. Memory leaks, bad practices, typos, CVEs etc. you name it we have it in our source of the app.
Since 5-6 sprints of our project, almost 50% of user stories were incomplete due to under estimations.
Basically everyone in management were almost sleeping since last 7-8 years about code quality & now suddenly when new Dev & QA team is here they wanted us to fix everything ASAP.
Most humourous thing is product owner is aware about importance of unit test cases, but don't want to allocate user stories for that at the time of sprint planning as code is almost freezed according to him for current release.
Actually, since last release he had done the same thing for each sprint, around 18 months were passed still he hadn't spared single day for unit testing.
Recently app crash issue was found in version upgrade scenario as QAs were much tired by testing hundreds of basic trivial test cases manually & server side testing too, so they can't do actual needful testing & which is tougher to automate for Dev.
Recently when team's old Macbook Pros got expired higher management has allocated Intel Mac minis by saying that few people of organization are misusing Macbooks. So for just few people everyone has to suffer now as there is no flexibility in frequent changing between WFH & WFO. 1 out of those Mac minis faced overheating & in repair since 6 months.
Out of 4 Devs & 3 QAs, all 3 QAs & 2 Devs had left gradually.
I think it's time to say goodbye 😔3 -
Things I say to my clients when I know that a reboot is required to fix their issue but I don't have enough evidence to prove it to them :
"... On any computing platform, we noted that the only solution to infinite loops (and similar behaviors) under cooperative preemption is to reboot the machine. While you may scoff at this hack, researchers have shown that reboot (or in general, starting over some piece of software) can be a hugely useful tool in building robust systems.
Specifically, reboot is useful because it moves software back to a known and likely more tested state. Reboots also reclaim stale or leaked resources (e.g., memory) which may otherwise be hard to handle. Finally, reboots are easy to automate. For all of these reasons, it is not uncommon in large-scale cluster Internet services for system management software to periodically reboot sets of machines in order to reset them and thus obtain the advantages listed above.
Thus, when you indeed perform a reboot, you are not just enacting some ugly hack. Rather, you are using a time-tested approach to improving the behavior of a computer system."
😎1 -
Sydochen has posted a rant where he is nt really sure why people hate Java, and I decided to publicly post my explanation of this phenomenon, please, from my point of view.
So there is this quite large domain, on which one or two academical studies are built, such as business informatics and applied system engineering which I find extremely interesting and fun, that is called, ironically, SAD. And then there are videos on youtube, by programmers who just can't settle the fuck down. Those videos I am talking about are rants about OOP in general, which, as we all know, is a huge part of studies in the aforementioned domain. What these people are even talking about?
Absolutely obvious, there is no sense in making a software in a linear pattern. Since Bikelsoft has conveniently patched consumers up with GUI based software, the core concept of which is EDP (event driven programming or alternatively, at least OS events queue-ing), the completely functional, linear approach in such environment does not make much sense in terms of the maintainability of the software. Uhm, raise your hand if you ever tried to linearly build a complex GUI system in a single function call on GTK, which does allow you to disregard any responsibility separation pattern of SAD, such as long loved MVC...
Additionally, OOP is mandatory in business because it does allow us to mount abstraction levels and encapsulate actual dataflow behind them, which, of course, lowers the costs of the development.
What happy programmers are talking about usually is the complexity of the task of doing the OOP right in the sense of an overflow of straight composition classes (that do nothing but forward data from lower to upper abstraction levels and vice versa) and the situation of responsibility chain break (this is when a class from lower level directly!! notifies a class of a higher level about something ignoring the fact that there is a chain of other classes between them). And that's it. These guys also do vouch for functional programming, and it's a completely different argument, and there is no reason not to do it in algorithmical, implementational part of the project, of course, but yeah...
So where does Java kick in you think?
Well, guess what language popularized programming in general and OOP in particular. Java is doing a lot of things in a modern way. Of course, if it's 1995 outside *lenny face*. Yeah, fuck AOT, fuck memory management responsibility, all to the maximum towards solving the real applicative tasks.
Have you ever tried to learn to apply Text Watchers in Android with Java? Then you know about inline overloading and inline abstract class implementation. This is not right. This reduces readability and reusability.
Have you ever used Volley on Android? Newbies to Android programming surely should have. Quite verbose boilerplate in google docs, huh?
Have you seen intents? The Android API is, little said, messy with all the support libs and Context class ancestors. Remember how many times the language has helped you to properly orient in all of this hierarchy, when overloading method declaration requires you to use 2 lines instead of 1. Too verbose, too hesitant, distracting - that's what the lang and the api is. Fucking toString() is hilarious. Reference comparison is unintuitive. Obviously poor practices are not banned. Ancient tools. Import hell. Slow evolution.
C# has ripped Java off like an utter cunt, yet it's a piece of cake to maintain a solid patternization and structure, and keep your code clean and readable. Yet, Cs6 already was okay featuring optionally nullable fields and safe optional dereferencing, while we get finally get lambda expressions in J8, in 20-fucking-14.
Java did good back then, but when we joke about dumb indian developers, they are coding it in Java. So yeah.
To sum up, it's easy to make code unreadable with Java, and Java is a tool with which developers usually disregard the patterns of SAD. -
So i have been thinking..
SQL is a lang that runs on a specific software on the server, and helps creating data stores(databases and tables) that can be queried & manipulated.
is there a way to run sql like queries on the client side with no interaction from backend at all?
Say i have 5 inter related data models. in a backend world, they will form nice little tables of a db with all their joins and composite keys. from the server, i shall be querying them like "SELECT name from x where y=z & ..."
but what if i could store them like tables in browser memory and run the same query filters via a query language... is this possible?
i know this poses a certain security risk, but we already use cookies, local storage and a lot of json based shitty client side storages. surely it might be possible to have a lesser optimised sql tables on the frontend with extremely good querying capabilities?
or am i talking something far fetched here?8 -
I was wondering if anybody working for a larger Company in 3rd-Level faces the same personal problems as me?
We got alot of our own developed software and process. Regulary some Keyuser fucks something up by importing something wrong, which skews the master data. Its intended to be managed by the Keyuser.
Fast Forward, my Keyusers are so dumb, that they fuck up the master data import wrongfully. The process behind that then have wrong Data to operate (a numeric value neeeds to be set).
The Enduser then opens a Ticket for problem XYZ. Then the Keyuser forwards us the issue.
We already had that same issue X amount of times and its always the same reply. I made a FAQ, Knowledgebase entry, etc.
Nothing works, 2 weeks pass and a similiar tickets comes in...
Memory Capacity of the User exceeded after 1 Day. FML
Anybody facing the same shit?6 -
During my small tenure as the lead mobile developer for a logistics company I had to manage my stacks between native Android applications in Java and native apps in IOS.
Back then, swift was barely coming into version 3 and as such the transition was not trustworthy enough for me to discard Obj C. So I went with Obj C and kept my knowledge of Swift in the back. It was not difficult since I had always liked Obj C for some reason. The language was what made me click with pointers and understand them well enough to feel more comfortable with C as it was a strict superset from said language. It was enjoyable really and making apps for IOS made me appreciate the ecosystem that much better and realize the level of dedication that the engineering team at Apple used for their compilation protocols. It was my first exposure to ARC(Automatic Reference Counting) as a "form" of garbage collection per se. The tooling in particular was nice, normally with xcode you have a 50/50 chance of it being great or shit. For me it was a mixture of both really, but the number of crashes or unexpected behavior was FAR lesser than what I had in Android back when we still used eclipse and even when we started to use Android Studio.
Developing IOS apps was also what made me see why IOS apps have that distinctive shine and why their phones required less memory(RAM). It was a pleasant experience.
The whole ordeal also left me with a bad taste for Android development. Don't get me wrong, I love my Android phones. But I firmly believe that unless you pay top dollar for an android manufacturer such as Samsung, motorla or lg then you will have lag galore. And man.....everyone that would try to prove me wrong always had to make excuses later on(no, your $200_$300 dllr android device just didn't cut it my dude)
It really sucks sometimes for Android development. I want to know what Google got so wrong that they made the decisions they made in order to make people design other tools such as React Native, Cordova, Ionic, phonegapp, titanium, xamarin(which is shit imo) codename one and many others. With IOS i never considered going for something different than Native since the API just seemed so well designed and far superior to me from an architectural point of view.
Fast forward to 2018(almost 2019) adn Google had talks about flutter for a while and how they make it seem that they are fixing how they want people to design apps.
You see. I firmly believe that tech stacks work in 2 ways:
1 people love a stack so much they start to develop cool ADDITIONS to it(see the awesomeios repo) to expand on the standard libraries
2 people start to FIX a stack because the implementation is broken, lacking in functionality, hard to use by itself: see okhttp, legit all the Square libs, butterknife etc etc etc and etc
From this I can conclude 2 things: people love developing for IOS because the ecosystem is nice and dev friendly, and people like to develop for Android in spite of how Google manages their API. Seriously Android is a great OS and having apps that work awesomely in spite of how hard it is to create applications for said platform just shows a level of love and dedication that is unmatched.
This is why I find it hard, and even mean to call out on one product over the other. Despite the morals behind the 2 leading companies inferred from my post, the develpers are what makes the situation better or worse.
So just fuck it and develop and use for what you want.
Honorific mention to PHP and the php developer community which is a mixture of fixing and adding in spite of the ammount of hatred that such coolness gets from a lot of peeps :P
Oh and I got a couple of mobile contracts in the way, this is why I made this post.
And I still hate developing for Android even though I love Java.3 -
[CONCEITED RANT]
I'm frustrated than I'm better tha 99% programmers I ever worked with.
Yes, it might sound so conceited.
I Work mainly with C#/.NET Ecosystem as fullstack dev (so also sql, backend, frontend etc), but I'm also forced to use that abhorrent horror that is js and angular.
I write readable code, I write easy code that works and rarely, RARELY causes any problem, The only fancy stuff I do is using new language features that come up with new C# versions, that in latest version were mostly syntactic sugar to make code shorter/more readable/easier.
People I have ever worked with (lot of) mostly try to overdo, overengineer, overcomplicate code, subdivide into methods when not needed fragmenting code and putting tons of variables.
People only needed me to explain my code when the codebase was huge (200K+ lines mostly written by me) of big so they don't have to spend hours to understand what's going on, or, if the customer requested a new technology to explain such new technology so they don't have to study it (which is perfectly understandable). (for example it happened that I was forced to use Devexpress package because they wanted to port a huge application from .NET 4.5 to .NET 8 and rewriting the whole devexpress logic had a HUGE impact on costs so I explained thoroughly and supported during developement because they didn't knew devexpress).
I don't write genius code or clevel tricks and patterns. My code works, doesn't create memory leaks or slowness and mostly works when doing unit tests at first run. Of course I also put bugs and everything, but that's part of the process.
THe point is that other people makes unreadable code, and when they pass code around you hear rising chaos, people cursing "WTF this even means, why he put that here, what the heck this is even supposed to do", you got the drill. And this happens when I read everyone code too.
But it doesn't happens the opposite. My code is often readable because I do code triple backflips only on personal projects because I don't have to explain anyone and I can learn new things and new coding styles.
Instead, people want to impress at work, and this results in unintelligible, chaotic code, full of bugs and that people can't read. They want to mix in the coolest technologies because they feel their virtual penis growing to showoff that they are latest bleeding edge technology experts and all.
They want to experiment on business code at the expense of all the other poor devils who will have to manage it.
Heck, I even worked with a few Microsoft MVPs.
Those are deadly. They're superfast code throughput people that combine lot of stuff.
THen they leave at you the problems once they leave.
This MVP guy on a big project for paperworks digital acquisiton for a big company did this huge project I got called to work in, which consited in a backend and a frontend web portal, and pushed at all costs to put in the middle another CDN web project and another Identity Server project to both do Caching with the cdn "to make it faster" and identity server for SSO (Single sign on).
We had to deal with gruesome work to deal with browser poor caching management and when he left, the SSO server started to loop after authentication at random intervals and I had to solve that stuff he put in with days of debugging that nasty stuff he did.
People definitely can't code, except me.
They have this "first of the class syndrome" which goes to the extent that their skill allows them to and try to do code backflips when they can't even do code pushups, to put them in a physical exercise parallelism.
And most people is like this. They will deny and won't admit, they believe they're good at it, but in reality they aren't.
There is some genius out there that does revoluitionary code and maybe needs to do horrible code to do amazing stuff, and that's ok. And there is also few people like me, with which you can work and produce great stuff.
I found one colleague like this and we had a $800.000 (yes, 800k) project in .NET Technology, which consisted in the renewal of 56 webservices and 3 web portals and 2 Winforms applications for our country main railway transport system. We worked in 2 on it, with a PM from the railway company.
It was estimated 14 months of work and we took 11 and all was working wonders. We had ton of fun doing it because also their PM was a cool guy and we did an awesome project and codebase was a jewel. The difficult thing you couldn't grasp if you read the code is if you don't know how railway systems work and that's the only difficult thing.
Sight, there people is macking me sick of this job11 -
I am having an introspective moment as a junior dev.
I am working in my 3rd company now and have spent the avg amount of time i would spent in a company ( 1- 1.5 years)
I find myself in similar problems and trajectories:
1. The companies i worked for were startups of various scales : an edtech platform, an insurance company (branch of an mnc) and a b2b analytics company
2. These people hire developers based on domain knowledge and not innovative thinking , and expect them to build anything that the PMs deem as growth/engagement worthy ( For eg, i am bad at those memory time optimising programming/ ds/algo, but i can make any kind of android screen/component, so me and people like me get hired here)
3. These people hire new PMs based on expertise in revenue generation and again , not on the basis of innovative thinking, coz most of the time these folks make tickets to experiment with buttons and text colors to increase engagement/growth
4. The system goes into chaos mode soon since their are so many cross operating teams and the PMs running around trying to boss every dev , qa and designer to add their changes in the app.
5. meanwhile due to multiple different teams working on different aspects, their is no common data center with up to date info of all flows, products and features. the product soon becomes a Frankenstein monster.
6. Thus these companies require more and more devs and QAs which are cogs in the system then innovative thinkers . the cogs in the system will simply come, dimwittingly add whatever feature is needed and goto home.
7. the cogs in system which also start taking the pain of tracking the changes and learning about the product itself becomes "load bearing cogs" : i.e the devs with so much knowledge of the product that they can be helpful in every aspect of feature lifecycle .
8. such devs find themselves in no need for proving themselves , in no need for doing innovative work and are simply promoted based on their domain knowledge and impact.
My question is simply this : are we as a dev just destined to be load bearing cogs?
we are doing the work which ideally a manager should be doing, ie maintaining confluence docs with end to end technical as well as business logic info of every feature/flow.
So is that the only definition of a Software Engineer in a technical product?
then how come innovations happen in companies like meta Microsoft google open ai etc?
if i have to guess as a far observer, i would say their diversity in different fields helps them mix and match stuff and lead to innovative stuff.
For eg, the android os team in google has helped add many innovative things in google cloud product and vice versa.
same is with azure and windows . windows is now optomissed to run in cloud machines when at one point it was just a horrible memory hogging and slow pc OS
for small companies, 1 ideology/product/domain is their hero ideology/product/domain .
an insurance company tries to experiment with stuff related to insurances,health,vehicles,and the best innovations they come up with is "lets give user a discount in premium if they do 5000 steps a day for an year".
edtech would say "lets do live streaming for children apart from static videos"
but Android team at google said , "since ai team is doing so well, lets include ai in various system apps and support device level models" ~ a much larger innovation as 2 domains combined to make a product
The small companies are not aiming to be an innovative product, they are just aiming to be a monopoly product. and this is kinda sad2 -
Say, you have a huge system with tight memory constraints. Almost every programmer who's created an instance of any object now has that instance in contention for resources on this system. The free memory is going lesser and lesser and other resources are also facing the heat. The garbage collector is not very effective (as garbage collectors we know are). There is a very large pool of objects that do not have any reference and are left to their own. Now instead of instantiating your own objects, you can simply request one off the pool and work with it however you like (regardless of what type the object was) and change its type to the type you want it to (coz they're all after all instances of the same base class).
Now the question is, would you still instantiate your own object or just request from the pool? If it were a fact that once this system is down, you'll never be able to develop and deploy a single program ever again, would you still not care for it?
If only my mom was a programmer, I could have easily explained why I want to adopt a kid and not make my own. :/ -
Rubber ducking your ass in a way, I figure things out as I rant and have to explain my reasoning or lack thereof every other sentence.
So lettuce harvest some more: I did not finish the linker as I initially planned, because I found a dumber way to solve the problem. I'm storing programs as bytecode chunks broken up into segment trees, and this is how we get namespaces, as each segment and value is labeled -- you can very well think of it as a file structure.
Each file proper, that is, every path you pass to the compiler, has it's own segment tree that results from breaking down the code within. We call this a clan, because it's a family of data, structures and procedures. It's a bit stupid not to call it "class", but that would imply each file can have only one class, which is generally good style but still technically not the case, hence the deliberate use of another word.
Anyway, because every clan is already represented as a tree, we can easily have two or more coexist by just parenting them as-is to a common root, enabling the fetching of symbols from one clan to another. We then perform a cannonical walk of the unified tree, push instructions to an assembly queue, and flatten the segmented memory into a single pool onto which we write the assembler's output.
I didn't think this would work, but it does. So how?
The assembly queue uses a highly sophisticated crackhead abstraction of the CVYC clan, or said plainly, clairvoyant code of the "fucked if I thought this would be simple" family. Fundamentally, every element in the queue is -- recursively -- either a fixed value or a function pointer plus arguments. So every instruction takes the form (ins (arg[0],arg[N])) where the instruction and the arguments may themselves be either fixed or indirect fetches that must be solved but in the ~ F U T U R E ~
Thusly, the assembler must be made aware of the fact that it's wearing sunglasses indoors and high on cocaine, so that these pointers -- and the accompanying arguments -- can be solved. However, your hemorroids are great, and sitting may be painful for long, hard times to come, because to even try and do this kind of John Connor solving pinky promises that loop on themselves is slowly reducing my sanity.
But minor time travel paradoxes aside, this allows for all existing symbols to be fetched at the time of assembly no matter where exactly in memory they reside; even if the namespace is mutated, and so the symbol duplicated, we can still modify the original symbol at the time of duplication to re-route fetchers to it's new location. And so the madness begins.
Effectively, our code can see the future, and it is not pleased with your test results. But enough about you being a disappointment to an equally misconstructed institution -- we are vermin of science, now stand still while I smack you with this Bible.
But seriously now, what I'm trying to say is that linking is not required as a separate step as a result of all this unintelligible fuckery; all the information required to access a file is the segment tree itself, so linking is appending trees to a new root, and a tree written to disk is essentially a linkable object file.
Mission accomplished... ? Perhaps.
This very much closes the chapter on *virtual* programs, that is, anything running on the VM. We're still lacking translation to native code, and that's an entirely different topic. Luckily, the language is pretty fucking close to assembler, so the translation may actually not be all that complicated.
But that is a story for another day, kids.
And now, a word from our sponsor:
<ad> Whoa, hold on there, crystal ball. It's clear to any tzaddiq that only prophets can prophecise, but if you are but a lowly goblinoid emperor of rectal pleasure, the simple truths can become very hard to grasp. How can one manage non-intertwining affairs in their professional and private lives while ALSO compulsively juggling nuts?
Enter: Testament, the gapp that will take your gonad-swallowing virtue to the next level. Ever felt like sucking on a hairy ballsack during office hours? We got you covered. With our state of the art cognitive implants, tracking devices and macumbeiras, you will be able to RIP your way into ultimate scrotolingual pleasure in no time!
Utilizing a highly elaborated process that combines illegal substances with the most forbidden schools of blood magic, we are able to [EXTREMELY CENSORED HERETICAL CONTENT] inside of your MATER with pinpoint accuracy! You shall be reformed in a parallel plane of existence, void of all that was your very being, just to suck on nads!
Just insert the ritual blade into your own testicles and let the spectral dance begin. Try Testament TODAY and use my promo code FIRSTBORNSFIRSTNUT for 20% OFF in your purchase of eternal damnation. Big ups to Testament for sponsoring DEEZ rant.3 -
So Im still a college student but my worst burnout happened a week before finals this year as a sophomore at DigiPen
On the same week, I had to conduct and submit 10 playtests for my competitive Unity game by Friday, submit said game polished and completed on Sunday, finish and submit my team game project that I've been working on the whole year with 10 other people also on Sunday (which we would discover so many TCR submission issues we ended up finally resubmitting on Thursday the following week).
On top of this I had to write a memory manager for my operating systems class due Thursday, a water retainment assignment involving recursive queues for my Data Structures class due Saturday, and to top it all off that class also had a final Thursday when that memory manager was due :'). I don't know how I managed to get OK sleep.
All stuff was due that week so all game teams could have next week before finals to work on submitting, so some CS teachers also move their finals to before that to theoretically distribute the load (which sucks for people in my major because we're almost a double major for CS and Game Design) However my team wanted to submit early to snatch some bonus points but we ended up having to resubmit late anyways :(. Due to the week of hell we were already burned out when trying fix our resubmission.
I love the school and the people in it but there's a reason why our most heard phrase is "I want to die" and no its not just a millenial thing I swear. -
Can we please pick a consistent pattern across the same app? I don't have enough short term memory to deal with this.
-
Oh... I dont know what to pick...
So i will pick 3 projects from my 3 stages of my dev "carrier"
1.Right after i discovered programing and learned how ,if, while and similar structures worked. The launguage was object pascal with delphi 2007
That was a "safe" with a stupidly complicated lock (text inputs, sliders, ect) it opened a secret folder in the end.
2. It was a embedded code for a Atmega8 AVR, Atmel studio, pure C but without memory managment (i didnt even know that it even existed)
It was a Pip boy knockoff, 16x2 display and a small keyboard connected to the arduino like board that i made on a proto board.
It wasnt that much of a pipboy, it was more of a showoff of atmega8 internal systems, (ADC, timers, interrupts and such)
3.DataLab, after helping my friend with his master thesis, (we meet on discord long story, i was in high school) i decided that mathlab is shit and i created a visual scripting enviroment, launguage C# .net 4 (in the latest version)
I remade the whole program from scrach 1 time, significantly improving everything (code reuse, better algorithms, data processing, code redability and edge cases) I have learned good practises from everywhere. I learned how to use git.
DataLab project looks just like LabViev (i didnt notice that it even existed...), it is frozen now because of my mental status but im planning on using it on my CV when i will be looking for jobs on holidays. There are many things that i can improve in that program but ... first i have to fix myself. -
Me and this friend of mine were usually average in college subjects. We were not really bad at them, we just never got any exceptional marks in those subjects.
So when our 4th sem result came, a third friend of us got really good marks in some subject , like in 90s, and we again had marks around 70s.
At that time we both knew that we know that subject way more than this topper guy in terms of knowledge, but he just crammed everything about that subject word to word and got the better marks.
We thus believed that marks doesn't matter, its the knowledge and we both know its stupid to cram useless things which could easily be referred from documentations or internet when required.
But last sem, something different happens. looks like mah boy was a little envious on the inside, he scored a whopping 88%, just near to that topper friend of ours . i was happy watching his happiness , and he was saying that "dude this sem, i will even try to beat that guy in marks."
Even though none of them are class toppers, but they are somehow running in the race to be one. I on the other hand is still firm on the belief of not cramming stupid shit just to get a status of some 'topper'.
even though cramming subject knowledge is not a total waste, i still believe we should only understand what we need to understand, like learning the moral from a war story, not cramming the actual war dates.
Some might find this quality of mine to be the reason of me being 'average', but i feel totally fine with it. I have trained myself to be able to lookup for a particular resource online faster than they are able to lookup for that resource crammed in their brain memory, and i wonder if i should feel guilty about it. Yet the society will always see me as an 'average' guy and them as a 'winner' -
I need to tell you the story of my MOAB (Mother of all bugs).
I need to write some stuff in C (which i am fairly used to) and have a function that allocates memory for a Matrix on the heap. The matrix has a rows and columns property and an associated data array, so it looks like this
struct Matrix{
uint8_t rows;
uint8_t columns;
uint8_t data[];
}
I allocate rows*columns + 2 bytes of memory for it.
I also have a function to zero it out which does something like this
for(int i=0; i < rows*columns;i++){
data[i]=0;}
Let‘s come to the problem:
On my Mac the whole stuff works and passes all tests. We tried the code on a Linux machine and suddenly the code crashed in various places, sometimes a realloc got an invalid pointer, sometimes free got an invalid pointer and basically the code crashed at arbitrary points randomly.
I was confused af because did i really make THAT many errors?
I found out that all errors occured when testing my matrices so i looked more into it and observed it through the debugger.
Eventually i came to the function that zeroes out my matrix and it went unusually high and wondered if my matrix really was that big.
Then i saw it
The matrix wasn‘t initialised yet
It had arbitrary data that was previously in the heap.
It zeroed out a huge chunk of the heap space.
It literally wrote a zero to a shitload of addresses which invalidated many pointer.
You can imagine my facepalm2 -
I need to write a standalone server in Java 1.7 and have it
-handle GET urls and map them to different classes/methods
-extract the query params and expose them to the method
-Can respond in JSON by serialising the POJO or a list of them
We have an existing server I think that uses JBoss but takes forever to start and uses a lot of memory.
I also wrote one before with just a (Grizzly?) HttpServer so had to manually implement the above as needed. Only needed it to do one thing so really just 1 path.
Similar situation this time but I'd prefer to not have to implement this stuff manually and need it to be a bit more flexible to extension.7 -
TL;DR I have to bump a Redis cluster from t3.medium to m6g.large just to get enough network bandwidth even though I have no need of the extra memory.
Debugged an interesting issue today.
I am adding Elasticache to a project to reduce strain on the single node postgres DB.
Deployed a Redis replication group with 2 shards, with multi-AZ replication for resilience.
Everything was going well. We arent caching that much atm so was barely using 100Mb of memory.
Suddenly, when our US region comes online, latency skyrockets and the logs are full of Jedis timeout errors.
Still no issue with memory or node CPU.
The cause? Arbitrary network bandwidth throttling by AWS. The app currently processes about 3,000 requests per second so we were exceeding Amazons random ass allowances which arent documented anywhere.1 -
Compare and harmonize the web configs
Oh no someone set execution timeouts to 14 days
Fuck fuck fuckity duck
Hey compare all the web configs of all environments and harmonize them all wtf cmon bruh do your job as a developer
Take them and back them up into svn. What do you mean svn isn't a back up system of course it is well its the only thing we have fuck
What do you mean we have shit logging where people will catch an exception and only print the word exception in the log you can figure it out can't you we have live produxtion issues that hace to be solved now what the fuck
How dare you make a. Mistake copying our shitload of a bloated codebase and configuring our 100s of different options all by fukcing hand what the fuck dude do yoh write anyrhing down?
Please catalogue all the exception mails we are getting but we have no db or error reporting system so they all just plop into tue inbox and thats all ypur fuckjng data figure it out kid
This is a rewarding, fulfilling job whwrw you can be both dev ops and a developer and manage all of our fucking environments of which there are about 15 of all your own with no sort of tool or software to aid you because haha what the fuck we wouldn't make your life easy
Whata that you want to spend time to write stuff or change stuff that will nake it easier fot you fuxk that bruh get back to your biklable tasks like holy shit you thjnk this is a charity ofr aomw shit
Live production issues
Live production issues
Produxtion issues. A ghost in the machine. Find it fix if find it fix it find it fix it cmon why can't you fix it I expect you to spend your day hopelessly pretending to try to solve something you fucker
One of the only peopel able to help you sometimes though hes a bit of an old laxky, yeah hea fucking leaving see ya seeya kid and now we're not hirinf anyone to fuckjng help you no no no managing and monitoring the environments its your jov alll fof them every sngle on do you knkw all the xonfiguraiton values for them yet??
Instead we are hiring a new sales person to fucking make us some more money and we don't need naother seceloper to help you infqct lets have you use this mid end retail computer from 2014 to develop on yeah yeah oh but all our shitty code and visual studip will destry your memory but too bad!! Hahahahahdhsj
Go lice is all you, why sare you so slow
How long will it take
How long will it take
How long will it take
How long witll it tqk2
How long will it take holy shit
Give time estimate for sonethign that I don't fucking know how about it will tqke till fuxk you oxloxk4 -
Python muses me sometimes.
Gunicorn has a preload mode. It enables forking...
So Gunicorn starts, when Gunicorn loaded it forks the workers (Uvicorn / FastAPI in my case).
https://github.com/tiangolo/...
So if we add a function that creates the app... this function will be executed before forking, thus the memory at the state of creating the app will be duplicated.
You can thus spawn 40 workers, they would all have the same ML models.
Or in my case a client who does some things that should only be run by a single thread (with locking).
So the client has a cache, as long as I load the cache during the create_app phase, the cache will be shared between all instances and not created per instance.
It's ... Such a small detail. So simple.
Yet completely fucks my brain.
It's logical, yes. I understand what it does, yes.
But it still makes my brain fart. -
A philosophical question about maintenance/updating.
There is no need to repeat the reasons we need to update our dependencies and our code. We know them/ especially regarding the security issues.
The real question is , "is that indicates a failure of automation"?
When i started thinking about code, and when also was a kid and saw all these sci fi universes with robots etc, the obvious thing was that you build an automation to do the job without having to work with it anymore. There is no meaning on automate something that need constant work above it.
When you have a car, you usually do not upgrade it all the time, you do some things of maintance (oil, tires) but it keeps your work on it in a logical amount.
A better example is the abacus, a calculating device which you know it works as it works.
A promise of functional programming is that because you are based on algebraic principles you do not have to worry so much about your code, you know it will doing the logical thing it supposed to do.
Unix philosophy made software that has been "updated" so little compared to all these modern apps.
Coding, because of its changeable nature is the first victim of the humans nature unsatisfying.
Modern software industry has so much of techniques and principles (solid, liquid, patterns, testing that that the air is air) and still needs so many developers to work on a project.
I know that you will blame the market needs (you cannot understand the need from the start, you have to do it agile) but i think that this is also a part of a problem .
Old devices evolved at much more slow pace. Radio was radio, and still a radio do its basic functionality the same war (the upgrades were only some memory functionalities like save your beloved frequencies and screen messages).
Although all answers are valid, i still feel, that we have failed. We have failed so much. The dream of being a programmer is to build something, bring you money or satisfaction, and you are bored so you build something completely new.13 -
tell me guys what would you prefer:
function a(){
..
b(..)
..
b(..)
..
}
function b(p1,p2,p3,p4,p5,p6){.
...
}
or
function a(){
..
b(..)
..
b(..)
..
}
function b(
p1,
p2,
p3,
p4,
p5,
p6
){
...
}
if you read this rant before expanding, you got a complete context on how what function a is, its calling b 2 times and how function b looks.
if instead of the first option, i had used 2nd block, you wouldn't even know the 2nd param of b function without expanding this rant.
my point?
i prefer to keeping unnecessary info on one line. and w lot of linters disagree by splitting up the code. and most importantly , my arrogant tl disagree by saying he prefers the splitted code "for readability" and becaue "he likes code this way, old-eng1 likes this and old-eng2 likes this" .
why tf does an ide have horizontal a scrolling option available when you are too stupid to use it?
ok, i know some smartass is going to point that i too can use vertical scrolling, but hear me out: i am optimising this!
case 1 : a function with 7 params is NOT split into 7 lines. lets calculate the effort to remember it
- since all params could have similar charactersticks ( they will be of some type, might have defaults, might be a suspendable/async function etc), each param will take similar memory-efforts points. say 5sp each.
- total memory-efforts= 5sp *7 = 35 sp.
- say a human has 100 sp of fast memory storage, he can use the remaining 65 sp for loading say 5 small lines above or below.
- but since 5 lines above are already read and still visible on screen, they won't be needed to be loaded again nd again, nd we can just check the lines below.
- thus we are able to store 65+35+65 = 165 sp or about 11 lines of code in out fast memory for just a 100sp brain storage
case 2 function with 7 params IS split into 7 lines.
- in this case all lines are somewhat similar. 5sp for param lines as they are still similar which implies same 35sp for storing current function and params
- remaining 65sp can only be used to store next 5 lines of 13sp as the previous code is no longer visible.
- plus if you wanna refresh the code above, you gotta scroll, which will result in removing bottom code from screen , and now your 65sp from bottom code is overwritten by 65sp of top code.
- thus at a time, you are storing only 6 lines worth of code info. this makes you slow.
this is some imaginary math, but i believe it works10 -
Just had a memory popup about my uni days about 5 years ago. I was in my Junior year in business school and was doing a "consulting" project involving the whole Class (200 students). Groups of 4 were assigned an international company in either Europe, Asia, or South America. We'd visit them (as well as do some sightseeing) and learn about them (performance, market positions, products) during Spring Break and come up with a real proposal. We would then compete with other teams, and the winning pitchs for each would be presented in the school auditorium in front the entire class.
Our team didn't get that far but that's not the point. We did win the individual classroom competition. Our company was Deutsche Telekom (owner of T-Mobile).
This was in 2010, when the iPhone/smart-phones started to become mainstream... And our team's idea was location-based advertising.
Looking back, we basically predicted the future... though we got the wrong industry...
It's also sort of funny though because I remember the main reason we came up with the idea was to be different.
All other teams just went with some expansion plan to a neighboring country or cutting costs.... pure MBA/business plan. But I guess I was being a natural techie so thought of a tech idea instead.
We had a meeting with our professor after he picked us and he told us he had a history of spotting future hits. We were like "hm... ok... let's give it a shot... we definitely got an A!" but at that point I was sort of skeptical if this would actually work in real-life (the basic idea was they would sell ads to local businesses and if you were nearby, you would get a text message with an offer).
But guess he was pretty right... we just needed to have Google or Facebook to have been our company... though Groupon or Yelp works too... basically a tech company with larger scale rather than a mobile carrier...1 -
Teaching coworkers performance tuning, we have the memory enough that you don't need to write to disk... Really the data isn't Even a MB....
-
Please support old web browser versions for all eternity.
I hate it when I open a site like SoundCloud one day and am greeted with a "we no longer support your browser" notice. Now I am forced to update my browser to a new version with removed features. On Android, Chrome sometimes crashes due to an apparent memory leak, so I have to go back to Samsung Internet, which does not work with some sites. Also, the Samsung clipboard manager (which can hold up to 20 items) is only available on Samsung Internet, not Chrome or Firefox.
I also have to update the browser on my live USB bootable stick because sites stop supporting it. Any browser starting in 2015 (ECMA script 6) should be supported until at least 2050 so that I never have to fear that a site one day spontaneously stops working on my browser.
I would like to browse the Internet forever without having to ever worry about pages to stop working one day. Browser vendors might also deprecate support for devices and operating systems. Old devices also have replaceable batteries and are easier to repair. I don't want be forced to buy new devices that are difficult and expensive to repair.20 -
Portrait of Me, Writting Documentation -- a short french film:
The processes applied to any section of memory utilized for a given purpose should be strictly limited to those declared by the associated type that encapsulates the purpose in question until release or mutation.
That is to say, improperly encoding the intended usage of such a block by utilizing an identical type or alias thereof for a multitude of incompatible situations, giving place to guesswork to arise, constitutes the prostitution of an abstraction.
Such heinous acts of symbolical pimping have received strong condemnation from multiple digital rights organizations, as well as our own, prestigious office. Let it be made Crystal, Alizé and Hennessy clear, that we will not stand for this kind of degenerate practice, and that any heretical sects and cabals built around worship of the strange creatures that arise every eleventh night from the depths of the Black Mausoleum will be prosecuted with the full force of the law.
As a young, corageous man once said at the peak of his career: "it is only through the self-inflicted, hyperbolic discharge of smouldered, comminute perennial anadenanthera colubrina spermatic fluid that the cannonical transfiguration of our collective rectosigmoid junction can be brought to fruition". He was immediately violated with might and ire far beyond our wildest, most profligately depraved fantasies, yet his message lives on.
I leave you now to be ritually and figuratively blown by a posssessed mortician that is to become concubine to our dark master; the long journey to the old graveyard will be perilous, and my destination most assuredly fatal, as I depart to give my firstborn to our Lord Berzchjanzad -- a blood sacrifice meant to appease him from peeling off my skin and refashioning it into a bloodied scarf to be worn around his thumping, grandemonic cock.
And in this moment, as I stare blankly at this teleprompter, the president wishes to reassure you of his sacred vows of stalwart and promethean gayhood, and may __these__ nuts bounce on chins forevermore. Here's to *not* bleeding to death in retribution for this unending litany of sins...
Yet all predictions come to pass.
««««««««««« finẽ »»»»»»»»»»» -
We’re only random people living in random places, speaking random languages, eating random food, sleeping, studying and working random hours. Traveling to random points on a sphere.
Just random range is different.
Just random stuff happens on crossroads of two random dots and the entropy speed ups or slows down.
Nothing special at all.
Just a finite state machine iteration.
I mean the amount of effort we put into explanation of infinity is outstanding.
What if there is no infinity at all ?
What if infinity is just misunderstanding of our interpretation of the world around us. It’s just pixels, resolution, gaussian splatting, quantum state, you name it.
Hey man the world is flat. Just put it to the 2d space. How many space you need from a simulation perspective where your patient eyes can only see up to certain amount of light particles per second on a shitty lens.
Propose a world optimization techniques by slowing down subject perception, tiredness introduced. Compress memory, sleep introduced. Limit neurons, cpu power assigned. Deploy on cloud - put it to life. Exit 0 body failure. Exit 1 suicide. Kill -9 killed by tty from ip EARTH.X.Y
What you can do to make the world around this planet alive? Make it blink.
We developers are lazy and I believe that nature is even more lazy than us.
You think you’re going to elevator right now ? You’re going to the preloader. Looking at the window equals playing video from playback. Never goes live, just precomputed fsm. Cars, trains, airplains ? Preloaders everywhere. Highways to split traffic to cities and communication. The road and cities planning department is a matrix maintenance department. And don’t get me started about space.
Space is empty because it’s not even finished. So they put it all behind glass called milky way. You know how glass looked 500 years ago ? It was milky so it’s milky way so we don’t see shit.
If the space would be finished I’ll be starting writing this text from mars, finished it and sent from earth but no it’s light years guys, light years is not a second for a matter. Light year is a second of the the injected thoughts exchange only. Thoughts of the global computer called generative AI that they introduced on local computing devices called cloud.
Even the preloader system is not present, they left us with the one map and overpopulated demo. What a shit hole.I bet they’re increasing temperature right now to erase this alpha build and cash out. Obviously so many bugs here that his one can’t be fixed anymore. To many viruses.
Hope for 0days to start happening so we can escape using time travel or something.
I bet they cut a budget or something, moved the team to other projects. Or even worse solar system team got layoff off because we are just neurons that ordered to do it. And now we’re stuck in some maintenance mode, no new physics no new thoughts to pursue, just slow degeneration. I would pay more for the next run and switch to other galaxy far far away where they at lest have more modern light speed technology.
What do you think about it Trinity ? Not even worth wasting your time for that. No white rabbit this time.
I do not recommend this game at this stage of early access.
- only one available map despite promises for expansions over the years no single dlc arrived,
- missing space adventures
- no galaxy travel mode only a teaser trailers of what you can do in other “universes”
- developers don’t respond to complains
- despite diversity of species and buildings at first sight world looks to generic
- instead of new features bots with mind manipulation, AB testing and data harvesting was introduced
- death anti cheat mode installed1 -
!comforting
TL;DR - I’ve done some thinking about operating systems and sticking to one
Mk
so I, like many of you, have seen far more than my fair share of “X operating system is perfect for it all, so don’t use Y operating system because it’s just awful” posts.
Over this week i’ve really done some thinking and experimenting with multiple devices and OSes and programs for various tasks. People coming from windows over to linux (like myself) tend to diss windows (rightfully so for the most part, but still). I’ve also noticed that the android vs. apple debate can get heated among users.
Listen guys,
iOS has its shortcomings obviously, UI being kinda a big one; but no one can deny that apple shoves some of the nicest hardware into their devices. Yes, this stuff is pricey as hell obviously, but the new macs come with an i9 and quite a bit of memory as well. Apple devices tend to have longer lasting batteries too - i cant count the times where i’ve just turned on my mobile hotspot, and stuck my android in my pocket to use my iphone (its a wifi-only 5s). the applications run nicely on apple hardware.
i couldnt learn even half as much programming as i do on my android though; Termux is a godsend, and im able to run and test scripts right there in the palm of my hand. can’t get that on an iphone.
Some of my favorite game developers only develop for windows; I’m dual booting for that sole reason (warframe and the epic games launcher don’t properly run through wine).
Just boil it down inside for a second; You might have come from a more “user friendly” operating system, to learn on one that is less so - wether you wanted the freedom and wiggle room for customization, or just a more developer friendly working environment (God bless conky and its devs) - so you didn’t have to be locked down into one way of seeing things. Putting a previously used OS down directly violates that thougjt process, and at that point you’re just another windows hater, or arch junkie, or whatever. I think we need to be open to appreciating the pros of every system, even if we almost never use some of them, and we should try not to put down other devs-to-be or csci/sec enthusiasts down because of that either.2 -
Bluemix: Hi, here have 1 instance of your app running, with 128M memory and 1024M disk. Btw, we are gonna take a hell lot a time to get your code up and running, and we love living at the 500.
-
Image implementing angular universal which for it self is quite painful... Timeinvestment 8hours
Testing local 12 hours by the team
.. deploying to Google app engine because we need a nodejs server and we do don't have one yet ... Server crashing 24/7 with random errors most are memory related spend 3 days almost rewriting everything ... trying to find the memory leak
Then when I was about to give up stumble over a GitHub issue where someone is saying something about tiers on app engine.
Me going wtf there are tiers everywhere it just says automatically scaling instances ...
Googling .. setting to highest tier .. app works. Apparently I was in lowest tier which only has 156 MB ram app needs 150- 250 MB. Me now crying in corner about my wasted 4 days. -
One nightmarish project that was doomed from the beginning, had me as the sole developer. I could hardly sleep when we began testing on a separate test system, but with (nearly) all the config stored in shared memory and copied from the production system, I dreaded, half awake, that the production server data base connection was still configured in the test system and that it was shooting all it's test data repeatedly to prod.
Finally drove to company in middle of the night at 4 o'clock. Checked everything was OK, tried to sleep 3 hours before the start of the work day.
This system also had the most hideous memory corruption in some shared memory that was used across several processes and should have been thoroughly protected by a mutex, but somehow, sometimes this crucial map, that was used to speed up the access to all the customer data just contained garbage.
Still haunts me to that day. (Like xkcd's unresolved tension of a non-matching parenthesis - an unresolved bug. -
Hi everyone hows it going today? been learning alot lately Question? when working with lib2cpp.so files whats the best inspector for them? and what do these files contain? (example: gamelib.so)
i know a .so file is C++ so i think it has something to do with offsets and memory ranges something like that.
but im trying to open one lol
we have moved to andlua and i learned the api fully
app: https://andnixsh.com/2020/05/...
AndLua+ app is a lightweight scripting tool that allows you to easily perform script programming and testing on your Android phone. This is a very useful tool for those who need script (android development or modding) programming. AndLua+ is based on the open source project lua. It uses a simple and beautiful lua language, which simplifies cumbersome Java statements. At the same time, it supports the use of most Android APIs, free installation and debugging, and makes your development on your mobile phone easier and faster. The permission requested is for you to write a program to use, please rest assured to use.