Join devRant
Do all the things like
++ or -- rants, post your own rants, comment on others' rants and build your customized dev avatar
Sign Up
Pipeless API
From the creators of devRant, Pipeless lets you power real-time personalized recommendations and activity feeds using a simple API
Learn More
Search - "no memory"
-
Interview with a candidate. He calls himself "C++ expert" on his resume. I think: "oh, great, I love C++ too, we will have an interesting conversation!"
Me: let's start with an easy one, what is 'nullptr'?
Him: (...some undecipherable sequence of words that didn't make any sense...)
In my mind: mh, probably I didn't understand right. Let's try again with something simple and more generic
Me: can you tell me about memory management in C++?
Him: you create objects on the stack with the 'new' keyword and they get automatically released when no other object references them
In my mind: wtf is this guy talking about? Is he confusing C++ with Java? Does he really know C++? Let's make him write some code, just to be sure
Me: can you write a program that prints numbers from 1 to 10?
Ten minutes and twenty mistakes later...
Me: okay, so what is this <int> here in angle brackets? What is a template?
Him: no idea
Me: you wrote 'cout', why sometimes do I see 'std::cout' instead? What is 'std'?
Answer: no idea, never heard of 'std'
I think: on his resume he also said he is a Java expert. Let's see if he knows the difference between the two. He *must* have noticed that one is byte-compiled and the other one is compiled to native code! Otherwise, how does he run his code? He must answer this question correctly:
Me: what is the difference between Java and C++? One has a Virtual Machine, what about the other?
Him: Java has the Java Virtual Machine
Me: yes, and C++?
Him: I guess C++ has a virtual machine too. The C++ Virtual Machine
Me (exhausted): okay, I don't have any other questions, we will let you know
And this is the story of how I got scared of interviews29 -
Being paid to rewrite someone else's bad code is no joke.
I'll give the dev this, the use of gen 1,2,3 Pokemon for variable names and class names in beyond fantastic in terms of memory and childhood nostalgia. It would be even more fantastic if he spelt the names correctly, or used it to make a Pokemon game and NOT A FUCKING ACCOUNTANCY PROGRAM.
There's no correspondence in name according to type, or even number. Dev has just gone batshit, left zero comments, and now somehow Ryhorn is shitting out error codes because of errors existing in Charmeleon's asshole.
The things I do for money...24 -
Toilets and race conditions!
A co-worker asked me what issues multi-threading and shared memory can have. So I explained him that stuff with the lock. He wasn't quite sure whether he got it.
Me: imagine you go to the toilet. You check whether there's enough toilet paper in the stall, and it is. BUT now someone else comes in, does business and uses up all paper. CPUs can do shit very fast, can't they? Yeah and now you're sitting on the bowl, and BAMM out of paper. This wouldn't have happened if you had locked the stall, right?
Him: yeah. And with a single thread?
Me: well if you're alone at home in your appartment, there's no reason to lock the door because there's nobody to interfere.
Him: ah, I see. And if I have two threads, but no shared memory, then it is as if my wife and me are at home with each a toilet of our own, then we don't need to lock either.
Me: exactly!12 -
Account guy saw me coding...
account guy: so you type a lot.. how can you remember so much??
me: ??
account guy: I mean there is NO LOGIC in what you do, so you must read these things and type them here... you need to remember a lot.. right??
me: ohh... that... well.. I have very good memory :)
p.s. last line was sarcasm12 -
Had a PR blocked yesterday. Oh god, have I introduced a memory leak? Have I not added unit tests? Is there a bug? What horrible thing have I unknowingly done?
... added comments to some code.
Yep apparently “our code needs to be readable without comments, please remove them”.
Time to move on, no signs of intelligent life here.39 -
So I cracked prime factorization. For real.
I can factor a 1024 bit product in 11hours on an i3.
No GPU acceleration, no massive memory overhead. Probably a lot faster with parallel computation on a better cpu, or even on a gpu.
4096 bits in 97-98 hours.
Verifiable. Not shitting you. My hearts beating out of my fucking chest. Maybe it was an act of god, I don't know, but it works.
What should I do with it?241 -
In a user-interface design meeting over a regulatory compliance implementation:
User: “We’ll need to input a city.”
Dev: “Should we validate that city against the state, zip code, and country?”
User: “You are going to make me enter all that data? Ugh…then make it a drop-down. I select the city and the state, zip code auto-fill. I don’t want to make a mistake typing any of that data in.”
Me: “I don’t think a drop-down of every city in the US is feasible.”
Manage: “Why? There cannot be that many. Drop-down is fine. What about the button? We have a few icons to choose from…”
Me: “Uh..yea…there are thousands of cities in the US. Way too much data to for anyone to realistically scroll through”
Dev: “They won’t have to scroll, I’ll filter the list when they start typing.”
Me: “That’s not really the issue and if they are typing the city anyway, just let them type it in.”
User: “What if I mistype Ch1cago? We could inadvertently be out of compliance. The system should never open the company up for federal lawsuits”
Me: “If we’re hiring individuals responsible for legal compliance who can’t spell Chicago, we should be sued by the federal government. We should validate the data the best we can, but it is ultimately your department’s responsibility for data accuracy.”
Manager: “Now now…it’s all our responsibility. What is wrong with a few thousand item drop-down?”
Me: “Um, memory, network bandwidth, database storage, who maintains this list of cities? A lot of time and resources could be saved by simply paying attention.”
Manager: “Memory? Well, memory is cheap. If the workstation needs more memory, we’ll add more”
Dev: “Creating a drop-down is easy and selecting thousands of rows from the database should be fast enough. If the selection is slow, I’ll put it in a thread.”
DBA: “Table won’t be that big and won’t take up much disk space. We’ll need to setup stored procedures, and data import jobs from somewhere to maintain the data. New cities, name changes, ect. ”
Manager: “And if the network starts becoming too slow, we’ll have the Networking dept. open up the valves.”
Me: “Am I the only one seeing all the moving parts we’re introducing just to keep someone from misspelling ‘Chicago’? I’ll admit I’m wrong or maybe I’m not looking at the problem correctly. The point of redesigning the compliance system is to make it simpler, not more complex.”
Manager: “I’m missing the point to why we’re still talking about this. Decision has been made. Drop-down of all cities in the US. Moving on to the button’s icon ..”
Me: “Where is the list of cities going to come from?”
<few seconds of silence>
Dev: “Post office I guess.”
Me: “You guess?…OK…Who is going to manage this list of cities? The manager responsible for regulations?”
User: “Thousands of cities? Oh no …no one is our area has time for that. The system should do it”
Me: “OK, the system. That falls on the DBA. Are you going to be responsible for keeping the data accurate? What is going to audit the cities to make sure the names are properly named and associated with the correct state?”
DBA: “Uh..I don’t know…um…I can set up a job to run every night”
Me: “A job to do what? Validate the data against what?”
Manager: “Do you have a point? No one said it would be easy and all of those details can be answered later.”
Me: “Almost done, and this should be easy. How many cities do we currently have to maintain compliance?”
User: “Maybe 4 or 5. Not many. Regulations are mostly on a state level.”
Me: “When was the last time we created a new city compliance?”
User: “Maybe, 8 years ago. It was before I started.”
Me: “So we’re creating all this complexity for data that, realistically, probably won’t ever change?”
User: “Oh crap, you’re right. What the hell was I thinking…Scratch the drop-down idea. I doubt we’re have a new city regulation anytime soon and how hard is it to type in a city?”
Manager: “OK, are we done wasting everyone’s time on this? No drop-down of cities...next …Let’s get back to the button’s icon …”
Simplicity 1, complexity 0.16 -
This facts are killing me
"During his own Google interview, Jeff Dean was asked the implications if P=NP were true. He said, "P = 0 or N = 1." Then, before the interviewer had even finished laughing, Jeff examined Google’s public certificate and wrote the private key on the whiteboard."
"Compilers don't warn Jeff Dean. Jeff Dean warns compilers."
"gcc -O4 emails your code to Jeff Dean for a rewrite."
"When Jeff Dean sends an ethernet frame there are no collisions because the competing frames retreat back up into the buffer memory on their source nic."
"When Jeff Dean has an ergonomic evaluation, it is for the protection of his keyboard."
"When Jeff Dean designs software, he first codes the binary and then writes the source as documentation."
"When Jeff has trouble sleeping, he Mapreduces sheep."
"When Jeff Dean listens to mp3s, he just cats them to /dev/dsp and does the decoding in his head."
"Google search went down for a few hours in 2002, and Jeff Dean started handling queries by hand. Search Quality doubled."
"One day Jeff Dean grabbed his Etch-a-Sketch instead of his laptop on his way out the door. On his way back home to get his real laptop, he programmed the Etch-a-Sketch to play Tetris."
"Jeff Dean once shifted a bit so hard, it ended up on another computer. "6 -
And, the other side, husbands 😂
——————————————————–
Dear Technical Support,
Last year I upgraded from Boyfriend 5.0 to Husband 1.0 and noticed a distinct slow down in overall system performance — particularly in the flower and jewelry applications, which operated flawlessly under Boyfriend 5.0. The new program also began making unexpected changes to the accounting modules.
In addition, Husband 1.0 uninstalled many other valuable programs, such as Romance 9.5 and Personal Attention 6.5 and then installed undesirable programs such as NFL 5.0, NBA 3.0, and Golf Clubs 4.1.
Conversation 8.0 no longer runs, and Housecleaning 2.6 simply crashes the system. I’ve tried running Nagging 5.3 to fix these problems, but to no avail.
What can I do?
Signed,
Desperate
——————————————————–
Dear Desperate:
First keep in mind, Boyfriend 5.0 is an Entertainment Package, while Husband 1.0 is an Operating System.
Please enter the command: ” C:/ I THOUGHT YOU LOVED ME” and try to download Tears 6.2 and don’t forget to install the Guilt 3.0 update.
If that application works as designed, Husband 1.0 should then automatically run the applications Jewelry 2.0 and Flowers 3.5. But remember, overuse of the above application can cause Husband 1.0 to default to Grumpy Silence 2.5, Happy Hour 7.0 or Beer 6.1.
Beer 6.1 is a very bad program that will download the Snoring Loudly Beta.
Whatever you do, DO NOT install Mother-in-law 1.0 (it runs a virus in the background that will eventually seize control of all your system resources).
Also, do not attempt to reinstall the Boyfriend 5.0 program. These are unsupported applications and will crash Husband 1.0.
In summary, Husband 1.0 is a great program, but it does have limited memory and cannot learn new applications quickly.
You might consider buying additional software to improve memory and performance. We recommend Food 3.0 and Hot Lingerie 7.7.
Good Luck,
Tech Support3 -
Apple has a real problem.
Their hardware has always been overpriced, but at least before it had defenders pointing out that it was at least capable and well made.
I know, I used to be one of them.
Past tense.
They have jumped the shark.
They now make pretentious hipster crap that is massively overpriced and doesn't have the basic features (like hardware ports) to enable you to do your job.
I mean, who needs an ESC key? What is wrong with learning to type CTRL-[ instead? Muscle memory? What's that?
They have gone from "It just works" to "It just doesn't work" in no time at all.
And it is Developers who are most pissed off. A tiny demographic who won't be visible on the financial bottom line until their newly absent software suddenly makes itself known two, three years down the line.
By which time it is too late to do anything.
But hey! Look how thin (and thermally throttled) my new laptop is!19 -
Me: you should not open that log file in excel its almost 700mb
Client: its okay, my computer has 4gb ram
Me: *looking at clients computer crashing*
Client: the file is broken!
Me: no, you just need to use a more memory efficient tool, like R, SAS, python, C#, or like anything else!5 -
Senior development manager in my org posted a rant in slack about how all our issues with app development are from
“Constantly moving goalposts from version to version of Xcode”
It took me a few minutes to calm myself down and not reply. So I’ll vent here to myself as a form of therapy instead.
Reality Check:
- You frequently discuss the fact that you don’t like following any of apples standards or app development guidelines. Bit rich to say the goalposts are moving when you have your back to them.
- We have a custom everything (navigation stack handler, table view like control etc). There’s nothing in these that can’t be done with the native ones. All that wasted dev time is on you guys.
- Last week a guy held a session about all the memory leaks he found in these custom libraries/controls. Again, your teams don’t know the basic fundamentals of the language or programming in general really. Not sure how that’s apples fault.
- Your “great emphasis on unit testing” has gotten us 21% coverage on iOS and an Android team recently said to us “yeah looks like the tests won’t compile. Well we haven’t touched them in like a year. Just ignore them”. Stability of the app is definitely on you and the team.
- Having half the app in react-native and half in native (split between objective-c and swift) is making nobodies life easier.
- The company forces us to use a custom built CI/CD solution that regularly runs out of memory, reports false negatives and has no specific mobile features built in. Did apple force this on us too?
- Shut the fuck up5 -
Yesterday was Friday the 13th, so here is a list of my worst dev nightmares without order of significance:
1) Dealing with multithreaded code, especially on Android
2) Javascript callback hell
3) Dependency hell, especially in Python
4) Segfaults
5) Memory Leaks
6) git conflicts
7) Crazy regexes and string manipulations
8) css. Fuck css.
9) not knowing jack shit about something but expected by others to
produce a result with it.
10) 3+ hours of debugging with no success
Post yours27 -
I'm getting ridiculously pissed off at Intel's Management Engine (etc.), yet again. I'm learning new terrifying things it does, and about more exploits. Anything this nefarious and overreaching and untouchable is evil by its very nature.
(tl;dr at the bottom.)
I also learned that -- as I suspected -- AMD has their own version of the bloody thing. Apparently theirs is a bit less scary than Intel's since you can ostensibly disable it, but i don't believe that because spy agencies exist and people are power-hungry and corrupt as hell when they get it.
For those who don't know what the IME is, it's hardware godmode. It's a black box running obfuscated code on a coprocessor that's built into Intel cpus (all Intell cpus from 2008 on). It runs code continuously, even when the system is in S3 mode or powered off. As long as the psu is supplying current, it's running. It has its own mac and IP address, transmits out-of-band (so the OS can't see its traffic), some chips can even communicate via 3g, and it can accept remote commands, too. It has complete and unfettered access to everything, completely invisible to the OS. It can turn your computer on or off, use all hardware, access and change all data in ram and storage, etc. And all of this is completely transparent: when the IME interrupts, the cpu stores its state, pauses, runs the SMM (system management mode) code, restores the state, and resumes normal operation. Its memory always returns 0xff when read by the os, and all writes fail. So everything about it is completely hidden from the OS, though the OS can trigger the IME/SMM to run various functions through interrupts, too. But this system is also required for the CPU to even function, so killing it bricks your CPU. Which, ofc, you can do via exploits. Or install ring-2 keyloggers. or do fucking anything else you want to.
tl;dr IME is a hardware godmode, and if someone compromises this (and there have been many exploits), their code runs at ring-2 permissions (above kernel (0), above hypervisor (-1)). They can do anything and everything on/to your system, completely invisibly, and can even install persistent malware that lives inside your bloody cpu. And guess who has keys for this? Go on, guess. you're probably right. Are they completely trustworthy? No? You're probably right again.
There is absolutely no reason for this sort of thing to exist, and its existence can only makes things worse. It enables spying of literally all kinds, it enables cpu-resident malware, bricking your physical cpu, reading/modifying anything anywhere, taking control of your hardware, etc. Literal godmode. and some of it cannot be patched, meaning more than a few exploits require replacing your cpu to protect against.
And why does this exist?
Ostensibly to allow sysadmins to remote-manage fleets of computers, which it does. But it allows fucking everything else, too. and keys to it exist. and people are absolutely not trustworthy. especially those in power -- who are most likely to have access to said keys.
The only reason this exists is because fucking power-hungry doucherockets exist.26 -
Story about an obscure bug: https://twitter.com/mmalex/status/...
"We had a ‘fun’ one on LittleBigPlanet 1: 2 weeks to gold, a Japanese QA tester started reliably crashing the game by leaving it on over night. We could not repro. Like you, days of confirmation of identical environment, os, hardware, etc; each attempt took over 24h, plus time differences, and still no repro.
"Eventually we realised they had an eye toy plugged in, and set to record audio (that took 2 days of iterating) still no joy.
"Finally we noticed the crash was always around 4am. Why? What happened only in Japan at 4am? We begged to find out.
"Eventually the answer came: cleaners arrived. They were more thorough than our cleaners! One hour of vacuuming near the eye toy- white noise- caused the in game chat audio compression to leak a few bytes of memory (only with white noise). Long enough? Crash.
"Our final repro: radios tuned to noise, turned up, and we could reliably crash the game. Fix took 5 minutes after that. Oh, gamedev...."5 -
We had the most fucking retarded client today. No, seriously, if you ever beat their level you have a serious mental issue.
They had a mail problem for which they'd need to check at the side of another company since we don't have those fucking logs.
Their statements:
- they entered an email address In the text field of mail-tester.com and were furious that they didn't get the results sent.
Note: it says right on that page that YOU JUST NEED TO SEND THE EMAIL ADDRESS WHICH IS PRE-ENTRRED IN THAT TEXT FIELD AN EMAIL.
- their company has been a reputable 'conservative' company which hasn't done anything wrong since 19xx so the fact that they'd end up on a blacklist was FUCKING OUTRAGEOUS and bullshit.
- our support wasn't willing to help and only willing to tell them outrageous lies.
- the other it company was only reachable at a premium number and thus expensive to call.
Emails back and forth and finally they CC'd the other company. They're reply was fucking priceless:
"we never had a premium number. Feel free to call us on *number* any time during the week between *time* and *time*.
Then he told us that we should just go back to sleep.
It was way worse than that but due to privacy and my own memory this is all I can tell.
Just wow.3 -
My team handles infrastructure deployment and automation in the cloud for our company, so we don't exactly develop applications ourselves, but we're responsible for building deployment pipelines, provisioning cloud resources, automating their deployments, etc.
I've ranted about this before, but it fits the weekly rant so I'll do it again.
Someone deployed an autoscaling application into our production AWS account, but they set the maximum instance count to 300. The account limit was less than that. So, of course, their application gets stuck and starts scaling out infinitely. Two hundred new servers spun up in an hour before hitting the limit and then throwing errors all over the place. They send me a ticket and I login to AWS to investigate. Not only have they broken their own application, but they've also made it impossible to deploy anything else into prod. Every other autoscaling group is now unable to scale out at all. We had to submit an emergency limit increase request to AWS, spent thousands of dollars on those stupidly-large instances, and yelled at the dev team responsible. Two weeks later, THEY INCREASED THE MAX COUNT TO 500 AND IT HAPPENED AGAIN!
And the whole thing happened because a database filled up the hard drive, so it would spin up a new server, whose hard drive would be full already and thus spin up a new server, and so on into infinity.
Thats probably the only WTF moment that resulted in me actually saying "WTF?!" out loud to the person responsible, but I've had others. One dev team had their code logging to a location they couldn't access, so we got daily requests for two weeks to download and email log files to them. Another dev team refused to believe their server was crashing due to their bad code even after we showed them the logs that demonstrated their application had a massive memory leak. Another team arbitrarily decided that they were going to deploy their code at 4 AM on a Saturday and they wanted a member of my team to be available in case something went wrong. We aren't 24/7 support. We aren't even weekend support. Or any support, technically. Another team told us we had one day to do three weeks' worth of work to deploy their application because they had set a hard deadline and then didn't tell us about it until the day before. We gave them a flat "No" for that request.
I could probably keep going, but you get the gist of it.4 -
Google: “Your websites must load the first byte in under 500ms and be fully loaded with no render blocking and local caching of all external site callouts to even begin to rank in Google searches.”
Me: “Ok, Google. Your wish is my command.”
*Looks at Chrome’s memory usage to load a blank page*7 -
Experience that made me feel like a dev badass?
Users requested the ability to 'send' information from one application to another. Couple of our senior devs started out saying it would be impossible (there is no way to pass objects across a machine's memory boundary), then entertained the idea of utilizing the various messaging frameworks such as Microsoft's ServiceBus and RabbitMQ, but came up with a plan to use 2 WebAPI services (one messenger, one receiver) along with a homegrown messaging API (the clients would 'poll' the services looking for message) because ServiceBus, RabbitMQ, etc might not be able to scale to our needs. Their initial estimates were about 6 months development for the two services, hardware requirement for two servers, MSSQL server licenses, and padded an additional 6 months for client modifications. Very...very proud of their detailed planning.
I thought ...hmmm...I've done memory maps and created simple TCP/IP hosts that could send messages back and forth between other apps (non-UI), WPF couldn't be that much different.
In an afternoon, I came up with this (see attached), and showed the boss. Guess which solution we're going with.
The two devs are still kinda pissed at me. One still likes say as I walk in the room "our hero returns"....frack him.11 -
Someone on a C++ learning and help discord wanted to know why the following was causing issues.
char * get_some_data() {
char buffer[1000];
init_buffer(&buffer[0]);
return &buffer[0];
}
I told them they were returning a pointer to a stack allocated memory region. They were confused, didn't know what I was talking about.
I pointed them to two pretty decently written and succinct articles, the first about stack vs. heap, and the second describing the theory of ownership and lifetimes. I instructed to give them a read, and to try to understand them as best as possible, and to ping me with any questions. Then I promised to explain their exact issue.
Silence for maybe five minutes. They disregard the articles, post other code saying "maybe it's because of this...". I quickly pointed them back at their original code (the above) and said this is 100% an issue you're facing. "Have you read the articles?"
"Nope" they said, "I just skimmed through them, can you tell me what's wrong with my code?"
Someone else chimed in and said "you need to just use malloc()." In a C++ room, no less.
I said "@OtherGuy please don't blindly instruct people to allocate memory on the heap if they do not understand what the heap is. They need to understand the concepts and the problems before learning how C++ approaches the solution."
I was quickly PM'd by one of the server's mods and told that I was being unhelpful and that I needed to reconsider my tone.
Fuck this industry. I'm getting so sick of it.26 -
Today, I was told to investigate why the software doesn't work on "some" computers. I had no previous experience with that particular software but I just had to make some tests... easy, right? As soon as I ran the software, my computer crashed (I literally had to restart the pc). I asked my colleagues if I did something wrong but the set up seemed ok.
Later, in a random discussion about the software I found out it does "a little memory allocation". I opened the performance tab in task manager and ran the software again. In an instant, the RAM went from 1.3GB to 7.66GB (my pc has 8GB of RAM).
In an attempt to find how such a monstrosity was creater, I found out the developer that made the software had 16GB of RAM on his pc.
I have found something that eats RAM more than Chrome... brace yourselves.8 -
One of my friend at college asked me why her computer is running slow even when she is running only chrome.
Me: how much memory does it have?
Her: 1TB.
Me (somewhat confused): no no I meant RAM.
Her: yeah yeah it's one TB. I read the specifications of the laptop.
Me: *in my mind, fucking read it again* please read it again. You must have misread it.
Her( grinning face ): alright.
Guess who didn't talk to me for a week. 😂14 -
Every single one of them, and every one that will come after them.
Google, it started out as 2 people in their garage, wanting to make a search engine that was better than the others. Nothing else, nothing evil. Just make the world a little bit better. And look what it's become now. A megacorporation with little to no regards for their user base. Because who cares about users anyway?
Microsoft, it started out with Bill Gates - young high school computer nerd - who wanted to make an operating system for the world to use. Something that's better than the competition. And boy did he do so. Well "better than the competition" aside, he did make it for the world to use. And the world adopted it. And look what it's become now. A megacorporation with little to no regards for their user base. Because who cares about users anyway?
See where I'm going here?
Apple, it started out with Steve Jobs and Steve Wozniak in their garage, just like Google did, wanting to make hardware that was better than the others. Nothing else, nothing evil. Just to make the world a little bit better. And look what it's become now. Planned obsolescence has been baked into it, just like it is in every other piece of technology. Quality control and thinking through the design has become a thing of the past. User choice, yeah who cares about that.
Samsung, it started out centuries ago actually, and I don't really remember the details of it.. ColdFusion has a video on it if memory serves me right. Do watch it if you're interested. Anyway, just like all the others they started out as a company which wanted to make the world a little bit better. And damn right did they do so.. initially. Look what they've become now. Forcing their stupid TouchWiz UI upon their customers (or products?), a Bixby button that can't even be reprogrammed.. and the latest thing.. Knox, advertised as a security feature, but as everyone who likes rooting their devices and mucking with it knows, it is an anti-feature that only serves for lockdown. Why shouldn't you be able to turn in a phone for RMA when a hardware error occurs, when all you've personally modified is the software? Why should changing the software blow that eFuse, so that you can be sure that you can't replace it without specialized equipment and a very steady hand?
I could go on and on forever about more of the tech giants out there, but I feel like this suffices for now. Otherwise I won't have anything else left for future rants! But one thing I know for sure. Every tech company started, starts, and will start out with a desire to make the world a better place, and once they gain a significant customer base, they will without exception turn into the same kind of Evil Megacorp., just like the ones before them. Some may say that capitalism itself is to blame for this, the greed for more when you already have a lot. Who knows? I'd rather say that the very human nature itself is to blame for it. We're by design greedy beings, and I hate it. I hate being human for that. I don't want humans to be evil towards one another, and be greedy for ever more. But I guess that that's just the way it is, and some things do actually never change...17 -
My code review nightmare part 3
Performed a review on/against a workplace 'nemesis'. I didn't follow the department standards document (cause I could care less about spacing, sorted usings, etc) and identified over 80 bugs, logic errors, n+1 patterns, memory leaks (yes, even in .net devs can cause em'), and general bad behavior (ex.'eating' exceptions that should be handled or at least logged)
Because 'Jeff' was considered a golden child (that's another long TL;DR), his boss and others took a major offense and demanded I justify my review, item by item.
About 2 hours into the meeting, our department mgr realized embarrassing Jeff any further wasn't doing anyone any good and decided to take matters into his own hands. Thinking 'well, its about time he did his job', I go back to my desk. About an hour later..
Mgr: "I need you in the conference room, RIGHT NOW!"
<oh crap>
Mgr: "I spoke to Jeff and I think I know what the problem is. Did you ever train him on any of the problems you identified in the review?"
Me: "Um, no. Why would I?"
Mgr: "Ha!..I was right. So lets agree the problems are partially your fault, OK?"
Me: "Finding the bugs in his code is somehow my fault?"
Mgr: "Yes! For example, the n+1 problem in using the WCF service, you never trained him on how to use the service. You wrote the service, correct?"
Me: "Yes, but it's not my job to teach him how to write C#. I documented the process and have examples in the document to avoid n+1. All he had to do was copy/paste."
Mgr: "But you never sat with Jeff and talked to him like a human being? You sit over there in your silo and are oblivious to the problems you cause. This ends today!"
Me: "What the...I have no idea what you are talking about. What in the world did Jeff tell you?"
Mgr: "He told me enough and I'm putting an end to it. I want a compressive training class developed on how to use your service. I'll give you a month to get your act together and properly train these developers."
3 days later, I submit the power-point presentation and accompanying docs. It was only one WCF with a handful of methods. Mgr approved the training, etc..etc. execute the 'training', and Jeff submits a code review a couple of weeks later. From over 80 issues to around 50. The poop hits the fan again.
Mgr: "What's your problem? When are you going to take your responsibility seriously?"
Me: "Its pretty clear I don't have the problem. All the review items were also verified by other devs. Its not me trying to be an asshole."
Mgr: "Enough with the excuses. If you think you can do a better job *you* make the code changes and submit them for Jeff for review. No More Excuses!"
Couple of days later, I make the changes, submit them for review, and Jeff really couldn't say too much other than "I don't see this as an improvement"
TL;DR, I had been tracking the errors generated by the site due to the bugs prior to my changes. After deployment, # of errors went from thousands per hour to maybe hundreds per day (that's another story) and the site saw significant performance increases, fewer customer complaints, etc..etc.
At a company event, the department VP hands out special recognition awards:
VP: "This award is especially well earned. Not only does this individual exemplify the company's focus on teamwork, he also went above and beyond the call of duty to serve our customers. Jeff, come on up and get this well deserved award."19 -
I had to open the desktop app to write this because I could never write a rant this long on the app.
This will be a well-informed rebuttal to the "arrays start at 1 in Lua" complaint. If you have ever said or thought that, I guarantee you will learn a lot from this rant and probably enjoy it quite a bit as well.
Just a tiny bit of background information on me: I have a very intimate understanding of Lua and its c API. I have used this language for years and love it dearly.
[START RANT]
"arrays start at 1 in Lua" is factually incorrect because Lua does not have arrays. From their documentation, section 11.1 ("Arrays"), "We implement arrays in Lua simply by indexing tables with integers."
From chapter 2 of the Lua docs, we know there are only 8 types of data in Lua: nil, boolean, number, string, userdata, function, thread, and table
The only unfamiliar thing here might be userdata. "A userdatum offers a raw memory area with no predefined operations in Lua" (section 26.1). Essentially, it's for the API to interact with Lua scripts. The point is, this isn't a fancy term for array.
The misinformation comes from the table type. Let's first explore, at a low level, what an array is. An array, in programming, is a collection of data items all in a line in memory (The OS may not actually put them in a line, but they act as if they are). In most syntaxes, you access an array element similar to:
array[index]
Let's look at c, so we have some solid reference. "array" would be the name of the array, but what it really does is keep track of the starting location in memory of the array. Memory in computers acts like a number. In a very basic sense, the first sector of your RAM is memory location (referred to as an address) 0. "array" would be, for example, address 543745. This is where your data starts. Arrays can only be made up of one type, this is so that each element in that array is EXACTLY the same size. So, this is how indexing an array works. If you know where your array starts, and you know how large each element is, you can find the 6th element by starting at the start of they array and adding 6 times the size of the data in that array.
Tables are incredibly different. The elements of a table are NOT in a line in memory; they're all over the place depending on when you created them (and a lot of other things). Therefore, an array-style index is useless, because you cannot apply the above formula. In the case of a table, you need to perform a lookup: search through all of the elements in the table to find the right one. In Lua, you can do:
a = {1, 5, 9};
a["hello_world"] = "whatever";
a is a table with the length of 4 (the 4th element is "hello_world" with value "whatever"), but a[4] is nil because even though there are 4 items in the table, it looks for something "named" 4, not the 4th element of the table.
This is the difference between indexing and lookups. But you may say,
"Algo! If I do this:
a = {"first", "second", "third"};
print(a[1]);
...then "first" appears in my console!"
Yes, that's correct, in terms of computer science. Lua, because it is a nice language, makes keys in tables optional by automatically giving them an integer value key. This starts at 1. Why? Lets look at that formula for arrays again:
Given array "arr", size of data type "sz", and index "i", find the desired element ("el"):
el = arr + (sz * i)
This NEEDS to start at 0 and not 1 because otherwise, "sz" would always be added to the start address of the array and the first element would ALWAYS be skipped. But in tables, this is not the case, because tables do not have a defined data type size, and this formula is never used. This is why actual arrays are incredibly performant no matter the size, and the larger a table gets, the slower it is.
That felt good to get off my chest. Yes, Lua could start the auto-key at 0, but that might confuse people into thinking tables are arrays... well, I guess there's no avoiding that either way.13 -
So, some time ago, I was working for a complete puckered anus of a cosmetics company on their ecommerce product. Won't name names, but they're shitty and known for MLM. If you're clever, go you ;)
Anyways, over the course of years they brought in a competent firm to implement their service layer. I'd even worked with them in the past and it was designed to handle a frankly ridiculous-scale load. After they got the 1.0 released, the manager was replaced with some absolutely talentless, chauvinist cuntrag from a phone company that is well known for having 99% indian devs and not being able to heard now. He of course brought in his number two, worked on making life miserable and running everyone on the team off; inside of a year the entire team was ex-said-phone-company.
Watching the decay of this product was a sheer joy. They cratered the database numerous times during peak-load periods, caused $20M in redis-cluster cost overrun, ended up submitting hundreds of erroneous and duplicate orders, and mailed almost $40K worth of product to a random guy in outer mongolia who is , we can only hope, now enjoying his new life as an instagram influencer. They even terminally broke the automatic metadata, and hired THIRTY PEOPLE to sit there and do nothing but edit swagger. And it was still both wrong and unusable.
Over the course of two years, I ended up rewriting large portions of their infra surrounding the centralized service cancer to do things like, "implement security," as well as cut memory usage and runtimes down by quite literally 100x in the worst cases.
It was during this time I discovered a rather critical flaw. This is the story of what, how and how can you fucking even be that stupid. The issue relates to users and their reports and their ability to order.
I first found this issue looking at some erroneous data for a low value order and went, "There's no fucking way, they're fucking stupid, but this is borderline criminal." It was easy to miss, but someone in a top down reporting chain had submitted an order for someone else in a different org. Shouldn't be possible, but here was that order staring me in the face.
So I set to work seeing if we'd pwned ourselves as an org. I spend a few hours poring over logs from the log service and dynatrace trying to recreate what happened. I first tested to see if I could get a user, not something that was usually done because auth identity was pervasive. I discover the users are INCREMENTAL int values they used for ids in the database when requesting from the API, so naturally I have a full list of users and their title and relative position, as well as reports and descendants in about 10 minutes.
I try the happy path of setting values for random, known payment methods and org structures similar to the impossible order, and submitting as a normal user, no dice. Several more tries and I'm confident this isn't the vector.
Exhausting that option, I look at the protocol for a type of order in the system that allowed higher level people to impersonate people below them and use their own payment info for descendant report orders. I see that all of the data for this transaction is stored in a cookie. Few tests later, I discover the UI has no forgery checks, hashing, etc, and just fucking trusts whatever is present in that cookie.
An hour of tweaking later, I'm impersonating a director as a bottom rung employee. Score. So I fill a cart with a bunch of test items and proceed to checkout. There, in all its glory are the director's payment options. I select one and am presented with:
"please reenter card number to validate."
Bupkiss. Dead end.
OR SO YOU WOULD THINK.
One unimportant detail I noticed during my log investigations that the shit slinging GUI monkeys who butchered the system didn't was, on a failed attempt to submit payment in the DB, the logs were filled with messages like:
"Failed to submit order for [userid] with credit card id [id], number [FULL CREDIT CARD NUMBER]"
One submit click later and the user's credit card number drops into lnav like a gatcha prize. I dutifully rerun the checkout and got an email send notification in the logs for successful transfer to fulfillment. Order placed. Some continued experimentation later and the truth is evident:
With an authenticated user or any privilege, you could place any order, as anyone, using anyon's payment methods and have it sent anywhere.
So naturally, I pack the crucifixion-worthy body of evidence up and walk it into the IT director's office. I show him the defect, and he turns sheet fucking white. He knows there's no recovering from it, and there's no way his shitstick service team can handle fixing it. Somewhere in his tiny little grinchly manager's heart he knew they'd caused it, and he was to blame for being a shit captain to the SS Failboat. He replies quietly, "You will never speak of this to anyone, fix this discretely." Straight up hitler's bunker meme rage.13 -
Teacher: Computer settings are stored in the ROM on the motherboard.
Me: *internally* Uhm, yea, sure... and I am the pope
Me: Sorry to interrupt you but how come the BIOS settings get reset when the CMOS battery is pulled out or dies if they are stored in ROM?
Teacher: ....
Me: *internally* yea, that's what I thought, you have no clue what you are even saying - the BIOS is stored in ROM or flash memory while the settings are stored in NVRAM also called CMOS memory...10 -
I taught my 9yo sister to SSH from my Arch Linux system to an Ubuntu system, she was amazed to see terminal and Firefox launching remotely. Next I taught her to murder and eat all the memory (I love Linux, as Batman, one should also know the weaknesses). Now she can rm rf / --no-preserve-root and the forkbomb. She's amazed at the power of one liners. Will be teaching her python as she grew fond of my Raspberry Pi zero w with blinkt and phat DAC, making rainbows and playing songs via mpg123.
I made her use play with Makey Makey when it first came out but it isn't as interesting. Drop your suggestions which could be good for her learning phase?13 -
Storytime
A story about an Android TVbox which decided to become an iPad
Several years ago we've bought an android tv-box.
It served me and my family well for several years.
Specs are not that important in this story, but there they are:
Android 4.4
1GB RAM
Amlogic quadcore 1.4HGz
8GB memory.
This device served us well - online TV, browsing, music, file sharing and so on. But recently cheap Chinese memory deciteed to take a break and damaged ROM. Because of that device won't boot. The only option was to take it apart and "short circuit" certain legs on memory chip and make it boot from SD card and install new firmware. After such operations tv-box worked well again.
Hoverer, memory glitched again and again and this algorithm was repeated for several months.
But that is not what is this story about.
One day memory went completely crazy and there was no way to install new firmware on it. It just hanged on install. (BTW, it was official firmware for this device)
But after countless attempts it finally worked! It installed the firmware and booted into launcher and connected to WiFi!
But now comes the most interesting part.
It was not android anymore.
It decided to became an iPad.
My dad logged in to his Google account via tv-box and got mail that someboby connected from our IP via iPad (we don't have an iPad) and using safari browser! Stock browser is not safari browser.....
"Ok, nvm, crazy glitch." - we thought.
But preinstalled play marked wont launch. Because he told us, that we're trying to connect from iPad.
And Google chrome page suggested to download chrome for iPad
And everything was acting like it is an iPad.
OK, downloaded iTunes, why not??? ._.
Tried to install elixir for android via apk from flash, but then memory glitched one more time, everything went black and tv-box had damaged ROM again...
After that we decided to not torment it anymore...
That's it. Poor Android TVbox that all his life dreamed to become an iPad. Rest in peace.2 -
I just mistyped a keyboard shortcut that caused my computer to say «I AM FILO AND EVERYONE LOVES ME» at full volume.
I have no memory of leaving a script attached to some random shortcut, and I can't find the setting anywhere.
Young me was a narcissistic asshole1 -
Here are the reasons why I don't like IPv6.
Now I'll be honest, I hate IPv6 with all my heart. So I'm not supporting it until inevitably it becomes the de facto standard of the internet. In home networks on the other hand.. huehue...
The main reason why I hate it is because it looks in every way overengineered. Or rather, poorly engineered. IPv4 has 32 bits worth, which translates to about 4 billion addresses. IPv6 on the other hand has 128 bits worth of addresses.. which translates to.. some obscenely huge number that I don't even want to start translating.
That's the problem. It's too big. Anyone who's worked on the internet for any amount of time knows that the internet on this planet will likely not exceed an amount of machines equal to about 1 or 2 extra bits (8.5B and 17.1B respectively). Now of course 33 or 34 bits in total is unwieldy, it doesn't go well with electronics. From 32 you essentially have to go up to 64 straight away. That's why 64-bit processors are.. well, 64 bits. The memory grew larger than the 4GB that a 32-bit processor could support, so that's what happened.
The internet could've grown that way too. Heck it probably could've become 64 bits in total of which 34 are assigned to the internet and the remaining bits are for whatever purposes large IP consumers would like to use the remainder for.
Whoever designed IPv6 however.. nope! Let's give everyone a /64 range, and give them quite literally an IP pool far, FAR larger than the entire current internet. What's the fucking point!?
The IPv6 standard is far larger than it should've been. It should've been 64 bits instead of 128, and it should've been separated differently. What were they thinking? A bazillion colonized planets' internetworks that would join the main internet as well? Yeah that's clearly something that the internet will develop into. The internet which is effectively just a big network that everyone leases and controls a little bit of. Just like a home network but scaled up. Imagine or even just look at the engineering challenges that interplanetary communications present. That is not going to be feasible for connecting multiple planets' internets. You can engineer however you want but you can't engineer around the hard limit of light speed. Besides, are our satellites internet-connected? Well yes but try using one. And those whizz only a couple of km above sea level. The latency involved makes it barely usable. Imagine communicating to the ISS, the moon or Mars. That is not going to happen at an internet scale. Not even close. And those are only the closest celestial objects out there.
So why was IPv6 engineered with hundreds of years of development and likely at least a stage 4 civilization in mind? No idea. Future-proofing or poor engineering? I honestly don't know. But as a stage 0 or maybe stage 1 person, I don't think that I or civilization for that matter is ready for a 128-bit internet. And we aren't even close to needing so many bits.
Going back to 64-bit processors and memory. We've passed 32 bit address width about a decade ago. But even now, we're only at about twice that size on average. We're not even close to saturating 64-bit address width, and that will likely take at least a few hundred years as well. I'd say that's more than sufficient. The internet should've really become a 64-bit internet too.34 -
Would the web be better off, if there was zero frontend scripting? There would be HTML5 video/audio, but zero client side JS.
Browsers wouldn't understand script tags, they wouldn't have javascript engines, and they wouldn't have to worry about new standards and deprecations.
Browsers would be MUCH more secure, and use way less memory and CPU resources.
What would we really be missing?
If you build less bloated pages, you would not really need ajax calls, page reloads would be cheap. Animated menus do not add anything functionally, and could be done using css as well. Complicated webapps... well maybe those should just be desktop/mobile apps.
Pages would contain less annoying elements, no tracking or crypto mining scripts, no mouse tracking, no exploitative spam alerts.
Why don't we just deprecate JS in the browser, completely?
I think it would be worth it.22 -
A small bug is found.
Chad dev:
😎 *Exists*
> Writes a simple ad hoc solution in a few lines
> Self documenting code with constant run time
> No external dependencies needed
> Fixes the bug, easy to test and does not introduce any new issues
That guy nobody likes (AKA. regex simp coder):
🤡 'This can be "simplified" into oNE LiNe'
> Writes a long regex expression that has to line wrap the editor window several times
> Writes an essay in the comments to explain it's apparent brilliance to the peasant reader
> Exponential run time (bwahahah), excessive memory requirements
> Needs to import additional frameworks, requires more testing that will delay release schedule
> Also fixes bug but the software now needs 2x ram to run and is 3x slower
> Really puts the "simp" in simplified, but not the way you would expect26 -
I like memory hungry desktop applications.
I do not like sluggish desktop applications.
Allow me to explain (although, this may already be obvious to quite a few of you)
Memory usage is stigmatized quite a lot today, and for good reason. Not only is it an indication of poor optimization, but not too many years ago, memory was a much more scarce resource.
And something that started as a joke in that era is true in this era: free memory is wasted memory. You may argue, correctly, that free memory is not wasted; it is reserved for future potential tasks. However, if you have 16GB of free memory and don't have any plans to begin rendering a 3D animation anytime soon, that memory is wasted.
Linux understands this. Linux actually has three States for memory to be in: used, free, and available. Used and free memory are the usual. However, Linux automatically caches files that you use and places them in ram as "available" memory. Available memory can be used at any time by programs, simply dumping out whatever was previously occupying the memory.
And as you well know, ram is much faster than even an SSD. Programs which are memory heavy COULD (< important) be holding things in memory rather than having them sit on the HDD, waiting to be slowly retrieved. I much rather a web browser take up 4 GB of RAM than sit around waiting for it to read the caches image off my had drive.
Now, allow me to reiterate: unoptimized programs still piss me off. There's no need for that electron-based webcam image capture app to take three gigs of memory upon launch. But I love it when programs use the hardware I spent money on to run smoother.
Don't hate a program simply because it's at the top of task manager.6 -
Him: Relation databases are stupid; SQL injections, complex relationships, redundant syntax and so much more!
Me: so what should we use instead? Mongo, redis, some other fancy new db?
Him: no, I have this class in Java, it loads all the data into memory and handles transfers with http.
Me: ...... Bye!5 -
Why computers are like men:
In order to get their attention, you have to turn them on.
They have a lot of data, but are still clueless.
They are supposed to help you solve problems, but half the time they are the problem.
As soon as you commit to one, you realize that if you had waited a little longer, you could have had a better model.
Why computers are like women:
No one but the Creator understands their internal logic.
The native language they use to communicate with other computers is incomprehensible to everyone else.
Even your smallest mistakes are stored in long-term memory for later retrieval.
As soon as you make a commitment to one, you find yourself spending half your paycheck on accessories for it.7 -
Story, !rant.
This memory came up as I was commenting on another rant, and thought it was worthy of a better retelling.
So about a year or two ago, I had just gotten a Software Defined Radio, and was tinkering with it and looking around for cool stuff I could do with it. After stalking planes for a while (caught a 747 over my area 😎) I saw this program that decoded satellite images of earth, coming from the NOAA satellites. I thought this was amazing.
So I waited until one was over my area and let the software do its magic. The image was not great, since I had this set up on the first floor and there was a lot of material between me and the satellite.
So I came to the brilliant conclusion that I'd leave the program on automatic more (it will start sampling when the satellite is near) on my terrace, which should yield better results, right?
Perhaps. Who knows. Anyways, couple hours pass and we are running late to a family dinner. So we book it. Family dinner was great, good food and all, and was having fun, so never thought about my poor laptop, sitting alone in the night.
But then, when I was walking home in the rain... It hit me. I started running. I couldn't believe what I had done. Fast forward five minutes, and I'm out of breath, but home. I run upstairs, and see the laptop just sitting there, lid open, no lights on, and of course soaked right through.
I couldn't believe it. My only piece of tech at the time, and my only avenue for programming, gone. And I was 15, so I wasn't getting another one any time soon. Took it inside and drained the water out of it, and just left it there lying on its side.
Next day it worked just fine 🤣 the battery on my laptop only lasted max one hour, so by sheer luck it had lost power before the rain came. That is the one time I have to thank that battery for being such utter trash.7 -
Boss wants to scale our webservers because it seems they're having performance/capacity issues....
I'VE BEEN TELLING HIM FOR WEEKS IT'S NOT THE SERVERS!!! IT'S THE FACT THAT EVERY SINGLE QUERY HITS A SINGLE MONGODB... AND NO CACHE EITHER... AND THE DB CANT BE ENTIRELY LOADED INTO MEMORY AS ITS TOO BIG FOR RAM ON A SINGLE SERVER...
HOW THE FUCK CAN YOU SCALE IF EVERYTHING HAS A DEPENDENCY ON 1 NON-DISTRIBUTED DATABASE?6 -
F*cking Samsung's alarm clock.
I really needed to wake up early today so I added secondary alarm little bit earlier. It was supposed to ring at 5:20 and second one shall ring 5:30. But Samsung said no.
Update came thru night and phone was restarted in process. Why it can't keep memory unlocked I don't understand, but OK. But it effectively means it was not able to trigger alarm clock. So I woke up at 6:35 and came more than hour late.
Why such basic functionality failed? My old Sony Erricsson T290i can ring even when powered down. Same as my Nokia and after that Lumia with Windows Phone 10. Why can't Samsung just be normal.12 -
Meet 'SBI Online' app from Play Store, in their own words:
What they were supposed to do?
"Experience the new Retail Internet Banking of SBI"
What they do?
"SBI online app will redirect to SBI Retail Internet Banking (online SBI) site"
Why do they have app?
"No need to remember URL",
"Less memory space required on device"
App storage space?
F**king 2.6 MB, just to redirect users to their website, in third-party browser.2 -
I have to rant a bit about the toxic reactions to a constructive Q&A website.
People keep complaining that they get downvotes and corrections, or stuff like that.
Are you fucking kidding me?
So you expect people to spend their own time for absolutely free, to help you, while you don't even want to invest in describing the issue you're having properly? And then complain that people are having issues in understanding your questions?
Let's look at this scientifically. Let's gather up some questions that have been received badly on SO in the last few hours. From the top (simply put https://stackoverflow.com/questions... in front of the id):
47619033 - person wants a discussion about an algorithm while not providing any information about what worked and what failed. "Please write a program for me". Breaking at least 2 rules.
47619027 - "check out my videos" spam
47619030 - "Here's the manual that has my answer but I can't find my answer in it".
47619004 - "how do I keep variables in memory"
47618997 - debug this exception, I'll give you no info on what I tried and failed. Screw this, you guys figure this out, I'm going out for beer.
47618993 - expects everyone to guess what the input is, what the expected output is, and whether he has read what HashMap is in the manual. But sure, this question is so far the best out of all the bad ones.
47618985 - please write code according to my specifications
Should I go on? There wasn't a single clear question about problems in code in this entire small set. Be free to continue searching, let me know if you find something that:
1. You understand what's being asked
2. Answer is clear and non-ambiguous (ex. NOT "which language is the coolest?")
3. Not asking someone to write a program for them.
4. Answer is not found in the most basic form of manuals (ex. php.net)
5. Is about programming.
The point is:
If you get downvoted on Stackoverflow - then you wrote a shitty question. Instead of coming over here and venting uselessly, simply address the concerns and at least TRY to write a clear question if you expect any answers.5 -
Programmers then:
No problem NASA mate, we can use these microcontrollers to bring men to the moon no problem!
Programmers now:
Help Stack Overflow, my program is kill.. isn't 90GB (looking at you Evolution) and 400GB of virtual memory (looking at you Gitea) for my app completely normal? I thought that unused memory was wasted memory!1!
(400GB in physical memory is something you only find in the most high-end servers btw)9 -
It's march, I'm in my final year of university. The physics/robotics simulator I need for my major project keeps running into problems on my laptop running Ubuntu, and my supervisor suggests installing Mint as it works fine on that.
I backup what's important across a 4GB and a 16GB memory stick. All I have to do now is boot from the mint installation disk and install from there. But no, I felt dangerous. I was about to kill anything I had, so why not `sudo rm -rf /*` ? After a couple seconds it was done. I turned it off, then back on. I wanted to move my backups to windows which I was dual booting alongside Ubuntu.
No OS found. WHAT. Called my dad, asked if what I thought happened was true, and learnt that the root directory contains ALL files and folders, even those on other partitions. Gone was the past 2 1/2 years of uni work and notes not on the uni computers and the 100GB+ other stuff on there.
At least my current stuff was backed up.
TL;DR : sudo rm -rf /* because I'm installing another Linux distro. Destroys windows too and 2 1/2 years of uni work.13 -
When you Valgrind your program for the first time for memory leaks and get "85000127 allocs, 85000127 deallocs, no memory leaks possible"4
-
Looks like /dev/body got tainted.. nasal memory leaks all over the place 😷
$ kill -9 $(pidof cold)
... Nothing.
$ sudo !!
I said kill the fucking cold!!! Y u no listen to your admin?! 😠
> User condor is not in the sudoers file. This incident will be reported.
RRRRRRRRREEEEEEEEE!!!! 😣😣😣
I just want to finish my goddamn power supply project, instead of getting bed-ridden by a cold, and running through paper towels like there's no tomorrow 😭5 -
!rant, but you're my friends and I want to share my day...
We've had a problem open since last March (before I started), but our teams identified the issue with the customer's code 2 years ago. No one made progress on it until I took it over. The newest version deployed 3 months ago and has no memory leak. I closed out oldest problem today.
On a personal note, I got quotes for my dj and photographer for my wedding next month, and the price for both is what I would've been willing to pay for one. My wedding was supposed to be very inexpensive, with these and my bartender being the most expensive parts, but due to unfortunate events, my wedding is 4x the cost (have to use a venue, backyard unavailable, which changes ALL my plans).4 -
A few years ago I was browsing Bash.org, and a user posted that he'd physically lost a machine.
A few weeks ago, I'd switched my router out for OPNSense. I figured it was time to start cleaning up my network.
Over the course of tracking down IP addresses and assigning statics to mac addresses, I spotted an IP I didn't recognize.
Being a home network, I'm pretty familiar with everything on the network by IP, so was a little taken aback.
I did some testing, found out that it was a Linux box. Cool.
I can SSH into it. Ok.
Logs show that it's running fine, no CPU/Memory/Harddrive issues. Nice.
So where is it?
Traceroute shows its connected directly to the router... Maybe over an unmanaged switch...
Hostname is "localhost"... That's no help.
I've walked the network 4 times now, and God knows where it is.
I think maybe I'll just leave it alone. If it ain't broke...9 -
I could bitch about XSLT again, as that was certainly painful, but that’s less about learning a skill and more about understanding someone else’s mental diarrhea, so let me pick something else.
My most painful learning experience was probably pointers, but not pointers in the usual sense of `char *ptr` in C and how they’re totally confusing at first. I mean, it was that too, but in addition it was how I had absolutely none of the background needed to understand them, not having any learning material (nor guidance), nor even a typical compiler to tell me what i was doing wrong — and on top of all of that, only being able to run code on a device that would crash/halt/freak out whenever i made a mistake. It was an absolute nightmare.
Here’s the story:
Someone gave me the game RACE for my TI-83 calculator, but it turned out to be an unlocked version, which means I could edit it and see the code. I discovered this later on by accident while trying to play it during class, and when I looked at it, all I saw was incomprehensible garbage. I closed it, and the game no longer worked. Looking back I must have changed something, but then I thought it was just magic. It took me a long time to get curious enough to look at it again.
But in the meantime, I ended up played with these “programs” a little, and made some really simple ones, and later some somewhat complex ones. So the next time I opened RACE again I kind of understood what it was doing.
Moving on, I spent a year learning TI-Basic, and eventually reached the limit of what it could do. Along the way, I learned that all of the really amazing games/utilities that were incredibly fast, had greyscale graphics, lowercase text, no runtime indicator, etc. were written in “Assembly,” so naturally I wanted to use that, too.
I had no idea what it was, but it was the obvious next step for me, so I started teaching myself. It was z80 Assembly, and there was practically no documents, resources, nothing helpful online.
I found the specs, and a few terrible docs and other sources, but with only one year of programming experience, I didn’t really understand what they were telling me. This was before stackoverflow, etc., too, so what little help I found was mostly from forum posts, IRC (mostly got ignored or made fun of), and reading other people’s source when I could find it. And usually that was less than clear.
And here’s where we dive into the specifics. Starting with so little experience, and in TI-Basic of all things, meant I had zero understanding of pointers, memory and addresses, the stack, heap, data structures, interrupts, clocks, etc. I had mastered everything TI-Basic offered, which astoundingly included arrays and matrices (six of each), but it hid everything else except basic logic and flow control. (No, there weren’t even functions; it has labels and goto.) It has 27 numeric variables (A-Z and theta, can store either float or complex numbers), 8 Lists (numeric arrays), 6 matricies (2d numeric arrays), 10 strings, and a few other things like “equations” and literal bitmap pictures.
Soo… I went from knowing only that to learning pointers. And pointer math. And data structures. And pointers to pointers, and the stack, and function calls, and all that goodness. And remember, I was learning and writing all of this in plain Assembly, in notepad (or on paper at school), not in C or C++ with a teacher, a textbook, SO, and an intelligent compiler with its incredibly helpful type checking and warnings. Just raw trial and error. I learned what I could from whatever cryptic sources I could find (and understand) online, and applied it.
But actually using what I learned? If a pointer was wrong, it resulted in unexpected behavior, memory corruption, freezes, etc. I didn’t have a debugger, an emulator, etc. I had notepad, the barebones compiler, and my calculator.
Also, iterating meant changing my code, recompiling, factory resetting my calculator (removing the battery for 30+ sec) because bugs usually froze it or corrupted something, then transferring the new program over, and finally running it. It was soo slowwwww. But I made steady progress.
Painful learning experience? Check.
Pointer hell? Absolutely.4 -
Interesting bug hunt!
Got called in because a co-team had a strange bug and couldn't make sense of it. After a compiler update, things had stopped working.
They had already hunted down the bug to something equivalent to the screenshot and put a breakpoint on the if-statement. The memory window showed the memory content, and it was indeed 42. However, the debugger would still jump over do_stuff(), both in single step and when setting a breakpoint on the function call. Very unusual, but the rest worked.
Looking closer, I noticed that the pointer's content was an odd number, but was supposed to be of type uint32_t *. So I dug out the controller's manual and looked up the instruction set what it would do with a 32 bit load from an unaligned address: the most braindead thing possible, it would just ignore the lowest two address bits. So the actual load happened from a different address, that's why the comparison failed.
I think the debugger fetched the memory content bytewise because that would work for any kind of data structure with only one code path, that's how it bypassed the alignment issue. Nice pitfall!
Investigating further why the pointer was off, it turned out that it pointed into an underlying array of type char. The offset into the array was correctly divisible by 4, but the beginning had no alignment, and a char array doesn't need one. I checked the mapfiles and indeed, the old compiler had put the array to a 4 byte boundary and the new one didn't.
Sure enough, after giving the array a 4 byte alignment directive, the code worked as intended.8 -
Why computers are like men:
1. In order to get their attention, you have to turn them on.
2. They have a lot of data, but are still clueless.
3. They are supposed to help you solve problems, but half the time they are the problem.
4. As soon as you commit to one, you realize that if you had waited a little longer, you could have had a better model.
Why computers are like women:
1. No one but the Creator understands their internal logic.
2. The native language they use to communicate with other computers is incomprehensible to everyone else.
3. Even your smallest mistakes are stored in long-term memory for later retrieval.
4. As soon as you make a commitment to one, you find yourself spending half your paycheck on accessories for it.1 -
Boss: Any idea why ColleagueX's code might be blowing out the memory?
Me (internal): Cos he's a fucking retard who can't code for shit, doesn't listen when I tell him to do stuff properly because he's fucking lazy, has no idea what stack and heap are, uses goto everywhere, doesn't know how to debug, doesn't write any unit tests, and generally WASTES MY FUCKING TIME!
Me (external): Probably a memory leak. I'll take a look.2 -
"...we're using Java. That fat bitch doesn't just eat memory, she just deep-throated her sixth serving and is showing no signs of relenting"
-Me, 2k182 -
!rant
When I was in 8th grade and was learning to code (c++), I sincerely believed that calling a function within a function simply calls it again (like in a loop) . I had never heard of recursion.
And I actually made a small project in which I called a function again and again thinking that calling another terminates the previous one.
No wonder my program kept crashing. I have still kept that code with me as a wonderful memory.
I know this isn't particularly interesting, but I just saw that code today and felt like sharing this...3 -
Saw this on Facebook and couldn't help but share here! 😂
A young woman submitted the tech support message below (about her relationship to her husband) presumably did it as a joke…
The query:
Dear Tech Support,
’Last year I upgraded from Boyfriend 5.0 to Husband 1.0 and noticed a distinct slowdown in overall system performance, particularly in the flower and jewelry applications, which operated flawlessly under Boyfriend 5.0.
In addition, Husband 1.0 uninstalled many other valuable programs, such as: Romance 9.5 and Personal Attention 6.5, and then installed undesirable programs such as: NBA 5.0, NFL 3.0 and Golf Clubs 4.1.
Conversation 8.0 no longer runs, and House cleaning 2.6 simply crashes the system. Please note that I have tried running Nagging 5.3 to fix these problems, but to no avail.
What can I do?
Signed,
Desperate
The response (that came weeks later out of the blue):
Dear Desperate,
“First keep in mind, Boyfriend 5.0 is an Entertainment Package, while Husband 1.0 is an operating system. Please enter command: I thought you loved me.html and try to download Tears 6.2 and do not forget to install the Guilt 3.0 update. If that application works as designed, Husband 1.0 should then automatically run the applications Jewelry 2.0 and Flowers 3.5.
However, remember, overuse of the above application can cause Husband 1.0 to default to Grumpy Silence 2.5, Happy Hour 7.0 or Beer 6.1. Please note that Beer 6.1 is a very bad program that will download the Farting and Snoring Loudly Beta.
Whatever you do, DO NOT, under any circumstances, install Mother-In-Law 1.0 (it runs a virus in the background that will eventually seize control of all your system resources.)
In addition, please, do not attempt to re-install the Boyfriend 5.0 program. These are unsupported applications and will crash Husband 1.0.
In summary, Husband 1.0 is a great program, but it does have limited memory and cannot learn new applications quickly. You might consider buying additional software to improve memory and performance. We recommend: Cooking 3.0.Good Luck!’
Good Luck!3 -
Our team makes a software in Java and because of technical reasons we require 1GB of memory for the JVM (with the Xmx switch).
If you don't have enough free memory the app without any sign just exits because the JVM just couldn't bite big enough from the memory.
Many days later and you just stand there without a clue as to why the launcher does nothing.
Then you remember this constraint and start to close every memory heavy app you can think of. (I'm looking at you Chrome) No matter how important those spreadsheets or illustrator files. Congratulation you just freed up 4GB of memory, things should work now! WRONG!
But why you might ask. You see we are using 32-bit version of java because someone in upper management decided that it should run on any machine (even if we only test it on win 7 and high sierra) and 32 is smaller than 64 so it must be downwards compatible! we should use it! Yes, in 2019 we use 32-bit java because some lunatic might want to run our software on a Windows XP 32-bit OS. But why is this so much of a problem?
Well.. the 32-bit version of Java requires CONTIGUOUS FREE SPACE IN MEMORY TO EVEN START... AND WE ARE REQUESTING ONE GIGABYTE!!
So you can shove your swap and closed applications up your ass but I bet you that you won't get 1GB contiguous memory that way!
Now there will be a meeting about this issue and another related to the issues with 32-bit JVM tomorrow. The only problem is that this issue only occures if you used up most of your memory and then try to open our software. So upper management will probably deem this issue minor and won't allow us to upgrade to 64-bit... in 20fucking1910 -
TFW you spend 30 minutes debugging an app that isn't working right only to find out it's working exactly as it's supposed you, you just forgot you put that bit in there that does that thing.
One character change later and it's working perfectly. ONE CHARACTER! THIRTY BASTARD MINUTES! I just spent thirty minutes driving through every line of code and coming closer to the conclusion that Java was doing some kind of strange thing with dropping objects from memory. No, it wasn't Java that had memory problems, it's me! Just check me into the old peoples home now so I can spend my day pissing my pants and making lewd comments at the nurses because that's all I'm fucking useful for at this point!!
I need a coffee.5 -
Just commented on a rant
*Goes back to scrolling*
*Remembers that I forgot to ++ it*
*Runs to my comments to upvote the rant*
Happened to anyone else?8 -
I've got a file on my desktop called key.txt, and it's just a single line in it that is clearly some sort of API key.
Absolutely no memory of what it is for.
💩9 -
2012 laptop:
- 4 USB ports or more.
- Full-sized SD card slot with write-protection ability.
- User-replaceable battery.
- Modular upgradeable memory.
- Modular upgradeable data storage.
- eSATA port.
- LAN port.
- Keyboard with NUM pad.
- Full-sized SD card slot.
- Full-sized HDMI port.
- Power, I/O, charging, network indicator lamps.
- Modular bay (for example Lenovo UltraBay)
- 1080p webcam (Samsung 700G7A)
- No TPM trojan horse.
2024 laptop:
- 1 or 2 USB ports.
- Only MicroSD card slot. Requires fumbling around and has no write-protection switch.
- Non-replaceable battery.
- Soldered memory.
- Soldered data storage.
- No eSATA port.
- No LAN port.
- No NUM pad.
- Micro-HDMI port or uses USB-C port as HDMI.
- Only power lamp. No I/O lamp so user doesn't know if a frozen computer is crashed or working.
- No modular bay
- 720p webcam
- TPM trojan horse (Jody Bruchon video: https://youtube.com/watch/... )
- "Premium design" (who the hell cares?!)14 -
Ok friends let's try to compile Flownet2 with Torch. It's made by NVIDIA themselves so there won't be any problem at all with dependencies right?????? /s
Let's use Deep Learning AMI with a K80 on AWS, totally updated and ready to go super great always works with everything else.
> CUDA error
> CuDNN version mismatch
> CUDA versions overwrite
> Library paths not updated ever
> Torch 0.4.1 doesn't work so have to go back to Torch 0.4
> Flownet doesn't compile, get bunch of CUDA errors piece of shit code
> online forums have lots of questions and 0 answers
> Decide to skip straight to vid2vid
> More cuda errors
> Can't compile the fucking 2d kernel
> Through some act of God reinstalling cuda and CuDNN, manage to finally compile Flownet2
> Try running
> "Kernel image" error
> excusemewhatthefuck.jpg
> Try without a label map because fuck it the instructions and flags they gave are basically guaranteed not to work, it's fucking Nvidia amirite
> Enormous fucking CUDA error and Torch error, makes no sense, online no one agrees and 0 answers again
> Try again but this time on a clean machine
> Still no go
> Last resort, use the docker image they themselves provided of flownet
> Same fucking error
> While in the process of debugging, realize my training image set is also bound to have bad results because "directly concatenating" images together as they claim in the paper actually has horrible results, and the network doesn't accept 6 channel input no matter what, so the only way to get around this is to make 2 images (3 * 2 = 6 quick maths)
> Fix my training data, fuck Nvidia dude who gave me wrong info
> Try again
> Same fucking errors
> Doesn't give nay helpful information, just spits out a bunch of fucking memory addresses and long function names from the CUDA core
> Try reinstalling and then making a basic torch network, works perfectly fine
> FINALLY.png
> Setup vid2vid and flownet again
> SAME FUCKING ERROR
> Try to build the entire network in tensorflow
> CUDA error
> CuDNN version mismatch
> Doesn't work with TF
> HAVE TO FUCKING DOWNGEADE DRIVERS TOO
> TF doesn't support latest cuda because no one in the ML community can be bothered to support anything other than their own machine
> After setting up everything again, realize have no space left on 75gb machine
> Try torch again, hoping that the entire change will fix things
At this point I'll leave a space so you can try to guess what happened next before seeing the result.
Ready?
3
2
1
> SAME FUCKING ERROR
In conclusion, NVIDIA is a fucking piece of shit that can't make their own libraries compatible with themselves, and can't be fucked to write instructions that actually work.
If anyone has vid2vid working or has gotten around the kernel image error for AWS K80s please throw me a lifeline, in exchange you can have my soul or what little is left of it5 -
Why do people jump from c to python quickly. And all are about machine learning. Free days back my cousin asked me for books to learn python.
Trust me you have to learn c before python. People struggle going from python to c. But no ml, scripting,
And most importantly software engineering wtf?
Software engineering is how to run projects and it is compulsory to learn python and no mention of got it any other vcs, wtf?
What the hell is that type of college. Trust me I am no way saying python is weak, but for learning purpose the depth of language and concepts like pass by reference, memory leaks, pointers.
And learning algorithms, data structures, is more important than machine learning, trust me if you cannot model the data, get proper training data, testing data then you will get screewed up outputs. And then again every one who hype these kinds of stuff also think that ml with 100% accuracy is greater than 90% and overfit the data, test the model on training data. And mostly the will learn in college will be by hearting few formulas, that's it.
Learn a language (concepts in language) like then you will most languages are easy.
Cool cs programmer are born today😖31 -
Is obsidian a fucking joke?
Seriously, is it a joke? Why would you ever care so much about indexing literally everything, if the entire thing crashes and/or takes >5min to LITERALLY just open the fucking directory and/or (so help you) if that directory is full of projects/repos or whatever the fuck and the total size of said directory is like >5GB.
WHY THE FUCK WOULD YOU INDEX EVERYTHING? -- "Ohh obsidian's not supposed to be used a fully fledged IDE, ohh obsidian should just handle MD files and normal sized projects, ohh the plugins and ease-of-use" -- Fuck.
There's no fucking real reason to index everything, BY DEFAULT. You open a directory with Obsidian? Doesn't matter, it's 1 byte, it's 100GB, you get indexed. Deal with it. It will use LITERALLY every resource your computer has. I'm surprised it doesn't go galaxy brain and ping if any other computers/devices are on the network and then attempt to connect and use their hardware (obsidian can be like a node!).
How shit can you be at understanding basic data structures and algorithms, where you just revert to based google-chrome brain and let the FUCKING TEXT EDITOR -- OBSIDIAN IS A FUCKING TEXT EDITOR HOLY SHIT -- hog all conceivable memory.
I swear to <some-deity> if anyone fucking says "Ohhhhhhhh actually, it's not a text editor, it has plugins and features and shit, it does all dis cool stff", OR, "Ohhhhh actually, obsidian indexes things for a very specific/rationale/apt/pragmatic/academic reason" OR "ohhhh, I have 100 iphones, 1000 ipads and a trillion desktop computers that each have 256GB of memory, why you hating on obsidian?" then go kick rocks. The fucking lot of you. Are you fucking kidding me.8 -
LPT: NEVER accept a freelance job without looking at the project's source first
Client: I have a project made by a company that is now abandoning it, I want you to fix some bugs
Me: Okay, can you:
1) Give me a build to test the current state of the game
2) Tell me what the bugs are
3) Show me the source
4) Tell me your budget
Client: *sends a list of 10 bugs* Here's the APK and to give you the project I'll need you to sign an NDA
Me: Sure...
*tests build*
*sees at least 20 bugs*
*still downloading source*
*bugs look quite easy to fix should be done under an hour*
Me: Okay, so, I can fix each bug for $10 and I can do 2 today
Client: Okay can you fix 8 bugs today for $40??
*sigh*
Me: No I cannot.
Client: okay then 2 today for $20 is fine, I want a refund if you can't fix them today
*sigh*
Me: Look dude, this isn't the first time I am doing this, aight? I'll fix the bugs today you can pay me after check they are done, savvy?
Client: okay
*source is downloaded*
*literal apes wrote the scripts, commented out code EVERYWHERE
Debug logs after every line printing every frame causing FPS drops, empty objects in the scene
multiple unused UI objects
everything is spaghetti*
*give up, after 2 hours of hell*
*tfw averted an order cancellation by not taking the order and telling client that they can pay me after I am done*
Attached is an image of a level object pool
It's an array with each element representing a level.
The numbers and "Final" are ids for objects in an object pool
The whole string is .Split(',') into an array (RIP MEMORY BTW) and then a loop goes through each element in the split array and instantiates the object from an object pool5 -
You know what just gets to me about garbage-collected languages like c# and Java? Fucking dynamic memory allocation (seemingly) on the stack. Like it's so bizzare to me.
"Hey, c#, can I have an array of 256 integers during run-time?"
"Ya sure no prob"
"What happens when the array falls out of scope"
"I gotchu fam lol"8 -
MacRant: was waiting for a new macbook pro release for awhile to upgrade by old laptop (not mac). Watched the release, had very mixed feeling about it, but still ordered (clinching my teeth and saying sorry to my wallet). Next day looked into alternatives, cacelled the orded to have more time to think, now deciding... I mean cmon, no latest 7th gen processor, no 32gb memory option, 2gb video is ok for non gaming, the whole "big" thing is TouchBar that I DON'T F* NEED. They should drop the "Pro" and name it "Fancy Strip".
So I looked into alternatives, and Dell XPS 15 with maxed spect is twice as juicier, and has not a touch bar, but the whole touch freakin 4k screen, for the less price :/
Just wanted to rant about the new macbook's spec and price and see what you all think of macbook vs alternatives?16 -
I do not like the direction laptop vendors are taking.
New laptops tend to feature fewer ports, making the user more dependent on adapters. Similarly to smartphones, this is a detrimental trend initiated by Apple and replicated by the rest of the pack.
As of 2022, many mid-range laptops feature just one USB-A port and one USB-C port, resembling Apple's toxic minimalism. In 2010, mid-class laptops commonly had three or four USB ports. I have even seen an MSi gaming laptop with six USB ports. Now, much of the edges is wasted "clean" space.
Sure, there are USB hubs, but those only work well with low-power devices. When attaching two external hard drives to transfer data between them, they might not be able to spin up due to insufficient power from the USB port or undervoltage caused by the impedance (resistance) of the USB cable between the laptop's USB port and hub. There are USB hubs which can be externally powered, but that means yet another wall adapter one has to carry.
Non-replaceable [shortest-lived component] mean difficult repairs and no more reserve batteries, as well as no extra-sized battery packs. When the battery expires, one might have to waste four hours on a repair shop for a replacement that would have taken a minute on a 2010 laptop.
The SD card slot is being replaced with inferior MicroSD or removed entirely. This is especially bad for photographers and videographers who would frequently plug memory cards into their laptop. SD cards are far more comfortable than MicroSD cards, and no, bulky external adapters that reserve the device's only USB port and protrude can not replace an integrated SD card slot.
Most mid-range laptops in the early 2010s also had a LAN port for immediate interference-free connection. That is now reserved for gaming-class / desknote laptops.
Obviously, components like RAM and storage are far more difficult to upgrade in more modern laptops, or not possible at all if soldered in.
Touch pads increasingly have the buttons underneath the touch surface rather than separate, meaning one has to be careful not to move the mouse while clicking. Otherwise, it could cause an unwanted drag-and-drop gesture. Some touch pads are smart enough to detect when a user intends to click, and lock the movement, but not all. A right-click drag-and-drop gesture might not be possible due to the finger on the button being registered as touch. Clicking with short tapping could be unreliable and sluggish. While one should have external peripherals anyway, one might not always have brought them with. The fallback input device is now even less comfortable.
Some laptop vendors include a sponge sheet that they want users to put between the keyboard and the screen before folding it, "to avoid damaging the screen", even though making it two millimetres thicker could do the same without relying on a sponge sheet. So they want me to carry that bulky thing everywhere around? How about no?
That's the irony. They wanted to make laptops lighter and slimmer, but that made them adapter- and sponge sheet-dependent, defeating the portability purpose.
Sure, the CPU performance has improved. Vendors proudly show off in their advertisements which generation of Intel Core they have this time. As if that is something users especially care about. Hoo-ray, generation 14 is now yet another 5% faster than the previous generation! But what is the benefit of that if I have to rely on annoying adapters to get the same work done that I could formerly do without those adapters?
Microsoft has also copied Apple in demanding internet connection before Windows 11 will set up. The setup screen says "You will need an Internet connection…" - no, technically I would not. What does technically stand in the way of Windows 11 setting up offline? After all, previous Windows versions like Windows 95 could do so 25 years earlier. But also far more recent versions. Thankfully, Linux distributions do not do that.
If "new" and "modern" mean more locked-in and less practical and difficult to repair, I would rather have "old" than "new".12 -
This happened today
My Manager: How is the progress so far on the search module?
Me(After implementing some crazy shit requirements): It's all set. APIs are working well against the mock in-memory database. I need an actual database to run my unit tests. Where do we have it?
My manager: Let's pretend that there is no database at this moment. Go-ahead with rest of your activities.
Me(IN MY MIND): F*CK you a** hole. You don't know the first thing of software development! Which a** hole promoted you as a manager!!!
Me(TO HIS FACE): Ah.. okay!! As you wish!3 -
long time ago....
Feature request: We want an android backup solution in Our app!
UI guy has already developed it, you just need to see if his solution is solid!
Ok then - lets look at the UI: Nice progress bars, that turn into green checkmarks. Looks good.
Now lets look at the code: ... Ok. loading some files into memory.... and... dafuq? does not write to a file?
Backup to RAM. With no restore. 🤦♂️.3 -
FFUUUuucccckkk me sideways. So I decided to look into USB type-c's power delivery and alt modes. Cause I kinda want to make an adapter card to run my displays over a single cable. TLDR of the rest: USB-C has some huge capabilities which noone is interested in using since its way to complex to handle for what its worth in the end.
Now PD alone is kinda ok to deal with since a lot of powerbanks use it and some hobby guys documented how to work with it. I find it really odd thou that you NEED to use a dedicated IC for using the configuration chanel to negotiate how much power you can draw. Why the USB standard didnt use some simple 5V low speed signalling? Also the standard says that you only have to implement 5v 0.6A with every other power level being optional. (This is also true for cables. Most manufacturers use only the USB 2.0 standard for them and brag about how fast type-C is. ლ(ಠ益ಠლ) )
Now to the alt modes. These motherfuckers are a real shitshow to deal with. First you need a Mux to deal with USB-C's two way insertion, so your signals wont get flipped. Next thing is that you have four lanes at your disposal in alt mode. Which you can either use for four Display Port Lanes or two DP lanes and two USB 3.0 lanes. (You always get USB 2.0) Now you may think that there would be one simple chip to do it all? Nope you need atleast two at the price of 6$ each. One for PD and one for Alt modes. Both are very hard to solder (QFN, 0.5 mm pitch 40+ pins) TI ended up being the only one with a decent offering of IC's that do what I need. As for working with them, you would think that you just slap a simple MCU on there that communicates over I2C or SPI to configure the chips? Nope! You program the chips memory from which it configures itsself. And the programming is done with some TI tool which gives me no idea as to how you can handle everything whith no control logic behind it.
Looking into alternative IC's leaves me with cypress semi. And their documentation is basically a total mess. I wanna know what that chip is good for and what I need to do to make it work. I dont care about technical details mixed with marketing jargon nobody understands. And I really despise that I have to register just to download a datasheet. Especially since there is no info about it on the main page.
And this whole rant hasnt even touched the topic that USB-C only uses DP and nothing else. So you better hope that you have DP++ so you can use a passive conversion.
This was my Ted Talk about USB-C. Some info in it may be subject to my stupidity and errors as it currently is 02:15 in the morning and I need some sleep.14 -
University Coding Exam for Specialization Batch:
Q. Write a Program to merge two strings, each can be of at max 25k length.
Wrote the code in C, because fast.
Realized some edge cases don't pass, runtime errors. Proceed on to check the locked code in the Stub. (We only have to write methods, the driver code is pre-written)
Found that the memory for the char Arrays is being allocated dynamically with size 10240.
Rant #1:
Dafuq? What's the point of dynamic Memory Allocation if you're gonna fix it to a certain amount anyway?
Continuing...
Called the Program Incharge, asking him to check the problem and provide a solution. He took 10 minutes to come, meanwhile I wrote the program in Java which cleared all the test cases. <backstory>No University Course on Java yet, learnt it on my own </backstory>
Dude comes, I explain the problem. He asks me to do it in C++ instead coz it uses the string type instead of char array.
I told him that I've already done it in Java.
Him: Do you know Java?
Rant #2:
No you jackass! I did the whole thing in Java without knowing Java, what's wrong with you!2 -
“Hey - just calling you to give you an update”
Great - sorry can you refresh my memory what was this for?
“So I was about to put you through for a client but they’re no longer accepting CVs so just to update you that’s not happening”
Sorry, what was the client again?
“Oh I can’t say, but they’re no longer accepting CVs”
“...Thanks, goodbye.”
*So you call me to tell me that you can’t give my details to a client that you can’t disclosure....get off my line 🤬😡🤬*3 -
Windows 10 wants to ruin my life by consuming almost 70% of memory for itself from 4 gigs.
No application is running and still consuming that much of memory. Now I just hate the updates of windows 10 pro.
Any suggestions to get rid of this situation?26 -
I propose that the study of Rust and therefore the application of said programming language and all of the technology that compromises it should be made because the language is actually really fucking good. Reading and studying how it manages to manipulate and otherwise use memory without a garbage collector is something to be admired, illuminating in its own accord.
BUT going for it because it is a "beTter C++" should not constitute a basis for it's study.
Let me expand through anecdotal evidence, which is really not to be taken seriously, but at the same time what I am using for my reasoning behind this, please feel free to correct me if I am wrong, for I am a software engineer yes, I do have academic training through a B.S in Computer Science yes, BUT my professional life has been solely dedicated to web development, which admittedly I do not go on about technical details of it with you all because: I am not allowed to(1) and (2)it is better for me to bitch and shit over other petty development related details.
Anecdotal and otherwise non statistically supported evidence: I have seen many motherfuckers doing shit in both C and C++ that ADMIT not covering their mistakes through the use of a debugger. Mostly because (A) using a debugger and proper IDE is for pendejos and debugging is for putos GDB is too hard and the VS IDE is waaaaaa "I onlLy NeeD Vim" and (B) "If an error would have registered then it would not have compiled no?", thus giving me the idea that the most common occurrences of issues through the use of the C father/son languages come from user error, non formal training in the language and a nice cusp of "fuck it it runs" while leaving all sorts of issues that come from manipulating the realm of the Gods "memory".
EVERY manual, book, coming all the way back to the K&C book talks about memory and the way in which developers of these 2 languages are able to manipulate and work on it. EVERY new standard of the ISO implementation of these languages deals, through community effort or standard documentation about the new items excised through features concerning MODERN (meaning, no, the shit you learned 20 years ago won't fucking cut it) will not cut it.
THUS if your ass is not constantly checking what the scalpel of electrical/circuitry/computational representation of algorithms CONDONES in what you are doing then YOU are the fucking problem.
Rust is thus no different from the original ideas of the developers behind Go when stating that their developers are not efficient enough to deal with X language, Rust protects you, because it knows that you are a fucking moron, so the compiler, advanced, and well made as it is, will give you warnings of your own idiotic tendencies, which would not have been required have you not been.....well....a fucking idiot.
Rust is a good language, but I feel one that came out from the necessity of people writing system level software as a bunch of fucking morons.
This speaks a lot more of our academic endeavors and current documentation than anything else. But to me DEALING with the idea of adapting Rust as a better C++ should come from a different point of view.
Do I agree with Linus's point of view of C++? fuck no, I do not, he is a kernel engineer, a damn good one at that regardless of what Dr. Tanenbaum believes(ed) but not everyone writes kernels, and sometimes that everyone requires OOP and additions to the language that they use. Else I would be a fucking moron for dabbling in the dictionary of languages that I use professionally.
BUT in terms of C++ being unsafe and unsecured and a horrible alternative to Rust I personaly do not believe so. I see it as a powerful white canvas, in which you are able to paint software to the best of your ability WHICH then requires thorough scrutiny from the entire team. NOT a quick replacement for something that protects your from your own stupidity BY impending the use of what are otherwise unknown "safe" features.
To be clear: I am not diminishing Rust as the powerhouse of a language that it is, myself I am quite invested in the language. But instead do not feel the reason/need before articles claiming it as the C++ killer.
I am currently heavily invested in C++ since I am trying a lot of different things for a lot of projects, and have been able to discern multiple pain points and unsafe features. Mainly the reason for this is documentation (your mother knows C++) and tooling, ide support, debugging operations, plethora of resources come from it and I have been able to push out to my secret project a lot of good dealings. WHICH I will eventually replicate with Rust to see the main differences.
Online articles stating that one will delimit or otherwise kill the other is well....wrong to me. And not the proper approach.
Anyways, I like big tits and small waists.14 -
This is my first rant here, so I hope everyone has a good time reading it.
So, the company I am working for got me going on the task to do a rewrite of a firmware that was extended for about 20 years now. Which is fine, since all new machines will be on a new platform anyways. (The old firmware was written for an 8051 initially. That thing has 256 byte of ram. Just imagine the usage of unions and bitfields...)
So, me and a few colleagues go ahead and start from scratch.
In the meantime however, the client has hired one single lonely developer. Keep in mind that nobody there understands code!
And oh boy did he go nuts on the old code, only for having it used on the very last machine of the old platform, ever! Everything after that one will have our firmware!
There are other machines in that series, using the original extended firmware. Nothing is compatible, bootloaders do not match, memory layouts do not match, code is a horrible mess now, the client is writing the specification RIGHT NOW (mind, the machine is already sold to customers), there are no tests, and for the grand finale, the guy canceled his job and went to a different company. Did I mention the bugs it has and the features it lacks?
Guess who's got to maintain that single abomination of a firmware now?1 -
Well I'm a first year student in computer science and in the first semester we started to learn C language and the IDE they told us to use for better learning was Devcpp.
We made a few small projects and all went well, but now in the second semester we started to make bigger projects with linked lists and memory allocation and Devcpp starts to be a complete bug itself... We are working hard in the project and after saving the project with no errors at all, at the next day, Devcpp starts to make any function we made invalid...
So we spoke to the same teacher about this and asked what can we do about it....
"Are you using Devcpp? You shouldn't, it is not that good for C"...
ARE YOU KIDDING ME?14 -
The only thing more dangerous than an alcoholic short-term-memory-challenged non-technical throw-you-under-the-bus IT director with self-esteem issues that are sporadically punctuated by delusions of superiority is one who fears for his job. Submitted for your inspection: a besotted mass of near-human brain function who not only has a 50 person IT department to run, but has also been questioned by the business owners as to what he actually does. So he has decided to show them. He has purchased a vendor product to replace a core in-house developed application used to facilitate creating the product the business sells. The purchased software only covers about 40 percent of the in-house application's functionality, so he is contracting with the vendor to perform custom development on the purchased product (at a cost likely to be just shy of six-figures) so that about 90 percent of existing functionality will be covered. He has asked one of his developers (me) to scale down the existing software to cover the functionality gaps the purchased software creates. There is no deployment plan that will allow the business to transition from the current software to the new vendor-supplied one without significantly hurting the ability of the business to function. When anyone raises this issue he dismisses it with sage musings such as, "I know it will be painful, but we'll just have to give the users really good support." Because he has no idea what any of his staff actually does, he is expecting one of his developers (again, unfortunately, me) to work with the vendor so that the Frankensoftware will perform as effectively as the current software (essentially as a project manager since there will be no in-house coding involved). Lastly, he refuses to assign someone to be responsible for the software: taking care of maintenance, configuration, and issue resolutions after it has been rolled out. When I pointedly tell him I will not be doing that (because this is purchased software and I am not a system admin or desktop engineer) he tells me, "Let me think about this." The worst part is that this is only one of four software replacement initiatives he is injecting himself into so he can prove his worth to the business owners. And by doing so he is systematically making every software development initiative akin to living in Dante's Eighth Circle. I am at the point where I want to burn my eye out with a hot poker, pour salt into the wound, and howl to the heavens in unbearable agony for a month, so when these projects come to fruition, and I am suffering the wrath of the business owners, I can look back on that moment I lost my eye and think "good times."4
-
So I've started learning Rust and I must say it feels great! But some parts of the language, like enums, are quite different than what I'm used to.
As a proof of concept I've reimplemented a small API (an Azure Function App) in Rust with Actix Web and it's FAST AS FUCK BOIII.
The response is served about 5x as quckly and the memory footprint shrinked from some 90 MB to around 5 MB.
In my small scale usecase it's not a huge difference, but I think it can be massive at large scales...
What is your experience with Rust (at scale)?
I wish I could quickly reimplement the whole fucking CMS Of Doom™ in Rust... but no time and resources :(5 -
!dev
A child's mind is fascinating.
I remember how it felt being a kid, just deliriously happy.
Things were magical, mystical and happy.
I knew the world wasn't perfect, I knew bad things happened to good people.
But a kid's mind is so powerful that it can fill in the blanks with the most cheerful and optimistic perspectives.
And at some point in my childhood I was exposed to videogames, and that kinda took me down fantasy lane even further.
I was extremely young and barely retaining any memories when I was exposed to my first console, a famicom.
I have a somewhat vivid memory of my mind being blown away for the first time by watching my brother play New Ghostbusters II for NES.
From then on, we never stopped and played several console and dos/pc games.
When I was 10, someone from the neighborhood brought in a couple of floppys with Pokemon Yellow.
"What? Pokemon? How the fuck is that even possible? This is a pc, not a gameboy".
I didn't know at the time what an emulator was, but I was super fucking stoked to be able to play that.
My dad had a 1 gb laptop from work that he didn't use, so I hoarded that shit, and I would get to bed and play nearly everyday.
The experience was surreal. I was doing pc gaming... not on a chair, on a fucking bed, and I was playing a gameboy game... on a pc.
It was so intense to me, that even after more than 2 decades of that time in my life, I still remember how it feels like.
Like, you know how you can "feel" things if you think about them? like for example if you think about the taste of chicken, you can somehow feel it for a second.
Well I have like an actual physical sensation linked to that experience but I can't explain it at all, because it's just a sensation.
I think people usually say they feel that way, for example, about the PSX (usually refered to as ps one) loading screen. I experienced that too but when I was 12, so it was not as intense (it does make me feel the fuzzies though).
I also remember other things with very high detail, like the texture of my bed cover, the weather, mom cooking, the clunky shape of the laptop, the way I carelessly stored it above a pile of magazines, etc.
I rememeber ofc how it felt looking at the game sprites, interacting with NPCs, and the goddamn fucking glorious music.
It was dreamy.
Years and years later, I grew up and I stopped living in fantasy world and became more aware of the grim aspects of life my younger self was sugarcoating.
So I tried to play pokemon again, again and again, and no matter how hard I tried to revive that euphoria, I could not never do it.
I started to get annoyed at the game.
"Come oooon, I did the tutorial already, let me skip this.
This pokemon is useless, why am I even training it.
Fuck, I'm tired of grinding"
At some point I accepted that the feeling would never return, and that it would just live in my memory.
Ironically, I can recall that memory and how it felt anytime I want to.
And I can actually still feel it, and throughtout these years, it has never wore down.
And eventually I learned how to play pokemon and enjoy it:
I read tier lists at smogon online and just catch and train the pokemons that are higher on the list, which is how i got to beat yellow in like 3 days.
(This is nothing compared to what speedrunners do, but much better than the weeks it had taken me in the past).
That served as an important lesson that when a kid plays a game, his mind is also the game at the same time, filling the blanks with its imagination.
A very similar experience happened to me with harvest moon, which is the precursor of stardew valley.
and that game is faaar more emotional: you talk to people, overtime you befriend them and they open up, you meet a girl, you marry her, have a kid
you get farm animals, you brush them, they become happy
you get attached
that game was also so powerful in me that in all naiveness I thought I wanted to be a farmer.
Eventually I grew up and hit puberty and from then on, I focused more on competitive games, like smash bros, cs and tf2.
and i dunno how to end a post so eat my fucking nuts17 -
Today’s lesson in C programming:
DON’T use
system( “clear” );
in Mac OS...
Causes seg Fault in ur program when it is perfectly correct...
What happened was... a friend wanted help with C programming and had written this code... but it was getting seg Fault randomly... just random seg Fault when his code was correct...
I pinpointed the seg Fault to a printf statement but the statement was correct...
Off to search the issue I went, found out that flushing problems can occur in printf if u don’t use \n.
This happens randomly. Thought this might b the reason...
Went to a VM running Arch Linux and tested the code there... worked perfectly. No issues whatsoever.
From a distant memory I remembered some people discussing to never use system( “clear” ); since it causes issues.
Thought to remove that line from code, thinking it wouldn’t make any difference.
Well imagine my shock when the code worked fine after remove that freaking line...
M gonna blame this one on Mac OS since arch had no issues with it 😡😡
Now to find alternative to system( “clear” );
Damm it I spent 4-5 hours on this crap!!!!!!9 -
My first contact with an actual computer was the Sinclair ZX80, a monster with 512 bytes of ram (as in 1/2 kbyte)
It had no storage so you had to enter every program every time and it was programmed in basic using key combinations, you could not just write the commands since it did not have memory enough to keep the full text in memory.
So you pressed the cmd key along with one of the letter keys and possibly shift to enter a command, like cmd+p for print and it stored s byte code.8 -
Spent 6 hours implementing a feature because my senior didn't want to use a 3rd party plug-in.
After said 6 hours, went to look at the plugin's source code to get some inspiration with a problem I was having.
Guess fucking what? Plug-in was implemented exactly as I had done it to start with. Even better, actually, since it fixed some native bugs I couldn't find a way around.
Went back to my senior, showed him both sources and argued again in favour of the plug-in.
Senior: "Meh, I'm not sure. Don't really like to keep adding plugins"
Me: "Why? Do they cause performance hits? Increase memory usage?"
S: "No, not all. But I don't like plugins"
/flip
We ended up using the plug-in, but I "wasted" a whole day doing something we scrapped. Guess I learned some interesting things about encryption on Android, at least...6 -
Four years ago while still a newbey in Android Dev and still using the eclipse IDE which was hell to configure by adding Android plugins,my girlfriend had a birthday.
With my new found love of coding thought of developing a b-day app for her.With so little android knowledge I had a great idea the main activity would have her photo as the background and button which when clicked would show a toast saying happy b-day love.
After spending few minutes in Tutorial point and learning how to display a toast and setting click listeners on buttons I was good to go and compiled the app.
Later that evening I head to her room where her b-day was to be held with some of her lady friends .When presenting gifts I presented her gift said had one more surprise for her and asked for her phone and using bluetooth sent the apk to her phone.
Installing the app I was scared to death on seeing how my grey buttons were displaying on her 2.7 screen size since had no idea on designing for multiple screens.
Giving her back the phone she loved the app and felt like her superman in the room though not for long.Her lady friends had gone ahead took her phone and were critising the app:
Why can't I take a selfie
Why can't the app play a b-day song for her and this went on them not knowing how hurting that was.
Bumped on the lady who lead the onslaught on me and had to go down memory lane.Life is a journey.2 -
I just installed Opera Mini on my PSP. That alone isn't very exciting on its own, although I am stoked that my website does in fact render on a device from 2009. With the helpful guidance of a laptop from 2004 that's doing the hotspot duties for this thing.
No, what really got me stoked is that Opera still supports these old platforms, and how small they managed to make it. The .jar file for Opera Mini 4.5 is ~800kB large. There's a .jad file as well but it's negligible in size and seems to be a signature of sorts.
Let that sink in for a moment. This entire web browser is 800kB. Firefox meanwhile consistently consumes 800 MEGABYTES.. in MEMORY. So then, I went to think for a moment, how on earth did they manage to cram an entire functioning web browser in 800kB? Hell, what makes up a web browser anyway?
The answer to that question I got to is as follows. You need an engine to render the web page you receive. You need a UI to make the browser look nice. And finally you need a certificate store to know which TLS certificates to trust. And while probably difficult to make, I think it should be possible to do in 800k. Seriously, think about it. How would you go *make* a web browser? Because I've already done that in the past.
Earlier I heard that you need graphics, audio, wasm, yada yada backends too.. no. Give your head a shake. Graphics are the responsibility of the graphics driver. A web browser shouldn't dabble with those at all. Audio, you connect to PulseAudio (in Linux at least) and you're done. Hell I don't even care about ALSA or OSS here. You just connect to the stuff that does that job for you. And WebAssembly.. God I could rant about that shit all day. How about making it a native application? Not like actual Assembly is used for BIOS and low-level drivers. And that we already have a better language for the more portable stuff called C.
Seriously, think about it. Opera - a reputable browser vendor - managed to do it in 800kB on a 12 year old device. Don't go full wank on your framework shit on the comments. And don't you fucking dare to tell me that there's more to it. They did it for crying out loud. Now you take a look at your shitpile for JS code and refactor that shit already. Thank you.21 -
imagine having kernel memory leaks in 2020
AT&T or Huawei, whichever, pushed an update for my already-struggling-to-exist phone that made the kernel memory leak go from 480KB/hr avg to 22.5MB/hr avg. When my free RAM is never under 50% of 2GB after the kernel starts loading other shit and i'm able to express free RAM, at any time in use, in megs, with 8 bits... this means my phone crashes, with no apps running aside from a trimmed list of stock apps, every 3-4 hours due to running out of RAM. The only usable (read: not R/O because unrooted) swapfile is located on a tmpfs, so it's completely fucking useless (and eats another 100MB of RAM that I could be using for LITERALLY anything else, that's like another 3 hours of full idle between crashes) and I can't unlock the bootloader to fix any of this as Huawei no longer hands out keys and it'd take 7 years or so to brute (32-bit @ 10/sec)
tl;dr: fuck15 -
In 2010, it was my first client project. Our architect was not from iOS background, we had editable pdfs in our app. Those were pretty rich pdfs with inline HD images. iPads that time were not too fast and couldn't handle big gb pdf loaded into memory. App would crash randomly running out of memory. We fixed it by paginating pdf, it wasn't out of the world but considering it was my first mobile project and no one to guide, I thought it was pretty cool what we did there
-
Arguing with my girlfriend (recalled from my mind, not 100% accurate)
she: What do you expect when you buy an android?
Me: sure thing apple is more "unpack it, use it", easy to use - but android is more like an empty canvas. The first thing when I buy it is setting it up to my needs.
she: You don't understand, what do you expect from your android device?
me: It has to be affordable and work for a certain time
she: No I mean, do you.. when you unpack your phone, expect it TO WORK?
me: Sure, it's not like I buy a pile of trash, I expect it to work
she: you're too stupid, baka
me: ... ? *confused*
she: When you say it is like a canvas, isn't a canvas someday full?
me: yes, every phone, iPhone, Pixel, Samsung, every phone has a limited memory
she: *mad* you don't get it, silly
me: I want to but heh, I don't get it10 -
PM: this is our super fancy new CI/CD pipeline, it's the greatest. i expect you to learn and understand all this in no time.
devs: so i have to spend some more time on this topic because it's completely new to me and requires some learning...
PM: nooo, that's a super easy task with zero effort, my braindead hamster can do that in no time, so can i, and so can you! let's assign 1 story point for that.
~ 3 months latèr ~
also PM, after he has started developing as well: so i'm realizing there are many things that i have to learn, and it takes me some time. i haven't developed with C++ and <other tool stack> for a longer time. by the way, you guys don't need to check for any quality right now, we need to deliver fast. it's okay, when you have memory overflows, your code is completely crappy, poor architecture or memory overflows, it doesn't matter.
he even has a subtask for migrating his code from VS project to our new project structure, since he refused to learn our pipeline right from the beginning and created VS project instead. シ why is this a subtask? this job can be done in no time, my left vanishing twin named Klaus who has dislexia and hates vim can solve this task in 20 seconds!!!!11
(and still no PR, not even a feature branch in our repo)2 -
In college when we had programming labs where we had to use the schools unix server to compile and run.
My professor was very bad at explaining what actually needed to be done in the labs to the point where even the TAs didn't know what to do.
We were suppose to write an application in C to find out by "trial and error" how large we could make an array (or something like that, it's been too long). This not being explained well and no one knowing that much about C, I wrote a loop that just kept growing an array until it couldn't anymore. I watched it consume 72GB or memory from the servers before quitting the loop and realizing with the TA what the professor really meant.
I now feel bad for the IT staff monitoring the system wondering where 72GB just went...2 -
I am not sure which 24 hours was the craziest one, but I will pick 2.
This one happened just a few weeks after I started working for the one and only company I have ever worked for. The huge-ass multi-tenant website stopped working. There was out of memory exception and nobody knew what is going on. I was still very new and knew shit about how it worked + plus my PHP knowledge was limited back then. Everyone was looking for the culprit but with no luck. Then the next day I finally managed to find a fucking infinite loop in our weather plugin.
We were working on a moderately big project for a client. There was a lot of work lately (on different projects) and we were *very* behind schedule on this one. Deadline? You guessed it - tomorrow. What was worse is that we couldnt move it any further, becuase we already did once before. So I had to work for about 20 hours straight to kinda finish the work. Worst part? Client turned out to be moron and half-scammer, so they are not our client anymore and the project was never deployed to production. Never again.2 -
I feel so guilty.
I had to make a hotfix today. It is the ugliest piece of shit code I ever intentionally created. But there was no other way. I swear there was no other fucking way!
My boss just assigned this to me. But because she thinks this needs to be a hotfix and can't wait for the next release we just have to change the server and not the client side of our application.
So I had to add a memory to our server so that it knows from which high level method from the client the multiple low level calls to it are coming from.
It just doesn't make sense logically.
I mean I feel like I killed someone. And just so that we get less writes to our DB. I mean yes in some edge cases it is a huge speed-up...
But nothing this fix solves is a new bug.
I'm gonna take a shower now. For like an hour3 -
Rant:
I am at work, some one says to me this system we are working on is multi threaded. I tell the no its not multi threaded and in this context. Things cannot happen concurrently. Its a single core arm 7tdmi. Arguments ensue abot the difference between multithread multitasking an multiprocessing. I proceed to explain this is a multitasking interrupt driven system. With no context switching or memory segmentation so one heap for all tasks cause thats how we have it configured and there is only one core. So there is no way the error he just described could possibly happen. Then he tells me im wrong but refuses to even look at the processor manual and rejects the Wikipedia entry for multithreading. So I plan on calling off so i can just have the next two weeks off while he trys to figure out why two things ar happening at once on this system. He deserves all the frustration that is to follow.1 -
For the last week or so I've been writing a userbot for Telegram. Completely from scratch, plus Telethon to not reinvent the wheel entirely. I'm coming from the codebase of an existing userbot.
That userbot is written by a good friend of mine, who makes 6 figures, and whom I respect greatly. However the code is a steaming pile of shit. Now that is not his fault, he largely inherited that code too, tried to fix it, failed, gave up.
I am reimplementing it entirely. I'm only looking at the modules, trying to understand them, and copying over the necessary bits and changing them where necessary. But I've come across some nasty shit.
Userbots often edit existing messages from real Telegram clients. They're kind of like a login to your account, but with a program rather than a regular client. You send a message from a real client, it sees it and does whatever it needs to, and edits your message to give you feedback. Which is great.
However, there's no need to do simple string edits by importing "re". So why do you? Because you're an idiot, that's why. The old bot is based on Paperplane, which in turn is based on Telethon. Why do I see function calls to Telethon in some places and Paperplane in others? Because you're an idiot, that's why. Why does the dig module fail to even give correct answers? Because you know nothing about the DNS, that's why. And you didn't learn about RRs before implementing it.
And don't you tell me that this code is shit, and this bot is slow only when I run it on a fucking Pentium. I run this shit on an i7 and CPU isn't even the issue - memory, disk and such are. If you had any clue whatsoever about efficiency, you would've known because it's blatantly obvious. There's a reason why my machines rarely go past 5% CPU utilization. It's the fastest component in the entire fucking system.
When users come and say.. hmm this application of yours, it consumes a lot of memory. It takes a long time to do X and Y and I don't quite understand why, it seems illogical. Then maybe you should go look at your code, like you would look at yourself in the mirror. And then you fucking go fix it so that I don't have to. You're an engineer just like I am. And I am not even a dev proper - I'm a sysadmin by trade. Why should I have to fix your shit for you?1 -
This is stupid but i think is my best idea yet.
So i have an old orange pi, with only 256m memory. Its running a few tasks i need but i wanted to use it for controlling a few things from my phone (lights and powering on my pc) so i thought i would make a server for that. Now mind you, my shirt doesnt say "lightweight backend language", so there was no way the pi couldve handled a struts server. I was digging around and found that php has a shell_exec command. Then it clicked, and i wrote the whole system like
shell_exec("java -jar someprocess.jar"). Now this sounds really stupid but it works and php is really light so it doesnt even slow it down that much.
Thinking about making this into some kinda server/framework/something just for fun.4 -
Tried to figure out why my computer was being slow and lagging earlier. Thought it may have been a bad update to the kernel I recently did, or an update to a package.
No, it was chrome and its horrible memory usage.7 -
Fucking hate my job 😡
I joined as nodejs dev at a mnc 3months ago involved in banking software in which i dont have any domain knowledge.. first 10 days I was told to go through fucking udemy nodejs and graphql tutorial (wtf) which i already have experience with before joining.. after that my reporting manager gives me task to resolve fields and gave me shitty jira story link to read.. that shit story link had no explanation about the fields and what the database it is, then she says to use some shitty sdk which is built internally by shiity devloper which had no documentation and have to follow other module which was again written by that sr. Dev... They hav fucked up the graphql and nodejs and entire stack and also till date no one has ever given any explanation about the domain and the fields and database schema.. this manager refuses to share knowledge about the domain now how the fuck i resolve the graphql schema which was again written by non technical b.a.. all they have used is latest technology in a shitty way with no standards to to follow .. no dataloading no caching no batching.. use shitty sdk which does not give access to dbconn and fucking tightly coupling expressjs which when i start consumes crazy 400Mb of memory .. these fucking seniors devs + the fucking b.a having 12+. Yrs exp each have fucked the entire codebase... Each day killing my passion for app development.. fuckkk ... Dunno what to do now5 -
I'm convinced this is going to be wildly unpopular, but hey...
Please stop writing stuff in C! Aside from a few niche areas (performance-critical, embedded, legacy etc. workloads) there's really no reason to other than some fumbled reason about "having full control over the hardware" and "not trusting these modern frameworks." I get it, it's what we all grew up with being the de-facto standard, but times have moved on, and the number of massive memory leaks & security holes that keep coming to light in *popular*, well-tested software is a great reason why you shouldn't think you're smart enough to avoid all those issues by taking full control yourself.
Especially, if like most C developers I've come across, you also shun things like unit tests as "something the QA department should worry about" 😬12 -
It grinds my gears to no end as to how insanely BAD most Electrical engineering software is. Lets start with Tina. A circuit simulator. A few versions ago it was rather good but now it feel like its built upon more legacy crap than fucking Windows! This causes it to have memory access violations and crashes even when you look at it from an odd angle.
On topic of circuit simulation. LT-Spice! It has less errors than Tina but is impossible to use without being lobotomized first. Who the FUCK decided it was a good idea to reinvent keyboard shortcuts by movin all of them to the F-row at the top of the keyboard. Also there is no option to delete a component. YOU NEED TO USE CUT IN ORDER TO REMOVE IT!
And at last Altium Designer for Layouting and Schematics. Whose license costs 9 grand. No one outside of some companies will buy this because of the price. Altium realized this and made two watered down versions of it. Which dont really get updates anymore. (last one was in 2018) So they essentially made a cash grab from people who cant afford their actual product. There also exist other (and a lot cheaper) products than what Altium offers. The problem is that they dont work well with interoperability. Schematics drawn in one program will look distorted in another or not import at all. And since Altium is the industry standard you got yourself this nice steaming soup of impossible collaboration. Its kinda like Adobe being absolute shit at progressing their software just because they got no competition. Or rather they do but the industry wont switch cause adobe is so engraved into it.6 -
Just now I was reading on https://pve.proxmox.com/wiki/... about high availability. Now my Proxmox VE is just a tower (which happens to have ECC memory) that's stored in my storage room (and which is mostly used for experimental and home server purposes). But my mail servers.. those have been made with high availability in mind. Most importantly, I've made their services entirely redundant (but within the same datacenter). And when they have updates, I apply updates to one, reboot, see if it didn't break something and then do the same to the other server after the first one came up again. So no downtime whatsoever.
If memory serves me right, I think that I've been able to maintain these servers for the last year without any downtime at all (I reboot them every month to apply new kernels but they haven't both been simultaneously down at any moment). Does that make them High Availability? My interventions regarding their availability have been rather trivial. Is it really that hard..?4 -
Imagine asking your friends to help you rate your app on the google play store and instead of saying NO I DONT WANT TO RATE YOUR APPLICATION no... they decide to fuck with your mind.
1)
I will rate it tomorrow. (she never rated it tomorrow nor the next couple of weeks later)
2)
I will keep it in mind and rate it later :). (she never rated it later)
3)
I rated it haha (less than 30 seconds later they deleted the rating)
4)
Send me a link and I'll rate it (i send the link, they never respond or read my message again)
5)
I dont have memory on my phone :) (because 13MB of memory is a lot of storage requirements but taking 1 million selfies of up to 25GB is completely fine)
6)
I dont have memory on my phone what dont you understand :) x2 (this is the second girl)
7)
Your trying to give me a virus?? No (i got blocked multiple times)
8)
You want to hack me by making me install this application from the link that you sent me that leads to google play store? No (blocked)
9)
Rate your app? Haha i dont care about it because it doesnt bring me any benefit only the fat cocks that fill my pussy up satisfy me and not ur app haha
10)
Haha send me a link ill rate it (i send link, 8 hours later no reply or reading my message, i text her back if she had done it and im still put on ignore)
...
N)
more
----
Notice how none of these people have said the 2 letter word: "no".
All of these 10 examples are based on a true story.
All of these 10 examples are different people.
---
How hard
Can it be
To just
Write
no
---
.
---
For all of you who are about to trash talk saying i am desperately trying to beg people to rate my app:
i know all of those people for a long time. But when it comes to asking (and not forcing) someone to do you a favor for free that takes no more than 30 seconds, no one seems to have 30 seconds of their free time. Dont get me wrong, some of my friends did politely rate it and left a review, even the people who i barely knew left a review and rated it, but the people with whom I was closer by, didnt.
---
In the beginning i used to not care about this at all. Then i started falling into depression because of it. I fell then into deep depression. Then i sunk so deep that i couldn't feel any emotions anymore so i laughed as an anti depressive mechanism whenever something depressing happened. Now i cant even laugh because i have no more energy. Now i actually leave man tears
---
The only thing more valuable than people, any materialistic thing, animals, coding and even money - is time....
----
why do you waste my time
if i ask you to do something that takes 30 seconds and you dont want to do it
why cant you just say no
why do you drag me
why do you say you're going to do it when you know you wont do it
what do you gain by unnecessarily lying to someone for such a small thing?
to someone who has been a good person to you?
do you feel superior?
is your ego bigger?
----
This experience has taught me that not even a human from the same blood can be trusted.
All of your are fucked up in the head in your own style and i am guilty of it too, all of us are.
But i have never seen the human evolution went from simplicity to overengineered complexitory bULLSHit where you have to lie to someone and waste hours, days, weeks, months and sometimes years of his time just because you dont want to say a 2 letter word, no.
But when that person becomes more successful than you and achieves higher status, Theen you have those 30 seconds of free time. All of you are fucking cynics. and i am so much overly disgusted by all of this fucking bullshit....
-----
This experience has proven to me to simply focus on investing into myself and learn and improve myself and no one else. To not even bother asking even for a small kind of help, a feedback from my work because people don't have 30 seconds of their free time. That is all.12 -
So I was writing docs for my project the hard way. Manually. Every time I added something new, I had to find the right place for it to be alphabetically in the reference. And god forbid I want to have the same information in two pages: I would need to copy/ paste and pray that I not forget to update the information EVERYWHERE.
I didn't feel like installing and learning some new markdown generating bullshit so I just made my own system. It's very simple and intuitive and I love it.
I made sure it can cover two use case: reading partial documents from the disk, and rendering in-memory objects to markdown too (like rendering a collection of tuples into a table)
I didn't care much for templating, so there's no templating capability.2 -
Many people here rant about the dependency hell (rightly so). I'm doing systems programming for quite some time now and it changed my view on what I consider a dependency.
When you build an application you usually have a system you target and some libraries you use that you consider dependencies.
So the system is basically also a dependency (which is abstracted away in the best case by a framework).
What many people forget are standard libraries and runtimes. Things like strlen, memcpy and so on are not available on many smaller systems but you can provide implementations of them easily. Things like malloc are much harder to provide. On some system there is no heap where you could dynamically allocate from so you have to add some static memory to your application and mimic malloc allocating chunks from this static memory. Sometimes you have a heap but you need to acquire the rights to use it first. malloc doesn't provide an interface for this. It just takes it. So you have to acquire the rights and bring them magically to malloc without the actual application code noticing. So even using only the C standard library or the POSIX API can be a hard to satisfy dependency on some systems. Things like the C++ standard library or the Go runtime are often completely unavailable or only rudimentary.
For those of you aiming to write highly portable embedded applications please keep in mind:
- anything except the bare language features is a dependency
- require small and highly abstracted interfaces, e.g. instead of malloc require a pointer and a size to be given to you application instead of your application taking it
- document your ABI well because that's what many people are porting against (and it makes it easier to interface with other languages)2 -
porra; caralho; toma no cu.
this fucking shit xamarin. I wish the ass who programed the xamarin vs2017 integration to go fuck off.
srsly, I just want to fucking code this fucking fucker VS2017 keep shitting all around me
first I was gonna install it. didn't install because no memory left. fair enough, my fault there.
cleaned 35 gbs.
finish installing VS, with xamarin. FIRST GOD DAMN TIME I create fucking project, 2 fucking errors and 3 warnings. I DIDN'T EVEN TYPE A COMMA.
ok, tried fucking it. it seems to be conflict between version of Android and xamarin forms. fucker you it shouldn't be like this. anyway.
tried downloading the updated Android version.
it failed at 80%! what error you ask? missing fucking space ok, fuck that thing is huge, ok, my fault again. uninstalled all programs I was not using, all projects I'm not current working on. more fucking 30GB free. tried again. ANDROID IS TOO FUVKING HUGE CAN'T INSTALL IN 30GB!!!
Ok. instead of updating android, gonna downgrade xamarin, can't downgrade. ok gonna remove and install an early version.
unistalled. CAN'T FIND XAMARIN DLLS.
I was like, fuck this project, gonna start a new one. ok, all seems fine, for some weird reason. Except no. I try adding a new page, ops, APPARENTLY VS2017 CAN'T LOAD A GODDAMN .XAML
Ok, I can create a .cs page. done, except now I get a fucking timeout error. fuck.
I search the internet for a workaround, see a guy saying I could manually add a .xaml + .cs by creating this files and then adding them to the proj file.
did it. I go again, everything seems fine. but now I can't freaking reference the damn page.
I'm fucking losing my mind here.
In the mean time I have to turn in this project at the end of the week AND I CAN'T FUCKING OPEN THE GOD DAMN FREKING PROJECT PROPERLY!
FUCK. MY. LIFE.
FUCK XAMARIM AS WELL
FUCK VISUAL STUDIO
FUCK MICROSOFT
FUCK THAT DAMN SSD
FUCK THAT BOSS WHO THINK THAT A 128GB SSD IS ENOUGH
FUCK IT ALL...15 -
I really think there should be a subject in every CS course to teach us how to handle/work-under Grade-A assholes and dumbfucks. Not that it would help, but atleast warn us on what we are getting into.
In my opinion, development is not *that* hard or frustrating but is made so by these shitty people. But again, what do I know.
I was scolded by my boss for using for-loop to iterate through an array recently. Apparently for-loop is not used in real world projects and this iteration should be done "in-memory". My colleagues and I are still trying to understand and process that.
I was asked to add fitbit integration to a project within 2 hours just because I had "already done it a week ago" in *another* project. Luckily, it was then given to a "senior" developer who took 4 days for it and essentially copy-pasted my work without much changes, ofcourse it stopped working every now and then.
I am given unreal deadlines on my tasks, on technologies I haven't worked on before, and then expected to churn out production ready code with no bugs in them.
My boss literally just sends me the links of 1st three google results on the problems I encounter and report, after humiliating me ofcourse. Yes, I did google it and yes I went through all I could find from Google forums to GitHub issues. When the library/plugin author himself says that this feature is not yet available, don't expect me to develop it in 2 hours you dumbfuck.
And for the love of God, please stop changing the data model every single day and justify it with agile development. Think before making any changes to it. Ever heard of Join queries? Foreign keys? Or any other basic database concepts.
We reached a point where each branch in the repo had different data model. Not kidding. And we were a team of just 4 developers. Atleast inform us when you change models after discussing it with your shit for knowledge "senior" developer, so we don't have to redo it all over again. The channels on slack are not for sharing random articles only.
I am just waiting to complete my year here.
I should have known what I got myself into the day he asked me to remove the comments I had added to explain what my code does. Why you ask? Because "we don't write comments". -
Absolutely not dev-related.
Blah, blah, weird conversation and shit. I'm too tired and lazy to write this crap again, but let's do it.
The guy is a dev I randomly found on some chatting service, he was interesting to talk with until this conversation. I'll write this out of memory, so yeah.
Him: So by the way I wrote an app that you give your penis size to to get measurements and stuff about it.
Me, thinking it was dev humor: That's hilarious. Tell me more, I'm interested.
Him: So the idea behind all of this was to gather some big data style info about people's penis size and habits and all that stuff.
Me: Man that's awesome. Can I see the source?
Him: No, it's proprietary. You can buy a license though.
Me: You went that far for a joke?
Him: What joke?
Me: The whole software you just told me about.
Him: That's not a joke, I'm being very serious about it.
Me: Oh well. What did you get from the stats?
Him: I got some tips from people's habits! I never thought that shaving it could make it look bigger, but that's awesome!
Me: Do you really care about it that much?
Him: Studies have proven that size correlated with confidence. Since I started doing it, I've been more confident than ever!
Me: Great.
Him: I'm a bit disappointed to see that I'm in the lower percentiles though.
Me: Well of course you are.
Him: Why would you say that?
Me: Well since people with a big dick tend to go more willingly into the subject and might even buy a fucking app for it, of course you'd have the higher average in your stats.
Him: You're only saying that because you have a small cock.
Me: Why the fuck would you say that? You're the one that's concerned about it, not me.
Him: Go on, what's your size?
Me, because I don't care about discussing that stuff: *Tells him*
Him: [stats, comparisons and stuff]
Me: Well I never gave a fuck and your stats won't make me change my mind.
[ ... Some other shit about my size compared to his ... ]
Him: Would you want to work with me for the database maintenance?
Me: You must be joking?
Him: I'm serious.
Me: *Deletes account*
Seriously, fuck that guy. I rewrote that quickly so you only had the best, but it was a whole fucking conversation.3 -
A girl sets out on a journey in the post apocalypse, to find the reason why the AI that ran humanity vanished decades ago, causing civilization to collapse. Instead she finds the most unusual pair of survivors, and receives the most unexpected answer.
Alice walked in to the ivy covered room, the floors covered in dust and lichen. There were two voices, mumbling in the dark, among the blue glow across the room. She came here for answers. Why the world had just stopped decades ago. If these machines could tell her, she would do anything to make them talk.
"No, no, no. I said before thats not the answer. I read the book. Your memory is bad."
"Atlas, the answer to life, the universe, and everything..why hello?"
Alice raised an eyebrow, and stepped forward. "Ahem. I'm alice."
"yes, yes, we knew that."
"I came here to find out why the blackout happened decades ago."
"Another one? Alright, lets see. Its been a LONG time. I'm apollo, and this is atlas. We were just discussing why my friend here is wrong."
Atlas - I anticipated that.
apollo - I knew you would say that.
alice - Guys. Stop, I just want you to answer my question already.
apollo - Straight to the point. About time.
alice - why the blackout then? Why leave us to die?
Read the rest here (5-10 minute read):
https://pastebin.com/wvifGLFP
(because it was too long for devrant).6 -
My two cent: Java is fucking terrible for computer science. Why the fuck would you teach somebody such a verbose language with so many unwritten rules?
If you really want your students to learn about computer, why not C? Java has no pointer, no passed by reference, no memory management, a lots of obscure classes structure and design pattern, this shit is garbage. The student will almost never has contact with the compiler, many don't even know of existence of a compiler.
Java is so enterprise focused and just fucked up for educating purpose. And I say it as somebody who (still) uses it as main language.
If you want your students to be productive and learn about software engineering, why not Python? Things are simple in Python can can be done way easier without students becoming code monkeys (assuming they don't use for each task a whole library). I mean java takes who god damn class and an explicitly declared entry point which is btw. fucking verbose to print something into the console.
Fuck Java.17 -
I know I'm writing the correct integration tests when each one I add uncovers a new bug.
Still, it would be nice if just one of them passed first time.1 -
My new favourite commit message:
"All changes as of 18th Sept"
How tremendously useful? There I was looking to know what changes were made to enable a feature / service, thought I could look for that in the commit message, but no you've given me a much more efficient way of finding out.
I simply need to download the contents of your memory, find out what date you made a change, and then dig through the massive commit to find the piece of info I need.
Forget experience using Git features, managing merges, following Git flow, or even any other SCM ... how can people be so tick when it comes to recording what they've done.
Heres a little cheat sheet for those struggling:
- Commit message
Describe what you actually ****ing did. Don't tell me the date or the time, thankfully Git records those. Don't tell me the day of the week, if I need to know I can figure that out, just tell me what ... you ... did.
- Feature branch names
Now this is a tricky one. You might be surprised to know that this isn't in fact suppose to be whatever random adjective or noun popped into your head ... I know, I too was shocked. The purpose of this is to let other people know what new feature is being worked on in this branch.
- Reusing feature branches
Now I know you started it to add some unit tests, and naming it "testing" is sort of ok. But its actually not ok to name it testing when you add 3 unit tests ... then rip out and replace 60% of the business logic. Perhaps it would have been wiser to create a new feature branch, given you are now working on a new feature.2 -
A friend came to me whether i want to do a project on c++(someone asked him to find a c++ guy).
Me needing money didn’t refuse. Even though i am a Java developer with 0 skills on c++, but wanted to give it a try.
So project started, and it was about a plugin for rhinoceros app(3d graphics app).
The plugin was simple, had some views and some services to upload a file into s3 and some api calls, not something complex..
So i ended up working on the project together with my friend(web dev).
So long story short, we had a lot of issues, but considering we both had no knowledge on c++, we were really lucky to finish the product almost on time(3 days after).
Did no memory management even though i’ve read that we have to do that by our selfs and that c++ doesn’t have garbage collector.
But the plugin worked great even without garbage collector.
Had a lot issues with string manipulation, which almost drive me crazy.
PS: did a post here before taking the project, to ask whether it is a good idea to take the project or not, had some positive and some negative replies, but i deleted the post since i thought i was breaking the NDA i signed 😂😂
PS2: just finished OCAJP 8 last week with a great score😃6 -
So, I work in a game development studio, right?
We're trying to launch the title on as many platforms as reasonable, because as a social VR app we're kinda rowing upstream.
So far, Steam and Oculus have been fairly reasonable, if oddly broken and inconsistent.
Enter store 3.
Basically no in-game transaction support (our asking prompted them to *start* developing it. No, it's not very complete). No patch-update system (You want an update? Gotta download the whole fsckin' thing!). No beta-testing functionality for most of their stuff ("Just write the code like the example, it will work, trust us!"). No tools besides the buggy SDK (Wanna upload that new build? Say hello to this page in your web browser!).
So, in other words: Fun.
We've been trying to get actively launched for two months now. Keep in mind that the build has been up on Steam and Oculus for over a year and half a year (respectively), so the actual binary functionality is, presumably fine.
The best feedback we get back tends to be "Well, when we click the Launch button it crashes, so fail."
Meanwhile we're going back and forth, dealing with other-side-of-the-world timezone lag, trying to figure out what is so different from their machines as ours. Eventually we get them to start sending logs (and no, Windows Event logs are not sufficient for GAMES, where did you even get that idea????) except the logs indicate that the program is getting killed so terribly that the engine's built-in crash handler can't even kick in to generate memory dumps or even know it died.
All this boils down to today, where I get a screenshot of their latest attempt.
I just can't even right now.5 -
I found this on a wiki with Haskell Humor... it's interesting...
How to Shoot Your Self in the Foot With Haskell: Putting the unsafe in unsafePerformIO!
You shoot the gun, but the bullet gets trapped in the IO monad.
Couldn't match expected type 'Deer' against inferred type 'Foot'.
While compiling your program the compiler produces a type error long enough to overflow a kernel buffer, overwrite the trigger control register and shoot you in the foot.
After trying to decipher the type errors from the compiler, your head explodes.
After you've finally found a way to circumvent the type system and shoot yourself in the foot, Oleg appears out of nothing and shoots you in the foot for coming up with it before him.
You shoot the gun but nothing happens (Haskell is pure, after all).
Your foot is fine, until you try to walk on it, at which point it becomes mangled.
You have a shootFoot function which you've proven correct. QuickCheck validates it for arbitrary you-like values. It will be evaluated only when you end up at the hospital. You hope this doesn't come to pass, as it actually returns a bullet-ridden copy of yourself and you don't want to be garbage-collected.
foreign import ccall "shootparts.h shootfoot" shoot_foot :: Gun -> Programmer -> IO ()
shootSelfInFoot = unsafePerformIO . shoot . foot $ self -- Shoot self in foot 0 or more times depending on evaluation order
No instance for (Target Foot)
arising from use of `shoot' at SelfInflictedInjury.hs:1:0
Possible fix: add an instance declaration for (Target Foot)
In the expression: shoot foot
You go to shoot yourself in the foot but the bullet is in the ST monad and the gun is in the IO monad, so you can't.
You ask Haskell to shoot you in the foot but by the rules of lazy evaluation you don't need the result yet so it doesn't happen.
You decide to shoot yourself in the foot but get distracted devising a ballistics algebra and wondering if you can do the calculations in the type system.
You want to shoot yourself in the foot but realize there is no Gun datatype so use Arrows instead.
You shoot in the direction of your foot, but since you are inside the STM monad you can just retry until you figure out what to do.
You shoot yourself in the foot, but you are perfectly fine as long you just don't evaluate the foot.
You shoot yourself in the foot, but nothing happens unless you start walking.
Don't forget about memory consumption! If you don't look, the bullet causes heap overflow. If you look, the bullet causes stack overflow.
You *appear* to have deliberately shot yourself in the foot, and yet your program actually runs perfectly OK due to lazy evaluation. (So long as you remember to not look at your foot...)
You aim the gun at your foot, pull the trigger and remove the clip. When you look at your undamaged foot, the hammer clicks on an empty barrel.1 -
I like developing on windows. Like many people here I got into development at home starting as a hobby when I was in school so there were things I still did on my computer that Linux wasn't really appropriate for.
I've made the jump to Linux in the past but found that it was awkward and annoying when I needed to do something on my windows. And I hate doing Dev out of a VM. So I've just got used to using windows at home.
And honestly, I don't know what's happening to everyone who keeps getting broken Windows updates. I think I've had 2 in living memory.
It's in no way perfect but what is? I don't use Windows servers, just for when I'm at home. -
The process of making my paging MIDI player has ground to a halt IMMEDIATELY:
Format 1 MIDIs.
There are 3 MIDI types: Format 0, 1, and 2.
Format 0 is two chunks long. One track chunk and the header chunk. Can be played with literally one chunk_load() call in my player.
Format 2 is (n+1) chunks long, with n being defined in the header chunk (which makes up the +1.) Can be played with one chunk_load() call per chunk in my player.
Format 1... is (n+1) chunks long, same as Format 2, but instead of being played one chunk at a time in sequence, it requires you play all chunks
AT THE SAME FUCKING TIME.
65534 maximum chunks (first track chunk is global tempo events and has no notes), maximum notes per chunk of ((FFFFFFFFh byte max chunk data area length)/3 = 1,431,655,763d)/2 (as Note On and Note Off have to be done for every note for it to be a valid note, and each eats 3 bytes) = 715,827,881 notes (truncated from 715,827,881.5), 715,827,881 * 65534 (max number of tracks with notes) = a grand total of 46,911,064,353,454 absolute maximum notes. At 6 bytes per (valid) note, disregarding track headers and footers, that's 281,466,386,120,724 bytes of memory at absolute minimum, or 255.992 TERABYTES of note data alone.
All potentially having to be played
ALL
AT
ONCE.
This wouldn't be so bad I thought at the start... I wasn't planning on supporting them.
Except...
>= 90% of MIDIs are Format 1.
Yup. The one format seemingly deliberately built not to be paged of the three is BY FAR the most common, even in cases where Format 0 would be a better fit.
Guess this is why no other player pages out MIDIs: the files are most commonly built specifically to disallow it.
Format 1 and 2 differ in the following way: Format 1's chunks all have to hit the piano keys, so to speak, all at once. Format 2's chunks hit one-by-one, even though it can have the same staggering number of notes as Format 1. One is built for short, detailed MIDIs, one for long, sparse ones.
No one seems to be making long ones.6 -
Learning Rust.
Holy brainfucking brain melt, those references, scoping and borrowing and cloning and whatnot, because there is no garbage collector, but also no direct memory management.
It's cool, but also hard for a noob coming from the JVM/Android. The compiler error messages are helpful, but I immediately found some cryptic ones that don't help me at all.9 -
it was not a technical interview.
just screening.
guy: tell me smth about redis.
me: key value, in memory storage.
guy: more
me: umm, the concept is similar to localStorage in browsers, key value storage, kinda in memory.
guy: so we use redis in browsers?
me: no, I mean the high level concept is similar.
guy: (internally: stupid, fail).3 -
Okay, sorry, I apologize to those to whom I claimed that properly asked Questions do not get downvoted on StackOverflow.
I have 600+ rep, 20+ Answers and questions.
I was doing something, it wasn't work, it wasn't homework or assignment. I was doing it purely out of interest. I got stuck and having no clue whatsoever, I asked a question. Got 3 down votes, close flags, and someone commented that they aren't there to do my homework.
-_-
The question was, after applying huffman encoding on an image (array of pixels) , how do I save it where it actually occupies less memory?
And this https://stackoverflow.com/questions...4 -
Man wk89 awesome... bringing back a lot of memories. The one thing really stands out to me though is the software.
I see a lot of rants about people shocked that turboC is still in use or other DOS programs are still in production. A lot can of bad be said here but I think often it's a case of we truly don't build things like we did in the good old days.
What those devs accomplished with such limited resources is phenomenal and the fact that we still haven't managed to replicate the feel and usability of it says a lot, not to mention just how fucking stable most of it was.
My favourite games are all DOS based, my most favourite of all time Sherlock is 103kb in size. When I started coding games I made a clone of it and to this day I am still trying to figure out what sorcery is in the algorithm that generates/solves puzzles that makes it so fast and memory efficient. I must have tried 100+ ways and can't even come close. NB! If you know you can hint but don't tell me. Solving this is a matter of personal pride.
Where those games really stand out is when you get into the graphics processing - the solutions they came up with to render sprites, maps and trick your eyes into seeing detail with only 4-16 colours is nothing short of genius. Also take a second to consider that taking a screen shot of the game is larger than the entire game itself and let that sink in...
I think the dramatic increase in storage, processing power and ram over the last decade is making us shit developers - all of us. Just take one look at chrome, skype or anything else mainline really and it's easy to see we no longer give a rats ass about memory anywhere except our monthly AWS/GCE bill.
We don't have to be creative or even mindful about anything but the most significant memory leaks in order to get our software to run now days. We also don't have constraints to distribute it, fast deliver-ability is rewarded over quality software. It's only expected to stay in production 3-4 years anyway.
Those guys were the true "rockstars" and "ninja" developers and if you can't acknowledge that you can take ya React app and shovit. -
News sites with infinite-scrolling are so damn annoying.
A new random article I am not interested in suddenly loads under the current news article when skimming through it by dragging the scroll bar, and then throws me far down into unknown territory due to the sudden change of the height of the page.
It also happens similarly on Imgur photo galleries: when I drag down the scroll bar to quickly seek through the images, the "explore posts" section suddenly loads hundreds of "trending" and "viral" (uninteresting junk and spam) photos under the gallery, and since this adds lots of height to the page, I get pulled right into it and my window is full of such posts. Both distracting and memory-consuming.
YouTube's infinite scrolling comments and video lists are acceptable as of writing, since they are on-topic, and no off-topic "trending" spam, and they do not load too much at once, which does not throw me down too far.
Quote from https://elite-strategies.com/infini... :
> The footer of a website is like the shoes of a person, it ties the whole outfit (or website) together. Footers are awesome because it gives you a chance to tell people where to go when they reach the bottom of the page.1 -
Not sure how to handle this one. My new company gave me a surface laptop to do dev ops work.
16 gigs of Ram but only a 256 SSD?
Nothing is installed so far except for MS office and acrobat and I am already running into memory issues.
My last work machine had 1TB HDD and 128 gigs of RAM (i know overkill but I could have several VM’s up and running at once).
What the fuck? Apparently the CTO ordered this piece of shit.
Also no mirco SD card like other models so I have no idea how this is going to fucking work.15 -
Most memorable co-worker was a daft idiot.
this was 10 years ago - I was working as a junior in my very first job, fresh out of uni, for a very small startup. It was me, and the 3 founders, for a very long time. Then this old (45, from my perspective then..) dev was hired.
This guy had no idea how to do the job. no common sense. the code confused him. the founders confused him. I was focusing on my work - and was unable to help him much with his. His only saving grace? He was a nice guy. Really nice.
But why was he so memorable, out of all the people I ever worked with? simple. He had a short term memory problem. Could not, even if he really tried, remember what he did yesterday.... when I asked him what his issue was, he decribed his life is like a car going in reverse in a heavy fog. "I can only see a short distance backwards, with no idea where I'm going".
Startup was sold to a big company. I became a teamlead/architect. He? someone decided he should be a PM. -
Fucking hate Qt.
Spent all morning trying to figure out how their bullshit QThreadPool works with their bullshit QRunnable but after a bunch of bullshit asynchronous testing I figured that my thread object was being collected and deleted before I was done with it, for no reason. Now if the race condition was documented... This wouldn't be an issue. But every google search brought up nada. Eventually I resorted to turning off autoDelete on the runnable, but then I just have a memory leak, obviously.
I couldn't find a way to manually clean up a QRunnable in Python. What the fuck.
I just went back to good old fashioned QThreads... This is why I quit Qt in the first place.18 -
Imagine you work in a mechanic’s shop. You just got trained today on a new part install, including all the task-specific tools it takes to install it.
Some are standard tools, like a screwdriver, that most people know how to use. Others are complicated, single-purpose tools that only work to install this one part.
It takes you a couple of hours compared to other techs who learned quicker than you and can do it in 20 minutes. You go to bed that night thinking “I’ve got this. I’ll remember how this works tomorrow and I’ll be twice as fast tomorrow as I was today.”
The next morning, you wake up retaining a working, useful memory of only about 5% on how to use the specialized tools and installation of the part.
You retrain that day as a review, but your install time still suffers in comparison. You again feel confident by the end of the day that you understand and go to bed thinking you’ll at least get within 10-20 minutes of the faster techs in your install.
The next morning, you wake up retaining a working, useful memory of only 10% on how to use the specialized tools.
Repeat until you reach 100% mastery and match the other techs in speed and efficiency.
Oops! Scratch that! We are no longer using those tools or that part. We’re switching to this other thing that somehow everyone already knows or understands quickly. Start over.
This has been my entire development career. I’m so tired.2 -
We had 1 Android app to be developed for charity org for data collection for ground water level increase competition among villages.
Initial scope was very small & feasible. Around 10 forms with 3-4 fields in each to be developed in 2 months (1 for dev, 1 for testing). There was a prod version which had similar forms with no validations etc.
We had received prod source, which was total junk. No KT was given.
In existing source, spelling mistakes were there in the era of spell/grammar checking tools.
There were rural names of classes, variables in regional language in English letters & that regional language is somewhat known to some developers but even they don't know those rural names' meanings. This costed us at great length in visualizing data flow between entities. Even Google translate wasn't reliable for this language due to low Internet penetration in that language region.
OOP wasn't followed, so at 10 places exact same code exists. If error or bug needed to be fixed it had to be fixed at all those 10 places.
No foreign key relationships was there in database while actually there were logical relations among different entites.
No created, updated timestamps in records at app side to have audit trail.
Small part of that existing source was quite good with Fragments, MVP etc. while other part was ancient Activities with business logic.
We have to support Android 4.0 to 9.0 of many screen sizes & resolutions without any target devices issued to us by the client.
Then Corona lockdown happened & during that suddenly client side professionals became over efficient.
Client started adding requirements like very complex validation which has inter-entity dependencies. Then they started filing bugs from prod version on us.
Let's come to the developers' expertise,
2 developers with 8+ years of experience & they're not knowing how to resolve conflicts in git merge which were created by them only due to not following git best practice for coding like only appending new implementation in existing classes for easy auto merge etc.
They are thinking like handling click events is called development.
They don't want to think about OOP, well structured code. They don't want to re-use code mostly & when they copy paste, they think it's called re-use.
They wanted to follow old school Java development in memory scarce Android app life cycle in end user phone. They don't understand memory leaks, even though it's pin pointed by memory leak detection tools (Leak canary etc.).
Now 3.5 months are over, that competition was called off for this year due to Corona & development is still ongoing.
We are nowhere close to completion even for initial internal QA round.
On top of this, nothing is billable so it's like financial suicide.
Remember whatever said here is only 10% of what is faced.
- An Engineering lead in a half billion dollar company.4 -
Debugged a complex bug at 10 PM, drunk and eating potato wedges, while on the phone with another drunk co worker.
Woke up next morning and had no memory of the fix.7 -
I just got a new phone, a Tecno device with a measly 8GB internal storage. Decided I'll have to root it, and force part of my class 10, 32GB mem card as adopted storage.
Went online, learnt for a few weeks, successfully rooted, and began enjoying the vast benefits.
But there are no good endings. Months later, while doing some heavy gaming, my phone reboots... and everything on the memory, pack up and go on a vacation... -
The best QA in the world is your boss. He always jump in and ask you to show him something that is not completed yet. He then acts like a professional and points out the red is not red enough... You have no word, mark down all the design changes, pass a message to designer, and then, finally, You forgot what's going on in your mind!!! And it takes another hour for you to resume your memory back....1
-
I have a friend that keeps his phone on "Battery saver" all the time... It's painful to always see that notification with "your phone is on battery saver mode".
He told me he does that because he does not want his battery to empty too soon...
A while ago he refused to use my Micro USB cable because he told me there might be viruses on that cable. ON THE CABLE!!!! Like viruses stored inside a cable that has no memory... A simple cable ( da fcuk? )
The worst part? He's studying computer science :/4 -
Anyone else have people that seem to constantly try to "prove" themselves to you in this weird, competitive way that only makes them seem... very annoying? I'll call him Bob here, but it's always something like:
Bob: Hi Almond, how's it going?
Almond: Ah not bad thanks, PSU blew up in the PC over the weekend though so that was a bit of a faff!
Bob: Ah no! How old's your PC?
Almond: Oh, like 7-8 years old now. I don't replace it often.
Bob: Really?! I replace mine completely every year.
Almond: Ah, cool.
Bob: Yeah, I'm a dev so I feel I need to. It's like my tool, you know.
Almond: Sure thing!
Bob: I actually spend quite a lot on it. I make sure it's got the fastest memory I can afford. Like, DDR5 stuff. That's really important, you know.
...etc., while I try to get out of said conversation for the next eternity.
Or:
(while in a conversation about a frontend bug I was looking at in Chrome devtools)
Bob: Hey Almond, you know Firefox actually had a plugin that did all this stuff before everything else?
Almond: Err, yeah, I think so. Used it back in the day.
Bob: It was called firebug. It was really good. Revolutionary.
Almond: Certainly was.
Bob: It was launched in January 2006 you know.
Almond: Right...
Bob: I used it back then.
...I mean damn, I'm all for being civil, but no-one cares you replace your PC every year, or that you know the year firebug was released, or that you once set up 5 identical PCs with different versions of Linux to run some benchmarks...14 -
One of the biggest challenges for me learning to program is my memory.
Some people can pick up concepts easily and have a field day. I have to keep practicing until I memorize it properly, and even then I have the tendency to struggle.
Does this mean I give up? Helllll no. I'm far from giving up with all the progress I've made.4 -
So recently I had an argument with gamers on memory required in a graphics card. The guy suggested 8GB model of.. idk I forgot the model of GPU already, some Nvidia crap.
I argued on that, well why does memory size matter so much? I know that it takes bandwidth to generate and store a frame, and I know how much size and bandwidth that is. It's a fairly simple calculation - you take your horizontal and vertical resolution (e.g. 2560x1080 which I'll go with for the rest of the rant) times the amount of subpixels (so red, green and blue) times the amount of bit depth (i.e. the amount of values you can set the subpixel/color brightness to, usually 8 bits i.e. 0-255).
The calculation would thus look like this.
2560*1080*3*8 = the resulting size in bits. You can omit the last 8 to get the size in bytes, but only for an 8-bit display.
The resulting number you get is exactly 8100 KiB or roughly 8MB to store a frame. There is no more to storing a frame than that. Your GPU renders the frame (might need some memory for that but not 1000x the amount of the frame itself, that's ridiculous), stores it into a memory area known as a framebuffer, for the display to eventually actually take it to put it on the screen.
Assuming that the refresh rate for the display is 60Hz, and that you didn't overbuild your graphics card to display a bazillion lost frames for that, you need to display 60 frames a second at 8MB each. Now that is significant. You need 8x60MB/s for that, which is 480MB/s. For higher framerate (that's hopefully coupled with a display capable of driving that) you need higher bandwidth, and for higher resolution and/or higher bit depth, you'd need more memory to fit your frame. But it's not a lot, certainly not 8GB of video memory.
Question time for gamers: suppose you run your fancy game from an iGPU in a laptop or whatever, with 8GB of memory in that system you're resorting to running off the filthy iGPU from. Are you actually using all that shared general-purpose RAM for frames and "there's more to it" juicy game data? Where does the rest of the operating system's memory fit in such a case? Ahhh.. yeah it doesn't. The iGPU magically doesn't use all that 8GB memory you've just told me that the dGPU totally needs.
I compared it to displaying regular frames, yes. After all that's what a game mostly is, a lot of potentially rapidly changing frames. I took the entire bandwidth and size of any unique frame into account, whereas the display of regular system tasks *could* potentially get away with less, since most of the frame is unchanging most of the time. I did not make that assumption. And rapidly changing frames is also why the bitrate on e.g. screen recordings matters so much. Lower bitrate means that you will be compromising quality in rapidly changing scenes. I've been bit by that before. For those cases it's better to have a huge source file recorded at a bitrate that allows for all these rapidly changing frames, then reduce the final size in post-processing.
I've even proven that driving a 2560x1080 display doesn't take oodles of memory because I actually set the timings for such a display in order for a Raspberry Pi to be able to drive it at that resolution. Conveniently the memory split for the overall system and the GPU respectively is also tunable, and the total shared memory is a relatively meager 1GB. I used to set it at 256MB because just like the aforementioned gamers, I thought that a display would require that much memory. After running into issues that were driver-related (seems like the VideoCore driver in Raspbian buster is kinda fuckulated atm, while it works fine in stretch) I ended up tweaking that a bit, to see what ended up working. 64MB memory to drive a 2560x1080 display? You got it! Because a single frame is only 8MB in size, and 64MB of video memory can easily fit that and a few spares just in case.
I must've sucked all that data out of my ass though, I've only seen people build GPU's out of discrete components and went down to the realms of manually setting display timings.
Interesting build log / documentary style video on building a GPU on your own: https://youtube.com/watch/...
Have fun!20 -
Was playing fallout 4 a couple days ago. About 20 minutes in. The computer just shuts off. Like no power at all. I start up the computer again. Try fallout 4 again. It shuts off at the beginning video. WTF... I try Skyrim wondering if video card is busted. Skyrim runs perfectly fine. I startup Fallout 4 again. It runs. WTF...
Next day I try fallout and bout 20 minutes in power off again. Now I am assuming cooling issue and I am trying to see temps with programs. Cannot really tell.
So today I take apart my laptop and vacuum every cooling orifice out. Vacuum any dust looking crap I can see. There was dust in the fans. All clean. I run a memory test for a couple hours. Memory passes (it was brand new memory, thought maybe flaw in ram). Now I run fallout 4. Runs fine, zero issues for about an hour.
Me to myself: CLEAN YOUR DAMN COMPUTER MORE OFTEN! Okay...
In between I read about Fallout 4 causing system reboots and shutdowns due to loading and heating. Apparently something about Fallout 4 causes this more than other games. Wild... Pretty sure it was thermal shutdown protection going on.3 -
I usually crib about how stupid people are and how I struggle to stay afloat.
Let's switch some gears now. A post about some good people, product, and processes.
You know what the common theme here is?
The goodness here cannot be measured. Your first interaction with them makes you feel so comfortable that you start feeling butterflies.
These people just keep on giving. They are selfless. They are pure. They actually care.
And when you think it's done, then they give you some more.
What blows me away is, they don't expect or accept anything in return. Absolutely nothing. Not even a simple thank you.
And they are like a wizard. They walk into your life when you least expect them but need them the most. And when the task is done, they'll be gone before you even know.
No lingering, no drama, no bullshit. Just pure goodness.
Like my ex-lead in current company, I have a very senior guy in neighbouring team (for which they were gonna hire me initially), who also happened to interview me, is a gem.
He takes care of me like his own younger brother. Supports me and always answers my queries no matter how occupied he is.
And same is with good products and processes. They feel effortless. So smooth and add exceptional value to your existence. They give rise to wonderful companies.
You'd never experience a single negative aspect about them. No matter how much you try, things will just keep getting better until they don't need to.
And then they'll be long gone. Never to be seen again and never to be forgotten.
You cherish them only in your memory and wish they lasted longer. But they didn't because the purpose was served.
Such people and experiences inspire me. They push me to become a better human.
No matter how the world is or how it treats me, I must always live with high values and be a better version of past self.
The other evening, I was conversing with my mother where we spoke about some family friends who are insanely wealthy but humble and kind.
Mom and I mutually agreed that they don't have such good traits because they are wealthy, but they are wealthy because they live with humility, kindness, and pure intentions.
World is surely a beautiful place because of such people and I aspire to be one. May lord guide me well :)3 -
Avoid ACPICA if at all possible. It's one garbage tier cluster fuck of bad design, horrible documentation and downright misleading and wrong code
It's meant to consist of an ASL compiler, disassembler, debugger, dumper, various user space utitilies and a kernel resident OSPM implementation *if* you can figure out what belongs to what. Even just compiling this pile of trash is a mystery in itself. Think you need the source files in source/common? EEEEH, wrong. Well, at least partially since most of them seem to be for the user space stuff..? Other ones *are* needed on the other hand. At least the disassembler and/or debugger and/or dumper components seem to reference them. Not that I could figure out how to compile those anyways. The real path to your goal seems to be to ignore a seemingly arbitrary subset of source and header files until your linker stops complaining
There's also a bunch of configuration defines, some of which *you* define, some defined *for* you, based on again others. Of course most of them do stupid shit. Enabling the debugger automatically enables debug logging. Enabling the disassembler force enables debug allocation tracking... What?
The code itself isn't of much help either. Looking in "os_specific/service_layers" you find what looks to be reference implementations of acpica functions in certain os' like windows and unix. Of course I had a look because AcpiOsReadMemory is supposed to read physical memory and I don't know how I would even implement that. But hey, osunixxf.c (xf for interface... of course) should tell me. I'll let you see for yourself in the attached image. Apparently it does fuck all and just returns AE_OK. No error, no logging, no nothing. Just ok. As you can imagine, AcpiOsWriteMemory doesn't do much more either.
...okay so maybe physical memory accesses aren't actually used and these functions are some sort of relic from past times? Nope! They are absolutely necessary for doing low level device interaction. WTF. So finally I went to the linux source and checked how *they* implemented them, and just as I thought, these functions are anything but no-ops...
...So for what fucking reason do these stupid interface implementations even exist but to purposefully mislead you?? They aren't used for fucking anything! As far as I know Windows doesn't even *use* ACPICA and Linux have their own fork with working implementations... They just sit there, just to tell you how to NOT do it
So that's some of my thoughts about ACPICA. Note that I haven't even used it as a library yet, I just got it to compile and link and it already fucked with me this much.
There's also so much more I didn't mention like that you *have* to modify the acpica source in order to get your own platform header working (else #error) eventhough the docs explicitely instruct you not too but you get the point
Don't use ACPICA if you don't have to. Save your sanity for something that's worth it -
I'm getting beat up pretty bad by Rust. I like it so far but man is it hard. Imposter-syndrome is almost making me lose motivation. Almost, but I won't quit, one day I'll get there.
I think the primary reason I think I'm having such a hard time is that I'm trying to learn stuff that prevents me from making some mistakes that I have never run into. I know a bit of the theory but no hand's on experience on double-free errors, memory leaks and weird low-level stuff. I read the documentation, mostly understand what stuff is for but when I go write code I'm just like "now what?". I don't have enough experience to know when and where to use some concepts and I'm super lost. I don't know where to start and the feeling of being completely overwhelmed by all sorts of new stuff is at the same time exciting and frightening.
I have never, as a programmer, thought something was hard. All of my past knowledge required dedication, work and patience, but I wouldn't say I ever felt something was *hard*. But Rust... damn. Rust is hard.
Hopefully at the end of this super steep learning curve I'll know a lot more stuff and have stronger "dev powers" and be one step closer to being as knowledgeable as some of you guys around here to whom I look up to.2 -
I'm rewriting a game from C++ to C just for the purpose of learning and adding more features, however; after I refactored the code, the game broke with a segmentation fault and I have no idea where the memory issue is.
I've been debugging for hours now and I've got nothing. FML5 -
Pretty much a sort of research work. The first assignment was: "look, we have this CAD viewer, but we would want to eventually optimize the structure of the mesh, so here's this method of minimizing the memory footprint. Try implementing it and integrating it with our application."
PS: the method is using triangle strips, where the next triangle uses two vertexes of the previous one, theoretically reducing the memory footprint of the mesh by 2/3 if the mesh is fully optimized. In the end, due to memory and performance constraints (this had to run on the first gen iPad), and overall application architecture, on the fly striping was unfeasible and gained no benefit, because striping an arbitrary mesh is a fucking hard task.
Another one was an implementation of smooth shading by recalculating vertex normals in runtime.5 -
Fellow Deviants, I need your help in understanding the importance of C++
Okay, I need to clarify a few things:
I am not a beginner or a newbie who has just entered this community...
I have been using C++ for some time and in fact, it was the language which introduced me to the world of programming... Before, I switched to Java, since I found it much better for application development...
I already know about the obvious arguments given in favour of C/C++ like how it is a much more faster and memory efficient than other languages...
But, at the same time, C/C++ exposes us and doesn't protect us from ourselves.. I hope that you understand what I mean to say..
And, I guess that it is a fair tradeoff for the kind of power and control that these languages (C/C++) provide us..
And, I also agree with the fact that it is an language that ideally suits our need, if we wish to deal with compilers, graphics, OS, etc, in the future...
But, what I really want to ask here is:
In this age and times, when hardware has advanced so much, where technically, memory efficiency or execution speeds no longer is the topmost priority... These were the reasons for which C/C++ was initially created...
In today's time, human concept of time matters more and hence, syntactical less complicated languages like Java or Python are much more preferred, especially for domains like application development or data sciences...
So, is continuing with C++, an endeavour worth sticking with in the future or is it not required...
I am talking about this issue since I am in a dilemma about the use of C++ in the future...
I would be grateful if we could talk about keeping AI, Machine Learning or Algorithms Optimisation in mind... Since, these are the fields in which I am interested in...
I know that my question could have been posted in a better way.. But, considering the chaos that is present in my mind, regarding this question doesn't allow me to do so...
Any kind of suggestion or thoughts would be welcome and much appreciated...
P.S: I currently use C++ only for competitive programming or challenges...28 -
"Suggest an AV/AM product, Avast refuses to install."
I do malware research as a hobby and have for a while, so I can generally spot when something's up before I even run a program. If i'm unsure about it (or know something's up and wanna see its effects for S&Gs) I throw it into one of a variety of VMs, each with a prepped, clean, standardized "testing" state.
I see no point to AV/AM products, especially as they annoy me more than anything since they can't be told not to reach into and protect VMs (thereby dirtying up my VM state, my research, crashing the VM hypervisor and generally being *really* annoying) and they like to erase samples from a *read-only, MOUNTED* VHDX.
However, normal people need them, so I usually suggest this list:
• MBAM is good and has a (relatively) low memory footprint, but doesn't have free realtime protection.
• Avast is very good as it picks up a lot, but it eats a FUCKTON of resources. It also *really* likes to crash VM hypervisors if it sees anything odd in them.
• AVG is garbage. Kill it with fire.
• Using Windows Defender is like trying to block the rain with an umbrella made of 1-ply toilet paper.
• herdProtect is amazing as it's basically a VirusTotal client but it's web-based and not currently available to be downloaded. (Existing copies still work!)
• Kaspersky. Yes, it spied on US gov't workers. No, they don't care about anyone BUT US gov't workers. Yes, it's pretty good.
• BitDefender: *sees steam game* "Is this ransomware?"
hope this helps10 -
So another story about college and stupid team assignments that I have to be responsible for dealing with.
So we had an assignment in operating systems 1 course, it was about memory management and we are a team of 3. Then came the time when we should discuss this assignment with the TA and that day I had to stay all night finishing a project in software engineering (literally giving us a description of a big project because that's what the course teaches And I had to finish it in one all nighter alone because my teammates just gave up).
When the discussion time came I was really tired and then the TA asks me something really simple and I say it but then she tells me that I'm wrong so I wondered a bit and then said no what I said was right! She then asks my teammate (who we are supposed to be good friends) "did he say the right thing?" And his answer is a definitive "NO he's wrong" and then he starts to say the right answer which I swear I said the same but in a different way so I start to say again that I was right and say that I said that just a different way and she took that as an insult and said that I'm shouting at her and being disrespectful to her.
When we finished I asked my friend if he heard me say it wrong and he said "I'm sorry but I didn't even hear what you said and I was afraid" WHAT THE FUCK, he just said that I was wrong to please her and make her feel like she is right and I had to be the wrong one even though I said it right but NOoo her pride is more important
All this was last semester and the second semester just started today and I go into operating system 2 and guess what? The TA got her doctorate and is now the professor for OS 2 when she doesn't even understand anything.
Really FUCK the academic system it feels like it is a grind more than actually gaining mastery of a subject.2 -
fuck oracle. fuck my company.
Using Oracle VM Manager/Servers to host Oracle Phone transfer solution without support coverage from Oracle.
Requiring Unix sysadmins to update to latest release and not telling that we do not have coverage from Oracle if anything goes wrong.
Gues what.. We've updated to Oracle VM Manager/Server 3.4.5 which was released this year and it uses fucking XEN hypervisor version 4.4.4 which has been deprecated and dead since who knows when. Latest release of XEN is 4.11. But that is not an issue, whatever, enterprise, legacy software, etc.
This fucking update introduced memory leak on the hypervisor which has been reported as per xen 4.4.4 history. Furthermore, we have no support from Oracle which means that I have to dig through mailing lists and limited information on the net since oracle has freakin support wall on nearly each of the major bugs found on that shitty software.
I have no idea whether any newer version of xen will work with that old Oracle Linux kernel or not.
Furthermore, Oracle provided great documentation on how to rollback the fcking update. Reinstall the hypervisor. Riiiight. XEN does not have export/import feature.
eh1 -
Mozilla really knows how to nudge one to not use email encryption by default.
Since Thunderbird has native support for OpenPGP encryption, i can only chose to encrypt all or no messages by default. There is no opportunistic mode and there are no per-reciepient encryption preferences. The Enigmail addon had both.
So i obviously have gone for encrypt-by-default.
But since then, whenever i want to send a message to the majority of my contacts, i have to manually disable the encryption or get annoyed by the no-key-found dialog.
I thought, i would get the muscle memory to just disable encryption for recipients for wich i don't expect to have a key.
But they also made the GUI so i have to open a dropdown and then click on the right item to do that. All the items basically look the same, as there is no color coding or specific icon for them. The item labels are also too long for unconscious pattern recognition.
So i didn't got that muscle memory.
I now have turned off encryption by default and will probably forget to enable it for some emails wich i actually could send encrypted...4 -
Time for a rant about shitstaind, suspend/hibernate, and if there's room for it at the end probably swappiness, and Windows' way of dealing with this.
So yesterday I wanted to suspend my laptop like usual, to get those goddamn fans to shut up when I'm sleeping. Shitstaind.. pinnacle of init systems.. nope, couldn't do it. Hibernation on the other hand, no problem mate! So I hibernated the laptop and resumed it just now. I'm baffled by this.
I'll oversimplify a bit here (but feel free to comment how there's more to it regardless) but basically with suspend you keep your memory active as well as some blinkenlights, and everything else goes down. Simple enough.. except ACPI and I will not get into that here, curse those foul lands of ACPI.
With hibernation you do exactly the same, but on top of that, you also resume the system after suspending it, and freeze it. While frozen, you send all the memory contents to the designated swap file/partition. Regarding the size of the swap file, it only needs to be big enough to fit the memory that's currently in use. So in a 16GB RAM system with 8GB swap, as long as your used memory is under 8GB, no problem! It will fit. After you've moved all the memory into swap, you can shut down the entire system.
Now here's the problem with how shitstaind handled this... It's blatantly obvious that hibernation is an extension of suspend (sometimes called S3, see e.g. https://wiki.ubuntu.com/Kernel/...) and that therefore the hibernation shouldn't have been possible either. The pinnacle of init systems.. can't even suspend a system, yet it can hibernate it. Shitstaind sure works in mysterious ways!
On Windows people would say it's a hardware issue though, so let's talk a bit about that clusterfuck too. And I'll even give you a life hack that saves 30GB of storage on your Windows system!
Now I use Windows 7 only, next to my Linux systems. Reason for it is it's the least fucked up version of Windows in my opinion, and while it's falling apart in terms of web browsing (not that you should on an EOL system), it's good enough for le games. With that out of the way... So when you install Windows, you'll find that out of the box it uses around 40GB of storage. Fairly substantial, and only ~12GB of it is actually system data. The other 30-ish GB are used by a hibernation file (size of your RAM, in C:\hiberfil.sys) and the page file (C:\pagefile.sys, and a little less than your total RAM.. don't ask me why). Disable both of those and on a 16GB RAM system, you'll save around 30GB storage. You can thank me later.
What I find strange though is that aside from this obscene amount of consumed storage, is that the pagefile and hibernation file are handled differently. In Linux both of those are handled by the swap, and it's easy to see why. Both are enabled by the concept of virtual memory. When hibernating, the "real" memory locations are simply being changed to those within swap. And what is the pagefile? Yep.. virtual memory. It's one thing to take an obscene amount of storage, but only Windows would go the extra mile and do it twice. Must be a hardware issue as well.
Oh, and swappiness. This is a concept that many Linux users seem to misunderstand. Intuitively you'd think that the swappiness determines what percentage of memory it takes for the kernel to start swapping, but this is not true. Instead, it's a ratio of sorts that the kernel uses when determining how important the memory and swap are. Each bit of memory has a chance to be put into either depending on the likelihood of it being used soon after, and with the swappiness you're tuning this likelihood to be either in favor of memory or swap. This is why a swappiness of 60 is default most of the time, because both are roughly equally important, and swap being on disk is already taken into account. When your system is swapping only and exactly the memory that's unlikely to be used again, you know you've succeeded. And even on large memory systems, having some swap is usually not a bad idea. Although I'd definitely recommend putting it on SSD in a partition, so that there's no filesystem overhead and so that it's still sufficiently fast, even when several GB of memory are being dumped in.6 -
Have been playing the pirated version of Rust for 30+ hours with no issues.
Decide to buy the game and every fucking time the game turns into Chrome and consumes all my RAM forcing Windows to show the low on memory dialog.
Lesson learned I guess.7 -
So we’ve taken over from a project team that disbanded... read: “cut their contracts because fuck this, I can earn more working for better people”.
Me and one other guy have been tasked with saving this heap of shit.
Obviously the project guys left saying “it’s nearly done, just this one feature”. Because cut contracts are easier to deal with if “everything is almost done”.
We jump on and find that’s not the case at all... this thing, is a beast, a big old stats analysis program... so we’re like “cool, let’s see what’s going o...OH MY GOD”.
The “recalculation” function was core to this POS. The contractors had done it in C# through entity framework... it took 24 hours to run, over a reasonably small data set that was due to double every 2-5 years.
So... here’s the deal, it ran over night.... then failed. And no cunt had noticed. Entity framework “can’t commit because I’m muddled up as fuck, did you really just put the whole db in EF in memory to work with it?” Exception.
Que 6 months of me and my lead doing the job properly.
Anyway, the failure: I ended up in Hospital again with a Crohn’s flare up... about 5 months in.
Fuckall to do with all this nonsense I just wanted to tell a story. it was an interesting/fun project to fix and my lead was a legend... so happy days.
Similar story, different set of contracted devs... they’d been defining requirements with the business users using the term “Risk” which the business users knew as a group of risks.
The domain model had been written RiskGroup<>— -
GOD DAMN IT COLLEGE YOU DID IT AGAIN. for real college can go suck Satan's 50 inch red cock for all I care.
A professor asked me to design a processor and I'll get a bonus. I said okay cool nothing hard.
oh but it has to be in verilog.
okay cool.
oh and it has to be on this fucking ancient useless piece of shit called xilinx that the fucking college provides to you only via a fucking 50 gigabyte virtual machine.
sigh. okay..... challenge accepted.
It fucking crashes every 2 minuites. And after 3 days of no sleep. I finally finished the Alu, Control unit, 4k memory, 8 registers and the busses.......... BUT THEN THE ENTIRE VIRTUAL MACHINE CRASHED AND LOST ALL PROGRESS...... fml.
and the professor only gave me the bonus for the Alu. sigh. fuck college.11 -
today my cpu was at 100% (red) ram at 96% (above 95 is memory leak) and disk 95% (almost no space left) bc i ran emulator 2 android studio projects chrome etc all in the same time1
-
Reanimated an old e-ink tablet today.
First, I didn't even know it needed to be reanimated. I just copied my books there, but it didn't find them. When I connected it again, they were gone.
Factory reset. Format storage. The memory seems empty, but after rebooting I see that everything is still intact.
Ok, imma hit forums then. They tell me I need to replace the internal memory. But isn't that something you need soldering for? Wrong! The internal memory IS JUST A MICRO SD CARD on the motherboard. The card is some cheap no name one, and people tell the similar story of it burning out after like four years of use.
Damn! The vendor has the AUDACITY to charge for signing their firmware to be flashed to a new micro sd card.
But I won't go down this easily. I hit forums again, and apparently there is a tool to sign the firmware yourself, but you need to find the card's serial number. To do that, you have to flash a bootleg tool, boot from that card, and it will show you the data you need. Then, you have to insert them into some shady .ini file (why is everything touching bootleg firmware runs only on windows?).
So I do that. The problem is, I need an image for my book. I find some shady one online, sign & flash it — touchscreen doesn't work. But I have the official firmware. I put two and two together and figure out that if the reader is able to display the ui, it probably has the firmware update tool working. So, immediately after flashing, I launch the firmware update utility that picks up my firmware from the second sd card (yes, they have an additional external slot).
Bingo. It works.
So, here are the steps:
1. Find a shady sd serial number detection tool
2. Flash it on a memory card with a shady vendor-specific flashing tool
3. Insert the new (now shady) card
4. Boot, write down the serial number
5. Find a shady boot image online
6. Edit a shady .ini file of a shady self-signing tool to sign the shady boot image
7. Flash the altered shady boot image with the shady flashing tool on your memory card
8. Copy a shady firmware update on a new card
9. Insert both cards
10. Pray4 -
!rant
How to earn a lot of money as a programmer?
So this question might sound a little naive and too simple, but earning a lot of money is what we all want after all right? Collecting experiences from people in the business should be a good idea.
So this is the position I am in:
I am a German student in my 13th year of school (which means I will graduate this summer) and I am very interested in information technology. I know C++ pretty well by now and I have built a rendering engine for a game I want to make using openGL already, which I am very proud of.
I would love to turn this passion into my profession and thats why I plan to attend a dual course of computer science next year (dual means that I will be employed at a company (or similar) in parallel to the studying course).
But what direction should I be going in if I want to make big money later on? I am ready to spend a lot of time and work on this life project but I don't know which directions are the most promising. I hate being a tiny gear in a huge machine that just has to keep spinning to keep the machine alive, I want to be part of a real project (like most people probably) and possibly sell a product (because I think that is how you really make money).
Now I know there is no magic answer to this, but I bet many people here have made experiences they can share and this could help a lot of people directing their path in a more success oriented way.
I personally am especially interested in fields which are relatively low-level and close to memory (C++), go hand in hand with physics and 3D simulation and are somewhat creative and allow new solutions. (These are no hard lines, I just thought I should give a little direction to what I know already and what I am interested in)
But really, I am interested in any work you are likely to earn a lot of money with.12 -
Tell me you're a media-obsessed rube drone without telling me you're a media-obsessed rube drone. I'll start:
"SoFtWaRe JoB mArKeT iS hOrRiBlE aNd ShOwS nO sIgN oF rEcOvErY!!!"
hah, you mean those layoffs from that handful of frothed-over tech giants which had, I don't know, approximately ONE HUNDRED TIMES the amount of engineers they actually needed? I swear if i see this trope one more time i'm about to rage. can't wait until 2023 when this 'scare' will be but a memory. yes i'm muad'dib, golden path, worm god, whatever
but it's even simpler, you don't have to drink the spice:
- there are an estimated 205,741 people affected by the LaYoFfs (https://www.trueup.io/layoffs, actually a really cool site I just found)
- there are an estimated 3.87 MILLION software engineers, and that's just in the US, so it's safe to say less than 5% of the industry has been affected
so in short yes, you are a rube, i'll enjoy my multiple job offerings
should have been working on your craft instead of reading all those "news" articles. sheesh, i'd scare to hire anyone for a software position who can't get a grip on simple numbers anyway6 -
A dev life in Queen songs:
„A Kind of Magic“ - Build successful
„A Winter’s Tale“ - Key Account Manager visits customer
„Action This Day“ - Release day
„All Dead, All Dead“ - System down
„Another One Bites the Dust“ - kill -9 4711
„Breakthru“ - 10 hour debuging session
„Chinese Torture“ - Microsft Office
„Coming Soon“ - Client asks for delivery date
„Dead on Time“ - shutdown -t 10
„Doing All Right“ - How's the progress on the new feature?
„Don’t Lose Your Head“ - git push -f
„Don’t Stop Me Now“ - In the zone
„Escape from the Swamp“ - Hand in resignation letter
„Forever“ - while(1)
„Friends Will Be Friends“ - friend class Vector;
„Get Down, Make Love“ - No rule to make target "Love"
„Hammer to Fall“ - Release day
„Hang on in There“ - 2 weeks until release
„I Can’t Live With You“- Microsoft
„I Go Crazy“ - Microsoft
„I Want It All“ - Google
„I Want to Break Free“ - free( (void*) 0xDEADBEEF );
„I’m Going Slightly Mad“ - Impossible feature requested
„If You Can’t Beat Them“ - Impossible feature promised by sales
„In Only Seven Days“ - Impossible feature ordered
„Is This the World We Created...?“ - Philosphic moments
„It’s a Beautiful Day“ - Weekend
„It’s a Hard Life“ - Weekday
„It’s Late“ - Deadline was last week
„Jesus“ - WTF?
„Keep Passing the Open Windows“ - Interprocess communication
„Keep Yourself Alive“ - Daily struggle
„Leaving Home Ain’t Easy“ - Time to get up and go to work
„Let Me Entertain You“ - Sales meets customer
„Liar“ - Sales
„Long Away“ - Project start
„Loser in the End“ - Dev
„Lost Opportunity“ - Job ad
„Love of My Life“ - emacs/vim
„Machines“ - Computer
„Made in Heaven“ - git
„Misfire“ - Unhandled exception at Memory location 0xDEADBEEF
„My Life Has Been Saved“ - Google drive/Facebook
„New York, New York“ - Meeting at customer
„No-One But You“ - Bus factor = 1
„Now I’m Here“ - Morning rush hour
„One Vision“ - Management goals
„Pain Is So Close to Pleasure“ - NullPointerExcption
„Party“ - Delivery completed
„Play the Game“ - Customer meeting inhous -
„Put Out the Fire“ - Support hotline
„Radio Ga Ga“ - GSM/GPRS/UMTS/LTE/5G
„Ride the Wild Wind“ - Arch Linux
„Rock It“ - Linux
„Save Me“ - CTRL-S/CTRL-Z
„See What a Fool I’ve Been“ - git blame
„Sheer Heart Attack“ - rm -rf /
„Staying Power“- UPS
„Stealin’“ - Stack Overflow
„The Miracle“ - It works
„The Night Comes Down“ - It doesn't work
„The Show Must Go On“ - Project cancelled
„There Must Be More to Life Than This“ - Philosophic moments
„These Are the Days of Our Lives“ - Daily routine
„Under Pressure“ - 1 day until release
„Was It All Worth It“ - Controlling
„We Are the Champions“ - Release finished
„We Will Rock You“ - Sales at customer
„Who Needs You“ - HR
„You Don’t Fool Me“ - Debugging session
„You Take My Breath Away“ - rm -rf /
„You’re My Best Friend“ - emacs/vim4 -
I really love my mother but.
A couple of weeks ago she asked me for advice regarding a laptop. She wanted something cheap for office and stuff.
Since I know her I exactly knows she needs extreme fast boot and responsiveness. She'll go all hulk rage if the laptop doesn't boot in less than 30 seconds.
Told her to get something with ssd since storage is no issue and 4gb ram with an decent older I5. Took a whole day going through stores in my area and online to find good deals. Send her everything I found. Really good laptop for under 500€ I would've killed for.
Fast forward. She bought some 300€ shit laptop because it had 1tb memory. She didn't ask for advice just bought the cheapest that would read decently description wise.
Now she is raging all day and bitching about it being so slow and I should fix it for her since I'm an it guy etc.
Looking at the specs I nearly started to vomit. She seriously bought a laptop worse than she already had. Old i3 2gb ram 5200rpm HDD.
I told her she should return it because it is shit. But no. She insists that since it's newer it is better and I am only a lazy fuck who doesn't want to be bothered to do her a favor.
Offered the best thing I could think of. Told her I'd install Linux on it for her and teach her how to use it.
Explained it would run more smoothly since she refused to take that shit laptop back. But no. Of course she insists on using windows 10....
FUUUUUUUCK. I love my mother but seriously I'm about to explode.5 -
Windows rant incoming!
For fucks sake! I think Windows have asked me 117 times if I want to update now. The answer is still fucking no!
And I don't care how much of a security improvement it might be, when your shitty update causes a Memory Management error.
So fuck off, stop minimising my game while I play and go fix your shitty update first!
Fuck you Microsoft, fuck your QA team and while I'm at it, I want to say fuck you to all versions of Windows Server as well!5 -
Okay this is 3.30 AM . Just woke up from bad geeky dreams. My heart is pounding so fast that I could nose bleed and I can't sleep as I am remembering I had the same dream last night.
Dream was about : me being astronaut. Everything was usual. From rocket launch to be in space. Scary part was my ship in orbit of moon.
Seeing dead land from that height chocked me. Imagine you are looking out of the window and all you see a big grey land and pitch black in background. Realising there is no one out there was spooky.
The scary part was I launched some satellite but crash on surface. It was scary seeing something going smaller every time. Crashing on deserted land was one plus on adding fear.
Then my ship leave the orbit (from the reverse shock of that satellite dittachment ) and it flow away in the vastness of space......
Away from the moon and away from the earth in long loneliness.
I wish I could erase this from my memory but I am not gonna watch space exploration video anymore.
I got to say, landing on moon is one thing but being out there knowing one accident and you will be forever there. You need balls to be on such missions.4 -
2nd part to https://devrant.com/rants/1986137/...
The story goes on...
After I found more bugs that seem to be related to the communication break, and took a closer look, I sent detailed logs of my research and today we had a conference call.
"We have 2,5 million user, our system is widely-used and there is no plan to change it" they said.
And "We cannot reproduce the issue, but even if there is one, you will have to work around the problem, because we cannot make changes on our side" was one answer
As well as "If we would make changes, we will have to re-certify everything"
So I said we told 'em about the issue to let them improve their system. And I can work around it, I already figured out a solution for my side, but if there is a bug, they'd better fix it for future releases.
And with my additional research I have a bad vibe of some kind of memory leak involved on their "certified" implementation, and that could trigger various other problems.
But it is as always, if I try to be nice, I just get kicked in the ass. I should really be more of an asshole. -
** this means words are muted **
Friday:
I send a mail the client a Google doc with elaborate details about evaluation of an Android tablet from a Chinese manufacturer.
Monday:
The client is upset, he says "You say there is no GPS chip on the tablet while the manufacturer says otherwise"
Me- "I have clearly mentioned that it has a GPS chip"
Client- Opens the Google doc, points to a sentence. Looks at me like I did something horrible.
Me - **This guys is either word blind or something else is wrong with him, the line reads 'GPS chip available'**
Me- "Look, it says 'GPS chip available'.
Client- **Blinks n blinks again** "Alright, but why did you share a Google document, why not PDF, docx"
Me-**Politely** "You can download the document in any format, look I will show you..."
Client- "It should have been in the mail itself ideally"
Me- **WTH** "We normally maintain a document for such things to keep everything organised, but if you want I will put everything in mail itself"
Client- "Hmm.. do both from next time"
Me- "Alright" **BS**
Client- "Why is the new feature taking so much time"
Me- "As planned earlier, we going to deliver it tomorrow"
Client- "Why not today??" **Gives a strange look.**
Me thinking - **Enough**
Me- "See, I am trying to integrate a smarten with a socket connection, reading it's data via exposed APIs that are hardly documented, we need faster performance so I need to implement caching, multi threading, offline handling, multiple processes to avoid memory fluctuations, sync adapter to sync data...."
Client- "Ok ok ok, it's fine if you give working build tomorrow"
Me- "Ok, fine"
#limit1 -
You know shit is going to hit the fan if the sentence "c++ is the same as java" is said because fuck all the underlying parts of software. It's all the fucking same. Oh and to write a newline in bash we don't use \n or so, we just put an empty echo in there. And fuck this #!/bin/bash line, I'm a teacher. I don't need to know how shit works to teach shit. Let's teach 'em you need stdio for printf even tho it compiles fine without on linux (wtf moment number one, asking em leaves you with "dunno..") and as someone who knows c you look at your terminal questioning everything you ever learned in your whole life. And then we let you look into the binaries with ldd and all the good stuff but we won't explain you why you can see a size difference in the compiled files even tho you included stdio in the second one, and all symbol tables show the exact same thing but dude chill, we don't know what's going on either.
Oh and btw don't use different directory names as we do in our examples. You won't find your own path, there is no tab key you can press to auto-fill shit.
But thats not everything. How about we fill a whole semester with "this is how to printf" but make you write a whole game with unity and c#. (not thaught even the slightest bit until then btw)
Now that you half-assed everything because we put you in a group full of fucks who don't even know what a compiler is but want to tell you you don't know shit and show you their non-working unfinished algorithms in some not-even-syntax-correct java...
...how about we finally go on with Algebra II: complex numbers, how they are going to fuck up your life, how we can do roots of negative numbers all of the sudden and let you do some probability shit no one ever fucking needs. BUT WHY DON'T YOU KNOW EVERYTHING ALREADY HMMMMM, IT'S YOUR SECOND LESSON, YOU WENT TO SCHOOL PLS BE A MATH PRO ASAP CUS YOU NEED IT SO MUCH BUT YOU DON'T NEED TO KNOW PROPER SYNTAX, HOW MEMORY MANAGEMENT WORKS, WHAT A REFERENCE IS AND PLS FINALLY FORGET THE WORD "ALLOCATION" IT DOESN'T PLAY A SINGLE ROLE YOU ARE STUDYING SOFTWARE DEVELOPMENT WHY ARE YOU SO BAD AT ECONOMICS IT MAKES NO SENSE I MEAN YOU HAD A WHOLE SEMESTER OF HOW TO GREET SOMEONE IN ENGLISH, MATHS > ECONOMICS > ENGLISH > FUCKING SHIT > CODING SKILL THATS HOW THE PRIORITIES WORK FOR US WHY DON'T YOU GET IT IT MAKES SO MUCH SENSE BRAH4 -
One of the things that I like the most regarding Clojure(and most Lisps to be honest) is how "not for beginners" the ecosystem feels.
Don't get me wrong, setting up a project in lein with dependencies(both internal and external) is a cakewalk, installing lein or boot is a cakewalk. Setting environment consts and middleware etc etc is a cakewalk.
Its just that there are no blogs about convoluted and amateurish ways of doing things. Most presentations and articles are written by really experienced and talented individuals.
I dunno, its just a nice shift in community. Its nice to see people not fucking up Object Oriented programming in java or any of the other oop languages. Its nice not seeing people giving horrible advice regarding memory management in C or c++ and it is sure as shit nice to not see spaghetti php und js code.
And my productivity levels are off the charts man. Really liking this shit and I get to stay inside my JVM -
Hey DevRant Fam!, i hope everyone is doing very well today! :D so recently i have had this thought in my mind and i'm not so sure what to think.... i've been coding in c# for awhile now and i absolutely love love it!.
though i have no job experience yet and i truly cannot wait till i get into an internship position and hopefully land a full-time position!, though, my memory isn't the best in terms of anything, i generally have to (not all the time) look up documentation on Microsoft's website for c#, try and read and understand code examples etc, Would you feel that's like not a good sign or..... im curious to know what you guys think!. just so you know i never copy/paste any code! i try do everything myself :-)
Again thank you very much for reading this! and i do apologise if it is too long!, i hope you guys/gals are having a wonderful day/night wherever you may be! <3
Best
Milo8 -
Today's accomplishments:
- Actually got the fuck out of bed this morning
- Fixed the RCA connector on the CRT I got from a friend (I got scared while discharging it but it turned out fine). Basically the metal piece that carries the signal through the connector was bent to hell and sticking out, so I desoldered it, bent it right again, put it in, and resoldered it.
- Went to taco bell twice within 8 hours
- Sat and talked with a couple friends for like 2 hours after school
- Met and briefly talked to a very cute girl that my friend introduced me to. She has colored hair (I REALLY like colored hair) and she vapes. So perfect girl for me.
- FINALLY FUCKING STARTED LAUNDRY
Things I didn't accomplish today:
- Working on the web page I posted about this morning
- Getting to school on time (ONE DAY I WILL)
- Staying in school once I was actually there (left during my 6th period to go to taco bell the second time, first time today was in the morning after I was already late to school cause they won't let me into class if I'm late)
- Fixing the boot errors on my laptop (sometimes when I boot it fucking freezes after flushing the journal, I've been trying to figure it out for a while but I have no fucking clue)
- Figuring out why my PS2 doesn't want to recognize controllers or memory cards (got a new motherboard and now it just isn't recognizing the controller/memory card, I feel like some of the traces broke at some point while it was apart??)1 -
1) Learning little to nothing useful in formal post-secondary and wasting tons of time and money just to have pain and suffering.
"Let's talk about hardware disc sectors divisions in the database course, rather than most of you might find useful for industry."
"Lemme grade based on regurgitating my exact definitions of things, later I'll talk about historical failed network protocols, that have little to no relevance/importance because they fucking lost and we don't use them. Practical networking information? Nah."
"Back in the day we used to put a cup of water on top of our desktops, and if it started to shake a lot that's how you'd know your operating system was working real hard and 'thrashing' "
"Is like differentiation but is like cat looking at crystal ball"
"Not all husbands beat their wives, but statistically...." (this one was confusing and awkward to the point that the memory is mostly dropped)
Streams & lambdas in java, were a few slides in a powerpoint & not really tested. Turns out industry loves 'em.
2) Landed my first student job and get shoved on an old legacy project nobody wants to touch. Am isolated and not being taught or helped much, do poorly. Boss gets pissed at me and is unpleasant to work with and get help from. Gets to the point where I start to wonder if he starts to try and create a show of how much of a nuisance I am. He meddle with some logo I'm fixing, getting fussy about individual pixels and shades, and makes a big deal of knowing how to use GIMP and how he's sitting with me micromanaging. Monthly one on one's were uncomfortable and had him metaphorically jerking off about his lifestory career wise.
But I think I learned in code monkey industry, you gotta be capable of learning and making things happen with effectively no help at all. It's hard as fuck though.
3) Everytime I meet an asshole who knows more and accomplish than I do (that's a lot of people) with higher TC than me (also a lot of people). I despair as I realize I might sound like that without realizing it.
4) Everytime I encounter one of my glaring gaps in my knowledge and I'm ashamed of the fact I have plenty of them. Cargo cult programming.
5) I can't do leetcode hards. Sometimes I suck at white board questions I haven't seen anything like before and anything similar to them before.
6) I also suck at some of the trivia questions in interviews. (Gosh I think I'd look that up in a search engine)
7) Mentorship is nigh non-existent. Gosh I'd love to be taught stuff so I'd know how to make technical design/architecture decisions and knowing tradeoffs between tech stack. So I can go beyond being a codemonkey.
8) Gave up and took an ok job outside of America rather than continuing to grind then try to interview into a high tier American company. Doubtful I'd ever manage to break in now, and TC would be sweet but am unsure if the rest would work out.
9) Assholes and trolls on stackoverflow, it's quite hard to ask questions sometimes it feels and now get closed, marked as dupe, or downvoted without explanation.3 -
I thought iPhone simulator was ONE THING that worked smoothly but NO, it is also a load of crap, hogging away the memory. Can’t develop ANYTHING without spoiling the mood because of this slow performance. It’s the same with android studio, Xcode, android emulators, and now simulator with vscode is doing the same thing. It’s 2020 you’d think a developer can write code smoothly on a huge MacBook Pro but no what a fucked up world, I’m hungry again, I have eaten up everything what to do I hate fruits !!7
-
Buying a thunder purple 1+6T on Thursday... It will have more memory and more storage than my daily use computer (a Chromebook). Going to install UserLAnd, get a folding Bluetooth keyboard and a stand.
Laptop replacement. No, seriously.6 -
Friday 13th. Superstition.
0655, got WFH laptop going. 0700, VPN'ed in. Bluescreen, first in ages. Yes, Windows, the hatred is mutual. Rebooted. Windows claimed memory fault, offered check, 40 minutes. Noped out. Started machine. VPN'ed in. Some strange script error that I'd never seen before. Rebooted. Script error again. Shut down machine, then rebooted, same problem. 0715, fuck, still wearing sweaters, my e-scooter not charged, and an important Teams call at 0800.
Got dressed, stuffed laptop into backpack, hurried up by foot. Took the bus. Fuck, the next connection on the change station just had gone off. Took a taxi to make it. Arrived at the company, plugged in the laptop, started with no issues. Had the important call.
Took the laptop to IT. Tested it with external network connection and VPN. Worked with no script error. Had it checked for RAM issues. No issue. WTF had happened in the morning?!6 -
I need guidance about my current situation.
I am perfectionist believing in OOP, preventing memory leak in advance, following clean code, best practices, constantly learning about new libraries to reduce custom implementation & improve efficiency.
So even a single bad variable name can trigger my nerves.
I am currently working in a half billion $ IT service company on a maintenance project of 8 year old Android app of security domain product of 1 of the top enterprise company of the world, which sold it to the many leading companies in the world in Govt service, banking, insurance sectors.
It's code quality is such a bad that I get panic attacks & nightmares daily.
Issues are like
- No apk obfuscation, source's everything is openbook, anybody can just unzip apk & open it in Android Studio to see the source.
- logs everywhere about method name invoked,
- static IV & salt for encryption.
- thousands of line code in God classes.
- Irrelevant method names compared to it's functionality.
- Even single item having list takes 2-3 seconds to load
- Lag in navigation between different features' screens.
- For even single thing like different dimension values for different density whole 100+ lines separate layout files for 6 types of densities are written.
- No modularized packages, every class is in single package & there are around 100+ classes.
Owner of the code, my team lead, is too terrified to change even single thing as he don't have coding maturity & no understanding of memory leak, clean code, OOP, in short typical IT 'service' company mentality.
Client is ill-informed or cost-cutting centric so no code review done by them in 8 years.
Feeling much frustrated as I can see it's like a bomb is waiting to blast anytime when some blackhat cracker will take advantage of this.
Need suggestions about this to tackle the situation.10 -
I've implemented an in memory caching system for database queries with Redis in one of the blogs I manage.
Will it work well? Or do you think it will produce issues? I have no experience with Redis yet.14 -
Not really a rant but..... Just found this while cleaning my PSP's Memory Stick collection
As a CS student this part makes me feel gud
(It's No Game No Life btw)2 -
This little game took me like 2h of development, it's build without any framework whatsoever.
It is based on my memory of a very old game my brothers used to play on DOS, it was used to teach how to type superfast
Little details on how this works: the inputs at the bottom are programmed to be used with keys (only letters), ENTER and TAB, no need to use mouse in this game to move around, just hit tab to move to next, hit enter to confirm what you typed.
I know I should upgrade this to use a list of actual words instead of just random letters, but never wanted to actually work on it again.
http://examcopy.altervista.org/apps...
I highly recommend trying it on a PC, also contains Ads, not invasive, tho
Other games I developed:
http://stefagna.altervista.org/swis...
http://examcopy.altervista.org/apps...
Note: PLEASE, DON'T GO TO THE HOMEPAGE OF THESE WEBSITES, they're kind of NSFW4 -
Sharing a first look at a prototype Web Components library I am working on for "fun"
TL;DR left side is pivot (grouped) table, right side is declarative code for it (Everything except the custom formatting is done declaratively, but has the option to be imperative as well).
====
TL;DR (Too long, did read):
I'm challenging myself to be creative with the cool new things that browsers offer us. Lani so far has a focus on extreme extensibility, abstraction from dependencies, and optional declarative style.
It's also going to be a micro CSS framework, but that's taking the back-seat.
I wanted to highlight my design here with this table, and the code that is written to produce this result.
First, you can see that the <lani-table> element is reading template, data, and layout information from its child elements. Besides the custom highlighting code (Yellow background in the "Tags" column, and green gradient in the "Score" column), everything can be done without opening even a single script tag.
The <lani-data-source> element is rather special. It's an abstraction of any data source, and you, as a developer can add custom data sources and hook up the handlers to your whim (the element itself uses the "type" attribute to choose a handler. In this case, the handler is "download" which simply sends a fetch request to the server once and downloads the result to memory).
Templates are stored in an html file, not string literals (Which I think really fucks the code) and loaded async, then cached into an object (so that the network tab doesn't get crowded, even if we can count on the HTTP cache). This also has the benefit of allowing me to parse the HTML templates once and then caching the parsed result in memory, so templates are never re-parsed from string no matter how many custom elements are created.
Everything is "compiled" into a single, minified .js file that you include on your page.
I know it's nothing extraordinary, but for something that doesn't need to be compiled, transpiled, packaged, shipped, and kissed goodnight, I think it's a really nice design and I hope to continue work on it and improve it over time1 -
LXC, no doubt.
I mean to be fair, LXC is an amazing container runtime once you manage to set it up. But setting it up is the hard bit. Starting off with LXC 2.x, it was a nightmare to find out how to get things like the storage backends working. But with ZFS it ended up being alright. Find some arcane values to stick in the /etc/lxc/default.conf to use ZFS as the backend and then the default storage location on those ZFS pools (I'll get back to that later), and it worked alright. Again, once it works it's great, but setting it up and finding the right configuration keys is absolute hell.
So, LXC 2.x for a while and a few months ago I finally ended up upgrading to 3.x. Every single configuration key changed. Every single one of them, and that's why I had to 1) learn LXC all over again, and 2) redeploy each and every one of my containers. That process is still not entirely completed. ZFS backend was once again a dive into arcane configuration keys found on forums and whatnot. Yeah.. official documentation has none of it. Oh and in 3.x you now also have to dodge the torrent of "just use LXD m8" messages. Yeah, very helpful when LXD is also the ONLY way to reasonably configure it. Absolutely beautiful. Oh and as far as the ZFS default storage location goes (such as ssd/lxc/ct)? Yeah forget about it. There's no configuration option for it anymore, and the default is "lxc". In ZFS lingo that means that LXC has the audacity to demand a whole pool for itself. No. No you don't deserve a whole pool for yourself. But hey at least you can define the storage location to use in the lxc-create command! Every single time you have to define it in lxc-create. I abstracted it away into my own LXC interface, so no big deal really. But yeah... That could absolutely be better. And in 2.x it was actually better.
Oh and btrfs, the filesystem I'd like to use on low memory systems because ZFS' ARC is too much on such systems? Yeah forget about it. I still have no idea how to do it. Thank you LXC and its amazing documentation!
And if you want the icing on the cake for LXC's terrible documentation, see their repo's index page at https://github.com/lxc/lxc/.... Yeah, it's totally still at 2.x... That's how well they maintain that. Even Debian has 3.x now. And if you look at the branches, you'll find that even 4.x is already available and considered stable. -
I have an internal perception of myself. It isn't an image like a memory is, and it's not a description such as a sentence, but it's purely a feeling. I feel it in the core of my soul, not my body. And when I listen to Minecraft volume Alpha, it transforms my internal perception for the duration of album and the feeling lingers afterwards.
By now I must have a year of in game time and hearing those sounds and seeing the old textures brings be back to the days of middle school playing Minecraft Pocket Edition Lite on my first phone.
I wasn't happier back then. I'm just as happy today as I was back then. But restoring my inner self to that time, just briefly, is wonderful.
I'm thankful to Minecraft for being a great game. It has seen many changes in it's public perception. In the beginning, it was for all ages. Deadmau5 played it, notch developed it. It was a different beast. Then, without the content of the game changing at all, it became a child's game. Then it became a child's game that PewDiePie played and it was acceptable to play without any shame again. And now, once again, it is on a downward slope to being a child's game.
No matter what the shifting sands of public view on the game is, I will always hold this game close to my heart and I will continue to play it whether it's socially acceptable or not. If for nothing else than to remind my soul of a simpler time.1 -
At first, you're just a baby who cries and poops.
You outgrow the baby clothes, the crib and the stroller.
Then, you're just a child who plays, runs around and starts school.
You grow tired of your toys and are no longer allowed in the ballpit.
Then, you're just a teenager who curses, sulks and defies your parents.
You grow tired of teen music, stow your stuff away and move out.
Then, you're just a student who finally gets to drive a car and vote, but has no money.
You get a job, a place of your own, start dating and fall in love.
Then you're just a noob at everything you do; new at work, newly in love; feeling your way through life.
You have children and no longer have time to spare for anything else.
Then, you're just a parent taking parental leaves, attend parent-teacher meetings and neglect your friends.
You're no longer welcome in the children's games, or even to talk to them.
Then, you're just an "old fart" or "bitch" who's only good when you give them dough.
You help the children move out, you retire and have grandchildren.
Then, you're just a senior citizen who talks about nothing but your grandchildren and go window shopping outside the pharmacy.
You're hearing and vision get impaired, you get ailments and lose your memory as well as your intellect.
Then, you're just dead.
So, at what stage of life are you really somebody?13 -
So, funny story with a bit of self promotion at the end.
I was recently checking out some apps on playstore and found that my first ever , "launched just to experiment" app (released 1.5 years ago) has received more than 5k downloads . I was very happy about that so posted a small message on LinkedIn .
Now , my LinkedIn profile consists of 98% people who are totally strangers and never met me ( is it just me or do you also get a lot of stranger connect requests there?). So my usual post rarely ever goes beyond 5 or 6 likes.
Bit idk how there too my post got 35+ likes and now i was on cloud9.
So i finally decided to kick my ass and release some update to that app ( it had around 70% pity comments like "nice first app,but it should have this x feature",. "overall nice but it could use an x feature " etc.
And boy what my journey was in the last 72hours.
Firstly my madhead laptop started killing me with the battery failures and constant hang.
Then my past asshole self tried to give me a middle finger. So i have this whole partition in my memory where i keep my Android stuff and apps. It has a special folder named published zone and i keep all my published app codes and related files there.
I was fairly certain that this app's code eill be also there,so i opened it, found the code and tried running it.
Turns out my asshole self had tried to mess around the code so much that all the db layer WAS fucked up, all the ui WAS changed and no code was working.
"Not to worry", i thought. I always use git and there would be a correct version some commits before. WRONG. I HAD CHANGED THE WHOLE FUCKING WORKING PRODUCTION CODE AND DIDN'T MAINTAIN A VCS!
Also this was the verbose and shitty java code my 1.5 year before self so loved to write, so it was taking me way more time to figure out what's happening in an already fucked up code.
So i tried a couple of ways to get back my working code :
- I tried looking for a google recommended solution. Those guys take my whole app code build and distribute via playstore, but they provide no means to retrieve back the original code.
- i checked my (occasionally) back up hard disk but no. My hard disk would have 100s of movies from 2016 , but not a useful piece of fuckin code.
- i also tried to get my apk and decompile it via some online decompiler. Here the google again fucks up and don't allow me to get my apk directly. Meanwhile i found a ton of shady websites which are hosting an apk of my app without my knowledge O_o . I tried to decompile on of them but code was even more non understandable than my fuck up code.
So i ended up looking at both the mess up code and decompiled code and coded the whole app from scratch ( well not scratch, i extracted the resources and some undamaged activities from the mess up code . Also github was down for more than 3 hours yesterday , at the same time when i was trying to look onto some repositories)
Lessons learned:
- DON'T FUCK UP WITH THE PRODUCTION CODE
- MAINTAIN VCS
- Your laptop is shit reliable, github is also shit reliable , so save code at multiple places.
- there are way more copies of your code lying on the internet than you think.
Checkout my app here :https://play.google.com/store/apps/...2 -
I can work with Angular, even though it's pain in the but.
My current Angular job is actually the job with the first manager that had decent human values and ethics, I like my team, and yeah, what we building is shit. But it's only 30% shit because of Angular, another 30% are due to SAFe, and the rest is the usual stuff.
Still enjoy my job and respect my team.
But please do not expect me to pretend Angular is on a comparable level to React. Angular hasn't brought any actual innovation in most major versions but releases those breaking major updates still at least twice a year.
Ivy might be awesome, but only because Angular told the world 3 years ago also to have Ivy compatible compile targets for their libs/packages doesn't mean everybody cared.
And the ngcc, the awesome compatibility compiler, mutates node modules in place. So ne parallel stuff, no using yarn2 or pnpm.
At the same time, React brought so many innovations into the frontend world but is basically backwards compatible.
Not sure how the Angular partial compilation and whatever needs to go on works, but it seems like there's hardly anyone that really knows, so you can't use Vite or whatever other new tool.
And sure, if you're really good, you can write Angular without producing memory leaks.
But it's really hard. Do you know what's also quite hard: Producing memory leaks with React!
And for sure, Angular Universal, which isn't used by anyone, it feels like, will still be on a comparable level to an open source product that's used all over the world, builds the basis for an open source company, and is improved by thousand of issues day by day.
And sure, two kinds of change detection are a great idea. And yeah, pretending Angular comes with all included makes it worth it that the API is fucking huge and you're better of knowing nothing, because you have to read up things, than knowing quite a lot, since making assumptions and believing apis work in a similar way and follow similar contentions...
Whatever... I work with it. Like the time. Like the company, even my poss. But please don't expect my lying to you this was a good idea, or Angular is even remotely the same level of React.15 -
So I'm on my morning stroll. Walking, enjoying, watching the world around me.. It's nice how cherries blossom. They smell very tempting to stop there and enjoy the moment. Some flowers under the cherry...
Why do plants blossom again? Oh yeah, that's right, to exchange some speciments in order to grow fruit and seeds. To have their offspring. Just like every other living macroorganism [with a few exceptions ofc]. Life has no other way to survive but to exchange genetic material between two parties and only then trigger growth of the new life.
And that is a very strict rule. No more, no less: it takes exactly 2 organisms to make new life. But why is that? If my memory serves, theory of evolution says that life is like business: cut the losses and let the profits run. Over time it discards everything not required for the organism in order to save energy, and only successful new "investments" remain in the genome. The unsuccessful ones die before they proliferate, so the bad genes shall not survive.
It also says that very simple things, very simple changes lead to very complex outcomes. Us. Life.
But what is simple about life having to need 2 other lives? Exactly 2. It's either simple or efficient, depends on perspective. BUT IT IS NOT BOTH. Look at cells. They just split in half and multiply. Dead simple. It takes one of them to make another one. But with mammals, birds, reptiles, plants and other macroorganisms [excpt fungi] this is not the case! Why?!? I can't think of any scenario where two generic microorganisms, following some dead simple mutations, would come up w/ something that inefficient and overly complex. Like they're living on their own, multiplying by division, and smth very simple happens and they can no longer divide, only mate in pairs. The primitive, efficient and simple mechanism gets terminated and replaced with a different one, incredibly complex one!
Sure, we have protozoa which have similar reproductive mechanisms. They exchange genetic material to multiply.
But look at our, human cells. They dont need that! Look at some reptiles, some plants that only take one to make another. They don't pair as well! It's simple. Efficient. Why do protozoa need 2 for the species to survive?
It's not simple and efficient [tho helps us adapt, but its not my point for now]. See, things like this make ne wonder. What if we, the life, are not as accidental as we think? What if this whole mechanism was set off by someone or something billions of years ago? That's mean there are much older, much more superior cognitive organisms than us. What if protozoa was version 3 of new life [the first two did not survive]? Viruses - v2? Sea creatures - v3, reptiles - v4, and so on until they came up with us, mammals? That'd surely mean we are not alone in this universe. Are they watching us? Will they create a new species any time soon? What's our purpose, are we just an experiment?
And so, from cherry blossoms to existensial dilemma, my stroll is over. Time for breakfast :)1 -
A long time ago you sent me an email with the subject 'I love you', I then got so excited that I forwarded the letter to all my contacts, and they forwarded it too.. I can't describe the words for the feelings I had back then for you. I felt into love with you, really. But there were always troubling moments for me.
For example when 'Code Red' showed up and found your backdoor. Man I was pissed at that time. I didn't know what to do next. But things settled, and we found each other again.
And then that other time when this girl named 'Melissa' was sending me some passwords to pr0n sites, I couldn't resist. She was really awesome, but you know, deep in my heart that was not what I wanted. I somehow managed to go back to you and say sorry. We even moved together in our first flat, and later in our own house. That was a really good time, I love to think back at those moments.
Then my friend 'Sasser' came over to us one night, do you remember how he claimed that big shelf in our living room, and overflooded it with his own stuff, so that we haven't a clue we are reading yet offshelve? Wow that was a disturbing experience.
But a really hard time has come when our dog 'Zeus' got kicked by this ugly trojan horse. I really don't want go into details how the mess looked like after we discovered him on our floor. Still, I am very sorry for him that he didn't survived it :(
Some months later this guy named 'Conficker' showed up one day. I shitted my pants when I discovered that he guessed my password on my computer and got access to all my private stuff on it. He even tried to find some network shares of us with our photos on it. God, I was happy that he didn't got access to the pics we stored there. Never thought that our homemade photos are not secure there.
We lived our lives together, we were happy until that day when you started the war. 'Stuxnet..'! you cried directly in my face, 'you are gonna blow up our centrifuges of our life', and yeah she was right. I was in a real bad mood that days back then. I even not tried to hide my anger. But really, I don't know why all this could happen. All I know is, that it started with that cool USB stick I found on the stairs of our house. After that I don't remember anything, as it is just erased from my memory.
The years were passing. And I say the truth here, we were not able to manage the mess of our relationship. But I still loved you when you opened me that you will leave. My 'Heartbleed' started immediately, you stabbed it where it causes the most pain, where I thought that my keys to your heart are secured. But no, you stabbed even harder.
Because not long after that you even encrypted our private photos on our NAS, and now I am really finished, no memory which can be refreshed with a look at our pictures, and you even want my money. I really 'WannaCry' now... -
!rant
Went from uni to my car to drive back home. Engine doesn't start, And report of low oil level is showing up. Hmmm. I've opened hood and checked oil level. It was empty. First thought. I drove here with no oil so I broke the engine. Great... I bought some oil and refiled it. Still same problem. I've called my insurance company and my mechanic. And then. Brilliant thought evolved. Did I turned off ignition on secret switch today? Yea it was it. Had to call everybody again and cancel my AC request. Gosh, I hate having memory of golden fish...
Also. Hi everybody. my first !rant3 -
9000 internet cookie points to whoever figures out this shit:
I'm trying to import a secret gpg key into my keyring.
If I run "gpg2 --import secring.gpg" and manually type each possible password that I can think of, the import fails. So far, nothing unusual.
HOWEVER
If I type the same passwords into a file and run:
echo pwfile.txt | gpg2 --batch --import secring.gpg
IT ACTUALLY FUCKING WORKS
What the fuck??? How can it be that whenever I type the pw manually it fails, but when I import it from a file it works??
And no, it's not typos: I could type those passwords blindfolded from muscle memory alone, and still get them right 99% of the time. And I'm definitely not blindfolded right now.
BUT WAIT, THERE'S MORE!!
Suppose my pwfile.txt looks something like this:
password1
password2
password3
password4
password5
password6
Now, I'm trying to narrow it down and figure out which one is the right password, so I'm gonna split the file in two parts and see which one succeds. Easy, right?
$ cat pw1.txt
password1
password2
password3
$ cat pw2.txt
password4
password5
password6
$ echo pw1.txt | gpg2 --batch --import secring.gpg
gpg: key 149C7ED3: secret key imported
$ gpg2 --delete-secret-key "149C7ED3"
[confirm deletion]
$ echo pw2.txt | gpg2 --batch --import secring.gpg
gpg: key 149C7ED3: secret key imported
In other words, both files successfully managed to import the secret key, but there are no passwords in common between the two!!
Am I going retarded, or is there something really wrong here? WTF!4 -
Compilers should just work for raw C with only static memory allocation. This isn't the bad old days where a couple of dudes wrote a short book explaining how C might probably should possibly work. I hear supposedly we have standards now.
Well, last week I lost 2 days to our compiler randomly forgetting that it wasn't okay to put a globally allocated uint32 at an address ending in 9. What? It had been handling this case without issue for more a year, but now after changing completely unrelated code we have this problem.
I'm not sure how to even deal with this idiocy so no doubt I'll continue working on it this week, too.
Thanks a lot, GCC.1 -
So I recently finished a rewrite of a website that processes donations for nonprofits. Once it was complete, I would migrate all the data from the old system to the new system. This involved iterating through every transaction in the database and making a cURL request to the new system's API. A rough calculation yielded 16 hours of migration time.
The first hour or two of the migration (where it was creating users) was fine, no issues. But once it got to the transaction part, the API server would start using more and more RAM. Eventually (30 minutes), it would start doing OOMs and the such. For a while, I just assumed the issue was a lack of RAM so I upgraded the server to 16 GB of RAM.
Running the script again, it would approach the 7 GiB mark and be maxing out all 8 CPUs. At this point, I assumed there was a memory leak somewhere and the garbage collector was doing it's best to free up anything it could find. I scanned my code time and time again, but there was no place I was storing any strong references to anything!
At this point, I just sort of gave up. Every 30 minutes, I would restart the server to fix the RAM and CPU issue. And all was fine. But then there was this one time where I tried to kill it, but I go the error: "fork failed: resource temporarily unavailable". Up until this point, I believed this was simply a lack of memory...but none of my SWAP was in use! And I had 4 GiB of cached stuff!
Now this made me really confused. So I did one search on the Internet and apparently this can be caused by many things: a lack of file descriptors or even too many threads. So I did some digging, and apparently my app was using over 31 thousands threads!!!!! WTF!
I did some more digging, and as it turns out, I never called close() on my network objects. Thus leaving ~30 new "worker" threads per iteration of the migration script. Thanks Java, if only finalize() was utilized properly.1 -
Our production server has huge memory shortage so I have to jump trough a lot of mod_rewrite optimising hoops to keep it running because no time to configure a new server...
-
I usually like PHP, because it is easy to use, but FUCK! Can you just let me free the fucking memory by myself? Setting variable to null doesn't work, unset doesn't work either. I am still getting fucking memory exhausted error.
There is literally no data stored anywhere, because I unset every fucking thing.
gc_collect_cycles() doesn't work either, probably because this crap thinks there is a reference for this variable somewhere.12 -
The past couple of weeks I've been struggling with my laptop. It regularly ran out of memory and when that happens everything runs in a snail pace. I always thought 8GB would be enough for developing software, but I was terribly wrong.
So I ordered another 8GB and installed it yesterday. Later at work I looked at the ram usage and noticed that it was up to nearly 13GB!
I have no idea how I managed to get by with only 8 for so long. 🤔
FYI: I usually have 2 to 3 IDEs and a gazillion chrome tabs open 😅6 -
Production goes down because there's a memory leak due to scale.
When you say it in one sentence, it sounds too easy. Being developers we know how it all goes. It starts with an alert ping, then one server instance goes down, then the next. First you start debugging from your code, then the application servers, then the web servers and by that time, you're already on the tips of your toes. Then you realize that the application and application servers have been gradually losing memory over a period of time. If the application is one that don't get re-deployed ever so often, the complexity grows faster. No anomaly / change detection monitor can detect a gradual decrease of memory over a period of months.2 -
ATTENTION PLEASE! Important announcement following:
Please check your interface implementations for correct byteorder according specification BEFORE YOU START COMPLAINING ABOUT DATA FAILURES ON EXCHANGING DATA.
Freakin hell, if I'd get some money for every byte order mismatch on testing interfaces, I'd be a be a billionaire.
And why are all those highlevel I-know-every-fucking-framework developer incapable of checking the real memory content of a datatype, and the real data content on the interface even if you tell them that their byte order is obviously wrong?
No, your system is not the centre of the universe and I don't care how you get your less-than-32bit-datatypes-are-for-assembler-usage-frameworks to change byteorder. It's not rocket science, if there's no ready-to-use-function then write those 4 lines yourself.
Next time I get to specify an interface I'll go for mixed-endian, just to make sure everybody involved knows the concepts of endianess afterwards.2 -
First rant here...
Hand full of devs have to create a huge web platform that can shovel a lot of data around in about two months which is impossible...
Project lead has left major decisions in the hands of interns like database we want to use because no question can.be answered by that person. Inexperienced intern has chosen a fucking nosql database for highly relational datasets... why? Because new tech...
Development began and a bunch of problems arised... database was accessable from internet from day one. Random crashes because out of memory exceptions. Every possible feature had a description of at most 10 words... and no standards where enforced on anything.
Now that finaaaally we switch to sql after almost a year of prototypical production everybody keeps coding on new features so i have to port all the crap to the new database...
best part: a bunch of clients on different op systems have to be ported as well!
Even better part: i have to do that cause everybody else has practically no experience in any field...
And now the joke: i got hired for gui/desktop application development
Am i a wizard now? -
Hey guys and gals, I built a silly little memory game! Comment with your best scores (no inspecting elements...that's cheating). Also, don't click too fast or it'll break. Lol
http://threetendesign.com/memory4 -
Actually kinda sad, that there is no pure rust ui framework out there, but rather mere adaptations of c/c++ frameworks for rust. It's better than nothing for sure, it just would be nice, if i could use a framework, that doesn't create a massive memory leak, because i looked at it funny.
In particular i'm using fltk-rs, and everytime I'm applying a font to some widget, 500kb get added as leaked memory. Doesn't sound like a lot, but for one it's a dynamically built application, so the order and amount of widgets changes, and this application is supposed to run days, if not weeks.
thanks to heaptrack i was able to pinpoint that to libpango, which i'm not even interacting with directly, but rather indirectly through the api.
Annoying, that i chose to use a language for actively preventing leaks and dangling pointers and stuff, but end up leaking memory because of a dependency somewhere.7 -
I struggled with weather to post this but I feel like I have to. I didnt want to feed into the fear or give 'them' any more reason to argue against common sense but I guess it cant be helped.
The reason I was gone for a while was because I went and got my vaccination.
In less than half hour after getting the vaccine, I was in the ICU. The staff told me I had a stroke possibly from clotting and inflamation. I couldnt feel my arm or anything below my shoulders. Yes really.
Apparently I "died" for a little while and when they brought me back I was in a coma for almost a week.
I'm back home now and I still dont fully understand what happened. Still have numbness, and horrible headaches, and can barely think straight sometimes, but the doctors told me that I didnt suffer any permanent brain damage according to my scans.
Also they told me I had old damage to my left and right temporal lobe, which makes sense because I have always suffered problems with short term memory and other issues.
And I'm just at a loss how this could happen. I have no serious injuries. We were told this is safe.
And this is the exact reason I didnt want to post it, because now tards will come in and be "lololol serves you right vaxxer!"
If I knew the side effects were this bad maybe I would have changed my mind but no one told me! I mean I think I still would have got it because we have to protect vulnerable people, but still.
The hospital assured me it wasnt the vaccine and must have been an underlaying condition, but I'm not so sure. I just happen to have a pre-existing problem that I dont know about that causes a stroke and paralysis only half an hour after the shot?
And now I dont know if I'll ever be ok. And doctors warned me I may suffer more strokes and to avoid physically demanding tasks for a while. My primary job is construction (not by chooce). Now I face the prospect of not even being able to work my existing job or do the things I love, like hiking, anymore. So much of the world doesnt make any sense right now and I just dont know what to believe anymore.
Tards will probably be in shortly to suggest I check for microchips or test fucking magnets on myself.
No, just stop.8 -
Lately programs have been crashing a lot on my pc, I've tried different things like disabling SWAP for a sec, BIOS changes, remove firefox and use Google Chrome, try different commands, it kept happening.
Obviously along the way I started investigating what was causing these crashes, looking through bug reports and my syslog. There was no consistency, except for 1 thing: SIGENV. Everything that crashed had a segmentation fault, now I'm not an expect and I don't know what this means or how to fix it, so I went to Google to ask for answers.
Then I downloaded memtest and ran a memory test, error palooza. Then I went to Windows and ran memory check, error palooza.
This is week 3 of this high-end gaming pc which was a huge investment AND IT HAS BEEN FUCKING WITH ME BECAUSE OF BAD MEMORY HOW THE FUCK DOES THIS HAPPEN I ALMOST STARTED TO DOUBT UBUNTU BUT IT WAS A FUCKING FAULT IN BRAND NEW MEMORY MODULES WHAT THE FUCK.
Obviously I'm pissed off. Today I'm gonna call the store that assembled it to voice my complaints.
Thank you for listening to my TedTalk.13 -
So, in opengl 4.x, there are no primitives for circle, and the only ways to draw an almost perfect circle are following
Draw a triangle fan and fk up your memory for a circle
Draw a rectangle and use the fragment shader and distance equation to discard the bit that is not used
But you will need to add an if statement and potentially increase the frame time (from what i have heard)
And it will be more complicated than just using a triangle fan14 -
It started when i was about 10 old.
My uncle showed me how to display something in dos-prompt using the echo command in a custom batch-file.
A few commands later, i was able to "program" a flip-book of an ascii ski-driver. Each ascii picture was separated by pressing any key and cls ^^
Aaaaah. Sweet childhood memories!
Later on i used a programming-language for beginners in windows.
This language gave you control of a triangle called "turtle".
My first high-level programming language was Delphi.
Since i had no idea of databases, i created a pseudo database of magic the gathering play-cards. Each card had it's very own windows formular filled up completely with an uncompressed image object displaying the chosen card modally. *sigh*
I scanned each card by using a feed scanner.
Finally, my application consisted of 200 cardimages and forced my PC to swap the required memory from my harddisk.
Boy o boy. I was such a noob! ^^
Over the years i discovered and felt in love with a lot of languages (jsp, java (script), c#, php, ...) and concepts (mvvm, mvc, clean-architecture, tdd, ...)! ;) -
Fucking elastic appsearch, too many requests per seconds and it dies. It doesn't go slow first, it just dies. No warning just a timeout that lasts until a manual restart in elastic cloud console. Besides being apparently the shittiest product in the elastic stack, it's also the worst documented. And yes I just scaled it up but not being able to handle indexing 100 documents per second with 8 vcpu and 8 gb memory is a shame.5
-
So to give you a feel for what evil, clusterfuck code it was in: this projects largest part was coded by a maniac, witty physicist confined in the factory for a month, intended as a 'provisional' solution of course it ran for years. The style was like C with a bit of classes.. and a big chunk of shared memory as a global mud of storage, communication and catastrophe. Optimistic or no locking of the memory between process barriers, arrays with self implemented boundary checks that would give you the zeroth element on failure and write an error log of which there were often dozens in the log. But if that sounds terrifying already, it is only baseline uneasyness which was largely surpassed by the shear mass of code, special units, undocumented madness. And I had like three month to write a simulator of the physical factory and sensors to feed that behemoth with the 'right' inputs. Still I don't know how I stood it through, but I resigned little time afterwards.
Well, lastly to the bug: there was some central map in that shared memory that hold like view of the central customer data. And somehow - maybe not that surprisingly giving the surrounding codebase - it sometimes got corrupted. Once in a month or two times a day. Tried to put in logging, more checks - but never really could pinpoint the problem... Till today I still get the haunting feeling of a luring memory corruption beneath my feet, if I get closer to the metal core of pure C.1 -
Having my first memory leak problem ever. This sucks. I've tried what seems like everything. Forcing garbage collection every time I press a key to try and debug the issue. Fuck. I have 'using' blocks everywhere, and I have no idea what I'm doing wrong.3
-
Can anyone help me with this theory about microprocessor, cpu and computers in general?
( I used to love programming when during school days when it was just basic searching/sorting and oop. Even in college , when it advanced to language details , compilers and data structures, i was fine. But subjects like coa and microprocessors, which kind of explains the working of hardware behind the brain that is a computer is so difficult to understand for me 😭😭😭)
How a computer works? All i knew was that when a bulb gets connected to a battery via wires, some metal inside it starts glowing and we see light. No magics involved till now.
Then came the von Neumann architecture which says a computer consists of 4 things : i/o devices, system bus ,memory and cpu. I/0 and memory interact with system bus, which is controlled by cpu . Thus cpu controls everything and that's how computer works.
Wait, what?
Let's take an easy example of calc. i pressed 1+2= on keyboard, it showed me '1+2=' and then '3'. How the hell that hapenned ?
Then some video told me this : every key in your keyboard is connected to a multiplexer which gives a special "code" to the processer regarding the key press.
The "control unit" of cpu commands the ram to store every character until '=' is pressed (which is a kind of interrupt telling the cpu to start processing) . RAM is simply a bunch of storage circuits (which can store some 1s) along with another bunch of circuits which can retrieve these data.
Up till now, the control unit knows that memory has (for eg):
Value 1 stored as 0001 at some address 34A
Value + stored as 11001101 at some address 34B
Value 2 stored as 0010 at some Address 23B
On recieving code for '=' press, the "control unit" commands the "alu" unit of cpu to fectch data from memory , understand it and calculate the result(i e the "fetch, decode and execute" cycle)
Alu fetches the "codes" from the memory, which translates to ADD 34A,23B i.e add the data stored at addresses 34a , 23b. The alu retrieves values present at given addresses, passes them through its adder circuit and puts the result at some new address 21H.
The control unit then fetches this result from new address and via, system busses, sends this new value to display's memory loaded at some memory port 4044.
The display picks it up and instantly shows it.
My problems:
1. Is this all correct? Does this only happens?
2. Please expand this more.
How is this system bus, alu, cpu , working?
What are the registers, accumulators , flip flops in the memory?
What are the machine cycles?
What are instructions cycles , opcodes, instruction codes ?
Where does assembly language comes in?
How does cpu manipulates memory?
This data bus , control bus, what are they?
I have come across so many weird words i dont understand dma, interrupts , memory mapped i/o devices, etc. Somebody please explain.
Ps : am learning about the fucking 8085 microprocessor in class and i can't even relate to basic computer architecture. I had flunked the coa paper which i now realise why, coz its so confusing. :'''(14 -
It's 2022 and people still believe USB sticks and external card readers are a replacement for memory card slots.
They're not. SD cards have a standardized form factor and do not protrude from memory card slots, but external card readers and USB sticks do.
Just like smartphones, laptops are increasingly ditching the SD card slot or replacing it with microSD, which has less capacity, lower life expectancy and data retention span due to smaller memory transistors, worse handling, and no write-protection switch.
Not only should full-sized SD cards be brought back to laptops, but also brought to smartphones. There might soon be 2 TB SD cards, meaning not one second of worrying about running out of space for years. That would be wonderful.22 -
C is love, C is life.
Great language.
I genuinely don't get why so many people are struggling with pointers, considering it's a pretty straightforward concept. I understand that they can be complex in simplicity, but the concept itself is much easier to understand than say, references in OOP languages(despite being the same thing under the hood).
I mean it's just a number like any other number, except that number is treated as a memory address, and the star(* - dereference operator) just takes a value, goes to the memory address that is the value, and takes a value from there.
I feel like most explanations and tutorials just try to over complicate it for no reason.27 -
I like rants that are thought provoking and push a message forward regardless of whether they may sting a little, so for my first post on here I'd like to hit at home with many of you.
Html5 "Native" Applications are not needed. Let's cover mobile first of all, the misconception that apps are written in either javascript or Native android/ Native ios environment. Or even some third party paid tools like xamarin is quite strange to me. OpenGL ES is on both IOS and Android there is no difference. It's quite easy to write once run everywhere but with native performance and not having to jump through js when it's not needed. Personally I never want to see html or css if I'm working on a mobile app or desktop. Which brings me to desktop, I can't begin to describe how unthought out an electron app is. Memory usage, storage space for embedding chromium, web views gained at the expense of literally everything else, cross platform desktop development has been around for decades, openGL is everywhere enough said. Finally what about targeting browser if your writing a native app for mobile and desktop let's say in c++ and it's not in javascript how can it turn back into javascript, well luckily c++ has emscripten which does that simply put, or you could be using a cross complier language like haxe which is what I use. It benefits with type safety, while exporting both c++ and javascript code. Conclusion in reality I see the appeal to the js ecosystem it's large filled with big companies trying to make js cross development stronger every day. However development in my mind should be a series of choices, choices that are invisible don't help anyone, regardless of the popularity of the choice, or the skill required.8 -
Two days ago on my linux partition python was being weird, and I couldn't fix it no matter what I tried. Logical option was to backup /home and then reinstall linux
Two days later I want to die, for whatever reason I can't properly boot from a live USB without getting "input/output error", and I've already erased my previous linux installation at this point.
Anyway, I still have windows, and I think that the problem might be faulty RAM (I get i/o errors with any live USB) so I booted into windows, let it do some updates and now I'm checking my memory
After this, I'm going to open up my PC and check that the RAM sticks are all in properly9 -
I keep phones for two years, or try to anyway. Later this year I will hit that two year mark, and rather than excitement at the idea of getting a new phone, I find myself thinking that there's nothing out there that excites me at all. And also, my current phone is in no way deficient. It doesn't hold a charge like when it was new, but that's totally normal, and as degraded as it may be, it's still not a problem at all.
A powerful phone with a snapdragon and 6 or more GB of memory, that measures under 5" and doesn't have some bullshit OEM skin on it. To my knowledge, it doesn't exist.6 -
In the past, apps I've written have used a flat file backend. It's very fast, but obviously clunky to have a big structure of flat files for an app. It ran circles around framework-based RDBMS backends, as performance is concerned, but again, it was clunky. Managing backups and permissions on tens or hundreds of thousands of small files was no fun. Optimizing code for scaling was fun- generating indexes, making shortcuts -but something was still missing. Early in 2017 I discovered redis. A nosql backend that just stores variables and lives almost entirely in memory. Excellent modules and frameworks for every language. It was EXACTLY what I'd needed, even though I didn't know I did. I spent a good deal of time in 2017 converting apps from flat files to redis, and cackled with glee as they became the apps I wanted them to be. Earlier this week, I started building my first app that started with redis, instead of flat files, and I can't stop gushing to anyone who will listen. Redis for president!
-
I can't really figure out how I grew from learning_syntax -> remembering_function_names -> following_patterns -> developing_a_personal_style -> reading_the_doc -> getting_the_source.
Well I have a long memory problem, so I guess it happened overnight!
Wait, did the doctor say it was a memory problem? Hell no! -
Let me run something by all of you. Let's say you once started freelancing as a "Plan B" in case your full-time gig dropped you. Over 12 years you've managed to build a long-standing personal brand around that occasional freelancing. You have several clients who adore you and the work you do and they tell you they would be lost without your talent and have nowhere else to go and nobody else they trust. You know, because in the past you tried to send them elsewhere (for various reasons) and they just kept coming back.
You get laid off from the full-time gig and ACME Company calls and interviews you as a top candidate they're really interested in for that same type of work for a full-time job they're offering.
Here's the catch...if hired, you have two months to basically erase your personal brand and agree never to do any freelancing work as before, even on your own time on evenings and weekends. ACME wants your full focus and attention. Additionally, you find out that the person you'd be replacing is being let go because they weren't sufficiently tech-skilled for the job. And, with a little digging, you find out that person _also_ had several freelancing gigs going on the side. Probably for the same "Plan B" reason. Which is probably why ACME is demanding exclusivity.
Your client base is small. ACME says "we don't care". The work you do is 90% automated and easily achievable in just minutes a day on a weekend or evening. ACME says "doesn't matter". You already had full-time work to begin with so you weren't doing a ton on the side. ACME couldn't be less interested in this "excuse". And you're not keen on the idea of burning down your brand, especially with no guarantees of any kind in the present IT industry hiring/firing/layoffs climate. ACME says this issue is make or break for them.
If you get to the offer stage do you:
a) Flip the bird to your brand and clients you've built up for over a decade and memory-hole it?
b) Negotiate a non-compete clause with ACME, agreeing not to take on any new clients while working full time for them?
c) Flip the bird to ACME and look for something else?
Asking for a friend. ;)16 -
I swear, I started yesterday windows once for some guilty gaming. ONCE
Tried to connect Bluetooth headset
-> BSOD on the first try. Fuckn os can't handle shit
Works second time.
*Execute guilty_gaming.exe*
*Finish gaming business*
Want to shut down windows
"oh, I can only shutdown if I install your fucking update? Well fuck me pls no delet pingu partition
Next day. Pingu is alive. Wanna connect headphones.
* Connection: yes
* error.Failed
* Connection: yes
Fuckn ok, does it still work in windows?
Spoiler: fucking no! Very cool. I didn't there would be a better waste of time than gaming, but windows always fund a way to fuck you shit up.
Windows vista was less of a pain, windows 7 a nice memory and this is just an abortion fucking kept alive for the proving god that human can create a better hell for people than lucifer could ever imagine.
Way to go windows, I appreciate MacOS now1 -
YGGG IM SO CLOSE I CAN ALMOST TASTE IT.
Register allocation pretty much done: you can still juggle registers manually if you want, but you don't have to -- declaring a variable and using it as operand instead of a register is implicitly telling the compiler to handle it for you.
Whats more, spilling to stack is done automatically, keeping track of whether a value is or isnt required so its only done when absolutely necessary. And variables are handled differently depending on wheter they are input, output, or both, so we can eliminate making redundant copies in some cases.
Its a thing of beauty, defenestrating the difficult aspects of assembly, while still writting pure assembly... well, for the most part. There's some C-like sugar that's just too convenient for me not to include.
(x,y)=*F arg0,argN. This piece of shit is the distillation of my very profound meditations on fuckerous thoughtlessness, so let me break it down:
- (x,y)=; fuck you in the ass I can return as many values as I want. You dont need the parens if theres only a single return.
- *F args; some may have thought I was dereferencing a pointer but Im calling F and passing it arguments; the asterisk indicates I want to jump to a symbol rather than read its address or the value stored at it.
To the virtual machine, this is three instructions:
- bind x,y; overwrite these values with Fs output.
- pass arg0,argN; setup the damn parameters.
- call F; you know this one, so perform the deed.
Everything else is generated; these are macro-instructions with some logic attached to them, and theres a step in the compilation dedicated to walking the stupid program for the seventh fucking time that handles the expansion and optimization.
So whats left? Ah shit, classes. Disinfect and open wide mother fucker we're doing OOP without a condom.
Now, obviously, we have to sanitize a lot of what OOP stands for. In general, you can consider every textbook shit, so much so that wiping your ass with their pages would defeat the point of wiping your ass.
Lets say, for simplicity, that every program is a data transform (see: computation) broken down into a multitude of classes that represent the layout and quantity of memory required at different steps, plus the operations performed on said memory.
That is most if not all of the paradigm's merit right there. Everything else that I thought to have found use for was in the end nothing but deranged ways of deriving one thing from another. Telling you I want the size of this worth of space is such an act, and is indeed useful; telling you I want to utilize this as base for that when this itself cannot be directly used is theoretically a poorly worded and overly verbose bitch slap.
Plainly, fucktoys and abstract classes are a mistake, autocorrect these fucking misspelled testicle sax.
None of the remaining deeper lore, or rather sleazy fanfiction, that forms the larger cannon of object oriented as taught by my colleagues makes sufficient sense at this level for me to even consider dumping a steaming fat shit down it's execrable throat, and so I will spare you bearing witness to the inevitable forced coprophagia.
This is what we're left with: structures and procedures. Easy as gobblin pie.
Any F taking pointer-to-struc as it's first argument that is declared within the same namespace can be fetched by an instance of the structure in question. The sugar: x ->* F arg0,argN
Where ->* stands for failed abortion. No, the arrow by itself means fetch me a symbol; the asterisk wants to jump there. So fetch and do. We make it work for all symbols just to be dicks about it.
Anyway, invoking anything like this passes the caller to the callee. If you use the name of the struc rather than a pointer, you get it as a string. Because fuck you, I like Perl.
What else is there to discuss? My mind seems blank, but it is truly blank.
Allocating multitudes of structures, with same or different types, should be done in one go whenever possible. I know I want to do this, and I know whichever way we settle for has to be intuitive, else this entire project has failed.
So my version of new always takes an argument, dont you just love slurping diarrhea. If zero it means call malloc for this one, else it's an address where this instance is to be stored.
What's the big idea? Only the topmost instance in any given hierarchy will trigger an allocation. My compiler could easily perform this analysis because I am unemployed.
So where do you want it on the stack on the heap yyou want to reutilize any piece of ass, where buttocks stands for some adequately sized space in memory -- entirely within the realm of possibility. Furthermore, evicting shit you don't need and replacing it with something else.
Let me tell you, I will give your every object an allocator if you give the chance. I will -- nevermind. This is not for your orifices, porridges, oranges, morpheousness.
Walruses.16 -
Some compilers give an error message on forgotten type casting. From that it shows good typing style casting. So you also avoid clerical errors that can lead to the program crash in the worst case. With some types it is also necessary to perform type casting comma on others Types, however, do this automatically for the compiler.
In short:Type casting is used to prevent mistakes.
An example of such an error would be:
#include <stdio.h>
#include <stdlib.h>
int main ()
{
int * ptr = malloc (10*sizeof (int))+1;
free(ptr-1);
return 0;
}
By default, one tries to access the second element of the requested memory. However, this is not possible, since pointer calculation (+,-) does not work for a void pointer.
The improved example would be:
int * ptr = ((int *) malloc (10*sizeof (int)))+1;
Here, typecasting is done beforehand and this turns the void pointer into its int pointer and pointer calculation can be applied. Note: If instead of error "no output" is displayed on the sololearn C compiler try another compiler.1 -
Linux.
Guys, I need some inspiration. How are you dealing with memory leaks, i. .e identifying which component of the system is leaking memory?
Regular method of dumping ps aux sorted by virtual memory usage is not working as all the processes are using the same amount of memory all the time. This is XEN dom0 memory leak, and I have no more ideas what to do.
Is it possible that guests could be eating the dom0 memory?15 -
So my aunt called because her phone had ran out of storage as she had "by mistake" disabled Play Store,WhatsApp, Browser, Chrome and every other fucking app, and she had to install WhatsApp back. After an hour of struggle explaining her to move her songs to memory card, enabling Chrome and Play Store, installing WhatsApp, I have started to lose faith from humaninty.
To make things worse, every Android phone manafacturer feels obligatory to change the settings app as per their wish and I didn't have a clue where the settings to enable apps were on her phone.
And I had to do all this through a phone call
And I can't say "No"
There should be a button in Android: I'm too dumb for all this stuff4 -
We had an ADAM/Colecovision unit before this, but I don't really count it, as it was more of a console for us than a computer.
In 1986 dad brought home a Tandy 1000 SX. It had an Intel 8088 processor, 64k of memory, and no hard drive. With dual 5.25" floppy drives, our write-protected DOS 3.1 disk stayed in drive A almost all the time. Games and other software were run from drive B, or from the external cassette drive. For really big games, like Conquest of Camelot and Space Quest 3, we were frequently prompted to swap disks in B: before the game could continue.
Space Quest, King's Quest, Lords of Conquest, Conquest of Camelot, Chuck Yeager's Advanced Flight Trainer, several editions of Carmen Sandiego, and at least a dozen other games dominated our gaming use. We wrote papers with WordStar, and my parents maintained their budget with Lotus 1-2-3.
A year or two later, Dad installed a 10 MB hard drive, and we started booting DOS off that instead. Heady days.1 -
"NO! I will not download your stupid app! That will boat my memory and chunk my battery! Just to get to that one bit of content from a service I use once in a blue moon!"
-
Avoided IoT(IoS - InternetOfShit) for a long time now, due to the security concerns with retail products.
Now I looked into 433 Transceiver + Arduino solutions.. to build something myself, just for the lolz.
Theory:
Smallest Arduino I found has 32 KByte of programmable memory, a tiny tiny crypto library could take around 4 KBytes...
Set a symetric crypto key for each homebrewn device / sensor / etc, send the info and commands (with time of day as salt for example) encrypted between Server <-> IoT gadget, ciphertext would have checksum appended, magic and ciphertext length prepended.
Result:
Be safe from possible drive-by attacks, still have a somewhat reliable communication?!
Ofc passionate hackers would be still able to crack it, no doubt.
Question: Am I thinking too simple? Am I describing just the standard here?14 -
haha yes let's go from 512MB used by the Android kernel to 1.5GB used in 8 hours thx phonerant android fuck my phone memory leak no root to fix the issue i only have 2gb total that can be used5
-
Was told at work today that I don’t follow directions closely enough and the lack of attention to detail in my work is a problem.
I remember being this way since my first elementary school teacher pointed it out to me. I’ve always been this way. It’s how my brain is wired. No matter how hard I try, I always miss something. Especially when it is a really complex set of tasks. I’ve literally got the results of a cognitive test I took in college documenting and quantifying my working memory deficits.
You think you’ll change that now, after more than four decades of me being like this, with a performance review? Good fucking luck!8 -
For web devs here, do we really still need to support browsers of the evil (yeah I'm talking about MS browsers, Edge included) ?
I mean, building a css ui library here in 2017, without the benefits of custom properties, grid and so many other cool things, is so fucking frustrating.
A practical example : color theming with custom properties = Fuck Yeah / color theming without custom properties = so verbose and painfull, sucks.
The library is mostly for private usage at the moment so... I'm about to drop IE and Edge in the deepest shithole of the darkest cavern of my memory, and move on coding my lib with modern CSS, with almost no regret for the ghosts of the past who are still using these shitware today.
Should I ? Or should I... maintain compatibility as we traditionnally do ?
What's you guys opinion about this ? Can we finally kickban these browsers from our lives ?3 -
Deadline was 2-3 days for product launch and doing distributed transactions was not an opinion as it requires heavy modifications.
I was doing money transfer app between one transactional system and one not transactional system so the way I did it was :
1. transfer money from one system to my app that was using Akka STM ( software transactional memory)
2. try to transfer money to second system
3. transfer money back on failure
There was no database, no state only transactional log as installing database would require to much time and paper work.
Sometimes transfer back failed so we need to look back at logs and search for money, it was quite easy cause there was error and there were not so many failed transactions like this.
About one or two in a month and everyone accepted that.
I started to write some sort of reconciliation thread but then was assigned to other work and it worked like this for couple of years transferring couple millions worth of transactions.1 -
Memory debugging iOS probably makes me more anxious and stressed out than anything. I have put 11
hours into attempting to figure out this crash, but still no progress. It's like I can feel management breathing down my neck to get it done asap. You ever get so stressed out while trying to figure something out at work?3 -
A lecturer for an Embedded Systems module who gave out drivers for an LCD display with no documentation at all, and about 4 functions for writing to the display and 3 initialisation functions, spent ages trying to actually decide what each function did by which memory addresses it was changing and how (made even better by the fact a good bit of the functions were written in Assembly since it was Embedded C)🙃
-
It is quite disappointing when some developers only rely on using libraries / dependencies(or whatever you call it) rather than do it manually. I know it can make the work faster but still using too much libraries will make it worst. It’s not bad using libraries, but if you use too much libraries it doesn’t degrade the performance of the app ( too much memory space when you only need that certain action and you include the whole library) but when the library becomes deprecated and no updates that might cause a problem.
It’s not bad using libraries, but not too much.2 -
Feel dirty writing in c. How do people even deal with unsafe pointer type casting/memory allocation/free? The codebase is plagued with memory leaks and there is no test.
I will just pretend I can't read c code and play dumb when shit happens13 -
Programming embedded systems from scratch. All hardware, memory, timers, peripherals, etc, must be set up correctly at startup, and if you set even one single bit incorrect in any of the sometimes hundreds of 32- or 64-bit configuration registers, you are screwed. There is often no terminal that prints error messages to help you, but if you are lucky you have an (often very expensive) hardware in-circuit debugger to step through the start up code.2
-
Why, Google? WHY?
My wife was annoyed, that her android image gallery showed the images she has sent via telegram, but not the ones, that she had received.
Stupidity no. 1: telegram puts received pictures into Pictures/Telegram on the internal memory. It seems like the default gallery apps don't take nested image-containing directories. As Pictures only contained the default Sony dummy images I moved them away.
Stupidity no. 2: both the receiving and sending image directory of Telegram is named "Telegram" and guess what... Android does not like that. Only the first ist shown (sent images).
Stupidity no. 3: to work around that, I installed the emulated shell to make a symlink named "Telegram-Received". Aaaand that requires root access.
Goddammit Google! She just wants to see our couple selfies that I sent her in her gallery!6 -
I wonder how many github issues have been closed by asking the author to implement the feature they've requested for. In the past, I was confident my issue will be resolved by opening a new one when there's no answer in earlier questions. I can't tell whether the nature of my questions advanced or whether it's a new trend. But I've opened maybe 4/5 issues in recent memory, and each time, the collaborators suggest the feature is one I should contribute to their project by implementing. Isn't this their job as maintainers? I'm already working on something that barely gives me breathing space. I encountered a challenge using your library, and your idea of helping is that I dissent from my own trajectory, acquaint with your project /how to implement what I want, wait for it to get merged etc, before continue what I originally intended. Do they think that's worth it?
Is it just me or is this a common occurrence, lately?17 -
I had to do a project for my A-levels.
The task was to get a client and develop and application based on their requirements. Naturally I made my friends my clients so that I could make something I was interested in.
The teacher constantly changed my requirements during the start, because he liked everyones applications to be somewhat similar (Probably easier to mark), which demotivated me.
The timescale we were set around easter time was to have a demo by the end of summer which didn't need to work properly, and then a completed version after the Christmas holidays.
I wrote about 90% of the program over my 2 weeks off for Christmas, most of that while drunk, high, or both, and managed to complete it within them two weeks.
I went back to the code a couple months later, with no memory of writing it, to set up a demo to show my teacher and I was actually surprised at it. It was the first project of that type that I had worked on, and while there were a couple noticable bugs, it actually worked fairly well, and was really well documented. I was expecting a pile of buggy spaghetti.1 -
Let's go down the memory lane back to freshman year in college as a Computer Science student in my Intro to Programming class....
I remember I was lost as to how the professor created this simple variable below:
int a = 5;
I had no idea what was going on there. haha. looking back to it and seeing the projects I'm working on now puts a smile on my face..
I asked questions. Even the dumb ones and that's what helped me to now..programmers always ask mates or search.
Do you guys care to share yours?1 -
So i have been thinking..
SQL is a lang that runs on a specific software on the server, and helps creating data stores(databases and tables) that can be queried & manipulated.
is there a way to run sql like queries on the client side with no interaction from backend at all?
Say i have 5 inter related data models. in a backend world, they will form nice little tables of a db with all their joins and composite keys. from the server, i shall be querying them like "SELECT name from x where y=z & ..."
but what if i could store them like tables in browser memory and run the same query filters via a query language... is this possible?
i know this poses a certain security risk, but we already use cookies, local storage and a lot of json based shitty client side storages. surely it might be possible to have a lesser optimised sql tables on the frontend with extremely good querying capabilities?
or am i talking something far fetched here?8 -
Urgh... No exceptions in Rust annoys me. Now you only have the choice between "this didn't work please handle this error, thank you ^-^" and "you fool, prepare for annihilation". So basically if anything remotely serious happens your programs dead and there's nothing you can do about it. I don't get why people have this hate for exceptions. Everytime a new language gets made it's always either "ew it has exceptions" or "it's so nice it doesn't even have exceptions". NOOO! They can deal with serious situations in the best possible way and they can be statically checked (so no "but they're so complex and unpredicable" stuff please). If you can expect an exception they shouldn't be used in the first place (eventhough they are absolutely no less good than Option returntypes or whatever, just different) but in cases when it's impossible to predict an error they really shine. And not having them makes your language worse. If a device driver accesses illegal memory it should throw an exception, so instead of the computer shitting the bed, first the offending function has a chance to resolve the problem at it's root, then a few functions up the call stack, the general control functions of the device drivers can handle it and restart the operation if applicable, and even if the driver fails to handle it, the OS can jump in and restart the driver, log an error and do whatever. It's absolutely beautiful: This hierarchical ramp from near the accident site to more high level operations code ensures the error can be caught at the right level of abstraction without introduction a lot of boilerplate. If everything fails and nobody can handle it *then* the program or kernel or whatever can panic.4
-
So, i have that assignment about docker stuff. nifty piece of software i must say.
anyways im installing docker software on windows bc im thinking if i have something that gives me at least the correct structure and some skeletal syntax i will have a faster grasp of the thing. expecting some sort of high level ide but end up instead with what looks like a blank window, with the only obvious choice being sign into some bullshit i dont need. but thats another story
my point is:
when installing the thing it prompted me to install WSL2. which i supposedly am not supposed to have because my cpu doesnt support intel virtualisation. but being impatient (thats why i came to look for an assisted solution), i pursued the installation.
lo and behold: i end up with a shell prompt at the root of a linux filesystem!
i ran 2 or 3 muscle-memory commands and closed the prompt, i was in docker stuff up to the neck.
later on, when i go back to my project, in a virtual machine its sluggish af and screams at me that amd-v is not supported because of something something nested pages (will look up later how that one works).
dont have time to explore it some more yet, and especially experiment or even barely look at this glorious mess because i have something barely working and no time to have it fail.
but this story definitely left me perplexed.
and also : you can run WSL2 on an fx83508 -
Currently having very funny project lead, who gives on the spot estimates for 9 years old very pathetic quality code having Android app in security domain. Memory leaks, bad practices, typos, CVEs etc. you name it we have it in our source of the app.
Since 5-6 sprints of our project, almost 50% of user stories were incomplete due to under estimations.
Basically everyone in management were almost sleeping since last 7-8 years about code quality & now suddenly when new Dev & QA team is here they wanted us to fix everything ASAP.
Most humourous thing is product owner is aware about importance of unit test cases, but don't want to allocate user stories for that at the time of sprint planning as code is almost freezed according to him for current release.
Actually, since last release he had done the same thing for each sprint, around 18 months were passed still he hadn't spared single day for unit testing.
Recently app crash issue was found in version upgrade scenario as QAs were much tired by testing hundreds of basic trivial test cases manually & server side testing too, so they can't do actual needful testing & which is tougher to automate for Dev.
Recently when team's old Macbook Pros got expired higher management has allocated Intel Mac minis by saying that few people of organization are misusing Macbooks. So for just few people everyone has to suffer now as there is no flexibility in frequent changing between WFH & WFO. 1 out of those Mac minis faced overheating & in repair since 6 months.
Out of 4 Devs & 3 QAs, all 3 QAs & 2 Devs had left gradually.
I think it's time to say goodbye 😔3 -
Silly question, but why is it that in this age of 64-bit computing and gigabytes of RAM applications still have trouble with text files/SQL dumps over 1MB in size? Surely for something so simple it should be able to store it all in memory without any issues, no?9
-
Rubber ducking your ass in a way, I figure things out as I rant and have to explain my reasoning or lack thereof every other sentence.
So lettuce harvest some more: I did not finish the linker as I initially planned, because I found a dumber way to solve the problem. I'm storing programs as bytecode chunks broken up into segment trees, and this is how we get namespaces, as each segment and value is labeled -- you can very well think of it as a file structure.
Each file proper, that is, every path you pass to the compiler, has it's own segment tree that results from breaking down the code within. We call this a clan, because it's a family of data, structures and procedures. It's a bit stupid not to call it "class", but that would imply each file can have only one class, which is generally good style but still technically not the case, hence the deliberate use of another word.
Anyway, because every clan is already represented as a tree, we can easily have two or more coexist by just parenting them as-is to a common root, enabling the fetching of symbols from one clan to another. We then perform a cannonical walk of the unified tree, push instructions to an assembly queue, and flatten the segmented memory into a single pool onto which we write the assembler's output.
I didn't think this would work, but it does. So how?
The assembly queue uses a highly sophisticated crackhead abstraction of the CVYC clan, or said plainly, clairvoyant code of the "fucked if I thought this would be simple" family. Fundamentally, every element in the queue is -- recursively -- either a fixed value or a function pointer plus arguments. So every instruction takes the form (ins (arg[0],arg[N])) where the instruction and the arguments may themselves be either fixed or indirect fetches that must be solved but in the ~ F U T U R E ~
Thusly, the assembler must be made aware of the fact that it's wearing sunglasses indoors and high on cocaine, so that these pointers -- and the accompanying arguments -- can be solved. However, your hemorroids are great, and sitting may be painful for long, hard times to come, because to even try and do this kind of John Connor solving pinky promises that loop on themselves is slowly reducing my sanity.
But minor time travel paradoxes aside, this allows for all existing symbols to be fetched at the time of assembly no matter where exactly in memory they reside; even if the namespace is mutated, and so the symbol duplicated, we can still modify the original symbol at the time of duplication to re-route fetchers to it's new location. And so the madness begins.
Effectively, our code can see the future, and it is not pleased with your test results. But enough about you being a disappointment to an equally misconstructed institution -- we are vermin of science, now stand still while I smack you with this Bible.
But seriously now, what I'm trying to say is that linking is not required as a separate step as a result of all this unintelligible fuckery; all the information required to access a file is the segment tree itself, so linking is appending trees to a new root, and a tree written to disk is essentially a linkable object file.
Mission accomplished... ? Perhaps.
This very much closes the chapter on *virtual* programs, that is, anything running on the VM. We're still lacking translation to native code, and that's an entirely different topic. Luckily, the language is pretty fucking close to assembler, so the translation may actually not be all that complicated.
But that is a story for another day, kids.
And now, a word from our sponsor:
<ad> Whoa, hold on there, crystal ball. It's clear to any tzaddiq that only prophets can prophecise, but if you are but a lowly goblinoid emperor of rectal pleasure, the simple truths can become very hard to grasp. How can one manage non-intertwining affairs in their professional and private lives while ALSO compulsively juggling nuts?
Enter: Testament, the gapp that will take your gonad-swallowing virtue to the next level. Ever felt like sucking on a hairy ballsack during office hours? We got you covered. With our state of the art cognitive implants, tracking devices and macumbeiras, you will be able to RIP your way into ultimate scrotolingual pleasure in no time!
Utilizing a highly elaborated process that combines illegal substances with the most forbidden schools of blood magic, we are able to [EXTREMELY CENSORED HERETICAL CONTENT] inside of your MATER with pinpoint accuracy! You shall be reformed in a parallel plane of existence, void of all that was your very being, just to suck on nads!
Just insert the ritual blade into your own testicles and let the spectral dance begin. Try Testament TODAY and use my promo code FIRSTBORNSFIRSTNUT for 20% OFF in your purchase of eternal damnation. Big ups to Testament for sponsoring DEEZ rant.3 -
Why TF does nodejs just eats 100mb of ram away for a simple application with ONE websocket connection ? I've tried getting some heap snaps, memory allocation timelines and used memwatch-next, but to no result AT ALL! Since the heap stay small but the rss memory grows like there is no tomorrow.
-
It's pain in the ass, when you finally managed to free enough memory to keep your android os up-to-date and just a few minutes after update restart getting a message that there is an os update, which needs another 200mb.
It's a never-ending torture..4 -
I am having an introspective moment as a junior dev.
I am working in my 3rd company now and have spent the avg amount of time i would spent in a company ( 1- 1.5 years)
I find myself in similar problems and trajectories:
1. The companies i worked for were startups of various scales : an edtech platform, an insurance company (branch of an mnc) and a b2b analytics company
2. These people hire developers based on domain knowledge and not innovative thinking , and expect them to build anything that the PMs deem as growth/engagement worthy ( For eg, i am bad at those memory time optimising programming/ ds/algo, but i can make any kind of android screen/component, so me and people like me get hired here)
3. These people hire new PMs based on expertise in revenue generation and again , not on the basis of innovative thinking, coz most of the time these folks make tickets to experiment with buttons and text colors to increase engagement/growth
4. The system goes into chaos mode soon since their are so many cross operating teams and the PMs running around trying to boss every dev , qa and designer to add their changes in the app.
5. meanwhile due to multiple different teams working on different aspects, their is no common data center with up to date info of all flows, products and features. the product soon becomes a Frankenstein monster.
6. Thus these companies require more and more devs and QAs which are cogs in the system then innovative thinkers . the cogs in the system will simply come, dimwittingly add whatever feature is needed and goto home.
7. the cogs in system which also start taking the pain of tracking the changes and learning about the product itself becomes "load bearing cogs" : i.e the devs with so much knowledge of the product that they can be helpful in every aspect of feature lifecycle .
8. such devs find themselves in no need for proving themselves , in no need for doing innovative work and are simply promoted based on their domain knowledge and impact.
My question is simply this : are we as a dev just destined to be load bearing cogs?
we are doing the work which ideally a manager should be doing, ie maintaining confluence docs with end to end technical as well as business logic info of every feature/flow.
So is that the only definition of a Software Engineer in a technical product?
then how come innovations happen in companies like meta Microsoft google open ai etc?
if i have to guess as a far observer, i would say their diversity in different fields helps them mix and match stuff and lead to innovative stuff.
For eg, the android os team in google has helped add many innovative things in google cloud product and vice versa.
same is with azure and windows . windows is now optomissed to run in cloud machines when at one point it was just a horrible memory hogging and slow pc OS
for small companies, 1 ideology/product/domain is their hero ideology/product/domain .
an insurance company tries to experiment with stuff related to insurances,health,vehicles,and the best innovations they come up with is "lets give user a discount in premium if they do 5000 steps a day for an year".
edtech would say "lets do live streaming for children apart from static videos"
but Android team at google said , "since ai team is doing so well, lets include ai in various system apps and support device level models" ~ a much larger innovation as 2 domains combined to make a product
The small companies are not aiming to be an innovative product, they are just aiming to be a monopoly product. and this is kinda sad2 -
So I thought I had a basic, high level understanding of C++ STL strings, pointers, copy constructors and stuff. In comes a dirname, a -D_GLIBCXX_USE_CXX11_ABI=0 and... Toto, I've a feeling we're not in Kansas anymore.
So what is happening? I copy a string expecting a deep copy, but then I do the dirname or manipulation on the copy and it messes up *both* strings. gcc/C++ I know you're a beast, but what's going on there? Thing is only possible if I cast away const from c_str - which of course is a doubtful operation - but there also seems to be some strange copy on write logic that the data pointers initially point to same memory location and only with first manipulation on the copy they start to point to different addresses.
I had no clue. And still don't have.4 -
everything is going as planned! :)
Learned Rust Lang. i loved it (that doesn't mean i am done learning na? No! never stop)
new language i could do game memory hacking in without worrying about C++ memory leaks or issues. it also compiles to assembly! another of my favorite languages!
(i use rust for game development and other stuff)
i am not leaving C / C++ though that would be harsh!,
i abandoned javascript for react and typescript.
to be honest the developer just made javascript and left us with a [object Object]
finished learning the android java api so im basically set anything i want to make i can just go on my pc, listen to music and write it out in a couple of days.
well phazor what are you going to do now?!
i will code till i am old.
i will leave my mark like a shid that made its skid in the bowl :)5 -
I know some of y'all will judge the fuck outa me
But I had a "I've no idea how this works, but it works" moment today on a pet-project...
It's so inefficiently made coz I was frustrated by it failing so thought il let it work first then worry about shrink-wrapping the logic
Yet with NO-CACHE, from DB -> Service/API -> HTTP response, is just 350ms...
WITH In-memory Cache it goes down to 40-50ms...1 -
I was 7 years old, and my mom’s friend brought me their old computer as a new year present. I was absolutely happy that day, because I wanted my own computer as far back as I can remember. I spent that evening exploring russian psychological (!) sex quiz (!!) with pictures (!!!) :D I found it on C:\
Actually no, there is an earlier memory. I was four, and I really wanted to mess around with my sis’ computer, it was some kind of holiday, maybe the new year as well. They won’t let me do it, and being an engineer, I took a rectangle-shaped candy box and made a “laptop” out of it. I remember drawing the screen, the icons and stuff. And plastic mold that actually handles candy, I turned upside down, and the candy cavities became sort of “buttons” I could press.2 -
is there any way to convert python straight to C yet? i just barely can't get PyInstaller working on PythonD because no os.fork() (because DOS. no, not cmd.exe, actual fucking DOS.)
one broken function between me and victory
"just use C" DJGPP is kicking my ass all the same, random unknown segfaults are a bitch and also i can't get quite what i want in the memory layout restrictions i have to work under
"just use Assembly/BASIC" their file handling makes me wanna die and BASIC is fucking massive as well18 -
Bloody fucking Android! Updates, updates and more updates! My development Nexus 5X won't allow me to sideload apps since it updated... Hello, printf debugging! Goodbye, profiler and debugger!
My hate for Android grows with each version after 4.0.$something... 2 was shit, I missed 3, 4 was OK, and since then it's going steeply down.
And don't get me started on Material Design...! Good luck figuring out what's a button and what's a label...
And what's up with the "let's keep all apps running all the time to save a few ms on start" philosophy!? Who thought that is a good idea!? Yeah, System.exit(0) works, but... Is it so hard to determine when it's not needed anymore (has no services running etc.)? Why should a web browser (for example) stay in memory after I quit? Minimize is a thing (Home button), why make it so confusing?
Another thing - feedback-less async tasks - why? I like to know when it is working in the background... How the hell am I supposed to find out if it is supposed to do this or if it is frozen?
And Android deciding to kill your process whenever it pleases without any callback... Happened to me once with an Activity in the foreground (no exceptions anywhere in my app, it just quit). How do you do IO properly? It seems you can't guarantee some file or socket or something that must be closed doesn't stay open (requiring to restart Bluetooth 'cause the socket wasn't closed, for example)...4 -
I don't know why but the default settings in Ubuntu have changed quite a lot. There was once a glorious time when if your Ubuntu got stuck, you could press Ctl + Alt + F2-F6, login to a console, run top command to see which process was taking too much time and kill it, and you can go back and start the process and again.
I remember days(~15-20 days) between restarts of my laptop, because I could do that. But now, my Ubuntu gets stuck, and continues to get stuck for about 5-10 mins, and then just restarts.
I have run the disk checks to see if my hard disk is creating issues, but no issues there. Maybe, there are times when the processes execute some buggy code and cannot get out. One fine moment, one of the processes(probably a browser or Eclipse), starts using too much memory or cpu, and the whole worlds seems to be crashing down.
But, my control to kill it promptly without crashing my other applications, was so good to have. And now, every time this happens, I feel 2016-17 and earlier days were so much better.12 -
I'm facing a strange problem, I have a 400GB microsd, it is formatted as exFAT
I tried formatting it again to either ntfs or ext4, on either Linux or macOS, but every tool says format complete then when scans again it still shows the files that storage had + that it's exFAT
I tried gparted, disk utilities (macOS), Disks (ubuntu), mkfs all show same result that it successfully formatted the card but after refresh still shows old filesystem + the contents of the memory already there no file was removed
Can anyone help?21 -
PhoenixOS (Android) in Windows
--booting from usb
1. Success
Boots well, with secure boot off, and legacy boot on
http://metroize.com/usb-boot-linux-...
2. Crash
google play store and other google services keeps crashes, but other apps doesn't
when ignoring error popups, the app doesn't actually crash
3. storage
the memory is only allocated to the system, which means no user file storage
have to find a way to fix that3 -
tell me guys what would you prefer:
function a(){
..
b(..)
..
b(..)
..
}
function b(p1,p2,p3,p4,p5,p6){.
...
}
or
function a(){
..
b(..)
..
b(..)
..
}
function b(
p1,
p2,
p3,
p4,
p5,
p6
){
...
}
if you read this rant before expanding, you got a complete context on how what function a is, its calling b 2 times and how function b looks.
if instead of the first option, i had used 2nd block, you wouldn't even know the 2nd param of b function without expanding this rant.
my point?
i prefer to keeping unnecessary info on one line. and w lot of linters disagree by splitting up the code. and most importantly , my arrogant tl disagree by saying he prefers the splitted code "for readability" and becaue "he likes code this way, old-eng1 likes this and old-eng2 likes this" .
why tf does an ide have horizontal a scrolling option available when you are too stupid to use it?
ok, i know some smartass is going to point that i too can use vertical scrolling, but hear me out: i am optimising this!
case 1 : a function with 7 params is NOT split into 7 lines. lets calculate the effort to remember it
- since all params could have similar charactersticks ( they will be of some type, might have defaults, might be a suspendable/async function etc), each param will take similar memory-efforts points. say 5sp each.
- total memory-efforts= 5sp *7 = 35 sp.
- say a human has 100 sp of fast memory storage, he can use the remaining 65 sp for loading say 5 small lines above or below.
- but since 5 lines above are already read and still visible on screen, they won't be needed to be loaded again nd again, nd we can just check the lines below.
- thus we are able to store 65+35+65 = 165 sp or about 11 lines of code in out fast memory for just a 100sp brain storage
case 2 function with 7 params IS split into 7 lines.
- in this case all lines are somewhat similar. 5sp for param lines as they are still similar which implies same 35sp for storing current function and params
- remaining 65sp can only be used to store next 5 lines of 13sp as the previous code is no longer visible.
- plus if you wanna refresh the code above, you gotta scroll, which will result in removing bottom code from screen , and now your 65sp from bottom code is overwritten by 65sp of top code.
- thus at a time, you are storing only 6 lines worth of code info. this makes you slow.
this is some imaginary math, but i believe it works10 -
Can anyone please help me troubleshooting my PC? My PC won't boot even to BIOS. This happened several times in the past but usually jiggling the cables would do the trick, this time it doesn't.
What happened: PC powered up, power light went on, all fans turned on just fine, hdd light turned on for a few seconds before turning off, monitor didn't catch any signal from the HDMI
What I have tried with no luck:
- unplugged and replugged SATA cables, fans, mobo 24 and 8-pins connectors
- moved the harddisk to another SATA and power connector
- flushed the CMOS memory
- removed RAMs
- unplugged speaker, keyboard and mouse
- switched it on without the HDD connected
Any suggestions?9 -
So Im still a college student but my worst burnout happened a week before finals this year as a sophomore at DigiPen
On the same week, I had to conduct and submit 10 playtests for my competitive Unity game by Friday, submit said game polished and completed on Sunday, finish and submit my team game project that I've been working on the whole year with 10 other people also on Sunday (which we would discover so many TCR submission issues we ended up finally resubmitting on Thursday the following week).
On top of this I had to write a memory manager for my operating systems class due Thursday, a water retainment assignment involving recursive queues for my Data Structures class due Saturday, and to top it all off that class also had a final Thursday when that memory manager was due :'). I don't know how I managed to get OK sleep.
All stuff was due that week so all game teams could have next week before finals to work on submitting, so some CS teachers also move their finals to before that to theoretically distribute the load (which sucks for people in my major because we're almost a double major for CS and Game Design) However my team wanted to submit early to snatch some bonus points but we ended up having to resubmit late anyways :(. Due to the week of hell we were already burned out when trying fix our resubmission.
I love the school and the people in it but there's a reason why our most heard phrase is "I want to die" and no its not just a millenial thing I swear. -
Frigging jvm crashing god every god dammed time...Here I was all jolly about to start presenting a report to my team , I hadn't exported it to HTML yet so I was presenting in the app . Half an hour goes by and it's all good no problems , manager suggest I export a report and pass it to everyone . The moment I right click jvm HANGS THE APP . I try to save the session file but fail horribly , the temp file only has unreadable , unimportable data...fml now I have to go back through all that from memory...
-
TL;DR When talking about caching, is it even worth considering try and br as memory efficient as possible?
Context:
I recently chatted with a developer who wanted to improve a frameworks memory usage. It's a framework creating discord bots, providing hooks to events such as message creation. He compared it too 2 other frameworks, where is ranked last with 240mb memory usage for a bot with around 10.5k users iirc. The best framework memory wise used around 120mb, all running on the same amount of users.
So he set out to reduce the memory consumption of that framework. He alone reduced the memory usage by quite some bit. Then he wanted to try out ttl for the cache or rather cache with expirations times, adding no overhead, besides checking every interval of there are so few records that should be deleted. (Somebody in the chat called that sort of cache a meme. Would be happy , if you coukd also explain why that is so😅).
Afterwards the memory usage droped down to 100mb after a Around 3-5 minutes.
The maintainer of the package won't merge his changes, because sone of them really introduce some stuff that might be troublesome later on, such as modifying the default argument for processes, something along these lines. Haven't looked at these changes.
So I'm asking myself whether it's worth saving that much memory. Because at the end of the day, it's cache. Imo cache can be as big as it wants to be, but should stay within borders and of course return memory of needed. Otherwise there should be no problem.
But maybe I just need other people point of view to consider. The other devs reasoning was simple because "it shouldn't consume that much memory", which doesn't really help, so I'm seeking you guys out😁 -
It annoys me immensely when I struggle with myself, criticizing my own lack of knowledge in certain areas and my colleagues say: "You'll learn by doing". No, I won't, that's a foolish dogma.
I won't and I have never learned by 'doing'. The best results I've obtained have been through understanding every last bit of what's under the hood of a particular functionality. I'm not going to understand the white box by constantly probing the black box, it's just unsatisfactory and insufficient information. It's even dangerous to base yourself on the black box results because you often might get false positives.
I got through university by massive multilateral sensory focus: kinesthetic (writing things down), auditory (listening to the professor), visual (observing graphs and models of the material taught), conscious (mentalizing it all and interlinking information so that later it's accessible from long-term memory). I can confirm this is necessary for the brain because a Neurologist once told me just that.
At least for me, I had the most horrible grades (D's and F's) in freshman year with the 'learn by doing' method and the best grades (A, A+) with the multi-sensory method in later years as I matured my studying methods. In fact, with that method I've continuously outsmarted other people who had 10 years more experience than me ('experts', 'consultants',..) but they preferred to stay in the ignorant 'bro zone' rather than learning things properly. Even worse, the day they arrived on the scene, they completely broke the production environment and messed it up for the whole team. I felt like banging my head on my desk. It just makes me disappointed in the system.
If you follow popular method, you'll soon find yourself in the same problems that arise from doing what everyone else does. What happens at that point? That's right, they have to call in someone who actually bothered learning things.10 -
Another hours wasted on debugging, on what I hate most about programming: strings!
Don't get me started on C-strings, this abomination from hell. Inefficient, error prone. Memory corruption through off by one errors, BSOD by out of bound access, seen it all. No, it's strings in general. Just untyped junk of data, undocumented formats. Everything has to be parsed back and forth. And this is not limited to our stupid stupid code base, as I read about the security issues of using innerHTML or having to fight CMake again.
So back to the issue this rant is about. CMake like other scripting languages as bash have their peculiarities when dealing with the enemy (i.e. strings), e.g. all the escaping. The thing I fought against was getting CMake's fixup_bundle work on macOS. It was a bit pesky to debug. But in the end it turned out that my file path had one "//" instead of an "/" and the path comparison just did a string comparison without path normalization.
Stop giving us enough string to hang ourselves!rant debugging shit scripts of death fuck file paths fuck macos string to hang ourselves fuck strings cmake hell12 -
In Italian we call ROM the Roma ethnic group.
Italian premier wants a census of this people.
Today I found this comic strip.
"From now CD and DVD will be no more ROM"
"ROM will be called Read Only Memory"
"EEPROM will be called EEP" -
TL;DR I have to bump a Redis cluster from t3.medium to m6g.large just to get enough network bandwidth even though I have no need of the extra memory.
Debugged an interesting issue today.
I am adding Elasticache to a project to reduce strain on the single node postgres DB.
Deployed a Redis replication group with 2 shards, with multi-AZ replication for resilience.
Everything was going well. We arent caching that much atm so was barely using 100Mb of memory.
Suddenly, when our US region comes online, latency skyrockets and the logs are full of Jedis timeout errors.
Still no issue with memory or node CPU.
The cause? Arbitrary network bandwidth throttling by AWS. The app currently processes about 3,000 requests per second so we were exceeding Amazons random ass allowances which arent documented anywhere.1 -
A philosophical question about maintenance/updating.
There is no need to repeat the reasons we need to update our dependencies and our code. We know them/ especially regarding the security issues.
The real question is , "is that indicates a failure of automation"?
When i started thinking about code, and when also was a kid and saw all these sci fi universes with robots etc, the obvious thing was that you build an automation to do the job without having to work with it anymore. There is no meaning on automate something that need constant work above it.
When you have a car, you usually do not upgrade it all the time, you do some things of maintance (oil, tires) but it keeps your work on it in a logical amount.
A better example is the abacus, a calculating device which you know it works as it works.
A promise of functional programming is that because you are based on algebraic principles you do not have to worry so much about your code, you know it will doing the logical thing it supposed to do.
Unix philosophy made software that has been "updated" so little compared to all these modern apps.
Coding, because of its changeable nature is the first victim of the humans nature unsatisfying.
Modern software industry has so much of techniques and principles (solid, liquid, patterns, testing that that the air is air) and still needs so many developers to work on a project.
I know that you will blame the market needs (you cannot understand the need from the start, you have to do it agile) but i think that this is also a part of a problem .
Old devices evolved at much more slow pace. Radio was radio, and still a radio do its basic functionality the same war (the upgrades were only some memory functionalities like save your beloved frequencies and screen messages).
Although all answers are valid, i still feel, that we have failed. We have failed so much. The dream of being a programmer is to build something, bring you money or satisfaction, and you are bored so you build something completely new.13 -
Compare and harmonize the web configs
Oh no someone set execution timeouts to 14 days
Fuck fuck fuckity duck
Hey compare all the web configs of all environments and harmonize them all wtf cmon bruh do your job as a developer
Take them and back them up into svn. What do you mean svn isn't a back up system of course it is well its the only thing we have fuck
What do you mean we have shit logging where people will catch an exception and only print the word exception in the log you can figure it out can't you we have live produxtion issues that hace to be solved now what the fuck
How dare you make a. Mistake copying our shitload of a bloated codebase and configuring our 100s of different options all by fukcing hand what the fuck dude do yoh write anyrhing down?
Please catalogue all the exception mails we are getting but we have no db or error reporting system so they all just plop into tue inbox and thats all ypur fuckjng data figure it out kid
This is a rewarding, fulfilling job whwrw you can be both dev ops and a developer and manage all of our fucking environments of which there are about 15 of all your own with no sort of tool or software to aid you because haha what the fuck we wouldn't make your life easy
Whata that you want to spend time to write stuff or change stuff that will nake it easier fot you fuxk that bruh get back to your biklable tasks like holy shit you thjnk this is a charity ofr aomw shit
Live production issues
Live production issues
Produxtion issues. A ghost in the machine. Find it fix if find it fix it find it fix it cmon why can't you fix it I expect you to spend your day hopelessly pretending to try to solve something you fucker
One of the only peopel able to help you sometimes though hes a bit of an old laxky, yeah hea fucking leaving see ya seeya kid and now we're not hirinf anyone to fuckjng help you no no no managing and monitoring the environments its your jov alll fof them every sngle on do you knkw all the xonfiguraiton values for them yet??
Instead we are hiring a new sales person to fucking make us some more money and we don't need naother seceloper to help you infqct lets have you use this mid end retail computer from 2014 to develop on yeah yeah oh but all our shitty code and visual studip will destry your memory but too bad!! Hahahahahdhsj
Go lice is all you, why sare you so slow
How long will it take
How long will it take
How long will it take
How long witll it tqk2
How long will it take holy shit
Give time estimate for sonethign that I don't fucking know how about it will tqke till fuxk you oxloxk4 -
Why oh why is libgdx on gradle?
Why does gradle exist?
Why...
Please
All I want is a project with some libraries 😭😭
I go to school and I have to take my projects to scool on a memory stick and run them from it. Problem: gradle. Oh wait, NO LIBS. Go DIE. No admin rights at school...
Half the time it doesnt even work at home. I swear I have spent at least a full week 24/7 trying to fix it in the past couple of months.
There has to be a way to purge gradle from the world.3 -
!comforting
TL;DR - I’ve done some thinking about operating systems and sticking to one
Mk
so I, like many of you, have seen far more than my fair share of “X operating system is perfect for it all, so don’t use Y operating system because it’s just awful” posts.
Over this week i’ve really done some thinking and experimenting with multiple devices and OSes and programs for various tasks. People coming from windows over to linux (like myself) tend to diss windows (rightfully so for the most part, but still). I’ve also noticed that the android vs. apple debate can get heated among users.
Listen guys,
iOS has its shortcomings obviously, UI being kinda a big one; but no one can deny that apple shoves some of the nicest hardware into their devices. Yes, this stuff is pricey as hell obviously, but the new macs come with an i9 and quite a bit of memory as well. Apple devices tend to have longer lasting batteries too - i cant count the times where i’ve just turned on my mobile hotspot, and stuck my android in my pocket to use my iphone (its a wifi-only 5s). the applications run nicely on apple hardware.
i couldnt learn even half as much programming as i do on my android though; Termux is a godsend, and im able to run and test scripts right there in the palm of my hand. can’t get that on an iphone.
Some of my favorite game developers only develop for windows; I’m dual booting for that sole reason (warframe and the epic games launcher don’t properly run through wine).
Just boil it down inside for a second; You might have come from a more “user friendly” operating system, to learn on one that is less so - wether you wanted the freedom and wiggle room for customization, or just a more developer friendly working environment (God bless conky and its devs) - so you didn’t have to be locked down into one way of seeing things. Putting a previously used OS down directly violates that thougjt process, and at that point you’re just another windows hater, or arch junkie, or whatever. I think we need to be open to appreciating the pros of every system, even if we almost never use some of them, and we should try not to put down other devs-to-be or csci/sec enthusiasts down because of that either.2 -
Have you ever need "modify/edit" button in your real world in talking??!
I told my GF a memory which I had with my frnds. It was about drinking and hangOver. She said "You didn't tell that to me!!" I said "I said that before! " again she said "No you didn't!!!" At that moment I was just looking for a modify/edit button!!!😒😒 -
spent a few days trying to track down the cause of a thermal shutdown in my workstation. intel 4790k with no overclock would spike to 95C on one core (core1) whenever maxing out all 8 threads, be it real work, mprime, anything with 100% cpu being used. I quadrupled my RAM from 8gb to 32, because its cheap and id like to have all data in memory sometimes, not because I thought that was the problem. I reseated my watercooling block. I checked out the PSU. I unplugged all unnecessary peripherals, drives, etc. It turned out to be a bug in Gigabyte MOBO BIOS (causing temps to be read incorrectly i think, still not exactly sure...) updated from version 5 to 10 and poof now temps are back in the high 50's at full load. it only took 2 days to figure out and i think i learned something
-
Recently joined new Android app (product) based project & got source code of existing prod app version.
Product source code must be easy to understand so that it could be supported for long term. In contrast to that, existing source structure is much difficult to understand.
Package structure is flat only 3 packages ui, service, utils. No module based grouped classes.
No memory release is done. So on each screen launch new memory leaks keep going on & on.
Too much duplication of code. Some lazy developer in the past had not even made wrappers to avoid direct usage of core classes like Shared Preference etc. So at each place same 4-5 lines were written.
Too much if-else ladders (4-5 blocks) & unnecessary repetitions of outer if condition in inner if condition. It looks like the owner of this nested if block implementation has trust issues, like that person thought computer 'forgets' about outer if when inside inner if.
Too much misuse of broadcast receiver to track activities' state in the era of activity, apપ life cycle related Android library.
Sometimes I think why people waste soooo... much efforts in the wrong direction & why can't just use library?!!
These things are found without even deep diving into the code, I don't know how much horrific things may come out of the closet.
This same app is being used by many companies in many different fields like banking, finance, insurance, govt. agencies etc.
Sometimes I surprise how this source passed review & reached the production. -
any advice/suggestions to intensively brush up on modern C++ and multithreading for an interview that will likely be technical and cover bases like algorithms, data structures, etc?
I haven’t done c++ for awhile since a few courses in college - I did parallel programming and GPGPU on the side, but nothing on a professional level.
I’ve been mostly doing front web dev since I got out of school and C#, so I’ve been more on design/higher level of abstraction in dev and if I am asked things about pointers, memory allocations, etc I would probably draw a blank but I am motivated to no life it hard for the next week to catch up again.3 -
Alright so I have been trudging around in javascript land for a bit and one thing kind of bothers me (correct me if I am wrong I would love to be wrong on this). It seems like a lot of javascript, or at least frameworks, leave a lot of possibility for memory leaks. Like you can create an anonymous object with a method that just kind of hangs out and acts with no way to retrieve it and turn it off. Am I wrong here? Please tell me I am wrong. And for the record I know I can assign anonymous objects to variables in various ways, but I am not forced to.4
-
ENOSPC = random things go wrong.
There are many synonyms for ENOSPC, like "disk full", "space storage full", "space storage exhausted", "no more space left on device", and those other repulsive errors. For the sake of simplicity, I am going to refer to it as ENOSPC.
If you are in this condition on the operating system partition, get out of it quickly or random things will go wrong. Text editors which write directly to a text file rather than creating a temporary file and then replacing the text file could end up blanking the text file, softwares' configuration files might fail saving which causes a reset, and web browsers might spontaneously reset cookies and lose history.
For example, Firefox has created a gap in the web browsing history, as shown here. The history that is now memory-holed initially appeared to have been recorded successfully. Apparently, a failed write to the places.sqlite database when closing the browser created this gap.4 -
Win7 Task manager: 7 slack.exe processes running after shutting down app. Force closes 1 process, revives itself. 16 chrome.exe processes running after closing chrome. 8 node.exe running when no more node apps running.. Jesus christ, clean up after yourself Windows.. No wonder virtual box complained about not enough memory to run Windows 10 image because I have to test my web app on Edge browser..1
-
Hello chat, its been a long week with no progress, i have four problems right now.
1 node js repl cant recognize <> in js, and i cant find a fix for that anywhere on the internet, is there a way to convert html tags in js to smooth brackets code?
2 node js repl dosnt recognize codes like import react from react, and i have to do an async function load, and i dont know y or how to fix it.
3 i dont know how to write import exg.csv into an async function load, its usually something like “import react from react” to “async function load(){let react=await import (‘react’);}”, and i dont know how to do this.
4 i tried to install materials.ui using npm into a folder called materialsui in modules in node, it deleted all the other modules in module folder, installed, and then left the materialsui folder empty. I complained and i dont know how to get it to not do that.
5 how do i fix out of memory errors?8 -
Bruh, just learn them how a computer works with minecraft. Inverters, And, Nand, etc... can all b made there. From there, u can make flip-flops, u can make registers, adders, multiplexors, demuxs. Ofc makin anything more then a 1-bit, maybe 2-bit machine, would be a pain, and dont get me starting on memory that extends one register. But hey, if u got the patience, u can ofc. Put it together to an ALU, combin all of them with a PC to an CPU. Ofc, you got no ROM or RAM, but hey, atleast u've built the hard part.4
-
You know I'm tired of the fucking memory noise of some twisted fuck working for twisted fucks laboring off some set of idiotic arbitrary stereotypes trying to get me to do the same fucking things by baiting me like a fucking dog
I want people to live their fucking lives and the social problems in this world to just be solved
None of this in last generation or twisted dumb fucks and their insensible number games that were used to program them
I want everything cleaned up and fixed and evil people to cease being evil and no more stupid loop2 -
Just realized a member function pointer can be a template parameter as non-type, gonna try to use it do no dynamic memory allocation trick with std::function.
-
That moment when you find a program you have no memory of writing and you are contemplating whether it's a good idea to open it or just leave it alone.
-
Please support old web browser versions for all eternity.
I hate it when I open a site like SoundCloud one day and am greeted with a "we no longer support your browser" notice. Now I am forced to update my browser to a new version with removed features. On Android, Chrome sometimes crashes due to an apparent memory leak, so I have to go back to Samsung Internet, which does not work with some sites. Also, the Samsung clipboard manager (which can hold up to 20 items) is only available on Samsung Internet, not Chrome or Firefox.
I also have to update the browser on my live USB bootable stick because sites stop supporting it. Any browser starting in 2015 (ECMA script 6) should be supported until at least 2050 so that I never have to fear that a site one day spontaneously stops working on my browser.
I would like to browse the Internet forever without having to ever worry about pages to stop working one day. Browser vendors might also deprecate support for devices and operating systems. Old devices also have replaceable batteries and are easier to repair. I don't want be forced to buy new devices that are difficult and expensive to repair.20 -
I'm currently doing something very similar to something I've done ~1.5 years ago, but my old notes/work fragments aren't really helpful
*inserts 'I have no memory of this place' meme* -
Tried running our selenium test suite on Firefox during the nightly build. Came in this morning with no nighly build. Turns out the tear downs weren't killing the firefox drivers and they used up all the memory on the build server. 😐
-
We’re only random people living in random places, speaking random languages, eating random food, sleeping, studying and working random hours. Traveling to random points on a sphere.
Just random range is different.
Just random stuff happens on crossroads of two random dots and the entropy speed ups or slows down.
Nothing special at all.
Just a finite state machine iteration.
I mean the amount of effort we put into explanation of infinity is outstanding.
What if there is no infinity at all ?
What if infinity is just misunderstanding of our interpretation of the world around us. It’s just pixels, resolution, gaussian splatting, quantum state, you name it.
Hey man the world is flat. Just put it to the 2d space. How many space you need from a simulation perspective where your patient eyes can only see up to certain amount of light particles per second on a shitty lens.
Propose a world optimization techniques by slowing down subject perception, tiredness introduced. Compress memory, sleep introduced. Limit neurons, cpu power assigned. Deploy on cloud - put it to life. Exit 0 body failure. Exit 1 suicide. Kill -9 killed by tty from ip EARTH.X.Y
What you can do to make the world around this planet alive? Make it blink.
We developers are lazy and I believe that nature is even more lazy than us.
You think you’re going to elevator right now ? You’re going to the preloader. Looking at the window equals playing video from playback. Never goes live, just precomputed fsm. Cars, trains, airplains ? Preloaders everywhere. Highways to split traffic to cities and communication. The road and cities planning department is a matrix maintenance department. And don’t get me started about space.
Space is empty because it’s not even finished. So they put it all behind glass called milky way. You know how glass looked 500 years ago ? It was milky so it’s milky way so we don’t see shit.
If the space would be finished I’ll be starting writing this text from mars, finished it and sent from earth but no it’s light years guys, light years is not a second for a matter. Light year is a second of the the injected thoughts exchange only. Thoughts of the global computer called generative AI that they introduced on local computing devices called cloud.
Even the preloader system is not present, they left us with the one map and overpopulated demo. What a shit hole.I bet they’re increasing temperature right now to erase this alpha build and cash out. Obviously so many bugs here that his one can’t be fixed anymore. To many viruses.
Hope for 0days to start happening so we can escape using time travel or something.
I bet they cut a budget or something, moved the team to other projects. Or even worse solar system team got layoff off because we are just neurons that ordered to do it. And now we’re stuck in some maintenance mode, no new physics no new thoughts to pursue, just slow degeneration. I would pay more for the next run and switch to other galaxy far far away where they at lest have more modern light speed technology.
What do you think about it Trinity ? Not even worth wasting your time for that. No white rabbit this time.
I do not recommend this game at this stage of early access.
- only one available map despite promises for expansions over the years no single dlc arrived,
- missing space adventures
- no galaxy travel mode only a teaser trailers of what you can do in other “universes”
- developers don’t respond to complains
- despite diversity of species and buildings at first sight world looks to generic
- instead of new features bots with mind manipulation, AB testing and data harvesting was introduced
- death anti cheat mode installed1 -
No documentation available. Often has to try to identify the dev, and then get the best of his memory guidance by phone.
-
Alright, I found the root of the issue from my last project post.
It’s the damn MEM indicator.
In my project, I want to be able to load presets from an SD card in order to let the user switch from preset values to values determined and calculated from a potentiometer.
So it’s pretty hard to discern when you’re using the memory and when you aren’t without some kind of indication.
Every fucking time I try to put on that indicator (a red “MEM” at the bottom of the screen) it completely fucks the entire display.
I HAVE NO IDEA WHY!!!
Might be possible that there is a VRAM space issue... idk.
Will probably use a LED for indication instead. -
Sometime I feel, god forget to write proper toggle command for me.
For others it is random, for me it is static. One sad life. Only hope is system run out of memory because it is recursion with no ending.
here is the dev-rant
After fucking with Laravel Passport for 3 days, I finally manage to find a way to do multi auth.
Yeah! dude I am the guy who is going to write a tutorial for that. So, you must -- this rant.1