Join devRant
Do all the things like
++ or -- rants, post your own rants, comment on others' rants and build your customized dev avatar
Sign Up
Pipeless API

From the creators of devRant, Pipeless lets you power real-time personalized recommendations and activity feeds using a simple API
Learn More
Search - "memory"
-
So this guy passed large objects as function arguments directly instead of referencing. What a jackass. So the program was slow as fuck AND taking up too much memory.
So yeah, I'm basically ranting about myself.2 -
A repressed memory just popped into my head:
At my former job I tried to explain a problem I was having to the tech lead. Then, without fully understanding the problem, he decided to rewrite my code that I had been working on for weeks. His code, that took him 2 days to write, went straight to master without peer review.
He introduced about 10 regressions…
Queue the client meeting where the client says “These bugs came back, and we thought they were fixed already…” (They demo the bugs)
So obviously I say “I’ll let Techlead address that one.”
He just mumbles some stuff, and goes quiet for the rest of the meeting. Finally, when the meeting was wrapping up we hear “It’s Fixed!”
Everyone was like ???
“That bug from earlier, it’s fixed, it should work now….”
Would you believe this guy decided to code during the entire meeting, clearly missing important feedback and information that would help him understand the problem. Again, pushing to master without review….
Not to mention that we were talking about 10 regressions…6 -
Behold the PHP pyramid of doom!
You know what kind of code is coming... a big pile of shite! 😍
Obviously you have to return by reference (&) because of performance and memory reasons. ☝️🤓
Man... I've seen code...23 -
Yeah Mozilla fuck merit and fuck you too!
This, this is what I was talking about when the fucking CoC came out and everyone (including it's author) started it using it as a political weapon.
You castrated fucking virgins! Mozilla, I want to support you I really don't like chrome but you always manage to disappoint everyone. I'm tired, tired of you morally superior socialists infecting my fucking workplace, entertainment and news.
This is just an excuse for lazy assholes to have their cake and eat it too and it's damn fucking INSULTING to us "minorities", I can work to get nice things just like anyone else bitch! having another skin color is not a disability!
Worst of all, you seem to have straight out millennial retards making these decisions seeing as it's based on an article from a washed up "gender research" professor that thinks Barbie Doctor is problematic, the most biased and dumb source you can possibly pull out of your ass.
Two classmates were murdered this morning, do you really think we care about what your diversity and inclusion Dept thinks it's problematic? You delusional halfwits, the only comforting thought is that your soft bigotry will perish alongside your product when it inevitably diminishes it's quality for sake of "equality".
Want to make better products? Ditch your useless diversity and inclusion department and start optimizing the memory consumption on firefox.
Want to help minorities? Start paying your outsourced developers decently.
I hope this helps people who thought including politics in software development wouldn't have dire consecuences to open their eyes; if not, oh well I guess people will get it when mozilla keeps going down the drain and they get fired because they just outsourced their work in the name of "diversity" just to save money.
https://blog.mozilla.org/inclusion/...100 -
I can't name one specific time that was the best memory per say. My team is super close - we get lunch almost every day, have (inappropriate) inside jokes and just act dumb together. Not one person on my team who doesn't fit in. Even our boss is cool. :) #LoveMyTeam4
-
Long long ago there was a man who discovered if he scratched certain patterns onto a rock he could use them to remind him about things he would otherwise forgot.
Over time the scratching were refined and this great secret of eternal memory were taught to his children, and they taught it to their children.
Soon mankind had discovered a way to preserve through the ages his thoughts and memories and further discovered that if he wrote down these symbols he could transfer information over distances by simply recording these symbols in a portable medium.
Writing exploded it allowed a genius in one place to communicate the information he had recorded across time and space.
Thousands of years passed, writing continued to be refined and more and more vital. Eventually a humble man by the name of Johannes Gutenberg seeking to make the divine word of God accessible to the people created the printing press allowing the written word to be copied and circulated with great ease expanding vastly the works available to mankind and the number of people who could understand this arcane art of writing.
But mankind never satiated in his desire to know all there is to know demanded more information, demanded it faster, demanded it better. So the greatest minds of 200 years, Marconi, Maxwell, Bohr, Von Nueman, Turing and a host of others working with each other, standing on the shoulders of their brobdinangian predecessors, brought forth a way to send these signals, transfer this writing upon beams of light, by manipulating the very fabric of the cosmos, mankind had reach the ultimate limits of transmission of information. Man has conquered time, and space itself in preserving and transmitting information, we are as the gods!
My point is this, that your insistence upon having a meeting to ask a question, with 10 people that could've been answered with a 2 sentence email, is not only an affront to me for wasting my time, but also serves as an affront to the greatest minds of the 19th and 20th centuries, it is an insult to your ancestors who first sacrificed and labored to master the art of writing, it is in fact offensive to all of humanity up to this point.
In short by requiring a meeting to be held, not only are you ensuring the information is delayed because we all now need to find a time that all of us are available, not only are you now eliminating the ability to have a first hand permanent record of what need to be communicated, you are actively working against progress, you are dragging humanity collectively backwards. You join the esteemed ranks of organizations such as the oppressive Catholic church that sought to silence Galialio and Copernicus, you are among the august crowd that burned witches at Salem, the Soviet secret police that silenced "bourgeoisie" science, you join the side of thousands of years of daft ignorance.
If it were not for you people we would have flying cars, we would have nanobots capable of building things on a whim, we would all be programming in lisp. But because of you and people like you we are trapped in this world, where the greatest minds are trapped in meetings that never end, where mistruth and ignorance run rampant, a world where JavaScript is the de facto language of choice every where because it runs everywhere, and ruins everywhere.
So please remember, next time you want to have a meeting ask yourself first. "Could this be an email?" "Do I enjoy burning witches?" if you do this you might make the world a little bit of a less terrible place to be.6 -
Consequences Associated with Burnout:
- sleep deprivation ✅
- change in eating habits ✅
- increased illness due to weakened immune system ✅
- difficulty concentrating and poor memory/attention ✅
- lack of productivity ✅
- poor performance ✅
- avoidance of responsibilities ✅
- loss of enjoyment ✅
Have I just been burnt out and living it as my norm for the past 5 years? 🤡3 -
What an absolute fucking disaster of a day. Strap in, folks; it's time for a bumpy ride!
I got a whole hour of work done today. The first hour of my morning because I went to work a bit early. Then people started complaining about Jenkins jobs failing on that one Jenkins server our team has been wanting to decom for two years but management won't let us force people to move to new servers. It's a single server with over four thousand projects, some of which run massive data processing jobs that last DAYS. The server was originally set up by people who have since quit, of course, and left it behind for my team to adopt with zero documentation.
Anyway, the 500GB disk is 100% full. The memory (all 64GB of it) is fully consumed by stuck jobs. We can't track down large old files to delete because du chokes on the workspace folder with thousands of subfolders with no Ram to spare. We decide to basically take a hacksaw to it, deleting the workspace for every job not currently in progress. This of course fucked up some really poorly-designed pipelines that relied on workspaces persisting between jobs, so we had to deal with complaints about that as well.
So we get the Jenkins server up and running again just in time for AWS to have a major incident affecting EC2 instance provisioning in our primary region. People keep bugging me to fix it, I keep telling them that it's Amazon's problem to solve, they wait a few minutes and ask me to fix it again. Emails flying back and forth until that was done.
Lunch time already. But the fun isn't over yet!
I get back to my desk to find out that new hires or people who got new Mac laptops recently can't even install our toolchain, because management has started handing out M1 Macs without telling us and all our tools are compiled solely for x86_64. That took some troubleshooting to even figure out what the problem was because the only error people got from homebrew was that the formula was empty when it clearly wasn't.
After figuring out that problem (but not fully solving it yet), one team starts complaining to us about a Github problem because we manage the github org. Except it's not a github problem and I already knew this because they are a Problem Team that uses some technical authoring software with Git integration but they only have even the barest understanding of what Git actually does. Turns out it's a Git problem. An update for Git was pushed out recently that patches a big bad vulnerability and the way it was patched causes problems because they're using Git wrong (multiple users accessing the same local repo on a samba share). It's a huge vulnerability so my entire conversation with them went sort of like:
"Please don't."
"We have to."
"Fine, here's a workaround, this will allow arbitrary code execution by anyone with physical or virtual access to this computer that you have sitting in an unlocked office somewhere."
"How do I run a Git command I don't use Git."
So that dealt with, I start taking a look at our toolchain, trying to figure out if I can easily just cross-compile it to arm64 for the M1 macbooks or if it will be a more involved fix. And I find all kinds of horrendous shit left behind by the people who wrote the tools that, naturally, they left for us to adopt when they quit over a year ago. I'm talking entire functions in a tool used by hundreds of people that were put in as a joke, poorly documented functions I am still trying to puzzle out, and exactly zero comments in the code and abbreviated function names like "gars", "snh", and "jgajawwawstai".
While I'm looking into that, the person from our team who is responsible for incident communication finally gets the AWS EC2 provisioning issue reported to IT Operations, who sent out an alert to affected users that should have gone out hours earlier.
Meanwhile, according to the health dashboard in AWS, the issue had already been resolved three hours before the communication went out and the ticket remains open at this moment, as far as I know.5 -
In my last job they required us to turn on a task timer for every little thing. Remembering to do that, and to turn it off, was a royal pain. First I had to look up which task it is, start the timer, stop the timer, find the next task and repeat, then flip back to the first task. Lots of open browser tabs within tab groups to keep track of it all. And if I came up short or went over on budget, there was a “conversation” with management to account for discrepancies. Then I had to go by memory and try to reconstruct the “missing time” accurately enough to be convincing.
Now that I’m freelancing, I try to keep up the habit because it does have merit for tracking estimates and actuals, but now it’s just me to answer to for discrepancies and I can fudge the numbers as I see fit. The time records did, however, save my bacon in a recent dispute.5 -
Boss: Our app is to memory consuming and heavy weighted. We to do something because we will have hunsdrets of thousands of users.
Dev: Yes, there are a lot of legacy parts which leave plenty of space for optimization. Every query have to be carefully analyzed. Some can be avoided at all.
Boss: We pay externals to do some clustering with our app.5 -
I was young and stupid. Remember floppy disks? Yeah, we still had them when I was studying. Went to a computer cafe, rented a PC and DOS was already booted up (I'm an old fucker). I didn't want to reboot because PC rental was metered. I inserted my floppy disk and got infected by a virus that deleted my work. No git back then and my backup was on the same disk (fuck me). Back home, rewrote the whole thing from memory.
I got mad and wanted revenge. De-constructed a floppy disk, replaced the magnetic media with sandpaper and went to each and every PC on that computer cafe.
It was closed for day.3 -
Not a coworker, but this guy who I went to uni with and was a real life saver when I was really down. (we played minecraft together)
... So, he is a real genius. One of those guys who I legit couldn't keep up with. His brain works, he doesn't bullshit his way through, he's not pretentious, he is legit a down to earth rare genius. Yet, he doesn't use his talents enough, he likes to work or go home to play minecraft. And he doesn't politically care enough, so I am almost sure that he will end up getting stuck in the defence force.
We're still friends. And I try my hardest to not be nosy and nag at him that he can do better. I mean, he is happy the way he is, and he is not ambitious. But the memory of him is a reminder that not everyone who gets somewhere is the best and brightest.36 -
I've got a file on my desktop called key.txt, and it's just a single line in it that is clearly some sort of API key.
Absolutely no memory of what it is for.
💩9 -
The Mac Studio with 128 GB integrated memory looks very interesting, I could finally run a third Electron app next to Slack and Spotify.6
-
Is obsidian a fucking joke?
Seriously, is it a joke? Why would you ever care so much about indexing literally everything, if the entire thing crashes and/or takes >5min to LITERALLY just open the fucking directory and/or (so help you) if that directory is full of projects/repos or whatever the fuck and the total size of said directory is like >5GB.
WHY THE FUCK WOULD YOU INDEX EVERYTHING? -- "Ohh obsidian's not supposed to be used a fully fledged IDE, ohh obsidian should just handle MD files and normal sized projects, ohh the plugins and ease-of-use" -- Fuck.
There's no fucking real reason to index everything, BY DEFAULT. You open a directory with Obsidian? Doesn't matter, it's 1 byte, it's 100GB, you get indexed. Deal with it. It will use LITERALLY every resource your computer has. I'm surprised it doesn't go galaxy brain and ping if any other computers/devices are on the network and then attempt to connect and use their hardware (obsidian can be like a node!).
How shit can you be at understanding basic data structures and algorithms, where you just revert to based google-chrome brain and let the FUCKING TEXT EDITOR -- OBSIDIAN IS A FUCKING TEXT EDITOR HOLY SHIT -- hog all conceivable memory.
I swear to <some-deity> if anyone fucking says "Ohhhhhhhh actually, it's not a text editor, it has plugins and features and shit, it does all dis cool stff", OR, "Ohhhhh actually, obsidian indexes things for a very specific/rationale/apt/pragmatic/academic reason" OR "ohhhh, I have 100 iphones, 1000 ipads and a trillion desktop computers that each have 256GB of memory, why you hating on obsidian?" then go kick rocks. The fucking lot of you. Are you fucking kidding me.8 -
I do not like the direction laptop vendors are taking.
New laptops tend to feature fewer ports, making the user more dependent on adapters. Similarly to smartphones, this is a detrimental trend initiated by Apple and replicated by the rest of the pack.
As of 2022, many mid-range laptops feature just one USB-A port and one USB-C port, resembling Apple's toxic minimalism. In 2010, mid-class laptops commonly had three or four USB ports. I have even seen an MSi gaming laptop with six USB ports. Now, much of the edges is wasted "clean" space.
Sure, there are USB hubs, but those only work well with low-power devices. When attaching two external hard drives to transfer data between them, they might not be able to spin up due to insufficient power from the USB port or undervoltage caused by the impedance (resistance) of the USB cable between the laptop's USB port and hub. There are USB hubs which can be externally powered, but that means yet another wall adapter one has to carry.
Non-replaceable [shortest-lived component] mean difficult repairs and no more reserve batteries, as well as no extra-sized battery packs. When the battery expires, one might have to waste four hours on a repair shop for a replacement that would have taken a minute on a 2010 laptop.
The SD card slot is being replaced with inferior MicroSD or removed entirely. This is especially bad for photographers and videographers who would frequently plug memory cards into their laptop. SD cards are far more comfortable than MicroSD cards, and no, bulky external adapters that reserve the device's only USB port and protrude can not replace an integrated SD card slot.
Most mid-range laptops in the early 2010s also had a LAN port for immediate interference-free connection. That is now reserved for gaming-class / desknote laptops.
Obviously, components like RAM and storage are far more difficult to upgrade in more modern laptops, or not possible at all if soldered in.
Touch pads increasingly have the buttons underneath the touch surface rather than separate, meaning one has to be careful not to move the mouse while clicking. Otherwise, it could cause an unwanted drag-and-drop gesture. Some touch pads are smart enough to detect when a user intends to click, and lock the movement, but not all. A right-click drag-and-drop gesture might not be possible due to the finger on the button being registered as touch. Clicking with short tapping could be unreliable and sluggish. While one should have external peripherals anyway, one might not always have brought them with. The fallback input device is now even less comfortable.
Some laptop vendors include a sponge sheet that they want users to put between the keyboard and the screen before folding it, "to avoid damaging the screen", even though making it two millimetres thicker could do the same without relying on a sponge sheet. So they want me to carry that bulky thing everywhere around? How about no?
That's the irony. They wanted to make laptops lighter and slimmer, but that made them adapter- and sponge sheet-dependent, defeating the portability purpose.
Sure, the CPU performance has improved. Vendors proudly show off in their advertisements which generation of Intel Core they have this time. As if that is something users especially care about. Hoo-ray, generation 14 is now yet another 5% faster than the previous generation! But what is the benefit of that if I have to rely on annoying adapters to get the same work done that I could formerly do without those adapters?
Microsoft has also copied Apple in demanding internet connection before Windows 11 will set up. The setup screen says "You will need an Internet connection…" - no, technically I would not. What does technically stand in the way of Windows 11 setting up offline? After all, previous Windows versions like Windows 95 could do so 25 years earlier. But also far more recent versions. Thankfully, Linux distributions do not do that.
If "new" and "modern" mean more locked-in and less practical and difficult to repair, I would rather have "old" than "new".13 -
Is this learning job cpu intensive or memory intensive?
I don't know and I don't give a flying fuck, because it's 6:20pm and I have not found any of my favorite servers free to rerun this shit the whole fucking week, so this server (which I have actually killed before, btw) can suck a dick and do its fucking job.
🎤🖐️11 -
Q: What's a "muscle memory"?
A: It's when you open up devRant, skim through several posts, get bored, decide to visit some other website for more stimulation, close the tab with devRant, open a new tab and your hands type in devrant.com [ENTER] before you know it6 -
long time ago....
Feature request: We want an android backup solution in Our app!
UI guy has already developed it, you just need to see if his solution is solid!
Ok then - lets look at the UI: Nice progress bars, that turn into green checkmarks. Looks good.
Now lets look at the code: ... Ok. loading some files into memory.... and... dafuq? does not write to a file?
Backup to RAM. With no restore. 🤦♂️.3 -
I'm convinced that playing the piano has allowed me to type faster and commit keyboard shortcuts to muscle memory faster too. While coding isn't about typing quickly, there's a whole bunch of times when I've had an idea, and had to get that down into code as quickly as possible before I forget it - and that's when I really find fast keyboard work comes into its own.5
-
Memory just came up from reading another rant about static keyword I wanted to share. Involved a network programming assignment in Java back in my heyday.
Fellow student was told that a static member was shared between every object in a class and decided that they could use that to implement network communication (i.e. if they ran the same java program on different machines, they'd be able to communicate by reading to and writing from the same static fields).
Have a memory of sitting in corner of lab overhearing tutor lose their mind trying to (unsuccessfully) explain why this didn't work.5 -
My first contact with an actual computer was the Sinclair ZX80, a monster with 512 bytes of ram (as in 1/2 kbyte)
It had no storage so you had to enter every program every time and it was programmed in basic using key combinations, you could not just write the commands since it did not have memory enough to keep the full text in memory.
So you pressed the cmd key along with one of the letter keys and possibly shift to enter a command, like cmd+p for print and it stored s byte code.8 -
Problems with redis... timeout everywhere...
30k READs per minute.
Me : Ok, How much ram are we actually using in redis ?
Metrics : Average : 30 MB
Me ; 30 MB, sure ? not 30 GB ?
Metrics : Nop, 30 MB
Me : fuck you redis then, hey memory cache, are you there ?
Memory cache : Yep, but only for one instance.
Me ok. So from now on you Memory cache is used, and you redis, you just publish messages when key should be delete. Works for you two ?
Memeory cache and redis : Yep, but nothing out of box exists
Me : Fine... I'll code it my selkf witj blackjack and hookers.
Redis : Why do I exist ?2 -
MTP is utter garbage and belongs to the technological hall of shame.
MTP (media transfer protocol, or, more accurately, MOST TERRIBLE PROTOCOL) sometimes spontaneously stops responding, causing Windows Explorer to show its green placebo progress bar inside the file path bar which never reaches the end, and sometimes to whiningly show "(not responding)" with that white layer of mist fading in. Sometimes lists files' dates as 1970-01-01 (which is the Unix epoch), sometimes shows former names of folders prior to being renamed, even after refreshing. I refer to them as "ghost folders". As well known, large directories load extremely slowly in MTP. A directory listing with one thousand files could take well over a minute to load. On mass storage and FTP? Three seconds at most. Sometimes, new files are not even listed until rebooting the smartphone!
Arguably, MTP "has" no bugs. It IS a bug. There is so much more wrong with it that it does not even fit into one post. Therefore it has to be expanded into the comments.
When moving files within an MTP device, MTP does not directly move the selected files, but creates a copy and then deletes the source file, causing both needless wear on the mobile device' flash memory and the loss of files' original date and time attribute. Sometimes, the simple act of renaming a file causes Windows Explorer to stop responding until unplugging the MTP device. It actually once unfreezed after more than half an hour where I did something else in the meantime, but come on, who likes to wait that long? Thankfully, this has not happened to me on Linux file managers such as Nemo yet.
When moving files out using MTP, Windows Explorer does not move and delete each selected file individually, but only deletes the whole selection after finishing the transfer. This means that if the process crashes, no space has been freed on the MTP device (usually a smartphone), and one will have to carefully sort out a mess of duplicates. Linux file managers thankfully delete the source files individually.
Also, for each file transferred from an MTP device onto a mass storage device, Windows has the strange behaviour of briefly creating a file on the target device with the size of the entire selection. It does not actually write that amount of data for each file, since it couldn't do so in this short time, but the current file is listed with that size in Windows Explorer. You can test this by refreshing the target directory shortly after starting a file transfer of multiple selected files originating from an MTP device. For example, when copying or moving out 01.MP4 to 10.MP4, while 01.MP4 is being written, it is listed with the file size of all 01.MP4 to 10.MP4 combined, on the target device, and the file actually exists with that size on the file system for a brief moment. The same happens with each file of the selection. This means that the target device needs almost twice the free space as the selection of files on the source MTP device to be able to accept the incoming files, since the last file, 10.MP4 in this example, temporarily has the total size of 01.MP4 to 10.MP4. This strange behaviour has been on Windows since at least Windows 7, presumably since Microsoft implemented MTP, and has still not been changed. Perhaps the goal is to reserve space on the target device? However, it reserves far too much space.
When transfering from MTP to a UDF file system, sometimes it fails to transfer ZIP files, and only copies the first few bytes. 208 or 74 bytes in my testing.
When transfering several thousand files, Windows Explorer also sometimes decides to quit and restart in midst of the transfer. Also, I sometimes move files out by loading a part of the directory listing in Windows Explorer and then hitting "Esc" because it would take too long to load the entire directory listing. It actually once assigned the wrong file names, which I noticed since file naming conflicts would occur where the source and target files with the same names would have different sizes and time stamps. Both files were intact, but the target file had the name of a different file. You'd think they would figure something like this out after two decades, but no. On Linux, the MTP directory listing is only shown after it is loaded in entirety. However, if the directory has too many files, it fails with an "libmtp: couldn't get object handles" error without listing anything.
Sometimes, a folder appears empty until refreshing one more time. Sometimes, copying a folder out causes a blank folder to be copied to the target. This is why on MTP, only a selection of files and never folders should be moved out, due to the risk of the folder being deleted without everything having been transferred completely.
(continued below)29 -
It's rant time again. I was working on a project which exports data to a zipped csv and uploads it to s3. I asked colleagues to review it, I guess that was a mistake.
Well, two of my lesser known colleague reviewed it and one of the complaints they had is that it wasn't typescript. Well yes good thing you have EYES, i'm not comfortable with typescript yet so I made it in nodejs (which is absolutely fine)
The other guy said that I could stream to the zip file and which I didn't know was possible so I said that's impossible right? (I didn't know some zip algorithms work on streams). And he kept brushing over it and taking about why I should use streams and why. I obviously have used streams before and if had read my code he could see that my code streamed everything to the filesystem and afterwards to s3. He continued to behave like I was a literall child who just used nodejs for 2 seconds. (I'm probably half his age so fair enough). He also assumed that my code would store everything in memory which also isn't true if he had read my code...
Never got an answer out of him and had to google myself and research how zlib works while he was sending me obvious examples how streams work. Which annoyed me because I asked him a very simple question.
Now the worst part, we had a dev meeting and both colleagues started talking about how they want that solutions are checked and talked about beforehand while talking about my project as if it was a failure. But it literally wasn't lol, i use streams for everything except the zipping part myself because I didn't know that was possible.
I was super motivated for this project but fuck this shit, I'm not sure why it annoys me so much. I wanted good feedback not people assuming because I'm young I can't fucking read documentation and also hate that they brought it up specifically pointing to my project, could be a general thing. Fuck me.3 -
I'm really enjoying rust now. It was worth the struggles.
I was really surprised to see, a NodeJS server takes around 40-60MB of memory whereas Rust (Actix web) server takes around 500KB-2MB :O whoa! Awesome!3 -
Why even is Microsoft Teams?
Why does it suck so bad? Why is it a memory hog? Why does the ELECTRON desktop app not have native ARM64 support neither on Windows nor macOS? Why is it even an Electron app? Why the web version does not work with Safari (then again, barely anything more complex than my portfolio site works on Safari)? Why is the UI from 2016? Why is it preinstalled with Windows 11? Why the pre-installed Windows 11 version is a completely different entity? Why the preinstalled Windows 11 version does not work with school/work version of Teams calls?11 -
Just found this absolute 5 head, galaxy brain implementation in a piece of code which is called in a loop by a background scheduler which has performance issues.
There are 20+ properties, some which are recursively calling other properties with the same implementation style in this class.
Constant out of memory errors have been reported for this software, I wonder why...15 -
PM: this is our super fancy new CI/CD pipeline, it's the greatest. i expect you to learn and understand all this in no time.
devs: so i have to spend some more time on this topic because it's completely new to me and requires some learning...
PM: nooo, that's a super easy task with zero effort, my braindead hamster can do that in no time, so can i, and so can you! let's assign 1 story point for that.
~ 3 months latèr ~
also PM, after he has started developing as well: so i'm realizing there are many things that i have to learn, and it takes me some time. i haven't developed with C++ and <other tool stack> for a longer time. by the way, you guys don't need to check for any quality right now, we need to deliver fast. it's okay, when you have memory overflows, your code is completely crappy, poor architecture or memory overflows, it doesn't matter.
he even has a subtask for migrating his code from VS project to our new project structure, since he refused to learn our pipeline right from the beginning and created VS project instead. シ why is this a subtask? this job can be done in no time, my left vanishing twin named Klaus who has dislexia and hates vim can solve this task in 20 seconds!!!!11
(and still no PR, not even a feature branch in our repo)2 -
Not entirely dev related, but...
I'm getting tired of (electrical, mechanical) engineers complaining about HW limitations like "oh this board only has 12 KB of flash memory" or "I can't make this thing move smoother because my CPU is only 16 MHz" Bitch, you can spend $500 on 3 servo motors, but you can't afford to pay extra $5 to get a board with better specs to control them?8 -
I have to fix a memory leaks of two jest test files of
2 FUCKING THOUSAND
lines of code each.
The End.15 -
So, I got this new coworker of mine, he's 5 years older than me and says he's got a loooot of experience with html, css und javascript. Nice. Boss wants us to blur some text on a website, I show him how to do it with css "filter: blur(4px)".
5 minutes later he's all proud and says "done". I think nice, check the code - and see this moron having put an <img> png in there, having blurred the text with photoshop. Short memory, anyone?2 -
Rant.
FUCK-WINDOWS-11! What crap! I got a gazillion CPUs, a shitload of memory and windows like:
”Don’t care! Gonna run slow and also not work at all…”12 -
"Education" system is based on memorization. If you're able to memorize stuff short term, you'll pass the exam and be considered as "educated" academic citizen, not as "memory efficient" citizen.
Let that sink in11 -
Hello guys. Today I bring you my list of top 3 programs that use too much memory
🪟 Windows 10+
⚛️ Web browsers, Electron
🐋 Docker containers
Honorable mention: ☕Java
The developers of those programs should put more effort into optimizing memory usage15 -
Nvidia at it again. After receiving backlash for trying to pass off a 4070 as 4080/12GB, und "unlaunching" it, they did the same shit again.
This time with a 3060/8GB. Yes, as RTX 3060, a well established product with a lot of reviews, intentionally misleading the customers who think that a 3060 is always the regular 12GB model. And the new shit isn't even cheaper.
The main issue isn't the reduced amount of VRAM, it's cutting down the memory bus from 192 to 128 bits, that costs quite some performance.
So if you see a 3060 and think it might be a bargain, watch out that you don't accidentally end up with the "bait and switch" 8GB model.
Or even better, consider a 6650 XT that is both faster and cheaper than a 3060, and RT is lackluster on the small RTX cards anyway.7 -
News sites with infinite-scrolling are so damn annoying.
A new random article I am not interested in suddenly loads under the current news article when skimming through it by dragging the scroll bar, and then throws me far down into unknown territory due to the sudden change of the height of the page.
It also happens similarly on Imgur photo galleries: when I drag down the scroll bar to quickly seek through the images, the "explore posts" section suddenly loads hundreds of "trending" and "viral" (uninteresting junk and spam) photos under the gallery, and since this adds lots of height to the page, I get pulled right into it and my window is full of such posts. Both distracting and memory-consuming.
YouTube's infinite scrolling comments and video lists are acceptable as of writing, since they are on-topic, and no off-topic "trending" spam, and they do not load too much at once, which does not throw me down too far.
Quote from https://elite-strategies.com/infini... :
> The footer of a website is like the shoes of a person, it ties the whole outfit (or website) together. Footers are awesome because it gives you a chance to tell people where to go when they reach the bottom of the page.2 -
A few Challenges at my job:
- a CEO with zero tech skills and zero memory.
- a sysadmin with literal brain damage and epilepsy (but he's great, we just have had to learn how to deal with it)
- another (volunteer) sysadmin who we call @God on Slack and who usually only shows up in extreme crises.
- the budget of a tiny organization, the web traffic of a huge site.
- incoherent business logic subject to the whims of volunteers and the loudest users
- a main revenue stream that contradicts our main mission.
it's fun! woot.1 -
I have adhd and anxiety which means I cant smoke, drink coffee or drink alcohol because that fucks up my sleep and short term and long term memory badly for few days in a row. ADD symptoms become unmanageable. Fuck my life. I guess I will have to cut all stimulants if I want to be abe to function as a decent dev. I will have to cut most of my social circle because they wont understand me not going out for drinks... Fuck my life....14
-
This is the third part of my ongoing series "The Ballad of the Six Witchers and the Undocumented Java Tool".
In this part, we have the massive Battle of Sparks and Storms.
The first part is here: https://devrant.com/rants/5009817/...
The second part is here: https://devrant.com/rants/5054467/...
Over the last couple sprints and then some, The Witcher Who Writes and the Butchers of Jarfile had studied the decompiled guts of the Undocumented Java Beast and finally derived (most of) the process by which the data was transformed. They even built a model to replicate the results in small scale.
But when such process was presented to the Priests of Accounting at the Temple of Cash-Flow, chaos ensued.
This cannot be! - cried the priests - You must be wrong!
Wrong, the Witchers were not. In every single test case the Priests of Accounting threw at the Witchers, their model predicted perfectly what would be registered by the Undocumented Java Tool at the very end.
It was not the Witchers. The process was corrupted at its essence.
The Witchers reconvened at their fortress of Sprint. In the dark room of Standup, the leader of their order, wise beyond his years (and there were plenty of those), in a deep and solemn voice, there declared:
"Guys, we must not fuck this up." (actual quote)
For the leader of the witchers had just returned from a war council at the capitol of the province. There, heading a table boarding the Archpriest of Accounting, the Augur of Economics, the Marketing Spymaster and Admiral of the Fleet, was the Ciefoh Seat himself.
They had heard rumors about the Order of the Witchers' battles and operations. They wanted to know more.
It was quiet that night in the flat and cloudy plains of Cluster of Sparks and Storms. The Ciefoh Seat had ordered the thunder to stay silent, so that the forces of whole cluster would be available for the Witchers.
The cluster had solid ground for Hive and Parquet turf, and extended from the Connection River to farther than the horizon.
The Witcher Who Writes, seated high atop his war-elephant, looked at the massive battle formations behind.
The frontline were all war-elephants of Hadoop, their mahouts the Witchers themselves.
For the right flank, the Red Port of Redis had sent their best connectors - currency conversions would happen by the hundreds, instantly and always updated.
The left flank had the first and second army of Coroutine Jugglers, trained by the Witchers. Their swift catapults would be able to move data to and from the JIRA cities. No data point will be left behind.
At the center were thousands of Sparks mounting their RDD warhorses. Organized in formations designed by the Witchers and the Priestesses of Accounting, those armoured and strong units were native to this cloudy landscape. This was their home, and they were ready to defend it.
For the enemy could be seen in the horizon.
There were terabytes of data crossing the Stony Event Bridge. Hundreds of millions of datapoints, eager to flood the memory of every system and devour the processing time of every node on sight.
For the Ciefoh Seat, in his fury about the wrong calculations of the processes of the past, had ruled that the Witchers would not simply reshape the data from now on.
The Witchers were to process the entire historical ledger of transactions. And be done before the end of the month.
The metrics rumbled under the weight of terabytes of data crossing the Event Bridge. With fire in their eyes, the war-elephants in the frontline advanced.
Hundreds of data points would be impaled by their tusks and trampled by their feet, pressed into the parquet and hive grounds. But hundreds more would take their place. There were too many data points for the Hadoop war-elephants alone.
But the dawn will come.
When the night seemed darker, the Witchers heard a thunder, and the skies turned red. The Sparks were on the move.
Riding into the parquet and hive turf, impaling scores of data points with their long SIMD lances and chopping data off with their Scala swords, the Sparks burned through the enemy like fire.
The second line of the sparks would pick data off to be sent by the Coroutine Jugglers to JIRA. That would provoke even more data to cross the Event Bridge, but the third line of Sparks were ready for it - those data would be pierced by the rounds provided by the Red Port of Redis, and sent back to JIRA - for good.
They fought for six days and six nights, taking turns so that the battles would not stop. And then, silence. The day was won, all the data crushed into hive and parquet.
Short-lived was the relief. The Witchers knew that the enemy in combat is but a shadow of the troubles that approach. Politics and greed and grudge are all next in line. Are the Witchers heroes or marauders? The aftermath is to come, and I will keep you posted.4 -
Imagine you work in a mechanic’s shop. You just got trained today on a new part install, including all the task-specific tools it takes to install it.
Some are standard tools, like a screwdriver, that most people know how to use. Others are complicated, single-purpose tools that only work to install this one part.
It takes you a couple of hours compared to other techs who learned quicker than you and can do it in 20 minutes. You go to bed that night thinking “I’ve got this. I’ll remember how this works tomorrow and I’ll be twice as fast tomorrow as I was today.”
The next morning, you wake up retaining a working, useful memory of only about 5% on how to use the specialized tools and installation of the part.
You retrain that day as a review, but your install time still suffers in comparison. You again feel confident by the end of the day that you understand and go to bed thinking you’ll at least get within 10-20 minutes of the faster techs in your install.
The next morning, you wake up retaining a working, useful memory of only 10% on how to use the specialized tools.
Repeat until you reach 100% mastery and match the other techs in speed and efficiency.
Oops! Scratch that! We are no longer using those tools or that part. We’re switching to this other thing that somehow everyone already knows or understands quickly. Start over.
This has been my entire development career. I’m so tired.2 -
In highschool we went through something like a malware/phishing prevention course.
It was pretty cool tbh, we spend the whole hour in a virtual environment where you'd see common malware and phishing attempts, but the really fun you could also "hack" other students.
Hacking them means you could cause some things to happen on their "PC". One of those was showing in a captcha on their screen and they had to type a the string of your choosing, before they could access the rest of the "virtual computer" again.
You can probably guess where this is going.
I was the first who had the idea to mix big i and small L and tested it on our teacher, who was also part of this environment and screenshared to the projector.
Thanks to sitting next projection I could see the pixels and I can confirm: same character, Pixel perfect!
I will forever cherish the memory of my the teacher begging me to undo the "hack" and the chaos that followed amongst my peers 😈
Also one of the excersizes was stupid. Click on a phishing mail and enter your credentials in the form. I asked the teacher WTF kind of credentials they even want me to enter to microsooft.cum and they just said "the credentials obviously" so I think they got their karma🖕 -
What does GPT-3 tell us about how our brains work?
I just read an interesting article (link below) about how it does on the turing test. I've had this inclination for a while that state of the art AI is "incomplete", in the sense that we have some of the systems to make AGI, but not all of them. One of the comments they make is that "GPT-3 often finds it easier to write code to solve a programming problem, than to solve the problem on one example input", and that's the nail on the head. We can codify situations, describe the rules, put them in memory and run those rules in our head. We can manipulate the input to see how it'll change, we can spot from a problem statement what the rules are instead of focusing on what the answer is. Anyway, light bulb moment shared.
Link: https://lacker.io/ai/2020/...9 -
Request I saw today...
* A new, empty AWS Account
* The ability to run 120 high memory EC2 instances, including up to 80 instances of dl1.24xlarge, but don't worry, 40 of them will be spot instances. I'll probably just start with two m5.xlarge for simplicity.
* VPC Peering into our primary AWS network
* VPC Peering into a 3rd party's network (because we're paying them for this service)
* A couple cross-acount IAM roles
* Granting "AWS: AdministratorAccess" to said IAM roles
I'm a bit behind schedule, and this is urgent. When will you have this completed?2 -
Windows OS..
24Gb RAM
Is 24Gb enough, no worries, we got virtual memory too on disk haven't we, it will use that if we run out of real memory, won't it..
Well, no..
Get up to 23Gb in use, and things start to break..
Today, oh it just shut off the main video card, no biggie..
I've another 3 monitors to work with, and tidy things up before I reboot and make everything work again..
And there was me thinking, oh I won't need 48Gb, 24 will be fine..
CP/M was never like this..2 -
I can't believe fucking Google, aka me cha for SWEs decided that the best they can do is shove all that ram into our computers for Chromium. All of the major apps decided Electron was a good idea and now all of our computers are bloated with fatass memory hogs taking 600MB RAM.
Fuck you Goog.25 -
I hate people who think they are always right.
A coworker who seemed to be a friend turns out to be an emotionally needy narcissist who seems to think that he is a perfect human being and is the best example of how to live.
Long story short is that we did some bonding via alcohol and smoking cigarettes. Especially when I was in a bad period in my life where I had little self confidence, was in a bad financial situation and overshared many details abound my personal life.
And yeah we also work as software devs in the same team but I started avoiding working with him directly, because due to his seniority he overcomplicates things a lot to the point where stuff gets postponed for months. Meanwhile I am a simple guy, I do my tasks and if they are not up to the standard I just work on the feedback until Im up to the standard, thats it. Its just a job for me, for him its a way of life and he considers himself to be basically an artist.
Hes always trying to prove me something, showing that the "long way" is the best way and so on. In reality I dont give a fuck about him. I live my own life and I have my own priorities. I work fulltime in one job, also I work part time as a freelancer and in total I make about 20 percent more than he does. Previously before this job I owned my own company where for 2 years I ran my own projects which generated a decent revenue. I know what is hard work and how to sacrifice myself in order to achieve results. I am more pragmatic and I have some limitations of what I can be good at (since I have a shitty working memory due to my ADHD). So I have systems in place and bottom line is that I earn a decent living and my skillset is different. Yeah I agree that in some ways he is better than me, but dude has such a massive inflated ego that now he thinks that he unlocked some sort of universal wisdom and now hes suddenly experienced in every field of life and his opinion is the right one.
This guy takes a massive pride in how good software engineer he is and in every topic or interaction he tries to one up me. Which most of the time is just his preference or in order to gain a 0.0001 percent performance increase. Dude is basically a big walking ego and since "we are close now" his ego started bleeding into personal relationship.
In my personal life, Im in a stable relationship, thinking of proposing soon and getting married. I already co-own an apartment with my current girlfriend. Everything is serious and planned, Im soon to be 30 years old. He is the same age but he still thinks hes young hot shit and all he cares about is getting shitfaced a couple times a week after work and he doesnt really have any other hobbies. He has a girlfriend but I dont see any future in there TBH.
So what I did now is I started putting some distance between us. No more drinking every week with him, maybe maximum once in 2 or 3 weeks. I started working from home more. Also I stopped sharing my personal life with him. Each time when he thinks he is right I just go along with it and dont even pay attention to his emotional manipulations. I just hope one day he fucks off completely and I wont give in to his gaslighting. Maybe in a few months I will be leaving this job, so I will never have to deal with him again.
Lesson learned: dont be vulnerable to coworkers who you bond together only via alcohol.3 -
Just reminiscing when back in the "old days" video games had cheats built in...
Proton men and the flying Dutchman IN AOE3...
Now have to find hackers to create trainer apps that change values in memory and bypass cheat detection...7 -
It's 2022 and Firefox still doesn't allow deactivating video caching to disk.
When playing videos from some sites like the Internet Archive, it writes several hundreds of megabytes to the disk, which causes wear on flash storage in the long term. This is the same reason cited for the use of jsonlz4 instead of plain JSON. The caching of videos to disk even happens when deactivating the normal browsing cache (about:config property "browser.cache.disk.enable").
I get the benefit of media caching, but I'd prefer Firefox not to write gigabytes to my SSD each time I watch a somewhat long video. There is actually the about:config property "browser.privatebrowsing.forceMediaMemoryCache", but as the name implies, it is only for private browsing. The RAM is much more suitable for this purpose, and modern computers have, unlike computers from a decade ago, RAM in abundance, which is intended precisely for such a purpose.
The caching of video (and audio) to disk is completely unnecessary as of 2022. It was useful over a decade ago, back when an average computer had 4 GB of RAM and a spinning hard disk (HDD). Now, computers commonly have 16 GB RAM and a solid-state drive (SSD), which makes media caching on disk obsolete, and even detrimental due to weardown. HDDs do not wear down much from writing, since it just alters magnetic fields. HDDs just wear down from the spinning and random access, whereas SSDs do wear down from writing. Since media caching mostly invovles sequential access, HDDs don't mind being used for that. But it is detrimental to the life span of flash memory, and especially hurts live USB drives (USB drives with an operating system) due to their smaller size.
If I watch a one-hour HD video, I do not wish 5 GB to be written to my SSD for nothing. The nonstandard LZ4 format "mozLZ4" for storing sessions was also introduced with the argument of reducing disk writes to flash memory, but video caching causes multiple times as much writing as that.
The property "media.cache_size" in about:config does not help much. Setting it to zero or a low value causes stuttering playback. Setting it to any higher value does not reduce writes to disk, since it apparently just rotates caching within that space, and a lower value means that it just rotates writing more often in a smaller space. Setting a lower value should not cause more wear due to wear levelling, but also does not reduce wear compared to a higher value, since still roughly the same amount of data is written to disk.
Media caching also applies to audio, but that is far less in size than video. Still, deactivating it without having to use private browsing should not be denied to the user.
The fact that this can not be deactivated is a shame for Firefox.2 -
MTP is complete garbage. I want mass storage back.
The media transfer protocol (MTP) occasionally discovers new creative ways of failure. Frequently, directory listings take minutes to load or fail to load at all, and it freezes up infinitely (until disconnected) when renaming an item, and I can not even do two things simultaneously.
While files are being moved, I can not browse pictures or watch videos from the smartphone.
Sometimes, files are listed with the date 1970-01-01 (Unix epoch) instead of their correct date. Sometimes, files do not appear at all, which makes it unsafe to move directories from the device.
MTP lacks random access. If I want to play a two-gigabyte 4K 2160p video and seek in the video, guess what: I need to copy it to my computer's local mass storage first because MTP lacks random access.
When transferring high numbers of files, MTP has to slooooowly enumerate (or "prepare" or "calculate the time of") them all, which might even take longer than mass storage would need for the entire process. This means MTP might start copying or moving the actual files when mass storage is already finished.
Today, the "preparing to move" process was especially slow: five minutes for around 150 files! How am I supposed to find out what caused this random malfunction?
MTP sometimes drives me insane. I want mass storage back, at least for the MicroSD memory card, which uses a widely supported file system.
Imagine a 2010 $100 Android phone is better at file transfer than a 2022 $1000 Android phone (or iPhone, for that matter).3 -
Boss needs certain stats pulled from database once a year for board meeting. This time I delegate it to a junior dba/sysadmin. He looks at my 3-year-old docs that I hastily jotted down and pasted and included my rambling notes with results from way back then. Mostly they were just to jog my own memory, not to be a really neat, clean instruction guide. He does the queries correctly, but in ticket for boss he pastes also all my notes from the docs. boss gets confused, "what is this other number, I don't get it?!" We have to have a meeting of the 3 of us and waste an hour or so just to figure out what went wrong, finally I realize what junior guy accidentally did. Moral of story: to avoid baffling the nontechs, always simplify, simplify, simplify. Alternate moral of story: before delegating a task that seems old hat to you, always review your notes/docs and make sure they're ready for someone else to use them.2
-
At a startup where the software was built haphazardly because the developer thought he'll lifelong be the sole maintainer. The dude antagonized me at every turn and refused to help with familiarising with his code. He eventually left majority of the work for me, and dedication to work continued to dwindle until he threw in the towel
After his departure, we surprisingly grew fond of each other, discussing code concepts at length. He was in the habit of refusing to read any of the articles I sent him, or answer open ended questions citing the claim that they require thinking and he was busy. I didn't take any of this to heart
But it accumulated and I deleted his number. I didn't bear him any ill wishes but it wasn't respectful to myself for him to remain in my space. Some day, I was looking for a point raised during our conversations and went rummaging through our chats. Going down memory lane opened scars I'd long forgotten. I was embarrassed to see the way I forgot all about it. I should never have had anything to do with someone like that
He contacted me for a favour just less than a year after I deleted his contact. I didn't even think of declining. But this evening, I randomly remembered how he saw a defect in my code, promised me that the code will fail in production and resisted all pleas to point out what it was. I don't know if I hate him for his dastardly acts. What I feel deepest is sadness/bitterness that I got to experience all that2 -
I always had this mentality that I shouldn’t rely on a certain library or framework for my entire project because what if one day they stop supporting it. (Yeah I’m talking to u vuetify) That’s why I came up with this code structure that for everything that I wanna do I have a ‘driver’ library all coded by myself that interacts with that third party framework or library so if they stop supporting it I could just change a couple of lines of code in my driver file and my codebase should be working again. But I feel like this ‘driver’ approach is not the most efficient way of going in terms of memory usage. Do you guys think I should keep it simple and directly use those libraries or this is actually not a bad approach.7
-
everytime i buy a new phone ,i feel this sense of extreme regret :(
i bought a moto g 5g phone last year in feb, it was so good . it didn't had any out of the world cameras or some funky stuff, but it gave a decent performance and i couldn't want any other phone.
In October my mom's phone started giving issues so i bought a realme phone for her that was half my phone's price. i couldn't spent any mor e because otherwise she wouldn't take it. she accepted the cheaper phone and within 4 days sue was cursing it. the phone had decent specs but would lag in certain apps like zoom, and won't run some call recorder apps. at the end i swapped my phone with mom's since i didn't cared about zoom or the recorder.
now this shit realme phone's memory has gone around 60% full of my stuff, and its showing its limitations. this shit auto relaunches insta after a few minutes of usage, probably because its runtime memory gets short( 4gb 128gb device gets memory shortages. nice). its video quality is shit and camera also takes rarely good pics.
the worst thing i like about smartphones today is how they over optimise the ui. this insta issue and auto call recorders not working is simply because of the realme skin running over the stock android. i had similar issues with a xiaomi device i bought for my dad sometime ago. (fortunately my dad is more medieval so that crap has not came back to me :'/ )
so overall i am buying a 3rd phone in 17 months.
This time it's Samsung f23 and am worried that it's also going to suck. i was this 🤏close to buying a pixel 6 or even an iphone coz i can afford them.
but the regret of buying such an expensive phone that will need replacement in 2 years made me rethink.
the only android os that have suited me the best is stock and as of now only 2 companies are making it : google and moto(* it's 100% aosp with 3 extra apps but they can't say that, so they also state that they are not stock os) . one plus is also a brand that i have heard makes a good os . but recently i also heard that they have completely scrapped their os and using oppo's softwares . plus the amount of tickets we get for notifications not working in oneplus, am sure their optimization is extremely aggressive.
so everything between a moderate price phone ( that will need a replacement in 2 years ) to a flagship felt unnecessary to me, so i went ahead with a Samsung's shit phone. f23 has almost same specs as moto but it's again a heavily customised os. i wanna waste my money on trying a custom os and declare it shitty.
most of my friends that use Samsung are fan of it but they are also not very techy so i guess it suits them well. i am the guy who first installs nova launcher in his device, so let's see what it brings on the table. from the 3rd person p.o.v, i felt its screen and camera images to he nice whenever i used their mobiles, so let's see what this brings to the table :(10 -
Was reading something about delusional disorder, and it got a bit scary cuz it made me question myself. Now, I tell you why.
I have a bad memory when it comes to trivial stuff. And I am, by occupation and therefore on a daily basis, creative and imaginative. Having pretty strong imagination means that I often have to ask myself "did that really happen or did I imagine it?" Which, given anxiety, I imagine all types of scenarios before they happen. (Parallel universes got nothing on me 😎)
So, now I'm wondering, where is the line between imagination and delusion, and how can you say what's real and what's not, be it offline (distorted memory) or online (schizophrenia).
One idea could be that video recording could help confirm, but we read emotions and vibe in real-time, and often those can't be recorded.
... Idk. Maybe I'm overthinking it. ¯\_(ツ)_/¯
Thank you for reading my half-baked thoughts!6 -
Tell me you're a media-obsessed rube drone without telling me you're a media-obsessed rube drone. I'll start:
"SoFtWaRe JoB mArKeT iS hOrRiBlE aNd ShOwS nO sIgN oF rEcOvErY!!!"
hah, you mean those layoffs from that handful of frothed-over tech giants which had, I don't know, approximately ONE HUNDRED TIMES the amount of engineers they actually needed? I swear if i see this trope one more time i'm about to rage. can't wait until 2023 when this 'scare' will be but a memory. yes i'm muad'dib, golden path, worm god, whatever
but it's even simpler, you don't have to drink the spice:
- there are an estimated 205,741 people affected by the LaYoFfs (https://www.trueup.io/layoffs, actually a really cool site I just found)
- there are an estimated 3.87 MILLION software engineers, and that's just in the US, so it's safe to say less than 5% of the industry has been affected
so in short yes, you are a rube, i'll enjoy my multiple job offerings
should have been working on your craft instead of reading all those "news" articles. sheesh, i'd scare to hire anyone for a software position who can't get a grip on simple numbers anyway6 -
I'm notoriously bad at Git. By that I mean I REALLY REALLY SUCK AT IT. And I have the curse of short memory and an even shorter ability to retain the how-to, muscle memory knowledge of things if too much time passes.
So, I was staring down the gullet of merging two separate repositories onto my local machine and then pushing the result to a remote server. Not having the benefit of someone else to bounce this off of, and always finding the usual Git docs too dense and obtuse, I turned to ChatGPT to help me sort it out.
Guys, where has this been all of my life? I know it's not perfect and it can make mistakes. I knew that going into it, so I made preparations in case this failed. BUT. IT. WORKED! I feel like it has put me into the Star Trek:TNG universe where I can say "Computer, do the thing." and it does that thing. Here's the prompt I used and which it answered perfectly.
"Play the role of a git coach. I have two git repositories. One is on Bitbucket. The other is on GitHub. The branch named "master" on Bitbucket has the latest code. The branch named "master" on GitHub needs to be updated to what's on the Bitbucket "master" branch. Please write the series of git commands that I will need to accomplish this."9 -
We specified a very optimistic setup for a data science platform for a client....
Minimum one machine with a 16 core CPU with 64GB RAM to process data.....
Client's IT department: Best we can do is an 8 core 16GB server.
Literally what I have on my laptop.
Data scientist doesn't use any out-of-memory data processing framework, e.g. Dask, despite telling him it's the best way to be economical on memory; ipykernel kills the computation anyway because it runs out of memory.
Data scientist has a 64GB machine himself so he says it's fine.
Purpose of the server: rendered pointless.5 -
Browser are so bloated. Why does it use so much memory?
*Closes 62 tab*
Seriously why?! Damn you modern software engineers for making everything so bloated.6 -
As I keep saying, we should spend less time developing "better, safer" tools and practices and more time making sure the developers that use them know what they're doing. The bugs caused by lack of memory safety are rare (although often more critical) compared to the bugs caused by developers not paying proper attention to what their code does in the first place.
https://theregister.com/2023/01/...11 -
I can work with Angular, even though it's pain in the but.
My current Angular job is actually the job with the first manager that had decent human values and ethics, I like my team, and yeah, what we building is shit. But it's only 30% shit because of Angular, another 30% are due to SAFe, and the rest is the usual stuff.
Still enjoy my job and respect my team.
But please do not expect me to pretend Angular is on a comparable level to React. Angular hasn't brought any actual innovation in most major versions but releases those breaking major updates still at least twice a year.
Ivy might be awesome, but only because Angular told the world 3 years ago also to have Ivy compatible compile targets for their libs/packages doesn't mean everybody cared.
And the ngcc, the awesome compatibility compiler, mutates node modules in place. So ne parallel stuff, no using yarn2 or pnpm.
At the same time, React brought so many innovations into the frontend world but is basically backwards compatible.
Not sure how the Angular partial compilation and whatever needs to go on works, but it seems like there's hardly anyone that really knows, so you can't use Vite or whatever other new tool.
And sure, if you're really good, you can write Angular without producing memory leaks.
But it's really hard. Do you know what's also quite hard: Producing memory leaks with React!
And for sure, Angular Universal, which isn't used by anyone, it feels like, will still be on a comparable level to an open source product that's used all over the world, builds the basis for an open source company, and is improved by thousand of issues day by day.
And sure, two kinds of change detection are a great idea. And yeah, pretending Angular comes with all included makes it worth it that the API is fucking huge and you're better of knowing nothing, because you have to read up things, than knowing quite a lot, since making assumptions and believing apis work in a similar way and follow similar contentions...
Whatever... I work with it. Like the time. Like the company, even my poss. But please don't expect my lying to you this was a good idea, or Angular is even remotely the same level of React.15 -
RethinkDB is such a rediculous overengineered BIGGEST BULLSHIT I HAVE EVER UNFORTUNATELY USED.
Does anyone even use this total shit????
This shit eats RAM memory for just 1 CRUD operation as if you opened 10,000 google chrome tabs. Who the fuck thought that kind of technology is a good idea?
Yes it IS very fast, a real time database. But you'd have to have a multi-million dollar supercomputer to be able to handle so much data like a relational database can....5 -
Friday 13th. Superstition.
0655, got WFH laptop going. 0700, VPN'ed in. Bluescreen, first in ages. Yes, Windows, the hatred is mutual. Rebooted. Windows claimed memory fault, offered check, 40 minutes. Noped out. Started machine. VPN'ed in. Some strange script error that I'd never seen before. Rebooted. Script error again. Shut down machine, then rebooted, same problem. 0715, fuck, still wearing sweaters, my e-scooter not charged, and an important Teams call at 0800.
Got dressed, stuffed laptop into backpack, hurried up by foot. Took the bus. Fuck, the next connection on the change station just had gone off. Took a taxi to make it. Arrived at the company, plugged in the laptop, started with no issues. Had the important call.
Took the laptop to IT. Tested it with external network connection and VPN. Worked with no script error. Had it checked for RAM issues. No issue. WTF had happened in the morning?!5 -
You know the PHP legacy code base is complete garbage when it requires a script memory limit of 1.5GB.9
-
Deployments, a limerick:
there once was an ops guy from New York,
who was working on deploying a fork.
the docs were weak
the code memory leaked
in a half hour all of production was borked.5 -
Got myself a new work computer. Aside from setting everything back up, it's been an absolute treat. I didn't even have to move to Windows 11.
Why Dell feels the need to put 7TB of garbage, including literal adware that spews notifications, escapes me. All it does is hurt their reputation.
I would have been allowed to build my own from scratch, but I didn't even ask since it's been so long since I built my last machine and I don't even know where to start hardware wise these days.
12th gen i7
GTX1080 that has all the video memory I could need
RAM just pouring out of the thing
I'm living the life.28 -
Sharing a first look at a prototype Web Components library I am working on for "fun"
TL;DR left side is pivot (grouped) table, right side is declarative code for it (Everything except the custom formatting is done declaratively, but has the option to be imperative as well).
====
TL;DR (Too long, did read):
I'm challenging myself to be creative with the cool new things that browsers offer us. Lani so far has a focus on extreme extensibility, abstraction from dependencies, and optional declarative style.
It's also going to be a micro CSS framework, but that's taking the back-seat.
I wanted to highlight my design here with this table, and the code that is written to produce this result.
First, you can see that the <lani-table> element is reading template, data, and layout information from its child elements. Besides the custom highlighting code (Yellow background in the "Tags" column, and green gradient in the "Score" column), everything can be done without opening even a single script tag.
The <lani-data-source> element is rather special. It's an abstraction of any data source, and you, as a developer can add custom data sources and hook up the handlers to your whim (the element itself uses the "type" attribute to choose a handler. In this case, the handler is "download" which simply sends a fetch request to the server once and downloads the result to memory).
Templates are stored in an html file, not string literals (Which I think really fucks the code) and loaded async, then cached into an object (so that the network tab doesn't get crowded, even if we can count on the HTTP cache). This also has the benefit of allowing me to parse the HTML templates once and then caching the parsed result in memory, so templates are never re-parsed from string no matter how many custom elements are created.
Everything is "compiled" into a single, minified .js file that you include on your page.
I know it's nothing extraordinary, but for something that doesn't need to be compiled, transpiled, packaged, shipped, and kissed goodnight, I think it's a really nice design and I hope to continue work on it and improve it over time1 -
I’m struggling in studying and that’s seriously holding me back, regardless of the type of technical book I’m reading I’m always in a fight with my brain. Even if I enjoy the topic and then I’ll enjoy using what I read while I study I struggle to learn more than 1-2 chapters (sometimes even less) at time then my head starts to hurt, my focus drifts away and if I force myself to go ahead my brain just refuses to store the new informations, it feels like filling a full tank.
At this point I should have learned C++ and Swift and started to contribute to projects which aren’t overdone web apps but all I have are two half read books which silently “judges” me anytime I open my eBook library and I dread returning to having associated them to headache and frustration and the only things I read this year are design patterns (which haven’t found a single real life use since then) and F# (which I never used with the exception of some little demos and is now slowly fading away in my memory).
Have you got any study advice to help me dealing with this frustrating situation?3 -
When file managers copy and delete files within the same partition instead of moving or renaming them…
When Google's Storage Access Framework was introduced, it did not feature a move command, so file managers just resorted to copying and deleting files within the same storage. Not only does this cause needless wear and is much slower, but it also destroys the date/time attribute (it gets changed to current).
When moving files through MTP (miserable transfer protocol, used for connecting smartphones to PC), they are also copy-deleted. This makes moving a 20-Gigabyte DCIM folder impractical. Also, if one cancels the operation, it might end up whoopsie-daisy deleting some files from the source before they have been transferred.
MTP is so bogus that it is incapable of a simple operation that would JustWork™ on mass storage devices. Not to mention, MTP lacks parallelism and its directory listing loading it S-L-O-W. Upwards of a minute for just 1000 files. Sometimes, it fails loading at all.
Also, trying to rename a file through MTP using the terminal through GVFS, even if just within the same folder, it copy-deletes it. If I want to rename a 1 GB 2160p 4K video in a highly populated DCIM folder, I can not do so through the terminal. At least, the 4K video has a time stamp in its internal metadata, but it still renames slowly and adds needless wear to the smartphone's flash memory.16 -
Does anyone remember BASIC?
10 PRINT "Hello World!"
20 GOTO 10
I learned it when I had my Commodore 64. Recently I've gotten the itch to dive back in the development world. So I'm refreshing my memory on HTML and CSS (yes I know they're not programming languages) then move on to JavaScript and either React or Angular. Hopefully I will be able to contribute more to discussion on here than just lurk.24 -
Does anyone here have any good resources for introduction to embedded, low level development, or anything on advanced C concepts? I've been having trouble trying to step into more complicated topics like bit manipulation and stuff I can do with memory management. Also any advice is also appreciated.30
-
F**k companies who's apps use MySQL/MariaDB tables of the table engine MEMORY.
Seriously.
That engine *sucks* to work with as an admin. It's such a huge pain in the ass having to always dump the whole DB instead of taking a snapshot.
And if the replica restarts... Poof. Replication breaks. Cuz all the memory tables are suddenly empty!
Fml. Fmfl. Ugh.17 -
Actually kinda sad, that there is no pure rust ui framework out there, but rather mere adaptations of c/c++ frameworks for rust. It's better than nothing for sure, it just would be nice, if i could use a framework, that doesn't create a massive memory leak, because i looked at it funny.
In particular i'm using fltk-rs, and everytime I'm applying a font to some widget, 500kb get added as leaked memory. Doesn't sound like a lot, but for one it's a dynamically built application, so the order and amount of widgets changes, and this application is supposed to run days, if not weeks.
thanks to heaptrack i was able to pinpoint that to libpango, which i'm not even interacting with directly, but rather indirectly through the api.
Annoying, that i chose to use a language for actively preventing leaks and dangling pointers and stuff, but end up leaking memory because of a dependency somewhere.7 -
TIL don't rely too much on in memory databases if your client runs development and production environment on the same machine.
Just don't7 -
I went to a Java community conference for the first time and I honestly nearly teared up. It’s been a few years since I’ve actually seen actual hard core engineering with real considerations on memory etc. I felt like all I do all day is get blocked by red tape when I do my job.
God, it felt refreshing to see the reason I got into programming still exists.2 -
I know streams are useful to enable faster per-chunk reading of large files (eg audio/ video), and in Node they can be piped, which also balances memory usage (when done correctly). But suppose I have a large JSON file of 500MB (say from a scraper) that I want to run some string content replacements on. Are streams fit for this kind of purpose? How do you go about altering the JSON file 'chunks' separately when the Buffer.toString of a chunk would probably be invalid partial JSON? I guess I could rephrase as: what is the best way to read large, structured text files (json, html etc), manipulate their contents and write them back (without reading them in memory at once)?4
-
It never ceases to amaze me just how big 64 bit memory space is. It's so unrealistically big that on contemporary processors you can't address the middle and the size of that dead spot (the number of high bits that must be the same in a valid address) is barely worth mentioning.1
-
You know what's worse than having to come up with a new password every time you create an account? Forgetting your password every time you try to log in!
I swear, it's like my brain has a selective memory when it comes to passwords. I can remember every lyric to a song from 10 years ago, but I can't remember the password I created yesterday.
And don't even get me started on password manager software. You would think that having all of your passwords stored in one place would make things easier, but nope. I've forgotten my password for my password manager so many times that I'm starting to think I need a password manager for my password manager.
But seriously, why do we even need passwords in the first place? Why isn’t there an easier one stone kills all solution to all these password authentication nonsense?
I could remember when it was all letters, then forced to use letters + numbers…
then later forced to include symbols…
and then forced to make it lengthier…
and then solve puzzles after getting it right…
and after all the stress now we are forced to find nemo from a set of images.
I thought the misery would end there but nope. Now some platform forces 2FA like dude seriously?
For God’s sake we built self driving cars already! Why can’t one just exist without a password? Why do we always end up in a password cycle?
And please don’t say shit about oauth because if your password master (i.e: google) fucks you in the ass then all your oauth accounts are gone for good!
I'm currently having an existential crisis about the meaning of passwords in our modern society. Shit is crazy when I ponder about it I get worried.12 -
How to disconnect from work after working hours? Im working for the last 4 months as a mid level dev in this company. I mean Im able to problem-solve and do my work but sometimes I get so addicted to problem solving that I get worried and become obsessed, hyperfixated (especialy if Im stuck on something for lets say a couple weeks). It goes to the point where I work from home 12-14 hours a day just to figure out some bug in the flow.
Thing is, our codebase is large and when doing every new refactor/feature some surprises happen. I dont have a decent mentor who could teach me one on one or even do pair programming with. All i have is just some colleagues who can point me to right direction or do a code review from time to time. Thats it.
I dont know why I take this so personally. For example I had to do a feature which I did in 1 week, then MR got approved by devs and QA. After that during regression they found like 3 blockers and I felt really bad and ashamed. While in reality our BA did not define feature properly, devs who reviewed it didnt even launch the code and poke around in the app, and our team's QA tested only the happy scenario. Basically this is failing/getting delayed because of a failure in like 6-7 people chain.
However for some reason Im taking this very personally, that I, as a dev failed. Maybe due to my ADHD or something but for the next days or weeks as long as I dont find solution I will isolate myself and tryhard until I get it right. Then have a few days of chill until I face another obstacle in another task again. And this keeps repeating and repeating.
My senior colleague tells me to chill and dont let work take such a toll on my emotional/physical/mental health. But its hard. He has 7 years of experience and has decent memory. I have 2-3 years of experience and have ADHD, we are not the same. I dont know how to become a guy who clocks out after 8 hours of work done everyday. Its like I feel that they might fire me or I will look bad if I dont put in enough effort. Not like I was ever fired for performance issues... Anyways I dont know how to start working to live, instead of living for work.
I hate who Im becoming. I dont work out anymore, started smoking a lot, dont exercise. I live this self induced anxiety driven workaholic lifestyle.6 -
When windows forms required me to dispose of a certain control derivative manually using a .dispose() call because dynamic control creation was causing a memory leak in dotnet, which instead of fixing, microsoft documented, vaguely.3
-
!rant
The new end to the idiotic code snippet head scratchers interviews (awkward for both parties but nobody is willing to admit it)?
Hometasks.
Infinite internet access, use whatever tools you want, do as much as you can in 2-3 hours.
The best non-toxic way to see how someone works as a dev.
This is the way I expect you to work, so this is the way I will interview you.
Sorry silicon valley, we don't need people who can write up a binary search algo from rote memory.3 -
switching from C# / managed C++ to pure C++ in the new project feels like being relocated to an outpost in the wild west.
now i have to think about so many things the C# compiler would just have cared for, and all this hassle before i can actually address the problems that i want to solve. already ran into some weird memory overflows. i'm actually happy to learn something new, but it still feels really inefficient.3 -
Today in Amy can’t remember words, I forgot the word nostalgia and instead said “pangs of wistful memory”. You’re welcome.2
-
How did mid-2000s computer users get along with just 1 GB of RAM or less?
As of today, anything less than 8 GB of RAM seems impractical. A handful of tabs in a web browser and file manager can quickly fill that up.
Shortly after booting, 2 GB of RAM are already eaten up on today's operating systems.
When I occasionally used an older laptop computer with 6 GB of RAM (because it has more ports and better repairability than today's laptops; before upgrading the memory), most of the time over 5 GB were in use, and that did not even include disk caching.
It appears that today's web browsers are far more memory-intensive than 2000s web browsers, even if we do similar things people did in the 2000s: browsing text-based pages with some photos here and there, watching videos, messaging and mailing, forum posting, and perhaps gaming. Tabbed browsing already was a thing in the 2000s. Microsoft added tabs to their pre-installed browser in 2006, back when an average personal computer had 1 GB of RAM, and an average laptop 512 MB!
Perhaps a difference is that people today watch in 720p or 1080p whereas in the 2000s, people typically watched at 240p, 360p, or 480p, but that still does not explain this massive difference. (Also, I pick a low resolution anyway when mostly listening to a video in background.)
One could create a swap file to extend system memory, though that is not healthy for an SSD in the long term. On computers, RAM is king.11 -
I often dream that I discovered a rare edge case in reality that can lead to a crash if corrupted people create any object together. Corrupted state is infectious but due to caching and lazy copying strategies you mainly spread it to previous owners of items you infect. Also I can't edit the code to fix the issue because I'd have to recompile and our world is an in-memory artifact of the current execution.1
-
We had an ADAM/Colecovision unit before this, but I don't really count it, as it was more of a console for us than a computer.
In 1986 dad brought home a Tandy 1000 SX. It had an Intel 8088 processor, 64k of memory, and no hard drive. With dual 5.25" floppy drives, our write-protected DOS 3.1 disk stayed in drive A almost all the time. Games and other software were run from drive B, or from the external cassette drive. For really big games, like Conquest of Camelot and Space Quest 3, we were frequently prompted to swap disks in B: before the game could continue.
Space Quest, King's Quest, Lords of Conquest, Conquest of Camelot, Chuck Yeager's Advanced Flight Trainer, several editions of Carmen Sandiego, and at least a dozen other games dominated our gaming use. We wrote papers with WordStar, and my parents maintained their budget with Lotus 1-2-3.
A year or two later, Dad installed a 10 MB hard drive, and we started booting DOS off that instead. Heady days.1 -
Question for devs who use Intellij IDEA.
How often do you use livetemplates?
I am a new android dev with ADHD and just discovered live templates. They make my life much easier, for example I have shortcuts for generating recyclerview adapter/viewholder/implementation boilerplate code.
In that way I am able to focus on implementation, and do my coding like building blocks, rather than memorizing every detail of implementation. Also I don't need to go to stackoverflow and copypaste basic things multiple times. Even for example during live coding interview having livetemplates seems awesome, copypasting from stackoverflow would be shameful (I think). Using my own custom shortcuts for livetemplates seems the best way for how my brain functions (I suck at memorizing tiny details, but I remember general idea/flow of a pattern and I would prefer memorizing what to use and when to use, instead of all small details of implementation).
Is getting to dependent on livetemplates a good practice to get used to? Do other developers frown upon a dev who has dozens of livetemplates and relies on them instead of writing all code from memory by hand?8 -
Feel dirty writing in c. How do people even deal with unsafe pointer type casting/memory allocation/free? The codebase is plagued with memory leaks and there is no test.
I will just pretend I can't read c code and play dumb when shit happens15 -
Not my 'first' but the first outside of stupid little toy projects.
I got an internship back in 2016 while I was in 11th grade. Mine was sort of a college doing community outreach, so yeah, not really impressive of an internship.
But my manager handed me a Micro:Bit. At the time, there were like 1000 in the U.S. the U.K. was brainstorming, including them in school curriculums. My manager just told me to experiment and see what I could do with it.
Minimal requirements Minimal guidance outside of ideas now and then (he had doctorate students to manage so I get it lol), so I started just doing stupid small things with the micro python, the language the minimal back then documentation reccomended, like a 'lowest of poly' crazy taxi thing.
But by the end, I hacked together some HORRIBLY written C++ to get 2 of them to communicate. 1 always powered and gets a state from the other at regular intervals. The other is powered by a hand crank and sending the direction of the crank to the other.
I forget what the end goal was. But it was fun to learn, and thinking back, I did a lot in just 8 weeks
My manager gave me the first Micro:Bit on my last day. I don't do anything with it anymore. But it's a fun memory.
It was also around that time I found DevRant and needed you guys to knock my ego down a few pegs when my head over inflated, lol. -
Go to hell elastic cloud!
While true:
I can’t resize my elasticsearch instance to get memory because it’s stuck….
It’s stuck because it doesn’t has enough memory to actually start …
Wtf!2 -
Currently having very funny project lead, who gives on the spot estimates for 9 years old very pathetic quality code having Android app in security domain. Memory leaks, bad practices, typos, CVEs etc. you name it we have it in our source of the app.
Since 5-6 sprints of our project, almost 50% of user stories were incomplete due to under estimations.
Basically everyone in management were almost sleeping since last 7-8 years about code quality & now suddenly when new Dev & QA team is here they wanted us to fix everything ASAP.
Most humourous thing is product owner is aware about importance of unit test cases, but don't want to allocate user stories for that at the time of sprint planning as code is almost freezed according to him for current release.
Actually, since last release he had done the same thing for each sprint, around 18 months were passed still he hadn't spared single day for unit testing.
Recently app crash issue was found in version upgrade scenario as QAs were much tired by testing hundreds of basic trivial test cases manually & server side testing too, so they can't do actual needful testing & which is tougher to automate for Dev.
Recently when team's old Macbook Pros got expired higher management has allocated Intel Mac minis by saying that few people of organization are misusing Macbooks. So for just few people everyone has to suffer now as there is no flexibility in frequent changing between WFH & WFO. 1 out of those Mac minis faced overheating & in repair since 6 months.
Out of 4 Devs & 3 QAs, all 3 QAs & 2 Devs had left gradually.
I think it's time to say goodbye 😔4 -
The downside of writing reusable, abstracted, DRY code for multiple applications to use: you have to remember to test changes in all the contexts... my org has to hire contractors project by project as we dont have the budget to have more devs than just 1 (me) on permanently. the contractors tho often don't know about all the places our code gets used. And sometimes I even forget - last week in the rush to finish some project, we forgot to think about how a library change made for benefit of a new project a few weeks ago might effect an older (in production) project. Until shit started breaking. Annoying. very annoying. luckily i fixed it (rolled back) before the weekend, but thursday and friday were quite stressful... now tomorrow, a bunch of sleuthing time to figure out exactly what recent change caused it... argh....3
-
Recently I've had a lot of realistic dreams and it's awful. For example, yesterday I dreamed that I have a SoftEng lecture on Monday at 9am. The day before I dreamed that Russia defeated Ukraine and are now neighbors with Hungary. On both occasions I was later convinced that the memory fragments were true until I either received conflicting news or some unrelated trigger reminded me of the later, unrealistic parts of the dream.
I can sort of deal with the possibility that my current life is a dream and I'll eventually wake up and start over from an unspecified morning, but the possibility that while living in reality an arbitrary subset of my memories comes from dreams is much worse3 -
After brute forced access to her hardware I spotted huge memory leak spreading on my key logger I just installed. She couldn’t resist right after my data reached her database so I inserted it once more to duplicate her primary key, she instantly locked my transaction and screamed so loud that all neighborhood was broadcasted with a message that exception is being raised. Right after she grabbed back of my stick just to push my exploit harder to it’s limits and make sure all stack trace is being logged into her security kernel log.
Fortunately my spyware was obfuscated and my metadata was hidden so despite she wanted to copy my code into her newly established kernel and clone it into new deadly weapon all my data went into temporary file I could flush right after my stick was unloaded.
Right after deeply scanning her localhost I removed my stick from her desktop and left the building, she was left alone again, loudly complaining about her security hole being exploited.
My work was done and I was preparing to break into another corporate security system.
- penetration tester diaries2 -
I was 7 years old, and my mom’s friend brought me their old computer as a new year present. I was absolutely happy that day, because I wanted my own computer as far back as I can remember. I spent that evening exploring russian psychological (!) sex quiz (!!) with pictures (!!!) :D I found it on C:\
Actually no, there is an earlier memory. I was four, and I really wanted to mess around with my sis’ computer, it was some kind of holiday, maybe the new year as well. They won’t let me do it, and being an engineer, I took a rectangle-shaped candy box and made a “laptop” out of it. I remember drawing the screen, the icons and stuff. And plastic mold that actually handles candy, I turned upside down, and the candy cavities became sort of “buttons” I could press.2 -
I wonder how many github issues have been closed by asking the author to implement the feature they've requested for. In the past, I was confident my issue will be resolved by opening a new one when there's no answer in earlier questions. I can't tell whether the nature of my questions advanced or whether it's a new trend. But I've opened maybe 4/5 issues in recent memory, and each time, the collaborators suggest the feature is one I should contribute to their project by implementing. Isn't this their job as maintainers? I'm already working on something that barely gives me breathing space. I encountered a challenge using your library, and your idea of helping is that I dissent from my own trajectory, acquaint with your project /how to implement what I want, wait for it to get merged etc, before continue what I originally intended. Do they think that's worth it?
Is it just me or is this a common occurrence, lately?22 -
So I’m reading this book called Hacking: The art of exploitation and I’ve got to admit. It’s one of my favourite books I’ve read. It really gets into the nitty gritty of how programs are laid out in memory and goes over how assembly works, among some other low level concepts. Highly recommend.1
-
Ok ok.. I used a German keyboard so Y and Z are switched. Ive never seen a picture of Jason Mraz but I really like his music so I wanted to YouTube him.. and my muscle memory did this.2
-
Up all damn night making the script work.
Wrote a non-sieve prime generator.
Thing kept outputting one or two numbers that weren't prime, related to something called carmichael numbers.
Any case got it to work, god damn was it a slog though.
Generates next and previous primes pretty reliably regardless of the size of the number
(haven't gone over 31 bit because I haven't had a chance to implement decimal for this).
Don't know if the sieve is the only reliable way to do it. This seems to do it without a hitch, and doesn't seem to use a lot of memory. Don't have to constantly return to a lookup table of small factors or their multiple either.
Technically it generates the primes out of the integers, and not the other way around.
Things 0.01-0.02th of a second per prime up to around the 100 million mark, and then it gets into the 0.15-1second range per generation.
At around primes of a couple billion, its averaging about 1 second per bit to calculate 1. whether the number is prime or not, 2. what the next or last immediate prime is. Although I'm sure theres some optimization or improvement here.
Seems reliable but obviously I don't have the resources to check it beyond the first 20k primes I confirmed.
From what I can see it didn't drop any primes, and it didn't include any errant non-primes.
Codes here:
https://pastebin.com/raw/57j3mHsN
Your gotos should be nextPrime(), lastPrime(), isPrime, genPrimes(up to but not including some N), and genNPrimes(), which generates x amount of primes for you.
Speed limit definitely seems to top out at 1 second per bit for a prime once the code is in the billions, but I don't know if thats the ceiling, again, because decimal needs implemented.
I think the core method, in calcY (terrible name, I know) could probably be optimized in some clever way if its given an adjacent prime, and what parameters were used. Theres probably some pattern I'm not seeing, but eh.
I'm also wondering if I can't use those fancy aberrations, 'carmichael numbers' or whatever the hell they are, to calculate some sort of offset, and by doing so, figure out a given primes index.
And all my brain says is "sleep"
But family wants me to hang out, and I have to go talk a manager at home depot into an interview, because wanting to program for a living, and actually getting someone to give you the time of day are two different things.1 -
Honestly after fucking around with rust async, I do have a lot more respect for high level languages where you don't have to worry about locking memory and stuff haha. Learning promises in nodejs was a breeze, learning them in rust requires a lot more thinking :p14
-
It's 2022 and people still believe USB sticks and external card readers are a replacement for memory card slots.
They're not. SD cards have a standardized form factor and do not protrude from memory card slots, but external card readers and USB sticks do.
Just like smartphones, laptops are increasingly ditching the SD card slot or replacing it with microSD, which has less capacity, lower life expectancy and data retention span due to smaller memory transistors, worse handling, and no write-protection switch.
Not only should full-sized SD cards be brought back to laptops, but also brought to smartphones. There might soon be 2 TB SD cards, meaning not one second of worrying about running out of space for years. That would be wonderful.22 -
Adobe, the company with virtually limitless budget, somehow created possibly the worst CMS to grace this earth (at least from the UX perspective). Meet Adobe Experience Manager, or AEM for short.
For starters, there's two executable jars: author and publis. Author is where you make all your pages, publish is the "final" preview. Except they're the same jar file. It's deciding which mode to run in based on the jar filename. The filename is also how you configure things like which port it's running on.
Publishing pages (sending them to the publish app) looks simple: select the page you want, press a button and it's ready to view. Except it's not. In order to publish a page and have it visible, you also need to publish the entire directory structure this site is in. So if you have the page in a directory "my-site/en/pages/home", you have to publish "my-site", then "en", then... The real kicker is that when you press "publish" on a page there's a checkbox that asks if you want to also publish everything that's linked to this page, that seemingly doesn't do anything
Ok, enough about publishing. Let's focus on the absolute monstrosity that is the "author" environment. When you first open it, you're greeted with a pretty layout with transitions and animations that's clearly meant for the editors. This is where you make folders and pages, and this is where you publish them. It's worth mentioning that these "folders" exist only in AEM, not on your disk. This part is actually ok, and if it wasn't for the shit publishing ux I'd say it's good.
But, that part only allows you to make pages with some predefined components. What if you wanted to make your own? Don't worry, you can. You just need a maven project that mixes Java, JavaScript, scss and XML in an unholy abomination of frontend and backend that _somehow_ gets compiled into Java classes that then get shoved into AEM and somehow work. Usually. Except for when they just break for no reason (5 people tried the same thing, and each got a completely different error, and it worked for the 6th person with no issues).
But that all was just the surface level stuff. You see, AEM is much more complicated than that. It's not _just_ a wisywyg HTML editor with some customizability sprinkled in. No, sir. It's practically an entire Unix-based operating system. You can open "crxde lite", or like I like to call it, the "os view" to see the entire unix-like directory tree. Just don't be surprised by how it looks. We're in admin/developer territory here, so better get used to the UI that'd make Windows Vista jealous.
The "os" comes with a bunch of apps. Aside from the designer view and crxde lite, there's a replication manager, GraphQL browser, user manager, asset manager and many more. Each app comes with its own UI style and even worse UX than the previous ones. Oh, by the way. I hope you have plenty of ram, cause all those apps are constantly loaded in memory.
Did I mention that the entire thing is written in Java? And I really mean the _entire_ thing. From what I can see, even the frontend JS is generated from Java classes.
So, TL;DR: it's shit. Stay the fuck away from it, and don't use it unless you absolutely have to. Or you're a masochist that wants to make a living out of it. If you know your way around AEM, you're practically guaranteed a well paying job2 -
So i have been thinking..
SQL is a lang that runs on a specific software on the server, and helps creating data stores(databases and tables) that can be queried & manipulated.
is there a way to run sql like queries on the client side with no interaction from backend at all?
Say i have 5 inter related data models. in a backend world, they will form nice little tables of a db with all their joins and composite keys. from the server, i shall be querying them like "SELECT name from x where y=z & ..."
but what if i could store them like tables in browser memory and run the same query filters via a query language... is this possible?
i know this poses a certain security risk, but we already use cookies, local storage and a lot of json based shitty client side storages. surely it might be possible to have a lesser optimised sql tables on the frontend with extremely good querying capabilities?
or am i talking something far fetched here?8 -
I am very thankful to C as I face less pain while dealing with pointers and memory allocation and deallocation in C++. I am very thankful to C++, as I grasp OOP and template concepts out of it and it was also my first language for DSAlgo implementation. I feel very fortunate to move to Java after C++ rather than python. Although Java's design is f**ked and it feeds on a computer's memory, it taught me to deal with objects( unlike C++). It taught me how objects are clearly different than primitive data types like int, float, char...And best of all, Java provided me everything I need to safely switch to Python, it's all because of Java, I can clearly understand the working of python. All the stuff which I find weird in python before is sounding logical to me now. As java taught me how to deal with objects, I am confident to say that "I CAN DEAL WITH PYTHON". With respect to all my 3 prior languages: C, C++, and Java.2
-
Currently fixing concurrency issues with a callback which is called so frequently it probably has multiple instances running and which can't ever be paused. Also, it isn't allowed to allocate or free memory. Riddles like this are the reason I got into computer science.
-
So there's a proposal for C++ to zero initialize pretty much everything that lands on the stack.
I think this is a good thing, but I also think malloc and the likes should zero out the memory they give you so I'm quite biased.
What's devrants opinion on this?
https://isocpp.org/files/papers/...20 -
I just had a ptsd (not real ptsd) attack cause I remembered in one of my first jobs we had gulp, grunt AND webpack to build our angularjs project.
Did I fix that mess? Sure!
Will the memory of it stalk me until new year? Absolutely.1 -
Was told at work today that I don’t follow directions closely enough and the lack of attention to detail in my work is a problem.
I remember being this way since my first elementary school teacher pointed it out to me. I’ve always been this way. It’s how my brain is wired. No matter how hard I try, I always miss something. Especially when it is a really complex set of tasks. I’ve literally got the results of a cognitive test I took in college documenting and quantifying my working memory deficits.
You think you’ll change that now, after more than four decades of me being like this, with a performance review? Good fucking luck!8 -
everything is going as planned! :)
Learned Rust Lang. i loved it (that doesn't mean i am done learning na? No! never stop)
new language i could do game memory hacking in without worrying about C++ memory leaks or issues. it also compiles to assembly! another of my favorite languages!
(i use rust for game development and other stuff)
i am not leaving C / C++ though that would be harsh!,
i abandoned javascript for react and typescript.
to be honest the developer just made javascript and left us with a [object Object]
finished learning the android java api so im basically set anything i want to make i can just go on my pc, listen to music and write it out in a couple of days.
well phazor what are you going to do now?!
i will code till i am old.
i will leave my mark like a shid that made its skid in the bowl :)5 -
I'm facing a strange problem, I have a 400GB microsd, it is formatted as exFAT
I tried formatting it again to either ntfs or ext4, on either Linux or macOS, but every tool says format complete then when scans again it still shows the files that storage had + that it's exFAT
I tried gparted, disk utilities (macOS), Disks (ubuntu), mkfs all show same result that it successfully formatted the card but after refresh still shows old filesystem + the contents of the memory already there no file was removed
Can anyone help?26 -
[Rust] What are alternatives to argument drilling for something like a string interner which is technically a memory leak so it really shouldn't be global but at the same time all but a couple top level functions depend on its existence? I'm aware of context objects and that's all ChatGPT could give me as well, but I'm wondering if there's more to this problem than that.1
-
Alright, I sometimes... Alright often... almost every night while trying to fall asleep... imagine applications on an NP computer. No, I don't claim there is a NP computer. But still...
Alright, if you don't wanna think of NP computer... Think of non-deterministic turing machines, which are NP computers....
Quick recap about the NP rules:
- If you have a problem with a realm of solutions, your computer will guess the right answer in O(1), if an answer exists.
- After guessing an answer you have to confirm it using a normal deterministic approach that the answer is correct. No unconfirmed answers, no ambiguity.
Anyway... Data compression in an NP computer. I will make a claim that I don't wanna look up or calculate, but think it is correct:
1. There is a number n. If we have any number of bits smaller or equal to n, we cannot find two combination of bits, so that combination 1 and combination 2 both evaluate to the same md5 hash and have the same length.
2. The given number n is really large, so that at least a few gigabytes, if not terabytes can be described by it. (Hash collisions are generally allowed, just not between two hashes with the same amount of bits within the bit amount of n)
Now it is possible to send a whole file by just sending it's md5hash and how many bits are in the file (as long as the file is smaller or equal to n, otherwise slice it). Because the other side can just decompress it by guessing the right program and confirming it by hashing it again.
This would be compressed in O(n) and decompressed in O(n). So it would be extremely fast.
I mean, sometimes it is a pity that we don't have NP computers, but given that with enormous amounts of calculation powers and or enough memory space, every NP program can be run on a P computer, we can conclude that technically md5 is compression. Even though our computers are far too slow to actually use it as such.
Obviously not limited to md5. True for other hashes. Just n changes.4 -
I never thought in my life that I would say this sentence one day - but:
Today I switched back to VS Code because it uses less memory than IntelliJ.
Context: Only temporary, very resource hungry dev environment, TypeScript, IntelliJ used >4.5 GB of ram and started lagging.5 -
Force pushing a better version in a different language to the repo of a program that I wrote 2 years ago. It was sort of a memory, but I mainly looked at it to feel better about my current coding style.
I don't want to take comfort in knowing that I'm getting better. I know that, and it feels like false affirmation. If anything, I want to know that I'm good compared to others, not compared to a previous, dumber version of me. I'll never get to beat him anyway.1 -
Okay, I have a desktop and a laptop. I don't think that's surprising.
I do sync the contents of both via git. Also not surprising.
But I thought, hmm, I hate having to do temporary git commits. Stuff like
git add .
git commit -m temp
git push
Just so I can remove it alter via
git reset HEAD^
I hate it because it forces me to force push. So, how do I sync stuff I do not want to commit yet?
Well, I just set up an instance of owncloud. Was easy. 20 minutes and everything is running. Can recommend. But...
For some reason it doesn't work. It syncs stuff just fine... But it also syncs my .git directory... I thought it wouldn't be a problem.
Saves me a pull. Don't have to pull what's synced, right? Also setting up new projects should be terribly simple. Just add it normally. So, git just versions and does pipelines. And I copy everything inside the git directory over.
Also allows me to have more private .git/info/exclude files and hooks...
But for some reason... everything is synced. Dot-files are being synced as well. Everything works... But running git status on one side tells me everything is commited... Doing it on the other side it tells me there are new files.
How is that possible??? I kind of expected that even a branch checkout would be synced... Was curious if that would lead to issues, but I didn't expect it just not recognizing changes. Git doesn't hold projects in memory, does it? Nah, that doesn't make any sense. So, why does git status disagree? Git log is identical... Git status is not...
It makes no bloody sense.11 -
Can anyone recommend a good vps/dedicated server hoster for east us? I looking for a machine with 8 cores, 32gb memory and 2x 1,9tb nvme ssds. AWS, Azure and Google are way too expencive. In Germany we use the ax61-nvme instance of hetzner which costs around 100$.
Thanks for any advises✌️5 -
A philosophical question about maintenance/updating.
There is no need to repeat the reasons we need to update our dependencies and our code. We know them/ especially regarding the security issues.
The real question is , "is that indicates a failure of automation"?
When i started thinking about code, and when also was a kid and saw all these sci fi universes with robots etc, the obvious thing was that you build an automation to do the job without having to work with it anymore. There is no meaning on automate something that need constant work above it.
When you have a car, you usually do not upgrade it all the time, you do some things of maintance (oil, tires) but it keeps your work on it in a logical amount.
A better example is the abacus, a calculating device which you know it works as it works.
A promise of functional programming is that because you are based on algebraic principles you do not have to worry so much about your code, you know it will doing the logical thing it supposed to do.
Unix philosophy made software that has been "updated" so little compared to all these modern apps.
Coding, because of its changeable nature is the first victim of the humans nature unsatisfying.
Modern software industry has so much of techniques and principles (solid, liquid, patterns, testing that that the air is air) and still needs so many developers to work on a project.
I know that you will blame the market needs (you cannot understand the need from the start, you have to do it agile) but i think that this is also a part of a problem .
Old devices evolved at much more slow pace. Radio was radio, and still a radio do its basic functionality the same war (the upgrades were only some memory functionalities like save your beloved frequencies and screen messages).
Although all answers are valid, i still feel, that we have failed. We have failed so much. The dream of being a programmer is to build something, bring you money or satisfaction, and you are bored so you build something completely new.14 -
You can make your software as good as you want, if its core functionality has one major flaw that cripples its usefulness, users will switch to an alternative.
For example, an imaginary file manager that is otherwise the best in the world becomes far less useful if it imposes an arbitrary fifty-character limit for naming files and folders.
If you developed a file manager better than ES File Explorer was in the golden age of smartphones (before Google excercised their so-called "iron grip" on Android OS by crippling storage access, presumably for some unknown economic incentive such as selling cloud storage, and before ES File Explorer became adware), and if your file manager had all the useful functionality like range selection and tabbed browsing and navigation history, but it limits file names to 50 characters even though the file system supports far longer names, the user will have to rely on a different application for the sole purpose of giving files longer names, since renaming, as a file action, is one of the few core features of a file management software.
Why do I mention a 50-character limit? The pre-installed "My Files" app by Samsung actually did once have a fifty-character limit for renaming files and folders. When entering a longer name, it would show the message "up to 50 characters available". My thought: "Yeah, thank you for being so damn useful (sarcasm). I already use you reluctantly because Google locked out superior third-party file managers likely for some stupid economic incentives, and now you make managing files even more of a headache than it already is, by imposing this pointless limitation on file names' length."
Some one at Samsung's developer department had a brain fart some day that it would be a smart idea to impose an arbitrary limit on file name lengths. It isn't.
The user needs to move files to a directory accessible to a superior third-party file manager just to give it a name longer than fifty characters. Even file management on desktop computers two decades ago was better than this crap!
All of this because Google apparently wants us to pay them instead of SanDisk or some other memory card vendor. This again shows that one only truly owns a device if one has root access. Then these crippling restrictions that were made "for security reasons" (which, in case it isn't clear, is an obvious pretext) can be defeated for selected apps.3 -
What's your current Desktop or Laptop specifications that you use for work and what's your job title?11
-
I had been assigned a task to create a cross-platform desktop application that keeps track of the expiry of a certain product and notify in real-time.
So, my journey to create such an application starts today and the list below describes the first few hours.
1. Google/Date and time in javascript
2. Google/Javascript date object
3. W3school/Time in javascript
4. W3school/Javascript date getTime() method
5. Google/Are electron.js applications platform independent
6. Google/Dart for desktop applications
7. Google/Is dart cross-platform
8. Google/Best desktop application framework
9. Google/Python for desktop app development
10. Freecodecamp/How to build your first desktop application in python
11. Google/Pyqt
12. Google/Which is the best technology to build cross-platform desktop application
13. Google/Cross-platform desktop app development for windows mac and linux
14. Udemy / cross platform desktop app development for windows mac and linux
15. Youtube/ electron desktop app, demo
16. Youtube/ electron.js is obsolete
17. Youtube/Neutralinojs
18. Youtube/ neutralinojs tutorial
19. Google/Neutralinojs or electronjs
20. Google/Math.js
21. Google/Math.js/JS Bin
22. Google/Cannot find package “math.js”
23. StackOverFlow/How do I resolve “cannot find module” error using Node.js
24. Google/ is it better to install npm packages locally
25. Quora/ why should you stop installing NPM packages globally
26. Google/ what is nvm
27. Google/nvm version check
28. Stackoverflow/node version management on windows
29. Github/coreybutler/nvm-windows: a nvm for windows. Ironically written in Go
30. Google/how to uninstall a npm package
31. Npm docs/uninstalling packages and dependencies
32. Google/require in javascript
33. Youtube/how to install electronjs
34. Youtube/electronjs in 100s(fireship.io)
35. Roryok.com/electronjs memory usage compared to other cross-platform frameworks
36. Google/is electronjs memory hungry
37. Youtube/sql in one hour
38. Youtube/learn sql in 60 mins
39. Geeksforgeeks/connect mysql with node app
40. Stackoverflow/How to return to previous directory using cmd
41. Stackoverflow/how to require using const
42. Geeksforgeeks/difference between require and es6 import and export
TO BE CONTINUED...1 -
Had to face the music and make the jump from Ubuntu 22.04 to Fedora 36. Am I have to say it’s been night and day so far. Everything is snappier. Yeah dnf is very slow in comparison to apt but there’s changes you can make to speed things up and the nifty terminal interface is a great change and helps to make up for the speed issues.
Came with Python 3.10 installed, Gnome and gtk4 apps are nice, fluid and up to date and the random slowdowns, freezing and restarts of Ubuntu running the version of Gnome are nonexistent.
For the life of me I can’t see why Ubuntu would drop the ball like this. I have a Dell XPS 13 developer edition and this is the best it’s ever ran. Even wifi connectivity is better despite of the crap WiFi card that ships with this machine.
I want to love this version and while it is the most graphical appealing and functional version of Ubuntu I’ve ever used. The memory management issues make it damn near unusable.10 -
TL;DR I have to bump a Redis cluster from t3.medium to m6g.large just to get enough network bandwidth even though I have no need of the extra memory.
Debugged an interesting issue today.
I am adding Elasticache to a project to reduce strain on the single node postgres DB.
Deployed a Redis replication group with 2 shards, with multi-AZ replication for resilience.
Everything was going well. We arent caching that much atm so was barely using 100Mb of memory.
Suddenly, when our US region comes online, latency skyrockets and the logs are full of Jedis timeout errors.
Still no issue with memory or node CPU.
The cause? Arbitrary network bandwidth throttling by AWS. The app currently processes about 3,000 requests per second so we were exceeding Amazons random ass allowances which arent documented anywhere.1 -
Whenever I see an ORM that supports creating and transforming objects in bulk, I can't help but think about the poor misdirected users who forced it to do that. It's an Object-Relational Mapper. It maps objects. The whole concept isn't designed for bulk operations, the point is that you add logic to each and every record and convert your operations to SQL so that you never have to keep a lot of them in memory.4
-
You can have the best test coverage - even building your own fuzzing framework on the way.
You can have top notch devs adhering to state of the art development processes.
You can have as big a community and as well-funded a bugbounty program as you want...
All of that doesn't matter if you have chosen the wrong language:
https://googleprojectzero.blogspot.com/...
This would just have been an out-of-bounds exception instead of a buffer overflow using an attacker-controlled payload in any memory-safe language.
Language choice matters!
Choose wisely!13 -
Once upon a time I offhandedly suggested a few beautiful things that were constructed
People who were taking responsibility for the ideas who do not understand delight simple Innocent delight and a desire to see what I created before had some miniatures cheaply constructed
The desire is for things that endure in memory
All they wanted was to mechanically and immediately work on something else like it was some kind of souless side show
All I wanted was a few measly days of rest and to look over the small shadows of my ideas
This is why 99% of all new ideas suck2 -
So we having a heated debate about MS decision on introducing ads on me menu
So was saying this can be a potential critical vuln as always... its kind of like MS tread Mark now :-C
My reasoning was now ads will have direct access to pc memory since they are being delivered straight to your pc
and this other guy went on to say they are being delivered to your machine they are being delivered to explore...... and I was like WTF?? isnt explore a process running directly on your machine?? -
So, like, why doesn't Java let me do manual memory management? In C# if I want to screw up the code-base and everyone that comes after me with my half-informed experiments it totally lets me.21
-
tips on how to retain something in memory for a long time? especially if it is something difficult , unpleasant and rarely occurring event like usage of differential calculus or dsa/ leetcode questions ?6
-
Some mobile file managers kick me back to the beginning after selecting items for copying or moving.
When tapping on "copy" or "move" after selecting files/folders, some file managers like ES File Explorer (back when it was popular) conveniently remain in the current directory, whereas the stock Android file manager and many vendors' pre-installed file managers like that of Samsung kick me back to the initial directory. On phones with MicroSD, that's the storage selector, and on phones without, that's /storage/emulated/0/.
If I wanted to move files into a sub folder of the currently viewed directory, I have to navigate all the way back to that current directory, which is, needless to say, annoying.
Who thought it was a smart idea to kick the user back to the initial directory? But vendors' pre-installed file managers tend to be garbage anyway. Samsung's "My Files" file manager does not let me enter file names longer than 50 characters, does not let me change the extensions of files, does not support selecting files from search results or jumping to their parent directory, does obviously lack range selection, hides the status bar while opened (what's the point of that?!), its search feature is slow and sometimes crashes, and it can only search the entire device storage or memory card and not individual directories.
It's almost like Samsung deliberately tried to design a file manager as terrible as they possibly could.5 -
So do you think this will finally mean we get a proper Linux version of this shit storm of an app?
https://theverge.com/2023/2/...9 -
how do you learn some concept in programming/dev? am not talking about the understanding, but rather the remembrance part, like retaining in memory in a way that you could remember to recognise/use it , the next time you see it or need it?
do you prefer :
- writing on pen and paper(ie creating notes)
- writing a personal/public online blog
- implementing it in a project that depends on it/ some sample project
- or something else?11 -
Please support old web browser versions for all eternity.
I hate it when I open a site like SoundCloud one day and am greeted with a "we no longer support your browser" notice. Now I am forced to update my browser to a new version with removed features. On Android, Chrome sometimes crashes due to an apparent memory leak, so I have to go back to Samsung Internet, which does not work with some sites. Also, the Samsung clipboard manager (which can hold up to 20 items) is only available on Samsung Internet, not Chrome or Firefox.
I also have to update the browser on my live USB bootable stick because sites stop supporting it. Any browser starting in 2015 (ECMA script 6) should be supported until at least 2050 so that I never have to fear that a site one day spontaneously stops working on my browser.
I would like to browse the Internet forever without having to ever worry about pages to stop working one day. Browser vendors might also deprecate support for devices and operating systems. Old devices also have replaceable batteries and are easier to repair. I don't want be forced to buy new devices that are difficult and expensive to repair.21 -
Who thought and wrote android studio thinking it was a good idea???
I need to use it because of flutter for like a week and I don't want this thing to create arbitrary directories in my home.
Then I can only use chrome for viewing my app because please don't give me any error on why the fucking emulator crashes.
It's my gpu? It's my memory? How do I know?
Now I'm using podman because I don't want to think about removing all of this crap when I'm finished.2 -
You know the symmetry of this insanity is only possible because people do the same things mimicking the novel actions people like John boy took or reproducing things so closely that with memory fading in time basic decay model of forgetting which so far as I can tell may blur details but leave major impressions of even some of the most insignificant things people simply engage in "novel" actions like this rant.
How's it feel to be part of an oversized music box ?
Need to wander somewhere new -
Does anyone here use any nootropics, either at work or on personal projects? About to have an extra busy few months and I'm looking for some recommendations.5
-
You know I'm tired of the fucking memory noise of some twisted fuck working for twisted fucks laboring off some set of idiotic arbitrary stereotypes trying to get me to do the same fucking things by baiting me like a fucking dog
I want people to live their fucking lives and the social problems in this world to just be solved
None of this in last generation or twisted dumb fucks and their insensible number games that were used to program them
I want everything cleaned up and fixed and evil people to cease being evil and no more stupid loop2 -
!rant
can you give me/point me to some good example problem/exercise for multithreading? as in, something that's small in scope, but actually requires dealing with most of the multithreading issues & complications? race conditions, synchronization, locks, shared memory access, cross-thread calls/callbacks, etc?2 -
ENOSPC = random things go wrong.
There are many synonyms for ENOSPC, like "disk full", "space storage full", "space storage exhausted", "no more space left on device", and those other repulsive errors. For the sake of simplicity, I am going to refer to it as ENOSPC.
If you are in this condition on the operating system partition, get out of it quickly or random things will go wrong. Text editors which write directly to a text file rather than creating a temporary file and then replacing the text file could end up blanking the text file, softwares' configuration files might fail saving which causes a reset, and web browsers might spontaneously reset cookies and lose history.
For example, Firefox has created a gap in the web browsing history, as shown here. The history that is now memory-holed initially appeared to have been recorded successfully. Apparently, a failed write to the places.sqlite database when closing the browser created this gap.4 -
How did mobile development manage to take off and survive up till now? Numerous aspects of its existence are a huge drawback to web apps and the Web, in general. When using an app, you:
- Can't select a term and press "search" from the context menu
- Can't have multiple app pages open
- Can't save pages for a revisit
- It Requires installation
- Takes up memory on installed device, not to mention accumulated app data
- It requires updates
- Development can get horrifying. From setting up optimal dev environment for device SDK, gradle differences, publishing an installable build despite sometimes stubborn dependencies, waiting for approval from app stores
It's literally an inconvenience, however you look at it6 -
any advice/suggestions to intensively brush up on modern C++ and multithreading for an interview that will likely be technical and cover bases like algorithms, data structures, etc?
I haven’t done c++ for awhile since a few courses in college - I did parallel programming and GPGPU on the side, but nothing on a professional level.
I’ve been mostly doing front web dev since I got out of school and C#, so I’ve been more on design/higher level of abstraction in dev and if I am asked things about pointers, memory allocations, etc I would probably draw a blank but I am motivated to no life it hard for the next week to catch up again.3 -
Now it's bitbucket and gitlab that are not answering.
I will get fired because I can't do my job because nothing's fucking working -_- When it's not team (M$ piece of shit) memory leaking, it's visual studio. When it's not visual studio, it's windows. Or WSL. Or Atlassian shit. -
Though I'm being affected by an suggestion based in a memory forgotten and consequently just recalled the second time seems people who have to push through life don't slow down as fast
Though sleep is good not what I'm talking about1 -
I keep having this recurring idea that I can fill in the gaps in my education by writing video games that allow me to explore those topics. This would force me to learn the subject well enough to share it with other people. So it would not be just surface level.
I keep thinking of a program that explores and visualizes math topics and programming topics. I would really like to have a program that allows me to visualize memory cells for algorithm exploration. Or a really nice graphing calculator in the computer that allows me to view multiple graphs to compare and contrast equations.
What holds me back is both math and CS are huge topics. I feel like any kind of playground would only cover a small subset. Ideally whatever I make should be extendable over time to add content and topics. It would need to be somewhat fun as well.
I can imagine an AI training program where you help your character navigate a room of hazards or die. This could be one such fun challenge.1 -
What is it with web devs that can't write effective PHP applications that don't need a 1 GB of Memory Limit?
Where are the days that 32MBs of memory was fine per request? Ugh...2 -
I try to delete a partition from my sundisk Pendrive using GParted but when I do this I got the error that is shown in the image.
And If I try to use "fdisk" which run `sudo fdisk /dev/sda` following command it gives me ```welcome to fdisk (util-linux 2.34).
Changes will remain in memory only, until you decide to write them.
Be careful before using the write command.
fdisk: cannot open /dev/sda: Read-only file system```
following error does that mean my pendrive permanently broke can anybody help me!9 -
Wandb sweep runs for an interactive job but gives me a cuda error for illegal memory access for the slurm job. Spent the last 15 hours solving it and still can't enable multi gpu support on it. FML