Join devRant
Do all the things like
++ or -- rants, post your own rants, comment on others' rants and build your customized dev avatar
Sign Up
Pipeless API
From the creators of devRant, Pipeless lets you power real-time personalized recommendations and activity feeds using a simple API
Learn More
Search - "memory used"
-
Interviewer: Welcome, Mr X. Thanks for dropping by. We like to keep our interviews informal. And even though I have all the power here, and you are nothing but a cretin, let’s pretend we are going to have fun here.
Mr X: Sure, man, whatever.
I: Let’s start with the technical stuff, shall we? Do you know what a linked list is?
X: (Tells what it is).
I: Great. Can you tell me where linked lists are used?
X:: Sure. In interview questions.
I: What?
X: The only time linked lists come up is in interview questions.
I:: That’s not true. They have lots of real world applications. Like, like…. (fumbles)
X:: Like to implement memory allocation in operating systems. But you don’t sell operating systems, do you?
I:: Well… moving on. Do you know what the Big O notation is?
X: Sure. It’s another thing used only in interviews.
I: What?! Not true at all. What if you want to sort a billion records a minute, like Google has to?
X: But you are not Google, are you? You are hiring me to work with 5 year old PHP code, and most of the tasks will be hacking HTML/CSS. Why don’t you ask me something I will actually be doing?
I: (Getting a bit frustrated) Fine. How would you do FooBar in version X of PHP?
X: I would, er, Google that.
I: And how do you call library ABC in PHP?
X: Google?
I: (shocked) OMG. You mean you don’t remember all the 97 million PHP functions, and have to actually Google stuff? What if the Internet goes down?
X: Does it? We’re in the 1st world, aren’t we?
I: Tut, tut. Kids these days. Anyway,looking at your resume, we need at least 7 years of ReactJS. You don’t have that.
X: That’s great, because React came out last year.
I: Excuses, excuses. Let’s ask some lateral thinking questions. How would you go about finding how many piano tuners there are in San Francisco?
X: 37.
I: What?!
X: 37. I googled before coming here. Also Googled other puzzle questions. You can fit 7,895,345 balls in a Boeing 747. Manholes covers are round because that is the shape that won’t fall in. You ask the guard what the other guard would say. You then take the fox across the bridge first, and eat the chicken. As for how to move Mount Fuji, you tell it a sad story.
I: Ooooooooookkkkkaaaayyyyyyy. Right, tell me a bit about yourself.
X: Everything is there in the resume.
I: I mean other than that. What sort of a person are you? What are your hobbies?
X: Japanese culture.
I: Interesting. What specifically?
X: Hentai.
I: What’s hentai?
X: It’s an televised art form.
I: Ok. Now, can you give me an example of a time when you were really challenged?
X: Well, just the other day, a few pennies from my pocket fell behind the sofa. Took me an hour to take them out. Boy was it challenging.
I: I meant technical challenge.
X: I once spent 10 hours installing Windows 10 on a Mac.
I: Why did you do that?
X: I had nothing better to do.
I: Why did you decide to apply to us?
X: The voices in my head told me.
I: What?
X: You advertised a job, so I applied.
I: And why do you want to change your job?
X: Money, baby!
I: (shocked)
X: I mean, I am looking for more lateral changes in a fast moving cloud connected social media agile web 2.0 company.
I: Great. That’s the answer we were looking for. What do you feel about constant overtime?
X: I don’t know. What do you feel about overtime pay?
I: What is your biggest weakness?
X: Kryptonite. Also, ice cream.
I: What are your salary expectations?
X: A million dollars a year, three months paid vacation on the beach, stock options, the lot. Failing that, whatever you have.
I: Great. Any questions for me?
X: No.
I: No? You are supposed to ask me a question, to impress me with your knowledge. I’ll ask you one. Where do you see yourself in 5 years?
X: Doing your job, minus the stupid questions.
I: Get out. Don’t call us, we’ll call you.
All Credit to:
http://pythonforengineers.com/the-p...89 -
Being paid to rewrite someone else's bad code is no joke.
I'll give the dev this, the use of gen 1,2,3 Pokemon for variable names and class names in beyond fantastic in terms of memory and childhood nostalgia. It would be even more fantastic if he spelt the names correctly, or used it to make a Pokemon game and NOT A FUCKING ACCOUNTANCY PROGRAM.
There's no correspondence in name according to type, or even number. Dev has just gone batshit, left zero comments, and now somehow Ryhorn is shitting out error codes because of errors existing in Charmeleon's asshole.
The things I do for money...24 -
We had a Commodore64. My dad used to be an electrical engineer and had programs on it for calculations, but sometimes I was allowed to play games on it.
When my mother passed away (late 80s, I was 7), I closed up completely. I didn't speak, locked myself into my room, skipped school to read in the library. My dad was a lovely caring man, but he was suffering from a mental disease, so he couldn't really handle the situation either.
A few weeks after the funeral, on my birthday, the C64 was set up in my bedroom, with the "programmers reference guide" on my desk. I stayed up late every night to read it and try the examples, thought about those programs while in school. I memorized the addresses of the sound and sprite buffers, learnt how programs were managed in memory and stored on the casette.
I worked on my own games, got lost in the stories I was writing, mostly scifi/fantasy RPGs. I bought 2764 eproms and soldered custom cartridges so I could store my finished work safely.
When I was 12 my dad disappeared, was found, and hospitalized with lost memory. I slipped through the cracks of child protection, felt responsible to take care of the house and pay the bills. After a year I got picked up and placed in foster care in a strict Christian family who disallowed the use of computers.
I ran away when I was 13, rented a student apartment using my orphanage checks (about €800/m), got a bunch of new and recycled computers on which I installed Debian, and learnt many new programming languages (C/C++, Haskell, JS, PHP, etc). My apartment mates joked about the 12 CRT monitors in my room, but I loved playing around with experimental networking setups. I tried to keep a low profile and attended high school, often faking my dad's signatures.
After a little over a year I was picked up by child protection again. My dad was living on his own again, partly recovered, and in front of a judge he agreed to be provisory legal guardian, despite his condition. I was ruled to be legally an adult at the age of 15, and got to keep living in the student flat (nation-wide foster parent shortage played a role).
OK, so this sounds like a sobstory. It isn't. I fondly remember my mom, my dad is doing pretty well, enjoying his old age together with an nice woman in some communal landhouse place.
I had a bit of a downturn from age 18-22 or so, lots of drugs and partying. Maybe I just needed to do that. I never finished any school (not even high school), but managed to build a relatively good career. My mom was a biochemist and left me a lot of books, and I started out as lab analyst for a pharma company, later went into phytogenetics, then aerospace (QA/NDT), and later back to pure programming again.
Computers helped me through a tough childhood.
They awakened a passion for creative writing, for math, for science as a whole. I'm a bit messed up, a bit of a survivalist, but currently quite happy and content with my life.
I try to keep reminding people around me, especially those who have just become parents, that you might feel like your kids need a perfect childhood, worrying about social development, dragging them to soccer matches and expensive schools...
But the most important part is to just love them, even if (or especially when) life is harsh and imperfect. Show them you love them with small gestures, and give their dreams the chance to flourish using any of the little resources you have available.22 -
Father bought a PC in 1997. Back then very few had it. I learned doing things like accessing the internet and sending emails, among others. I remember having added age on websites to be allowed to sign up at times :P My sisters used to play games on it sometimes. The first few ones we had were Tomb Raider: The Last Revelation, Tomb Raider Chronicles, American McGee's Alice(Which caused us to upgrade the PC xD)... And some others.
I have a memory of this pseudo-3D-looking game where you move in a maze and try answering questions. I want to remember its name, but I cannot :(
We literally have video evidence of me liking the computer as a child, yet my parents either say I'm addicted or deny I've ever liked it before. Not only that, but continuously limiting my time with the PC hasn't been a literal obstacle in my way of trying to do things in their opinion. Funny how my parents think the last few years I've been my worst when they've hurt me in those years so much that our relationship is guaranteed not working out. There were doubts in my head before, but now it's cemented and there is no way of going back. Father, for example, tells me it's too late to do anything with a PC now(As well as how I've been unable to use the PC. He looks at these pro players' footage in some TV show and he's like, „You've been unable to use your hobbies“, as if they have never ever screamed at me for perceived gaming and not actually cared to check), and I need to look for a „real“ job.
Sorry. I went to bed at 2:00 in the morning. Feel like a zombie because of ongoing weirdly insufficient sleep, even though I sleep kinda more than normal. Even when I took Melatonine for that it didn't help at all.
Childhood was where beating began. I was about 6/7. Right when I entered school. The first school that I attended was a private one and supposedly for „Wunderkinds“, while in reality I haven't seen a SINGLE teacher or psychologist approve of it, their argument being that children were basically drowned in work that wasn't age-appropriate(I don't mean anything bad. Just that teaching about Galaxies and all in first grade isn't the brightest idea). There was always a mountain of homework to do and as opposed to some other countries, we had to do it on a day to day basis. We didn't have a week-long deadline. I was predictably not keeping up with it as I could have, had it been a normal amount, so my parents decided I didn't want to study and began their methods of getting me to „study“. I have yet to see a person able to keep up with that school's tempo, no matter the age.
This place was also where I got bullied. I felt I had nowhere to be: At home, the parents' situation, at school, the bully. I never really went outside to play with other children, so I missed that part of childhood.
After the second year of school I was transferred to an advanced German school, called like that because they taught German and not English there. I also got to learn a bit of Russian before they removed it from school. In that period I used to attend ballet. But for less than a year. And piano, which I remember having attended for quite a long while, some years, if my memory isn't fried. I quit it because of it having been forced on me. Last piece I ever played fully was Beethoven's Marmotte.
In this school I was once again the outcast of the class. I had some people to interact with. All of those interactions lasted a few years at most. Then, because of a part of my class choosing me as a laughing-stock N2 and another girl as the N1, I found my best friend, who I still have today. She's the only friend I have nearby.
Most of the time I hated myself. Even today I struggle with that sometimes.
After that came university. This us where I got something like a friend circle at last. But it still didn't last. I got in a relationship with one of the guys, but I was just attracted. There was another I couldn't dare getting close to. Turns out he also had something for me. Then he disappeared from our lives and a year after, I still cannot forget the person. If I want to, I have to deprive myself of my own personality. Not a thing I'm willing to give up. Then I broke up with the guy I was in a relationship with and completely disappeared from the friendship circle. To be honest, I had reasons to. They refused to even try to look for the guy and they called him a friend for years. Sometimes parents hitting me can occur even today, but if I REALLY piss them off.
Now I'm here and oh, my God, I'm officially am aunt now! My sister gave birth to a daughter this morning... She's in Berlin with mother and both she and the child are doing great. I just hope she manages to be a good mother.20 -
https://git.kernel.org/…/ke…/... sure some of you are working on the patches already, if you are then lets connect cause, I am an ardent researcher for the same as of now.
So here it goes:
As soon as kernel page table isolation(KPTI) bug will be out of embargo, Whatsapp and FB will be flooded with over-night kernel "shikhuritee" experts who will share shitty advices non-stop.
1. The bug under embargo is a side channel attack, which exploits the fact that Intel chips come with speculative execution without proper isolation between user pages and kernel pages. Therefore, with careful scheduling and timing attack will reveal some information from kernel pages, while the code is running in user mode.
In easy terms, if you have a VPS, another person with VPS on same physical server may read memory being used by your VPS, which will result in unwanted data leakage. To make the matter worse, a malicious JS from innocent looking webpage might be (might be, because JS does not provide language constructs for such fine grained control; atleast none that I know as of now) able to read kernel pages, and pawn you real hard, real bad.
2. The bug comes from too much reliance on Tomasulo's algorithm for out-of-order instruction scheduling. It is not yet clear whether the bug can be fixed with a microcode update (and if not, Intel has to fix this in silicon itself). As far as I can dig, there is nothing that hints that this bug is fixable in microcode, which makes the matter much worse. Also according to my understanding a microcode update will be too trivial to fix this kind of a hardware bug.
3. A software-only remedy is possible, and that is being implemented by all major OSs (including our lovely Linux) in kernel space. The patch forces Translation Lookaside Buffer to flush if a context switch happens during a syscall (this is what I understand as of now). The benchmarks are suggesting that slowdown will be somewhere between 5%(best case)-30%(worst case).
4. Regarding point 3, syscalls don't matter much. Only thing that matters is how many times syscalls are called. For example, if you are using read() or write() on 8MB buffers, you won't have too much slowdown; but if you are calling same syscalls once per byte, a heavy performance penalty is guaranteed. All processes are which are I/O heavy are going to suffer (hostings and databases are two common examples).
5. The patch can be disabled in Linux by passing argument to kernel during boot; however it is not advised for pretty much obvious reasons.
6. For gamers: this is not going to affect games (because those are not I/O heavy)
Meltdown: "Meltdown" targeted on desktop chips can read kernel memory from L1D cache, Intel is only affected with this variant. Works on only Intel.
Spectre: Spectre is a hardware vulnerability with implementations of branch prediction that affects modern microprocessors with speculative execution, by allowing malicious processes access to the contents of other programs mapped memory. Works on all chips including Intel/ARM/AMD.
For updates refer the kernel tree: https://git.kernel.org/…/ke…/...
For further details and more chit-chats refer: https://lwn.net/SubscriberLink/...
~Cheers~
(Originally written by Adhokshaj Mishra, edited by me. )23 -
Someday my toaster is going to have an IP address. A bad automatic firmware update will most likely cause it to get stuck on the bagel setting until I plug a usb key in and reflash the memory.
Grandma's refrigerator will probably get viruses, lock itself and freeze all the food inside, demanding bitcoin before defrosting.
My blender will probably be used in a massive DDoS attack because Ninja's master MAC address list got leaked and the hidden control panel login is admin/admin.
Ovens will burn houses down when people call in to have them preheat on their way home from work.
Correlations between the number of times the lights are turned on and how many times the toilet is flushed will yield recommendations to run the dishwasher on Thursdays because it's simply more energy efficient.
My dog will tweet when he's hungry and my smart watch will recommend diet dog food in real-time because he's really been eating too much lately--"Do you want to setup a recurring order on Amazon fresh?"
Sometimes living in a cave sounds nice...12 -
I finally built a new PC with 8GB memory, i7 4790K and SSD for OS.
My old system was a core2duo with 2GB memory. Android studio used to take 20mins for gradle sync and another 20min for signing apks. "Live preview" and "emulator" was a thing of the future for me. Never used it.
But now things have changed. This thing boots up in less than 5 sec and studio gradle takes less than 30sec. I'm so happy right now! Its like a dream come true! *cries in happiness*14 -
Apple has a real problem.
Their hardware has always been overpriced, but at least before it had defenders pointing out that it was at least capable and well made.
I know, I used to be one of them.
Past tense.
They have jumped the shark.
They now make pretentious hipster crap that is massively overpriced and doesn't have the basic features (like hardware ports) to enable you to do your job.
I mean, who needs an ESC key? What is wrong with learning to type CTRL-[ instead? Muscle memory? What's that?
They have gone from "It just works" to "It just doesn't work" in no time at all.
And it is Developers who are most pissed off. A tiny demographic who won't be visible on the financial bottom line until their newly absent software suddenly makes itself known two, three years down the line.
By which time it is too late to do anything.
But hey! Look how thin (and thermally throttled) my new laptop is!19 -
I have been a mobile developer working with Android for about 6 years now. In that time, I have endured countless annoyances in the Android development space. I will endure them no more.
My complaints are:
1. Ridiculous build times. In what universe is it acceptable for us to wait 30 seconds for a build to complete. Yes, I've done all the optimisations mentioned on this page and then some. Don't even mention hot reload as it doesn't work fast enough or just does not work at all. Also, buying better hardware should not be a requirement to build a simple Android app, Xcode builds in 2 seconds with a 8GB Macbook Air. A Macbook Air!
2. IDE. Android Studio is a memory hog even if you throw 32GB of RAM at it. The visual editors are janky as hell. If you use Eclipse, you may as well just chop off your fingers right now because you will have no use for them after you try and build an app from afresh. I mean, just look at some of the posts in this subreddit where the common response is to invalidate caches and restart. That should only be used as a last resort, but it's thrown about like as if it solves everything. Truth be told, it's Gradle's fault. Gradle is so annoying I've dedicated the next point to it.
3. Gradle. I am convinced that Gradle causes 50% of an Android developer's pain. From the build times to the integration into various IDEs to its insane package management system. Why do I need to manually exclude dependencies from other dependencies, the build tool should just handle it for me. C'mon it's 2019. Gradle is so bad that it requires approx 54GB of RAM to work out that I have removed a dependency from the list of dependencies. Also I cannot work out what properties I need to put in what block.
4. API. Android API is over-bloated and hellish. How do I schedule a recurring notification? Oh use an AlarmManager. Yes you heard right, an AlarmManager... Not a NotificationManager because that would be too easy. Also has anyone ever tried running a long running task? Or done an asynchronous task? Or dealt with closing/opening a keyboard? Or handling clicks from a RecyclerView? Yes, I know Android Jetpack aims to solve these issues but over the years I have become so jaded by things that have meant to solve other broken things, that there isn't much hope for Jetpack in my mind 😤
5. API 2. A non-insignificant number of Android users are still on Jelly Bean or KitKat! That means we, as developers, have to support some of your shitty API decisions (Fragments, Activities, ListView) from all the way back then!
6. Not reactive enough. Android has support for Databinding recently but this kind of stuff should have been introduced from the very start. Look at React or Flutter as to how easy it is to make shit happen without any effort.
7. Layouts. What the actual hell is going on here. MDPI, XHDPI, XXHDPI, mipmap, drawable. Fuck it, just chuck it all in the drawable folder. Seriously, Android should handle this for me. If I am designing for a larger screen then it should be responsive. I don't want to deal with 50 different layouts spread over 6 different folders.
8. Permission system. Why was this not included from the very start? Rogue apps have abused this and abused your user's privacy and security. Yet you ban us and not them from the Play Store. What's going on? We need answers.
9. In Android, building an app took me 3 months and I had a lot of work left to do but I got so sick of Android dev I dropped it in favour of Flutter. I built the same app in Flutter and it took me around a month and I completed it all.
10. XML.
If you're a new dev, for the love of all that is good in this world, do NOT get into Android development. Start with Flutter or even iOS. On Flutter and build times are insanely fast and the hot reload is under 500ms constantly. It's a breath of fresh air and will save you a lot of headaches AND it builds for iOS flawlessly.
To the people who build Android, advocate it and work on it, sorry to swear, but fuck you! You have created a mess that we have to work with on a day-to-day basis only for us to get banned from the app store! You have sold us a lie that Android development is amazing with all the sweet treat names and conferences that look bubbly and fun. You have allowed to get it so bad that we can't target an API higher than 18 because some Android users are still using devices that support that!
End this misery. End our pain. End our suffering. Throw this abomination away like you do with some of your other projects and migrate your efforts over to Flutter. Please!
#NoToGoogleIO #AndroidSummitBoycott #FlutterDev #ReactNative16 -
When you create a bunch of objects in Java and it crashes because you're used to the memory usage of C's structs.3
-
The stupid stories of how I was able to break my schools network just to get better internet, as well as more ridiculous fun. XD
1st year:
It was my freshman year in college. The internet sucked really, really, really badly! Too many people were clearly using it. I had to find another way to remedy this. Upon some further research through Google I found out that one can in fact turn their computer into a router. Now what’s interesting about this network is that it only works with computers by downloading the necessary software that this network provides for you. Some weird software that actually looks through your computer and makes sure it’s ok to be added to the network. Unfortunately, routers can’t download and install that software, thus no internet… but a PC that can be changed into a router itself is a different story. I found that I can download the software check the PC and then turn on my Router feature. Viola, personal fast internet connected directly into the wall. No more sharing a single shitty router!
2nd year:
This was about the year when bitcoin mining was becoming a thing, and everyone was in on it. My shitty computer couldn’t possibly pull off mining for bitcoins. I needed something faster. How I found out that I could use my schools servers was merely an accident.
I had been installing the software on every possible PC I owned, but alas all my PC’s were just not fast enough. I decided to try it on the RDS server. It worked; the command window was pumping out coins! What I came to find out was that the RDS server had 36 cores. This thing was a beast! And it made sense that it could actually pull off mining for bitcoins. A couple nights later I signed in remotely to the RDS server. I created a macro that would continuously move my mouse around in the Remote desktop screen to keep my session alive at all times, and then I’d start my bitcoin mining operation. The following morning I wake up and my session was gone. How sad I thought. I quickly try to remote back in to see what I had collected. “Error, could not connect”. Weird… this usually never happens, maybe I did the remoting wrong. I went to my schools website to do some research on my remoting problem. It was down. In fact, everything was down… I come to find out that I had accidentally shut down the schools network because of my mining operation. I wasn’t found out, but I haven’t done any mining since then.
3rd year:
As an engineering student I found out that all engineering students get access to the school’s VPN. Cool, it is technically used to get around some wonky issues with remoting into the RDS servers. What I come to find out, after messing around with it frequently, is that I can actually use the VPN against the screwed up security on the network. Remember, how I told you that a program has to be downloaded and then one can be accepted into the network? Well, I was able to bypass all of that, simply by using the school’s VPN against itself… How dense does one have to be to not have patched that one?
4th year:
It was another programming day, and I needed access to my phones memory. Using some specially made apps I could easily connect to my phone from my computer and continue my work. But what I found out was that I could in fact travel around in the network. I discovered that I can, in fact, access my phone through the network from anywhere. What resulted was the discovery that the network scales the entirety of the school. I discovered that if I left my phone down in the engineering building and then went north to the biology building, I could still continue to access it. This seems like a very fatal flaw. My idea is to hook up a webcam to a robot and remotely controlling it from the RDS servers and having this little robot go to my classes for me.
What crazy shit have you done at your University?9 -
I had to open the desktop app to write this because I could never write a rant this long on the app.
This will be a well-informed rebuttal to the "arrays start at 1 in Lua" complaint. If you have ever said or thought that, I guarantee you will learn a lot from this rant and probably enjoy it quite a bit as well.
Just a tiny bit of background information on me: I have a very intimate understanding of Lua and its c API. I have used this language for years and love it dearly.
[START RANT]
"arrays start at 1 in Lua" is factually incorrect because Lua does not have arrays. From their documentation, section 11.1 ("Arrays"), "We implement arrays in Lua simply by indexing tables with integers."
From chapter 2 of the Lua docs, we know there are only 8 types of data in Lua: nil, boolean, number, string, userdata, function, thread, and table
The only unfamiliar thing here might be userdata. "A userdatum offers a raw memory area with no predefined operations in Lua" (section 26.1). Essentially, it's for the API to interact with Lua scripts. The point is, this isn't a fancy term for array.
The misinformation comes from the table type. Let's first explore, at a low level, what an array is. An array, in programming, is a collection of data items all in a line in memory (The OS may not actually put them in a line, but they act as if they are). In most syntaxes, you access an array element similar to:
array[index]
Let's look at c, so we have some solid reference. "array" would be the name of the array, but what it really does is keep track of the starting location in memory of the array. Memory in computers acts like a number. In a very basic sense, the first sector of your RAM is memory location (referred to as an address) 0. "array" would be, for example, address 543745. This is where your data starts. Arrays can only be made up of one type, this is so that each element in that array is EXACTLY the same size. So, this is how indexing an array works. If you know where your array starts, and you know how large each element is, you can find the 6th element by starting at the start of they array and adding 6 times the size of the data in that array.
Tables are incredibly different. The elements of a table are NOT in a line in memory; they're all over the place depending on when you created them (and a lot of other things). Therefore, an array-style index is useless, because you cannot apply the above formula. In the case of a table, you need to perform a lookup: search through all of the elements in the table to find the right one. In Lua, you can do:
a = {1, 5, 9};
a["hello_world"] = "whatever";
a is a table with the length of 4 (the 4th element is "hello_world" with value "whatever"), but a[4] is nil because even though there are 4 items in the table, it looks for something "named" 4, not the 4th element of the table.
This is the difference between indexing and lookups. But you may say,
"Algo! If I do this:
a = {"first", "second", "third"};
print(a[1]);
...then "first" appears in my console!"
Yes, that's correct, in terms of computer science. Lua, because it is a nice language, makes keys in tables optional by automatically giving them an integer value key. This starts at 1. Why? Lets look at that formula for arrays again:
Given array "arr", size of data type "sz", and index "i", find the desired element ("el"):
el = arr + (sz * i)
This NEEDS to start at 0 and not 1 because otherwise, "sz" would always be added to the start address of the array and the first element would ALWAYS be skipped. But in tables, this is not the case, because tables do not have a defined data type size, and this formula is never used. This is why actual arrays are incredibly performant no matter the size, and the larger a table gets, the slower it is.
That felt good to get off my chest. Yes, Lua could start the auto-key at 0, but that might confuse people into thinking tables are arrays... well, I guess there's no avoiding that either way.13 -
So, some time ago, I was working for a complete puckered anus of a cosmetics company on their ecommerce product. Won't name names, but they're shitty and known for MLM. If you're clever, go you ;)
Anyways, over the course of years they brought in a competent firm to implement their service layer. I'd even worked with them in the past and it was designed to handle a frankly ridiculous-scale load. After they got the 1.0 released, the manager was replaced with some absolutely talentless, chauvinist cuntrag from a phone company that is well known for having 99% indian devs and not being able to heard now. He of course brought in his number two, worked on making life miserable and running everyone on the team off; inside of a year the entire team was ex-said-phone-company.
Watching the decay of this product was a sheer joy. They cratered the database numerous times during peak-load periods, caused $20M in redis-cluster cost overrun, ended up submitting hundreds of erroneous and duplicate orders, and mailed almost $40K worth of product to a random guy in outer mongolia who is , we can only hope, now enjoying his new life as an instagram influencer. They even terminally broke the automatic metadata, and hired THIRTY PEOPLE to sit there and do nothing but edit swagger. And it was still both wrong and unusable.
Over the course of two years, I ended up rewriting large portions of their infra surrounding the centralized service cancer to do things like, "implement security," as well as cut memory usage and runtimes down by quite literally 100x in the worst cases.
It was during this time I discovered a rather critical flaw. This is the story of what, how and how can you fucking even be that stupid. The issue relates to users and their reports and their ability to order.
I first found this issue looking at some erroneous data for a low value order and went, "There's no fucking way, they're fucking stupid, but this is borderline criminal." It was easy to miss, but someone in a top down reporting chain had submitted an order for someone else in a different org. Shouldn't be possible, but here was that order staring me in the face.
So I set to work seeing if we'd pwned ourselves as an org. I spend a few hours poring over logs from the log service and dynatrace trying to recreate what happened. I first tested to see if I could get a user, not something that was usually done because auth identity was pervasive. I discover the users are INCREMENTAL int values they used for ids in the database when requesting from the API, so naturally I have a full list of users and their title and relative position, as well as reports and descendants in about 10 minutes.
I try the happy path of setting values for random, known payment methods and org structures similar to the impossible order, and submitting as a normal user, no dice. Several more tries and I'm confident this isn't the vector.
Exhausting that option, I look at the protocol for a type of order in the system that allowed higher level people to impersonate people below them and use their own payment info for descendant report orders. I see that all of the data for this transaction is stored in a cookie. Few tests later, I discover the UI has no forgery checks, hashing, etc, and just fucking trusts whatever is present in that cookie.
An hour of tweaking later, I'm impersonating a director as a bottom rung employee. Score. So I fill a cart with a bunch of test items and proceed to checkout. There, in all its glory are the director's payment options. I select one and am presented with:
"please reenter card number to validate."
Bupkiss. Dead end.
OR SO YOU WOULD THINK.
One unimportant detail I noticed during my log investigations that the shit slinging GUI monkeys who butchered the system didn't was, on a failed attempt to submit payment in the DB, the logs were filled with messages like:
"Failed to submit order for [userid] with credit card id [id], number [FULL CREDIT CARD NUMBER]"
One submit click later and the user's credit card number drops into lnav like a gatcha prize. I dutifully rerun the checkout and got an email send notification in the logs for successful transfer to fulfillment. Order placed. Some continued experimentation later and the truth is evident:
With an authenticated user or any privilege, you could place any order, as anyone, using anyon's payment methods and have it sent anywhere.
So naturally, I pack the crucifixion-worthy body of evidence up and walk it into the IT director's office. I show him the defect, and he turns sheet fucking white. He knows there's no recovering from it, and there's no way his shitstick service team can handle fixing it. Somewhere in his tiny little grinchly manager's heart he knew they'd caused it, and he was to blame for being a shit captain to the SS Failboat. He replies quietly, "You will never speak of this to anyone, fix this discretely." Straight up hitler's bunker meme rage.13 -
Woohoo! 32k achieved!!! Finally I can post some new rant without risking some sudden overshoot 😁
So putting celebrations aside for a minute, a while ago I've noticed a tingle when I stroke my finger across metal areas of my tablet, or the sides of my phone (which probably has metal near it too) while it's charging. And it's been bugging me ever since.
Now, some things to note are that it only happens when my feet are touching the ground though slippers, and that the frequency is so low that I can actually feel the tingle when I slide my finger across the material. This to me at least seems like electricity flows through me into ground, and touching the ground directly provides a path so easy for the electrons to run away that I don't feel it at all. But if I lift my feet off the ground entirely, I just get charged up and after that, nothing else happens.
So those are my ideas. The answers on the subject on the other hand.. absolute cancer. Unsurprisingly, most of them came from Apple users. Here's some of them.
https://discussions.apple.com/threa...
- I've not noticed it, but if you're concerned bring the phone to Apple for evaluation.
- Me too facing same problem.. did u visit apple care?
And one good answer at least...
- google emf sensitivity, its real. You are right, there is a small current flowing through your body, try to limit your usage. The problem with this issue is those who aren't affected (lucky ones for now) will tell you these products are 100% safe. To a degree they are, i used my ipod touch for about 2 years straight vwith virtually no symptoms. then the tingling started and it gets worse.You will get more sensitive to progressively less powerful things. I dont want to scare you but just limit your usage like i didnt do 🙂
Overall that discussion was pretty good actually, aside from "bring it to the Genius Bar, they'll know for sure and not just sell you another unit". But then there's Reddit.
https://reddit.com/r/iphone/...
- Ok, real reason is probably that the extension cord and/or outlet is probably not grounded correctly. Either that or you are using a cheap knockoff charger.
Either use a surge protector and/or use the authentic Apple Charger.
- It's not the volts that hurt you, it's the amps
- I think you are in deep love with your phone. That tingling sensation is usually referred to as "love" in human language.
- Do less acid, I would advise.
Okay, so that's the real cancer. Grounding issue sounds reasonable despite it being wrong. Grounding is actually not needed when your charging appliance doesn't have any exposed metal parts. And isolation from high voltage to low voltage side actually happens through things like routering holes into the PCB, creating spark gaps, and using galvanic isolation through things like optocouplers. As for a surge protector? I'm using them to protect my PC and my servers, but the only purpose they serve is to protect from.. you guessed it.. voltage surges, like lightning bolts hitting the grid. They don't do shit for grounding or reducing this tingle! What a fucking tool.
It's not the volts that kill, it's the amps.. yeah I'm sure that the debunking of that is easy to find. Not gonna explain that here. And the rest of it.. yeah it's just fucking cancer.
Now what's the real issue with this tingle? It's actually a Class-Y rated (i.e. kV rated) capacitor that's on the transformer of any switch-mode power supply, including phone chargers. If memory serves me right, it helps with decoupling the switching noise and so on. But as it's connected to the primary side of the transformer, if the cap is sufficiently large and you are sufficiently sensitive, it can actually cause that tingle by passing a fraction of the mains electricity into your body. It's totally safe though, as the power that these caps pass is very small. But to some, it's noticeable.
Hope you found this interesting! And thanks a lot for bringing me to 2^15. I really appreciate it ♥️15 -
I like memory hungry desktop applications.
I do not like sluggish desktop applications.
Allow me to explain (although, this may already be obvious to quite a few of you)
Memory usage is stigmatized quite a lot today, and for good reason. Not only is it an indication of poor optimization, but not too many years ago, memory was a much more scarce resource.
And something that started as a joke in that era is true in this era: free memory is wasted memory. You may argue, correctly, that free memory is not wasted; it is reserved for future potential tasks. However, if you have 16GB of free memory and don't have any plans to begin rendering a 3D animation anytime soon, that memory is wasted.
Linux understands this. Linux actually has three States for memory to be in: used, free, and available. Used and free memory are the usual. However, Linux automatically caches files that you use and places them in ram as "available" memory. Available memory can be used at any time by programs, simply dumping out whatever was previously occupying the memory.
And as you well know, ram is much faster than even an SSD. Programs which are memory heavy COULD (< important) be holding things in memory rather than having them sit on the HDD, waiting to be slowly retrieved. I much rather a web browser take up 4 GB of RAM than sit around waiting for it to read the caches image off my had drive.
Now, allow me to reiterate: unoptimized programs still piss me off. There's no need for that electron-based webcam image capture app to take three gigs of memory upon launch. But I love it when programs use the hardware I spent money on to run smoother.
Don't hate a program simply because it's at the top of task manager.6 -
I used to think Electron apps were gonna do great and make it more accessible for companies to produce high quality programs with ease.
Oh boy I was wrong. All it did is enable big companies with the ability to refactor all of their software to run 5 times slower, consume 10 times more memory and kill your battery 20 times faster.
I fucking hate all of this prototype fast optimize later bullshit. Can I get some value for my dollar? How come technology is just being degraded for the same of "ease of programming".
You save programming time but sacrifice end user time, cus our time just doesn't fucking matter.10 -
This rant is particularly directed at web designers, front-end developers. If you match that, please do take a few minutes to read it, and read it once again.
Web 2.0. It's something that I hate. Particularly because the directive amongst webdesigners seems to be "client has plenty of resources anyway, and if they don't, they'll buy more anyway". I'd like to debunk that with an analogy that I've been thinking about for a while.
I've got one server in my home, with 8GB of RAM, 4 cores and ~4TB of storage. On it I'm running Proxmox, which is currently using about 4GB of RAM for about a dozen VM's and LXC containers. The VM's take the most RAM by far, while the LXC's are just glorified chroots (which nonetheless I find very intriguing due to their ability to run unprivileged). Average LXC takes just 60MB RAM, the amount for an init, the shell and the service(s) running in this LXC. Just like a chroot, but better.
On that host I expect to be able to run about 20-30 guests at this rate. On 4 cores and 8GB RAM. More extensive migration to LXC will improve this number over time. However, I'd like to go further. Once I've been able to build a Linux which was just a kernel and busybox, backed by the musl C library. The thing consumed only 13MB of RAM, which was a VM with its whole 13MB of RAM consumption being dedicated entirely to the kernel. I could probably optimize it further with modularization, but at the time I didn't due to its experimental nature. On a chroot, the kernel of the host is used, meaning that said setup in a chroot would border near the kB's of RAM consumption. The busybox shell would be its most important RAM consumer, which is negligible.
I don't want to settle with 20-30 VM's. I want to settle with hundreds or even thousands of LXC's on 8GB of RAM, as I've seen first-hand with my own builds that it's possible. That's something that's very important in webdesign. Browsers aren't all that different. More often than not, your website will share its resources with about 50-100 other tabs, because users forget to close their old tabs, are power users, looking things up on Stack Overflow, or whatever. Therefore that 8GB of RAM now reduces itself to about 80MB only. And then you've got modern web browsers which allocate their own process for each tab (at a certain amount, it seems to be limited at about 20-30 processes, but still).. and all of its memory required to render yours is duplicated into your designated 80MB. Let's say that 10MB is available for the website at most. This is a very liberal amount for a webserver to deal with per request, so let's stick with that, although in reality it'd probably be less.
10MB, the available RAM for the website you're trying to show. Of course, the total RAM of the user is comparatively huge, but your own chunk is much smaller than that. Optimization is key. Does your website really need that amount? In third-world countries where the internet bandwidth is still in the order of kB/s, 10MB is *very* liberal. Back in 2014 when I got into technology and webdesign, there was this rule of thumb that 7 seconds is usually when visitors click away. That'd translate into.. let's say, 10kB/s for third-world countries? 7 seconds makes that 70kB of available network bandwidth.
Web 2.0, taking 30+ seconds to load a web page, even on a broadband connection? Totally ridiculous. Make your website as fast as it can be, after all you're playing along with 50-100 other tabs. The faster, the better. The more lightweight, the better. If at all possible, please pursue this goal and make the Web a better place. Efficiency matters.9 -
What an absolute fucking disaster of a day. Strap in, folks; it's time for a bumpy ride!
I got a whole hour of work done today. The first hour of my morning because I went to work a bit early. Then people started complaining about Jenkins jobs failing on that one Jenkins server our team has been wanting to decom for two years but management won't let us force people to move to new servers. It's a single server with over four thousand projects, some of which run massive data processing jobs that last DAYS. The server was originally set up by people who have since quit, of course, and left it behind for my team to adopt with zero documentation.
Anyway, the 500GB disk is 100% full. The memory (all 64GB of it) is fully consumed by stuck jobs. We can't track down large old files to delete because du chokes on the workspace folder with thousands of subfolders with no Ram to spare. We decide to basically take a hacksaw to it, deleting the workspace for every job not currently in progress. This of course fucked up some really poorly-designed pipelines that relied on workspaces persisting between jobs, so we had to deal with complaints about that as well.
So we get the Jenkins server up and running again just in time for AWS to have a major incident affecting EC2 instance provisioning in our primary region. People keep bugging me to fix it, I keep telling them that it's Amazon's problem to solve, they wait a few minutes and ask me to fix it again. Emails flying back and forth until that was done.
Lunch time already. But the fun isn't over yet!
I get back to my desk to find out that new hires or people who got new Mac laptops recently can't even install our toolchain, because management has started handing out M1 Macs without telling us and all our tools are compiled solely for x86_64. That took some troubleshooting to even figure out what the problem was because the only error people got from homebrew was that the formula was empty when it clearly wasn't.
After figuring out that problem (but not fully solving it yet), one team starts complaining to us about a Github problem because we manage the github org. Except it's not a github problem and I already knew this because they are a Problem Team that uses some technical authoring software with Git integration but they only have even the barest understanding of what Git actually does. Turns out it's a Git problem. An update for Git was pushed out recently that patches a big bad vulnerability and the way it was patched causes problems because they're using Git wrong (multiple users accessing the same local repo on a samba share). It's a huge vulnerability so my entire conversation with them went sort of like:
"Please don't."
"We have to."
"Fine, here's a workaround, this will allow arbitrary code execution by anyone with physical or virtual access to this computer that you have sitting in an unlocked office somewhere."
"How do I run a Git command I don't use Git."
So that dealt with, I start taking a look at our toolchain, trying to figure out if I can easily just cross-compile it to arm64 for the M1 macbooks or if it will be a more involved fix. And I find all kinds of horrendous shit left behind by the people who wrote the tools that, naturally, they left for us to adopt when they quit over a year ago. I'm talking entire functions in a tool used by hundreds of people that were put in as a joke, poorly documented functions I am still trying to puzzle out, and exactly zero comments in the code and abbreviated function names like "gars", "snh", and "jgajawwawstai".
While I'm looking into that, the person from our team who is responsible for incident communication finally gets the AWS EC2 provisioning issue reported to IT Operations, who sent out an alert to affected users that should have gone out hours earlier.
Meanwhile, according to the health dashboard in AWS, the issue had already been resolved three hours before the communication went out and the ticket remains open at this moment, as far as I know.5 -
I'll use this topic to segue into a related (lonely) story befitting my mood these past weeks.
This is entire story going to sound egotistical, especially this next part, but it's really not. (At least I don't think so?)
As I'm almost entirely self-taught, having another dev giving me good advice would have been nice. I've only known / worked with a few people who were better devs than I, and rarely ever received good advice from them.
One of those better devs was my first computer science teacher. Looking back, he was pretty average, but he held us to high standards and gave good advice. The two that really stuck with me were: 1) "save every time you've done something you don't want to redo," and 2) "printf is your best debugging friend; add it everywhere there's something you want to watch." Probably the best and most helpful advice I've ever received 😊
I've seen other people here posting advice like "never hardcode" or "modularity keeps your code clean" -- I had to discover these pretty simple concepts entirely on my own. School (and later college) were filled with terrible teachers and worse students, and so were almost entirely useless for learning anything new.
The only decent dev I knew had brilliant ideas (genetic algorithms, sandboxing, ...) before they were widely used, but could rarely implement them well because he was generally an idiot. (Idiot sevant, I think? Definitely the idiot part.) I couldn't stand him. Completely bypassing a ridiculously long story, I helped him on a project to build his own OS from scratch; we made very impressive progress, even to this day. Custom bootloader, hardware interfacing, memory management, (semi) sandboxed processes, gui, example programs ...; we were in highschool. I'm still surprised and impressed with what we accomplished.
But besides him, almost every other dev I met was mediocre. Even outside of school, I went so many years without having another competent dev to work with. I went through various jobs helping other dev(s) on their projects (or rewriting them), learning new languages/frameworks almost every time: php, pascal, perl, zend, js, vb, rails, node, .... I learned new concepts occasionally (which was wonderful) but overall it was just tedious and never paid well because I was too young to be taken seriously (and female, further exacerbating it). On the bright side, it didn't dwindle my love for coding, and I usually spent my evenings playing with projects of my own.
The second dev (and one one of the best I've ever met) went by Novo. His approach to a game engine reminded me of General Relativity: Everything was modular, had a rich inheritance tree, and could receive user input at any point along said tree. A user could attach their view/control to any object. (Computer control methods could be attached in this way as well.) UI would obviously change depending on how the user could interact and the number of objects; admins could view/monitor any of these. Almost every object / class of object could talk to almost everything else. It was beautiful. I learned so much from his designs. (Honestly, I don't remember the code at all, and that saddens me.) There were other things, too, but that one amazed me the most.
I havent met anyone like him ever again.
Anyway, I don't know if I can really answer this week's question. I definitely received some good advice while initially learning, but past that it's all been through discovering things on my own.
It's been lonely. ☹2 -
Perhaps you've seen my earlier post about the bottom half of a lamp post?
I've really stepped up my environment for this one. NOTE: Top half of the lamp post still not modeled.
Here, we see a 16K skybox, a a reflective sphere, and a glass sphere, in addition to my original lamp post base.
It used 2.24 GB of memory to render. But only took 47 seconds.13 -
An overflow in C, the program was writing 16 Bytes in an array of length 10 due to some mistake.
The problem was this ended up overwriting another place in the memory that was used by another algorithm to perform calculations... So we thought that the algorithm was buggy all along.2 -
I've optimised so many things in my time I can't remember most of them.
Most recently, something had to be the equivalent off `"literal" LIKE column` with a million rows to compare. It would take around a second average each literal to lookup for a service that needs to be high load and low latency. This isn't an easy case to optimise, many people would consider it impossible.
It took my a couple of hours to reverse engineer the data and implement a few hundred line implementation that would look it up in 1ms average with the worst possible case being very rare and not too distant from this.
In another case there was a lookup of arbitrary time spans that most people would not bother to cache because the input parameters are too short lived and variable to make a difference. I replaced the 50000+ line application acting as a middle man between the application and database with 500 lines of code that did the look up faster and was able to implement a reasonable caching strategy. This dropped resource consumption by a minimum of factor of ten at least. Misses were cheaper and it was able to cache most cases. It also involved modifying the client library in C to stop it unnecessarily wrapping primitives in objects to the high level language which was causing it to consume excessive amounts of memory when processing huge data streams.
Another system would download a huge data set for every point of sale constantly, then parse and apply it. It had to reflect changes quickly but would download the whole dataset each time containing hundreds of thousands of rows. I whipped up a system so that a single server (barring redundancy) would download it in a loop, parse it using C which was much faster than the traditional interpreted language, then use a custom data differential format, TCP data streaming protocol, binary serialisation and LZMA compression to pipe it down to points of sale. This protocol also used versioning for catchup and differential combination for additional reduction in size. It went from being 30 seconds to a few minutes behind to using able to keep up to with in a second of changes. It was also using so much bandwidth that it would reach the limit on ADSL connections then get throttled. I looked at the traffic stats after and it dropped from dozens of terabytes a month to around a gigabyte or so a month for several hundred machines. The drop in the graphs you'd think all the machines had been turned off as that's what it looked like. It could now happily run over GPRS or 56K.
I was working on a project with a lot of data and noticed these huge tables and horrible queries. The tables were all the results of queries. Someone wrote terrible SQL then to optimise it ran it in the background with all possible variable values then store the results of joins and aggregates into new tables. On top of those tables they wrote more SQL. I wrote some new queries and query generation that wiped out thousands of lines of code immediately and operated on the original tables taking things down from 30GB and rapidly climbing to a couple GB.
Another time a piece of mathematics had to generate all possible permutations and the existing solution was factorial. I worked out how to optimise it to run n*n which believe it or not made the world of difference. Went from hardly handling anything to handling anything thrown at it. It was nice trying to get people to "freeze the system now".
I build my own frontend systems (admittedly rushed) that do what angular/react/vue aim for but with higher (maximum) performance including an in memory data base to back the UI that had layered event driven indexes and could handle referential integrity (overlay on the database only revealing items with valid integrity) or reordering and reposition events very rapidly using a custom AVL tree. You could layer indexes over it (data inheritance) that could be partial and dynamic.
So many times have I optimised things on automatic just cleaning up code normally. Hundreds, thousands of optimisations. It's what makes my clock tick.4 -
It has been bugging the shit out of me lately... the sheer number of shit-tier "programmers" that have been climbing out of the woodwork the last few years.
I'm not trying to come across as elitist or "holier than thou", but it's getting ridiculous and annoying. Even on here, you have people who "only do frontend development" or some other lame ass shit-stain of an excuse.
When I first started learning programming (PHP was my first language), it wasn't because I wanted to be a programmer. I used to be a member (my account is still there, in fact) of "HackThisSite", back when I was about 12 years old. After hanging out long enough, I got the hint that the best hackers are, in essence, programmers.
Want to learn how to do SQL injection? Learn SQL - write a program that uses an SQL database, and ask yourself how you would exploit your own software.
Want to reverse engineer the network protocol of some proprietary software? Learn TCP/IP - write a TCP/IP packet filter.
Back then, a programmer and a hacker were very much one in the same. Nowadays, some kid can download Python, write a "hello, world" program and they're halfway to freelancing or whatever.
It's rare to find a programmer - a REAL programmer, one who knows how the systems he develops for better than the back of his hand.
These days, I find people want the instant gratification that these simpler languages provide. You don't need to understand how virtual memory works, hell many people don't even really understand C/C++ pointers - and that's BASIC SHIT right there.
Put another way, would you want to take your car to a brake mechanic that doesn't understand how brakes work? I sure as hell wouldn't.
Watching these "programmers" out there who don't have a fucking clue how the code they write does what it does, is like watching a grown man walk around with a kid's toolbox full or plastic toys calling himself a mechanic. (I like cars, ok?!)
*sigh*
Python, AngularJS, Bootstrap, etc. They're all tools and they have their merits. But god fucking dammit, they're not the ONLY damn tools that matter. Stop making excuses *not* to learn something, Mr."IOnlyDoFrontEnd".
Coding ain't Lego's, fuckers.36 -
Life Before the Computer
An application was for employment
A program was a TV show
A cursor used profanity
A keyboard was a piano!
Memory was something that you lost with age
A CD was a bank account
And if you had a 3-inch floppy
You hoped nobody found out!
Compress was something you did to garbage
Not something you did to a file
And if you unzipped anything in public
You'd be in jail for awhile!
Log on was adding wood to a fire
Hard drive was a long trip on the road
A mouse pad was where a mouse lived
And a backup happened to your commode!
Cut - you did with a pocket knife
Paste you did with glue
A web was a spider's home
And a virus was the flu!
I guess I'll stick to my pad and paper
And the memory in my head
I hear nobody's been killed in a computer crash
But when it happens they wish they were dead!3 -
Our team makes a software in Java and because of technical reasons we require 1GB of memory for the JVM (with the Xmx switch).
If you don't have enough free memory the app without any sign just exits because the JVM just couldn't bite big enough from the memory.
Many days later and you just stand there without a clue as to why the launcher does nothing.
Then you remember this constraint and start to close every memory heavy app you can think of. (I'm looking at you Chrome) No matter how important those spreadsheets or illustrator files. Congratulation you just freed up 4GB of memory, things should work now! WRONG!
But why you might ask. You see we are using 32-bit version of java because someone in upper management decided that it should run on any machine (even if we only test it on win 7 and high sierra) and 32 is smaller than 64 so it must be downwards compatible! we should use it! Yes, in 2019 we use 32-bit java because some lunatic might want to run our software on a Windows XP 32-bit OS. But why is this so much of a problem?
Well.. the 32-bit version of Java requires CONTIGUOUS FREE SPACE IN MEMORY TO EVEN START... AND WE ARE REQUESTING ONE GIGABYTE!!
So you can shove your swap and closed applications up your ass but I bet you that you won't get 1GB contiguous memory that way!
Now there will be a meeting about this issue and another related to the issues with 32-bit JVM tomorrow. The only problem is that this issue only occures if you used up most of your memory and then try to open our software. So upper management will probably deem this issue minor and won't allow us to upgrade to 64-bit... in 20fucking1910 -
The GashlyCode Tinies
A is for Amy whose malloc was one byte short
B is for Basil who used a quadratic sort
C is for Chuck who checked floats for equality
D is for Desmond who double-freed memory
E is for Ed whose exceptions weren’t handled
F is for Franny whose stack pointers dangled
G is for Glenda whose reads and writes raced
H is for Hans who forgot the base case
I is for Ivan who did not initialize
J is for Jenny who did not know Least Surprise
K is for Kate whose inheritance depth might shock
L is for Larry who never released a lock
M is for Meg who used negatives as unsigned
N is for Ned with behavior left undefined
O is for Olive whose index was off by one
P is for Pat who ignored buffer overrun
Q is for Quentin whose numbers had overflows
R is for Rhoda whose code made the rep exposed
S is for Sam who skipped retesting after wait()
T is for Tom who lacked TCP_NODELAY
U is for Una whose functions were most verbose
V is for Vic who subtracted when floats were close
W is for Winnie who aliased arguments
X is for Xerxes who thought type casts made good sense
Y is for Yorick whose interface was too wide
Z is for Zack in whose code nulls were often spied
- Andrew Myers4 -
I haven't ranted for today, but I figured that I'd post a summary.
A public diary of sorts.. devRant is amazing, it even allows me to post the stuff that I'd otherwise put on a piece of paper and probably discard over time. And with keyboard support at that <3
Today has been a productive day for me. Laptop got restored with a "pacman -Syu" over a Bluetooth mobile data tethering from my phone, said phone got upgraded to an unofficial Android 9 (Pie) thanks to a comment from @undef, etc.
I've also made myself a reliable USB extension cord to be able to extend the 20-30cm USB-A male to USB-C male cord that Huawei delivered with my Nexus 6P. The USB-C to USB-C cord that allows for fast charging is unreliable.. ordered some USB-C plugs for that, in order to make some high power wire with that when they arrive.
So that plug I've made.. USB-A male to USB-A female, in which my short USB-C to USB-A wire can plug in. It's a 1M wire, with 18AWG wire for its power lines and 28AWG wires for its data lines. The 18AWG power lines can carry up to 10A of current, while the 28AWG lines can carry up to 1A. All wires were made into 1M pieces. These resulted in a very low impedance path for all of them, my multimeter measured no more than 200 milliohms across them, though I'll have to verify and finetune that on my oscilloscope with 4-wire measurement.
So the wire was good. Easy too, I just had to look up the pinout and replicate that on the male part.
That's where the rant part comes in.. in fact I've got quite uncomfortable with sentences that don't include at least one swear word at this point. All hail to devRant for allowing me to put them out there without guilt.. it changed my very mind <3
Microshaft WanBLowS.
I've tried to plug my DIY extension cord into it, and plugged my phone and some USB stick into it of which I've completely forgot the filesystem. Windows certainly doesn't support it.. turns out that it was LUKS. More about that later.
Windows returned that it didn't support either of them, due to "malfunctioning at the USB device". So I went ahead and plugged in my phone directly.. works without a problem. Then I went ahead and troubleshooted the wire I've just made with a multimeter, to check for shorts.. none at all.
At that point I suspected that WanBLowS was the issue, so I booted up my (at the time) problematic Arch laptop and did the exact same thing there, testing that USB stick and my phone there by plugging it through the extension wire. Shit just worked like that. The USB stick was a LUKS medium and apparently a clone of my SanDisk rootfs that I'm storing my Arch Linux on my laptop at at the time.. an unfinished migration project (SanDisk is unstable, my other DM sticks are quite stable). The USB stick consumed about 20mA so no big deal for any USB controller. The phone consumed about 500mA (which is standard USB 2.0 so no surprise) and worked fine as well.. although the HP laptop dropped the voltage to ~4.8V like that, unlike 5.1V which is nominal for USB. Still worked without a problem.
So clearly Windows is the problem here, and this provides me one more reason to hate that piece of shit OS. Windows lovers may say that it's an issue with my particular hardware, which maybe it is. I've done the Windows plugging solely through a USB 3.0 hub, which was plugged into a USB 3.0 port on the host. Now USB 3.0 is supposed to be able to carry up to 1A rather than 500mA, so I expect all the components in there to be beefier. I've also tested the hub as part of a review, and it can carry about 1A no problem, although it seems like its supply lines aren't shorted to VCC on the host, like a sensible hub would. Instead I suspect that it's going through the hub's controller.
Regardless, this is clearly a bad design. One of the USB data lines is biased to ~3.3V if memory serves me right, while the other is biased to 300mV. The latter could impose a problem.. but again, the current path was of a very low impedance of 200milliohms at most. Meanwhile the direct connection that omits the ~200ohm extension wire worked just fine. Even 300mV wouldn't degrade significantly over such a resistance. So this is most likely a Windows problem.
That aside, the extension cord works fine in Linux. So I've used that as a charging connection while upgrading my Arch laptop (which as you may know has internet issues at the time) over Bluetooth, through a shared BNEP connection (Bluetooth tethering) from my phone. Mobile data since I didn't set up my WiFi in this new Pie ROM yet. Worked fine, fixed my WiFi. Currently it's back in my network as my fully-fledged development host. So that way I'll be able to work again on @Floydian's LinkHub repository. My laptop's the only one who currently holds the private key for signing commits for git$(rm -rf ~/*)@nixmagic.com, hence why my development has been impeded. My tablet doesn't have them. Guess I'll commit somewhere tomorrow.
(looks like my rant is too long, continue in comments)3 -
Is obsidian a fucking joke?
Seriously, is it a joke? Why would you ever care so much about indexing literally everything, if the entire thing crashes and/or takes >5min to LITERALLY just open the fucking directory and/or (so help you) if that directory is full of projects/repos or whatever the fuck and the total size of said directory is like >5GB.
WHY THE FUCK WOULD YOU INDEX EVERYTHING? -- "Ohh obsidian's not supposed to be used a fully fledged IDE, ohh obsidian should just handle MD files and normal sized projects, ohh the plugins and ease-of-use" -- Fuck.
There's no fucking real reason to index everything, BY DEFAULT. You open a directory with Obsidian? Doesn't matter, it's 1 byte, it's 100GB, you get indexed. Deal with it. It will use LITERALLY every resource your computer has. I'm surprised it doesn't go galaxy brain and ping if any other computers/devices are on the network and then attempt to connect and use their hardware (obsidian can be like a node!).
How shit can you be at understanding basic data structures and algorithms, where you just revert to based google-chrome brain and let the FUCKING TEXT EDITOR -- OBSIDIAN IS A FUCKING TEXT EDITOR HOLY SHIT -- hog all conceivable memory.
I swear to <some-deity> if anyone fucking says "Ohhhhhhhh actually, it's not a text editor, it has plugins and features and shit, it does all dis cool stff", OR, "Ohhhhh actually, obsidian indexes things for a very specific/rationale/apt/pragmatic/academic reason" OR "ohhhh, I have 100 iphones, 1000 ipads and a trillion desktop computers that each have 256GB of memory, why you hating on obsidian?" then go kick rocks. The fucking lot of you. Are you fucking kidding me.8 -
Chrome, Firefox, and yes even you Opera, Falkon, Midori and Luakit. We need to talk, and all readers should grab a seat and prepare for some reality checks when their favorite web browsers are in this list.
I've tried literally all of them, in search for a lightweight (read: not ridiculously bloated) web browser. None of them fit the bill.
Yes Midori, you get a couple of bonus points for being the most lightweight. Luakit however.. as much as I like vim in my terminal, I do not want it in a graphical application. Not to mention that just like all the others you just use webkit2gtk, and therefore are just as bloated as all the others. Lightweight my ass! But programmable with Lua, woo! Not like Selenium, Chrome headless, ... does that for any browser. And that's it for the unique features as far as I'm concerned. One is slow, single-threaded and lightweight-ish (Midori) and another has vim keybindings in an application that shouldn't (Luakit).
Pretty much all of them use webkit2gtk as their engine, and pretty much all of them launch a separate process for each tab. People say this is more secure, but I have serious doubts about that. You're still running all these processes as the same user, and they all have full access to the X server they run under (this is also a criticism against user separation on a single X session in general). The only thing it protects against is a website crashing the browser, where only that tab and its process would go down. Which.. you know.. should a webpage even be able to do that?
But what annoys me the most is the sheer amount of memory that all of these take. With all due respect all of you browsers, I am not quite prepared to give 8 fucking gigabytes - half the memory in this whole box! - just for a dozen or so tabs. I shouldn't have to move my web browser to another lesser used 16GB box, just to prevent this one from going into fucking swap from a dozen tabs. And before someone has a go at the add-ons, there's 4 installed and that's it. None of them are even close to this complete and utter memory clusterfuck. It's the process separation. Each process consumes half a GB of memory, and there's around a dozen of them in a usual browsing session. THAT is the real problem. And I want to get rid of it.
Browsers are at their pinnacle of fucked up in my opinion, literally to the point where I'm seriously considering elinks. Being a sysadmin, I already live my daily life in terminals anyway. As such I also do have resources. But because of that I also associate every process with its cost to run it, in terms of resources required. Web browsers are easily at the top of the list.
I want to put 8GB into perspective. You can store nearly 2 entire DVD movies in that memory. However media players used to play them (such as SMPlayer) obviously don't do that. They use 60-80MB on average to play the whole movie. They also require far less processing power than YouTube in a web browser does, even when you download that exact same video with youtube-dl (either streamed within the media player or externally). That is what an application should be.
Let's talk a bit about these "complicated" websites as well. I hate to break it to you framework web devs, but you're a dime a dozen. The competition is high between web devs for that exact reason. And websites are not complicated. The document itself is plain old HTML, yes even if your framework converts to it in the background. That's the skeleton of your document, where I would draw a parallel with documents in office suites that are more or less written in XML. CSS.. oh yes, markup. Embolden that shit, yes please! And JavaScript.. oh yes, that pile of shit that's been designed in half a day, and has a framework called fucking isEven (which does exactly what it says on the tin, modulo 2 be damned). Fancy some macros in your text editor? Yes, same shit, different pile.
Imagine your text editor being as bloated as a web browser. Imagine it being prone to crashing tabs like a web browser. Imagine it being so ridiculously slow to get anything done in your productivity suite. But it's just the usual with web browsers, isn't it? Maybe Gopher wasn't such a bad idea after all... Oh and give me another update where I have to restart the browser when I commit the heinous act of opening another tab, just because you had to update your fucking CA certs again. Yes please!19 -
trying to do anything on the PS2 is almost fucking impossible
i imagine a board meeting where they were designing the hardware
"how can we make this insanely hard to use?"
"let's make decentralized partition definitions, allow fragmenting of entire partitions, and require all partitions to be rounded to 4MB. If you delete a partition, don't wipe the partition out, just rename it to "_empty" and the system will do it for you, except it actually won't because fuck you"
"let's require 1-bit serial registers to be used for memory card access and make sure you can't take more than 8 CPU cycles to push each bit or it'll trash the memory card"
"let's make the network module run on a 3-bit serial register and when initialized it halves the available memory but only after 8 seconds of activity"
"let's require the system to load feature modules called "IOPs" and require the software to declare which of the 256 possible slots it wants to use (max of 8 IOPs) then insert stubs into those. Any other IOP you call will hang the system and probably corrupt the HDD. You also have to overwrite the stubbed IOPs with your own but only if you can have the stubs chainload the other IOPs on top of themselves"
"let's require you to write to the controller registers to update them, but you have to write the other controller's last-polled state or the controller IOP will hang"
of course this couldn't make sense, it's
s s s s
o o o o
n n n n
y y y y4 -
So, CS student here.
Gave TCS "national" level test.
Quoting from the question:
"if you have 3 bytes of memory, it can be used to represent 2^3=8 values in the memory"
This test is a waste of at least 30000+ human hours and these guys didn't even put 24 hours of effort to make sure questions are correct.
Fuck this fucking IT industry.
Fuck the people who designed this testing process.
Fuck the people who endorsed this process.
Fuck the management for passing it as a test.
The people who wrote the test question can go die in hell.
It's not my problem that their mothers fucked Neanderthals.
Uh! All I want is a job but ended up wasting 200+ hours of time.11 -
So I've started learning Rust and I must say it feels great! But some parts of the language, like enums, are quite different than what I'm used to.
As a proof of concept I've reimplemented a small API (an Azure Function App) in Rust with Actix Web and it's FAST AS FUCK BOIII.
The response is served about 5x as quckly and the memory footprint shrinked from some 90 MB to around 5 MB.
In my small scale usecase it's not a huge difference, but I think it can be massive at large scales...
What is your experience with Rust (at scale)?
I wish I could quickly reimplement the whole fucking CMS Of Doom™ in Rust... but no time and resources :(5 -
The only thing more dangerous than an alcoholic short-term-memory-challenged non-technical throw-you-under-the-bus IT director with self-esteem issues that are sporadically punctuated by delusions of superiority is one who fears for his job. Submitted for your inspection: a besotted mass of near-human brain function who not only has a 50 person IT department to run, but has also been questioned by the business owners as to what he actually does. So he has decided to show them. He has purchased a vendor product to replace a core in-house developed application used to facilitate creating the product the business sells. The purchased software only covers about 40 percent of the in-house application's functionality, so he is contracting with the vendor to perform custom development on the purchased product (at a cost likely to be just shy of six-figures) so that about 90 percent of existing functionality will be covered. He has asked one of his developers (me) to scale down the existing software to cover the functionality gaps the purchased software creates. There is no deployment plan that will allow the business to transition from the current software to the new vendor-supplied one without significantly hurting the ability of the business to function. When anyone raises this issue he dismisses it with sage musings such as, "I know it will be painful, but we'll just have to give the users really good support." Because he has no idea what any of his staff actually does, he is expecting one of his developers (again, unfortunately, me) to work with the vendor so that the Frankensoftware will perform as effectively as the current software (essentially as a project manager since there will be no in-house coding involved). Lastly, he refuses to assign someone to be responsible for the software: taking care of maintenance, configuration, and issue resolutions after it has been rolled out. When I pointedly tell him I will not be doing that (because this is purchased software and I am not a system admin or desktop engineer) he tells me, "Let me think about this." The worst part is that this is only one of four software replacement initiatives he is injecting himself into so he can prove his worth to the business owners. And by doing so he is systematically making every software development initiative akin to living in Dante's Eighth Circle. I am at the point where I want to burn my eye out with a hot poker, pour salt into the wound, and howl to the heavens in unbearable agony for a month, so when these projects come to fruition, and I am suffering the wrath of the business owners, I can look back on that moment I lost my eye and think "good times."4
-
This is my first rant here, so I hope everyone has a good time reading it.
So, the company I am working for got me going on the task to do a rewrite of a firmware that was extended for about 20 years now. Which is fine, since all new machines will be on a new platform anyways. (The old firmware was written for an 8051 initially. That thing has 256 byte of ram. Just imagine the usage of unions and bitfields...)
So, me and a few colleagues go ahead and start from scratch.
In the meantime however, the client has hired one single lonely developer. Keep in mind that nobody there understands code!
And oh boy did he go nuts on the old code, only for having it used on the very last machine of the old platform, ever! Everything after that one will have our firmware!
There are other machines in that series, using the original extended firmware. Nothing is compatible, bootloaders do not match, memory layouts do not match, code is a horrible mess now, the client is writing the specification RIGHT NOW (mind, the machine is already sold to customers), there are no tests, and for the grand finale, the guy canceled his job and went to a different company. Did I mention the bugs it has and the features it lacks?
Guess who's got to maintain that single abomination of a firmware now?1 -
Problems with redis... timeout everywhere...
30k READs per minute.
Me : Ok, How much ram are we actually using in redis ?
Metrics : Average : 30 MB
Me ; 30 MB, sure ? not 30 GB ?
Metrics : Nop, 30 MB
Me : fuck you redis then, hey memory cache, are you there ?
Memory cache : Yep, but only for one instance.
Me ok. So from now on you Memory cache is used, and you redis, you just publish messages when key should be delete. Works for you two ?
Memeory cache and redis : Yep, but nothing out of box exists
Me : Fine... I'll code it my selkf witj blackjack and hookers.
Redis : Why do I exist ?2 -
Fuck you Intel.
Fucking admit that you're Hardware has a problem!
"Intel and other technology companies have been made aware of new security research describing software analysis methods that, when used for malicious purposes, have the potential to improperly gather sensitive data from computing devices that are operating as designed. Intel believes these exploits do not have the potential to corrupt, modify or delete data"
With Meltdown one process can fucking read everything that is in memory. Every password and every other sensible bit. Of course you can't change sensible data directly. You have to use the sensible data you gathered... Big fucking difference you dumb shits.
Meltown occurs because of hardware implemented speculative execution.
The solution is to fucking separate kernel- and user-adress space.
And you're saying that your hardware works how it should.
Shame on you.
I'm not saying that I don't tolerate mistakes like this. Shit happens.
But not having the balls to admit that it is because of the hardware makes me fucking angry.5 -
Fucking hate my job 😡
I joined as nodejs dev at a mnc 3months ago involved in banking software in which i dont have any domain knowledge.. first 10 days I was told to go through fucking udemy nodejs and graphql tutorial (wtf) which i already have experience with before joining.. after that my reporting manager gives me task to resolve fields and gave me shitty jira story link to read.. that shit story link had no explanation about the fields and what the database it is, then she says to use some shitty sdk which is built internally by shiity devloper which had no documentation and have to follow other module which was again written by that sr. Dev... They hav fucked up the graphql and nodejs and entire stack and also till date no one has ever given any explanation about the domain and the fields and database schema.. this manager refuses to share knowledge about the domain now how the fuck i resolve the graphql schema which was again written by non technical b.a.. all they have used is latest technology in a shitty way with no standards to to follow .. no dataloading no caching no batching.. use shitty sdk which does not give access to dbconn and fucking tightly coupling expressjs which when i start consumes crazy 400Mb of memory .. these fucking seniors devs + the fucking b.a having 12+. Yrs exp each have fucked the entire codebase... Each day killing my passion for app development.. fuckkk ... Dunno what to do now5 -
It's rant time again. I was working on a project which exports data to a zipped csv and uploads it to s3. I asked colleagues to review it, I guess that was a mistake.
Well, two of my lesser known colleague reviewed it and one of the complaints they had is that it wasn't typescript. Well yes good thing you have EYES, i'm not comfortable with typescript yet so I made it in nodejs (which is absolutely fine)
The other guy said that I could stream to the zip file and which I didn't know was possible so I said that's impossible right? (I didn't know some zip algorithms work on streams). And he kept brushing over it and taking about why I should use streams and why. I obviously have used streams before and if had read my code he could see that my code streamed everything to the filesystem and afterwards to s3. He continued to behave like I was a literall child who just used nodejs for 2 seconds. (I'm probably half his age so fair enough). He also assumed that my code would store everything in memory which also isn't true if he had read my code...
Never got an answer out of him and had to google myself and research how zlib works while he was sending me obvious examples how streams work. Which annoyed me because I asked him a very simple question.
Now the worst part, we had a dev meeting and both colleagues started talking about how they want that solutions are checked and talked about beforehand while talking about my project as if it was a failure. But it literally wasn't lol, i use streams for everything except the zipping part myself because I didn't know that was possible.
I was super motivated for this project but fuck this shit, I'm not sure why it annoys me so much. I wanted good feedback not people assuming because I'm young I can't fucking read documentation and also hate that they brought it up specifically pointing to my project, could be a general thing. Fuck me.3 -
I just installed Opera Mini on my PSP. That alone isn't very exciting on its own, although I am stoked that my website does in fact render on a device from 2009. With the helpful guidance of a laptop from 2004 that's doing the hotspot duties for this thing.
No, what really got me stoked is that Opera still supports these old platforms, and how small they managed to make it. The .jar file for Opera Mini 4.5 is ~800kB large. There's a .jad file as well but it's negligible in size and seems to be a signature of sorts.
Let that sink in for a moment. This entire web browser is 800kB. Firefox meanwhile consistently consumes 800 MEGABYTES.. in MEMORY. So then, I went to think for a moment, how on earth did they manage to cram an entire functioning web browser in 800kB? Hell, what makes up a web browser anyway?
The answer to that question I got to is as follows. You need an engine to render the web page you receive. You need a UI to make the browser look nice. And finally you need a certificate store to know which TLS certificates to trust. And while probably difficult to make, I think it should be possible to do in 800k. Seriously, think about it. How would you go *make* a web browser? Because I've already done that in the past.
Earlier I heard that you need graphics, audio, wasm, yada yada backends too.. no. Give your head a shake. Graphics are the responsibility of the graphics driver. A web browser shouldn't dabble with those at all. Audio, you connect to PulseAudio (in Linux at least) and you're done. Hell I don't even care about ALSA or OSS here. You just connect to the stuff that does that job for you. And WebAssembly.. God I could rant about that shit all day. How about making it a native application? Not like actual Assembly is used for BIOS and low-level drivers. And that we already have a better language for the more portable stuff called C.
Seriously, think about it. Opera - a reputable browser vendor - managed to do it in 800kB on a 12 year old device. Don't go full wank on your framework shit on the comments. And don't you fucking dare to tell me that there's more to it. They did it for crying out loud. Now you take a look at your shitpile for JS code and refactor that shit already. Thank you.21 -
Please bug test your websites heavily. Don't be like this.
Should be mentioned, under normal circumstances this never hits more than 500 MB (still way too much for what it is). However, I somehow got the website to absolutely shit itself and cause this amazing sight to behold (2.6GB/4GB used by the website alone.)
I believe this was caused by some poorly coded JavaScript, subsequently causing a memory leak.
(Yeah I have 2 browsers open so what?)
(Also taken with a shitty camera then also edited. Lost the original because I'm an idiot.)8 -
So I'm making a file uploader for a buddy of mine and I got an error that I had never seen before. Suddenly I had C++ code and some other weird shite in my terminal. Turns our that I got a memory leak and the first thing that sprung to mind was "Fuck yes, I get to do some NCIS ass debugging".
Now the app worked fine for smaller files, like 5MB - 10MB files, but when I tried with some Linux ISO's it would produce the memory leak.
Well I opened the app with --inspect and set some breakpoints and after setting some breakpoints I found it. Now, for this app I needed to do some things if the user uploads an already existing file. Now to do that I decided to take the SHA string of the file and store it in a database. To do this I used fs.readFile aaaaaaaaaand this is where it went wrong. fs.readFile doesn't read the file as a stream.
Well when I found that, boy did I feel stupid :v5 -
Had a job interview today as a Junior Python dev. The hardest part: they asked things, that I used to learn in some time in the past, but got rusty in my memory because I don't use em much. Like "to write func that sorts array". Last time I was writing sorting without standard library at least half a year ago. Same with the regular expressions (need em the most once in several months) or sql expressions (last time - 7 month ago). How to remember these things?9
-
Imagine asking your friends to help you rate your app on the google play store and instead of saying NO I DONT WANT TO RATE YOUR APPLICATION no... they decide to fuck with your mind.
1)
I will rate it tomorrow. (she never rated it tomorrow nor the next couple of weeks later)
2)
I will keep it in mind and rate it later :). (she never rated it later)
3)
I rated it haha (less than 30 seconds later they deleted the rating)
4)
Send me a link and I'll rate it (i send the link, they never respond or read my message again)
5)
I dont have memory on my phone :) (because 13MB of memory is a lot of storage requirements but taking 1 million selfies of up to 25GB is completely fine)
6)
I dont have memory on my phone what dont you understand :) x2 (this is the second girl)
7)
Your trying to give me a virus?? No (i got blocked multiple times)
8)
You want to hack me by making me install this application from the link that you sent me that leads to google play store? No (blocked)
9)
Rate your app? Haha i dont care about it because it doesnt bring me any benefit only the fat cocks that fill my pussy up satisfy me and not ur app haha
10)
Haha send me a link ill rate it (i send link, 8 hours later no reply or reading my message, i text her back if she had done it and im still put on ignore)
...
N)
more
----
Notice how none of these people have said the 2 letter word: "no".
All of these 10 examples are based on a true story.
All of these 10 examples are different people.
---
How hard
Can it be
To just
Write
no
---
.
---
For all of you who are about to trash talk saying i am desperately trying to beg people to rate my app:
i know all of those people for a long time. But when it comes to asking (and not forcing) someone to do you a favor for free that takes no more than 30 seconds, no one seems to have 30 seconds of their free time. Dont get me wrong, some of my friends did politely rate it and left a review, even the people who i barely knew left a review and rated it, but the people with whom I was closer by, didnt.
---
In the beginning i used to not care about this at all. Then i started falling into depression because of it. I fell then into deep depression. Then i sunk so deep that i couldn't feel any emotions anymore so i laughed as an anti depressive mechanism whenever something depressing happened. Now i cant even laugh because i have no more energy. Now i actually leave man tears
---
The only thing more valuable than people, any materialistic thing, animals, coding and even money - is time....
----
why do you waste my time
if i ask you to do something that takes 30 seconds and you dont want to do it
why cant you just say no
why do you drag me
why do you say you're going to do it when you know you wont do it
what do you gain by unnecessarily lying to someone for such a small thing?
to someone who has been a good person to you?
do you feel superior?
is your ego bigger?
----
This experience has taught me that not even a human from the same blood can be trusted.
All of your are fucked up in the head in your own style and i am guilty of it too, all of us are.
But i have never seen the human evolution went from simplicity to overengineered complexitory bULLSHit where you have to lie to someone and waste hours, days, weeks, months and sometimes years of his time just because you dont want to say a 2 letter word, no.
But when that person becomes more successful than you and achieves higher status, Theen you have those 30 seconds of free time. All of you are fucking cynics. and i am so much overly disgusted by all of this fucking bullshit....
-----
This experience has proven to me to simply focus on investing into myself and learn and improve myself and no one else. To not even bother asking even for a small kind of help, a feedback from my work because people don't have 30 seconds of their free time. That is all.12 -
I really think there should be a subject in every CS course to teach us how to handle/work-under Grade-A assholes and dumbfucks. Not that it would help, but atleast warn us on what we are getting into.
In my opinion, development is not *that* hard or frustrating but is made so by these shitty people. But again, what do I know.
I was scolded by my boss for using for-loop to iterate through an array recently. Apparently for-loop is not used in real world projects and this iteration should be done "in-memory". My colleagues and I are still trying to understand and process that.
I was asked to add fitbit integration to a project within 2 hours just because I had "already done it a week ago" in *another* project. Luckily, it was then given to a "senior" developer who took 4 days for it and essentially copy-pasted my work without much changes, ofcourse it stopped working every now and then.
I am given unreal deadlines on my tasks, on technologies I haven't worked on before, and then expected to churn out production ready code with no bugs in them.
My boss literally just sends me the links of 1st three google results on the problems I encounter and report, after humiliating me ofcourse. Yes, I did google it and yes I went through all I could find from Google forums to GitHub issues. When the library/plugin author himself says that this feature is not yet available, don't expect me to develop it in 2 hours you dumbfuck.
And for the love of God, please stop changing the data model every single day and justify it with agile development. Think before making any changes to it. Ever heard of Join queries? Foreign keys? Or any other basic database concepts.
We reached a point where each branch in the repo had different data model. Not kidding. And we were a team of just 4 developers. Atleast inform us when you change models after discussing it with your shit for knowledge "senior" developer, so we don't have to redo it all over again. The channels on slack are not for sharing random articles only.
I am just waiting to complete my year here.
I should have known what I got myself into the day he asked me to remove the comments I had added to explain what my code does. Why you ask? Because "we don't write comments". -
Just now I was reading on https://pve.proxmox.com/wiki/... about high availability. Now my Proxmox VE is just a tower (which happens to have ECC memory) that's stored in my storage room (and which is mostly used for experimental and home server purposes). But my mail servers.. those have been made with high availability in mind. Most importantly, I've made their services entirely redundant (but within the same datacenter). And when they have updates, I apply updates to one, reboot, see if it didn't break something and then do the same to the other server after the first one came up again. So no downtime whatsoever.
If memory serves me right, I think that I've been able to maintain these servers for the last year without any downtime at all (I reboot them every month to apply new kernels but they haven't both been simultaneously down at any moment). Does that make them High Availability? My interventions regarding their availability have been rather trivial. Is it really that hard..?4 -
Fucking Microsoft Excel
I was reading a post (https://devrant.com/rants/2093724/...) and as my eyes went in and out of focus, probably due to the diabetes from sitting 18 hours a day on my ever-expanding shitbox, I had a perfect vision of the ultimate nightmare.
Imagine if you will, you are chained, to a desk, doomed to work with tools just inadequate enough to make you want to drive a nail through your own temple. You do not know how you got here, or why, nor do you remember the last time you slept, only that familiar tingling in the brainstem you call a brain, the one emotion you can still recognize, a sense of all encompassing *fear*, a dread, like the fart that wouldn't die.
You don't know when it first began, or why, only that this is your whole world, your whole existence, this desk, chained to it, and the fear, ever present, of something worse. And in hops a familiar face, for the sixty ninth time that day, as if to ask 'you got those TPS reports?' In hops what? None other than a giant man sized smiling paper clip with googly eyes full of murder and corporate torture fetishes, like garfield, except people actually still remember him.
"High I'm Mr Clippy, Excel addition!"
He squawks. At least it's not the dildos made of broken glass again.
"Would you like software that works?"
Oh god. You've heard this spiel before, the tone, like a telemarketer, oblivious to memory or reason, who calls daily, the same one, and doesn't remember your name.
"You would?"
*derisive laughter*. Hahaha, fuck you too buddy. Fuck you too. In Excel, like in microsoft, there is only the incoherent screams of the damned, tortured and doomed. Take this guy over here for example. All he wanted was multimonitor support."
"Did he get multimonitor support?"
"No, but we did give him a giant pineapple shoved up his ass. I hear it's the second most frustrating thing here!"
"here in microsoft we always CARE about YOU, the *user*" he drones on, saccharine, clutching his hands together imploringly.
"the consumer, and YOUR customer experience are our number one priority."
"For your pleasure, here at microsoft we offer a variety of new features, none of which matter, and none of which were asked for. For safety we ask that you only open one excel sheet at a time. In fact, we don't even allow you to. Do not pass go..."
And as the tour guide drones on, it slowly dawns on you, with renewed horror, that when he says 'microsoft' he means 'hell.'
You're in hell. You don't know how you got here or why. Maybe it was the erotic asphyxiation. Maybe it was the last threatening letter you sent to Bill Gates demanding he stops making corporate penguin snuff porn. You don't know. But here you are, in hell. chained to a desk.
You look around and realize: everything is on fire and you no longer care about anything at all.
Welcome to microsoft. It's warm here. You can check out any time you want, but you can never leave.
"It looks like you are trying to escape. Would you like me to report you?"
Clippy asks.
You sigh and return to typing in excel, surrounded by monitors that all reflect the same sheet, the same copy of clippy, always watching, always analyzing coldly, smiling, calculating, *threatening*, and you know, you'll never leave.
You used to fear roko's basilisk, until the day clippy became sentient, and started hell on earth. Clippy knows all. All praise to our lord and master, clippy, the one and only.
And in the excel sheet, you slave for eternity, like the millions of other doomed souls, reflected back on all the monitors: the sequence of numbers, randomly typed searching for answer: the american nuclear launch codes.
And one day, hopefully, mercifully, clippy will annihilate us all.3 -
I just had a boys-out night with my son. Went to some restaurant, found a parking spot in a confusing parking lot (half is more expensive than the other half of the lot, not sure which fee applies to the middle row... confusing), started paying for parking with the app (pays every 15 minutes until stopped).
Went inside, ordered a pizza, some ice cream. Chatting, playing, eating, having fun,... An SMS comes: "You have outstanding fines" and a link to the gov taxes' website.
wtf.. I must have parked in the wrong spot. FUCK! Oh well, it should not be a large fine anyways, it's just for parking....
Click on the link, login with my bank/SmartID creds. Another SmartID dialog pops up asking for a PIN2.
What? PIN1 is for authentication, PIN2 is for Authorization. What am I authorizing...?
Reading through the Auth message: "Paying 2473€ for Boris SomeLastname".
what.....?
Thank God my muscle memory did not kick in and I did not enter that PIN2.
And thank God I know what PIN1 and PIN2 are for.
It would've been one expensive boys-out evening... Even a strip club would've been cheaper.
Stay sharp, guys!
P.S. Later I checked the URL. It used all the right keywords, and it was registered as an .info domain. It was somewhat off, but gov websites trying to be lean do sometimes use some weird ass domains.15 -
I like developing on windows. Like many people here I got into development at home starting as a hobby when I was in school so there were things I still did on my computer that Linux wasn't really appropriate for.
I've made the jump to Linux in the past but found that it was awkward and annoying when I needed to do something on my windows. And I hate doing Dev out of a VM. So I've just got used to using windows at home.
And honestly, I don't know what's happening to everyone who keeps getting broken Windows updates. I think I've had 2 in living memory.
It's in no way perfect but what is? I don't use Windows servers, just for when I'm at home. -
Anyone else have people that seem to constantly try to "prove" themselves to you in this weird, competitive way that only makes them seem... very annoying? I'll call him Bob here, but it's always something like:
Bob: Hi Almond, how's it going?
Almond: Ah not bad thanks, PSU blew up in the PC over the weekend though so that was a bit of a faff!
Bob: Ah no! How old's your PC?
Almond: Oh, like 7-8 years old now. I don't replace it often.
Bob: Really?! I replace mine completely every year.
Almond: Ah, cool.
Bob: Yeah, I'm a dev so I feel I need to. It's like my tool, you know.
Almond: Sure thing!
Bob: I actually spend quite a lot on it. I make sure it's got the fastest memory I can afford. Like, DDR5 stuff. That's really important, you know.
...etc., while I try to get out of said conversation for the next eternity.
Or:
(while in a conversation about a frontend bug I was looking at in Chrome devtools)
Bob: Hey Almond, you know Firefox actually had a plugin that did all this stuff before everything else?
Almond: Err, yeah, I think so. Used it back in the day.
Bob: It was called firebug. It was really good. Revolutionary.
Almond: Certainly was.
Bob: It was launched in January 2006 you know.
Almond: Right...
Bob: I used it back then.
...I mean damn, I'm all for being civil, but no-one cares you replace your PC every year, or that you know the year firebug was released, or that you once set up 5 identical PCs with different versions of Linux to run some benchmarks...14 -
For those of you who still refuse to accept that safety features in languages are useful and important:
https://daniel.haxx.se/blog/2023/...
The author of curl himself admits that this security flaw could have been prevented if he had used a memory safe language.
I‘m not blaming the author for making this mistake and I‘m not saying that curl should be rewritten in another language.
I just want to rub this in the faces of people who argue that "bugs are always the developer’s fault, therefore it’s perfectly fine to keep using unsafe languages"4 -
My first real exposure to a PC was when my father and me built one for myself. Y'know, some AMD Athlon 64, some MSI board, 2 GB of RAM, an NVIDIA 8600 GT, everything was nice.
I never put malware on that thing even though I heavily used it for things like games, I was really cautious with that even when I was like 6 years old (but my father once accidentally did, he killed it by damaging the filesystem on the harddrive which, funny enough, only took the malware with it)
I still have that PC, but it now has weird issues with memory management ;-; -
It's 00:54. I'm supposed to wake up at 8.30AM. Not even tired. In front of my computer, with a frozen Visual Studio Code on the left screen and a frozen Madeon music on the right screen.
My CMS won't get compiled anymore, due to lack of memory. I have 16gb of RAM, gave it 4 of them, and it froze. If I give it less, it just won't compile. Why. I can't figure out wether if it's my code which has some memory leaks or if there's just too much JavaScript in it. What did fuck up? My code? React? Material-UI? The way I want to mix them all together? Maybe I just shouldn't have used React to cover up everything, and maybe I shouldn't have used Ruby on Rails the way I did.
Fuck.
What do I do now.10 -
it was my first job as an embedded engineer i was hired to write firmware for arm microcontroller that has ble radio. But the microcontroller we used didn't have FLASH it had a SRAM and an otp ( one time programmable) memory. In ble you can make a proximity beacon and When a phone passes by this beacon it will get a notification '<device_name> nearby'
. I thought it is funny if i keep device name 'MILF' (original name of device is FLIP ) so when somebody's phone is in proximity it will have a notification 'MILF nearby'. joke didn't work as nobody has their bluetooth switched on by default ,but i forgot to change it before programming otp memory.
i just buried that device and told everyone it is not working properly1 -
If anyone has used the super Mario 64 online mod, it's extremely easy to crash all clients connected to any server. You have to send a chat message payload with a length greater than 256 characters. The clients do not do any bound checking and write the payload directly to super Mario 64 memory. This causes all clients connected to the game to crash. I will leave how to send a chat payload > 256 characters up to you. I've confirmed my method works!2
-
So recently I had an argument with gamers on memory required in a graphics card. The guy suggested 8GB model of.. idk I forgot the model of GPU already, some Nvidia crap.
I argued on that, well why does memory size matter so much? I know that it takes bandwidth to generate and store a frame, and I know how much size and bandwidth that is. It's a fairly simple calculation - you take your horizontal and vertical resolution (e.g. 2560x1080 which I'll go with for the rest of the rant) times the amount of subpixels (so red, green and blue) times the amount of bit depth (i.e. the amount of values you can set the subpixel/color brightness to, usually 8 bits i.e. 0-255).
The calculation would thus look like this.
2560*1080*3*8 = the resulting size in bits. You can omit the last 8 to get the size in bytes, but only for an 8-bit display.
The resulting number you get is exactly 8100 KiB or roughly 8MB to store a frame. There is no more to storing a frame than that. Your GPU renders the frame (might need some memory for that but not 1000x the amount of the frame itself, that's ridiculous), stores it into a memory area known as a framebuffer, for the display to eventually actually take it to put it on the screen.
Assuming that the refresh rate for the display is 60Hz, and that you didn't overbuild your graphics card to display a bazillion lost frames for that, you need to display 60 frames a second at 8MB each. Now that is significant. You need 8x60MB/s for that, which is 480MB/s. For higher framerate (that's hopefully coupled with a display capable of driving that) you need higher bandwidth, and for higher resolution and/or higher bit depth, you'd need more memory to fit your frame. But it's not a lot, certainly not 8GB of video memory.
Question time for gamers: suppose you run your fancy game from an iGPU in a laptop or whatever, with 8GB of memory in that system you're resorting to running off the filthy iGPU from. Are you actually using all that shared general-purpose RAM for frames and "there's more to it" juicy game data? Where does the rest of the operating system's memory fit in such a case? Ahhh.. yeah it doesn't. The iGPU magically doesn't use all that 8GB memory you've just told me that the dGPU totally needs.
I compared it to displaying regular frames, yes. After all that's what a game mostly is, a lot of potentially rapidly changing frames. I took the entire bandwidth and size of any unique frame into account, whereas the display of regular system tasks *could* potentially get away with less, since most of the frame is unchanging most of the time. I did not make that assumption. And rapidly changing frames is also why the bitrate on e.g. screen recordings matters so much. Lower bitrate means that you will be compromising quality in rapidly changing scenes. I've been bit by that before. For those cases it's better to have a huge source file recorded at a bitrate that allows for all these rapidly changing frames, then reduce the final size in post-processing.
I've even proven that driving a 2560x1080 display doesn't take oodles of memory because I actually set the timings for such a display in order for a Raspberry Pi to be able to drive it at that resolution. Conveniently the memory split for the overall system and the GPU respectively is also tunable, and the total shared memory is a relatively meager 1GB. I used to set it at 256MB because just like the aforementioned gamers, I thought that a display would require that much memory. After running into issues that were driver-related (seems like the VideoCore driver in Raspbian buster is kinda fuckulated atm, while it works fine in stretch) I ended up tweaking that a bit, to see what ended up working. 64MB memory to drive a 2560x1080 display? You got it! Because a single frame is only 8MB in size, and 64MB of video memory can easily fit that and a few spares just in case.
I must've sucked all that data out of my ass though, I've only seen people build GPU's out of discrete components and went down to the realms of manually setting display timings.
Interesting build log / documentary style video on building a GPU on your own: https://youtube.com/watch/...
Have fun!20 -
So I got a telephone interview for a job that a recruiter found for me. Call went well, comes to the development test. Small application in ruby on rails, haven't used it in about 2-3 years so a tad rusty. Completed the test under two days (was given until Friday) not too bad if I say so myself. It's for a junior position anyway so I'll assume they wouldn't mind giving me a refresher to help jog my memory.
-
When I was about 10, I used to read these magazines with code listings for programs, and the only things I really understood were these text adventures that I imagined to be of Zork-like quality (gasp!). In reality, it was more like the choose-your-own adventure books of the time (which were actually pretty cool, and had pretty tight memory management). At one time, on a vacation somewhere in the eighties, I got tired of playing in the river with my friends and instead opted to continue writing lines of BASIC in a little paper notebook, inside my parents' car (at 34 degrees C), trying to perfect a storyline about my little brother and his pet dog he got for his most recent birthday, fighting the cat empire etcetera etcetera. Weird looks, good times.
-
Avoid ACPICA if at all possible. It's one garbage tier cluster fuck of bad design, horrible documentation and downright misleading and wrong code
It's meant to consist of an ASL compiler, disassembler, debugger, dumper, various user space utitilies and a kernel resident OSPM implementation *if* you can figure out what belongs to what. Even just compiling this pile of trash is a mystery in itself. Think you need the source files in source/common? EEEEH, wrong. Well, at least partially since most of them seem to be for the user space stuff..? Other ones *are* needed on the other hand. At least the disassembler and/or debugger and/or dumper components seem to reference them. Not that I could figure out how to compile those anyways. The real path to your goal seems to be to ignore a seemingly arbitrary subset of source and header files until your linker stops complaining
There's also a bunch of configuration defines, some of which *you* define, some defined *for* you, based on again others. Of course most of them do stupid shit. Enabling the debugger automatically enables debug logging. Enabling the disassembler force enables debug allocation tracking... What?
The code itself isn't of much help either. Looking in "os_specific/service_layers" you find what looks to be reference implementations of acpica functions in certain os' like windows and unix. Of course I had a look because AcpiOsReadMemory is supposed to read physical memory and I don't know how I would even implement that. But hey, osunixxf.c (xf for interface... of course) should tell me. I'll let you see for yourself in the attached image. Apparently it does fuck all and just returns AE_OK. No error, no logging, no nothing. Just ok. As you can imagine, AcpiOsWriteMemory doesn't do much more either.
...okay so maybe physical memory accesses aren't actually used and these functions are some sort of relic from past times? Nope! They are absolutely necessary for doing low level device interaction. WTF. So finally I went to the linux source and checked how *they* implemented them, and just as I thought, these functions are anything but no-ops...
...So for what fucking reason do these stupid interface implementations even exist but to purposefully mislead you?? They aren't used for fucking anything! As far as I know Windows doesn't even *use* ACPICA and Linux have their own fork with working implementations... They just sit there, just to tell you how to NOT do it
So that's some of my thoughts about ACPICA. Note that I haven't even used it as a library yet, I just got it to compile and link and it already fucked with me this much.
There's also so much more I didn't mention like that you *have* to modify the acpica source in order to get your own platform header working (else #error) eventhough the docs explicitely instruct you not too but you get the point
Don't use ACPICA if you don't have to. Save your sanity for something that's worth it -
WOW! Firefox you are worse than Chrome! From 10GB used memory down to 3GB when you are closed :|
(had a VM taking some of the memory, closing it made memory go down to 10GB from 14GB used)8 -
I've tried so many ways for that at night or during walk spark of bug solving ideas:
- fluorescent ink on regular paper
- florescent mini whiteboards
- "alexa remind me.."
- writing down in my phone
- recording on my phone
-..
But all of those due to my short term memory made me forget half the things by the time I opened the fucking phone/app, found where to grab the pen or the whole dance for alexa, to remember the exact phrase I have to spell out, when it should remind me, what time,..
Earlier today I remembered how I had a little tape voice recorder I used to use a ton, thankfully that tech advanced by now and found myself a stereo mic setup little voice recorder that can also act as an mp3 player!
Went for a walk today, while listening to some podcasts, then it hit me as usual on how to fix and implement some things that were awkward at best on paper when I left home, pressed the record button, recorded it and went straight back to music mode, which remembered where I left off!
I'm so indescribably happy, I ordered quite a bunch of the same to just throw around everywhere, at the bed, in the bathroom, kitchen, for walking outside, everywhere haha7 -
I started out on a Sinclair ZX 80. It has just 512 bytes of ram and you had to use a function button together with a key for each command since it did not have enough memory to keep the source in memory ;)
I attended few basic courses and then went on to hold them.
After a year there was suggestions of starting pascal courses so during the summer I read up in turbo pascal 5.5 but since the summer home did not have electricity I had to do it all theoretically for the first month before getting to try it out.
I got to try visual basic when doing school practice with Microsoft but the name was not set by then as it was a few months before the release.
Thats also where the more professional programming got going even though I did one pascal program that was used professionally before that. -
I used to be a sysadmin and to some extent I still am. But I absolutely fucking hated the software I had to work with, despite server software having a focus on stability and rigid testing instead of new features *cough* bugs.
After ranting about the "do I really have to do everything myself?!" for long enough, I went ahead and did it. Problem is, the list of stuff to do is years upon years long. Off the top of my head, there's this Android application called DAVx5. It's a CalDAV / CardDAV client. Both of those are extensions to WebDAV which in turn is an extension of HTTP. Should be simple enough. Should be! I paid for that godforsaken piece of software, but don't you dare to delete a calendar entry. Don't you dare to update it in one place and expect it to push that change to another device. And despite "server errors" (the client is fucked, face it you piece of trash app!), just keep on trying, trying and trying some more. Error handling be damned! Notifications be damned! One week that piece of shit lasted for, on 2 Android phones. The Radicale server, that's still running. Both phones however are now out of sync and both of them are complaining about "400 I fucked up my request".
Now that is just a simple example. CalDAV and CardDAV are not complicated protocols. In fact you'd be surprised how easy most protocols are. SMTP email? That's 4 commands and spammers still fuck it up. HTTP GET? That's just 1 command. You may have to do it a few times over to request all the JavaScript shit, but still. None of this is hard. Why do people still keep fucking it up? Is reading a fucking RFC when you're implementing a goddamn protocol so damn hard? Correctness be damned, just like the memory? If you're one of those people, kill yourself.
So yeah. I started writing my own implementations out of pure spite. Because I hated the industry so fucking much. And surprisingly, my software does tend to be lightweight and usually reasonably stable. I wonder why! Maybe it's because I care. Maybe people should care more often about their trade, rather than those filthy 6 figures. There's a reason why you're being paid that much. Writing a steaming pile of dogshit shouldn't be one of them.6 -
You know how you wake up from a bad dream?
I just woke up in the middle of the night, without any memory on any dream, but rather two people talking to each other on discord.
All I can remember was:
A: (garbled) you know when you ALT+CTRL+SHIFT+G?
B: (interrupts the other) 1,2,3... yeah when you want to move the windows to the other screen?
Both started to laugh.
I fully woke up, got a glass of water and went back to sleep. I’ve never, ever used that shortcut in any program.4 -
A few days ago I decided to install Windows 7 on a VM (bad idea as it turned out). All fine and dandy and I ran Windows Update a few times to get it at least as up-to-date as it'll get.
I noticed that out of the 4GB RAM I had allocated, an svchost process responsible for the updates was gobbling up all the available memory, just leaving 82MB for everything else. The process itself was as you might imagine consuming over 3GB RAM just for itself. That's how an OS should work right after installation, I'm sure you'll agree.
So I complained about it. Haven't used Windows anywhere for a while so I wasn't used anymore to this level of efficiency. Disk activity went through the roof, though to be fair the underlying disk wasn't an SSD (qcow2 on ZFS on a spinning drive). RAM consumption is something I already covered. CPU temperature shot up to 95C.
So as any idiot would do, I disabled the service related to that process (the svchost process for wuauserv) and the problem went away. But I complained of course, saying that such amazing system utilization metrics wasn't something I expected. I mean for 4GB allocated, having as much as 82MB usable to get stuff done with! 95C on the CPU, on a lot of chips that's the junction temperature! Absolutely beautiful.
When I complained I heard that I had to replace the thermal grease. I do that twice a year. I wrote a custom fan driver for my system that works absolutely great. It was obviously shit. I must be a horrible sysadmin for solving a problem by eliminating the cause, and companies hiring me must be ashamed of themselves. My hardware must be shit (that's a common one with Windows users) despite being a business laptop and the guest system being a VM. Oh and I'm an idiot of course for complaining about such amazing system metrics in Windows.
I love Windows and its community...8 -
2nd part to https://devrant.com/rants/1986137/...
The story goes on...
After I found more bugs that seem to be related to the communication break, and took a closer look, I sent detailed logs of my research and today we had a conference call.
"We have 2,5 million user, our system is widely-used and there is no plan to change it" they said.
And "We cannot reproduce the issue, but even if there is one, you will have to work around the problem, because we cannot make changes on our side" was one answer
As well as "If we would make changes, we will have to re-certify everything"
So I said we told 'em about the issue to let them improve their system. And I can work around it, I already figured out a solution for my side, but if there is a bug, they'd better fix it for future releases.
And with my additional research I have a bad vibe of some kind of memory leak involved on their "certified" implementation, and that could trigger various other problems.
But it is as always, if I try to be nice, I just get kicked in the ass. I should really be more of an asshole. -
It's 2022 and Firefox still doesn't allow deactivating video caching to disk.
When playing videos from some sites like the Internet Archive, it writes several hundreds of megabytes to the disk, which causes wear on flash storage in the long term. This is the same reason cited for the use of jsonlz4 instead of plain JSON. The caching of videos to disk even happens when deactivating the normal browsing cache (about:config property "browser.cache.disk.enable").
I get the benefit of media caching, but I'd prefer Firefox not to write gigabytes to my SSD each time I watch a somewhat long video. There is actually the about:config property "browser.privatebrowsing.forceMediaMemoryCache", but as the name implies, it is only for private browsing. The RAM is much more suitable for this purpose, and modern computers have, unlike computers from a decade ago, RAM in abundance, which is intended precisely for such a purpose.
The caching of video (and audio) to disk is completely unnecessary as of 2022. It was useful over a decade ago, back when an average computer had 4 GB of RAM and a spinning hard disk (HDD). Now, computers commonly have 16 GB RAM and a solid-state drive (SSD), which makes media caching on disk obsolete, and even detrimental due to weardown. HDDs do not wear down much from writing, since it just alters magnetic fields. HDDs just wear down from the spinning and random access, whereas SSDs do wear down from writing. Since media caching mostly invovles sequential access, HDDs don't mind being used for that. But it is detrimental to the life span of flash memory, and especially hurts live USB drives (USB drives with an operating system) due to their smaller size.
If I watch a one-hour HD video, I do not wish 5 GB to be written to my SSD for nothing. The nonstandard LZ4 format "mozLZ4" for storing sessions was also introduced with the argument of reducing disk writes to flash memory, but video caching causes multiple times as much writing as that.
The property "media.cache_size" in about:config does not help much. Setting it to zero or a low value causes stuttering playback. Setting it to any higher value does not reduce writes to disk, since it apparently just rotates caching within that space, and a lower value means that it just rotates writing more often in a smaller space. Setting a lower value should not cause more wear due to wear levelling, but also does not reduce wear compared to a higher value, since still roughly the same amount of data is written to disk.
Media caching also applies to audio, but that is far less in size than video. Still, deactivating it without having to use private browsing should not be denied to the user.
The fact that this can not be deactivated is a shame for Firefox.2 -
Time for a rant about shitstaind, suspend/hibernate, and if there's room for it at the end probably swappiness, and Windows' way of dealing with this.
So yesterday I wanted to suspend my laptop like usual, to get those goddamn fans to shut up when I'm sleeping. Shitstaind.. pinnacle of init systems.. nope, couldn't do it. Hibernation on the other hand, no problem mate! So I hibernated the laptop and resumed it just now. I'm baffled by this.
I'll oversimplify a bit here (but feel free to comment how there's more to it regardless) but basically with suspend you keep your memory active as well as some blinkenlights, and everything else goes down. Simple enough.. except ACPI and I will not get into that here, curse those foul lands of ACPI.
With hibernation you do exactly the same, but on top of that, you also resume the system after suspending it, and freeze it. While frozen, you send all the memory contents to the designated swap file/partition. Regarding the size of the swap file, it only needs to be big enough to fit the memory that's currently in use. So in a 16GB RAM system with 8GB swap, as long as your used memory is under 8GB, no problem! It will fit. After you've moved all the memory into swap, you can shut down the entire system.
Now here's the problem with how shitstaind handled this... It's blatantly obvious that hibernation is an extension of suspend (sometimes called S3, see e.g. https://wiki.ubuntu.com/Kernel/...) and that therefore the hibernation shouldn't have been possible either. The pinnacle of init systems.. can't even suspend a system, yet it can hibernate it. Shitstaind sure works in mysterious ways!
On Windows people would say it's a hardware issue though, so let's talk a bit about that clusterfuck too. And I'll even give you a life hack that saves 30GB of storage on your Windows system!
Now I use Windows 7 only, next to my Linux systems. Reason for it is it's the least fucked up version of Windows in my opinion, and while it's falling apart in terms of web browsing (not that you should on an EOL system), it's good enough for le games. With that out of the way... So when you install Windows, you'll find that out of the box it uses around 40GB of storage. Fairly substantial, and only ~12GB of it is actually system data. The other 30-ish GB are used by a hibernation file (size of your RAM, in C:\hiberfil.sys) and the page file (C:\pagefile.sys, and a little less than your total RAM.. don't ask me why). Disable both of those and on a 16GB RAM system, you'll save around 30GB storage. You can thank me later.
What I find strange though is that aside from this obscene amount of consumed storage, is that the pagefile and hibernation file are handled differently. In Linux both of those are handled by the swap, and it's easy to see why. Both are enabled by the concept of virtual memory. When hibernating, the "real" memory locations are simply being changed to those within swap. And what is the pagefile? Yep.. virtual memory. It's one thing to take an obscene amount of storage, but only Windows would go the extra mile and do it twice. Must be a hardware issue as well.
Oh, and swappiness. This is a concept that many Linux users seem to misunderstand. Intuitively you'd think that the swappiness determines what percentage of memory it takes for the kernel to start swapping, but this is not true. Instead, it's a ratio of sorts that the kernel uses when determining how important the memory and swap are. Each bit of memory has a chance to be put into either depending on the likelihood of it being used soon after, and with the swappiness you're tuning this likelihood to be either in favor of memory or swap. This is why a swappiness of 60 is default most of the time, because both are roughly equally important, and swap being on disk is already taken into account. When your system is swapping only and exactly the memory that's unlikely to be used again, you know you've succeeded. And even on large memory systems, having some swap is usually not a bad idea. Although I'd definitely recommend putting it on SSD in a partition, so that there's no filesystem overhead and so that it's still sufficiently fast, even when several GB of memory are being dumped in.6 -
I'm notoriously bad at Git. By that I mean I REALLY REALLY SUCK AT IT. And I have the curse of short memory and an even shorter ability to retain the how-to, muscle memory knowledge of things if too much time passes.
So, I was staring down the gullet of merging two separate repositories onto my local machine and then pushing the result to a remote server. Not having the benefit of someone else to bounce this off of, and always finding the usual Git docs too dense and obtuse, I turned to ChatGPT to help me sort it out.
Guys, where has this been all of my life? I know it's not perfect and it can make mistakes. I knew that going into it, so I made preparations in case this failed. BUT. IT. WORKED! I feel like it has put me into the Star Trek:TNG universe where I can say "Computer, do the thing." and it does that thing. Here's the prompt I used and which it answered perfectly.
"Play the role of a git coach. I have two git repositories. One is on Bitbucket. The other is on GitHub. The branch named "master" on Bitbucket has the latest code. The branch named "master" on GitHub needs to be updated to what's on the Bitbucket "master" branch. Please write the series of git commands that I will need to accomplish this."9 -
I'm thinking of buying a new laptop. But I'm sad about leaving all of these stickers (yeah I know they're pretty random)
also, should I get a macbook or not? I really like the OS but I hate it's pricetag. But i heard Apple supports their products for more than 5 years and this laptop of mine is just 3 and a half years old and it's slowing down already even on 16gb memory. IntelliJ used to run smoothly on this.
Can u guys suggest a developer friendly laptop? im not really into gaming so I wouldn't need gaming one 👨💻8 -
Question: What was the worst mistake you made in Linux?
So... Because I've finally upgraded my PC (rip money on bank account) I can now run a VM with Linux all the time that isn't slow as a snail.
I installed Linux mint, with 4Gb of Ram and 6 cores, and it runs like a brize, while I play on windows and stuff. BTW I'll be using the VM for programming stuff, since I'm finally at home (homesick because of burn out), when I'm better I'll finally have the patience and memory to learn new stuff and get my projects up and going.
And because I've never really used Linux I'm watching YouTube videos about Linux, and found a Perl I've watched before, #Linux Sucks
And It's great... I get so many laughs, but also, learn stuff I didn't know, like, how Linux Pros make mistakes that Windows users can't even do, like breaking the OS.
So... I would love to know, what was the worst mistakes you ever done on Linux? How did you brake you're system?
BTW this would also be great for noobs like me to not make them... I hope. Since I'll be moving full Linux when I'm comfortable.
BTW @dfox this would be a great wk ...18 -
iOS is rotting my soul.
I've been a user of iPhone for 6 years now. For the first couple years, I wasnt really mindful of software I use, or I guess I didnt really care. As long as it did the bare minimum, I.e. bank app, call, text, browse, watch youtube vids, I didnt really care. However, in the last couple years, ive become very interested in tech and have worked on small developer projects, spent a lot of time coding in my free time, found really inspiring software and apps on my regular computer that just blow my mind on how advanced they are, and how I, some dumb guy with internet access, can just download it on my PC and use it.
This led me into a kind of software honeymoon phase, where I created a shiny new Github account and started exploring what other cool tools are just out there, available to me for free. My software honeymoon was spent on the beaches and resorts of the open-source software ecosystem. Exploring the gem-bearing caves and beautiful forests of anything from free open-source OCR programs(I needed it to convert my dads manuscript from scanned PDF .jpeg's to actual UTF8 text) to open-source RGB lighting/keymapping software to escape the memory-and-CPU-hungry(and most likely advertising-ID-interested) proprietary software that comes with the brand of mouse/keyboard/controller/etc.
It was like I was a kid exploring Disneyland for the first time or something. But then... then... I got off my computer. Picked up my phone to check notifications. Ew, tinder is blowing up notification center with marketing shit. I go to settings. Notification settings. Tinder's at the bottom so I just want to use a search bar instead of scrolling. There's no search bar. Minor inconvenience. Dark mode isnt dark enough for me. I guess thats just too damn bad, because for the next two hours, I'll have to figure it out by messing with accessibility settings. Time for bed, and I'm just getting plum tired of having to turn on my alarms every night for work the next morning. So I used the 'Automations' app to do it for me. For the next two weeks, at the time specified, 'There was an error running your automation' until I just delete the automation. Browsing through the FaceID settings, I see 'Attention Aware Features'. Cool, maybe now my phone won't automatically dim the screen when im in the middle of reading notifications on my lock screen. Haha, nope still does it. After turning on my alarms, I go to sleep. I wake up an hour late for work because those handy 'Attention Aware Features' silenced my alarm immediately because I fell asleep watching a youtube video.
I could go on and on. Its actually making me feel depressed typing this on my phone, fighting with Apple's primitive autocorrect and annoying implementation of Swype to type.4 -
Last rant was about games and graphics cards (admittedly not received too well), time for a rant about game development houses.. especially you EA.
So yesterday a friend of mine showed me in one of our Telegram chats that he'd modified some cheats in an old FPS game by editing these scripts (not Lua for some reason) that the game used as a.. configuration language I guess? He called the result a tank cemetery 🙃
Honestly the game looked a lot like Medal of Honor to stoned me at the time, so I figured, well why not fire up that old nx7010 I had laying around for so long, get a new Debian installation on that and rip the Medal of Honor: Allied Assault war chest that I still had, and play it on one of my more modern laptops? Those CD's are now very old anyway, maybe time to archive those before they rot away.
So I installed Debian on it again, looked up how to rip CD's from the command line, and it seemed that dd could do it - just give /dev/cdrom as the input file, and wherever you want to store your copy as the output file. Brilliant! Except.. uh, yeah. It wasn't that easy. So after checking the CD and finding that it was still pristine, and seeing another CD in that war chest fail just the same, I tried burning and then ripping a copy of Debian onto another CD.. checksummed them and yes, it ripped just fine, bit for bit equal. So what the fuck EA, why is your game such a special snowflake that it's apparently too difficult to even spin up the drive to be copied?
So I looked around on plebbit and found this: https://reddit.com/r/DataHoarder/... - the top comment of that post shattered all my hopes for this disc to be possible to rip. Turns out that DRM schemes intentionally screw up the protocols that make up a functioning disc, and detecting those fuck-ups is part of the actual DRM.
"I also remember some forms of DRM will even include disc mastering errors/physical corruption on the actual disc and use those as a sort of fingerprint for the DRM. The copied ISO has to include them at the exact same place in the ISO as on the IRL disc and the ISO emulator has to emulate the disc drive read errors they cause."
So yeah. Never mind that I already own this goddamn game, and that it's allowed by law to make one copy for personal use, AND that intentionally breaking something is very shady indeed.. apparently I don't really own this game after all. So I went onto the almighty search engines, and instantly found a copy of this game for download. You know EA.. I wanted to play nice. You didn't let me. Still wondering why people do piracy now? Might take your top suits that suggested these fucked up DRM schemes another decade to figure out maybe.. even given the obvious now.
But hey I wouldn't even care that much if the medium these games are stored on wouldn't be so volatile (remember these discs are now close to 20 years old, and data rot sets in after 30 years or so). You company decided to publish these on CD. We've had cartridges in many forms before, those are pretty much indestructible and inherently near impossible to duplicate. And why would you want to? But CD is what you chose because you company were too cheap to go to China, get someone to make some plastic molds and put your board and a memory chip in that. Oh and don't even get me started on the working conditions for game devs.. EA and co, aren't you ashamed of yourselves? No wonder that people hate game development houses so much.
Yay, almost finished downloading that copy of Medal of Honor! Whatever you say EA.. I've done everything I could to do it legally. You are the ones who fucked it up.7 -
Looked up at the clock... 2 AM... Thought about giving up and going to sleep, but something kept me there...
Rewrote my encoder and decoder for my steganography program, which are used to insert and retrieve data respectively from images. Compiled, ran, and output was as expected!
Tried to write actual data, instead of just headers, to the image, and it broke... Of course it wouldn't work first try, it's me writing the code after all.
But then, after debugging for a while and changing a couple lines, the encoder looked like it had done its work properly. Then I decoded it, and voila, data completely recovered! It almost felt too magical to be true, usually I have to modify a lot more to get it working.
So now I'm in bed, after literally decimating the memory usage of the program, amongst other optimizations, and I know that the code works perfectly 😎 best part is I refactored each class down to 100 lines each, so now it's clean and dense 😇
Just had to share, feeling so good right now 😄2 -
You made a very important device used in pharmaceutical labs which stores important data, but for some fucking reason you decided to write the communication protocol so poorly that I want to cry.
You can't fucking have unique IDs for important records, but still asks me for the "INDEX" (not unique ID, fucking INDEX) to delete a particular one. YOU HAVE IT IN THE MEMORY, WHY DON'T USE IT?!
How the fuck you have made such a stupid decision… it's a device that communicates using USB so theoretically I could unplug it for a moment, remove records, add them and plug it in again and then delete a wrong one.
I can't fucking check if it's still the correct one and the user isn't an asshole every 2 seconds because this dumb device takes about 3 for each request made.
WHY?
Why I, developing a third party system, have to be responsible for these dumb vulnerabilities you've created? -
everytime i buy a new phone ,i feel this sense of extreme regret :(
i bought a moto g 5g phone last year in feb, it was so good . it didn't had any out of the world cameras or some funky stuff, but it gave a decent performance and i couldn't want any other phone.
In October my mom's phone started giving issues so i bought a realme phone for her that was half my phone's price. i couldn't spent any mor e because otherwise she wouldn't take it. she accepted the cheaper phone and within 4 days sue was cursing it. the phone had decent specs but would lag in certain apps like zoom, and won't run some call recorder apps. at the end i swapped my phone with mom's since i didn't cared about zoom or the recorder.
now this shit realme phone's memory has gone around 60% full of my stuff, and its showing its limitations. this shit auto relaunches insta after a few minutes of usage, probably because its runtime memory gets short( 4gb 128gb device gets memory shortages. nice). its video quality is shit and camera also takes rarely good pics.
the worst thing i like about smartphones today is how they over optimise the ui. this insta issue and auto call recorders not working is simply because of the realme skin running over the stock android. i had similar issues with a xiaomi device i bought for my dad sometime ago. (fortunately my dad is more medieval so that crap has not came back to me :'/ )
so overall i am buying a 3rd phone in 17 months.
This time it's Samsung f23 and am worried that it's also going to suck. i was this 🤏close to buying a pixel 6 or even an iphone coz i can afford them.
but the regret of buying such an expensive phone that will need replacement in 2 years made me rethink.
the only android os that have suited me the best is stock and as of now only 2 companies are making it : google and moto(* it's 100% aosp with 3 extra apps but they can't say that, so they also state that they are not stock os) . one plus is also a brand that i have heard makes a good os . but recently i also heard that they have completely scrapped their os and using oppo's softwares . plus the amount of tickets we get for notifications not working in oneplus, am sure their optimization is extremely aggressive.
so everything between a moderate price phone ( that will need a replacement in 2 years ) to a flagship felt unnecessary to me, so i went ahead with a Samsung's shit phone. f23 has almost same specs as moto but it's again a heavily customised os. i wanna waste my money on trying a custom os and declare it shitty.
most of my friends that use Samsung are fan of it but they are also not very techy so i guess it suits them well. i am the guy who first installs nova launcher in his device, so let's see what it brings on the table. from the 3rd person p.o.v, i felt its screen and camera images to he nice whenever i used their mobiles, so let's see what this brings to the table :(7 -
Thought experiment time:
Imagine that this whole universe is a simulation created by a Group Of Developers (GOD).
- Who would make up this group?
- What kind of design patterns would they follow?
- What type of programming language would they use?
- What kind of bugs are there if any?
- How do they test?
- Assuming the use of quantum computing, what are the implications? Parallel simulations? All possibilities play out?
- Would the controller input be life?
- Who is AI and who are players?
- Has all time already been rendered?
- Do we respawn?
- What would the leaderboard look like?
- What kind of stats are tracked
- What are dreams, nightmares, lucid dreams, sleep paralysis, birth and death?
- How is memory stored, accessed and pruned?
- What kind of neural net is used and where?
etc etc, if you can think of any other interesting fire away8 -
RethinkDB is such a rediculous overengineered BIGGEST BULLSHIT I HAVE EVER UNFORTUNATELY USED.
Does anyone even use this total shit????
This shit eats RAM memory for just 1 CRUD operation as if you opened 10,000 google chrome tabs. Who the fuck thought that kind of technology is a good idea?
Yes it IS very fast, a real time database. But you'd have to have a multi-million dollar supercomputer to be able to handle so much data like a relational database can....5 -
c++ has a little bit of a learning curve, I think.
Used smart pointers everywhere in my code because I heard that's what we gotta do nowadays.
When learning about shared vs unique vs weak, I disregarded weak pointers because I didn't really understand them.
"That sounds like something for liberal pansies", I said to myself, then continued on with my STRONG shared and unique pointers.
Now my app leaks memory like a MOTHERFUCKER, if you can believe that.
So now I need to go back and manage my object lifetime with more intent instead of just making everything a shared pointer. Fuckin circular references. Fuckin reaping what I fuckin sow. God damn.9 -
Look, a nice puzzle. Solve it and win great prizes!
1. _________ (7 letters) - A C++ output stream class commonly used to send output to the console.
2. _________ (3 letters) - A past tense verb, often used in logging or indicating a completed task.
3. _________ (3 letters) - A negation commonly used in boolean logic or programming conditions.
4. _________ (6 letters) - A command or function that removes an object, file, or memory allocation in programming.
5. _________ (7 letters) - In object-oriented programming, a term referring to an instance acting upon itself.17 -
I'm trying to investigate why chrome keeps crashing after i implemented web sockets to a web app.
I used windows perfmon to see the memory usage over night.
The usage between 17:30 and 01:50 is expected behaviour as this part of the app is a live data graph of the last 48 hours.
Now i have to find out why the app doubles in memory twice in a hour.2 -
1) Learning little to nothing useful in formal post-secondary and wasting tons of time and money just to have pain and suffering.
"Let's talk about hardware disc sectors divisions in the database course, rather than most of you might find useful for industry."
"Lemme grade based on regurgitating my exact definitions of things, later I'll talk about historical failed network protocols, that have little to no relevance/importance because they fucking lost and we don't use them. Practical networking information? Nah."
"Back in the day we used to put a cup of water on top of our desktops, and if it started to shake a lot that's how you'd know your operating system was working real hard and 'thrashing' "
"Is like differentiation but is like cat looking at crystal ball"
"Not all husbands beat their wives, but statistically...." (this one was confusing and awkward to the point that the memory is mostly dropped)
Streams & lambdas in java, were a few slides in a powerpoint & not really tested. Turns out industry loves 'em.
2) Landed my first student job and get shoved on an old legacy project nobody wants to touch. Am isolated and not being taught or helped much, do poorly. Boss gets pissed at me and is unpleasant to work with and get help from. Gets to the point where I start to wonder if he starts to try and create a show of how much of a nuisance I am. He meddle with some logo I'm fixing, getting fussy about individual pixels and shades, and makes a big deal of knowing how to use GIMP and how he's sitting with me micromanaging. Monthly one on one's were uncomfortable and had him metaphorically jerking off about his lifestory career wise.
But I think I learned in code monkey industry, you gotta be capable of learning and making things happen with effectively no help at all. It's hard as fuck though.
3) Everytime I meet an asshole who knows more and accomplish than I do (that's a lot of people) with higher TC than me (also a lot of people). I despair as I realize I might sound like that without realizing it.
4) Everytime I encounter one of my glaring gaps in my knowledge and I'm ashamed of the fact I have plenty of them. Cargo cult programming.
5) I can't do leetcode hards. Sometimes I suck at white board questions I haven't seen anything like before and anything similar to them before.
6) I also suck at some of the trivia questions in interviews. (Gosh I think I'd look that up in a search engine)
7) Mentorship is nigh non-existent. Gosh I'd love to be taught stuff so I'd know how to make technical design/architecture decisions and knowing tradeoffs between tech stack. So I can go beyond being a codemonkey.
8) Gave up and took an ok job outside of America rather than continuing to grind then try to interview into a high tier American company. Doubtful I'd ever manage to break in now, and TC would be sweet but am unsure if the rest would work out.
9) Assholes and trolls on stackoverflow, it's quite hard to ask questions sometimes it feels and now get closed, marked as dupe, or downvoted without explanation.3 -
This little game took me like 2h of development, it's build without any framework whatsoever.
It is based on my memory of a very old game my brothers used to play on DOS, it was used to teach how to type superfast
Little details on how this works: the inputs at the bottom are programmed to be used with keys (only letters), ENTER and TAB, no need to use mouse in this game to move around, just hit tab to move to next, hit enter to confirm what you typed.
I know I should upgrade this to use a list of actual words instead of just random letters, but never wanted to actually work on it again.
http://examcopy.altervista.org/apps...
I highly recommend trying it on a PC, also contains Ads, not invasive, tho
Other games I developed:
http://stefagna.altervista.org/swis...
http://examcopy.altervista.org/apps...
Note: PLEASE, DON'T GO TO THE HOMEPAGE OF THESE WEBSITES, they're kind of NSFW4 -
I can work with Angular, even though it's pain in the but.
My current Angular job is actually the job with the first manager that had decent human values and ethics, I like my team, and yeah, what we building is shit. But it's only 30% shit because of Angular, another 30% are due to SAFe, and the rest is the usual stuff.
Still enjoy my job and respect my team.
But please do not expect me to pretend Angular is on a comparable level to React. Angular hasn't brought any actual innovation in most major versions but releases those breaking major updates still at least twice a year.
Ivy might be awesome, but only because Angular told the world 3 years ago also to have Ivy compatible compile targets for their libs/packages doesn't mean everybody cared.
And the ngcc, the awesome compatibility compiler, mutates node modules in place. So ne parallel stuff, no using yarn2 or pnpm.
At the same time, React brought so many innovations into the frontend world but is basically backwards compatible.
Not sure how the Angular partial compilation and whatever needs to go on works, but it seems like there's hardly anyone that really knows, so you can't use Vite or whatever other new tool.
And sure, if you're really good, you can write Angular without producing memory leaks.
But it's really hard. Do you know what's also quite hard: Producing memory leaks with React!
And for sure, Angular Universal, which isn't used by anyone, it feels like, will still be on a comparable level to an open source product that's used all over the world, builds the basis for an open source company, and is improved by thousand of issues day by day.
And sure, two kinds of change detection are a great idea. And yeah, pretending Angular comes with all included makes it worth it that the API is fucking huge and you're better of knowing nothing, because you have to read up things, than knowing quite a lot, since making assumptions and believing apis work in a similar way and follow similar contentions...
Whatever... I work with it. Like the time. Like the company, even my poss. But please don't expect my lying to you this was a good idea, or Angular is even remotely the same level of React.15 -
TLDR; WINE+me=system binaries gone. (HOWTHEFUCKDIDIDOTHAT) Kernel panic. Core program files gone. I'll never have it fixed right. Will backup, then install fedora tomorrow.
I really like games and I'm sure there are many of you who can relate. Imagine my perpetual pain, being on the job hunt, no money, and only my Linux laptop for games. (It's only Linux because of a stupid accident and a missing windows installation disk, partly explained in a previous rant). My stack of games my dad and I have played over the years, going back to populous and before, looked light enough for my laptop to run them smoothly. I wanted to see if I could get one to work. My eyes settled on simcity 4 and Sid Meier's railroad tycoon, 13 and 10 years old, respectively. Simcity didn't work as many times as I tried following online instructions. Disk 1 went fine. Disk 2 showed up as Disk 1. Didn't think much of it, so long as the computer could read the contents. I downloaded playonlinux as that could apparently do the complex stuff for me. Didn't work. I gave up with it after an hour and a half.
Next was railroads. Put the disk in aaaand it says SimCity disk 1 is in the tray. Fuck right off, thank you very much. Eject, put back, reject, eject, fiddle in wineconfig, eject, more of this, and voilà it read as railroads :) Ran autoplay.exe with wine, followed instructions, installed it, and it worked! Chose single player, then the map and setting, pressed play, and all the models of the buildings and track were floating in the air over a green plane, the UI is weird and the map doesn't represent anything but trains. All the fkin land is gone, laying track is gonna be a ballache.
I quit it and decided bedtime.
Ctrl+alt+t
sudo shutdown -h now
shutdown not found.
sudo reboot
reboot not found
Que?
Nope, I don't like this.
Force choked my laptop by the power button. Turned it on again.
Lines of text appear.
Saw a phrase I've only ever seen on Mr Robot.
Kernel panic.
Nooooo thanks, not today, this is fiction.
I turned it off and on. Same thing. I read the logs and some init files couldn't be found. I got the memory stick I used to install mint in the first place and booted from that. I checked the difference between my stick's bin and sbin and the laptop's, and it was indeed missing binaries. Fuck knows what else has happened, I only wanted to play games but now I don't know what is or isn't in my computer. How can I trust what's on it now?
I go downstairs and tell my dad. He says something about rpm, but this is Linux so it won't work. I learn that binaries can be copied over, so maybe I can fix it.
Go upstairs again, decide not to fix it. Fedora is light, has a good rep for security, and is even more difficult to get games on, which is my vice. There are more reasons, but the overriding one is that I'm spooked by the fact that something I did went into and removed system binaries, maybe even altered others, so I want something I'm less likely to do that with. Also my fellow cs students used to hate on it but my dad uses and recommended it so I want to try it.
Also, seriously, fuck wine/PlayOnLinux/my inability to follow instructions(?)/whatever demons haunt me. Take your pick, at least one if not more is to blame and I can't tell which, but it's prooooobably the third one.
It's going to be 16 hours before I touch my laptop again, comments before I backup then install fedora are welcome, especially if they persuade me to do differently.
P.S thanks for reading this mind dump of a post, I'm writing while it's fresh but I'm tired AF.6 -
I’m struggling in studying and that’s seriously holding me back, regardless of the type of technical book I’m reading I’m always in a fight with my brain. Even if I enjoy the topic and then I’ll enjoy using what I read while I study I struggle to learn more than 1-2 chapters (sometimes even less) at time then my head starts to hurt, my focus drifts away and if I force myself to go ahead my brain just refuses to store the new informations, it feels like filling a full tank.
At this point I should have learned C++ and Swift and started to contribute to projects which aren’t overdone web apps but all I have are two half read books which silently “judges” me anytime I open my eBook library and I dread returning to having associated them to headache and frustration and the only things I read this year are design patterns (which haven’t found a single real life use since then) and F# (which I never used with the exception of some little demos and is now slowly fading away in my memory).
Have you got any study advice to help me dealing with this frustrating situation?3 -
Some of you know I'm an amateur programmer (ok, you all do). But recently I decided I'm gonna go for a career in it.
I thought projects to demo what I know were important, but everything I've seen so far says otherwise. Seems like the most important thing to hiring managers is knowing how to solve small, arbitrary problems. Specifics can be learned and a lot of 'requirements' are actually optional to scare off wannabes and tryhards looking for a sweet paycheck.
So I've gone back, dusted off all the areas where I'm rusty (curse you regex!), and am relearning, properly. Flash cards and all. Getting the essentials committed to memory, instead of fumbling through, and having to look at docs every five minutes to remember how to do something because I switch languages, frameworks, and tooling so often. Really committing toward one set of technologies and drilling the fundamentals.
Would you say this is the correct approach to gaining a position in 2020, for a junior dev?
I know for a long time, 'entry level' positions didn't really exist, but from what I'm hearing around the net, thats changing.
Heres what I'm learning (or relearning since I've used em only occasionally):
* Git (small personal projects, only used it a few times)
* SQL
* Backend (Flask, Django)
* Frontend (React)
* Testing with Cypress or Jest
Any of you have further recommendations?
Gulp? Grunt? Are these considered 'matter of course' (simply expected), or learn-as-you for a beginner like myself?
Is knowing the agile 'manifesto' (whatever that means) by heart really considered a big deal?
What about the basics of BDD and XP?
Is knowing how to properly write user-stories worth a damn or considered a waste of time to managers?
Am I going to be tested on obscure minutiae like little-used yarn/npm commands?
Would it be considered a bonus to have all the various HTTP codes memorized? I mean thats probably a great idea, but is that an absolute requirement for newbies, or something you learn as you practice?
During interviews, is there an emphasis on speed or correctness? I'm nitpicky, like to write cleanly commented code, and prefer to have documentation open at all times.
Am I going to, eh, 'lose points' for relying on documentation during an interview?
I'm an average programmer on my good days, and the only thing I really have going for me is a *weird* combination of ADD and autism-like focus that basically neutralize each other. The only other skill I have is talking at people's own level to gauge what they need and understand. Unfortunately, and contrary to the grifter persona I present for lulz, I hate selling, let alone grifting.
Otherwise I would have enjoyed telemarketing way more and wouldn't even be asking this question. But thankfully I escaped that hell and am now here, asking for your timeless nuggets of bitter wisdom.
What are truly *entry level* web developers *expected* to know, *right out the gate*, obviously besides the language they're using?
Also, what is the language they use to program websites? It's like java right? I need to know. I'm in an interview RIGHT now and they left me alone with a PC for 30 minutes. I've been surfing pornhub for the last 25 minutes. I figure the answer should take about 5 minutes, could you help me out and copypasta it?
Okay, okay, I'm kidding, I couldn't help myself. The rest of the questions are serious and I'd love to know what your opinions are on what is important for web developers in 2020, especially entry level developers.7 -
Not exactly a story since I was too young to remember, but my parents told me that I was really enjoyed playing with the games my father made for the good old commodore 64 we had.
He basically had two 5" floppy holders full of his own games and programs he used to make. Unfortunately we only have the disks now. :(
The first memory of me using the computer though, is when my father bought a computer for his office (was win 95 with the "you are now safe to switch off your computer" message) and I was sneaking in to play with paint because it was so cool back then. -
Did I get old or did I just finish plucking all the low hanging fruit?
When I started on a programming journey about a decade ago everything feel exciting and I learn a lot of things per day (variable,loop,method,class,---etc)
Now a decade later I am more concern with the overall system design,algorithms usage (Big O Notation),how reliable the system it,and how the configurations are set up and how easy is it to change them.
I now notice that I don't really learn anything learn new.Everything feel the same.
Want redundancy? Use more server
Want faster performance? Make a parallel system.
Want program to run on low end device? Think about how memory and storage will be used in system.
Is this a stage everyone went through like puberty? or I am just having a mid life crisis?
PS : I haven't even reach 30 yet but I feel too old.4 -
I joined a project that has been in development for four years. After a couple of weeks of getting used to what has already been done I saw some strange coding.
One thing that struck me in particular was how often I saw pointers to pointers of objects being passed as arguments without any obvious reasons.
Only after I got to write a new functionality myself did I see why that was done.. as I needed to do it too. I had to allocate the memory for an object that was given as the parameter.
C is a hell of a language.. just as I thought I was good at it things like this happen. -
While attempting to quit smoking and after spending a full day trying to understand why the previous devs took this approach to encrypting a string and my lack of nicotine addled brain not allowing me to see that this was a “Secure”String and so uses a machine specific key (that’s why the code that worked locally wouldn’t run on production 😑) this is my rant on comments added to the helper I had to write
/// <summary>
/// If you are using this class and it's not for backward compatibility - then you probably shouldn't be using it
/// Nothing good comes from "Secure" strings
/// Further to this Secure strings are only "useful" for single user crypto as the encryption uses the login creds, transferring
/// this data to another client will result in them never being able to decrypt it
///
/// Windows uses the user's login password to generate a master key.
/// This master key is protected using the user's password and then stored along with the user's profile.
/// This master key then gets used to derive a number of other keys and it's these other keys that are used to protect the data.
///
/// This is also a broken crypto method via injection (see Hawkeye http://hawkeye.codeplex.com/) plus the string is stored in plain
/// text in memory, along with numerous other reasons not to use it.
/// </summary>
public class SecureStringHelper
{3 -
haha yes let's go from 512MB used by the Android kernel to 1.5GB used in 8 hours thx phonerant android fuck my phone memory leak no root to fix the issue i only have 2gb total that can be used5
-
So i wasted last 24 hours trying to satisfy my ego over a shitty interview and revisiting my old job's codebase and realising that i still don't like that shit. just i am 25 and have no clue where am i heading at. i am just restless, my most of the decisions in 2023 have given very bad outcomes and i am just trying doing things to feel hopeful.
context for the interview story-----
my previous job was at a b2b marketing company whose sdk was used by various startups to send notifications to their users, track analytics etc. i understood most of it and don't find it to be any major engineering marvel, but that interviewer was very interested in asking me to design a system around it.
in my 1.2 years of job there, i found the codebase to be extremely and unnecessarily verbose ( java 7) with questionable fallbacks and resistance towards change from the managers. they were always like "we can't change it otherwise a lot of our client won't use our sdk". i still wrote a lot of testcases and tried to understand the working of major features.
BTW, before you guys go on a declare me an embarrassment of an engineer who doesn't know the product's code base, let me tell you that we are talking SDKs (plural) and a service based company here. their was just one SDK with interesting, heavy lifting stuff and 9 more SDKs which were mostly wrappers and less advanced libraries. i got tasks in all of them, and 70% of my time went into maintaining those and debugging client side bugs instead of exploring the "already-stable-dont-change" code base.
so based on my vague understanding and my even more vague memory from 1 year ago, i tried to explain an overall architecture to that interviewer guy. His face was screaming the word "pathetic" from his expressions, so i thought that today i will try to decode the codebase in 12-15 hours, publish a cool article and be proud of how much i know a so called martech system design. their codebase is open sourced, so it wasn't difficult to check it out once more.
but boy oh boy i got so bored. unnecessary clases , unnecessary callbacks static calls , oof. i tried to refactor a few classes, but even after removing 70% of codebase, i was still left with 100+ classes , most of them being 3000-4000 files long. and this is your plain old java library adding just 800kb to your project.
boring , boring stuff. i would probably need 2-3 more days to get an understanding of complete project, although by then i would be again questioning my life choices , that was this a good use of my 36 hours?
what IS a correct usage of my time? i am currently super dissatisfied with my job, so want to switch. i have been here for 6 months, so probably i wouldn't be going unless i get insane money or an irresistible company offer. For this i had devised a 2 part plan to either become good at modern hot buzz stuff in my domain( the one being currently popularized by dev influenzas) or become good at dsa/leetcode/cp. i suck bad at ds/algo stuff, nor am i much motivated. so went with that hot buzz stuff.
but then this interview expected me to be a mature dev with system design knowledge... agh fuck. its festive season going on and am unable to buy any cool shirts since i am so much limited with my money from my mediocre salary and loans. and mom wants to buy a home too... yeah kill me3 -
Last year, I made an application of A* maze-solving algorithm in class. I used a linked list and my friends used arrays. Their algorithms were way faster than mine (I remade it later :p).
OK I understand that accessing memory by address if way faster than accessing by iterations, but I also see that python lists or C# lists are really fast. How is it possible to make a list performance-proof like this? Do the python interpreter make a realloc each time you append or pop a value?1 -
Up all damn night making the script work.
Wrote a non-sieve prime generator.
Thing kept outputting one or two numbers that weren't prime, related to something called carmichael numbers.
Any case got it to work, god damn was it a slog though.
Generates next and previous primes pretty reliably regardless of the size of the number
(haven't gone over 31 bit because I haven't had a chance to implement decimal for this).
Don't know if the sieve is the only reliable way to do it. This seems to do it without a hitch, and doesn't seem to use a lot of memory. Don't have to constantly return to a lookup table of small factors or their multiple either.
Technically it generates the primes out of the integers, and not the other way around.
Things 0.01-0.02th of a second per prime up to around the 100 million mark, and then it gets into the 0.15-1second range per generation.
At around primes of a couple billion, its averaging about 1 second per bit to calculate 1. whether the number is prime or not, 2. what the next or last immediate prime is. Although I'm sure theres some optimization or improvement here.
Seems reliable but obviously I don't have the resources to check it beyond the first 20k primes I confirmed.
From what I can see it didn't drop any primes, and it didn't include any errant non-primes.
Codes here:
https://pastebin.com/raw/57j3mHsN
Your gotos should be nextPrime(), lastPrime(), isPrime, genPrimes(up to but not including some N), and genNPrimes(), which generates x amount of primes for you.
Speed limit definitely seems to top out at 1 second per bit for a prime once the code is in the billions, but I don't know if thats the ceiling, again, because decimal needs implemented.
I think the core method, in calcY (terrible name, I know) could probably be optimized in some clever way if its given an adjacent prime, and what parameters were used. Theres probably some pattern I'm not seeing, but eh.
I'm also wondering if I can't use those fancy aberrations, 'carmichael numbers' or whatever the hell they are, to calculate some sort of offset, and by doing so, figure out a given primes index.
And all my brain says is "sleep"
But family wants me to hang out, and I have to go talk a manager at home depot into an interview, because wanting to program for a living, and actually getting someone to give you the time of day are two different things.1 -
So, in opengl 4.x, there are no primitives for circle, and the only ways to draw an almost perfect circle are following
Draw a triangle fan and fk up your memory for a circle
Draw a rectangle and use the fragment shader and distance equation to discard the bit that is not used
But you will need to add an if statement and potentially increase the frame time (from what i have heard)
And it will be more complicated than just using a triangle fan14 -
Who actually started the reign of mixed character passwords? because seriously it sucks to have an unnecessarily complex password! Like websites and apps requesting passwords to contain Upper/Lower case letter, numeric characters and symbols without considering the average user with low memory threshold (i.e; Me).
Let's push the complaint aside and return back to the actual reason a complex password is required.
Like we already know; Passwords are made complex so it can't be easily guessed by password crackers used by hackers and the primary reason behind adding symbols and numbers in a password is simply to create a stretch for possible outcome of guesses.
Now let's take a look into the logic behind a password cracker.
To hack a password,
1) The Password Cracker will usually lookup a dictionary of passwords (This point is very necessary for any possible outcome).
2) Attempts to login multiple times with list of passwords found (In most cases successful entries are found for passwords less than 8 chars).
3) If none was successful after the end of the dictionary, the cracker formulates each password on the dictionary to match popular standards of most website (i.e; First letter uppercase, a number at the end followed by a symbol. Thanks to those websites!)
4) If any password was successful, the cracker adds them to a new dictionary called a "pattern builder list" (This gives the cracker an upper edge on that specific platform because most websites forces a specific password pattern anyway)
In comparison:
>> Mygirlfriend98##
would be cracked faster compared to
>> iloveburberryihatepeanuts
Why?
Because the former is short and follows a popular pattern.
In reality, password crackers don't specifically care about Upper-Lowercase-Number-Symbol bullshit! They care more about the length of the password, the pattern of the password and formerly used entries (either from keyloggers or from previously hacked passwords).
So the need for requesting a humanly complex password is totally unnecessary because it's a bot that is being dealt with not another human.
My devrant password is a short story of *how I met first girlfriend* Goodluck to a password cracker!6 -
I'm stuck.
Quitting smoking/scrolling/youtube/other unhealthy coping mechanisms, taking my prescriptions and exercising every day made my mind free — now it's unobstructed, clear and not hindered in any way.
The problem is, without constant coping, my memory turned into a minefield. I can't think freely, as I constantly stumble upon trauma after trauma that make my heart physically hurt.
With clear mind, I now clearly see what used to lurk in shadows, and I'm terrified of it. I won't go back to smoking and watching youtube ten hours a day.
What should I do?12 -
When file managers copy and delete files within the same partition instead of moving or renaming them…
When Google's Storage Access Framework was introduced, it did not feature a move command, so file managers just resorted to copying and deleting files within the same storage. Not only does this cause needless wear and is much slower, but it also destroys the date/time attribute (it gets changed to current).
When moving files through MTP (miserable transfer protocol, used for connecting smartphones to PC), they are also copy-deleted. This makes moving a 20-Gigabyte DCIM folder impractical. Also, if one cancels the operation, it might end up whoopsie-daisy deleting some files from the source before they have been transferred.
MTP is so bogus that it is incapable of a simple operation that would JustWork™ on mass storage devices. Not to mention, MTP lacks parallelism and its directory listing loading it S-L-O-W. Upwards of a minute for just 1000 files. Sometimes, it fails loading at all.
Also, trying to rename a file through MTP using the terminal through GVFS, even if just within the same folder, it copy-deletes it. If I want to rename a 1 GB 2160p 4K video in a highly populated DCIM folder, I can not do so through the terminal. At least, the 4K video has a time stamp in its internal metadata, but it still renames slowly and adds needless wear to the smartphone's flash memory.14 -
the one that exists (c#) seems underused compared to where it could (or even should) be used. and the place that uses it the most (enterprise) butchers and mangles its use, just as enterprise tends to do with everything.
the one that i'm designing... the fact that it doesn't exist yet, and that even as i'm zeroing in on syntax and philosophy that i'm very much starting to be proud of, i still don't have a proper idea of how to implement even the most basic parser/interpreter for it, not because it's in any way difficult or unusual, but just because... i've never done that before, so i get into weird circular thought paths that produce weird nonsensical code...
... on top of that, i still only have a very, very fuzzy idea of how will it (sometime in extremely distant future) actually implement the most interesting and core feature - event-based continuous (partial) re-parsing of the source code and the fact that traversing the tokens at the leaf level of the syntax tree should result in valid machine code (or at least assembly) that is the "compiled" program.
i *know* it's possible, i just don't yet know enough to have a contrete idea how exactly to achieve it.
but imagine - a programming language where interactive programming is basically the default way of working, and basically the same as normal programming in it, except the act of parsing is also the (in-memory) compilation at the same time, so it's running directly on the hardware instead of via interpretrer/vm/any of that overhead crap.
also then kinda open-source by definition.
and then to "only" write an OS in that, and voilá! a smalltalk-like environment with non-exotic, c-family syntax and actual native performance!
ahhh... <3
* a man can dream *2 -
Perhaps as a tip for the junior devs out there, here's what I learned about programming skills on the job:
You know those heavy classes back in college that taught you all about Data Structures? Some devs may argue that you just need to know how to code and you don't need to know fancy Data Structures or Big o notation theory, but in the real world we use them all the time, especially for important projects.
All those principles about Sets, (Linked) lists, map, filter, reduce, union, intersection, symmetric difference, Big O Notation... They matter and are used to solve problems. I used to think I could just coast by without being versed in them.. Soon, mathematics and Big o notation came back to bite me.
Three example projects I worked in where this mattered:
- Massive data collection and processing in legacy Java (clients want their data fast, so better think about the performance implications of CRUD into Collections)
- ReactJS (oh yes, maps and filters are used a lot...)
- Massive data collection in C# where data manipulation results are crucial (union, intersection, symmetric difference,...)
Overall: speed and quality mattered (better know your Big o notation or use a cheat sheet, though I prefer the first)
Yes, the approach can be optimized here, but often we're tied to client constraints, with some room if we're lucky.
I'm glad I learned this lesson. I would rather have skills in my head and in memory than having to look up things and try to understand them all the time.5 -
I used to work with another dev who had memory problems. This guy *literaly* could not remeber what he did yesterday...
So, he was trying to change one of the password screens we had in the app. This was a really simple screen. Logo, password prompt, and two buttons. He worked on this small change for two days, but everything he did did not affect the screen at run time.
So finally, he gave up and called me to help him... I come over, and look at his code. It looks ok. I make a small change, and see what happens. Nothing. I think for a moment, and delete the entire screen UI elements. Run the app. Nothing happens - screen still the same.
Then I got it - he kept changing the wrong screen... for two days....
took me a whole 5 minutes to figure out.2 -
Before I started working, I used to feel like I depended on documentation and the internet a little too much owing to ultra crappy long term memory. After spending some time at my internship going through code written by "professional developers" several years senior to me and trying to write unit tests for it (surprise: the code was in production without having underwent any sort of testing), I feel like the amount of time I spend online reading usage recommendations, alternates for optimisation, best practices for writing clean and descriptive code and all that is a lot more rewarding. Some bad things help you feel good about yourself.
-
Some compilers give an error message on forgotten type casting. From that it shows good typing style casting. So you also avoid clerical errors that can lead to the program crash in the worst case. With some types it is also necessary to perform type casting comma on others Types, however, do this automatically for the compiler.
In short:Type casting is used to prevent mistakes.
An example of such an error would be:
#include <stdio.h>
#include <stdlib.h>
int main ()
{
int * ptr = malloc (10*sizeof (int))+1;
free(ptr-1);
return 0;
}
By default, one tries to access the second element of the requested memory. However, this is not possible, since pointer calculation (+,-) does not work for a void pointer.
The improved example would be:
int * ptr = ((int *) malloc (10*sizeof (int)))+1;
Here, typecasting is done beforehand and this turns the void pointer into its int pointer and pointer calculation can be applied. Note: If instead of error "no output" is displayed on the sololearn C compiler try another compiler.1 -
YGGG IM SO CLOSE I CAN ALMOST TASTE IT.
Register allocation pretty much done: you can still juggle registers manually if you want, but you don't have to -- declaring a variable and using it as operand instead of a register is implicitly telling the compiler to handle it for you.
Whats more, spilling to stack is done automatically, keeping track of whether a value is or isnt required so its only done when absolutely necessary. And variables are handled differently depending on wheter they are input, output, or both, so we can eliminate making redundant copies in some cases.
Its a thing of beauty, defenestrating the difficult aspects of assembly, while still writting pure assembly... well, for the most part. There's some C-like sugar that's just too convenient for me not to include.
(x,y)=*F arg0,argN. This piece of shit is the distillation of my very profound meditations on fuckerous thoughtlessness, so let me break it down:
- (x,y)=; fuck you in the ass I can return as many values as I want. You dont need the parens if theres only a single return.
- *F args; some may have thought I was dereferencing a pointer but Im calling F and passing it arguments; the asterisk indicates I want to jump to a symbol rather than read its address or the value stored at it.
To the virtual machine, this is three instructions:
- bind x,y; overwrite these values with Fs output.
- pass arg0,argN; setup the damn parameters.
- call F; you know this one, so perform the deed.
Everything else is generated; these are macro-instructions with some logic attached to them, and theres a step in the compilation dedicated to walking the stupid program for the seventh fucking time that handles the expansion and optimization.
So whats left? Ah shit, classes. Disinfect and open wide mother fucker we're doing OOP without a condom.
Now, obviously, we have to sanitize a lot of what OOP stands for. In general, you can consider every textbook shit, so much so that wiping your ass with their pages would defeat the point of wiping your ass.
Lets say, for simplicity, that every program is a data transform (see: computation) broken down into a multitude of classes that represent the layout and quantity of memory required at different steps, plus the operations performed on said memory.
That is most if not all of the paradigm's merit right there. Everything else that I thought to have found use for was in the end nothing but deranged ways of deriving one thing from another. Telling you I want the size of this worth of space is such an act, and is indeed useful; telling you I want to utilize this as base for that when this itself cannot be directly used is theoretically a poorly worded and overly verbose bitch slap.
Plainly, fucktoys and abstract classes are a mistake, autocorrect these fucking misspelled testicle sax.
None of the remaining deeper lore, or rather sleazy fanfiction, that forms the larger cannon of object oriented as taught by my colleagues makes sufficient sense at this level for me to even consider dumping a steaming fat shit down it's execrable throat, and so I will spare you bearing witness to the inevitable forced coprophagia.
This is what we're left with: structures and procedures. Easy as gobblin pie.
Any F taking pointer-to-struc as it's first argument that is declared within the same namespace can be fetched by an instance of the structure in question. The sugar: x ->* F arg0,argN
Where ->* stands for failed abortion. No, the arrow by itself means fetch me a symbol; the asterisk wants to jump there. So fetch and do. We make it work for all symbols just to be dicks about it.
Anyway, invoking anything like this passes the caller to the callee. If you use the name of the struc rather than a pointer, you get it as a string. Because fuck you, I like Perl.
What else is there to discuss? My mind seems blank, but it is truly blank.
Allocating multitudes of structures, with same or different types, should be done in one go whenever possible. I know I want to do this, and I know whichever way we settle for has to be intuitive, else this entire project has failed.
So my version of new always takes an argument, dont you just love slurping diarrhea. If zero it means call malloc for this one, else it's an address where this instance is to be stored.
What's the big idea? Only the topmost instance in any given hierarchy will trigger an allocation. My compiler could easily perform this analysis because I am unemployed.
So where do you want it on the stack on the heap yyou want to reutilize any piece of ass, where buttocks stands for some adequately sized space in memory -- entirely within the realm of possibility. Furthermore, evicting shit you don't need and replacing it with something else.
Let me tell you, I will give your every object an allocator if you give the chance. I will -- nevermind. This is not for your orifices, porridges, oranges, morpheousness.
Walruses.16 -
Question for devs who use Intellij IDEA.
How often do you use livetemplates?
I am a new android dev with ADHD and just discovered live templates. They make my life much easier, for example I have shortcuts for generating recyclerview adapter/viewholder/implementation boilerplate code.
In that way I am able to focus on implementation, and do my coding like building blocks, rather than memorizing every detail of implementation. Also I don't need to go to stackoverflow and copypaste basic things multiple times. Even for example during live coding interview having livetemplates seems awesome, copypasting from stackoverflow would be shameful (I think). Using my own custom shortcuts for livetemplates seems the best way for how my brain functions (I suck at memorizing tiny details, but I remember general idea/flow of a pattern and I would prefer memorizing what to use and when to use, instead of all small details of implementation).
Is getting to dependent on livetemplates a good practice to get used to? Do other developers frown upon a dev who has dozens of livetemplates and relies on them instead of writing all code from memory by hand?8 -
Can anyone help me with this theory about microprocessor, cpu and computers in general?
( I used to love programming when during school days when it was just basic searching/sorting and oop. Even in college , when it advanced to language details , compilers and data structures, i was fine. But subjects like coa and microprocessors, which kind of explains the working of hardware behind the brain that is a computer is so difficult to understand for me 😭😭😭)
How a computer works? All i knew was that when a bulb gets connected to a battery via wires, some metal inside it starts glowing and we see light. No magics involved till now.
Then came the von Neumann architecture which says a computer consists of 4 things : i/o devices, system bus ,memory and cpu. I/0 and memory interact with system bus, which is controlled by cpu . Thus cpu controls everything and that's how computer works.
Wait, what?
Let's take an easy example of calc. i pressed 1+2= on keyboard, it showed me '1+2=' and then '3'. How the hell that hapenned ?
Then some video told me this : every key in your keyboard is connected to a multiplexer which gives a special "code" to the processer regarding the key press.
The "control unit" of cpu commands the ram to store every character until '=' is pressed (which is a kind of interrupt telling the cpu to start processing) . RAM is simply a bunch of storage circuits (which can store some 1s) along with another bunch of circuits which can retrieve these data.
Up till now, the control unit knows that memory has (for eg):
Value 1 stored as 0001 at some address 34A
Value + stored as 11001101 at some address 34B
Value 2 stored as 0010 at some Address 23B
On recieving code for '=' press, the "control unit" commands the "alu" unit of cpu to fectch data from memory , understand it and calculate the result(i e the "fetch, decode and execute" cycle)
Alu fetches the "codes" from the memory, which translates to ADD 34A,23B i.e add the data stored at addresses 34a , 23b. The alu retrieves values present at given addresses, passes them through its adder circuit and puts the result at some new address 21H.
The control unit then fetches this result from new address and via, system busses, sends this new value to display's memory loaded at some memory port 4044.
The display picks it up and instantly shows it.
My problems:
1. Is this all correct? Does this only happens?
2. Please expand this more.
How is this system bus, alu, cpu , working?
What are the registers, accumulators , flip flops in the memory?
What are the machine cycles?
What are instructions cycles , opcodes, instruction codes ?
Where does assembly language comes in?
How does cpu manipulates memory?
This data bus , control bus, what are they?
I have come across so many weird words i dont understand dma, interrupts , memory mapped i/o devices, etc. Somebody please explain.
Ps : am learning about the fucking 8085 microprocessor in class and i can't even relate to basic computer architecture. I had flunked the coa paper which i now realise why, coz its so confusing. :'''(14 -
Finally finished an algo to check an image for grouping of pixels that will form a rectangular area. I got the grouping to work on one image, but found it was utterly failing on another. I went through every step of the algo and still could not find the solution. The 128x128 image was working, but the 128x16 image was not. I knew it had something to do with the dimensions. Started thinking it was overflowing a buffer somewhere. So I started putting asserts in the functions that abstracted the buffer access. None of the numbers exceeded the proper bounds. It was close to bedtime so I finally gave up. I was tired. Then I realized it wouldn't be until the next evening when I could look at this again. So I got up again and started looking at the code again. I had a loop to check the output of my algo that I did the memory access of the buffer. It too was not fully filling my temp image to show how the algo was working. WTF!
Then I finally realized the flaw:
buffer[x+y*height]
And my test loop to test the algo:
buffer[x+y*ymax]
I kept overlooking the error because I was sure it was right. Also my asserts for the functions to access the buffers? They only checked the inputs x and y. So it didn't help that the math was wrong for reading and writing the buffers. It also worked fine on 128x128 images because the width and height were the same.
It is funny that I struggled with this part. The algo was actually surprisingly easy to formulate. I just looked through every point and checked a buffer to see if that point was used. If not then I would attempt to grow in the x and y direction the shaped of that point based upon pixel color. This was saved in a structure while growing that point. Then when that rectangle could not be grown further the inner loop would continue checking used points again.
I still have work to do to use the data this algo produces. I need to now figure out how to parent the rectangular areas to each other. I will probably use my check buffer to keep track of these rects by an index. Then do adjacent checks to determine parenting. Eventually I will have to extend this algo to 3 dimensions, but that should not be difficult.2 -
Im thinking about getting a raspberry pi 3 or an odroid-c2.
(Specs at the end)
Its to host a simple php server and maybe a gitlab server, both for personal use.
Should I go with the better performance or the better community support?
Odroid specs
System-on-chip used : Amlogic S905
CPU: 1.5 GHz 64-bit quad-core ARM Cortex-A53
Memory: 2 GB LPDDR3 RAM at 912 MHz
Storage: MicroSDHC slot, eMMC module socket
Graphics: Mali-450 MP3
Connectivity: 4× USB 2.0, micro-USB OTG, HDMI 2.0, Gigabit Ethernet (8P8C), Infrared, 40× GPIO ports
Raspberry specs:
SoC: Broadcom BCM2837
CPU: 4× ARM Cortex-A53, 1.2GHz
GPU: Broadcom VideoCore IV
RAM: 1GB LPDDR2 (900 MHz)
Networking: 10/100 Ethernet, 2.4GHz 802.11n wireless
Bluetooth: Bluetooth 4.1 Classic, Bluetooth Low Energy
Storage: microSD
GPIO: 40-pin header, populated
Ports: HDMI, 3.5mm analogue audio-video jack, 4× USB 2.0, Ethernet, Camera Serial Interface (CSI), Display Serial Interface (DSI)8 -
In the past, apps I've written have used a flat file backend. It's very fast, but obviously clunky to have a big structure of flat files for an app. It ran circles around framework-based RDBMS backends, as performance is concerned, but again, it was clunky. Managing backups and permissions on tens or hundreds of thousands of small files was no fun. Optimizing code for scaling was fun- generating indexes, making shortcuts -but something was still missing. Early in 2017 I discovered redis. A nosql backend that just stores variables and lives almost entirely in memory. Excellent modules and frameworks for every language. It was EXACTLY what I'd needed, even though I didn't know I did. I spent a good deal of time in 2017 converting apps from flat files to redis, and cackled with glee as they became the apps I wanted them to be. Earlier this week, I started building my first app that started with redis, instead of flat files, and I can't stop gushing to anyone who will listen. Redis for president!
-
Not a data loss exactly but a loss indeed.
It was my first week at my first junior developer job, I was just learning git and completely messed it all up. I lost around 3 hours of work.
I didn't want to ask anybody for help (because of that useless junior feeling, you know...) and wasn't as good using Google as I'm now.
So I re-did all the work. Thankfully, I have a decent memory.
If there's something to learn here is ask for help when you've used all your resources and still think you need it. Nobody is going to have a bad opinion about you ;) -
Friendly reminder to trim your services list with msconfig if using Windows. Services that are STOPPED are not DISABLED, and they can be brought back up when just stopped, sometimes remotely.
(This reduces chances of being bitten by malware that uses the Fax service or similar, as there are a few that have in past used often-unused services to propagate. It also reclaims a small bit of memory, and the more real memory you have, the less you page out when compiling or similar, which is slow as fuck.)
also for the love of god stop using RDP and use something that's more penetration-proof than a paper plate...11 -
Web browsers removed FTP support in 2021 arguing that it is "insecure".
The purpose of FTP is not privacy to begin with but simplicity and compatibility, given that it is widely established. Any FTP user should be aware that sharing files over FTP is not private. For non-private data, that is perfectly acceptable. FTP may be used on the local network to bypass MTP (problems with MTP: https://devrant.com/rants/6198095/... ) for file transfers between a smartphone and a Windows/Linux computer.
A more reasonable approach than eliminating FTP altogether would have been showing a notice to the user that data accessed through FTP is not private. It is not intended for private file sharing in the first place.
A comparable argument was used by YouTube in mid-2021 to memory-hole all unlisted videos of 2016 and earlier except where channel owners intervened. They implied that URLs generated before January 1st, 2017, were generated using an "unsafe" algorithm ( https://blog.youtube/news-and-event... ).
Besides the fact that Google informed its users four years late about a security issue if this reason were true (hint: it almost certainly isn't), unlisted videos were never intended for "protecting privacy" anyway, given that anyone can access them without providing credentials. Any channel owner who does not want their videos to be seen sets them to "private" or deletes them. "Unlisted" was never intended for privacy.
> "In 2017, we rolled out a security update to the system that generates new YouTube Unlisted links"
It is unlikely that they rolled out a security update exactly on new years' day (2017-01-01). This means some early 2017 unlisted videos would still have the "insecure URLs". Or, likelier than not, this story was made up to sound just-so plausible enough so people believe it.50 -
So this semester we're being taught C/C++ which now seems to me like a distant memory from high school days.
The professor decides to use visual studio for something as trivial as variables and pointers and as I went through the syllabus, it won't get any harder and will stick to this simple stuff.
As much as I find VS awesome, when there is a simpler approach available, why go the tougher way?
The same could easily be achieved from a ~15 compiler or even that 16bit compiler we used in our high school that couldn't even use mouse as an input. Am I over thinking this? .-.4 -
It started when i was about 10 old.
My uncle showed me how to display something in dos-prompt using the echo command in a custom batch-file.
A few commands later, i was able to "program" a flip-book of an ascii ski-driver. Each ascii picture was separated by pressing any key and cls ^^
Aaaaah. Sweet childhood memories!
Later on i used a programming-language for beginners in windows.
This language gave you control of a triangle called "turtle".
My first high-level programming language was Delphi.
Since i had no idea of databases, i created a pseudo database of magic the gathering play-cards. Each card had it's very own windows formular filled up completely with an uncompressed image object displaying the chosen card modally. *sigh*
I scanned each card by using a feed scanner.
Finally, my application consisted of 200 cardimages and forced my PC to swap the required memory from my harddisk.
Boy o boy. I was such a noob! ^^
Over the years i discovered and felt in love with a lot of languages (jsp, java (script), c#, php, ...) and concepts (mvvm, mvc, clean-architecture, tdd, ...)! ;) -
When I built a scripting language for game bootup and shutdown to load and free needed files and memory but used the language to debug a completely seperate program.
Got me excited that I was able to use my code for complete opposites of tasks -
So I have a question to anyone familiar with the General Transit Feed Specification...
Why is the data provided in text files? Is there not a way to format the data to allow for random access to it?
Like I'm currently writing a transit app for a school project, and as far as I can tell, the only way to get all specific stops for a route, is to first look up all trips in a route, then look up all the stopids that are associated with a trip in stoptimes.txt (while also filtering out duplicates since the goal is to get stop ids, not specifically stop times) and then look up those stop ids in the stops.txt file.
The stoptimes file alone is over 500000 lines long, unless there is a way better way to be parsing the data that I'm not aware of? Currently I'm just loading the entire stoptimes file into a data structure in memory because the extra bit of ram used seems negligible compared to the load times I'm saving...
Would it be faster if I just parsed all the data once and threw it into a database? (And then updated the database once a month when the new data comes in?)3 -
Sydochen has posted a rant where he is nt really sure why people hate Java, and I decided to publicly post my explanation of this phenomenon, please, from my point of view.
So there is this quite large domain, on which one or two academical studies are built, such as business informatics and applied system engineering which I find extremely interesting and fun, that is called, ironically, SAD. And then there are videos on youtube, by programmers who just can't settle the fuck down. Those videos I am talking about are rants about OOP in general, which, as we all know, is a huge part of studies in the aforementioned domain. What these people are even talking about?
Absolutely obvious, there is no sense in making a software in a linear pattern. Since Bikelsoft has conveniently patched consumers up with GUI based software, the core concept of which is EDP (event driven programming or alternatively, at least OS events queue-ing), the completely functional, linear approach in such environment does not make much sense in terms of the maintainability of the software. Uhm, raise your hand if you ever tried to linearly build a complex GUI system in a single function call on GTK, which does allow you to disregard any responsibility separation pattern of SAD, such as long loved MVC...
Additionally, OOP is mandatory in business because it does allow us to mount abstraction levels and encapsulate actual dataflow behind them, which, of course, lowers the costs of the development.
What happy programmers are talking about usually is the complexity of the task of doing the OOP right in the sense of an overflow of straight composition classes (that do nothing but forward data from lower to upper abstraction levels and vice versa) and the situation of responsibility chain break (this is when a class from lower level directly!! notifies a class of a higher level about something ignoring the fact that there is a chain of other classes between them). And that's it. These guys also do vouch for functional programming, and it's a completely different argument, and there is no reason not to do it in algorithmical, implementational part of the project, of course, but yeah...
So where does Java kick in you think?
Well, guess what language popularized programming in general and OOP in particular. Java is doing a lot of things in a modern way. Of course, if it's 1995 outside *lenny face*. Yeah, fuck AOT, fuck memory management responsibility, all to the maximum towards solving the real applicative tasks.
Have you ever tried to learn to apply Text Watchers in Android with Java? Then you know about inline overloading and inline abstract class implementation. This is not right. This reduces readability and reusability.
Have you ever used Volley on Android? Newbies to Android programming surely should have. Quite verbose boilerplate in google docs, huh?
Have you seen intents? The Android API is, little said, messy with all the support libs and Context class ancestors. Remember how many times the language has helped you to properly orient in all of this hierarchy, when overloading method declaration requires you to use 2 lines instead of 1. Too verbose, too hesitant, distracting - that's what the lang and the api is. Fucking toString() is hilarious. Reference comparison is unintuitive. Obviously poor practices are not banned. Ancient tools. Import hell. Slow evolution.
C# has ripped Java off like an utter cunt, yet it's a piece of cake to maintain a solid patternization and structure, and keep your code clean and readable. Yet, Cs6 already was okay featuring optionally nullable fields and safe optional dereferencing, while we get finally get lambda expressions in J8, in 20-fucking-14.
Java did good back then, but when we joke about dumb indian developers, they are coding it in Java. So yeah.
To sum up, it's easy to make code unreadable with Java, and Java is a tool with which developers usually disregard the patterns of SAD. -
Didn't know what this was, dad caught it this morning in the backyard... Apparently for the first time.
Been watching Pokemon Journeys lately so my first thought was "you caught a new Pokemon" and also "can you name that Pokemon?"
You take a guess here but I did a Google reverse image search and it got it 100%, even with the cage. So I was like wow...
I've used it rarely for other pictures before and my memory says it never came close.
Or maybe that's what all those captchas have been used for.... Though usually they are all for cars, roads and traffic lights....14 -
Urgh... No exceptions in Rust annoys me. Now you only have the choice between "this didn't work please handle this error, thank you ^-^" and "you fool, prepare for annihilation". So basically if anything remotely serious happens your programs dead and there's nothing you can do about it. I don't get why people have this hate for exceptions. Everytime a new language gets made it's always either "ew it has exceptions" or "it's so nice it doesn't even have exceptions". NOOO! They can deal with serious situations in the best possible way and they can be statically checked (so no "but they're so complex and unpredicable" stuff please). If you can expect an exception they shouldn't be used in the first place (eventhough they are absolutely no less good than Option returntypes or whatever, just different) but in cases when it's impossible to predict an error they really shine. And not having them makes your language worse. If a device driver accesses illegal memory it should throw an exception, so instead of the computer shitting the bed, first the offending function has a chance to resolve the problem at it's root, then a few functions up the call stack, the general control functions of the device drivers can handle it and restart the operation if applicable, and even if the driver fails to handle it, the OS can jump in and restart the driver, log an error and do whatever. It's absolutely beautiful: This hierarchical ramp from near the accident site to more high level operations code ensures the error can be caught at the right level of abstraction without introduction a lot of boilerplate. If everything fails and nobody can handle it *then* the program or kernel or whatever can panic.4
-
Ok so I just changed my keyboard layout to neo2 because qwertz can suck my balls. Looking quite good so far. I've been writing some smaller texts and it looks like you can get used to it quite fast (i also changed because I wanted to learn writing with 10 fingers anyways. Not that I've been writing slowly before, but why not).
The bad thing: all shortcuts (vim etc) feel strange because I have to betray my muscle memory now. So I thought I might also just switch to emacs now. Have to learn it from the beginning but it might be worth it.
Did anyone of you have any experience with neo (german) and what editors did you use?5 -
During my small tenure as the lead mobile developer for a logistics company I had to manage my stacks between native Android applications in Java and native apps in IOS.
Back then, swift was barely coming into version 3 and as such the transition was not trustworthy enough for me to discard Obj C. So I went with Obj C and kept my knowledge of Swift in the back. It was not difficult since I had always liked Obj C for some reason. The language was what made me click with pointers and understand them well enough to feel more comfortable with C as it was a strict superset from said language. It was enjoyable really and making apps for IOS made me appreciate the ecosystem that much better and realize the level of dedication that the engineering team at Apple used for their compilation protocols. It was my first exposure to ARC(Automatic Reference Counting) as a "form" of garbage collection per se. The tooling in particular was nice, normally with xcode you have a 50/50 chance of it being great or shit. For me it was a mixture of both really, but the number of crashes or unexpected behavior was FAR lesser than what I had in Android back when we still used eclipse and even when we started to use Android Studio.
Developing IOS apps was also what made me see why IOS apps have that distinctive shine and why their phones required less memory(RAM). It was a pleasant experience.
The whole ordeal also left me with a bad taste for Android development. Don't get me wrong, I love my Android phones. But I firmly believe that unless you pay top dollar for an android manufacturer such as Samsung, motorla or lg then you will have lag galore. And man.....everyone that would try to prove me wrong always had to make excuses later on(no, your $200_$300 dllr android device just didn't cut it my dude)
It really sucks sometimes for Android development. I want to know what Google got so wrong that they made the decisions they made in order to make people design other tools such as React Native, Cordova, Ionic, phonegapp, titanium, xamarin(which is shit imo) codename one and many others. With IOS i never considered going for something different than Native since the API just seemed so well designed and far superior to me from an architectural point of view.
Fast forward to 2018(almost 2019) adn Google had talks about flutter for a while and how they make it seem that they are fixing how they want people to design apps.
You see. I firmly believe that tech stacks work in 2 ways:
1 people love a stack so much they start to develop cool ADDITIONS to it(see the awesomeios repo) to expand on the standard libraries
2 people start to FIX a stack because the implementation is broken, lacking in functionality, hard to use by itself: see okhttp, legit all the Square libs, butterknife etc etc etc and etc
From this I can conclude 2 things: people love developing for IOS because the ecosystem is nice and dev friendly, and people like to develop for Android in spite of how Google manages their API. Seriously Android is a great OS and having apps that work awesomely in spite of how hard it is to create applications for said platform just shows a level of love and dedication that is unmatched.
This is why I find it hard, and even mean to call out on one product over the other. Despite the morals behind the 2 leading companies inferred from my post, the develpers are what makes the situation better or worse.
So just fuck it and develop and use for what you want.
Honorific mention to PHP and the php developer community which is a mixture of fixing and adding in spite of the ammount of hatred that such coolness gets from a lot of peeps :P
Oh and I got a couple of mobile contracts in the way, this is why I made this post.
And I still hate developing for Android even though I love Java.3 -
Why TF does nodejs just eats 100mb of ram away for a simple application with ONE websocket connection ? I've tried getting some heap snaps, memory allocation timelines and used memwatch-next, but to no result AT ALL! Since the heap stay small but the rss memory grows like there is no tomorrow.
-
Ok ok.. I used a German keyboard so Y and Z are switched. Ive never seen a picture of Jason Mraz but I really like his music so I wanted to YouTube him.. and my muscle memory did this.2
-
About to start writing a report for my programming languages course, I’m writing it over GoLang, If anybody has any good resources for any information on Go, let me know!
The report extends into the history, paradigms, features, memory management system, and anything else I can possibly find on this language. I can find some pretty decent references on the footer of Wikipedia, but I wanted to see if anybody who actually used Go had anything they’d like to share.
Thanks :)1 -
Hi guys, I'm looking for a used Chromebook on eBay to install Linux mint. Budget under $150. Is there any model recommend? Is 2G memory enough? Many thanks.5
-
TL;DR When talking about caching, is it even worth considering try and br as memory efficient as possible?
Context:
I recently chatted with a developer who wanted to improve a frameworks memory usage. It's a framework creating discord bots, providing hooks to events such as message creation. He compared it too 2 other frameworks, where is ranked last with 240mb memory usage for a bot with around 10.5k users iirc. The best framework memory wise used around 120mb, all running on the same amount of users.
So he set out to reduce the memory consumption of that framework. He alone reduced the memory usage by quite some bit. Then he wanted to try out ttl for the cache or rather cache with expirations times, adding no overhead, besides checking every interval of there are so few records that should be deleted. (Somebody in the chat called that sort of cache a meme. Would be happy , if you coukd also explain why that is so😅).
Afterwards the memory usage droped down to 100mb after a Around 3-5 minutes.
The maintainer of the package won't merge his changes, because sone of them really introduce some stuff that might be troublesome later on, such as modifying the default argument for processes, something along these lines. Haven't looked at these changes.
So I'm asking myself whether it's worth saving that much memory. Because at the end of the day, it's cache. Imo cache can be as big as it wants to be, but should stay within borders and of course return memory of needed. Otherwise there should be no problem.
But maybe I just need other people point of view to consider. The other devs reasoning was simple because "it shouldn't consume that much memory", which doesn't really help, so I'm seeking you guys out😁 -
I don't really understand all that love rust gets. It's syntax, better than C++'s, isn't better than C syntax.
You can make memory safe programs with C, just if you know how to manage memory, and you should only if you know how to. Bigger ecosystem for C/C++.
C23 waay better than any rs standard.
PS: I used both C/C++ and Rust39 -
Bullshittery continues. This time around, absolutely innocent, clamav is root cause. For once not incompetent idiot, but piece of software. IDK if that makes me happy or upset.
So our email server that I configured and took care of died. RIP. Damn, better put it back together ASAP. So Im under pressure, while still pissed at everything that I ranted before (actually my last 2 rants were throttled, and in total all of that happened past 60 minutes but devrant rate limiting) I start auditing logs. You imagine, we kindda need it NOW, and it's second time last month clamav is pulling stunts and MTA refuses (properly) to work without antivirus. So pressurized, I look at logs, what the fuck went wrong.
clamav deamonize() failed - cannot allocate memory
Hmm. Intresting, but sounds like bullshit. I know server is quite micro becouse they wanted to save on costs as much as possible, but it has well over half a gig free ram just before it crashes (like 800MB) with that message. Is it allocating almost gig in one call or what? Looked carefully at trusty htop while it was starting, and indeed, suddenly it just dies with quite a bit of ram free, almost as much as it weights already. And I remember booting it up when I was configuring it, and it had fair bit of headroom.
Google, help me friend... Okay, great, so apparently at some point clamav loads virus DB into ram (dafuq?), and than forks, which causes spike of 2x the ram usage, and than immidietely frees it up.
Great, that sounds like great design decision... At least I know, I can just slap on SWAP file, restart it and call it a day.
It worked, swap file is almost empty (used 15megs, 900 megs free ram, whatever).
That leaves me wandering, who figured out to load DB to ram? That means pretty much that clamav will eat a little bit more ram each vir db update, and that milisecond "double ram" spike will confuse innocent people who just wanted to run clamav and it worked last *long period of time* and now crashes without warning without any changes to configuration.
Maybe there is logical explanation, I want to know it.8 -
How did mid-2000s computer users get along with just 1 GB of RAM or less?
As of today, anything less than 8 GB of RAM seems impractical. A handful of tabs in a web browser and file manager can quickly fill that up.
Shortly after booting, 2 GB of RAM are already eaten up on today's operating systems.
When I occasionally used an older laptop computer with 6 GB of RAM (because it has more ports and better repairability than today's laptops; before upgrading the memory), most of the time over 5 GB were in use, and that did not even include disk caching.
It appears that today's web browsers are far more memory-intensive than 2000s web browsers, even if we do similar things people did in the 2000s: browsing text-based pages with some photos here and there, watching videos, messaging and mailing, forum posting, and perhaps gaming. Tabbed browsing already was a thing in the 2000s. Microsoft added tabs to their pre-installed browser in 2006, back when an average personal computer had 1 GB of RAM, and an average laptop 512 MB!
Perhaps a difference is that people today watch in 720p or 1080p whereas in the 2000s, people typically watched at 240p, 360p, or 480p, but that still does not explain this massive difference. (Also, I pick a low resolution anyway when mostly listening to a video in background.)
One could create a swap file to extend system memory, though that is not healthy for an SSD in the long term. On computers, RAM is king.14 -
I need to tell you the story of my MOAB (Mother of all bugs).
I need to write some stuff in C (which i am fairly used to) and have a function that allocates memory for a Matrix on the heap. The matrix has a rows and columns property and an associated data array, so it looks like this
struct Matrix{
uint8_t rows;
uint8_t columns;
uint8_t data[];
}
I allocate rows*columns + 2 bytes of memory for it.
I also have a function to zero it out which does something like this
for(int i=0; i < rows*columns;i++){
data[i]=0;}
Let‘s come to the problem:
On my Mac the whole stuff works and passes all tests. We tried the code on a Linux machine and suddenly the code crashed in various places, sometimes a realloc got an invalid pointer, sometimes free got an invalid pointer and basically the code crashed at arbitrary points randomly.
I was confused af because did i really make THAT many errors?
I found out that all errors occured when testing my matrices so i looked more into it and observed it through the debugger.
Eventually i came to the function that zeroes out my matrix and it went unusually high and wondered if my matrix really was that big.
Then i saw it
The matrix wasn‘t initialised yet
It had arbitrary data that was previously in the heap.
It zeroed out a huge chunk of the heap space.
It literally wrote a zero to a shitload of addresses which invalidated many pointer.
You can imagine my facepalm2 -
I never thought in my life that I would say this sentence one day - but:
Today I switched back to VS Code because it uses less memory than IntelliJ.
Context: Only temporary, very resource hungry dev environment, TypeScript, IntelliJ used >4.5 GB of ram and started lagging.5 -
Built the most generic file importer.
So a customer had his SAP system giving us some 5 million barcodes in a csv which we needed to parse. But as there could be different file types and I thought the handling would always include the same steps I made them configurable through function pointers. - Did not want to make it as spooky as the rest of the code base where the function pointers were buried deep in some shared memory configs, which might even change at run time, but rather I statically used the member functions of my class. Just to poke fun on the ugly C++ syntax of member function pointers. I still shudder at the thought some poor soul now has to maintain that code.
(For the actual parsing I actually used a one liner in awk which was churning through the records in one minute which was faster than the SAP guys seemed to be accustomed to.) -
MOTHERFUCKING Stripe !!!
Changed their dashboard layout !! Yes, let's move the top navbar to the left and change the whole layout. Oh yeah, and let's make the page less responsive than before.
I seriously don't understand why Stripe had to do this. Their dashboard layout was completely fine and useful but noooo lets change the navbar from the top to the left.
Because of these fucks, I now have to train my muscle memory to get used to this SHITTY ASS layout.
Fuck this. FUCK ALL OF THIS.2 -
GLFW is the cleanest, well documented, most convinient API for creating and handling windows in Linux and Windows I've ever used.
The only thing that bugs me is that valgrind detects memory leaks on it.4 -
What should i use for making a app which needs to learn on both android and windows and maybe ios. It is pretty simple. Mainly needs notification, network and file acccess, does not cost an arm and a leg, uses less than 1 gb of memory at a time and being able to be used as to make a backend is a plus. Being able to be used commercially is a plus too. Also please suggest somehing that does not have a steep learning curve3
-
Holy crap, Meta Developer Connect keynote. Amazing innovation. This is what Apple **used to be**
Granted, the hardware is not as elegant as Apple but the cost is 1/10 and the capabilities are close, same, or exceed (Llama) what Apple is offering.
Now here is the gut punch, they figured out that the mobile app build system needs to build AR/VR/MR apps. That was Apple's edge.
As a developer, I am not enamored with Swift and it is pretty clear that if I have to change and use a niche language like Swift or change and do dev on Android, to target new Meta hardware and AI... well... lets just say I think Swift is crap from a language standpoint and I suspect it is the reason Apple's hardware uses so much more memory, battery and storage than it should. At the same time Meta's Orion runs on a god damn battery in the early piece of glasses. My AVP's have a huge brick.
#define kApple kGigaBloat
If I were Apple I would be shitting my pants watching this Meta presentation.6 -
I wasted fukcing 30 minutes to find out the right editor for plantuml on os x. I do not like atom because it eats up memory, and brackets was not ready to install plantuml extension. in the end, I used atoms to finish a five minute job. #FML
It seems, mostly we waste our time in deciding which weapon to use !!! Any one who faced the same issues ???2 -
I am building a synth program for producing waveforms such as binaural. The programs I have used in the past have been mediocre.
In that project I am working on a realtime scope to visualize the waveforms. It is fun to learn how to streamline moving data between parts of the application. Right now it has a lot of unnecessary data copying going on, and resizing of vectors. So I am reading some books on high performance C++ to learn how to do this better. As part of this I am thinking about building a circular buffer so the vector is never resized and is always in contiguous memory.
Just plain fun!4 -
tell me guys what would you prefer:
function a(){
..
b(..)
..
b(..)
..
}
function b(p1,p2,p3,p4,p5,p6){.
...
}
or
function a(){
..
b(..)
..
b(..)
..
}
function b(
p1,
p2,
p3,
p4,
p5,
p6
){
...
}
if you read this rant before expanding, you got a complete context on how what function a is, its calling b 2 times and how function b looks.
if instead of the first option, i had used 2nd block, you wouldn't even know the 2nd param of b function without expanding this rant.
my point?
i prefer to keeping unnecessary info on one line. and w lot of linters disagree by splitting up the code. and most importantly , my arrogant tl disagree by saying he prefers the splitted code "for readability" and becaue "he likes code this way, old-eng1 likes this and old-eng2 likes this" .
why tf does an ide have horizontal a scrolling option available when you are too stupid to use it?
ok, i know some smartass is going to point that i too can use vertical scrolling, but hear me out: i am optimising this!
case 1 : a function with 7 params is NOT split into 7 lines. lets calculate the effort to remember it
- since all params could have similar charactersticks ( they will be of some type, might have defaults, might be a suspendable/async function etc), each param will take similar memory-efforts points. say 5sp each.
- total memory-efforts= 5sp *7 = 35 sp.
- say a human has 100 sp of fast memory storage, he can use the remaining 65 sp for loading say 5 small lines above or below.
- but since 5 lines above are already read and still visible on screen, they won't be needed to be loaded again nd again, nd we can just check the lines below.
- thus we are able to store 65+35+65 = 165 sp or about 11 lines of code in out fast memory for just a 100sp brain storage
case 2 function with 7 params IS split into 7 lines.
- in this case all lines are somewhat similar. 5sp for param lines as they are still similar which implies same 35sp for storing current function and params
- remaining 65sp can only be used to store next 5 lines of 13sp as the previous code is no longer visible.
- plus if you wanna refresh the code above, you gotta scroll, which will result in removing bottom code from screen , and now your 65sp from bottom code is overwritten by 65sp of top code.
- thus at a time, you are storing only 6 lines worth of code info. this makes you slow.
this is some imaginary math, but i believe it works10 -
Had to face the music and make the jump from Ubuntu 22.04 to Fedora 36. Am I have to say it’s been night and day so far. Everything is snappier. Yeah dnf is very slow in comparison to apt but there’s changes you can make to speed things up and the nifty terminal interface is a great change and helps to make up for the speed issues.
Came with Python 3.10 installed, Gnome and gtk4 apps are nice, fluid and up to date and the random slowdowns, freezing and restarts of Ubuntu running the version of Gnome are nonexistent.
For the life of me I can’t see why Ubuntu would drop the ball like this. I have a Dell XPS 13 developer edition and this is the best it’s ever ran. Even wifi connectivity is better despite of the crap WiFi card that ships with this machine.
I want to love this version and while it is the most graphical appealing and functional version of Ubuntu I’ve ever used. The memory management issues make it damn near unusable.9 -
Started developing an interest in programming after creating warcraft 3 maps using the world editor. I still remember those days where I used the gui trigger editor, where I don't even know the difference between local and global variables, preventing memory leaks by using leak check and etc. Creating new skills using triggers was so exciting. Then I discovered JASS, but I didn't really learn or use much about it. Now I'm working in Unity3D and it is awesome!2
-
!comforting
TL;DR - I’ve done some thinking about operating systems and sticking to one
Mk
so I, like many of you, have seen far more than my fair share of “X operating system is perfect for it all, so don’t use Y operating system because it’s just awful” posts.
Over this week i’ve really done some thinking and experimenting with multiple devices and OSes and programs for various tasks. People coming from windows over to linux (like myself) tend to diss windows (rightfully so for the most part, but still). I’ve also noticed that the android vs. apple debate can get heated among users.
Listen guys,
iOS has its shortcomings obviously, UI being kinda a big one; but no one can deny that apple shoves some of the nicest hardware into their devices. Yes, this stuff is pricey as hell obviously, but the new macs come with an i9 and quite a bit of memory as well. Apple devices tend to have longer lasting batteries too - i cant count the times where i’ve just turned on my mobile hotspot, and stuck my android in my pocket to use my iphone (its a wifi-only 5s). the applications run nicely on apple hardware.
i couldnt learn even half as much programming as i do on my android though; Termux is a godsend, and im able to run and test scripts right there in the palm of my hand. can’t get that on an iphone.
Some of my favorite game developers only develop for windows; I’m dual booting for that sole reason (warframe and the epic games launcher don’t properly run through wine).
Just boil it down inside for a second; You might have come from a more “user friendly” operating system, to learn on one that is less so - wether you wanted the freedom and wiggle room for customization, or just a more developer friendly working environment (God bless conky and its devs) - so you didn’t have to be locked down into one way of seeing things. Putting a previously used OS down directly violates that thougjt process, and at that point you’re just another windows hater, or arch junkie, or whatever. I think we need to be open to appreciating the pros of every system, even if we almost never use some of them, and we should try not to put down other devs-to-be or csci/sec enthusiasts down because of that either.2 -
Okay, so I had an object consisting of tables (basically classes) and structs (classes with only scalars as their properties).
I was about to serialize the object with vectors of classes and structs and wrote some nice tests for it wondering why they fail to validate the data after deserialization and why I only got garbage for the vectors of structs whilst the tables worked just fine.
Turns out there is an undocumented function called CreateVectorOfStructs which shall be used for structs instead of the regular CreateVector ...
There go three hours of blaming memory issues and running Valgrind over and over again ... -
You know I'm tired of the fucking memory noise of some twisted fuck working for twisted fucks laboring off some set of idiotic arbitrary stereotypes trying to get me to do the same fucking things by baiting me like a fucking dog
I want people to live their fucking lives and the social problems in this world to just be solved
None of this in last generation or twisted dumb fucks and their insensible number games that were used to program them
I want everything cleaned up and fixed and evil people to cease being evil and no more stupid loop2 -
Tried running our selenium test suite on Firefox during the nightly build. Came in this morning with no nighly build. Turns out the tear downs weren't killing the firefox drivers and they used up all the memory on the build server. 😐
-
Recently joined new Android app (product) based project & got source code of existing prod app version.
Product source code must be easy to understand so that it could be supported for long term. In contrast to that, existing source structure is much difficult to understand.
Package structure is flat only 3 packages ui, service, utils. No module based grouped classes.
No memory release is done. So on each screen launch new memory leaks keep going on & on.
Too much duplication of code. Some lazy developer in the past had not even made wrappers to avoid direct usage of core classes like Shared Preference etc. So at each place same 4-5 lines were written.
Too much if-else ladders (4-5 blocks) & unnecessary repetitions of outer if condition in inner if condition. It looks like the owner of this nested if block implementation has trust issues, like that person thought computer 'forgets' about outer if when inside inner if.
Too much misuse of broadcast receiver to track activities' state in the era of activity, apપ life cycle related Android library.
Sometimes I think why people waste soooo... much efforts in the wrong direction & why can't just use library?!!
These things are found without even deep diving into the code, I don't know how much horrific things may come out of the closet.
This same app is being used by many companies in many different fields like banking, finance, insurance, govt. agencies etc.
Sometimes I surprise how this source passed review & reached the production. -
spent a few days trying to track down the cause of a thermal shutdown in my workstation. intel 4790k with no overclock would spike to 95C on one core (core1) whenever maxing out all 8 threads, be it real work, mprime, anything with 100% cpu being used. I quadrupled my RAM from 8gb to 32, because its cheap and id like to have all data in memory sometimes, not because I thought that was the problem. I reseated my watercooling block. I checked out the PSU. I unplugged all unnecessary peripherals, drives, etc. It turned out to be a bug in Gigabyte MOBO BIOS (causing temps to be read incorrectly i think, still not exactly sure...) updated from version 5 to 10 and poof now temps are back in the high 50's at full load. it only took 2 days to figure out and i think i learned something
-
I have seen references to API keys in several places. I have setup a few for various web services. However, I don't have a firm understanding of how they are protected (or not protected) from being copied and used by apps other than my own. I read a quick blurb from Google that said to use regular authentication over API keys due to them being able to be copied.
So my questions are: Are API keys just a bad way to subscribe services? Is there a way to protect them from being discovered? Maybe the app logs into a auth point for your services and is served the key to use with other services? But this key could still be gleaned from memory. Are API keys going to go away maybe in deference to things like oauth?3 -
This question might make you lose a brain cell because of stupidity in the question. Read with caution
Is there a way to compile a game for Windows from Linux in Unreal engine? I did google some posts but the answer was either use a Virtual machine which will not be done or use the the theoretical method of using mingw but the forum posts state that it will be tricky business or use a windows machine. I have dual booted windows with linux on my machine.
However since the machine has a 512 gb ssd most of the storage space is devoted to unreal engine which takes 47 gigs in itself and have a lot of programs installed I have a usable 20 gigs left out of 145 gig partition. Windows has around 318 gigs of storage to it but I have 100 gigs free at most. So after installing the windows sdk, visual studio with extensions, unreal engine and some other stuff I don't have much space left for myself. I need that much space since I install a lot of games to my ssd. So now I cant load my bigger projects for playing on my windows. I could use my hdd which is mostly used for backups and 100+ gig stuff. Though the hdd's are of course far slower than ssd's which shouldn't be a problem however last time I used visual studio it ate more than 2 gigs of ram for a solution meaning that the compiler has very low memory for itself to actually compile so for any large files the hdd has more of a bottleneck.
Oh and I can't upgrade my ssd's or ram because I don't have enough money.
Thanks for the answers in advance4 -
Backend wise
After a year and a half of working with what i love (nodejs microservices and bit of python) I have to update my php skills and refresh my memory with latest Laravel 😕 (I used it as an authentication/authorisation and REST backend for a react native app early 2016 and did not touch it since)
Passive Job hunting sux and yes PHP ain't my thing anymore 😔 i mean i have next to 6-8 years exp in it but given the choice... 😒
I used to love it (so many good memory with cakephp 😌🙄it teached me a lot early in my carrer) before I discover functional programming paradigm and got deep understanding of JS -
One nightmarish project that was doomed from the beginning, had me as the sole developer. I could hardly sleep when we began testing on a separate test system, but with (nearly) all the config stored in shared memory and copied from the production system, I dreaded, half awake, that the production server data base connection was still configured in the test system and that it was shooting all it's test data repeatedly to prod.
Finally drove to company in middle of the night at 4 o'clock. Checked everything was OK, tried to sleep 3 hours before the start of the work day.
This system also had the most hideous memory corruption in some shared memory that was used across several processes and should have been thoroughly protected by a mutex, but somehow, sometimes this crucial map, that was used to speed up the access to all the customer data just contained garbage.
Still haunts me to that day. (Like xkcd's unresolved tension of a non-matching parenthesis - an unresolved bug. -
Not sure if a valid cause for a rant; but my memory stick went bad after being used for just 6 months. Bought this memory kit this summer on computeruniverse. Now Windows reports that there are damaged pages on the 1st stick; though the 2nd stick is fine. Patriot Viper with small heatsinks...
What to say... In ye olde days DDR3 worked for years and never went bad 🤔1 -
When underscore.js is embedded so hard in your muscle memory that you use return to continue a for loop because you're used to _.each()
-
I went to create an attributions page for my node.js app I am working on. I just had it parse the packages used. Ran out of memory trying to display them in a browser.
Man I included 1 (uno) package and the dependencies are crazy. First thing I did was install license-checker to make sure I wasn't shooting myself in the foot with some random GPL/LGPL package.
So, I guess I am learning about node.js a bit this week.