Join devRant
Do all the things like
++ or -- rants, post your own rants, comment on others' rants and build your customized dev avatar
Sign Up
Pipeless API
From the creators of devRant, Pipeless lets you power real-time personalized recommendations and activity feeds using a simple API
Learn More
Search - "memory management"
-
Microsoft support: "Your antivirus software is causing problems with the memory management."
Me: "I use Windows defender"
Microsoft support: "Oh..."
Me: 🙃13 -
Interview with a candidate. He calls himself "C++ expert" on his resume. I think: "oh, great, I love C++ too, we will have an interesting conversation!"
Me: let's start with an easy one, what is 'nullptr'?
Him: (...some undecipherable sequence of words that didn't make any sense...)
In my mind: mh, probably I didn't understand right. Let's try again with something simple and more generic
Me: can you tell me about memory management in C++?
Him: you create objects on the stack with the 'new' keyword and they get automatically released when no other object references them
In my mind: wtf is this guy talking about? Is he confusing C++ with Java? Does he really know C++? Let's make him write some code, just to be sure
Me: can you write a program that prints numbers from 1 to 10?
Ten minutes and twenty mistakes later...
Me: okay, so what is this <int> here in angle brackets? What is a template?
Him: no idea
Me: you wrote 'cout', why sometimes do I see 'std::cout' instead? What is 'std'?
Answer: no idea, never heard of 'std'
I think: on his resume he also said he is a Java expert. Let's see if he knows the difference between the two. He *must* have noticed that one is byte-compiled and the other one is compiled to native code! Otherwise, how does he run his code? He must answer this question correctly:
Me: what is the difference between Java and C++? One has a Virtual Machine, what about the other?
Him: Java has the Java Virtual Machine
Me: yes, and C++?
Him: I guess C++ has a virtual machine too. The C++ Virtual Machine
Me (exhausted): okay, I don't have any other questions, we will let you know
And this is the story of how I got scared of interviews29 -
This post is in memory of all the devRanters fallen in battle against management stress and impossible deadlines.
Let their memories remain, and their loss never be forgotten.7 -
Anyone looking for something interesting to do???
Step 1) understand how basic circuitry works on a bread board nothing too fancy. ( Implement NAND, AND, ADDER, SUBTRACTOR)
Step 2) learn about microprocessors and how OS works
Step 3) learn assembly
Step 4)write a basic assembler and understand how loaders and linkers works !
Step 5) write a kernel with very basic features like memory management and process management and some drivers for IO
Step 5) write an emulator for some simple systems .! ex chip-8.
Step 6) read about compiler theory and automata
Step 7) write a basic Python interpreter that compiles (not interpreter) to native assembly.
Step 8) implement TCP stack .
Step 9) learn as much as u can about complexity measurement ), data structures and algorithms using C or C++ it's very important ( familiarity with pointers and thus computer memory )
Step 10) learn any high level language of choice like Python or Ruby.
Step 11) stop debating over tabs vs spaces , emacs vs vim , angular vs vue, php vs Python , OOps vs procedular vs functional ( just know about all of them and when to use but don't fucking debate over which one is superior )..
Step 12) live happily and be healthy.30 -
A university that teaches students
C++ without teaching an understanding of memory management is pointerless.5 -
I'm getting ridiculously pissed off at Intel's Management Engine (etc.), yet again. I'm learning new terrifying things it does, and about more exploits. Anything this nefarious and overreaching and untouchable is evil by its very nature.
(tl;dr at the bottom.)
I also learned that -- as I suspected -- AMD has their own version of the bloody thing. Apparently theirs is a bit less scary than Intel's since you can ostensibly disable it, but i don't believe that because spy agencies exist and people are power-hungry and corrupt as hell when they get it.
For those who don't know what the IME is, it's hardware godmode. It's a black box running obfuscated code on a coprocessor that's built into Intel cpus (all Intell cpus from 2008 on). It runs code continuously, even when the system is in S3 mode or powered off. As long as the psu is supplying current, it's running. It has its own mac and IP address, transmits out-of-band (so the OS can't see its traffic), some chips can even communicate via 3g, and it can accept remote commands, too. It has complete and unfettered access to everything, completely invisible to the OS. It can turn your computer on or off, use all hardware, access and change all data in ram and storage, etc. And all of this is completely transparent: when the IME interrupts, the cpu stores its state, pauses, runs the SMM (system management mode) code, restores the state, and resumes normal operation. Its memory always returns 0xff when read by the os, and all writes fail. So everything about it is completely hidden from the OS, though the OS can trigger the IME/SMM to run various functions through interrupts, too. But this system is also required for the CPU to even function, so killing it bricks your CPU. Which, ofc, you can do via exploits. Or install ring-2 keyloggers. or do fucking anything else you want to.
tl;dr IME is a hardware godmode, and if someone compromises this (and there have been many exploits), their code runs at ring-2 permissions (above kernel (0), above hypervisor (-1)). They can do anything and everything on/to your system, completely invisibly, and can even install persistent malware that lives inside your bloody cpu. And guess who has keys for this? Go on, guess. you're probably right. Are they completely trustworthy? No? You're probably right again.
There is absolutely no reason for this sort of thing to exist, and its existence can only makes things worse. It enables spying of literally all kinds, it enables cpu-resident malware, bricking your physical cpu, reading/modifying anything anywhere, taking control of your hardware, etc. Literal godmode. and some of it cannot be patched, meaning more than a few exploits require replacing your cpu to protect against.
And why does this exist?
Ostensibly to allow sysadmins to remote-manage fleets of computers, which it does. But it allows fucking everything else, too. and keys to it exist. and people are absolutely not trustworthy. especially those in power -- who are most likely to have access to said keys.
The only reason this exists is because fucking power-hungry doucherockets exist.26 -
I have been a mobile developer working with Android for about 6 years now. In that time, I have endured countless annoyances in the Android development space. I will endure them no more.
My complaints are:
1. Ridiculous build times. In what universe is it acceptable for us to wait 30 seconds for a build to complete. Yes, I've done all the optimisations mentioned on this page and then some. Don't even mention hot reload as it doesn't work fast enough or just does not work at all. Also, buying better hardware should not be a requirement to build a simple Android app, Xcode builds in 2 seconds with a 8GB Macbook Air. A Macbook Air!
2. IDE. Android Studio is a memory hog even if you throw 32GB of RAM at it. The visual editors are janky as hell. If you use Eclipse, you may as well just chop off your fingers right now because you will have no use for them after you try and build an app from afresh. I mean, just look at some of the posts in this subreddit where the common response is to invalidate caches and restart. That should only be used as a last resort, but it's thrown about like as if it solves everything. Truth be told, it's Gradle's fault. Gradle is so annoying I've dedicated the next point to it.
3. Gradle. I am convinced that Gradle causes 50% of an Android developer's pain. From the build times to the integration into various IDEs to its insane package management system. Why do I need to manually exclude dependencies from other dependencies, the build tool should just handle it for me. C'mon it's 2019. Gradle is so bad that it requires approx 54GB of RAM to work out that I have removed a dependency from the list of dependencies. Also I cannot work out what properties I need to put in what block.
4. API. Android API is over-bloated and hellish. How do I schedule a recurring notification? Oh use an AlarmManager. Yes you heard right, an AlarmManager... Not a NotificationManager because that would be too easy. Also has anyone ever tried running a long running task? Or done an asynchronous task? Or dealt with closing/opening a keyboard? Or handling clicks from a RecyclerView? Yes, I know Android Jetpack aims to solve these issues but over the years I have become so jaded by things that have meant to solve other broken things, that there isn't much hope for Jetpack in my mind 😤
5. API 2. A non-insignificant number of Android users are still on Jelly Bean or KitKat! That means we, as developers, have to support some of your shitty API decisions (Fragments, Activities, ListView) from all the way back then!
6. Not reactive enough. Android has support for Databinding recently but this kind of stuff should have been introduced from the very start. Look at React or Flutter as to how easy it is to make shit happen without any effort.
7. Layouts. What the actual hell is going on here. MDPI, XHDPI, XXHDPI, mipmap, drawable. Fuck it, just chuck it all in the drawable folder. Seriously, Android should handle this for me. If I am designing for a larger screen then it should be responsive. I don't want to deal with 50 different layouts spread over 6 different folders.
8. Permission system. Why was this not included from the very start? Rogue apps have abused this and abused your user's privacy and security. Yet you ban us and not them from the Play Store. What's going on? We need answers.
9. In Android, building an app took me 3 months and I had a lot of work left to do but I got so sick of Android dev I dropped it in favour of Flutter. I built the same app in Flutter and it took me around a month and I completed it all.
10. XML.
If you're a new dev, for the love of all that is good in this world, do NOT get into Android development. Start with Flutter or even iOS. On Flutter and build times are insanely fast and the hot reload is under 500ms constantly. It's a breath of fresh air and will save you a lot of headaches AND it builds for iOS flawlessly.
To the people who build Android, advocate it and work on it, sorry to swear, but fuck you! You have created a mess that we have to work with on a day-to-day basis only for us to get banned from the app store! You have sold us a lie that Android development is amazing with all the sweet treat names and conferences that look bubbly and fun. You have allowed to get it so bad that we can't target an API higher than 18 because some Android users are still using devices that support that!
End this misery. End our pain. End our suffering. Throw this abomination away like you do with some of your other projects and migrate your efforts over to Flutter. Please!
#NoToGoogleIO #AndroidSummitBoycott #FlutterDev #ReactNative16 -
Please Java and all java shit, take more memory I don't need it -_-
16GB doesn't seem to be enough to have a VM and Android Studio Open but it is more than enough to have
1. Visual Studio
2. SQL Server Management Studio
3. VM
4. FireFox
5. Visual Studio code
Fuck. This. Shit!20 -
that's quite accurate :) ofc there are exceptions (looking at you two naughty bois, metaspace and off-heap allocations!), but it's true for the most of it :)5
-
The solution for this one isn't nearly as amusing as the journey.
I was working for one of the largest retailers in NA as an architect. Said retailer had over a thousand big box stores, IT maintenance budget of $200M/year. The kind of place that just reeks of waste and mismanagement at every level.
They had installed a system to distribute training and instructional videos to every store, as well as recorded daily broadcasts to all store employees as a way of reducing management time spend with employees in the morning. This system had cost a cool 400M USD, not including labor and upgrades for round 1. Round 2 was another 100M to add a storage buffer to each store because they'd failed to account for the fact that their internet connections at the store and the outbound pipe from the DC wasn't capable of running the public facing e-commerce and streaming all the video data to every store in realtime. Typical massive enterprise clusterfuck.
Then security gets involved. Each device at stores had a different address on a private megawan. The stores didn't generally phone home, home phoned them as an access control measure; stores calling the DC was verboten. This presented an obvious problem for the video system because it needed to pull updates.
The brilliant Infosys resources had a bright idea to solve this problem:
- Treat each device IP as an access key for that device (avg 15 per store per store).
- Verify the request ip, then issue a redirect with ANOTHER ip unique to that device that the firewall would ingress only to the video subnet
- Do it all with the F5
A few months later, the networking team comes back and announces that after months of work and 10s of people years they can't implement the solution because iRules have a size limit and they would need more than 60,000 lines or 15,000 rules to implement it. Sad trombones all around.
Then, a wild DBA appears, steps up to the plate and says he can solve the problem with the power of ORACLE! Few months later he comes back with some absolutely batshit solution that stored the individual octets of an IPV4, multiple nested queries to the same table to emulate subnet masking through some temp table spanning voodoo. Time to complete: 2-4 minutes per request. He too eventually gives up the fight, sort of, in that backhanded way DBAs tend to do everything. I wish I would have paid more attention to that abortion because the rationale and its mechanics were just staggeringly rube goldberg and should have been documented for posterity.
So I catch wind of this sitting in a CAB meeting. I hear them talking about how there's "no way to solve this problem, it's too complex, we're going to need a lot more databases to handle this." I tune in and gather all it really needs to do, since the ingress firewall is handling the origin IP checks, is convert the request IP to video ingress IP, 302 and call it a day.
While they're all grandstanding and pontificating, I fire up visual studio and:
- write a method that encodes the incoming request IP into a single uint32
- write an http module that keeps an in-memory dictionary of uint32,string for the request, response, converts the request ip and 302s the call with blackhole support
- convert all the mappings in the spreadsheet attached to the meetings into a csv, dump to disk
- write a wpf application to allow for easily managing the IP database in the short term
- deploy the solution one of our stage boxes
- add a TODO to eventually move this to a database
All this took about 5 minutes. I interrupt their conversation to ask them to retarget their test to the port I exposed on the stage box. Then watch them stare in stunned silence as the crow grows cold.
According to a friend who still works there, that code is still running in production on a single node to this day. And still running on the same static file database.
#TheValueOfEngineers2 -
Not just another Windows rant:
*Disclaimer* : I'm a full time Linux user for dev work having switched from Windows a couple of years ago. Only open Windows for Photoshop (or games) or when I fuck up my Linux install (Arch user) because I get too adventurous (don't we all)
I have hated Windows 10 from day 1 for being a rebel. Automatic updates and generally so many bugs (specially the 100% disk usage on boot for idk how long) really sucked.
It's got ads now and it's generally much slower than probably a Windows 8 install..
The pathetic memory management and the overall slower interface really ticks me off. I'm trying to work and get access to web services and all I get is hangups.
Chrome is my go-to browser for everything and the experience is sub par. We all know it gobbles up RAM but even more on Windows.
My Linux install on the same computer flies with a heavy project open in Android Studio, 25+ tabs in Chrome and a 1080p video playing in the background.
Up until the creators update, UI bugs were a common sight. Things would just stop working if you clicked them multiple times.
But you know what I'm tired of more?
The ignorant pricks who bash it for being Windows. This OS isn't bad. Sure it's not Linux or MacOS but it stands strong.
You are just bashing it because it's not developer friendly and it's not. It never advertises itself like that.
It's a full fledged OS for everyone. It's not dev friendly but you can make it as much as possible but you're lazy.
People do use Windows to code. If you don't know that, you're ignorant. They also make a living by using Windows all day. How bout tha?
But it tries to make you feel comfortable with the recent bash integration and the plethora of tools that Microsoft builds.
IIS may not be Apache or Nginx but it gets the job done.
Azure uses Windows and it's one of best web services out there. It's freaking amazing with dead simple docs to get up and running with a web app in 10 minutes.
I saw many rants against VS but you know it's one of the best IDEs out there and it runs the best on Windows (for me, at least).
I'm pissed at you - you blind hater you.
Research and appreciate the things good qualities in something instead of trying to be the cool but ignorant dev who codes with Linux/Mac but doesn't know shit about the advantages they offer.undefined windows 10 sucks visual studio unix macos ignorance mac terminal windows 10 linux developer22 -
What an absolute fucking disaster of a day. Strap in, folks; it's time for a bumpy ride!
I got a whole hour of work done today. The first hour of my morning because I went to work a bit early. Then people started complaining about Jenkins jobs failing on that one Jenkins server our team has been wanting to decom for two years but management won't let us force people to move to new servers. It's a single server with over four thousand projects, some of which run massive data processing jobs that last DAYS. The server was originally set up by people who have since quit, of course, and left it behind for my team to adopt with zero documentation.
Anyway, the 500GB disk is 100% full. The memory (all 64GB of it) is fully consumed by stuck jobs. We can't track down large old files to delete because du chokes on the workspace folder with thousands of subfolders with no Ram to spare. We decide to basically take a hacksaw to it, deleting the workspace for every job not currently in progress. This of course fucked up some really poorly-designed pipelines that relied on workspaces persisting between jobs, so we had to deal with complaints about that as well.
So we get the Jenkins server up and running again just in time for AWS to have a major incident affecting EC2 instance provisioning in our primary region. People keep bugging me to fix it, I keep telling them that it's Amazon's problem to solve, they wait a few minutes and ask me to fix it again. Emails flying back and forth until that was done.
Lunch time already. But the fun isn't over yet!
I get back to my desk to find out that new hires or people who got new Mac laptops recently can't even install our toolchain, because management has started handing out M1 Macs without telling us and all our tools are compiled solely for x86_64. That took some troubleshooting to even figure out what the problem was because the only error people got from homebrew was that the formula was empty when it clearly wasn't.
After figuring out that problem (but not fully solving it yet), one team starts complaining to us about a Github problem because we manage the github org. Except it's not a github problem and I already knew this because they are a Problem Team that uses some technical authoring software with Git integration but they only have even the barest understanding of what Git actually does. Turns out it's a Git problem. An update for Git was pushed out recently that patches a big bad vulnerability and the way it was patched causes problems because they're using Git wrong (multiple users accessing the same local repo on a samba share). It's a huge vulnerability so my entire conversation with them went sort of like:
"Please don't."
"We have to."
"Fine, here's a workaround, this will allow arbitrary code execution by anyone with physical or virtual access to this computer that you have sitting in an unlocked office somewhere."
"How do I run a Git command I don't use Git."
So that dealt with, I start taking a look at our toolchain, trying to figure out if I can easily just cross-compile it to arm64 for the M1 macbooks or if it will be a more involved fix. And I find all kinds of horrendous shit left behind by the people who wrote the tools that, naturally, they left for us to adopt when they quit over a year ago. I'm talking entire functions in a tool used by hundreds of people that were put in as a joke, poorly documented functions I am still trying to puzzle out, and exactly zero comments in the code and abbreviated function names like "gars", "snh", and "jgajawwawstai".
While I'm looking into that, the person from our team who is responsible for incident communication finally gets the AWS EC2 provisioning issue reported to IT Operations, who sent out an alert to affected users that should have gone out hours earlier.
Meanwhile, according to the health dashboard in AWS, the issue had already been resolved three hours before the communication went out and the ticket remains open at this moment, as far as I know.5 -
Why is it so important to some people to claim that "HTML and CSS are not programming languages"? I get it, you're a REAL programmer working with arrays, maybe tuples, objects and possibly direct memory management. Who the fuck has a right to call themselves a programmer for writing some brain dead markup or poorly designed selectors, right? Who fucking cares for semantic tags or nested selectors?
Just think for a few seconds about when you were taking your first baby steps to becoming the GOD ROCKING MEMORY HANDLER THAT WRITES _REAL_ CODE that you are today, and how good it felt to be able to create something that appeared on your screen. It felt pretty awesome, yeah?
Now imagine if someone much more experienced than you told you "You're not a real programmer, that is not real programming. You should see what I do, I do real programming".
I think you get it. Why spend your energy spreading bad vibes when you could spend it on something more productive. Like reading up on the new CSS4 specs ;)18 -
The more depressed you get over the current state of software is how you know you made it.When you start making your own opinions and say"wow these people are full of shit"
Primary example, the web development overblown bullshit. Fuck me dude, you really don't need that full featured react, vue, angular framework to make sense of shit. You are going over the top for fucking ajax functionality and state management that you could do by yourself without needing to learn a full framework, by the time you finish learning react you probably would have been better served with standard vanilla af JS and server side rendering.
Our world is full of fads and many talented people that perpetrate them. Its fine, it is a the nature of the beast. But a lot...A LOT of software is very POORLY written. And adding levels of abstraction over a very broken paradigm (web in this case) does and will not make it better.
Basically I am fucking hating being a web developer and want to go back to a time in which we cared about how much memory consumption our applications made as well as not worrying about the fucking frontend having the ability to implement machine learning.
I want to run sublime.exe and being sure that it is a native application to my system and not using a fucking contained web browser to implement my fucking text editor. With 20mb of ram at most instead of 500mb WTF.
I knew I made it when I could read comments on Hacker news and reddit and say "this idiot is full of shit", I knew I made it when I would sigh heavily at the idea of having another project rather than having a fan girl attitude towards it.
I knew I made it when people writing about software development meant shit to me rather than the wonder of what the fuck they were talking about.
I knew I made it when getting laid was more important to me than fucking around with code.
pussy > code
Fuck you.13 -
In my last job they required us to turn on a task timer for every little thing. Remembering to do that, and to turn it off, was a royal pain. First I had to look up which task it is, start the timer, stop the timer, find the next task and repeat, then flip back to the first task. Lots of open browser tabs within tab groups to keep track of it all. And if I came up short or went over on budget, there was a “conversation” with management to account for discrepancies. Then I had to go by memory and try to reconstruct the “missing time” accurately enough to be convincing.
Now that I’m freelancing, I try to keep up the habit because it does have merit for tracking estimates and actuals, but now it’s just me to answer to for discrepancies and I can fudge the numbers as I see fit. The time records did, however, save my bacon in a recent dispute.5 -
I'll use this topic to segue into a related (lonely) story befitting my mood these past weeks.
This is entire story going to sound egotistical, especially this next part, but it's really not. (At least I don't think so?)
As I'm almost entirely self-taught, having another dev giving me good advice would have been nice. I've only known / worked with a few people who were better devs than I, and rarely ever received good advice from them.
One of those better devs was my first computer science teacher. Looking back, he was pretty average, but he held us to high standards and gave good advice. The two that really stuck with me were: 1) "save every time you've done something you don't want to redo," and 2) "printf is your best debugging friend; add it everywhere there's something you want to watch." Probably the best and most helpful advice I've ever received 😊
I've seen other people here posting advice like "never hardcode" or "modularity keeps your code clean" -- I had to discover these pretty simple concepts entirely on my own. School (and later college) were filled with terrible teachers and worse students, and so were almost entirely useless for learning anything new.
The only decent dev I knew had brilliant ideas (genetic algorithms, sandboxing, ...) before they were widely used, but could rarely implement them well because he was generally an idiot. (Idiot sevant, I think? Definitely the idiot part.) I couldn't stand him. Completely bypassing a ridiculously long story, I helped him on a project to build his own OS from scratch; we made very impressive progress, even to this day. Custom bootloader, hardware interfacing, memory management, (semi) sandboxed processes, gui, example programs ...; we were in highschool. I'm still surprised and impressed with what we accomplished.
But besides him, almost every other dev I met was mediocre. Even outside of school, I went so many years without having another competent dev to work with. I went through various jobs helping other dev(s) on their projects (or rewriting them), learning new languages/frameworks almost every time: php, pascal, perl, zend, js, vb, rails, node, .... I learned new concepts occasionally (which was wonderful) but overall it was just tedious and never paid well because I was too young to be taken seriously (and female, further exacerbating it). On the bright side, it didn't dwindle my love for coding, and I usually spent my evenings playing with projects of my own.
The second dev (and one one of the best I've ever met) went by Novo. His approach to a game engine reminded me of General Relativity: Everything was modular, had a rich inheritance tree, and could receive user input at any point along said tree. A user could attach their view/control to any object. (Computer control methods could be attached in this way as well.) UI would obviously change depending on how the user could interact and the number of objects; admins could view/monitor any of these. Almost every object / class of object could talk to almost everything else. It was beautiful. I learned so much from his designs. (Honestly, I don't remember the code at all, and that saddens me.) There were other things, too, but that one amazed me the most.
I havent met anyone like him ever again.
Anyway, I don't know if I can really answer this week's question. I definitely received some good advice while initially learning, but past that it's all been through discovering things on my own.
It's been lonely. ☹2 -
Soooo I think I have finally come to the point that I may have to create a YouTube channel, to teach software engineering from the ground up... and teach it the way the universities and everyone else should be teaching it, so that they have a solid foundation.... throwing hello world, and loops and variables at folks out of the box without any of the environment context or low level embedded register, even logic gate understanding
That lack of understanding is why, soooo many college students and younger folks, are actually pretty shitty engineers. Everything is high level languages and theoretical concepts to them. Nothing practical, that’s why there’s sooo many python and java developers that can’t for the life of them understand memory management, low level hardware interfacing etc, because the colleges don’t teach it the way it use to be taught.
I seriously fear 30 years from now or sooner when there are few embedded engineers only left till retirement, as without those folks the whole pyramid of electronics falls to pieces.
Java, C#, python, all that shit don’t run on the bare metal... there’s this magical layer of C, and assembler that does all the work just so folks can abstract their thoughts.
Either 1 of two situations will happen.. price of electronics will rise because the embedded guys are few and far between therefore salaries skyrocket... OR everything starts running shit like java on the metal, where there are a over abundance of developers, their salaries will be low because there are soo many but the processing power, space, and energy needed to run java natively causes electronics cost to increase
but regardless 30 years from now if those script kiddies are building everything I fear it cuz there’s gonna be memory leaks, and overflow issues everywhere.. shit be blowing up more than 4th of July.. lol
Soooo in effort to prevent that and keep the embedded engineers up, or atleast properly educate the script kiddies, I’m gonna make that YouTube channel.. 1 maybe 2 videos a week, 1-2 hours sessions each.. starting at the fucken ground and building up.39 -
Our team makes a software in Java and because of technical reasons we require 1GB of memory for the JVM (with the Xmx switch).
If you don't have enough free memory the app without any sign just exits because the JVM just couldn't bite big enough from the memory.
Many days later and you just stand there without a clue as to why the launcher does nothing.
Then you remember this constraint and start to close every memory heavy app you can think of. (I'm looking at you Chrome) No matter how important those spreadsheets or illustrator files. Congratulation you just freed up 4GB of memory, things should work now! WRONG!
But why you might ask. You see we are using 32-bit version of java because someone in upper management decided that it should run on any machine (even if we only test it on win 7 and high sierra) and 32 is smaller than 64 so it must be downwards compatible! we should use it! Yes, in 2019 we use 32-bit java because some lunatic might want to run our software on a Windows XP 32-bit OS. But why is this so much of a problem?
Well.. the 32-bit version of Java requires CONTIGUOUS FREE SPACE IN MEMORY TO EVEN START... AND WE ARE REQUESTING ONE GIGABYTE!!
So you can shove your swap and closed applications up your ass but I bet you that you won't get 1GB contiguous memory that way!
Now there will be a meeting about this issue and another related to the issues with 32-bit JVM tomorrow. The only problem is that this issue only occures if you used up most of your memory and then try to open our software. So upper management will probably deem this issue minor and won't allow us to upgrade to 64-bit... in 20fucking1910 -
I'm fairly sure that even if I were to run Windows 10 on a machine with double ECC, it'd still BSOD because of poor memory management. Because why on Earth would Microsoft support such a basic, essential thing properly, hmm?!!
Oh and let's add HDMI to the existing list of ACPI and USB. How difficult can it possibly be to support those basic, most standardized fucking hardware protocols? Pretty fucking insanely hard apparently!!!3 -
A couple days ago, I went through the most embarrassing interview ever. It was a startup into both hardware and software merged over image processing. I really wanted it. Really really did. It was telephonic, and involved a little bit coding over docs. In the one hour we talked over the phone, he asked me about 30 questions. I hadn't even heard of the words he said! Ive never delved into compilers, lower level things, and memory management. I could answer about 5 questions- including the tell me about yourself question.
So thats about 25 ways I came up with of saying "I don't know" in a span of 60 minutes.3 -
I dive in head first.
Some existing program annoys me, so I get this itch to write a selfhosted Spotify in Go, or a conky with 3D graphics in Rust.
I check the homepage of the language, download the tools, check which IDE is great for it.
Then I just start writing code, following the error corrections thrown by the IDE, doing web searches for all errors. Then when I run into a wall, I might check the reference docs or a udemy course.
Often I don't finish the project, because time is limited and I still have 4 million other things to do and learn, but at least I've learned a new language/tech.
Con: For tech which uses unique paradigms like Rust's memory management or Go's Goroutines, it can be frustrating to bash away at a problem using old assumptions.
Pro: By having a real demand for a product with requirements instead of a hello world or todo app, it's much easier to stay motivated, and you learn beyond what courses would teach you.5 -
For the first time in my life windows 10 repair fixed the PC?
Was getting bluscreen regarding unmountable boot, memory management then try to open it now it started diagnosing your PC.
Started reading devrant it fixed the problem and it works now.5 -
It has been a loooong time since i have had awesome programming day like today, i've been doing earth simulator kind of game for mobile and lots of its core mechanics are starting to take shape. Still long way to go but it feels great! Days like these remind me why i started programming. I will have to deal with memory management because there will be lots of shit happening on screen not yet sure how to deal with that in corona sdk. Sorry for good feels.11
-
Windows 10 wants to ruin my life by consuming almost 70% of memory for itself from 4 gigs.
No application is running and still consuming that much of memory. Now I just hate the updates of windows 10 pro.
Any suggestions to get rid of this situation?26 -
So, CS student here.
Gave TCS "national" level test.
Quoting from the question:
"if you have 3 bytes of memory, it can be used to represent 2^3=8 values in the memory"
This test is a waste of at least 30000+ human hours and these guys didn't even put 24 hours of effort to make sure questions are correct.
Fuck this fucking IT industry.
Fuck the people who designed this testing process.
Fuck the people who endorsed this process.
Fuck the management for passing it as a test.
The people who wrote the test question can go die in hell.
It's not my problem that their mothers fucked Neanderthals.
Uh! All I want is a job but ended up wasting 200+ hours of time.11 -
We all know you can't "learn x programming language in a day" without travelling to the Arctic and catching a day that last half a year.
But what's the worst language to try and learn in a day?
I vote c++. Manual memory management, multiple inheritance, static compilation, operator overloading, and generally non-human syntax ( Like std::cout << "This is how you print!" << std::endl; ) make it a difficult one to attempt in a day.26 -
When the hero turns into villain.
Windows memory management is one of the worst architectural designs in tech industry to ever exist.
One mistake on my part is I did not opt for SSD four years ago when my purchased my current laptop.
The entire experience is irritating.33 -
If anything taught me that garbage collectors aren't the one true answer for memory management then it's got to be modded minecraft
It takes around 10-20 seconds only to disconnect from a game and go back to the main menu.
You also get memory leaks, which result in several second long freezes when the GC kicks in, that happen every 15 seconds.1 -
Professors today in colleges don't know...
.
1. the proper denominations of outputs of basic shell commands like "ls -l", "cat", "cal" (pronounces linux as laynux)
.
2. how memory management works
.
3. how process scheduling actually takes place and not in the outdated bookish way.
.
4. how to compile a package from scratch and including digital signatures
.
5. cannot read a man page properly, yet come to take OS labs.
.
6. how to mount a different hardware
.
7. how to check kernel build rules, forget about compiling a custom kernel.
.
.
.
n. ....
Yet we are expecting the engineers who are churned out of colleges to be NEXT GEN ?!
It is not entirely because of syllabus, its also because of professors who had not updated their knowledge since they got a job. Therefore they cannot impart proper basics on students.
If you want things to change, train students directly in the industry with versions of these professors UPDATED.6 -
I'm getting really tired of those dumbass programmers that do not understand shit and then come to me when production breaks. (I am also a programmer, not really a DevOps engineer, but I'm the least worst at DevOps stuff, so it's my job...).
We're programming some kind of document management tool. Today we had a release, and one of the new features is to download all of your documents as a zip file, which is asynchronuously generated. When it's done, the user gets a mail with the download link to the zip file.
The feature works basically, but today it broke our production service, as somebody was running a test of it.
Turns out all the documents are loaded into memory to be zipped. So if you have 2 gigs of documents, a container with memory restrictions in that area will crash.
I asked the programmer who reported this «ops problem» to me, why he didn't just shit the files into a temp foler in order to zip them in there.
He told me that he wanted to do so, but did not know how to mock this for a unit test, and therefore went to the in-memory «solution», which was easier for him to mock.
For fuck's sake, unit tests and mocks are fucking tools, not ends in itself! I don't give a fuck about your pointless mocking code when the application crashes!
When I got to deal with such dumbasses, I'd prefer to mock those motherfuckers with a leaky bucket of liquid shit, which basically accomplishes the same task from my perspective: dripping shit all over the place and make everything suck as fuck.3 -
My two cent: Java is fucking terrible for computer science. Why the fuck would you teach somebody such a verbose language with so many unwritten rules?
If you really want your students to learn about computer, why not C? Java has no pointer, no passed by reference, no memory management, a lots of obscure classes structure and design pattern, this shit is garbage. The student will almost never has contact with the compiler, many don't even know of existence of a compiler.
Java is so enterprise focused and just fucked up for educating purpose. And I say it as somebody who (still) uses it as main language.
If you want your students to be productive and learn about software engineering, why not Python? Things are simple in Python can can be done way easier without students becoming code monkeys (assuming they don't use for each task a whole library). I mean java takes who god damn class and an explicitly declared entry point which is btw. fucking verbose to print something into the console.
Fuck Java.17 -
A friend came to me whether i want to do a project on c++(someone asked him to find a c++ guy).
Me needing money didn’t refuse. Even though i am a Java developer with 0 skills on c++, but wanted to give it a try.
So project started, and it was about a plugin for rhinoceros app(3d graphics app).
The plugin was simple, had some views and some services to upload a file into s3 and some api calls, not something complex..
So i ended up working on the project together with my friend(web dev).
So long story short, we had a lot of issues, but considering we both had no knowledge on c++, we were really lucky to finish the product almost on time(3 days after).
Did no memory management even though i’ve read that we have to do that by our selfs and that c++ doesn’t have garbage collector.
But the plugin worked great even without garbage collector.
Had a lot issues with string manipulation, which almost drive me crazy.
PS: did a post here before taking the project, to ask whether it is a good idea to take the project or not, had some positive and some negative replies, but i deleted the post since i thought i was breaking the NDA i signed 😂😂
PS2: just finished OCAJP 8 last week with a great score😃6 -
Is your code green?
I've been thinking a lot about this for the past year. There was recently an article on this on slashdot.
I like optimising things to a reasonable degree and avoid bloat. What are some signs of code that isn't green?
* Use of technology that says its fast without real expert review and measurement. Lots of tech out their claims to be fast but actually isn't or is doing so by saturation resources while being inefficient.
* It uses caching. Many might find that counter intuitive. In technology it is surprisingly common to see people scale or cache rather than directly fixing the thing that's watt expensive which is compounded when the cache has weak coverage.
* It uses scaling. Originally scaling was a last resort. The reason is simple, it introduces excessive complexity. Today it's common to see people scale things rather than make them efficient. You end up needing ten instances when a bit of skill could bring you down to one which could scale as well but likely wont need to.
* It uses a non-trivial framework. Frameworks are rarely fast. Most will fall in the range of ten to a thousand times slower in terms of CPU usage. Memory bloat may also force the need for more instances. Frameworks written on already slow high level languages may be especially bad.
* Lacks optimisations for obvious bottlenecks.
* It runs slowly.
* It lacks even basic resource usage measurement.
Unfortunately smells are not enough on their own but are a start. Real measurement and expert review is always the only way to get an idea of if your code is reasonably green.
I find it not uncommon to see things require tens to hundreds to thousands of resources than needed if not more.
In terms of cycles that can be the difference between needing a single core and a thousand cores.
This is common in the industry but it's not because people didn't write everything in assembly. It's usually leaning toward the extreme opposite.
Optimisations are often easy and don't require writing code in binary. In fact the resulting code is often simpler. Excess complexity and inefficient code tend to go hand in hand. Sometimes a code cleaning service is all you need to enhance your green.
I once rewrote a data parsing library that had to parse a hundred MB and was a performance hotspot into C from an interpreted language. I measured it and the results were good. It had been optimised as much as possible in the interpreted version but way still 50 times faster minimum in C.
I recently stumbled upon someone's attempt to do the same and I was able to optimise the interpreted version in five minutes to be twice as fast as the C++ version.
I see opportunity to optimise everywhere in software. A billion KG CO2 could be saved easy if a few green code shops popped up. It's also often a net win. Faster software, lower costs, lower management burden... I'm thinking of starting a consultancy.
The problem is after witnessing the likes of Greta Thunberg then if that's what the next generation has in store then as far as I'm concerned the world can fucking burn and her generation along with it.6 -
I started writing code at a young age, nodding games, building websites, modifying hex files, hacking etc... I started my career off tho in highschool writing embedded code for a local medical robotics company, and also got tasked with building the mobile app to control these robots and use them for diagnostics, etc.... this was before the App bubble, before there was app degree and that bullshit.. anyway graduated highschool, went to college to get a comp sci degree.
Wanted to teach for the university and research AI...
well I dropped out of college after 3 years, cuz I spent more time at work than in class. (I was a software consultant) in the auto industry in Detroit. I wasn’t learning anything I didn’t already know or could learn from books or a quick google search.
I also didn’t like the approach professors and the department taught software... way none of the kids had a good foundation of what the fuck they were doing... and everyone relied on the god damn IDEs... so I said fuck it and dropped out after getting in plenty of arguments with the professors and department leads.
I probably should have choose CE .. but whatever CS imo still needs a solid CE/EE foundation without it, 30 years from now I fear what will become of the industry of electronics... when all current gen folks are retired and nobody to write the embedded code, that literally ALLLLL consumer electronics runs on. Newer generations don’t understand pointers, proper memory management etc.
So I combined both passion AI and knowledge of software in general and embedded software, and been working on my career in the auto industry without a degree, never looked back.2 -
My first real exposure to a PC was when my father and me built one for myself. Y'know, some AMD Athlon 64, some MSI board, 2 GB of RAM, an NVIDIA 8600 GT, everything was nice.
I never put malware on that thing even though I heavily used it for things like games, I was really cautious with that even when I was like 6 years old (but my father once accidentally did, he killed it by damaging the filesystem on the harddrive which, funny enough, only took the malware with it)
I still have that PC, but it now has weird issues with memory management ;-; -
Imagine the horror of learning C programming with manual memory management, pointer arithmetic and without your cool utility libraries after programming for 2 years in Python just becoz it's in the fukin syllabus!!13
-
Learning Rust.
Holy brainfucking brain melt, those references, scoping and borrowing and cloning and whatnot, because there is no garbage collector, but also no direct memory management.
It's cool, but also hard for a noob coming from the JVM/Android. The compiler error messages are helpful, but I immediately found some cryptic ones that don't help me at all.9 -
Back when I was still in school for comp sci we had an advanced software engineering and design class with c++. At this time, everyone was expected to be proficient enough with cpp to go ahead and properly work with whatever the instructor would throw at us. And pretty much everyone was since past classes included a lot of c++ development. Of course, efficient at least related to academic studies rather than actual real world development.
Our teacher would mix in a lot pf phyisics and mathematics into what we were doing, something that I greatly enjoyed, while at the same time putting real world value concerning cpp best practices to avoid common pitfalls in the development of said language. Since most bugs seemed to be memory based he would be particularly strict about that.
One classmate, good friend and an actual proper developer now a days would ALWAYS forget to free his resources...ALWAYS for whatever fucking reason he would just ignore that shit, regardless of how much the instructor would make a point on it.
At one point during class on a virtual lecture the dude literally addressed a couple of students but when he got to my boy in particular he said: "you are the reason why people are praying to Mozilla and Hoare to release Rust as fast as possible into a suitable alternative to high performant code in C++, WHY won't you pay attention to how you deal with memory management?"
And it stuck with me. I merely a recreational cpp dev, most of my profesional work is done on web development, so I cannot attest to all the additional unsafe code that people encounter in the wild when dealing with cpp on a professional level.
But in terms of them common criticisms of C and C++ for which memory is so important to work with, wouldn't you guys say that it comes more from the side of people just not knowing what they are doing rather than a fault on the language itself?
I see the merits and beauty of Rust, I truly do, it is a fantastic language, with a standardized build system and a lot of good design put into it. But I can't really fathom it being the cpp killer, if anything, the real cpp killers are bad devs that just don't know what they are doing or miss shit.
What do y'all ninjas think?8 -
When I was about 10, I used to read these magazines with code listings for programs, and the only things I really understood were these text adventures that I imagined to be of Zork-like quality (gasp!). In reality, it was more like the choose-your-own adventure books of the time (which were actually pretty cool, and had pretty tight memory management). At one time, on a vacation somewhere in the eighties, I got tired of playing in the river with my friends and instead opted to continue writing lines of BASIC in a little paper notebook, inside my parents' car (at 34 degrees C), trying to perfect a storyline about my little brother and his pet dog he got for his most recent birthday, fighting the cat empire etcetera etcetera. Weird looks, good times.
-
Trying to develop Java applications in Windows...
sed s/Java applications/anything/g
'sed' is not recognized as internal or external command, operable program or batch file
*sigh*3 -
So another story about college and stupid team assignments that I have to be responsible for dealing with.
So we had an assignment in operating systems 1 course, it was about memory management and we are a team of 3. Then came the time when we should discuss this assignment with the TA and that day I had to stay all night finishing a project in software engineering (literally giving us a description of a big project because that's what the course teaches And I had to finish it in one all nighter alone because my teammates just gave up).
When the discussion time came I was really tired and then the TA asks me something really simple and I say it but then she tells me that I'm wrong so I wondered a bit and then said no what I said was right! She then asks my teammate (who we are supposed to be good friends) "did he say the right thing?" And his answer is a definitive "NO he's wrong" and then he starts to say the right answer which I swear I said the same but in a different way so I start to say again that I was right and say that I said that just a different way and she took that as an insult and said that I'm shouting at her and being disrespectful to her.
When we finished I asked my friend if he heard me say it wrong and he said "I'm sorry but I didn't even hear what you said and I was afraid" WHAT THE FUCK, he just said that I was wrong to please her and make her feel like she is right and I had to be the wrong one even though I said it right but NOoo her pride is more important
All this was last semester and the second semester just started today and I go into operating system 2 and guess what? The TA got her doctorate and is now the professor for OS 2 when she doesn't even understand anything.
Really FUCK the academic system it feels like it is a grind more than actually gaining mastery of a subject.2 -
I would like to stop and genuinely thank the devs and anyone that contributed to NW.js for allowing users to work outside the sandbox. Fucking sandboxes these days make developing editors and tooling a bunch of bullshit hassle. I understand why, but it makes an entire class of software that much more difficult to develop.
And on a semirelated note, I decided to go with nw.js because unlike electron, I don't have to tell users "just install these two gigabytes of npm dependencies *from off the net after already downloading the main application*, dependencies that could break at any time at all for any reason."
Does anyone even bundle their dependencies any more or is this something only clinically insane people like myself do?
Because last I checked most users still don't know how to debug console autobarf when a single command goes awry due to something obscure like a version conflict between two brittle cogs in the organ grinder known as package management.
Edit: also, nw.js startup times and memory requirements are relatively sane compared to electron.3 -
Windows rant incoming!
For fucks sake! I think Windows have asked me 117 times if I want to update now. The answer is still fucking no!
And I don't care how much of a security improvement it might be, when your shitty update causes a Memory Management error.
So fuck off, stop minimising my game while I play and go fix your shitty update first!
Fuck you Microsoft, fuck your QA team and while I'm at it, I want to say fuck you to all versions of Windows Server as well!5 -
One of the things that I like the most regarding Clojure(and most Lisps to be honest) is how "not for beginners" the ecosystem feels.
Don't get me wrong, setting up a project in lein with dependencies(both internal and external) is a cakewalk, installing lein or boot is a cakewalk. Setting environment consts and middleware etc etc is a cakewalk.
Its just that there are no blogs about convoluted and amateurish ways of doing things. Most presentations and articles are written by really experienced and talented individuals.
I dunno, its just a nice shift in community. Its nice to see people not fucking up Object Oriented programming in java or any of the other oop languages. Its nice not seeing people giving horrible advice regarding memory management in C or c++ and it is sure as shit nice to not see spaghetti php und js code.
And my productivity levels are off the charts man. Really liking this shit and I get to stay inside my JVM -
A dev life in Queen songs:
„A Kind of Magic“ - Build successful
„A Winter’s Tale“ - Key Account Manager visits customer
„Action This Day“ - Release day
„All Dead, All Dead“ - System down
„Another One Bites the Dust“ - kill -9 4711
„Breakthru“ - 10 hour debuging session
„Chinese Torture“ - Microsft Office
„Coming Soon“ - Client asks for delivery date
„Dead on Time“ - shutdown -t 10
„Doing All Right“ - How's the progress on the new feature?
„Don’t Lose Your Head“ - git push -f
„Don’t Stop Me Now“ - In the zone
„Escape from the Swamp“ - Hand in resignation letter
„Forever“ - while(1)
„Friends Will Be Friends“ - friend class Vector;
„Get Down, Make Love“ - No rule to make target "Love"
„Hammer to Fall“ - Release day
„Hang on in There“ - 2 weeks until release
„I Can’t Live With You“- Microsoft
„I Go Crazy“ - Microsoft
„I Want It All“ - Google
„I Want to Break Free“ - free( (void*) 0xDEADBEEF );
„I’m Going Slightly Mad“ - Impossible feature requested
„If You Can’t Beat Them“ - Impossible feature promised by sales
„In Only Seven Days“ - Impossible feature ordered
„Is This the World We Created...?“ - Philosphic moments
„It’s a Beautiful Day“ - Weekend
„It’s a Hard Life“ - Weekday
„It’s Late“ - Deadline was last week
„Jesus“ - WTF?
„Keep Passing the Open Windows“ - Interprocess communication
„Keep Yourself Alive“ - Daily struggle
„Leaving Home Ain’t Easy“ - Time to get up and go to work
„Let Me Entertain You“ - Sales meets customer
„Liar“ - Sales
„Long Away“ - Project start
„Loser in the End“ - Dev
„Lost Opportunity“ - Job ad
„Love of My Life“ - emacs/vim
„Machines“ - Computer
„Made in Heaven“ - git
„Misfire“ - Unhandled exception at Memory location 0xDEADBEEF
„My Life Has Been Saved“ - Google drive/Facebook
„New York, New York“ - Meeting at customer
„No-One But You“ - Bus factor = 1
„Now I’m Here“ - Morning rush hour
„One Vision“ - Management goals
„Pain Is So Close to Pleasure“ - NullPointerExcption
„Party“ - Delivery completed
„Play the Game“ - Customer meeting inhous -
„Put Out the Fire“ - Support hotline
„Radio Ga Ga“ - GSM/GPRS/UMTS/LTE/5G
„Ride the Wild Wind“ - Arch Linux
„Rock It“ - Linux
„Save Me“ - CTRL-S/CTRL-Z
„See What a Fool I’ve Been“ - git blame
„Sheer Heart Attack“ - rm -rf /
„Staying Power“- UPS
„Stealin’“ - Stack Overflow
„The Miracle“ - It works
„The Night Comes Down“ - It doesn't work
„The Show Must Go On“ - Project cancelled
„There Must Be More to Life Than This“ - Philosophic moments
„These Are the Days of Our Lives“ - Daily routine
„Under Pressure“ - 1 day until release
„Was It All Worth It“ - Controlling
„We Are the Champions“ - Release finished
„We Will Rock You“ - Sales at customer
„Who Needs You“ - HR
„You Don’t Fool Me“ - Debugging session
„You Take My Breath Away“ - rm -rf /
„You’re My Best Friend“ - emacs/vim4 -
C++ might be very good in memory management but it's an absolute pain in the ass. When the src has thousands of lines of code it's clos3 to impossible to manage the memory. The errors are so vague.
And porting the code to a new hardware is an absolute nightmare.1 -
You know shit is going to hit the fan if the sentence "c++ is the same as java" is said because fuck all the underlying parts of software. It's all the fucking same. Oh and to write a newline in bash we don't use \n or so, we just put an empty echo in there. And fuck this #!/bin/bash line, I'm a teacher. I don't need to know how shit works to teach shit. Let's teach 'em you need stdio for printf even tho it compiles fine without on linux (wtf moment number one, asking em leaves you with "dunno..") and as someone who knows c you look at your terminal questioning everything you ever learned in your whole life. And then we let you look into the binaries with ldd and all the good stuff but we won't explain you why you can see a size difference in the compiled files even tho you included stdio in the second one, and all symbol tables show the exact same thing but dude chill, we don't know what's going on either.
Oh and btw don't use different directory names as we do in our examples. You won't find your own path, there is no tab key you can press to auto-fill shit.
But thats not everything. How about we fill a whole semester with "this is how to printf" but make you write a whole game with unity and c#. (not thaught even the slightest bit until then btw)
Now that you half-assed everything because we put you in a group full of fucks who don't even know what a compiler is but want to tell you you don't know shit and show you their non-working unfinished algorithms in some not-even-syntax-correct java...
...how about we finally go on with Algebra II: complex numbers, how they are going to fuck up your life, how we can do roots of negative numbers all of the sudden and let you do some probability shit no one ever fucking needs. BUT WHY DON'T YOU KNOW EVERYTHING ALREADY HMMMMM, IT'S YOUR SECOND LESSON, YOU WENT TO SCHOOL PLS BE A MATH PRO ASAP CUS YOU NEED IT SO MUCH BUT YOU DON'T NEED TO KNOW PROPER SYNTAX, HOW MEMORY MANAGEMENT WORKS, WHAT A REFERENCE IS AND PLS FINALLY FORGET THE WORD "ALLOCATION" IT DOESN'T PLAY A SINGLE ROLE YOU ARE STUDYING SOFTWARE DEVELOPMENT WHY ARE YOU SO BAD AT ECONOMICS IT MAKES NO SENSE I MEAN YOU HAD A WHOLE SEMESTER OF HOW TO GREET SOMEONE IN ENGLISH, MATHS > ECONOMICS > ENGLISH > FUCKING SHIT > CODING SKILL THATS HOW THE PRIORITIES WORK FOR US WHY DON'T YOU GET IT IT MAKES SO MUCH SENSE BRAH4 -
C++. Damn the pointers. It's because I learned Java before C++ and the memory management in C++. I don't get it ever, the object creation, memory allocation, deallocation and everything4
-
Today I watched "the birth and death of js":
https://destroyallsoftware.com/talk...
Here Gary Bernhardt talks about compiling executables to asm.js and about running the compiled files using a js interpreter that can be included in the kernel.
Eventually, some responsibility can be moved from the kernel to this interpreter, responsibility like virtual memory and trap management.
This speech aims to be fun, so not everything should be taken seriously...
but...
but...
(Forgive me)
...this trick seems to be a nice idea, and projects like Node OS work likewise.
So now, would you even consider this? Or is it just something that will be nothing more than craziness of a mad man?1 -
Our client has put me and some others through at least 10 hours of passive lectures on non-development server management, nothing hands-on included. I realized today that I may be responsible for replicating some of the management processes (while supervised), and I think I start tomorrow. I didn't take notes on any of these meetings because the level of detail was overwhelming. I suck at communicating on time--can I still tell my PM I have about 5% memory retention on these lectures or did I royally screw up?
-
Does anyone here have any good resources for introduction to embedded, low level development, or anything on advanced C concepts? I've been having trouble trying to step into more complicated topics like bit manipulation and stuff I can do with memory management. Also any advice is also appreciated.30
-
How do you organize your downloads folder?
Personally, I make a new folder with some name(altough the name actually being useful is rare) and just select all of my files and dump them there. Finding a file sucks so much though, I can never remember their names so I just look through the folders at the icons and hope I find the file I'm looking for. This mess that is my downloads folder led to looking 5 times in a folder to find a file.
My DOS VM is more organized than that...
Speaking of DOS managing memory in that is hell. I've never had memmaker detect 64MB of RAM, giving the VM 96MB of RAM made it detect 2 more MB or something.5 -
for my job I need to know,
Programming, C#, Optimization, Multithreading and Async code, Working certain tools, Reading difficult written code, Understanding, Physics, Networking, Rendering, Codeloops, Memory management, Profiling tools, Being able to make Jira tickets and read Jira tickets. Understanding source control branching, merging, push and pull. bug fixing.
And I write almost 1 line of code a week on average..
I'm a programmer.2 -
How about incompetent management? Company absolutely murders any possible increase in productivity. Laptop provided? Slow as balls. Takes minutes to log in. I get a Mac for mobile development and that's OK. SSD and adequate memory but I'm primarily a .NET Dev. Can't get on the network with a virtual machine. They won't I stall even a managed image. So can't use databases because they're all AD authenticated. Got a virtual desktop environment and that sucks worse in performance than the laptop. Add the Assault on local administration rights and the monitoring software that constantly thrashers any memory and hard drive usage and im about to quit over all this... All this decided by a non developer and not asked for our opinions. Yay large Enterprises
-
I'm thinking of designing a programming language.
I want it to have easy to read syntax like python. Inheritance and interfaces like java. More advanced concepts like pointers and memory management like c++.
I was originally going to write my own compiler but I figured it's not worth reinventing the wheel. So the current plan is to basically just create a parser that turns a source file into c++ code and then that is compiled with g++. The only problem I can think of with that is catching runtime errors.
How does this language sound?
My purpose is to have a language that is as easy to read as python but with the speed of a compiled program and the ability to use it for embedded projects. I feel like reading larger C++ projects can be quite time consuming. So I figure the trade off of taking a little longer to write the code to make it more obvious what is going on is better than having a lot of syntax that can be tough to walk though the logic of (I find this often with c and c++, not like I don't figure it out but It definitely takes longer than it does to read and understand python)4 -
Identifying when to start a project over because it has gotten out of hand with workarounds and memory management issues.
-
When a DevOps engineer finds a fault with memory leaks on the application/software that crashes services and management responds with "Lets Scale The Application".1
-
I fucking hate being put on the spot. I'm trying my best over here to learn and improve but I don't know my entire project by memory and how every single little thing works, and it makes me feel like shit constantly having to say "I don't know" when asked about task estimates and work difficulty
Now I've made myself look like an incompetent moron because it's stressful and the one thing I was left in charge of I screwed up
Christ man since when did programming become a social management activity?4 -
Garbage collection incentivizes shit and cuckold programmers. Change my mind.
Reason is basically, it's easy to design a bad architecture, potential bugs are just delayed and waiting to happen later. There are still resources like databases, whose management is more or less like memory that you never learn to do properly because of GC15 -
Sydochen has posted a rant where he is nt really sure why people hate Java, and I decided to publicly post my explanation of this phenomenon, please, from my point of view.
So there is this quite large domain, on which one or two academical studies are built, such as business informatics and applied system engineering which I find extremely interesting and fun, that is called, ironically, SAD. And then there are videos on youtube, by programmers who just can't settle the fuck down. Those videos I am talking about are rants about OOP in general, which, as we all know, is a huge part of studies in the aforementioned domain. What these people are even talking about?
Absolutely obvious, there is no sense in making a software in a linear pattern. Since Bikelsoft has conveniently patched consumers up with GUI based software, the core concept of which is EDP (event driven programming or alternatively, at least OS events queue-ing), the completely functional, linear approach in such environment does not make much sense in terms of the maintainability of the software. Uhm, raise your hand if you ever tried to linearly build a complex GUI system in a single function call on GTK, which does allow you to disregard any responsibility separation pattern of SAD, such as long loved MVC...
Additionally, OOP is mandatory in business because it does allow us to mount abstraction levels and encapsulate actual dataflow behind them, which, of course, lowers the costs of the development.
What happy programmers are talking about usually is the complexity of the task of doing the OOP right in the sense of an overflow of straight composition classes (that do nothing but forward data from lower to upper abstraction levels and vice versa) and the situation of responsibility chain break (this is when a class from lower level directly!! notifies a class of a higher level about something ignoring the fact that there is a chain of other classes between them). And that's it. These guys also do vouch for functional programming, and it's a completely different argument, and there is no reason not to do it in algorithmical, implementational part of the project, of course, but yeah...
So where does Java kick in you think?
Well, guess what language popularized programming in general and OOP in particular. Java is doing a lot of things in a modern way. Of course, if it's 1995 outside *lenny face*. Yeah, fuck AOT, fuck memory management responsibility, all to the maximum towards solving the real applicative tasks.
Have you ever tried to learn to apply Text Watchers in Android with Java? Then you know about inline overloading and inline abstract class implementation. This is not right. This reduces readability and reusability.
Have you ever used Volley on Android? Newbies to Android programming surely should have. Quite verbose boilerplate in google docs, huh?
Have you seen intents? The Android API is, little said, messy with all the support libs and Context class ancestors. Remember how many times the language has helped you to properly orient in all of this hierarchy, when overloading method declaration requires you to use 2 lines instead of 1. Too verbose, too hesitant, distracting - that's what the lang and the api is. Fucking toString() is hilarious. Reference comparison is unintuitive. Obviously poor practices are not banned. Ancient tools. Import hell. Slow evolution.
C# has ripped Java off like an utter cunt, yet it's a piece of cake to maintain a solid patternization and structure, and keep your code clean and readable. Yet, Cs6 already was okay featuring optionally nullable fields and safe optional dereferencing, while we get finally get lambda expressions in J8, in 20-fucking-14.
Java did good back then, but when we joke about dumb indian developers, they are coding it in Java. So yeah.
To sum up, it's easy to make code unreadable with Java, and Java is a tool with which developers usually disregard the patterns of SAD. -
Memory debugging iOS probably makes me more anxious and stressed out than anything. I have put 11
hours into attempting to figure out this crash, but still no progress. It's like I can feel management breathing down my neck to get it done asap. You ever get so stressed out while trying to figure something out at work?3 -
About to start writing a report for my programming languages course, I’m writing it over GoLang, If anybody has any good resources for any information on Go, let me know!
The report extends into the history, paradigms, features, memory management system, and anything else I can possibly find on this language. I can find some pretty decent references on the footer of Wikipedia, but I wanted to see if anybody who actually used Go had anything they’d like to share.
Thanks :)1 -
Things I say to my clients when I know that a reboot is required to fix their issue but I don't have enough evidence to prove it to them :
"... On any computing platform, we noted that the only solution to infinite loops (and similar behaviors) under cooperative preemption is to reboot the machine. While you may scoff at this hack, researchers have shown that reboot (or in general, starting over some piece of software) can be a hugely useful tool in building robust systems.
Specifically, reboot is useful because it moves software back to a known and likely more tested state. Reboots also reclaim stale or leaked resources (e.g., memory) which may otherwise be hard to handle. Finally, reboots are easy to automate. For all of these reasons, it is not uncommon in large-scale cluster Internet services for system management software to periodically reboot sets of machines in order to reset them and thus obtain the advantages listed above.
Thus, when you indeed perform a reboot, you are not just enacting some ugly hack. Rather, you are using a time-tested approach to improving the behavior of a computer system."
😎1 -
Currently having very funny project lead, who gives on the spot estimates for 9 years old very pathetic quality code having Android app in security domain. Memory leaks, bad practices, typos, CVEs etc. you name it we have it in our source of the app.
Since 5-6 sprints of our project, almost 50% of user stories were incomplete due to under estimations.
Basically everyone in management were almost sleeping since last 7-8 years about code quality & now suddenly when new Dev & QA team is here they wanted us to fix everything ASAP.
Most humourous thing is product owner is aware about importance of unit test cases, but don't want to allocate user stories for that at the time of sprint planning as code is almost freezed according to him for current release.
Actually, since last release he had done the same thing for each sprint, around 18 months were passed still he hadn't spared single day for unit testing.
Recently app crash issue was found in version upgrade scenario as QAs were much tired by testing hundreds of basic trivial test cases manually & server side testing too, so they can't do actual needful testing & which is tougher to automate for Dev.
Recently when team's old Macbook Pros got expired higher management has allocated Intel Mac minis by saying that few people of organization are misusing Macbooks. So for just few people everyone has to suffer now as there is no flexibility in frequent changing between WFH & WFO. 1 out of those Mac minis faced overheating & in repair since 6 months.
Out of 4 Devs & 3 QAs, all 3 QAs & 2 Devs had left gradually.
I think it's time to say goodbye 😔3 -
so recently i have been through memory management hell. maybe i should rethink about pointers and stuff.
long story short: i have done a calendar app using SDL2 and the program is written in C++. when rendering textures using fonts i referenced null pointers to the font.
i will implement events in the future and if you have any suggestions or some advice leave it in the comments, the feedback helps me a lot!
anyways you might give it a try (i am sorry about the makefile not working, i built the app on windows and needed to link against the folder where sdl is located): https://github.com/zetef/calviewer3 -
I signed up for a uni lesson on MATLAB, the prof just changed his mind because he is braindead enough not to know MATLAB is not free, and decided to go with fucking C, because you know, there hasn't been a fucking assignment in my uni that wasn't in C except the OOP one. Miss me with that shit. I'm now going to develop automatic memory management and OOP libraries in C because if I write another program without proper OOP I will kill myself.4
-
Microsoft announced a new security feature for the Windows operating system.
According to a report of ZDNet: Named "Hardware-Enforced Stack Protection", which allows applications to use the local CPU hardware to protect their code while running inside the CPU's memory. As the name says, it's primary role is to protect the memory-stack (where an app's code is stored during execution).
"Hardware-Enforced Stack Protection" works by enforcing strict management of the memory stack through the use of a combination between modern CPU hardware and Shadow Stacks (refers to a copies of a program's intended execution).
The new "Hardware-Enforced Stack Protection" feature plans to use the hardware-based security features in modern CPUs to keep a copy of the app's shadow stack (intended code execution flow) in a hardware-secured environment.
Microsoft says that this will prevent malware from hijacking an app's code by exploiting common memory bugs such as stack buffer overflows, dangling pointers, or uninitialized variables which could allow attackers to hijack an app's normal code execution flow. Any modifications that don't match the shadow stacks are ignored, effectively shutting down any exploit attempts.5 -
[CONCEITED RANT]
I'm frustrated than I'm better tha 99% programmers I ever worked with.
Yes, it might sound so conceited.
I Work mainly with C#/.NET Ecosystem as fullstack dev (so also sql, backend, frontend etc), but I'm also forced to use that abhorrent horror that is js and angular.
I write readable code, I write easy code that works and rarely, RARELY causes any problem, The only fancy stuff I do is using new language features that come up with new C# versions, that in latest version were mostly syntactic sugar to make code shorter/more readable/easier.
People I have ever worked with (lot of) mostly try to overdo, overengineer, overcomplicate code, subdivide into methods when not needed fragmenting code and putting tons of variables.
People only needed me to explain my code when the codebase was huge (200K+ lines mostly written by me) of big so they don't have to spend hours to understand what's going on, or, if the customer requested a new technology to explain such new technology so they don't have to study it (which is perfectly understandable). (for example it happened that I was forced to use Devexpress package because they wanted to port a huge application from .NET 4.5 to .NET 8 and rewriting the whole devexpress logic had a HUGE impact on costs so I explained thoroughly and supported during developement because they didn't knew devexpress).
I don't write genius code or clevel tricks and patterns. My code works, doesn't create memory leaks or slowness and mostly works when doing unit tests at first run. Of course I also put bugs and everything, but that's part of the process.
THe point is that other people makes unreadable code, and when they pass code around you hear rising chaos, people cursing "WTF this even means, why he put that here, what the heck this is even supposed to do", you got the drill. And this happens when I read everyone code too.
But it doesn't happens the opposite. My code is often readable because I do code triple backflips only on personal projects because I don't have to explain anyone and I can learn new things and new coding styles.
Instead, people want to impress at work, and this results in unintelligible, chaotic code, full of bugs and that people can't read. They want to mix in the coolest technologies because they feel their virtual penis growing to showoff that they are latest bleeding edge technology experts and all.
They want to experiment on business code at the expense of all the other poor devils who will have to manage it.
Heck, I even worked with a few Microsoft MVPs.
Those are deadly. They're superfast code throughput people that combine lot of stuff.
THen they leave at you the problems once they leave.
This MVP guy on a big project for paperworks digital acquisiton for a big company did this huge project I got called to work in, which consited in a backend and a frontend web portal, and pushed at all costs to put in the middle another CDN web project and another Identity Server project to both do Caching with the cdn "to make it faster" and identity server for SSO (Single sign on).
We had to deal with gruesome work to deal with browser poor caching management and when he left, the SSO server started to loop after authentication at random intervals and I had to solve that stuff he put in with days of debugging that nasty stuff he did.
People definitely can't code, except me.
They have this "first of the class syndrome" which goes to the extent that their skill allows them to and try to do code backflips when they can't even do code pushups, to put them in a physical exercise parallelism.
And most people is like this. They will deny and won't admit, they believe they're good at it, but in reality they aren't.
There is some genius out there that does revoluitionary code and maybe needs to do horrible code to do amazing stuff, and that's ok. And there is also few people like me, with which you can work and produce great stuff.
I found one colleague like this and we had a $800.000 (yes, 800k) project in .NET Technology, which consisted in the renewal of 56 webservices and 3 web portals and 2 Winforms applications for our country main railway transport system. We worked in 2 on it, with a PM from the railway company.
It was estimated 14 months of work and we took 11 and all was working wonders. We had ton of fun doing it because also their PM was a cool guy and we did an awesome project and codebase was a jewel. The difficult thing you couldn't grasp if you read the code is if you don't know how railway systems work and that's the only difficult thing.
Sight, there people is macking me sick of this job11 -
You can make your software as good as you want, if its core functionality has one major flaw that cripples its usefulness, users will switch to an alternative.
For example, an imaginary file manager that is otherwise the best in the world becomes far less useful if it imposes an arbitrary fifty-character limit for naming files and folders.
If you developed a file manager better than ES File Explorer was in the golden age of smartphones (before Google excercised their so-called "iron grip" on Android OS by crippling storage access, presumably for some unknown economic incentive such as selling cloud storage, and before ES File Explorer became adware), and if your file manager had all the useful functionality like range selection and tabbed browsing and navigation history, but it limits file names to 50 characters even though the file system supports far longer names, the user will have to rely on a different application for the sole purpose of giving files longer names, since renaming, as a file action, is one of the few core features of a file management software.
Why do I mention a 50-character limit? The pre-installed "My Files" app by Samsung actually did once have a fifty-character limit for renaming files and folders. When entering a longer name, it would show the message "up to 50 characters available". My thought: "Yeah, thank you for being so damn useful (sarcasm). I already use you reluctantly because Google locked out superior third-party file managers likely for some stupid economic incentives, and now you make managing files even more of a headache than it already is, by imposing this pointless limitation on file names' length."
Some one at Samsung's developer department had a brain fart some day that it would be a smart idea to impose an arbitrary limit on file name lengths. It isn't.
The user needs to move files to a directory accessible to a superior third-party file manager just to give it a name longer than fifty characters. Even file management on desktop computers two decades ago was better than this crap!
All of this because Google apparently wants us to pay them instead of SanDisk or some other memory card vendor. This again shows that one only truly owns a device if one has root access. Then these crippling restrictions that were made "for security reasons" (which, in case it isn't clear, is an obvious pretext) can be defeated for selected apps.2 -
I had been assigned a task to create a cross-platform desktop application that keeps track of the expiry of a certain product and notify in real-time.
So, my journey to create such an application starts today and the list below describes the first few hours.
1. Google/Date and time in javascript
2. Google/Javascript date object
3. W3school/Time in javascript
4. W3school/Javascript date getTime() method
5. Google/Are electron.js applications platform independent
6. Google/Dart for desktop applications
7. Google/Is dart cross-platform
8. Google/Best desktop application framework
9. Google/Python for desktop app development
10. Freecodecamp/How to build your first desktop application in python
11. Google/Pyqt
12. Google/Which is the best technology to build cross-platform desktop application
13. Google/Cross-platform desktop app development for windows mac and linux
14. Udemy / cross platform desktop app development for windows mac and linux
15. Youtube/ electron desktop app, demo
16. Youtube/ electron.js is obsolete
17. Youtube/Neutralinojs
18. Youtube/ neutralinojs tutorial
19. Google/Neutralinojs or electronjs
20. Google/Math.js
21. Google/Math.js/JS Bin
22. Google/Cannot find package “math.js”
23. StackOverFlow/How do I resolve “cannot find module” error using Node.js
24. Google/ is it better to install npm packages locally
25. Quora/ why should you stop installing NPM packages globally
26. Google/ what is nvm
27. Google/nvm version check
28. Stackoverflow/node version management on windows
29. Github/coreybutler/nvm-windows: a nvm for windows. Ironically written in Go
30. Google/how to uninstall a npm package
31. Npm docs/uninstalling packages and dependencies
32. Google/require in javascript
33. Youtube/how to install electronjs
34. Youtube/electronjs in 100s(fireship.io)
35. Roryok.com/electronjs memory usage compared to other cross-platform frameworks
36. Google/is electronjs memory hungry
37. Youtube/sql in one hour
38. Youtube/learn sql in 60 mins
39. Geeksforgeeks/connect mysql with node app
40. Stackoverflow/How to return to previous directory using cmd
41. Stackoverflow/how to require using const
42. Geeksforgeeks/difference between require and es6 import and export
TO BE CONTINUED...1 -
Debugging a WCF app and could for the love of it not get its main window up. Then I realized it was freaking Photoshop that placed itself on top of it. PS is great for editing images, but has useless management of windows and memory, at least on Windows, originally developed for Mac as it is.
-
Had to face the music and make the jump from Ubuntu 22.04 to Fedora 36. Am I have to say it’s been night and day so far. Everything is snappier. Yeah dnf is very slow in comparison to apt but there’s changes you can make to speed things up and the nifty terminal interface is a great change and helps to make up for the speed issues.
Came with Python 3.10 installed, Gnome and gtk4 apps are nice, fluid and up to date and the random slowdowns, freezing and restarts of Ubuntu running the version of Gnome are nonexistent.
For the life of me I can’t see why Ubuntu would drop the ball like this. I have a Dell XPS 13 developer edition and this is the best it’s ever ran. Even wifi connectivity is better despite of the crap WiFi card that ships with this machine.
I want to love this version and while it is the most graphical appealing and functional version of Ubuntu I’ve ever used. The memory management issues make it damn near unusable.9 -
Not a rant but just curious to see what mobile OS do all you devs on devRant use?
I recently moved from Android to iOS. Doing so has made me realise how much I loved using Android but due to its inherent stuttery lag, inferior standby time and memory management, I will be staying with iOS for now.
What are your reasons on using the mobile OS that you are using now?7 -
In Python my favs are lxml and sqlalchemy. Very simple to use and solve the problem of xml and database very easily.
In C++ my fav is the Qt system of libraries. Very nice solutions for structuring an application, cross platform, good memory management, COW if you want it, etc. Every release gets better and better. Oh and boost should get an honorable mention. I really like the ranges library and some of the more esoteric libs it provides.2 -
So, like, why doesn't Java let me do manual memory management? In C# if I want to screw up the code-base and everyone that comes after me with my half-informed experiments it totally lets me.21
-
Calling C a "high level language" is complete bullshit. 99,9% of all code is written in C or higher level languages than C.
What a "high level language" is not objectively definable. So this arbitrary division divides programming languages in two halves of astronomically different sizes.
It may have been a good decision in the 70s but it's completely off nowadays. I propose to draw the line between languages with manual and languages with automatic memory management.10 -
I’ve made a list of things I want to learn. Languages, frameworks, etc and i don’t really have too many things on the list that way I can learn them well.
I’ve been struggling with this choice because I’m honestly not sure whether or not I should consider Rust to be on the list.
I like the modern features it contains, and I don’t mind the syntax.
I don’t like it’s way of memory management, I’ve heard it’s performance can be very lacking, and I’ve heard a lot of negative things about the compiler and the efficiency of the language (although I feel like efficiency comes down to the person and how the code is written)
So please redpill me on Rust and try to convince me to add it to my list because it’s close.2