Join devRant
Do all the things like
++ or -- rants, post your own rants, comment on others' rants and build your customized dev avatar
Sign Up
Pipeless API
From the creators of devRant, Pipeless lets you power real-time personalized recommendations and activity feeds using a simple API
Learn More
Search - "memory safe"
-
Woohoo! 32k achieved!!! Finally I can post some new rant without risking some sudden overshoot 😁
So putting celebrations aside for a minute, a while ago I've noticed a tingle when I stroke my finger across metal areas of my tablet, or the sides of my phone (which probably has metal near it too) while it's charging. And it's been bugging me ever since.
Now, some things to note are that it only happens when my feet are touching the ground though slippers, and that the frequency is so low that I can actually feel the tingle when I slide my finger across the material. This to me at least seems like electricity flows through me into ground, and touching the ground directly provides a path so easy for the electrons to run away that I don't feel it at all. But if I lift my feet off the ground entirely, I just get charged up and after that, nothing else happens.
So those are my ideas. The answers on the subject on the other hand.. absolute cancer. Unsurprisingly, most of them came from Apple users. Here's some of them.
https://discussions.apple.com/threa...
- I've not noticed it, but if you're concerned bring the phone to Apple for evaluation.
- Me too facing same problem.. did u visit apple care?
And one good answer at least...
- google emf sensitivity, its real. You are right, there is a small current flowing through your body, try to limit your usage. The problem with this issue is those who aren't affected (lucky ones for now) will tell you these products are 100% safe. To a degree they are, i used my ipod touch for about 2 years straight vwith virtually no symptoms. then the tingling started and it gets worse.You will get more sensitive to progressively less powerful things. I dont want to scare you but just limit your usage like i didnt do 🙂
Overall that discussion was pretty good actually, aside from "bring it to the Genius Bar, they'll know for sure and not just sell you another unit". But then there's Reddit.
https://reddit.com/r/iphone/...
- Ok, real reason is probably that the extension cord and/or outlet is probably not grounded correctly. Either that or you are using a cheap knockoff charger.
Either use a surge protector and/or use the authentic Apple Charger.
- It's not the volts that hurt you, it's the amps
- I think you are in deep love with your phone. That tingling sensation is usually referred to as "love" in human language.
- Do less acid, I would advise.
Okay, so that's the real cancer. Grounding issue sounds reasonable despite it being wrong. Grounding is actually not needed when your charging appliance doesn't have any exposed metal parts. And isolation from high voltage to low voltage side actually happens through things like routering holes into the PCB, creating spark gaps, and using galvanic isolation through things like optocouplers. As for a surge protector? I'm using them to protect my PC and my servers, but the only purpose they serve is to protect from.. you guessed it.. voltage surges, like lightning bolts hitting the grid. They don't do shit for grounding or reducing this tingle! What a fucking tool.
It's not the volts that kill, it's the amps.. yeah I'm sure that the debunking of that is easy to find. Not gonna explain that here. And the rest of it.. yeah it's just fucking cancer.
Now what's the real issue with this tingle? It's actually a Class-Y rated (i.e. kV rated) capacitor that's on the transformer of any switch-mode power supply, including phone chargers. If memory serves me right, it helps with decoupling the switching noise and so on. But as it's connected to the primary side of the transformer, if the cap is sufficiently large and you are sufficiently sensitive, it can actually cause that tingle by passing a fraction of the mains electricity into your body. It's totally safe though, as the power that these caps pass is very small. But to some, it's noticeable.
Hope you found this interesting! And thanks a lot for bringing me to 2^15. I really appreciate it ♥️14 -
Programmers then:
No problem NASA mate, we can use these microcontrollers to bring men to the moon no problem!
Programmers now:
Help Stack Overflow, my program is kill.. isn't 90GB (looking at you Evolution) and 400GB of virtual memory (looking at you Gitea) for my app completely normal? I thought that unused memory was wasted memory!1!
(400GB in physical memory is something you only find in the most high-end servers btw)9 -
Today was a day at work that I felt like I made a significant contribution. It was not a lot of code. Actually it was a difference of 3 characters.
I am developing an industrial server so that my employer can provide access to their machines to enterprise industrial systems. You know, the big boys toys. Probably in fucking java...
Anyway, I am putting this server on an embedded system. So naturally you want to see how much serving a server can serve. In this case the device in more processor starved than memory starved. So I bumped up the speed of the serving from 1000mS to 100mS per sample. This caused the processor to jump from 8% of one core (as read from top) to 70%. Okay, 10x more sampling then 10x approx cpu usage. That is good. I know some basic metrics for a certain amount of data for a couple of different sampling rates.
Now, I realized this really was not that much activity for this processor. I mean, it didn't seem to me that it "took much" to see a large increase of processor usage. So I started wondering about another process on the system that was eating 60 to 70 % all the time. I know it updated a screen that showed some not often needed data from its display among controlling things. Most of the time it will be in a cabinet hidden from the world. I started looking at this code and figured out where the display code was being called.
This is where it gets interesting. I didn't write this code. Another really good programmer I work with wrote this. It also seemed to be pretty standard approach. It had a timer that fired an event every 50mS. This is 20 times per second. So 20 fps if you will. I thought, What would happen if I changed this to 250mS? So I did. It dropped the processor usage to 15%! WTF?! I showed another programmer: WTF?! I showed the guy who wrote it: WTF?! I asked what does it do? He said all it does it update the display. He said: Lets take to 1000mS! I was hesitant, but okay. It dropped to 5%!
What is funny is several people all said: This is running kinda hot. It really shouldn't be this hot.
Don't assume, if you have a hunch, play with it if its safe to do so. You might just shave off 55 to 60 % cpu usage on your system.
So the code I ended up changing: "50" to "1000".16 -
I propose that the study of Rust and therefore the application of said programming language and all of the technology that compromises it should be made because the language is actually really fucking good. Reading and studying how it manages to manipulate and otherwise use memory without a garbage collector is something to be admired, illuminating in its own accord.
BUT going for it because it is a "beTter C++" should not constitute a basis for it's study.
Let me expand through anecdotal evidence, which is really not to be taken seriously, but at the same time what I am using for my reasoning behind this, please feel free to correct me if I am wrong, for I am a software engineer yes, I do have academic training through a B.S in Computer Science yes, BUT my professional life has been solely dedicated to web development, which admittedly I do not go on about technical details of it with you all because: I am not allowed to(1) and (2)it is better for me to bitch and shit over other petty development related details.
Anecdotal and otherwise non statistically supported evidence: I have seen many motherfuckers doing shit in both C and C++ that ADMIT not covering their mistakes through the use of a debugger. Mostly because (A) using a debugger and proper IDE is for pendejos and debugging is for putos GDB is too hard and the VS IDE is waaaaaa "I onlLy NeeD Vim" and (B) "If an error would have registered then it would not have compiled no?", thus giving me the idea that the most common occurrences of issues through the use of the C father/son languages come from user error, non formal training in the language and a nice cusp of "fuck it it runs" while leaving all sorts of issues that come from manipulating the realm of the Gods "memory".
EVERY manual, book, coming all the way back to the K&C book talks about memory and the way in which developers of these 2 languages are able to manipulate and work on it. EVERY new standard of the ISO implementation of these languages deals, through community effort or standard documentation about the new items excised through features concerning MODERN (meaning, no, the shit you learned 20 years ago won't fucking cut it) will not cut it.
THUS if your ass is not constantly checking what the scalpel of electrical/circuitry/computational representation of algorithms CONDONES in what you are doing then YOU are the fucking problem.
Rust is thus no different from the original ideas of the developers behind Go when stating that their developers are not efficient enough to deal with X language, Rust protects you, because it knows that you are a fucking moron, so the compiler, advanced, and well made as it is, will give you warnings of your own idiotic tendencies, which would not have been required have you not been.....well....a fucking idiot.
Rust is a good language, but I feel one that came out from the necessity of people writing system level software as a bunch of fucking morons.
This speaks a lot more of our academic endeavors and current documentation than anything else. But to me DEALING with the idea of adapting Rust as a better C++ should come from a different point of view.
Do I agree with Linus's point of view of C++? fuck no, I do not, he is a kernel engineer, a damn good one at that regardless of what Dr. Tanenbaum believes(ed) but not everyone writes kernels, and sometimes that everyone requires OOP and additions to the language that they use. Else I would be a fucking moron for dabbling in the dictionary of languages that I use professionally.
BUT in terms of C++ being unsafe and unsecured and a horrible alternative to Rust I personaly do not believe so. I see it as a powerful white canvas, in which you are able to paint software to the best of your ability WHICH then requires thorough scrutiny from the entire team. NOT a quick replacement for something that protects your from your own stupidity BY impending the use of what are otherwise unknown "safe" features.
To be clear: I am not diminishing Rust as the powerhouse of a language that it is, myself I am quite invested in the language. But instead do not feel the reason/need before articles claiming it as the C++ killer.
I am currently heavily invested in C++ since I am trying a lot of different things for a lot of projects, and have been able to discern multiple pain points and unsafe features. Mainly the reason for this is documentation (your mother knows C++) and tooling, ide support, debugging operations, plethora of resources come from it and I have been able to push out to my secret project a lot of good dealings. WHICH I will eventually replicate with Rust to see the main differences.
Online articles stating that one will delimit or otherwise kill the other is well....wrong to me. And not the proper approach.
Anyways, I like big tits and small waists.14 -
For those of you who still refuse to accept that safety features in languages are useful and important:
https://daniel.haxx.se/blog/2023/...
The author of curl himself admits that this security flaw could have been prevented if he had used a memory safe language.
I‘m not blaming the author for making this mistake and I‘m not saying that curl should be rewritten in another language.
I just want to rub this in the faces of people who argue that "bugs are always the developer’s fault, therefore it’s perfectly fine to keep using unsafe languages"4 -
*leaning back in the story chair*
One night, a long time ago, I was playing computer games with my closest friends through the night. We would meet for a whole weekend extended through some holiday to excessively celebrate our collaborative and competitive gaming skills. In other words we would definitely kick our asses all the time. Laughing at each other for every kill we made and game we won. Crying for every kill received and game lost. A great fun that was.
Sleep level through the first 48 hours was around 0 hours. After some fresh air I thought it would be a very good idea to sit down, taking the time to eventually change all my accounts passwords including the password safe master password. Of course I also had to generate a new key file. You can't be too serious about security these days.
One additional 48 hours, including 13 hours of sleep, some good rounds Call of Duty, Counter Strike and Crashday plus an insane Star Wars Marathon in between later...
I woke up. A tiereing but fun weekend was over again. After I got the usual cereals for breakfast I set down to work on one of my theory magic decks. I opened the browser, navigated to the Web page and opened my password manager. I type in the password as usual.
Error: incorrect password.
I retry about 20 times. Each time getting more and more terrified.
WTF? Did I change my password or what?...
Fuck.
Ffuck fuck fuck FUCKK.
I've reset and now forgotten my master password. I completely lost memory of that moment. I'm screwed.
---
Disclaimer: sure it's in my brain, but it's still data right?
I remembered the situation but until today I can't remember which password I set.
Fun fact. I also could not remember the contents of episode 6 by the time we started the movie although I'd seen the movie about 10 - 15 times up to that point. Just brain afk. -
Tell me you're a media-obsessed rube drone without telling me you're a media-obsessed rube drone. I'll start:
"SoFtWaRe JoB mArKeT iS hOrRiBlE aNd ShOwS nO sIgN oF rEcOvErY!!!"
hah, you mean those layoffs from that handful of frothed-over tech giants which had, I don't know, approximately ONE HUNDRED TIMES the amount of engineers they actually needed? I swear if i see this trope one more time i'm about to rage. can't wait until 2023 when this 'scare' will be but a memory. yes i'm muad'dib, golden path, worm god, whatever
but it's even simpler, you don't have to drink the spice:
- there are an estimated 205,741 people affected by the LaYoFfs (https://www.trueup.io/layoffs, actually a really cool site I just found)
- there are an estimated 3.87 MILLION software engineers, and that's just in the US, so it's safe to say less than 5% of the industry has been affected
so in short yes, you are a rube, i'll enjoy my multiple job offerings
should have been working on your craft instead of reading all those "news" articles. sheesh, i'd scare to hire anyone for a software position who can't get a grip on simple numbers anyway6 -
Go, Rust, C#, Swift, Java, Ruby, Python, Delphi/Object Pascal, Ada.
Those are the "approved" memory safe languages of the us gov. Seriously, go fuck yourself.
https://readwrite.com/the-nsa-list-...24 -
Not exactly a story since I was too young to remember, but my parents told me that I was really enjoyed playing with the games my father made for the good old commodore 64 we had.
He basically had two 5" floppy holders full of his own games and programs he used to make. Unfortunately we only have the disks now. :(
The first memory of me using the computer though, is when my father bought a computer for his office (was win 95 with the "you are now safe to switch off your computer" message) and I was sneaking in to play with paint because it was so cool back then. -
I can work with Angular, even though it's pain in the but.
My current Angular job is actually the job with the first manager that had decent human values and ethics, I like my team, and yeah, what we building is shit. But it's only 30% shit because of Angular, another 30% are due to SAFe, and the rest is the usual stuff.
Still enjoy my job and respect my team.
But please do not expect me to pretend Angular is on a comparable level to React. Angular hasn't brought any actual innovation in most major versions but releases those breaking major updates still at least twice a year.
Ivy might be awesome, but only because Angular told the world 3 years ago also to have Ivy compatible compile targets for their libs/packages doesn't mean everybody cared.
And the ngcc, the awesome compatibility compiler, mutates node modules in place. So ne parallel stuff, no using yarn2 or pnpm.
At the same time, React brought so many innovations into the frontend world but is basically backwards compatible.
Not sure how the Angular partial compilation and whatever needs to go on works, but it seems like there's hardly anyone that really knows, so you can't use Vite or whatever other new tool.
And sure, if you're really good, you can write Angular without producing memory leaks.
But it's really hard. Do you know what's also quite hard: Producing memory leaks with React!
And for sure, Angular Universal, which isn't used by anyone, it feels like, will still be on a comparable level to an open source product that's used all over the world, builds the basis for an open source company, and is improved by thousand of issues day by day.
And sure, two kinds of change detection are a great idea. And yeah, pretending Angular comes with all included makes it worth it that the API is fucking huge and you're better of knowing nothing, because you have to read up things, than knowing quite a lot, since making assumptions and believing apis work in a similar way and follow similar contentions...
Whatever... I work with it. Like the time. Like the company, even my poss. But please don't expect my lying to you this was a good idea, or Angular is even remotely the same level of React.15 -
Years ago when I was a kid I was into making things to solve problems. Earlier in my life I remember as a kid seeing a neighbor shove 120VAC into the ground to get worms to come up to the surface. I also remember taking a worm and slapping it on one of the posts and shocking the shit out of myself. Apparently that lesson did not stick.
As a teenager I wanted a device similar to this. So I wired a 120VAC plug and cord to a 1/4" audio connector. I threw this in a box and forgot about it. Years later my sister went through my things looking for a power plug for a synthesizer we had. It had an audio 1/4" plug for headphones. She told me she plugged this cord she found of mine into the synth and it started smoking. She went to pull the plug out and shocked the shit out of her. I don't think that the synth ever worked correctly after that.
Well, today I was thinking fondly about that story. I mean, who wouldn't think fondly about shocking the shit out of your sister (she didn't die, so its okay). However, it was dangerous. Really really dangerous.
The lesson I can take from this memory is this: if you know a software interface (or electrical) is not safe then don't build it. Someone will try and use that shit years later and really fuck some stuff up. I have to wonder. What kind of software traps have I built in the past that are yet to be discovered?3 -
I feel like writing or telling people about the time I jumped from Windows 7 Ultimate and jumping to Windows 10. (I'm not against 10, but I'm never updating after what had happened to me)
It all starts when none of my games will play due to a possible issue with my graphics card. I look up "3D source game bug" and not many results pop up. I go on Microsoft's Qna areas and ask this question but to my surprise nothing they say would make sense. "Clean the pins of your graphics card, make sure you verify the games on Steam". I verified the games and they checked out as perfectly fine. I don't have access to my graphics card because this is a laptop, sadly not a tower.
Two months pass and my computer is already showing signs of stress, like it didn't want to live in a sense. It was three times slower than when I was on Windows 7 and it was unallocating areas of my main hard drive where I could make virtual hard drives.
Instantly I start looking up Linux distros and find Linux Mint. 17.3 was the current version at the time. I downloaded it and burned it onto a DVD-rom and rebooted my computer. I loaded into the disc and to my surprise it seemed almost like Windows 7 apart from the Linux part. I grab my external hard drive and partition it to hold the Linux distro and leave it plugged in incase Windows 10 does actually fail.
On December 19, a few months after Windows 10 had released. I start my laptop to try and continue my studies in video game development. But to my surprise, Windows 10 had finally crashed permanently. The screen flickered blue and black, and an error box saying Loginui.exe failed to start. I look at it for a solid minute as my computer had just committed suicide in a sense.
I reboot thinking it would fix the error but it didn't. I couldn't log in anymore.
I force shutdown the laptop and turn it back on putting it into safe mode.
To my surprise loginui.exe works and I sign in. I look at my desktop, the space wallpaper I always admired, the sound files, screen shots I had saved.
I go into file explorer and grab everything out of my default hard drive Windows was installed on. Nothing but 400gb got left behind and that was mainly garbage prototypes I had made and Windows itself. I formatted my external hard drive and placed everything on it. Escaping Windows 10 with around 100GB of useful data I looked at the final shutdown button I would look at.
I click it and try to boot into normal Windows 10. But it doesn't work. It flickers and the error pops up once more.
I force it to shutdown and insert the previous Linux Mint disc I made and format the default hard drive through Linux. I was done. 10 gave me a lot of shit. Java wouldn't work, my games has a functional UI but no screen popped up except a black abyss and it wouldn't even let me try to update my graphics card, apparently my AMD Radeon 5450 was up to date at the AMD Radeon 5000's.
I installed Linux Mint and thinking the games would actually play I open steam and Launch Half-Life 2 to check if Linux would be nicer to me than Windows 10 had been.
To my surprise the game ran. The scene from Highway 17 popped on screen and the UI was fully functional. But it was playing at 10-15fps rather than the usual 60-70fps. Keep look at my drivers and see my graphics card isn't in use. I do some research and it turns out I have a Hybrid Laptop.
Intel HD Graphics and an AMD Radeon 5450 and it was using the Intel and not the AMD. Months of testing and attempts of getting the games to work at high frame rates pass and the Damn thing still functions at a low terrible fps. Finally I give up. I ask my mom for a Windows 7 disc and she says we can't afford it. A few months pass and I finally get a Windows 7 installation disc through money I've saved up. Proudly I put it into my optical disc drive and install it to my main hard drive deleting Linux completely. I announced to all my friends my computer was back in working order and I install everything I needed, Steam, Skype, Blender, and Unity as well as all my games. I test Half-Life 2 and it's running exceptionally smoothly, I test Minecraft at max settings and it's working beautifully. The computer was functioning properly once again and my life as a developer started as I modeled things and blender, learned beginners C# and learned a lot of Batch. Today the computer still runs at a great speed and I warn others of what happened to me after I installed Windows 10 to my machine if they are thinking of switching from 7 or 8 on an older machine.
Truly the damage to my data cannot be undone. But the memory of the maintenance, work, tests, all are a memory of how Windows 10 ruined me and every night before the one year anniversary of Windows 10's release, I took out the battery of my laptop and unplugged it from the a.c. power, just so Windows 10 doesn't show it's DLLs, batch scripts, vbs scripts, anything on my computer. But now, after this has happened and I have recovered, I now only have a story to tell5 -
I struggled with weather to post this but I feel like I have to. I didnt want to feed into the fear or give 'them' any more reason to argue against common sense but I guess it cant be helped.
The reason I was gone for a while was because I went and got my vaccination.
In less than half hour after getting the vaccine, I was in the ICU. The staff told me I had a stroke possibly from clotting and inflamation. I couldnt feel my arm or anything below my shoulders. Yes really.
Apparently I "died" for a little while and when they brought me back I was in a coma for almost a week.
I'm back home now and I still dont fully understand what happened. Still have numbness, and horrible headaches, and can barely think straight sometimes, but the doctors told me that I didnt suffer any permanent brain damage according to my scans.
Also they told me I had old damage to my left and right temporal lobe, which makes sense because I have always suffered problems with short term memory and other issues.
And I'm just at a loss how this could happen. I have no serious injuries. We were told this is safe.
And this is the exact reason I didnt want to post it, because now tards will come in and be "lololol serves you right vaxxer!"
If I knew the side effects were this bad maybe I would have changed my mind but no one told me! I mean I think I still would have got it because we have to protect vulnerable people, but still.
The hospital assured me it wasnt the vaccine and must have been an underlaying condition, but I'm not so sure. I just happen to have a pre-existing problem that I dont know about that causes a stroke and paralysis only half an hour after the shot?
And now I dont know if I'll ever be ok. And doctors warned me I may suffer more strokes and to avoid physically demanding tasks for a while. My primary job is construction (not by chooce). Now I face the prospect of not even being able to work my existing job or do the things I love, like hiking, anymore. So much of the world doesnt make any sense right now and I just dont know what to believe anymore.
Tards will probably be in shortly to suggest I check for microchips or test fucking magnets on myself.
No, just stop.8 -
Avoided IoT(IoS - InternetOfShit) for a long time now, due to the security concerns with retail products.
Now I looked into 433 Transceiver + Arduino solutions.. to build something myself, just for the lolz.
Theory:
Smallest Arduino I found has 32 KByte of programmable memory, a tiny tiny crypto library could take around 4 KBytes...
Set a symetric crypto key for each homebrewn device / sensor / etc, send the info and commands (with time of day as salt for example) encrypted between Server <-> IoT gadget, ciphertext would have checksum appended, magic and ciphertext length prepended.
Result:
Be safe from possible drive-by attacks, still have a somewhat reliable communication?!
Ofc passionate hackers would be still able to crack it, no doubt.
Question: Am I thinking too simple? Am I describing just the standard here?14 -
Sydochen has posted a rant where he is nt really sure why people hate Java, and I decided to publicly post my explanation of this phenomenon, please, from my point of view.
So there is this quite large domain, on which one or two academical studies are built, such as business informatics and applied system engineering which I find extremely interesting and fun, that is called, ironically, SAD. And then there are videos on youtube, by programmers who just can't settle the fuck down. Those videos I am talking about are rants about OOP in general, which, as we all know, is a huge part of studies in the aforementioned domain. What these people are even talking about?
Absolutely obvious, there is no sense in making a software in a linear pattern. Since Bikelsoft has conveniently patched consumers up with GUI based software, the core concept of which is EDP (event driven programming or alternatively, at least OS events queue-ing), the completely functional, linear approach in such environment does not make much sense in terms of the maintainability of the software. Uhm, raise your hand if you ever tried to linearly build a complex GUI system in a single function call on GTK, which does allow you to disregard any responsibility separation pattern of SAD, such as long loved MVC...
Additionally, OOP is mandatory in business because it does allow us to mount abstraction levels and encapsulate actual dataflow behind them, which, of course, lowers the costs of the development.
What happy programmers are talking about usually is the complexity of the task of doing the OOP right in the sense of an overflow of straight composition classes (that do nothing but forward data from lower to upper abstraction levels and vice versa) and the situation of responsibility chain break (this is when a class from lower level directly!! notifies a class of a higher level about something ignoring the fact that there is a chain of other classes between them). And that's it. These guys also do vouch for functional programming, and it's a completely different argument, and there is no reason not to do it in algorithmical, implementational part of the project, of course, but yeah...
So where does Java kick in you think?
Well, guess what language popularized programming in general and OOP in particular. Java is doing a lot of things in a modern way. Of course, if it's 1995 outside *lenny face*. Yeah, fuck AOT, fuck memory management responsibility, all to the maximum towards solving the real applicative tasks.
Have you ever tried to learn to apply Text Watchers in Android with Java? Then you know about inline overloading and inline abstract class implementation. This is not right. This reduces readability and reusability.
Have you ever used Volley on Android? Newbies to Android programming surely should have. Quite verbose boilerplate in google docs, huh?
Have you seen intents? The Android API is, little said, messy with all the support libs and Context class ancestors. Remember how many times the language has helped you to properly orient in all of this hierarchy, when overloading method declaration requires you to use 2 lines instead of 1. Too verbose, too hesitant, distracting - that's what the lang and the api is. Fucking toString() is hilarious. Reference comparison is unintuitive. Obviously poor practices are not banned. Ancient tools. Import hell. Slow evolution.
C# has ripped Java off like an utter cunt, yet it's a piece of cake to maintain a solid patternization and structure, and keep your code clean and readable. Yet, Cs6 already was okay featuring optionally nullable fields and safe optional dereferencing, while we get finally get lambda expressions in J8, in 20-fucking-14.
Java did good back then, but when we joke about dumb indian developers, they are coding it in Java. So yeah.
To sum up, it's easy to make code unreadable with Java, and Java is a tool with which developers usually disregard the patterns of SAD. -
A friend sent me a link to a torrent of a documentary. Because of the memory I have of Napster days and my fear of being that one guy that gets caught, plus the constant specter of malware, I've never been brave enough to use torrent links and software, so I've never looked into it. Is it safe these days? How do I do this and what do you all recommend for getting at the content safely?9
-
I don't really understand all that love rust gets. It's syntax, better than C++'s, isn't better than C syntax.
You can make memory safe programs with C, just if you know how to manage memory, and you should only if you know how to. Bigger ecosystem for C/C++.
C23 waay better than any rs standard.
PS: I used both C/C++ and Rust39 -
Oh... I dont know what to pick...
So i will pick 3 projects from my 3 stages of my dev "carrier"
1.Right after i discovered programing and learned how ,if, while and similar structures worked. The launguage was object pascal with delphi 2007
That was a "safe" with a stupidly complicated lock (text inputs, sliders, ect) it opened a secret folder in the end.
2. It was a embedded code for a Atmega8 AVR, Atmel studio, pure C but without memory managment (i didnt even know that it even existed)
It was a Pip boy knockoff, 16x2 display and a small keyboard connected to the arduino like board that i made on a proto board.
It wasnt that much of a pipboy, it was more of a showoff of atmega8 internal systems, (ADC, timers, interrupts and such)
3.DataLab, after helping my friend with his master thesis, (we meet on discord long story, i was in high school) i decided that mathlab is shit and i created a visual scripting enviroment, launguage C# .net 4 (in the latest version)
I remade the whole program from scrach 1 time, significantly improving everything (code reuse, better algorithms, data processing, code redability and edge cases) I have learned good practises from everywhere. I learned how to use git.
DataLab project looks just like LabViev (i didnt notice that it even existed...), it is frozen now because of my mental status but im planning on using it on my CV when i will be looking for jobs on holidays. There are many things that i can improve in that program but ... first i have to fix myself. -
You can have the best test coverage - even building your own fuzzing framework on the way.
You can have top notch devs adhering to state of the art development processes.
You can have as big a community and as well-funded a bugbounty program as you want...
All of that doesn't matter if you have chosen the wrong language:
https://googleprojectzero.blogspot.com/...
This would just have been an out-of-bounds exception instead of a buffer overflow using an attacker-controlled payload in any memory-safe language.
Language choice matters!
Choose wisely!13