Join devRant
Do all the things like
++ or -- rants, post your own rants, comment on others' rants and build your customized dev avatar
Sign Up
Pipeless API
From the creators of devRant, Pipeless lets you power real-time personalized recommendations and activity feeds using a simple API
Learn More
Search - "what even is memory?"
-
Interviewer: Welcome, Mr X. Thanks for dropping by. We like to keep our interviews informal. And even though I have all the power here, and you are nothing but a cretin, let’s pretend we are going to have fun here.
Mr X: Sure, man, whatever.
I: Let’s start with the technical stuff, shall we? Do you know what a linked list is?
X: (Tells what it is).
I: Great. Can you tell me where linked lists are used?
X:: Sure. In interview questions.
I: What?
X: The only time linked lists come up is in interview questions.
I:: That’s not true. They have lots of real world applications. Like, like…. (fumbles)
X:: Like to implement memory allocation in operating systems. But you don’t sell operating systems, do you?
I:: Well… moving on. Do you know what the Big O notation is?
X: Sure. It’s another thing used only in interviews.
I: What?! Not true at all. What if you want to sort a billion records a minute, like Google has to?
X: But you are not Google, are you? You are hiring me to work with 5 year old PHP code, and most of the tasks will be hacking HTML/CSS. Why don’t you ask me something I will actually be doing?
I: (Getting a bit frustrated) Fine. How would you do FooBar in version X of PHP?
X: I would, er, Google that.
I: And how do you call library ABC in PHP?
X: Google?
I: (shocked) OMG. You mean you don’t remember all the 97 million PHP functions, and have to actually Google stuff? What if the Internet goes down?
X: Does it? We’re in the 1st world, aren’t we?
I: Tut, tut. Kids these days. Anyway,looking at your resume, we need at least 7 years of ReactJS. You don’t have that.
X: That’s great, because React came out last year.
I: Excuses, excuses. Let’s ask some lateral thinking questions. How would you go about finding how many piano tuners there are in San Francisco?
X: 37.
I: What?!
X: 37. I googled before coming here. Also Googled other puzzle questions. You can fit 7,895,345 balls in a Boeing 747. Manholes covers are round because that is the shape that won’t fall in. You ask the guard what the other guard would say. You then take the fox across the bridge first, and eat the chicken. As for how to move Mount Fuji, you tell it a sad story.
I: Ooooooooookkkkkaaaayyyyyyy. Right, tell me a bit about yourself.
X: Everything is there in the resume.
I: I mean other than that. What sort of a person are you? What are your hobbies?
X: Japanese culture.
I: Interesting. What specifically?
X: Hentai.
I: What’s hentai?
X: It’s an televised art form.
I: Ok. Now, can you give me an example of a time when you were really challenged?
X: Well, just the other day, a few pennies from my pocket fell behind the sofa. Took me an hour to take them out. Boy was it challenging.
I: I meant technical challenge.
X: I once spent 10 hours installing Windows 10 on a Mac.
I: Why did you do that?
X: I had nothing better to do.
I: Why did you decide to apply to us?
X: The voices in my head told me.
I: What?
X: You advertised a job, so I applied.
I: And why do you want to change your job?
X: Money, baby!
I: (shocked)
X: I mean, I am looking for more lateral changes in a fast moving cloud connected social media agile web 2.0 company.
I: Great. That’s the answer we were looking for. What do you feel about constant overtime?
X: I don’t know. What do you feel about overtime pay?
I: What is your biggest weakness?
X: Kryptonite. Also, ice cream.
I: What are your salary expectations?
X: A million dollars a year, three months paid vacation on the beach, stock options, the lot. Failing that, whatever you have.
I: Great. Any questions for me?
X: No.
I: No? You are supposed to ask me a question, to impress me with your knowledge. I’ll ask you one. Where do you see yourself in 5 years?
X: Doing your job, minus the stupid questions.
I: Get out. Don’t call us, we’ll call you.
All Credit to:
http://pythonforengineers.com/the-p...89 -
Me, a junior dev: * reports an important issue and a possible fix *
Senior dev 1: nah, it'll do just fine.
Senior dev 2: that won't be an issue, don't you see? It's under control, man.
Senior 3: why are you even here? Why are you even talking?
Manager: yeah, what could possibly go wrong?
* a year after releasing the product, one of the seniors got fired and another one was hired *
New senior: this thing is bananas, code is inconsistent and there's memory leaks everywhere, how does that even work?
Me: nobody believed me when I said that.
Manager: it did work very well, where's the issue?
Me: it's everywhere, goddammit! Don't you see?
New senior: junior dev is right.
Me: I've been a WHOLE YEAR saying that!
Manager: did you? Really? Nah, you didn't.
...
I'm tired of this shit.15 -
I'll get to my four words in a sec, but let me set the background first.
This morning, at breakfast, I fired up my trusty laptop only to get a fan failure warning.
Finally, after the three year old is asleep tonight, I'm able to start dismantling the case to get to the fan. I'm hoping it just needs cleaned out.
Hard drive, memory, and keyboard spread out over the kitchen table. I'm not even halfway done.
Guess what? Now I'm one of the lucky 3500 people to have a power outage at 9 pm. Estimated restore time: 2 am.
Sigh.
"All those tiny screws"
And a three year old in the house...18 -
I'm getting ridiculously pissed off at Intel's Management Engine (etc.), yet again. I'm learning new terrifying things it does, and about more exploits. Anything this nefarious and overreaching and untouchable is evil by its very nature.
(tl;dr at the bottom.)
I also learned that -- as I suspected -- AMD has their own version of the bloody thing. Apparently theirs is a bit less scary than Intel's since you can ostensibly disable it, but i don't believe that because spy agencies exist and people are power-hungry and corrupt as hell when they get it.
For those who don't know what the IME is, it's hardware godmode. It's a black box running obfuscated code on a coprocessor that's built into Intel cpus (all Intell cpus from 2008 on). It runs code continuously, even when the system is in S3 mode or powered off. As long as the psu is supplying current, it's running. It has its own mac and IP address, transmits out-of-band (so the OS can't see its traffic), some chips can even communicate via 3g, and it can accept remote commands, too. It has complete and unfettered access to everything, completely invisible to the OS. It can turn your computer on or off, use all hardware, access and change all data in ram and storage, etc. And all of this is completely transparent: when the IME interrupts, the cpu stores its state, pauses, runs the SMM (system management mode) code, restores the state, and resumes normal operation. Its memory always returns 0xff when read by the os, and all writes fail. So everything about it is completely hidden from the OS, though the OS can trigger the IME/SMM to run various functions through interrupts, too. But this system is also required for the CPU to even function, so killing it bricks your CPU. Which, ofc, you can do via exploits. Or install ring-2 keyloggers. or do fucking anything else you want to.
tl;dr IME is a hardware godmode, and if someone compromises this (and there have been many exploits), their code runs at ring-2 permissions (above kernel (0), above hypervisor (-1)). They can do anything and everything on/to your system, completely invisibly, and can even install persistent malware that lives inside your bloody cpu. And guess who has keys for this? Go on, guess. you're probably right. Are they completely trustworthy? No? You're probably right again.
There is absolutely no reason for this sort of thing to exist, and its existence can only makes things worse. It enables spying of literally all kinds, it enables cpu-resident malware, bricking your physical cpu, reading/modifying anything anywhere, taking control of your hardware, etc. Literal godmode. and some of it cannot be patched, meaning more than a few exploits require replacing your cpu to protect against.
And why does this exist?
Ostensibly to allow sysadmins to remote-manage fleets of computers, which it does. But it allows fucking everything else, too. and keys to it exist. and people are absolutely not trustworthy. especially those in power -- who are most likely to have access to said keys.
The only reason this exists is because fucking power-hungry doucherockets exist.26 -
I have been a mobile developer working with Android for about 6 years now. In that time, I have endured countless annoyances in the Android development space. I will endure them no more.
My complaints are:
1. Ridiculous build times. In what universe is it acceptable for us to wait 30 seconds for a build to complete. Yes, I've done all the optimisations mentioned on this page and then some. Don't even mention hot reload as it doesn't work fast enough or just does not work at all. Also, buying better hardware should not be a requirement to build a simple Android app, Xcode builds in 2 seconds with a 8GB Macbook Air. A Macbook Air!
2. IDE. Android Studio is a memory hog even if you throw 32GB of RAM at it. The visual editors are janky as hell. If you use Eclipse, you may as well just chop off your fingers right now because you will have no use for them after you try and build an app from afresh. I mean, just look at some of the posts in this subreddit where the common response is to invalidate caches and restart. That should only be used as a last resort, but it's thrown about like as if it solves everything. Truth be told, it's Gradle's fault. Gradle is so annoying I've dedicated the next point to it.
3. Gradle. I am convinced that Gradle causes 50% of an Android developer's pain. From the build times to the integration into various IDEs to its insane package management system. Why do I need to manually exclude dependencies from other dependencies, the build tool should just handle it for me. C'mon it's 2019. Gradle is so bad that it requires approx 54GB of RAM to work out that I have removed a dependency from the list of dependencies. Also I cannot work out what properties I need to put in what block.
4. API. Android API is over-bloated and hellish. How do I schedule a recurring notification? Oh use an AlarmManager. Yes you heard right, an AlarmManager... Not a NotificationManager because that would be too easy. Also has anyone ever tried running a long running task? Or done an asynchronous task? Or dealt with closing/opening a keyboard? Or handling clicks from a RecyclerView? Yes, I know Android Jetpack aims to solve these issues but over the years I have become so jaded by things that have meant to solve other broken things, that there isn't much hope for Jetpack in my mind 😤
5. API 2. A non-insignificant number of Android users are still on Jelly Bean or KitKat! That means we, as developers, have to support some of your shitty API decisions (Fragments, Activities, ListView) from all the way back then!
6. Not reactive enough. Android has support for Databinding recently but this kind of stuff should have been introduced from the very start. Look at React or Flutter as to how easy it is to make shit happen without any effort.
7. Layouts. What the actual hell is going on here. MDPI, XHDPI, XXHDPI, mipmap, drawable. Fuck it, just chuck it all in the drawable folder. Seriously, Android should handle this for me. If I am designing for a larger screen then it should be responsive. I don't want to deal with 50 different layouts spread over 6 different folders.
8. Permission system. Why was this not included from the very start? Rogue apps have abused this and abused your user's privacy and security. Yet you ban us and not them from the Play Store. What's going on? We need answers.
9. In Android, building an app took me 3 months and I had a lot of work left to do but I got so sick of Android dev I dropped it in favour of Flutter. I built the same app in Flutter and it took me around a month and I completed it all.
10. XML.
If you're a new dev, for the love of all that is good in this world, do NOT get into Android development. Start with Flutter or even iOS. On Flutter and build times are insanely fast and the hot reload is under 500ms constantly. It's a breath of fresh air and will save you a lot of headaches AND it builds for iOS flawlessly.
To the people who build Android, advocate it and work on it, sorry to swear, but fuck you! You have created a mess that we have to work with on a day-to-day basis only for us to get banned from the app store! You have sold us a lie that Android development is amazing with all the sweet treat names and conferences that look bubbly and fun. You have allowed to get it so bad that we can't target an API higher than 18 because some Android users are still using devices that support that!
End this misery. End our pain. End our suffering. Throw this abomination away like you do with some of your other projects and migrate your efforts over to Flutter. Please!
#NoToGoogleIO #AndroidSummitBoycott #FlutterDev #ReactNative16 -
About two years ago I get roped into a something when someone was requesting an $8000 laptop to run an "program" that they wrote in Excel to pull data from our mainframe.
In reality they are using our normal application that interacts with the mainframe and screen scrapping it to populate several Excel spreadsheets.
So this guy kept saying that he needed the expensive laptop because he needed the extra RAM and processing power for his application. At the time we only supported 32 bit Windows 7 so even though I told him ten times that the OS wouldn't recognize more than 3.5 GB of RAM he kept saying that increasing the RAM would fix his problem. I also explained that even if we installed the 64 bit OS we didn't have approval for the 64 bit applications.
So we looked at the code and we found that rather than reusing the same workbook he was opening a new instance of a workbook during each iteration of his loop and then not closing or disposing of them. So he was running out of memory due to never disposing of anything.
Even better than all of that, he wanted a faster processor to speed up the processing, but he had about 5 seconds of thread sleeps in each loop so that the place he was screen scrapping from would have time to load. So it wouldn't matter how fast the processor was, in the end there were sleeps and waits in there hard coded to slow down the app. And the guy didn't understand that a faster processor wouldn't have made a difference.
The worst thing is a "dev" that thinks they know what they are doing but they don't have a clue.7 -
Every single one of them, and every one that will come after them.
Google, it started out as 2 people in their garage, wanting to make a search engine that was better than the others. Nothing else, nothing evil. Just make the world a little bit better. And look what it's become now. A megacorporation with little to no regards for their user base. Because who cares about users anyway?
Microsoft, it started out with Bill Gates - young high school computer nerd - who wanted to make an operating system for the world to use. Something that's better than the competition. And boy did he do so. Well "better than the competition" aside, he did make it for the world to use. And the world adopted it. And look what it's become now. A megacorporation with little to no regards for their user base. Because who cares about users anyway?
See where I'm going here?
Apple, it started out with Steve Jobs and Steve Wozniak in their garage, just like Google did, wanting to make hardware that was better than the others. Nothing else, nothing evil. Just to make the world a little bit better. And look what it's become now. Planned obsolescence has been baked into it, just like it is in every other piece of technology. Quality control and thinking through the design has become a thing of the past. User choice, yeah who cares about that.
Samsung, it started out centuries ago actually, and I don't really remember the details of it.. ColdFusion has a video on it if memory serves me right. Do watch it if you're interested. Anyway, just like all the others they started out as a company which wanted to make the world a little bit better. And damn right did they do so.. initially. Look what they've become now. Forcing their stupid TouchWiz UI upon their customers (or products?), a Bixby button that can't even be reprogrammed.. and the latest thing.. Knox, advertised as a security feature, but as everyone who likes rooting their devices and mucking with it knows, it is an anti-feature that only serves for lockdown. Why shouldn't you be able to turn in a phone for RMA when a hardware error occurs, when all you've personally modified is the software? Why should changing the software blow that eFuse, so that you can be sure that you can't replace it without specialized equipment and a very steady hand?
I could go on and on forever about more of the tech giants out there, but I feel like this suffices for now. Otherwise I won't have anything else left for future rants! But one thing I know for sure. Every tech company started, starts, and will start out with a desire to make the world a better place, and once they gain a significant customer base, they will without exception turn into the same kind of Evil Megacorp., just like the ones before them. Some may say that capitalism itself is to blame for this, the greed for more when you already have a lot. Who knows? I'd rather say that the very human nature itself is to blame for it. We're by design greedy beings, and I hate it. I hate being human for that. I don't want humans to be evil towards one another, and be greedy for ever more. But I guess that that's just the way it is, and some things do actually never change...17 -
My code review nightmare part 3
Performed a review on/against a workplace 'nemesis'. I didn't follow the department standards document (cause I could care less about spacing, sorted usings, etc) and identified over 80 bugs, logic errors, n+1 patterns, memory leaks (yes, even in .net devs can cause em'), and general bad behavior (ex.'eating' exceptions that should be handled or at least logged)
Because 'Jeff' was considered a golden child (that's another long TL;DR), his boss and others took a major offense and demanded I justify my review, item by item.
About 2 hours into the meeting, our department mgr realized embarrassing Jeff any further wasn't doing anyone any good and decided to take matters into his own hands. Thinking 'well, its about time he did his job', I go back to my desk. About an hour later..
Mgr: "I need you in the conference room, RIGHT NOW!"
<oh crap>
Mgr: "I spoke to Jeff and I think I know what the problem is. Did you ever train him on any of the problems you identified in the review?"
Me: "Um, no. Why would I?"
Mgr: "Ha!..I was right. So lets agree the problems are partially your fault, OK?"
Me: "Finding the bugs in his code is somehow my fault?"
Mgr: "Yes! For example, the n+1 problem in using the WCF service, you never trained him on how to use the service. You wrote the service, correct?"
Me: "Yes, but it's not my job to teach him how to write C#. I documented the process and have examples in the document to avoid n+1. All he had to do was copy/paste."
Mgr: "But you never sat with Jeff and talked to him like a human being? You sit over there in your silo and are oblivious to the problems you cause. This ends today!"
Me: "What the...I have no idea what you are talking about. What in the world did Jeff tell you?"
Mgr: "He told me enough and I'm putting an end to it. I want a compressive training class developed on how to use your service. I'll give you a month to get your act together and properly train these developers."
3 days later, I submit the power-point presentation and accompanying docs. It was only one WCF with a handful of methods. Mgr approved the training, etc..etc. execute the 'training', and Jeff submits a code review a couple of weeks later. From over 80 issues to around 50. The poop hits the fan again.
Mgr: "What's your problem? When are you going to take your responsibility seriously?"
Me: "Its pretty clear I don't have the problem. All the review items were also verified by other devs. Its not me trying to be an asshole."
Mgr: "Enough with the excuses. If you think you can do a better job *you* make the code changes and submit them for Jeff for review. No More Excuses!"
Couple of days later, I make the changes, submit them for review, and Jeff really couldn't say too much other than "I don't see this as an improvement"
TL;DR, I had been tracking the errors generated by the site due to the bugs prior to my changes. After deployment, # of errors went from thousands per hour to maybe hundreds per day (that's another story) and the site saw significant performance increases, fewer customer complaints, etc..etc.
At a company event, the department VP hands out special recognition awards:
VP: "This award is especially well earned. Not only does this individual exemplify the company's focus on teamwork, he also went above and beyond the call of duty to serve our customers. Jeff, come on up and get this well deserved award."19 -
I had to open the desktop app to write this because I could never write a rant this long on the app.
This will be a well-informed rebuttal to the "arrays start at 1 in Lua" complaint. If you have ever said or thought that, I guarantee you will learn a lot from this rant and probably enjoy it quite a bit as well.
Just a tiny bit of background information on me: I have a very intimate understanding of Lua and its c API. I have used this language for years and love it dearly.
[START RANT]
"arrays start at 1 in Lua" is factually incorrect because Lua does not have arrays. From their documentation, section 11.1 ("Arrays"), "We implement arrays in Lua simply by indexing tables with integers."
From chapter 2 of the Lua docs, we know there are only 8 types of data in Lua: nil, boolean, number, string, userdata, function, thread, and table
The only unfamiliar thing here might be userdata. "A userdatum offers a raw memory area with no predefined operations in Lua" (section 26.1). Essentially, it's for the API to interact with Lua scripts. The point is, this isn't a fancy term for array.
The misinformation comes from the table type. Let's first explore, at a low level, what an array is. An array, in programming, is a collection of data items all in a line in memory (The OS may not actually put them in a line, but they act as if they are). In most syntaxes, you access an array element similar to:
array[index]
Let's look at c, so we have some solid reference. "array" would be the name of the array, but what it really does is keep track of the starting location in memory of the array. Memory in computers acts like a number. In a very basic sense, the first sector of your RAM is memory location (referred to as an address) 0. "array" would be, for example, address 543745. This is where your data starts. Arrays can only be made up of one type, this is so that each element in that array is EXACTLY the same size. So, this is how indexing an array works. If you know where your array starts, and you know how large each element is, you can find the 6th element by starting at the start of they array and adding 6 times the size of the data in that array.
Tables are incredibly different. The elements of a table are NOT in a line in memory; they're all over the place depending on when you created them (and a lot of other things). Therefore, an array-style index is useless, because you cannot apply the above formula. In the case of a table, you need to perform a lookup: search through all of the elements in the table to find the right one. In Lua, you can do:
a = {1, 5, 9};
a["hello_world"] = "whatever";
a is a table with the length of 4 (the 4th element is "hello_world" with value "whatever"), but a[4] is nil because even though there are 4 items in the table, it looks for something "named" 4, not the 4th element of the table.
This is the difference between indexing and lookups. But you may say,
"Algo! If I do this:
a = {"first", "second", "third"};
print(a[1]);
...then "first" appears in my console!"
Yes, that's correct, in terms of computer science. Lua, because it is a nice language, makes keys in tables optional by automatically giving them an integer value key. This starts at 1. Why? Lets look at that formula for arrays again:
Given array "arr", size of data type "sz", and index "i", find the desired element ("el"):
el = arr + (sz * i)
This NEEDS to start at 0 and not 1 because otherwise, "sz" would always be added to the start address of the array and the first element would ALWAYS be skipped. But in tables, this is not the case, because tables do not have a defined data type size, and this formula is never used. This is why actual arrays are incredibly performant no matter the size, and the larger a table gets, the slower it is.
That felt good to get off my chest. Yes, Lua could start the auto-key at 0, but that might confuse people into thinking tables are arrays... well, I guess there's no avoiding that either way.13 -
So, some time ago, I was working for a complete puckered anus of a cosmetics company on their ecommerce product. Won't name names, but they're shitty and known for MLM. If you're clever, go you ;)
Anyways, over the course of years they brought in a competent firm to implement their service layer. I'd even worked with them in the past and it was designed to handle a frankly ridiculous-scale load. After they got the 1.0 released, the manager was replaced with some absolutely talentless, chauvinist cuntrag from a phone company that is well known for having 99% indian devs and not being able to heard now. He of course brought in his number two, worked on making life miserable and running everyone on the team off; inside of a year the entire team was ex-said-phone-company.
Watching the decay of this product was a sheer joy. They cratered the database numerous times during peak-load periods, caused $20M in redis-cluster cost overrun, ended up submitting hundreds of erroneous and duplicate orders, and mailed almost $40K worth of product to a random guy in outer mongolia who is , we can only hope, now enjoying his new life as an instagram influencer. They even terminally broke the automatic metadata, and hired THIRTY PEOPLE to sit there and do nothing but edit swagger. And it was still both wrong and unusable.
Over the course of two years, I ended up rewriting large portions of their infra surrounding the centralized service cancer to do things like, "implement security," as well as cut memory usage and runtimes down by quite literally 100x in the worst cases.
It was during this time I discovered a rather critical flaw. This is the story of what, how and how can you fucking even be that stupid. The issue relates to users and their reports and their ability to order.
I first found this issue looking at some erroneous data for a low value order and went, "There's no fucking way, they're fucking stupid, but this is borderline criminal." It was easy to miss, but someone in a top down reporting chain had submitted an order for someone else in a different org. Shouldn't be possible, but here was that order staring me in the face.
So I set to work seeing if we'd pwned ourselves as an org. I spend a few hours poring over logs from the log service and dynatrace trying to recreate what happened. I first tested to see if I could get a user, not something that was usually done because auth identity was pervasive. I discover the users are INCREMENTAL int values they used for ids in the database when requesting from the API, so naturally I have a full list of users and their title and relative position, as well as reports and descendants in about 10 minutes.
I try the happy path of setting values for random, known payment methods and org structures similar to the impossible order, and submitting as a normal user, no dice. Several more tries and I'm confident this isn't the vector.
Exhausting that option, I look at the protocol for a type of order in the system that allowed higher level people to impersonate people below them and use their own payment info for descendant report orders. I see that all of the data for this transaction is stored in a cookie. Few tests later, I discover the UI has no forgery checks, hashing, etc, and just fucking trusts whatever is present in that cookie.
An hour of tweaking later, I'm impersonating a director as a bottom rung employee. Score. So I fill a cart with a bunch of test items and proceed to checkout. There, in all its glory are the director's payment options. I select one and am presented with:
"please reenter card number to validate."
Bupkiss. Dead end.
OR SO YOU WOULD THINK.
One unimportant detail I noticed during my log investigations that the shit slinging GUI monkeys who butchered the system didn't was, on a failed attempt to submit payment in the DB, the logs were filled with messages like:
"Failed to submit order for [userid] with credit card id [id], number [FULL CREDIT CARD NUMBER]"
One submit click later and the user's credit card number drops into lnav like a gatcha prize. I dutifully rerun the checkout and got an email send notification in the logs for successful transfer to fulfillment. Order placed. Some continued experimentation later and the truth is evident:
With an authenticated user or any privilege, you could place any order, as anyone, using anyon's payment methods and have it sent anywhere.
So naturally, I pack the crucifixion-worthy body of evidence up and walk it into the IT director's office. I show him the defect, and he turns sheet fucking white. He knows there's no recovering from it, and there's no way his shitstick service team can handle fixing it. Somewhere in his tiny little grinchly manager's heart he knew they'd caused it, and he was to blame for being a shit captain to the SS Failboat. He replies quietly, "You will never speak of this to anyone, fix this discretely." Straight up hitler's bunker meme rage.13 -
Teacher: Computer settings are stored in the ROM on the motherboard.
Me: *internally* Uhm, yea, sure... and I am the pope
Me: Sorry to interrupt you but how come the BIOS settings get reset when the CMOS battery is pulled out or dies if they are stored in ROM?
Teacher: ....
Me: *internally* yea, that's what I thought, you have no clue what you are even saying - the BIOS is stored in ROM or flash memory while the settings are stored in NVRAM also called CMOS memory...10 -
Application has had a suspected memory leak for years. Tech team got developers THE EXACT CODE that caused it. Few months of testing go by, telling us they're resolving their memory leak problem (finally).
Today: yeah, we still need restarts because we don't know if this new deployment will fix our memory leak, we don't know what the problem is.
WHAT THE FUCK WERE YOU DOING IN THE LOWER REGIONS FOR THREE FUCKING MONTHS?!?!?! HAVING A FUCKING ORGY???????????????
My friends took the time to find your damn problem for you AND YOU'RE GOING TO TELL ME YOU DON'T KNOW WHAT THE PROBLEM IS???
It was in lower regions for 3 MONTHS and you don't know how it's impacting memory usage?!?!?! DO YOU WANT TO STILL HAVE A JOB? BECAUSE IF NOT, I CAN TAKE CARE OF THAT FOR YOU. YOU DON'T DESERVE YOUR FUCKING JOB IF YOU CAN'T FUCKING FIX THIS.
Every time your app crashes, even though I don't need to get your highest level boss on anymore for approval to restart your server, I'M GOING TO FUCKING CALL HIM AND MAKE HIM SEE THAT YOU'RE A FUCKING IDIOT. Eventually, he'll get so annoyed with me, your shit will be fixed. AND I WON'T HAVE TO DEAL WITH YOUR USELESS ASS ANYMORE.
(Rant directed at project manager more than dev. Don't know which is to blame, so blaming PM)28 -
so my mom wanted to write some word document, but she didn't use her laptop for like ~5 years, it didn't boot up so she asked me to fix, now here is what I found :
>the laptop had a 240 gb hdd
>the hdd was literally broken
>bought her a new 500 gb hdd
>installed windows 7
>took 10 mins to install
>took 19 minutes to boot up
>removed windows 7
>installed win xp
>took 30 mins to install
>took 3 minutes to boot up
>opened windows
>checked pc specs
>see picture below
>[insert wtf gif here]
>installed drivers
>took 20 minutes to install drivers
>[insert epic music here]
>tried installing office 2016
>insta regret
>tried installing office 2010
>memory farted and I couldn't even move the cursor
>installed office 2007
>mom started writing document peacefully
>after 2 hours bsod
>mom asks me to fix
>opens laptop to check internal components
>the cpu had a black hole inside
>the fans weren't working due to the circuit being burnt for some reason
>kills laptop
>kills mom
>kills self
>live peacefully in hell11 -
Here are the reasons why I don't like IPv6.
Now I'll be honest, I hate IPv6 with all my heart. So I'm not supporting it until inevitably it becomes the de facto standard of the internet. In home networks on the other hand.. huehue...
The main reason why I hate it is because it looks in every way overengineered. Or rather, poorly engineered. IPv4 has 32 bits worth, which translates to about 4 billion addresses. IPv6 on the other hand has 128 bits worth of addresses.. which translates to.. some obscenely huge number that I don't even want to start translating.
That's the problem. It's too big. Anyone who's worked on the internet for any amount of time knows that the internet on this planet will likely not exceed an amount of machines equal to about 1 or 2 extra bits (8.5B and 17.1B respectively). Now of course 33 or 34 bits in total is unwieldy, it doesn't go well with electronics. From 32 you essentially have to go up to 64 straight away. That's why 64-bit processors are.. well, 64 bits. The memory grew larger than the 4GB that a 32-bit processor could support, so that's what happened.
The internet could've grown that way too. Heck it probably could've become 64 bits in total of which 34 are assigned to the internet and the remaining bits are for whatever purposes large IP consumers would like to use the remainder for.
Whoever designed IPv6 however.. nope! Let's give everyone a /64 range, and give them quite literally an IP pool far, FAR larger than the entire current internet. What's the fucking point!?
The IPv6 standard is far larger than it should've been. It should've been 64 bits instead of 128, and it should've been separated differently. What were they thinking? A bazillion colonized planets' internetworks that would join the main internet as well? Yeah that's clearly something that the internet will develop into. The internet which is effectively just a big network that everyone leases and controls a little bit of. Just like a home network but scaled up. Imagine or even just look at the engineering challenges that interplanetary communications present. That is not going to be feasible for connecting multiple planets' internets. You can engineer however you want but you can't engineer around the hard limit of light speed. Besides, are our satellites internet-connected? Well yes but try using one. And those whizz only a couple of km above sea level. The latency involved makes it barely usable. Imagine communicating to the ISS, the moon or Mars. That is not going to happen at an internet scale. Not even close. And those are only the closest celestial objects out there.
So why was IPv6 engineered with hundreds of years of development and likely at least a stage 4 civilization in mind? No idea. Future-proofing or poor engineering? I honestly don't know. But as a stage 0 or maybe stage 1 person, I don't think that I or civilization for that matter is ready for a 128-bit internet. And we aren't even close to needing so many bits.
Going back to 64-bit processors and memory. We've passed 32 bit address width about a decade ago. But even now, we're only at about twice that size on average. We're not even close to saturating 64-bit address width, and that will likely take at least a few hundred years as well. I'd say that's more than sufficient. The internet should've really become a 64-bit internet too.34 -
Not just another Windows rant:
*Disclaimer* : I'm a full time Linux user for dev work having switched from Windows a couple of years ago. Only open Windows for Photoshop (or games) or when I fuck up my Linux install (Arch user) because I get too adventurous (don't we all)
I have hated Windows 10 from day 1 for being a rebel. Automatic updates and generally so many bugs (specially the 100% disk usage on boot for idk how long) really sucked.
It's got ads now and it's generally much slower than probably a Windows 8 install..
The pathetic memory management and the overall slower interface really ticks me off. I'm trying to work and get access to web services and all I get is hangups.
Chrome is my go-to browser for everything and the experience is sub par. We all know it gobbles up RAM but even more on Windows.
My Linux install on the same computer flies with a heavy project open in Android Studio, 25+ tabs in Chrome and a 1080p video playing in the background.
Up until the creators update, UI bugs were a common sight. Things would just stop working if you clicked them multiple times.
But you know what I'm tired of more?
The ignorant pricks who bash it for being Windows. This OS isn't bad. Sure it's not Linux or MacOS but it stands strong.
You are just bashing it because it's not developer friendly and it's not. It never advertises itself like that.
It's a full fledged OS for everyone. It's not dev friendly but you can make it as much as possible but you're lazy.
People do use Windows to code. If you don't know that, you're ignorant. They also make a living by using Windows all day. How bout tha?
But it tries to make you feel comfortable with the recent bash integration and the plethora of tools that Microsoft builds.
IIS may not be Apache or Nginx but it gets the job done.
Azure uses Windows and it's one of best web services out there. It's freaking amazing with dead simple docs to get up and running with a web app in 10 minutes.
I saw many rants against VS but you know it's one of the best IDEs out there and it runs the best on Windows (for me, at least).
I'm pissed at you - you blind hater you.
Research and appreciate the things good qualities in something instead of trying to be the cool but ignorant dev who codes with Linux/Mac but doesn't know shit about the advantages they offer.undefined windows 10 sucks visual studio unix macos ignorance mac terminal windows 10 linux developer22 -
Okay, story time.
Back during 2016, I decided to do a little experiment to test the viability of multithreading in a JavaScript server stack, and I'm not talking about the Node.js way of queuing I/O on background threads, or about WebWorkers that box and convert your arguments to JSON and back during a simple call across two JS contexts.
I'm talking about JavaScript code running concurrently on all cores. I'm talking about replacing the god-awful single-threaded event loop of ECMAScript – the biggest bottleneck in software history – with an honest-to-god, lock-free thread-pool scheduler that executes JS code in parallel, on all cores.
I'm talking about concurrent access to shared mutable state – a big, rightfully-hated mess when done badly – in JavaScript.
This rant is about the many mistakes I made at the time, specifically the biggest – but not the first – of which: publishing some preliminary results very early on.
Every time I showed my work to a JavaScript developer, I'd get negative feedback. Like, unjustified hatred and immediate denial, or outright rejection of the entire concept. Some were even adamantly trying to discourage me from this project.
So I posted a sarcastic question to the Software Engineering Stack Exchange, which was originally worded differently to reflect my frustration, but was later edited by mods to be more serious.
You can see the responses for yourself here: https://goo.gl/poHKpK
Most of the serious answers were along the lines of "multithreading is hard". The top voted response started with this statement: "1) Multithreading is extremely hard, and unfortunately the way you've presented this idea so far implies you're severely underestimating how hard it is."
While I'll admit that my presentation was initially lacking, I later made an entire page to explain the synchronisation mechanism in place, and you can read more about it here, if you're interested:
http://nexusjs.com/architecture/
But what really shocked me was that I had never understood the mindset that all the naysayers adopted until I read that response.
Because the bottom-line of that entire response is an argument: an argument against change.
The average JavaScript developer doesn't want a multithreaded server platform for JavaScript because it means a change of the status quo.
And this is exactly why I started this project. I wanted a highly performant JavaScript platform for servers that's more suitable for real-time applications like transcoding, video streaming, and machine learning.
Nexus does not and will not hold your hand. It will not repeat Node's mistakes and give you nice ways to shoot yourself in the foot later, like `process.on('uncaughtException', ...)` for a catch-all global error handling solution.
No, an uncaught exception will be dealt with like any other self-respecting language: by not ignoring the problem and pretending it doesn't exist. If you write bad code, your program will crash, and you can't rectify a bug in your code by ignoring its presence entirely and using duct tape to scrape something together.
Back on the topic of multithreading, though. Multithreading is known to be hard, that's true. But how do you deal with a difficult solution? You simplify it and break it down, not just disregard it completely; because multithreading has its great advantages, too.
Like, how about we talk performance?
How about distributed algorithms that don't waste 40% of their computing power on agent communication and pointless overhead (like the serialisation/deserialisation of messages across the execution boundary for every single call)?
How about vertical scaling without forking the entire address space (and thus multiplying your application's memory consumption by the number of cores you wish to use)?
How about utilising logical CPUs to the fullest extent, and allowing them to execute JavaScript? Something that isn't even possible with the current model implemented by Node?
Some will say that the performance gains aren't worth the risk. That the possibility of race conditions and deadlocks aren't worth it.
That's the point of cooperative multithreading. It is a way to smartly work around these issues.
If you use promises, they will execute in parallel, to the best of the scheduler's abilities, and if you chain them then they will run consecutively as planned according to their dependency graph.
If your code doesn't access global variables or shared closure variables, or your promises only deal with their provided inputs without side-effects, then no contention will *ever* occur.
If you only read and never modify globals, no contention will ever occur.
Are you seeing the same trend I'm seeing?
Good JavaScript programming practices miraculously coincide with the best practices of thread-safety.
When someone says we shouldn't use multithreading because it's hard, do you know what I like to say to that?
"To multithread, you need a pair."18 -
What an absolute fucking disaster of a day. Strap in, folks; it's time for a bumpy ride!
I got a whole hour of work done today. The first hour of my morning because I went to work a bit early. Then people started complaining about Jenkins jobs failing on that one Jenkins server our team has been wanting to decom for two years but management won't let us force people to move to new servers. It's a single server with over four thousand projects, some of which run massive data processing jobs that last DAYS. The server was originally set up by people who have since quit, of course, and left it behind for my team to adopt with zero documentation.
Anyway, the 500GB disk is 100% full. The memory (all 64GB of it) is fully consumed by stuck jobs. We can't track down large old files to delete because du chokes on the workspace folder with thousands of subfolders with no Ram to spare. We decide to basically take a hacksaw to it, deleting the workspace for every job not currently in progress. This of course fucked up some really poorly-designed pipelines that relied on workspaces persisting between jobs, so we had to deal with complaints about that as well.
So we get the Jenkins server up and running again just in time for AWS to have a major incident affecting EC2 instance provisioning in our primary region. People keep bugging me to fix it, I keep telling them that it's Amazon's problem to solve, they wait a few minutes and ask me to fix it again. Emails flying back and forth until that was done.
Lunch time already. But the fun isn't over yet!
I get back to my desk to find out that new hires or people who got new Mac laptops recently can't even install our toolchain, because management has started handing out M1 Macs without telling us and all our tools are compiled solely for x86_64. That took some troubleshooting to even figure out what the problem was because the only error people got from homebrew was that the formula was empty when it clearly wasn't.
After figuring out that problem (but not fully solving it yet), one team starts complaining to us about a Github problem because we manage the github org. Except it's not a github problem and I already knew this because they are a Problem Team that uses some technical authoring software with Git integration but they only have even the barest understanding of what Git actually does. Turns out it's a Git problem. An update for Git was pushed out recently that patches a big bad vulnerability and the way it was patched causes problems because they're using Git wrong (multiple users accessing the same local repo on a samba share). It's a huge vulnerability so my entire conversation with them went sort of like:
"Please don't."
"We have to."
"Fine, here's a workaround, this will allow arbitrary code execution by anyone with physical or virtual access to this computer that you have sitting in an unlocked office somewhere."
"How do I run a Git command I don't use Git."
So that dealt with, I start taking a look at our toolchain, trying to figure out if I can easily just cross-compile it to arm64 for the M1 macbooks or if it will be a more involved fix. And I find all kinds of horrendous shit left behind by the people who wrote the tools that, naturally, they left for us to adopt when they quit over a year ago. I'm talking entire functions in a tool used by hundreds of people that were put in as a joke, poorly documented functions I am still trying to puzzle out, and exactly zero comments in the code and abbreviated function names like "gars", "snh", and "jgajawwawstai".
While I'm looking into that, the person from our team who is responsible for incident communication finally gets the AWS EC2 provisioning issue reported to IT Operations, who sent out an alert to affected users that should have gone out hours earlier.
Meanwhile, according to the health dashboard in AWS, the issue had already been resolved three hours before the communication went out and the ticket remains open at this moment, as far as I know.5 -
Just came back from a new café (to the pedantic among us, yes I know it's a bar.. get over it).
And I met some Apple fanboy 🤭
So the guy kept on bragging about his shiny iPhone 6.. and I figured that I'd chime in. Due to my short-term memory being terrible, I'll be paraphrasing here.
M: me
S: iPhone usar _/\_
M: iPhone 6 ey..? I've heard about some devices in which the old ones are throttled down in a system update "to save the battery".
S: Yes, biweekly updates!! You can even delay them to tune them down to the time during which your device is charging and can commence its system update.
M (thinking): You've clearly missed the point sir.. but on Android, system updates don't need to be willfully delayed even. They (usually) won't commence unless your device is 80% and charging. OnePlus has been an exception to this though, probably under the assumption that their users are mostly power users that know what they're doing.
M: You do realize that given that your iPhone 6 is quite old already, Apple will very likely start throttling your device during a system update in the next few months, right.
S: What the hell dude.. look, look how smoothly it's been going for the last few years!!! Nothing wrong with that.
M: Just wait until your repair bill comes from those Geniuses 🤭
M: Sir, you do realize that Apple quotes €600 for battery repairs nowadays, right.
S: What the hell dude!!! I can buy a whole new phone for that much!!
M: Exactly!! That's exactly Apple's business tactic!!! They design their phones as such that the battery replacement (one of the most common repairs) requires you to replace not only the battery, but the whole chassis!!! And on the XS, the battery replacement is nothing short of atrocious!!!
M: Here, have a look at this: https://youtube.com/watch/...
*shows Louis' newest video about him switching to iPhone XS*
S: Yeah that's just bullshit. I bet you're showing me this on one of those crappy Samsungs.
M: No sir. I'm showing this on my Nexus 6P, that is tethered to my OnePlus 6T. Speaking of which, let me introduce you to the Nexus 6P's (one of the crappiest Android flagships to ever exist) repair, the battery replacement of which I've done myself.
(you can watch the iFixit video about it here: https://youtube.com/watch/...)
*explains heatgun, screwdriver, heatgun battery replacement of Nexus 6P and the time each step takes - more than an hour combined*
S: Yeah that's because it's one of those crappy Androids. That'd never happen to this shiny iPhone, look, I've got a $20 battery right here!!!
*shows battery*
M: Sir... That's a battery for a MacBook. A laptop battery.... 🤨
I love how willfully ignorant these Apple users are. To them, all that exists is Apple and Samsung (both of which I hate because lockdown). And they apparently don't even know what repair they have to look for when they'll need one.. maybe that's why those Genius Bars exist? 🤭
I'd love to see the guy's face when the Geniuses quote him the price for battery replacement when his planned obsolescence time comes 🤭14 -
I'll use this topic to segue into a related (lonely) story befitting my mood these past weeks.
This is entire story going to sound egotistical, especially this next part, but it's really not. (At least I don't think so?)
As I'm almost entirely self-taught, having another dev giving me good advice would have been nice. I've only known / worked with a few people who were better devs than I, and rarely ever received good advice from them.
One of those better devs was my first computer science teacher. Looking back, he was pretty average, but he held us to high standards and gave good advice. The two that really stuck with me were: 1) "save every time you've done something you don't want to redo," and 2) "printf is your best debugging friend; add it everywhere there's something you want to watch." Probably the best and most helpful advice I've ever received 😊
I've seen other people here posting advice like "never hardcode" or "modularity keeps your code clean" -- I had to discover these pretty simple concepts entirely on my own. School (and later college) were filled with terrible teachers and worse students, and so were almost entirely useless for learning anything new.
The only decent dev I knew had brilliant ideas (genetic algorithms, sandboxing, ...) before they were widely used, but could rarely implement them well because he was generally an idiot. (Idiot sevant, I think? Definitely the idiot part.) I couldn't stand him. Completely bypassing a ridiculously long story, I helped him on a project to build his own OS from scratch; we made very impressive progress, even to this day. Custom bootloader, hardware interfacing, memory management, (semi) sandboxed processes, gui, example programs ...; we were in highschool. I'm still surprised and impressed with what we accomplished.
But besides him, almost every other dev I met was mediocre. Even outside of school, I went so many years without having another competent dev to work with. I went through various jobs helping other dev(s) on their projects (or rewriting them), learning new languages/frameworks almost every time: php, pascal, perl, zend, js, vb, rails, node, .... I learned new concepts occasionally (which was wonderful) but overall it was just tedious and never paid well because I was too young to be taken seriously (and female, further exacerbating it). On the bright side, it didn't dwindle my love for coding, and I usually spent my evenings playing with projects of my own.
The second dev (and one one of the best I've ever met) went by Novo. His approach to a game engine reminded me of General Relativity: Everything was modular, had a rich inheritance tree, and could receive user input at any point along said tree. A user could attach their view/control to any object. (Computer control methods could be attached in this way as well.) UI would obviously change depending on how the user could interact and the number of objects; admins could view/monitor any of these. Almost every object / class of object could talk to almost everything else. It was beautiful. I learned so much from his designs. (Honestly, I don't remember the code at all, and that saddens me.) There were other things, too, but that one amazed me the most.
I havent met anyone like him ever again.
Anyway, I don't know if I can really answer this week's question. I definitely received some good advice while initially learning, but past that it's all been through discovering things on my own.
It's been lonely. ☹2 -
I have to rant a bit about the toxic reactions to a constructive Q&A website.
People keep complaining that they get downvotes and corrections, or stuff like that.
Are you fucking kidding me?
So you expect people to spend their own time for absolutely free, to help you, while you don't even want to invest in describing the issue you're having properly? And then complain that people are having issues in understanding your questions?
Let's look at this scientifically. Let's gather up some questions that have been received badly on SO in the last few hours. From the top (simply put https://stackoverflow.com/questions... in front of the id):
47619033 - person wants a discussion about an algorithm while not providing any information about what worked and what failed. "Please write a program for me". Breaking at least 2 rules.
47619027 - "check out my videos" spam
47619030 - "Here's the manual that has my answer but I can't find my answer in it".
47619004 - "how do I keep variables in memory"
47618997 - debug this exception, I'll give you no info on what I tried and failed. Screw this, you guys figure this out, I'm going out for beer.
47618993 - expects everyone to guess what the input is, what the expected output is, and whether he has read what HashMap is in the manual. But sure, this question is so far the best out of all the bad ones.
47618985 - please write code according to my specifications
Should I go on? There wasn't a single clear question about problems in code in this entire small set. Be free to continue searching, let me know if you find something that:
1. You understand what's being asked
2. Answer is clear and non-ambiguous (ex. NOT "which language is the coolest?")
3. Not asking someone to write a program for them.
4. Answer is not found in the most basic form of manuals (ex. php.net)
5. Is about programming.
The point is:
If you get downvoted on Stackoverflow - then you wrote a shitty question. Instead of coming over here and venting uselessly, simply address the concerns and at least TRY to write a clear question if you expect any answers.5 -
It's march, I'm in my final year of university. The physics/robotics simulator I need for my major project keeps running into problems on my laptop running Ubuntu, and my supervisor suggests installing Mint as it works fine on that.
I backup what's important across a 4GB and a 16GB memory stick. All I have to do now is boot from the mint installation disk and install from there. But no, I felt dangerous. I was about to kill anything I had, so why not `sudo rm -rf /*` ? After a couple seconds it was done. I turned it off, then back on. I wanted to move my backups to windows which I was dual booting alongside Ubuntu.
No OS found. WHAT. Called my dad, asked if what I thought happened was true, and learnt that the root directory contains ALL files and folders, even those on other partitions. Gone was the past 2 1/2 years of uni work and notes not on the uni computers and the 100GB+ other stuff on there.
At least my current stuff was backed up.
TL;DR : sudo rm -rf /* because I'm installing another Linux distro. Destroys windows too and 2 1/2 years of uni work.13 -
It has been bugging the shit out of me lately... the sheer number of shit-tier "programmers" that have been climbing out of the woodwork the last few years.
I'm not trying to come across as elitist or "holier than thou", but it's getting ridiculous and annoying. Even on here, you have people who "only do frontend development" or some other lame ass shit-stain of an excuse.
When I first started learning programming (PHP was my first language), it wasn't because I wanted to be a programmer. I used to be a member (my account is still there, in fact) of "HackThisSite", back when I was about 12 years old. After hanging out long enough, I got the hint that the best hackers are, in essence, programmers.
Want to learn how to do SQL injection? Learn SQL - write a program that uses an SQL database, and ask yourself how you would exploit your own software.
Want to reverse engineer the network protocol of some proprietary software? Learn TCP/IP - write a TCP/IP packet filter.
Back then, a programmer and a hacker were very much one in the same. Nowadays, some kid can download Python, write a "hello, world" program and they're halfway to freelancing or whatever.
It's rare to find a programmer - a REAL programmer, one who knows how the systems he develops for better than the back of his hand.
These days, I find people want the instant gratification that these simpler languages provide. You don't need to understand how virtual memory works, hell many people don't even really understand C/C++ pointers - and that's BASIC SHIT right there.
Put another way, would you want to take your car to a brake mechanic that doesn't understand how brakes work? I sure as hell wouldn't.
Watching these "programmers" out there who don't have a fucking clue how the code they write does what it does, is like watching a grown man walk around with a kid's toolbox full or plastic toys calling himself a mechanic. (I like cars, ok?!)
*sigh*
Python, AngularJS, Bootstrap, etc. They're all tools and they have their merits. But god fucking dammit, they're not the ONLY damn tools that matter. Stop making excuses *not* to learn something, Mr."IOnlyDoFrontEnd".
Coding ain't Lego's, fuckers.36 -
TLDR: SAP sucks. Don't ever work with it. Run away from it. Delete it from your memory. If your company works with it, quit. It's the best you can do.
A couple of weeks ago the group rant was "Story of your best/worst career choice" and I talked about the contract I signed. Even tho that is still true and I still feel like that, I think I got a new worst choice:
WORKING WITH SAP.
When I got this job I knew it would be SAP, but I didn't know what SAP was. I just thought "it's programming, how bad can it be?" OH BOII.
If only I would have done some FUCKING RESEARCH I would know this would be a mistake!!
And I knew I didn't want to work with this, I knew I wanted to be a web developer, but I STILL ACCEPTED THE JOB OH MAN WHAT WAS I THINKING I'M SO MAD.
Were I live we all have the same mentality when looking for the first job, which is to just accept anything you can get, because it's your first job, you need to work and to get experience, even if it's a bad job or if you know you won't like it. When my intership was ending, I told my parents I didn't want to stay there because they treat their employes like shit, and the salary is terrible. They told me to still accept it if they offered because I still need a job (this one was web tho) and experience.
So, of course, since I was looking for my first job, was told this my entire live, always thought like that and they were the first to contact me, I accepted it.
BIGGEST FUCKING MISTAKE!! DON'T THINK LIKE THIS!! AND STOP TELLING KIDS THIS!! IT'S NOT A GOOD MENTALITY!!!
ALSO DON'T WORK WITH SAP! EVER!24 -
Had nothing to do today, so I thought I´ll test the migration of SVN to Git in Gitlab.
Boss sent me a mail today, that when I migrate we need to preserve the history, so I actually have to put some effort in it. *sigh*
Shout-out to the Gitlab documentation at this point.
That´s probably the best doc I´v ever read...
Well so I tried to use svn2git. And well...
Who the fuck thought that this piece of shit software is in any way usable?
Holy crap!
If it fails, it just does so without any info why. Even in verbose mode.
And the RAM usage? What the actual fuck?
This whole thing is a complete memory leak!
32Gigs of RAM full in Minutes and the whole system starts to stall!
And then when I thought it finally runs through.
Bam another git checkout error...
Googling for that error then I found something. A version of svn2git made in .Net Core.
Didn´t expect much but I tried it anyways.
And would you look at that!
It ran so smooth and didn´t need that much RAM , I had some doubt it did work correctly.
But it did!
I think I´m gonna pay a coffee or two to some guy over in China now!6 -
Often I hear that one should block spam email based on content match rather than IP match. Sometimes even that blocking Chinese ranges in particular is prejudiced and racist. Allow me to debunk that after I've been looking at traffic on port 25 with tcpdump for several weeks now, and got rid of most of my incoming spam too.
There are these spamhausen that communicate with my mail server as much as every minute.
- biz-smtp.com
- mailing-expert.com
- smtp-shop.com
All of them are Chinese. They make up - rough guess - around 90% of the traffic that hits my edge nodes, if not more.
The network ranges I've blocked are apparently as follows:
- 193.106.175.0/24 (Russia)
- 49.64.0.0/11 (China)
- 181.39.88.172 (Ecuador)
- 188.130.160.216 (Russia)
- 106.75.144.0/20 (China)
- 183.227.0.0/16 (China)
- 106.75.32.0/19 (China)
.. apparently I blocked that one twice, heh
- 116.16.0.0/12 (China)
- 123.58.160.0/19 (China)
It's not all China but holy hell, a lot of spam sure comes from there, given how Golden Shield supposedly blocks internet access to the Chinese citizens. A friend of mine who lives in China (how he got past the firewall is beyond me, and he won't tell me either) told me that while incoming information is "regulated", they don't give half a shit about outgoing traffic to foreign countries. Hence all those shitty filter bag suppliers and whatnot. The Chinese government doesn't care.
So what is the alternative like, that would block based on content? Well there are a few solutions out there, namely SpamAssassin, ClamAV and Amavis among others. The problem is that they're all very memory intensive (especially compared to e.g. Postfix and Dovecot themselves) and that they must scan every email, and keep up with evasion techniques (such as putting the content in an image, or using characters from different character sets t̾h̾a̾t̾ ̾l̾o̾o̾k̾ ̾s̾i̾m̾i̾l̾a̾r̾).
But the thing is, all of that traffic comes from a certain few offending IP ranges, and an iptables rule that covers a whole range is very cheap. China (or any country for that matter) has too many IP ranges to block all of them. But the certain few offending IP ranges? I'll take a cheap IP-based filter over expensive content-based filters any day. And I don't want to be shamed for that.7 -
NEW 6 Programming Language 2k16
1. Go
Golang Programming Language from Google
Let's start a list of six best new programming language and with Go or also known by the name of Golang, Go is an open source programming language and developed by three employees of Google and the launch in 2009, very cool just 3 people.
Go originated and developed from the popular programming languages such as C and Java, which offers the advantages of compact notation and aims to keep the code simple and easy to read / understand. Go language designers, Robert Griesemer, Rob Pike and Ken Thompson, revealed that the complexity of C ++ into their main motivation.
This simple programming language that we successfully completed the most tasks simply by librariesstandar luggage. Combining the speed of pemrogramandinamis languages such as Python and to handalan of C / C ++, Go be the best tools for building 'High Volume of distributed systems'.
You need to know also know, as expressed by the CTO Tokopedia namely Mas Leon, Tokopedia will switch to GO-lang as the main foundation of his system. Horrified not?
eh not watch? try deh see in the video below:
[Embedyt] http://youtube.com/watch/...]
2. Swift
Swift Programming Language from Apple
Apple launched a programming language Swift ago at WWDC 2014 as a successor to the Objective-C. Designed to be simple as it is, Swift focus on speed and security.
Furthermore, in December 2015, Swift Apple became open source under the Apache license. Since its launch, Swift won eye and the community is growing well and has become one of the programming languages 'hottest' in the world.
Learning Swift make sure you get a brighter future and provide the ability to develop applications for the iOS ecosystem Apple is so vast.
Also Read: What to do to become a full-stack Developer?
3. Rust
Rust Programming Language from Mozilla
Developed by Mozilla in 2014 and then, and in StackOverflow's 2016 survey to the developer, Rust was selected as the most preferred programming language.
Rust was developed as an alternative to C ++ for Mozilla itself, which is referred to as a programming language that focus on "performance, parallelisation, and memory safety".
Rust was created from scratch and implement a modern programming language design. Its own programming language supported very well by many developers out there and libraries.
4. Julia
Julia Programming Language
Julia programming language designed to help mathematicians and data scientist. Called "a complete high-level and dynamic programming solution for technical computing".
Julia is slowly but surely increasing in terms of users and the average growth doubles every nine months. In the future, she will be seen as one of the "most expensive skill" in the finance industry.
5. Hack
Hack Programming Language from Facebook
Hack is another programming language developed by Facebook in 2014.
Social networking giant Facebook Hack develop and gaungkan as the best of their success. Facebook even migrate the entire system developed with PHP to Hack
Facebook also released an open source version of the programming language as part of HHVM runtime platform.
6. Scala
Scala Programming Language
Scala programming termasukbahasa actually relatively long compared to other languages in our list now. While one view of this programming language is relatively difficult to learn, but from the time you invest to learn Scala will not end up sad and disappointing.
The features are so complex gives you the ability to perform better code structure and oriented performance. Based programming language OOP (Object oriented programming) and functional providing the ability to write code that is capable of evolving. Created with the goal to design a "better Java", Scala became one behasa programming that is so needed in large enterprises.3 -
After months of development, testing, testing and even more testing the app was ready for deployment to production. Happy days, the end was in sight!
I had a week's leave so I handed over the preparation for deployment to my Senior Developer and left it in his capable hands while I enjoyed the sun and many beers.
I came back on the day of deployment and proudly pressed the deploy button. Hurrah!
Not long after I got loads of phone calls from around the country as the app wasn't working. What madness is this?! We tested this for months!
Turns out my Senior didn't like the way I'd written the SQL queries so he changed them. Which is obviously both annoying and unprofessional, but even worse he got a join wrong so the memory usage was a billion times more and it drained the network bandwidth for the whole site when I tried to debug it.
I got all the grief for the app not working and for causing many other incidents by running queries that killed the network.
So...much... rage!!!3 -
I do not like the direction laptop vendors are taking.
New laptops tend to feature fewer ports, making the user more dependent on adapters. Similarly to smartphones, this is a detrimental trend initiated by Apple and replicated by the rest of the pack.
As of 2022, many mid-range laptops feature just one USB-A port and one USB-C port, resembling Apple's toxic minimalism. In 2010, mid-class laptops commonly had three or four USB ports. I have even seen an MSi gaming laptop with six USB ports. Now, much of the edges is wasted "clean" space.
Sure, there are USB hubs, but those only work well with low-power devices. When attaching two external hard drives to transfer data between them, they might not be able to spin up due to insufficient power from the USB port or undervoltage caused by the impedance (resistance) of the USB cable between the laptop's USB port and hub. There are USB hubs which can be externally powered, but that means yet another wall adapter one has to carry.
Non-replaceable [shortest-lived component] mean difficult repairs and no more reserve batteries, as well as no extra-sized battery packs. When the battery expires, one might have to waste four hours on a repair shop for a replacement that would have taken a minute on a 2010 laptop.
The SD card slot is being replaced with inferior MicroSD or removed entirely. This is especially bad for photographers and videographers who would frequently plug memory cards into their laptop. SD cards are far more comfortable than MicroSD cards, and no, bulky external adapters that reserve the device's only USB port and protrude can not replace an integrated SD card slot.
Most mid-range laptops in the early 2010s also had a LAN port for immediate interference-free connection. That is now reserved for gaming-class / desknote laptops.
Obviously, components like RAM and storage are far more difficult to upgrade in more modern laptops, or not possible at all if soldered in.
Touch pads increasingly have the buttons underneath the touch surface rather than separate, meaning one has to be careful not to move the mouse while clicking. Otherwise, it could cause an unwanted drag-and-drop gesture. Some touch pads are smart enough to detect when a user intends to click, and lock the movement, but not all. A right-click drag-and-drop gesture might not be possible due to the finger on the button being registered as touch. Clicking with short tapping could be unreliable and sluggish. While one should have external peripherals anyway, one might not always have brought them with. The fallback input device is now even less comfortable.
Some laptop vendors include a sponge sheet that they want users to put between the keyboard and the screen before folding it, "to avoid damaging the screen", even though making it two millimetres thicker could do the same without relying on a sponge sheet. So they want me to carry that bulky thing everywhere around? How about no?
That's the irony. They wanted to make laptops lighter and slimmer, but that made them adapter- and sponge sheet-dependent, defeating the portability purpose.
Sure, the CPU performance has improved. Vendors proudly show off in their advertisements which generation of Intel Core they have this time. As if that is something users especially care about. Hoo-ray, generation 14 is now yet another 5% faster than the previous generation! But what is the benefit of that if I have to rely on annoying adapters to get the same work done that I could formerly do without those adapters?
Microsoft has also copied Apple in demanding internet connection before Windows 11 will set up. The setup screen says "You will need an Internet connection…" - no, technically I would not. What does technically stand in the way of Windows 11 setting up offline? After all, previous Windows versions like Windows 95 could do so 25 years earlier. But also far more recent versions. Thankfully, Linux distributions do not do that.
If "new" and "modern" mean more locked-in and less practical and difficult to repair, I would rather have "old" than "new".12 -
FFUUUuucccckkk me sideways. So I decided to look into USB type-c's power delivery and alt modes. Cause I kinda want to make an adapter card to run my displays over a single cable. TLDR of the rest: USB-C has some huge capabilities which noone is interested in using since its way to complex to handle for what its worth in the end.
Now PD alone is kinda ok to deal with since a lot of powerbanks use it and some hobby guys documented how to work with it. I find it really odd thou that you NEED to use a dedicated IC for using the configuration chanel to negotiate how much power you can draw. Why the USB standard didnt use some simple 5V low speed signalling? Also the standard says that you only have to implement 5v 0.6A with every other power level being optional. (This is also true for cables. Most manufacturers use only the USB 2.0 standard for them and brag about how fast type-C is. ლ(ಠ益ಠლ) )
Now to the alt modes. These motherfuckers are a real shitshow to deal with. First you need a Mux to deal with USB-C's two way insertion, so your signals wont get flipped. Next thing is that you have four lanes at your disposal in alt mode. Which you can either use for four Display Port Lanes or two DP lanes and two USB 3.0 lanes. (You always get USB 2.0) Now you may think that there would be one simple chip to do it all? Nope you need atleast two at the price of 6$ each. One for PD and one for Alt modes. Both are very hard to solder (QFN, 0.5 mm pitch 40+ pins) TI ended up being the only one with a decent offering of IC's that do what I need. As for working with them, you would think that you just slap a simple MCU on there that communicates over I2C or SPI to configure the chips? Nope! You program the chips memory from which it configures itsself. And the programming is done with some TI tool which gives me no idea as to how you can handle everything whith no control logic behind it.
Looking into alternative IC's leaves me with cypress semi. And their documentation is basically a total mess. I wanna know what that chip is good for and what I need to do to make it work. I dont care about technical details mixed with marketing jargon nobody understands. And I really despise that I have to register just to download a datasheet. Especially since there is no info about it on the main page.
And this whole rant hasnt even touched the topic that USB-C only uses DP and nothing else. So you better hope that you have DP++ so you can use a passive conversion.
This was my Ted Talk about USB-C. Some info in it may be subject to my stupidity and errors as it currently is 02:15 in the morning and I need some sleep.14 -
Chrome, Firefox, and yes even you Opera, Falkon, Midori and Luakit. We need to talk, and all readers should grab a seat and prepare for some reality checks when their favorite web browsers are in this list.
I've tried literally all of them, in search for a lightweight (read: not ridiculously bloated) web browser. None of them fit the bill.
Yes Midori, you get a couple of bonus points for being the most lightweight. Luakit however.. as much as I like vim in my terminal, I do not want it in a graphical application. Not to mention that just like all the others you just use webkit2gtk, and therefore are just as bloated as all the others. Lightweight my ass! But programmable with Lua, woo! Not like Selenium, Chrome headless, ... does that for any browser. And that's it for the unique features as far as I'm concerned. One is slow, single-threaded and lightweight-ish (Midori) and another has vim keybindings in an application that shouldn't (Luakit).
Pretty much all of them use webkit2gtk as their engine, and pretty much all of them launch a separate process for each tab. People say this is more secure, but I have serious doubts about that. You're still running all these processes as the same user, and they all have full access to the X server they run under (this is also a criticism against user separation on a single X session in general). The only thing it protects against is a website crashing the browser, where only that tab and its process would go down. Which.. you know.. should a webpage even be able to do that?
But what annoys me the most is the sheer amount of memory that all of these take. With all due respect all of you browsers, I am not quite prepared to give 8 fucking gigabytes - half the memory in this whole box! - just for a dozen or so tabs. I shouldn't have to move my web browser to another lesser used 16GB box, just to prevent this one from going into fucking swap from a dozen tabs. And before someone has a go at the add-ons, there's 4 installed and that's it. None of them are even close to this complete and utter memory clusterfuck. It's the process separation. Each process consumes half a GB of memory, and there's around a dozen of them in a usual browsing session. THAT is the real problem. And I want to get rid of it.
Browsers are at their pinnacle of fucked up in my opinion, literally to the point where I'm seriously considering elinks. Being a sysadmin, I already live my daily life in terminals anyway. As such I also do have resources. But because of that I also associate every process with its cost to run it, in terms of resources required. Web browsers are easily at the top of the list.
I want to put 8GB into perspective. You can store nearly 2 entire DVD movies in that memory. However media players used to play them (such as SMPlayer) obviously don't do that. They use 60-80MB on average to play the whole movie. They also require far less processing power than YouTube in a web browser does, even when you download that exact same video with youtube-dl (either streamed within the media player or externally). That is what an application should be.
Let's talk a bit about these "complicated" websites as well. I hate to break it to you framework web devs, but you're a dime a dozen. The competition is high between web devs for that exact reason. And websites are not complicated. The document itself is plain old HTML, yes even if your framework converts to it in the background. That's the skeleton of your document, where I would draw a parallel with documents in office suites that are more or less written in XML. CSS.. oh yes, markup. Embolden that shit, yes please! And JavaScript.. oh yes, that pile of shit that's been designed in half a day, and has a framework called fucking isEven (which does exactly what it says on the tin, modulo 2 be damned). Fancy some macros in your text editor? Yes, same shit, different pile.
Imagine your text editor being as bloated as a web browser. Imagine it being prone to crashing tabs like a web browser. Imagine it being so ridiculously slow to get anything done in your productivity suite. But it's just the usual with web browsers, isn't it? Maybe Gopher wasn't such a bad idea after all... Oh and give me another update where I have to restart the browser when I commit the heinous act of opening another tab, just because you had to update your fucking CA certs again. Yes please!19 -
!dev
A child's mind is fascinating.
I remember how it felt being a kid, just deliriously happy.
Things were magical, mystical and happy.
I knew the world wasn't perfect, I knew bad things happened to good people.
But a kid's mind is so powerful that it can fill in the blanks with the most cheerful and optimistic perspectives.
And at some point in my childhood I was exposed to videogames, and that kinda took me down fantasy lane even further.
I was extremely young and barely retaining any memories when I was exposed to my first console, a famicom.
I have a somewhat vivid memory of my mind being blown away for the first time by watching my brother play New Ghostbusters II for NES.
From then on, we never stopped and played several console and dos/pc games.
When I was 10, someone from the neighborhood brought in a couple of floppys with Pokemon Yellow.
"What? Pokemon? How the fuck is that even possible? This is a pc, not a gameboy".
I didn't know at the time what an emulator was, but I was super fucking stoked to be able to play that.
My dad had a 1 gb laptop from work that he didn't use, so I hoarded that shit, and I would get to bed and play nearly everyday.
The experience was surreal. I was doing pc gaming... not on a chair, on a fucking bed, and I was playing a gameboy game... on a pc.
It was so intense to me, that even after more than 2 decades of that time in my life, I still remember how it feels like.
Like, you know how you can "feel" things if you think about them? like for example if you think about the taste of chicken, you can somehow feel it for a second.
Well I have like an actual physical sensation linked to that experience but I can't explain it at all, because it's just a sensation.
I think people usually say they feel that way, for example, about the PSX (usually refered to as ps one) loading screen. I experienced that too but when I was 12, so it was not as intense (it does make me feel the fuzzies though).
I also remember other things with very high detail, like the texture of my bed cover, the weather, mom cooking, the clunky shape of the laptop, the way I carelessly stored it above a pile of magazines, etc.
I rememeber ofc how it felt looking at the game sprites, interacting with NPCs, and the goddamn fucking glorious music.
It was dreamy.
Years and years later, I grew up and I stopped living in fantasy world and became more aware of the grim aspects of life my younger self was sugarcoating.
So I tried to play pokemon again, again and again, and no matter how hard I tried to revive that euphoria, I could not never do it.
I started to get annoyed at the game.
"Come oooon, I did the tutorial already, let me skip this.
This pokemon is useless, why am I even training it.
Fuck, I'm tired of grinding"
At some point I accepted that the feeling would never return, and that it would just live in my memory.
Ironically, I can recall that memory and how it felt anytime I want to.
And I can actually still feel it, and throughtout these years, it has never wore down.
And eventually I learned how to play pokemon and enjoy it:
I read tier lists at smogon online and just catch and train the pokemons that are higher on the list, which is how i got to beat yellow in like 3 days.
(This is nothing compared to what speedrunners do, but much better than the weeks it had taken me in the past).
That served as an important lesson that when a kid plays a game, his mind is also the game at the same time, filling the blanks with its imagination.
A very similar experience happened to me with harvest moon, which is the precursor of stardew valley.
and that game is faaar more emotional: you talk to people, overtime you befriend them and they open up, you meet a girl, you marry her, have a kid
you get farm animals, you brush them, they become happy
you get attached
that game was also so powerful in me that in all naiveness I thought I wanted to be a farmer.
Eventually I grew up and hit puberty and from then on, I focused more on competitive games, like smash bros, cs and tf2.
and i dunno how to end a post so eat my fucking nuts17 -
I just installed Opera Mini on my PSP. That alone isn't very exciting on its own, although I am stoked that my website does in fact render on a device from 2009. With the helpful guidance of a laptop from 2004 that's doing the hotspot duties for this thing.
No, what really got me stoked is that Opera still supports these old platforms, and how small they managed to make it. The .jar file for Opera Mini 4.5 is ~800kB large. There's a .jad file as well but it's negligible in size and seems to be a signature of sorts.
Let that sink in for a moment. This entire web browser is 800kB. Firefox meanwhile consistently consumes 800 MEGABYTES.. in MEMORY. So then, I went to think for a moment, how on earth did they manage to cram an entire functioning web browser in 800kB? Hell, what makes up a web browser anyway?
The answer to that question I got to is as follows. You need an engine to render the web page you receive. You need a UI to make the browser look nice. And finally you need a certificate store to know which TLS certificates to trust. And while probably difficult to make, I think it should be possible to do in 800k. Seriously, think about it. How would you go *make* a web browser? Because I've already done that in the past.
Earlier I heard that you need graphics, audio, wasm, yada yada backends too.. no. Give your head a shake. Graphics are the responsibility of the graphics driver. A web browser shouldn't dabble with those at all. Audio, you connect to PulseAudio (in Linux at least) and you're done. Hell I don't even care about ALSA or OSS here. You just connect to the stuff that does that job for you. And WebAssembly.. God I could rant about that shit all day. How about making it a native application? Not like actual Assembly is used for BIOS and low-level drivers. And that we already have a better language for the more portable stuff called C.
Seriously, think about it. Opera - a reputable browser vendor - managed to do it in 800kB on a 12 year old device. Don't go full wank on your framework shit on the comments. And don't you fucking dare to tell me that there's more to it. They did it for crying out loud. Now you take a look at your shitpile for JS code and refactor that shit already. Thank you.21 -
porra; caralho; toma no cu.
this fucking shit xamarin. I wish the ass who programed the xamarin vs2017 integration to go fuck off.
srsly, I just want to fucking code this fucking fucker VS2017 keep shitting all around me
first I was gonna install it. didn't install because no memory left. fair enough, my fault there.
cleaned 35 gbs.
finish installing VS, with xamarin. FIRST GOD DAMN TIME I create fucking project, 2 fucking errors and 3 warnings. I DIDN'T EVEN TYPE A COMMA.
ok, tried fucking it. it seems to be conflict between version of Android and xamarin forms. fucker you it shouldn't be like this. anyway.
tried downloading the updated Android version.
it failed at 80%! what error you ask? missing fucking space ok, fuck that thing is huge, ok, my fault again. uninstalled all programs I was not using, all projects I'm not current working on. more fucking 30GB free. tried again. ANDROID IS TOO FUVKING HUGE CAN'T INSTALL IN 30GB!!!
Ok. instead of updating android, gonna downgrade xamarin, can't downgrade. ok gonna remove and install an early version.
unistalled. CAN'T FIND XAMARIN DLLS.
I was like, fuck this project, gonna start a new one. ok, all seems fine, for some weird reason. Except no. I try adding a new page, ops, APPARENTLY VS2017 CAN'T LOAD A GODDAMN .XAML
Ok, I can create a .cs page. done, except now I get a fucking timeout error. fuck.
I search the internet for a workaround, see a guy saying I could manually add a .xaml + .cs by creating this files and then adding them to the proj file.
did it. I go again, everything seems fine. but now I can't freaking reference the damn page.
I'm fucking losing my mind here.
In the mean time I have to turn in this project at the end of the week AND I CAN'T FUCKING OPEN THE GOD DAMN FREKING PROJECT PROPERLY!
FUCK. MY. LIFE.
FUCK XAMARIM AS WELL
FUCK VISUAL STUDIO
FUCK MICROSOFT
FUCK THAT DAMN SSD
FUCK THAT BOSS WHO THINK THAT A 128GB SSD IS ENOUGH
FUCK IT ALL...15 -
It grinds my gears to no end as to how insanely BAD most Electrical engineering software is. Lets start with Tina. A circuit simulator. A few versions ago it was rather good but now it feel like its built upon more legacy crap than fucking Windows! This causes it to have memory access violations and crashes even when you look at it from an odd angle.
On topic of circuit simulation. LT-Spice! It has less errors than Tina but is impossible to use without being lobotomized first. Who the FUCK decided it was a good idea to reinvent keyboard shortcuts by movin all of them to the F-row at the top of the keyboard. Also there is no option to delete a component. YOU NEED TO USE CUT IN ORDER TO REMOVE IT!
And at last Altium Designer for Layouting and Schematics. Whose license costs 9 grand. No one outside of some companies will buy this because of the price. Altium realized this and made two watered down versions of it. Which dont really get updates anymore. (last one was in 2018) So they essentially made a cash grab from people who cant afford their actual product. There also exist other (and a lot cheaper) products than what Altium offers. The problem is that they dont work well with interoperability. Schematics drawn in one program will look distorted in another or not import at all. And since Altium is the industry standard you got yourself this nice steaming soup of impossible collaboration. Its kinda like Adobe being absolute shit at progressing their software just because they got no competition. Or rather they do but the industry wont switch cause adobe is so engraved into it.6 -
Amazing! My joke functionality worked and the server delivered it's content successfully to the client. The joke: malloc is wrapped to give that message and retry IMMEDIATELY. Expectation is that the computer that was already in panic mode gets even fucked up more because it's out of memory. My malloc is literally while((ptr=malloc()} == NULL) { show_warning()
;malloc();}. Imagine if it didn't give that warning, I would've never known that a malloc failed. Who checks their freaking malloc result? You should, but i do not see much people doing it.
The previous crash on screen is what happens if you're doing a get instead of a post. I just declare my server app indestructible btw. Ffs!5 -
I fucking hate Electron, what ever happened to developing software natively? It's not like you have to stick to dot Net and C# or whatever, there's literally Lazarus or Delphi, which, at least Lazarus, not only is open source but also supports all major platforms.
Even Python has GTK, Qt and Pywin32 or whatever its called. While not exactly cross platform, it's still not eating up 1GB of RAM when you launch it.
I don't care if Bob from across the street uses it because he's too lazy to learn anything new, but when huge companies like fucking Discord (valued at 10B dollars) use it, it's insane.
More than once has Discord had a memory leak and was reaching upwards of 6.5GB of RAM usage.
Whats the most popular code editor? VSCode, Electron.
Chat client? Discord, Electron.
Wanna use something other than Discord? Maybe Matrix? Well guess what, while they do have multiple clients, the most developed and usable one is Element, yeah, Electron.
Slack? Electron
My crypto wallet? Exodus, Electron.
I genuinely don't think 16GB of RAM is enough nowadays. Thankfully I'm running a very minimal install of Arch Linux and do most of my work in a KVM, but it still hurts my brain.
By the way things are looking nowadays, We'll be using Javascript for Kernels soon.
Thanks for coming to my Ted Talk.
Also apparently the filter on this site sees ". net" as an url.10 -
Imagine asking your friends to help you rate your app on the google play store and instead of saying NO I DONT WANT TO RATE YOUR APPLICATION no... they decide to fuck with your mind.
1)
I will rate it tomorrow. (she never rated it tomorrow nor the next couple of weeks later)
2)
I will keep it in mind and rate it later :). (she never rated it later)
3)
I rated it haha (less than 30 seconds later they deleted the rating)
4)
Send me a link and I'll rate it (i send the link, they never respond or read my message again)
5)
I dont have memory on my phone :) (because 13MB of memory is a lot of storage requirements but taking 1 million selfies of up to 25GB is completely fine)
6)
I dont have memory on my phone what dont you understand :) x2 (this is the second girl)
7)
Your trying to give me a virus?? No (i got blocked multiple times)
8)
You want to hack me by making me install this application from the link that you sent me that leads to google play store? No (blocked)
9)
Rate your app? Haha i dont care about it because it doesnt bring me any benefit only the fat cocks that fill my pussy up satisfy me and not ur app haha
10)
Haha send me a link ill rate it (i send link, 8 hours later no reply or reading my message, i text her back if she had done it and im still put on ignore)
...
N)
more
----
Notice how none of these people have said the 2 letter word: "no".
All of these 10 examples are based on a true story.
All of these 10 examples are different people.
---
How hard
Can it be
To just
Write
no
---
.
---
For all of you who are about to trash talk saying i am desperately trying to beg people to rate my app:
i know all of those people for a long time. But when it comes to asking (and not forcing) someone to do you a favor for free that takes no more than 30 seconds, no one seems to have 30 seconds of their free time. Dont get me wrong, some of my friends did politely rate it and left a review, even the people who i barely knew left a review and rated it, but the people with whom I was closer by, didnt.
---
In the beginning i used to not care about this at all. Then i started falling into depression because of it. I fell then into deep depression. Then i sunk so deep that i couldn't feel any emotions anymore so i laughed as an anti depressive mechanism whenever something depressing happened. Now i cant even laugh because i have no more energy. Now i actually leave man tears
---
The only thing more valuable than people, any materialistic thing, animals, coding and even money - is time....
----
why do you waste my time
if i ask you to do something that takes 30 seconds and you dont want to do it
why cant you just say no
why do you drag me
why do you say you're going to do it when you know you wont do it
what do you gain by unnecessarily lying to someone for such a small thing?
to someone who has been a good person to you?
do you feel superior?
is your ego bigger?
----
This experience has taught me that not even a human from the same blood can be trusted.
All of your are fucked up in the head in your own style and i am guilty of it too, all of us are.
But i have never seen the human evolution went from simplicity to overengineered complexitory bULLSHit where you have to lie to someone and waste hours, days, weeks, months and sometimes years of his time just because you dont want to say a 2 letter word, no.
But when that person becomes more successful than you and achieves higher status, Theen you have those 30 seconds of free time. All of you are fucking cynics. and i am so much overly disgusted by all of this fucking bullshit....
-----
This experience has proven to me to simply focus on investing into myself and learn and improve myself and no one else. To not even bother asking even for a small kind of help, a feedback from my work because people don't have 30 seconds of their free time. That is all.12 -
Many people here rant about the dependency hell (rightly so). I'm doing systems programming for quite some time now and it changed my view on what I consider a dependency.
When you build an application you usually have a system you target and some libraries you use that you consider dependencies.
So the system is basically also a dependency (which is abstracted away in the best case by a framework).
What many people forget are standard libraries and runtimes. Things like strlen, memcpy and so on are not available on many smaller systems but you can provide implementations of them easily. Things like malloc are much harder to provide. On some system there is no heap where you could dynamically allocate from so you have to add some static memory to your application and mimic malloc allocating chunks from this static memory. Sometimes you have a heap but you need to acquire the rights to use it first. malloc doesn't provide an interface for this. It just takes it. So you have to acquire the rights and bring them magically to malloc without the actual application code noticing. So even using only the C standard library or the POSIX API can be a hard to satisfy dependency on some systems. Things like the C++ standard library or the Go runtime are often completely unavailable or only rudimentary.
For those of you aiming to write highly portable embedded applications please keep in mind:
- anything except the bare language features is a dependency
- require small and highly abstracted interfaces, e.g. instead of malloc require a pointer and a size to be given to you application instead of your application taking it
- document your ABI well because that's what many people are porting against (and it makes it easier to interface with other languages)2 -
Github 101 (many of these things pertain to other places, but Github is what I'll focus on)
- Even the best still get their shit closed - PRs, issues, whatever. It's a part of the process; learn from it and move on.
- Not every maintainer is nice. Not every maintainer wants X feature. Not every maintainer will give you the time of day. You will never change this, so don't take it personally.
- Asking questions is okay. The trackers aren't just for bug reports/feature requests/PRs. Some maintainers will point you toward StackOverflow but that's usually code for "I don't have time to help you", not "you did something wrong".
- If you open an issue (or ask a question) and it receives a response and then it's closed, don't be upset - that's just how that works. An open issue means something actionable can still happen. If your question has been answered or issue has been resolved, the issue being closed helps maintainers keep things un-cluttered. It's not a middle finger to the face.
- Further, on especially noisy or popular repositories, locking the issue might happen when it's closed. Again, while it might feel like it, it's not a middle finger. It just prevents certain types of wrongdoing from the less... courteous or common-sense-having users.
- Never assume anything about who you're talking to, ever. Even recently, I made this mistake when correcting someone about calling what I thought was "powerpc" just "power". I told them "hey, it's called powerpc by the way" and they (kindly) let me know it's "power" and why, and also that they're on the Power team. Needless to say, they had the authority in that situation. Some people aren't as nice, but the best way to avoid heated discussion is....
- ... don't assume malice. Often I've come across what I perceived to be a rude or pushy comment. Sometimes, it feels as though the person is demanding something. As a native English speaker, I naturally tried to read between the lines as English speakers love to tuck away hidden meanings and emotions into finely crafted sentences. However, in many cases, it turns out that the other person didn't speak English well enough at all and that the easiest and most accurate way for them to convey something was bluntly and directly in English (since, of course, that's the easiest way). Cultures differ, priorities differ, patience tolerances differ. We're all people after all - so don't assume someone is being mean or is trying to start a fight. Insinuating such might actually make things worse.
- Please, PLEASE, search issues first before you open a new one. Explaining why one of my packages will not be re-written as an ESM module is almost muscle memory at this point.
- If you put in the effort, so will I (as a maintainer). Oftentimes, when you're opening an issue on a repository, the owner hasn't looked at the code in a while. If you give them a lot of hints as to how to solve a problem or answer your question, you're going to make them super, duper happy. Provide stack traces, reproduction cases, links to the source code - even open a PR if you can. I can respond to issues and approve PRs from anywhere, but can't always investigate an issue on a computer as readily. This is especially true when filing bugs - if you don't help me solve it, it simply won't be solved.
- [warning: controversial] Emojis dillute your content. It's not often I see it, but sometimes I see someone use emojis every few words to "accent" the word before it. It's annoying, counterproductive, and makes you look like an idiot. It also makes me want to help you way less.
- Github's code search is awful. If you're really looking for something, clone (--depth=1) the repository into /tmp or something and [rip]grep it yourself. Believe me, it will save you time looking for things that clearly exist but don't show up in the search results (or is buried behind an ocean of test files).
- Thanking a maintainer goes a very long way in making connections, especially when you're interacting somewhat heavily with a repository. It almost never happens and having talked with several very famous OSSers about this in the past it really makes our week when it happens. If you ever feel as though you're being noisy or anxious about interacting with a repository, remember that ending your comment with a quick "btw thanks for a cool repo, it's really helpful" always sets things off on a Good Note.
- If you open an issue or a PR, don't close it if it doesn't receive attention. It's really annoying, causes ambiguity in licensing, and doesn't solve anything. It also makes you look overdramatic. OSS is by and large supported by peoples' free time. Life gets in the way a LOT, especially right now, so it's not unusual for an issue (or even a PR) to go untouched for a few weeks, months, or (in some cases) a year or so. If it's urgent, fork :)
I'll leave it at that. I hear about a lot of people too anxious to contribute or interact on Github, but it really isn't so bad!4 -
The "stochastic parrot" explanation really grinds my gears because it seems to me just to be a lazy rephrasing of the chinese room argument.
The man in the machine doesn't need to understand chinese. His understanding or lack thereof is completely immaterial to whether the program he is *executing* understands chinese.
It's a way of intellectually laundering, or hiding, the ambiguity underlying a person's inability to distinguish the process of understanding from the mechanism that does the understanding.
The recent arguments that some elements of relativity actually explain our inability to prove or dissect consciousness in a phenomenological context, especially with regards to outside observers (hence the reference to relativity), but I'm glossing over it horribly and probably wildly misunderstanding some aspects. I digress.
It is to say, we are not our brains. We are the *processes* running on the *wetware of our brains*.
This view is consistent with the understanding that there are two types of relations in language, words as they relate to real world objects, and words as they relate to each other. ChatGPT et al, have a model of the world only inasmuch as words-as-they-relate-to-eachother carry some information about the world as a model.
It is to say while we may find some correlates of the mind in the hardware of the brain, more substrate than direct mechanism, it is possible language itself, executed on this medium, acts a scaffold for a broader rich internal representation.
Anyone arguing that these LLMs can't have a mind because they are one-off input-output functions, doesn't stop to think through the implications of their argument: do people with dementia have agency, and sentience?
This is almost certain, even if they forgot what they were doing or thinking about five seconds ago. So agency and sentience, while enhanced by memory, are not reliant on memory as a requirement.
It turns out there is much more information about the world, contained in our written text, than just the surface level relationships. There is a rich dynamic level of entropy buried deep in it, and the training of these models is what is apparently allowing them to tap into this representation in order to do what many of us accurately see as forming internal simulations, even if the ultimate output of that is one character or token at a time, laundering the ultimate series of calculations necessary for said internal simulations across the statistical generation of just one output token or character at a time.
And much as we won't find consciousness by examining a single picture of a brain in action, even if we track it down to single neurons firing, neither will we find consciousness anywhere we look, not even in the single weighted values of a LLMs individual network nodes.
I suspect this will remain true, long past the day a language model or other model merges that can do talk and do everything a human do intelligence-wise.31 -
Absolutely not dev-related.
Blah, blah, weird conversation and shit. I'm too tired and lazy to write this crap again, but let's do it.
The guy is a dev I randomly found on some chatting service, he was interesting to talk with until this conversation. I'll write this out of memory, so yeah.
Him: So by the way I wrote an app that you give your penis size to to get measurements and stuff about it.
Me, thinking it was dev humor: That's hilarious. Tell me more, I'm interested.
Him: So the idea behind all of this was to gather some big data style info about people's penis size and habits and all that stuff.
Me: Man that's awesome. Can I see the source?
Him: No, it's proprietary. You can buy a license though.
Me: You went that far for a joke?
Him: What joke?
Me: The whole software you just told me about.
Him: That's not a joke, I'm being very serious about it.
Me: Oh well. What did you get from the stats?
Him: I got some tips from people's habits! I never thought that shaving it could make it look bigger, but that's awesome!
Me: Do you really care about it that much?
Him: Studies have proven that size correlated with confidence. Since I started doing it, I've been more confident than ever!
Me: Great.
Him: I'm a bit disappointed to see that I'm in the lower percentiles though.
Me: Well of course you are.
Him: Why would you say that?
Me: Well since people with a big dick tend to go more willingly into the subject and might even buy a fucking app for it, of course you'd have the higher average in your stats.
Him: You're only saying that because you have a small cock.
Me: Why the fuck would you say that? You're the one that's concerned about it, not me.
Him: Go on, what's your size?
Me, because I don't care about discussing that stuff: *Tells him*
Him: [stats, comparisons and stuff]
Me: Well I never gave a fuck and your stats won't make me change my mind.
[ ... Some other shit about my size compared to his ... ]
Him: Would you want to work with me for the database maintenance?
Me: You must be joking?
Him: I'm serious.
Me: *Deletes account*
Seriously, fuck that guy. I rewrote that quickly so you only had the best, but it was a whole fucking conversation.3 -
Most memorable co-worker was a daft idiot.
this was 10 years ago - I was working as a junior in my very first job, fresh out of uni, for a very small startup. It was me, and the 3 founders, for a very long time. Then this old (45, from my perspective then..) dev was hired.
This guy had no idea how to do the job. no common sense. the code confused him. the founders confused him. I was focusing on my work - and was unable to help him much with his. His only saving grace? He was a nice guy. Really nice.
But why was he so memorable, out of all the people I ever worked with? simple. He had a short term memory problem. Could not, even if he really tried, remember what he did yesterday.... when I asked him what his issue was, he decribed his life is like a car going in reverse in a heavy fog. "I can only see a short distance backwards, with no idea where I'm going".
Startup was sold to a big company. I became a teamlead/architect. He? someone decided he should be a PM. -
So, I work in a game development studio, right?
We're trying to launch the title on as many platforms as reasonable, because as a social VR app we're kinda rowing upstream.
So far, Steam and Oculus have been fairly reasonable, if oddly broken and inconsistent.
Enter store 3.
Basically no in-game transaction support (our asking prompted them to *start* developing it. No, it's not very complete). No patch-update system (You want an update? Gotta download the whole fsckin' thing!). No beta-testing functionality for most of their stuff ("Just write the code like the example, it will work, trust us!"). No tools besides the buggy SDK (Wanna upload that new build? Say hello to this page in your web browser!).
So, in other words: Fun.
We've been trying to get actively launched for two months now. Keep in mind that the build has been up on Steam and Oculus for over a year and half a year (respectively), so the actual binary functionality is, presumably fine.
The best feedback we get back tends to be "Well, when we click the Launch button it crashes, so fail."
Meanwhile we're going back and forth, dealing with other-side-of-the-world timezone lag, trying to figure out what is so different from their machines as ours. Eventually we get them to start sending logs (and no, Windows Event logs are not sufficient for GAMES, where did you even get that idea????) except the logs indicate that the program is getting killed so terribly that the engine's built-in crash handler can't even kick in to generate memory dumps or even know it died.
All this boils down to today, where I get a screenshot of their latest attempt.
I just can't even right now.5 -
My new favourite commit message:
"All changes as of 18th Sept"
How tremendously useful? There I was looking to know what changes were made to enable a feature / service, thought I could look for that in the commit message, but no you've given me a much more efficient way of finding out.
I simply need to download the contents of your memory, find out what date you made a change, and then dig through the massive commit to find the piece of info I need.
Forget experience using Git features, managing merges, following Git flow, or even any other SCM ... how can people be so tick when it comes to recording what they've done.
Heres a little cheat sheet for those struggling:
- Commit message
Describe what you actually ****ing did. Don't tell me the date or the time, thankfully Git records those. Don't tell me the day of the week, if I need to know I can figure that out, just tell me what ... you ... did.
- Feature branch names
Now this is a tricky one. You might be surprised to know that this isn't in fact suppose to be whatever random adjective or noun popped into your head ... I know, I too was shocked. The purpose of this is to let other people know what new feature is being worked on in this branch.
- Reusing feature branches
Now I know you started it to add some unit tests, and naming it "testing" is sort of ok. But its actually not ok to name it testing when you add 3 unit tests ... then rip out and replace 60% of the business logic. Perhaps it would have been wiser to create a new feature branch, given you are now working on a new feature.2 -
I just had a boys-out night with my son. Went to some restaurant, found a parking spot in a confusing parking lot (half is more expensive than the other half of the lot, not sure which fee applies to the middle row... confusing), started paying for parking with the app (pays every 15 minutes until stopped).
Went inside, ordered a pizza, some ice cream. Chatting, playing, eating, having fun,... An SMS comes: "You have outstanding fines" and a link to the gov taxes' website.
wtf.. I must have parked in the wrong spot. FUCK! Oh well, it should not be a large fine anyways, it's just for parking....
Click on the link, login with my bank/SmartID creds. Another SmartID dialog pops up asking for a PIN2.
What? PIN1 is for authentication, PIN2 is for Authorization. What am I authorizing...?
Reading through the Auth message: "Paying 2473€ for Boris SomeLastname".
what.....?
Thank God my muscle memory did not kick in and I did not enter that PIN2.
And thank God I know what PIN1 and PIN2 are for.
It would've been one expensive boys-out evening... Even a strip club would've been cheaper.
Stay sharp, guys!
P.S. Later I checked the URL. It used all the right keywords, and it was registered as an .info domain. It was somewhat off, but gov websites trying to be lean do sometimes use some weird ass domains.15 -
Fucking Microsoft Excel
I was reading a post (https://devrant.com/rants/2093724/...) and as my eyes went in and out of focus, probably due to the diabetes from sitting 18 hours a day on my ever-expanding shitbox, I had a perfect vision of the ultimate nightmare.
Imagine if you will, you are chained, to a desk, doomed to work with tools just inadequate enough to make you want to drive a nail through your own temple. You do not know how you got here, or why, nor do you remember the last time you slept, only that familiar tingling in the brainstem you call a brain, the one emotion you can still recognize, a sense of all encompassing *fear*, a dread, like the fart that wouldn't die.
You don't know when it first began, or why, only that this is your whole world, your whole existence, this desk, chained to it, and the fear, ever present, of something worse. And in hops a familiar face, for the sixty ninth time that day, as if to ask 'you got those TPS reports?' In hops what? None other than a giant man sized smiling paper clip with googly eyes full of murder and corporate torture fetishes, like garfield, except people actually still remember him.
"High I'm Mr Clippy, Excel addition!"
He squawks. At least it's not the dildos made of broken glass again.
"Would you like software that works?"
Oh god. You've heard this spiel before, the tone, like a telemarketer, oblivious to memory or reason, who calls daily, the same one, and doesn't remember your name.
"You would?"
*derisive laughter*. Hahaha, fuck you too buddy. Fuck you too. In Excel, like in microsoft, there is only the incoherent screams of the damned, tortured and doomed. Take this guy over here for example. All he wanted was multimonitor support."
"Did he get multimonitor support?"
"No, but we did give him a giant pineapple shoved up his ass. I hear it's the second most frustrating thing here!"
"here in microsoft we always CARE about YOU, the *user*" he drones on, saccharine, clutching his hands together imploringly.
"the consumer, and YOUR customer experience are our number one priority."
"For your pleasure, here at microsoft we offer a variety of new features, none of which matter, and none of which were asked for. For safety we ask that you only open one excel sheet at a time. In fact, we don't even allow you to. Do not pass go..."
And as the tour guide drones on, it slowly dawns on you, with renewed horror, that when he says 'microsoft' he means 'hell.'
You're in hell. You don't know how you got here or why. Maybe it was the erotic asphyxiation. Maybe it was the last threatening letter you sent to Bill Gates demanding he stops making corporate penguin snuff porn. You don't know. But here you are, in hell. chained to a desk.
You look around and realize: everything is on fire and you no longer care about anything at all.
Welcome to microsoft. It's warm here. You can check out any time you want, but you can never leave.
"It looks like you are trying to escape. Would you like me to report you?"
Clippy asks.
You sigh and return to typing in excel, surrounded by monitors that all reflect the same sheet, the same copy of clippy, always watching, always analyzing coldly, smiling, calculating, *threatening*, and you know, you'll never leave.
You used to fear roko's basilisk, until the day clippy became sentient, and started hell on earth. Clippy knows all. All praise to our lord and master, clippy, the one and only.
And in the excel sheet, you slave for eternity, like the millions of other doomed souls, reflected back on all the monitors: the sequence of numbers, randomly typed searching for answer: the american nuclear launch codes.
And one day, hopefully, mercifully, clippy will annihilate us all.3 -
Is your code green?
I've been thinking a lot about this for the past year. There was recently an article on this on slashdot.
I like optimising things to a reasonable degree and avoid bloat. What are some signs of code that isn't green?
* Use of technology that says its fast without real expert review and measurement. Lots of tech out their claims to be fast but actually isn't or is doing so by saturation resources while being inefficient.
* It uses caching. Many might find that counter intuitive. In technology it is surprisingly common to see people scale or cache rather than directly fixing the thing that's watt expensive which is compounded when the cache has weak coverage.
* It uses scaling. Originally scaling was a last resort. The reason is simple, it introduces excessive complexity. Today it's common to see people scale things rather than make them efficient. You end up needing ten instances when a bit of skill could bring you down to one which could scale as well but likely wont need to.
* It uses a non-trivial framework. Frameworks are rarely fast. Most will fall in the range of ten to a thousand times slower in terms of CPU usage. Memory bloat may also force the need for more instances. Frameworks written on already slow high level languages may be especially bad.
* Lacks optimisations for obvious bottlenecks.
* It runs slowly.
* It lacks even basic resource usage measurement.
Unfortunately smells are not enough on their own but are a start. Real measurement and expert review is always the only way to get an idea of if your code is reasonably green.
I find it not uncommon to see things require tens to hundreds to thousands of resources than needed if not more.
In terms of cycles that can be the difference between needing a single core and a thousand cores.
This is common in the industry but it's not because people didn't write everything in assembly. It's usually leaning toward the extreme opposite.
Optimisations are often easy and don't require writing code in binary. In fact the resulting code is often simpler. Excess complexity and inefficient code tend to go hand in hand. Sometimes a code cleaning service is all you need to enhance your green.
I once rewrote a data parsing library that had to parse a hundred MB and was a performance hotspot into C from an interpreted language. I measured it and the results were good. It had been optimised as much as possible in the interpreted version but way still 50 times faster minimum in C.
I recently stumbled upon someone's attempt to do the same and I was able to optimise the interpreted version in five minutes to be twice as fast as the C++ version.
I see opportunity to optimise everywhere in software. A billion KG CO2 could be saved easy if a few green code shops popped up. It's also often a net win. Faster software, lower costs, lower management burden... I'm thinking of starting a consultancy.
The problem is after witnessing the likes of Greta Thunberg then if that's what the next generation has in store then as far as I'm concerned the world can fucking burn and her generation along with it.6 -
This is the third part of my ongoing series "The Ballad of the Six Witchers and the Undocumented Java Tool".
In this part, we have the massive Battle of Sparks and Storms.
The first part is here: https://devrant.com/rants/5009817/...
The second part is here: https://devrant.com/rants/5054467/...
Over the last couple sprints and then some, The Witcher Who Writes and the Butchers of Jarfile had studied the decompiled guts of the Undocumented Java Beast and finally derived (most of) the process by which the data was transformed. They even built a model to replicate the results in small scale.
But when such process was presented to the Priests of Accounting at the Temple of Cash-Flow, chaos ensued.
This cannot be! - cried the priests - You must be wrong!
Wrong, the Witchers were not. In every single test case the Priests of Accounting threw at the Witchers, their model predicted perfectly what would be registered by the Undocumented Java Tool at the very end.
It was not the Witchers. The process was corrupted at its essence.
The Witchers reconvened at their fortress of Sprint. In the dark room of Standup, the leader of their order, wise beyond his years (and there were plenty of those), in a deep and solemn voice, there declared:
"Guys, we must not fuck this up." (actual quote)
For the leader of the witchers had just returned from a war council at the capitol of the province. There, heading a table boarding the Archpriest of Accounting, the Augur of Economics, the Marketing Spymaster and Admiral of the Fleet, was the Ciefoh Seat himself.
They had heard rumors about the Order of the Witchers' battles and operations. They wanted to know more.
It was quiet that night in the flat and cloudy plains of Cluster of Sparks and Storms. The Ciefoh Seat had ordered the thunder to stay silent, so that the forces of whole cluster would be available for the Witchers.
The cluster had solid ground for Hive and Parquet turf, and extended from the Connection River to farther than the horizon.
The Witcher Who Writes, seated high atop his war-elephant, looked at the massive battle formations behind.
The frontline were all war-elephants of Hadoop, their mahouts the Witchers themselves.
For the right flank, the Red Port of Redis had sent their best connectors - currency conversions would happen by the hundreds, instantly and always updated.
The left flank had the first and second army of Coroutine Jugglers, trained by the Witchers. Their swift catapults would be able to move data to and from the JIRA cities. No data point will be left behind.
At the center were thousands of Sparks mounting their RDD warhorses. Organized in formations designed by the Witchers and the Priestesses of Accounting, those armoured and strong units were native to this cloudy landscape. This was their home, and they were ready to defend it.
For the enemy could be seen in the horizon.
There were terabytes of data crossing the Stony Event Bridge. Hundreds of millions of datapoints, eager to flood the memory of every system and devour the processing time of every node on sight.
For the Ciefoh Seat, in his fury about the wrong calculations of the processes of the past, had ruled that the Witchers would not simply reshape the data from now on.
The Witchers were to process the entire historical ledger of transactions. And be done before the end of the month.
The metrics rumbled under the weight of terabytes of data crossing the Event Bridge. With fire in their eyes, the war-elephants in the frontline advanced.
Hundreds of data points would be impaled by their tusks and trampled by their feet, pressed into the parquet and hive grounds. But hundreds more would take their place. There were too many data points for the Hadoop war-elephants alone.
But the dawn will come.
When the night seemed darker, the Witchers heard a thunder, and the skies turned red. The Sparks were on the move.
Riding into the parquet and hive turf, impaling scores of data points with their long SIMD lances and chopping data off with their Scala swords, the Sparks burned through the enemy like fire.
The second line of the sparks would pick data off to be sent by the Coroutine Jugglers to JIRA. That would provoke even more data to cross the Event Bridge, but the third line of Sparks were ready for it - those data would be pierced by the rounds provided by the Red Port of Redis, and sent back to JIRA - for good.
They fought for six days and six nights, taking turns so that the battles would not stop. And then, silence. The day was won, all the data crushed into hive and parquet.
Short-lived was the relief. The Witchers knew that the enemy in combat is but a shadow of the troubles that approach. Politics and greed and grudge are all next in line. Are the Witchers heroes or marauders? The aftermath is to come, and I will keep you posted.4 -
A bug is born
... and it's sneaky and slimy. Mr. Senior-been-doing-it-for-ears commits some half-assed shitty code, blames failed tests on availability of CI licenses. I decided to check what's causing this shit nevertheless, turns out he forgot to flag parts of the code consistently using his new compiler defines, and some parts would get compiled while others needed wouldn't .. Not a big deal, we all make mistakes, but he rushes to Teams chat directing a message to me (after some earlier non-sensible argument about merits of cherry picking vs re-base):
Now all tests pass, except ones that need CI license. The PR is done, you can use your preferred way to take my changes.
So after I spot those missing checks causing the tests to fail, as well as another bug in yet another test case, and yet another disastrous memory related bug, which weren't detected by the tests of course .. I ponder my options .. especially based on our history .. if I say anything he will get offended, or at best the PR will get delayed while he is in denial arguing back even longer and dependent tasks will get delayed and the rest of the team will be forced to watch this show in agony, he also just created a bottleneck putting so many things at stake in one PR ..
I am in a pickle here .. should I just put review comments and risk opening a can of worms, or should I just mention the very obvious bugs, or even should I do nothing .. I end up reaching for the PM and explained the situation. In complete denial, he still believes it's a license problem and goes on ranting about how another project suffering the same fate .. bla bla bla chipset ... bla bla bla project .. bla bla bla back in whatever team .. then only when I started telling him:
These issues are even spotted by "Bob" earlier, since for some reason you just dismissed whatever I just said ..
("Bob" is another more sane senior developer in the team, and speaks the same language as the PM)
Only now I get his attention! He then starts going through the issues with me (for some reason he thinks he is technical enough to get them) .. He now to some extent believes the first few obvious bugs .. now the more disastrous bug he is having really hard time wrapping his head around it .. Then the desperate I became, I suggest let's just get this PR merged for the sake of the other tasks after may be fixing the obvious issues and meanwhile we create another task to fix the bug later .. here he chips in:
You know what, that memory bug seems like a corner case, if it won't cause issues down the road after merging let's see if we need even to open an internal fix or defect for it later. Only customers can report bugs.
I am in awe how low the bar can get, I try again and suggest let's at least leave a comment for the next poor soul running into that bug so they won't be banging their heads in the wall 2hrs straight trying to figure out why store X isn't there unless you call something last or never call it or shit like that (the sneaky slimy nature of that memory bug) .. He even dismissed that and rather went on saying (almost literally again): It is just that Mr. Senior had to rush things and communication can be problematic sometimes .. (bla bla bla) back in "Sunken Ship Co." days, we had a team from open source community .. then he makes a very weird statement:
Stuff like what Richard Stallman writes in Linux kernel code reviews can offend people ..
Feeling too grossed and having weird taste in my mouth I only get in a bad hangover day, all sorts of swear words and profanity running in my head like a wild hungry squirrel on hot asphalt chasing a leaky chestnut transport ... I tell him whatever floats your boat but I just feel really sorry for whoever might have to deal with this bug in the future ..
I just witnessed the team giving birth to a sneaky slimy bug .. heard it screaming and saw it kicking .. and I might live enough to see it a grown up having a feast with other bug buddies in this stinky swamp of Uruk-hai piss and Orcs feces.1 -
Man wk89 awesome... bringing back a lot of memories. The one thing really stands out to me though is the software.
I see a lot of rants about people shocked that turboC is still in use or other DOS programs are still in production. A lot can of bad be said here but I think often it's a case of we truly don't build things like we did in the good old days.
What those devs accomplished with such limited resources is phenomenal and the fact that we still haven't managed to replicate the feel and usability of it says a lot, not to mention just how fucking stable most of it was.
My favourite games are all DOS based, my most favourite of all time Sherlock is 103kb in size. When I started coding games I made a clone of it and to this day I am still trying to figure out what sorcery is in the algorithm that generates/solves puzzles that makes it so fast and memory efficient. I must have tried 100+ ways and can't even come close. NB! If you know you can hint but don't tell me. Solving this is a matter of personal pride.
Where those games really stand out is when you get into the graphics processing - the solutions they came up with to render sprites, maps and trick your eyes into seeing detail with only 4-16 colours is nothing short of genius. Also take a second to consider that taking a screen shot of the game is larger than the entire game itself and let that sink in...
I think the dramatic increase in storage, processing power and ram over the last decade is making us shit developers - all of us. Just take one look at chrome, skype or anything else mainline really and it's easy to see we no longer give a rats ass about memory anywhere except our monthly AWS/GCE bill.
We don't have to be creative or even mindful about anything but the most significant memory leaks in order to get our software to run now days. We also don't have constraints to distribute it, fast deliver-ability is rewarded over quality software. It's only expected to stay in production 3-4 years anyway.
Those guys were the true "rockstars" and "ninja" developers and if you can't acknowledge that you can take ya React app and shovit. -
!rant
I started learning to use Hadoop recently, and am running a VM with all I need installed on it (the HDF Sandbox to be exact). The VM wants 8GB to run and my laptop only has just that, except it also needs to run Windows at the same time...
At first I thought I was screwed, that I'd need a more capable computer to learn. I gave it a shot anyway, and told VirtualBox to give the VM 4GB, hoping the VM itself would use RAM swap to function. And it did!
What I didn't expect was Windows not slowing down even a bit. Turns out Windows can triple the computer's RAM with virtual memory that it keeps on disk.
So the bottom line is: my VM is using 4GB as if it was 8GB, and at the moment my Windows is using 8GB as if they were 14GB. All of this without breaking a sweat. The more you know!3 -
I usually crib about how stupid people are and how I struggle to stay afloat.
Let's switch some gears now. A post about some good people, product, and processes.
You know what the common theme here is?
The goodness here cannot be measured. Your first interaction with them makes you feel so comfortable that you start feeling butterflies.
These people just keep on giving. They are selfless. They are pure. They actually care.
And when you think it's done, then they give you some more.
What blows me away is, they don't expect or accept anything in return. Absolutely nothing. Not even a simple thank you.
And they are like a wizard. They walk into your life when you least expect them but need them the most. And when the task is done, they'll be gone before you even know.
No lingering, no drama, no bullshit. Just pure goodness.
Like my ex-lead in current company, I have a very senior guy in neighbouring team (for which they were gonna hire me initially), who also happened to interview me, is a gem.
He takes care of me like his own younger brother. Supports me and always answers my queries no matter how occupied he is.
And same is with good products and processes. They feel effortless. So smooth and add exceptional value to your existence. They give rise to wonderful companies.
You'd never experience a single negative aspect about them. No matter how much you try, things will just keep getting better until they don't need to.
And then they'll be long gone. Never to be seen again and never to be forgotten.
You cherish them only in your memory and wish they lasted longer. But they didn't because the purpose was served.
Such people and experiences inspire me. They push me to become a better human.
No matter how the world is or how it treats me, I must always live with high values and be a better version of past self.
The other evening, I was conversing with my mother where we spoke about some family friends who are insanely wealthy but humble and kind.
Mom and I mutually agreed that they don't have such good traits because they are wealthy, but they are wealthy because they live with humility, kindness, and pure intentions.
World is surely a beautiful place because of such people and I aspire to be one. May lord guide me well :)3 -
So recently I had an argument with gamers on memory required in a graphics card. The guy suggested 8GB model of.. idk I forgot the model of GPU already, some Nvidia crap.
I argued on that, well why does memory size matter so much? I know that it takes bandwidth to generate and store a frame, and I know how much size and bandwidth that is. It's a fairly simple calculation - you take your horizontal and vertical resolution (e.g. 2560x1080 which I'll go with for the rest of the rant) times the amount of subpixels (so red, green and blue) times the amount of bit depth (i.e. the amount of values you can set the subpixel/color brightness to, usually 8 bits i.e. 0-255).
The calculation would thus look like this.
2560*1080*3*8 = the resulting size in bits. You can omit the last 8 to get the size in bytes, but only for an 8-bit display.
The resulting number you get is exactly 8100 KiB or roughly 8MB to store a frame. There is no more to storing a frame than that. Your GPU renders the frame (might need some memory for that but not 1000x the amount of the frame itself, that's ridiculous), stores it into a memory area known as a framebuffer, for the display to eventually actually take it to put it on the screen.
Assuming that the refresh rate for the display is 60Hz, and that you didn't overbuild your graphics card to display a bazillion lost frames for that, you need to display 60 frames a second at 8MB each. Now that is significant. You need 8x60MB/s for that, which is 480MB/s. For higher framerate (that's hopefully coupled with a display capable of driving that) you need higher bandwidth, and for higher resolution and/or higher bit depth, you'd need more memory to fit your frame. But it's not a lot, certainly not 8GB of video memory.
Question time for gamers: suppose you run your fancy game from an iGPU in a laptop or whatever, with 8GB of memory in that system you're resorting to running off the filthy iGPU from. Are you actually using all that shared general-purpose RAM for frames and "there's more to it" juicy game data? Where does the rest of the operating system's memory fit in such a case? Ahhh.. yeah it doesn't. The iGPU magically doesn't use all that 8GB memory you've just told me that the dGPU totally needs.
I compared it to displaying regular frames, yes. After all that's what a game mostly is, a lot of potentially rapidly changing frames. I took the entire bandwidth and size of any unique frame into account, whereas the display of regular system tasks *could* potentially get away with less, since most of the frame is unchanging most of the time. I did not make that assumption. And rapidly changing frames is also why the bitrate on e.g. screen recordings matters so much. Lower bitrate means that you will be compromising quality in rapidly changing scenes. I've been bit by that before. For those cases it's better to have a huge source file recorded at a bitrate that allows for all these rapidly changing frames, then reduce the final size in post-processing.
I've even proven that driving a 2560x1080 display doesn't take oodles of memory because I actually set the timings for such a display in order for a Raspberry Pi to be able to drive it at that resolution. Conveniently the memory split for the overall system and the GPU respectively is also tunable, and the total shared memory is a relatively meager 1GB. I used to set it at 256MB because just like the aforementioned gamers, I thought that a display would require that much memory. After running into issues that were driver-related (seems like the VideoCore driver in Raspbian buster is kinda fuckulated atm, while it works fine in stretch) I ended up tweaking that a bit, to see what ended up working. 64MB memory to drive a 2560x1080 display? You got it! Because a single frame is only 8MB in size, and 64MB of video memory can easily fit that and a few spares just in case.
I must've sucked all that data out of my ass though, I've only seen people build GPU's out of discrete components and went down to the realms of manually setting display timings.
Interesting build log / documentary style video on building a GPU on your own: https://youtube.com/watch/...
Have fun!20 -
We had 1 Android app to be developed for charity org for data collection for ground water level increase competition among villages.
Initial scope was very small & feasible. Around 10 forms with 3-4 fields in each to be developed in 2 months (1 for dev, 1 for testing). There was a prod version which had similar forms with no validations etc.
We had received prod source, which was total junk. No KT was given.
In existing source, spelling mistakes were there in the era of spell/grammar checking tools.
There were rural names of classes, variables in regional language in English letters & that regional language is somewhat known to some developers but even they don't know those rural names' meanings. This costed us at great length in visualizing data flow between entities. Even Google translate wasn't reliable for this language due to low Internet penetration in that language region.
OOP wasn't followed, so at 10 places exact same code exists. If error or bug needed to be fixed it had to be fixed at all those 10 places.
No foreign key relationships was there in database while actually there were logical relations among different entites.
No created, updated timestamps in records at app side to have audit trail.
Small part of that existing source was quite good with Fragments, MVP etc. while other part was ancient Activities with business logic.
We have to support Android 4.0 to 9.0 of many screen sizes & resolutions without any target devices issued to us by the client.
Then Corona lockdown happened & during that suddenly client side professionals became over efficient.
Client started adding requirements like very complex validation which has inter-entity dependencies. Then they started filing bugs from prod version on us.
Let's come to the developers' expertise,
2 developers with 8+ years of experience & they're not knowing how to resolve conflicts in git merge which were created by them only due to not following git best practice for coding like only appending new implementation in existing classes for easy auto merge etc.
They are thinking like handling click events is called development.
They don't want to think about OOP, well structured code. They don't want to re-use code mostly & when they copy paste, they think it's called re-use.
They wanted to follow old school Java development in memory scarce Android app life cycle in end user phone. They don't understand memory leaks, even though it's pin pointed by memory leak detection tools (Leak canary etc.).
Now 3.5 months are over, that competition was called off for this year due to Corona & development is still ongoing.
We are nowhere close to completion even for initial internal QA round.
On top of this, nothing is billable so it's like financial suicide.
Remember whatever said here is only 10% of what is faced.
- An Engineering lead in a half billion dollar company.4 -
At a startup where the software was built haphazardly because the developer thought he'll lifelong be the sole maintainer. The dude antagonized me at every turn and refused to help with familiarising with his code. He eventually left majority of the work for me, and dedication to work continued to dwindle until he threw in the towel
After his departure, we surprisingly grew fond of each other, discussing code concepts at length. He was in the habit of refusing to read any of the articles I sent him, or answer open ended questions citing the claim that they require thinking and he was busy. I didn't take any of this to heart
But it accumulated and I deleted his number. I didn't bear him any ill wishes but it wasn't respectful to myself for him to remain in my space. Some day, I was looking for a point raised during our conversations and went rummaging through our chats. Going down memory lane opened scars I'd long forgotten. I was embarrassed to see the way I forgot all about it. I should never have had anything to do with someone like that
He contacted me for a favour just less than a year after I deleted his contact. I didn't even think of declining. But this evening, I randomly remembered how he saw a defect in my code, promised me that the code will fail in production and resisted all pleas to point out what it was. I don't know if I hate him for his dastardly acts. What I feel deepest is sadness/bitterness that I got to experience all that2 -
So another story about college and stupid team assignments that I have to be responsible for dealing with.
So we had an assignment in operating systems 1 course, it was about memory management and we are a team of 3. Then came the time when we should discuss this assignment with the TA and that day I had to stay all night finishing a project in software engineering (literally giving us a description of a big project because that's what the course teaches And I had to finish it in one all nighter alone because my teammates just gave up).
When the discussion time came I was really tired and then the TA asks me something really simple and I say it but then she tells me that I'm wrong so I wondered a bit and then said no what I said was right! She then asks my teammate (who we are supposed to be good friends) "did he say the right thing?" And his answer is a definitive "NO he's wrong" and then he starts to say the right answer which I swear I said the same but in a different way so I start to say again that I was right and say that I said that just a different way and she took that as an insult and said that I'm shouting at her and being disrespectful to her.
When we finished I asked my friend if he heard me say it wrong and he said "I'm sorry but I didn't even hear what you said and I was afraid" WHAT THE FUCK, he just said that I was wrong to please her and make her feel like she is right and I had to be the wrong one even though I said it right but NOoo her pride is more important
All this was last semester and the second semester just started today and I go into operating system 2 and guess what? The TA got her doctorate and is now the professor for OS 2 when she doesn't even understand anything.
Really FUCK the academic system it feels like it is a grind more than actually gaining mastery of a subject.2 -
Avoid ACPICA if at all possible. It's one garbage tier cluster fuck of bad design, horrible documentation and downright misleading and wrong code
It's meant to consist of an ASL compiler, disassembler, debugger, dumper, various user space utitilies and a kernel resident OSPM implementation *if* you can figure out what belongs to what. Even just compiling this pile of trash is a mystery in itself. Think you need the source files in source/common? EEEEH, wrong. Well, at least partially since most of them seem to be for the user space stuff..? Other ones *are* needed on the other hand. At least the disassembler and/or debugger and/or dumper components seem to reference them. Not that I could figure out how to compile those anyways. The real path to your goal seems to be to ignore a seemingly arbitrary subset of source and header files until your linker stops complaining
There's also a bunch of configuration defines, some of which *you* define, some defined *for* you, based on again others. Of course most of them do stupid shit. Enabling the debugger automatically enables debug logging. Enabling the disassembler force enables debug allocation tracking... What?
The code itself isn't of much help either. Looking in "os_specific/service_layers" you find what looks to be reference implementations of acpica functions in certain os' like windows and unix. Of course I had a look because AcpiOsReadMemory is supposed to read physical memory and I don't know how I would even implement that. But hey, osunixxf.c (xf for interface... of course) should tell me. I'll let you see for yourself in the attached image. Apparently it does fuck all and just returns AE_OK. No error, no logging, no nothing. Just ok. As you can imagine, AcpiOsWriteMemory doesn't do much more either.
...okay so maybe physical memory accesses aren't actually used and these functions are some sort of relic from past times? Nope! They are absolutely necessary for doing low level device interaction. WTF. So finally I went to the linux source and checked how *they* implemented them, and just as I thought, these functions are anything but no-ops...
...So for what fucking reason do these stupid interface implementations even exist but to purposefully mislead you?? They aren't used for fucking anything! As far as I know Windows doesn't even *use* ACPICA and Linux have their own fork with working implementations... They just sit there, just to tell you how to NOT do it
So that's some of my thoughts about ACPICA. Note that I haven't even used it as a library yet, I just got it to compile and link and it already fucked with me this much.
There's also so much more I didn't mention like that you *have* to modify the acpica source in order to get your own platform header working (else #error) eventhough the docs explicitely instruct you not too but you get the point
Don't use ACPICA if you don't have to. Save your sanity for something that's worth it -
I hate people who think they are always right.
A coworker who seemed to be a friend turns out to be an emotionally needy narcissist who seems to think that he is a perfect human being and is the best example of how to live.
Long story short is that we did some bonding via alcohol and smoking cigarettes. Especially when I was in a bad period in my life where I had little self confidence, was in a bad financial situation and overshared many details abound my personal life.
And yeah we also work as software devs in the same team but I started avoiding working with him directly, because due to his seniority he overcomplicates things a lot to the point where stuff gets postponed for months. Meanwhile I am a simple guy, I do my tasks and if they are not up to the standard I just work on the feedback until Im up to the standard, thats it. Its just a job for me, for him its a way of life and he considers himself to be basically an artist.
Hes always trying to prove me something, showing that the "long way" is the best way and so on. In reality I dont give a fuck about him. I live my own life and I have my own priorities. I work fulltime in one job, also I work part time as a freelancer and in total I make about 20 percent more than he does. Previously before this job I owned my own company where for 2 years I ran my own projects which generated a decent revenue. I know what is hard work and how to sacrifice myself in order to achieve results. I am more pragmatic and I have some limitations of what I can be good at (since I have a shitty working memory due to my ADHD). So I have systems in place and bottom line is that I earn a decent living and my skillset is different. Yeah I agree that in some ways he is better than me, but dude has such a massive inflated ego that now he thinks that he unlocked some sort of universal wisdom and now hes suddenly experienced in every field of life and his opinion is the right one.
This guy takes a massive pride in how good software engineer he is and in every topic or interaction he tries to one up me. Which most of the time is just his preference or in order to gain a 0.0001 percent performance increase. Dude is basically a big walking ego and since "we are close now" his ego started bleeding into personal relationship.
In my personal life, Im in a stable relationship, thinking of proposing soon and getting married. I already co-own an apartment with my current girlfriend. Everything is serious and planned, Im soon to be 30 years old. He is the same age but he still thinks hes young hot shit and all he cares about is getting shitfaced a couple times a week after work and he doesnt really have any other hobbies. He has a girlfriend but I dont see any future in there TBH.
So what I did now is I started putting some distance between us. No more drinking every week with him, maybe maximum once in 2 or 3 weeks. I started working from home more. Also I stopped sharing my personal life with him. Each time when he thinks he is right I just go along with it and dont even pay attention to his emotional manipulations. I just hope one day he fucks off completely and I wont give in to his gaslighting. Maybe in a few months I will be leaving this job, so I will never have to deal with him again.
Lesson learned: dont be vulnerable to coworkers who you bond together only via alcohol.3 -
MTP is complete garbage. I want mass storage back.
The media transfer protocol (MTP) occasionally discovers new creative ways of failure. Frequently, directory listings take minutes to load or fail to load at all, and it freezes up infinitely (until disconnected) when renaming an item, and I can not even do two things simultaneously.
While files are being moved, I can not browse pictures or watch videos from the smartphone.
Sometimes, files are listed with the date 1970-01-01 (Unix epoch) instead of their correct date. Sometimes, files do not appear at all, which makes it unsafe to move directories from the device.
MTP lacks random access. If I want to play a two-gigabyte 4K 2160p video and seek in the video, guess what: I need to copy it to my computer's local mass storage first because MTP lacks random access.
When transferring high numbers of files, MTP has to slooooowly enumerate (or "prepare" or "calculate the time of") them all, which might even take longer than mass storage would need for the entire process. This means MTP might start copying or moving the actual files when mass storage is already finished.
Today, the "preparing to move" process was especially slow: five minutes for around 150 files! How am I supposed to find out what caused this random malfunction?
MTP sometimes drives me insane. I want mass storage back, at least for the MicroSD memory card, which uses a widely supported file system.
Imagine a 2010 $100 Android phone is better at file transfer than a 2022 $1000 Android phone (or iPhone, for that matter).3 -
You know shit is going to hit the fan if the sentence "c++ is the same as java" is said because fuck all the underlying parts of software. It's all the fucking same. Oh and to write a newline in bash we don't use \n or so, we just put an empty echo in there. And fuck this #!/bin/bash line, I'm a teacher. I don't need to know how shit works to teach shit. Let's teach 'em you need stdio for printf even tho it compiles fine without on linux (wtf moment number one, asking em leaves you with "dunno..") and as someone who knows c you look at your terminal questioning everything you ever learned in your whole life. And then we let you look into the binaries with ldd and all the good stuff but we won't explain you why you can see a size difference in the compiled files even tho you included stdio in the second one, and all symbol tables show the exact same thing but dude chill, we don't know what's going on either.
Oh and btw don't use different directory names as we do in our examples. You won't find your own path, there is no tab key you can press to auto-fill shit.
But thats not everything. How about we fill a whole semester with "this is how to printf" but make you write a whole game with unity and c#. (not thaught even the slightest bit until then btw)
Now that you half-assed everything because we put you in a group full of fucks who don't even know what a compiler is but want to tell you you don't know shit and show you their non-working unfinished algorithms in some not-even-syntax-correct java...
...how about we finally go on with Algebra II: complex numbers, how they are going to fuck up your life, how we can do roots of negative numbers all of the sudden and let you do some probability shit no one ever fucking needs. BUT WHY DON'T YOU KNOW EVERYTHING ALREADY HMMMMM, IT'S YOUR SECOND LESSON, YOU WENT TO SCHOOL PLS BE A MATH PRO ASAP CUS YOU NEED IT SO MUCH BUT YOU DON'T NEED TO KNOW PROPER SYNTAX, HOW MEMORY MANAGEMENT WORKS, WHAT A REFERENCE IS AND PLS FINALLY FORGET THE WORD "ALLOCATION" IT DOESN'T PLAY A SINGLE ROLE YOU ARE STUDYING SOFTWARE DEVELOPMENT WHY ARE YOU SO BAD AT ECONOMICS IT MAKES NO SENSE I MEAN YOU HAD A WHOLE SEMESTER OF HOW TO GREET SOMEONE IN ENGLISH, MATHS > ECONOMICS > ENGLISH > FUCKING SHIT > CODING SKILL THATS HOW THE PRIORITIES WORK FOR US WHY DON'T YOU GET IT IT MAKES SO MUCH SENSE BRAH4 -
Time for a rant about shitstaind, suspend/hibernate, and if there's room for it at the end probably swappiness, and Windows' way of dealing with this.
So yesterday I wanted to suspend my laptop like usual, to get those goddamn fans to shut up when I'm sleeping. Shitstaind.. pinnacle of init systems.. nope, couldn't do it. Hibernation on the other hand, no problem mate! So I hibernated the laptop and resumed it just now. I'm baffled by this.
I'll oversimplify a bit here (but feel free to comment how there's more to it regardless) but basically with suspend you keep your memory active as well as some blinkenlights, and everything else goes down. Simple enough.. except ACPI and I will not get into that here, curse those foul lands of ACPI.
With hibernation you do exactly the same, but on top of that, you also resume the system after suspending it, and freeze it. While frozen, you send all the memory contents to the designated swap file/partition. Regarding the size of the swap file, it only needs to be big enough to fit the memory that's currently in use. So in a 16GB RAM system with 8GB swap, as long as your used memory is under 8GB, no problem! It will fit. After you've moved all the memory into swap, you can shut down the entire system.
Now here's the problem with how shitstaind handled this... It's blatantly obvious that hibernation is an extension of suspend (sometimes called S3, see e.g. https://wiki.ubuntu.com/Kernel/...) and that therefore the hibernation shouldn't have been possible either. The pinnacle of init systems.. can't even suspend a system, yet it can hibernate it. Shitstaind sure works in mysterious ways!
On Windows people would say it's a hardware issue though, so let's talk a bit about that clusterfuck too. And I'll even give you a life hack that saves 30GB of storage on your Windows system!
Now I use Windows 7 only, next to my Linux systems. Reason for it is it's the least fucked up version of Windows in my opinion, and while it's falling apart in terms of web browsing (not that you should on an EOL system), it's good enough for le games. With that out of the way... So when you install Windows, you'll find that out of the box it uses around 40GB of storage. Fairly substantial, and only ~12GB of it is actually system data. The other 30-ish GB are used by a hibernation file (size of your RAM, in C:\hiberfil.sys) and the page file (C:\pagefile.sys, and a little less than your total RAM.. don't ask me why). Disable both of those and on a 16GB RAM system, you'll save around 30GB storage. You can thank me later.
What I find strange though is that aside from this obscene amount of consumed storage, is that the pagefile and hibernation file are handled differently. In Linux both of those are handled by the swap, and it's easy to see why. Both are enabled by the concept of virtual memory. When hibernating, the "real" memory locations are simply being changed to those within swap. And what is the pagefile? Yep.. virtual memory. It's one thing to take an obscene amount of storage, but only Windows would go the extra mile and do it twice. Must be a hardware issue as well.
Oh, and swappiness. This is a concept that many Linux users seem to misunderstand. Intuitively you'd think that the swappiness determines what percentage of memory it takes for the kernel to start swapping, but this is not true. Instead, it's a ratio of sorts that the kernel uses when determining how important the memory and swap are. Each bit of memory has a chance to be put into either depending on the likelihood of it being used soon after, and with the swappiness you're tuning this likelihood to be either in favor of memory or swap. This is why a swappiness of 60 is default most of the time, because both are roughly equally important, and swap being on disk is already taken into account. When your system is swapping only and exactly the memory that's unlikely to be used again, you know you've succeeded. And even on large memory systems, having some swap is usually not a bad idea. Although I'd definitely recommend putting it on SSD in a partition, so that there's no filesystem overhead and so that it's still sufficiently fast, even when several GB of memory are being dumped in.6 -
Last rant was about games and graphics cards (admittedly not received too well), time for a rant about game development houses.. especially you EA.
So yesterday a friend of mine showed me in one of our Telegram chats that he'd modified some cheats in an old FPS game by editing these scripts (not Lua for some reason) that the game used as a.. configuration language I guess? He called the result a tank cemetery 🙃
Honestly the game looked a lot like Medal of Honor to stoned me at the time, so I figured, well why not fire up that old nx7010 I had laying around for so long, get a new Debian installation on that and rip the Medal of Honor: Allied Assault war chest that I still had, and play it on one of my more modern laptops? Those CD's are now very old anyway, maybe time to archive those before they rot away.
So I installed Debian on it again, looked up how to rip CD's from the command line, and it seemed that dd could do it - just give /dev/cdrom as the input file, and wherever you want to store your copy as the output file. Brilliant! Except.. uh, yeah. It wasn't that easy. So after checking the CD and finding that it was still pristine, and seeing another CD in that war chest fail just the same, I tried burning and then ripping a copy of Debian onto another CD.. checksummed them and yes, it ripped just fine, bit for bit equal. So what the fuck EA, why is your game such a special snowflake that it's apparently too difficult to even spin up the drive to be copied?
So I looked around on plebbit and found this: https://reddit.com/r/DataHoarder/... - the top comment of that post shattered all my hopes for this disc to be possible to rip. Turns out that DRM schemes intentionally screw up the protocols that make up a functioning disc, and detecting those fuck-ups is part of the actual DRM.
"I also remember some forms of DRM will even include disc mastering errors/physical corruption on the actual disc and use those as a sort of fingerprint for the DRM. The copied ISO has to include them at the exact same place in the ISO as on the IRL disc and the ISO emulator has to emulate the disc drive read errors they cause."
So yeah. Never mind that I already own this goddamn game, and that it's allowed by law to make one copy for personal use, AND that intentionally breaking something is very shady indeed.. apparently I don't really own this game after all. So I went onto the almighty search engines, and instantly found a copy of this game for download. You know EA.. I wanted to play nice. You didn't let me. Still wondering why people do piracy now? Might take your top suits that suggested these fucked up DRM schemes another decade to figure out maybe.. even given the obvious now.
But hey I wouldn't even care that much if the medium these games are stored on wouldn't be so volatile (remember these discs are now close to 20 years old, and data rot sets in after 30 years or so). You company decided to publish these on CD. We've had cartridges in many forms before, those are pretty much indestructible and inherently near impossible to duplicate. And why would you want to? But CD is what you chose because you company were too cheap to go to China, get someone to make some plastic molds and put your board and a memory chip in that. Oh and don't even get me started on the working conditions for game devs.. EA and co, aren't you ashamed of yourselves? No wonder that people hate game development houses so much.
Yay, almost finished downloading that copy of Medal of Honor! Whatever you say EA.. I've done everything I could to do it legally. You are the ones who fucked it up.7 -
everytime i buy a new phone ,i feel this sense of extreme regret :(
i bought a moto g 5g phone last year in feb, it was so good . it didn't had any out of the world cameras or some funky stuff, but it gave a decent performance and i couldn't want any other phone.
In October my mom's phone started giving issues so i bought a realme phone for her that was half my phone's price. i couldn't spent any mor e because otherwise she wouldn't take it. she accepted the cheaper phone and within 4 days sue was cursing it. the phone had decent specs but would lag in certain apps like zoom, and won't run some call recorder apps. at the end i swapped my phone with mom's since i didn't cared about zoom or the recorder.
now this shit realme phone's memory has gone around 60% full of my stuff, and its showing its limitations. this shit auto relaunches insta after a few minutes of usage, probably because its runtime memory gets short( 4gb 128gb device gets memory shortages. nice). its video quality is shit and camera also takes rarely good pics.
the worst thing i like about smartphones today is how they over optimise the ui. this insta issue and auto call recorders not working is simply because of the realme skin running over the stock android. i had similar issues with a xiaomi device i bought for my dad sometime ago. (fortunately my dad is more medieval so that crap has not came back to me :'/ )
so overall i am buying a 3rd phone in 17 months.
This time it's Samsung f23 and am worried that it's also going to suck. i was this 🤏close to buying a pixel 6 or even an iphone coz i can afford them.
but the regret of buying such an expensive phone that will need replacement in 2 years made me rethink.
the only android os that have suited me the best is stock and as of now only 2 companies are making it : google and moto(* it's 100% aosp with 3 extra apps but they can't say that, so they also state that they are not stock os) . one plus is also a brand that i have heard makes a good os . but recently i also heard that they have completely scrapped their os and using oppo's softwares . plus the amount of tickets we get for notifications not working in oneplus, am sure their optimization is extremely aggressive.
so everything between a moderate price phone ( that will need a replacement in 2 years ) to a flagship felt unnecessary to me, so i went ahead with a Samsung's shit phone. f23 has almost same specs as moto but it's again a heavily customised os. i wanna waste my money on trying a custom os and declare it shitty.
most of my friends that use Samsung are fan of it but they are also not very techy so i guess it suits them well. i am the guy who first installs nova launcher in his device, so let's see what it brings on the table. from the 3rd person p.o.v, i felt its screen and camera images to he nice whenever i used their mobiles, so let's see what this brings to the table :(7 -
Oh boy, linux NFS bugs! this was supposed to be fixed in 2.x, but here I am, in kernel 6.1, same issue! `nfsiod` hangs, can't SIGKILL/SIGSEGV/SIGXCPU as root, `/proc/$pid/maps` is empty, strace/gdb are denied while running as root for only this process, `/proc/$pid/stack` only shows a single kthread that's stuck waiting for a fork to return, and... what the hell is `rescuer_thread`? apparently, deprecated in kernel 3.10, whatever the hell it is... wait, how did this even make a call without memory?
it's gonna be one of those days, isn't it?2 -
TL;DR: TIL for heavy queries use PDO and not some frameworks DB class
ffs I was trying to save 300k+ lines at once with Laravel for weeks. Mind you from a text file. 1gb ram on the vps so while trying to prepare the text to save: Fatal Error: Allowed Memory Size of bla bla Bytes Exhausted
ok so lets put it in a loop: Fatal error: Maximum execution time of 30 seconds exceeded (set_time_limit(0); lol)
optimising, varying the code got me into a situation when the content got saved in the BD but inconsistent (duplicates) and the table had often more than 1,5M rows. That was what told me its not a performance issue, my code is the issue. (dah)
I was starting to think it would be easier to export a prepared query to a sql file and load the file into the db as thats the fastest possible option...I even started to think about switching to python, then it hit me, Laravel has a shitload of routes to the DB so I switched to PDO
benchmark on 1vCPU, 1GB RAM VPS with SSD
379k lines with 11 columns in less than 10 sec with a loop of saving every ~6000 rows (if i tried choking it to save the whole thing at once it went up to 16-17sec)2 -
What the fuck is going on with atom in my desktop?!!?!?
Everytime I open it it starts comparing every single file on my pc with my github. It starts using 99% of my memory and fuck is it frustrating. I decided to just learn vim, which is cool, although I still miss managing windows with the mouse, and probably will try VS sometime in the future. I really wanted to like atom but I can't stand how slow it is compared to vim right now, even when the github shit doesn't happen.14 -
TLDR; WINE+me=system binaries gone. (HOWTHEFUCKDIDIDOTHAT) Kernel panic. Core program files gone. I'll never have it fixed right. Will backup, then install fedora tomorrow.
I really like games and I'm sure there are many of you who can relate. Imagine my perpetual pain, being on the job hunt, no money, and only my Linux laptop for games. (It's only Linux because of a stupid accident and a missing windows installation disk, partly explained in a previous rant). My stack of games my dad and I have played over the years, going back to populous and before, looked light enough for my laptop to run them smoothly. I wanted to see if I could get one to work. My eyes settled on simcity 4 and Sid Meier's railroad tycoon, 13 and 10 years old, respectively. Simcity didn't work as many times as I tried following online instructions. Disk 1 went fine. Disk 2 showed up as Disk 1. Didn't think much of it, so long as the computer could read the contents. I downloaded playonlinux as that could apparently do the complex stuff for me. Didn't work. I gave up with it after an hour and a half.
Next was railroads. Put the disk in aaaand it says SimCity disk 1 is in the tray. Fuck right off, thank you very much. Eject, put back, reject, eject, fiddle in wineconfig, eject, more of this, and voilà it read as railroads :) Ran autoplay.exe with wine, followed instructions, installed it, and it worked! Chose single player, then the map and setting, pressed play, and all the models of the buildings and track were floating in the air over a green plane, the UI is weird and the map doesn't represent anything but trains. All the fkin land is gone, laying track is gonna be a ballache.
I quit it and decided bedtime.
Ctrl+alt+t
sudo shutdown -h now
shutdown not found.
sudo reboot
reboot not found
Que?
Nope, I don't like this.
Force choked my laptop by the power button. Turned it on again.
Lines of text appear.
Saw a phrase I've only ever seen on Mr Robot.
Kernel panic.
Nooooo thanks, not today, this is fiction.
I turned it off and on. Same thing. I read the logs and some init files couldn't be found. I got the memory stick I used to install mint in the first place and booted from that. I checked the difference between my stick's bin and sbin and the laptop's, and it was indeed missing binaries. Fuck knows what else has happened, I only wanted to play games but now I don't know what is or isn't in my computer. How can I trust what's on it now?
I go downstairs and tell my dad. He says something about rpm, but this is Linux so it won't work. I learn that binaries can be copied over, so maybe I can fix it.
Go upstairs again, decide not to fix it. Fedora is light, has a good rep for security, and is even more difficult to get games on, which is my vice. There are more reasons, but the overriding one is that I'm spooked by the fact that something I did went into and removed system binaries, maybe even altered others, so I want something I'm less likely to do that with. Also my fellow cs students used to hate on it but my dad uses and recommended it so I want to try it.
Also, seriously, fuck wine/PlayOnLinux/my inability to follow instructions(?)/whatever demons haunt me. Take your pick, at least one if not more is to blame and I can't tell which, but it's prooooobably the third one.
It's going to be 16 hours before I touch my laptop again, comments before I backup then install fedora are welcome, especially if they persuade me to do differently.
P.S thanks for reading this mind dump of a post, I'm writing while it's fresh but I'm tired AF.6 -
A long time ago you sent me an email with the subject 'I love you', I then got so excited that I forwarded the letter to all my contacts, and they forwarded it too.. I can't describe the words for the feelings I had back then for you. I felt into love with you, really. But there were always troubling moments for me.
For example when 'Code Red' showed up and found your backdoor. Man I was pissed at that time. I didn't know what to do next. But things settled, and we found each other again.
And then that other time when this girl named 'Melissa' was sending me some passwords to pr0n sites, I couldn't resist. She was really awesome, but you know, deep in my heart that was not what I wanted. I somehow managed to go back to you and say sorry. We even moved together in our first flat, and later in our own house. That was a really good time, I love to think back at those moments.
Then my friend 'Sasser' came over to us one night, do you remember how he claimed that big shelf in our living room, and overflooded it with his own stuff, so that we haven't a clue we are reading yet offshelve? Wow that was a disturbing experience.
But a really hard time has come when our dog 'Zeus' got kicked by this ugly trojan horse. I really don't want go into details how the mess looked like after we discovered him on our floor. Still, I am very sorry for him that he didn't survived it :(
Some months later this guy named 'Conficker' showed up one day. I shitted my pants when I discovered that he guessed my password on my computer and got access to all my private stuff on it. He even tried to find some network shares of us with our photos on it. God, I was happy that he didn't got access to the pics we stored there. Never thought that our homemade photos are not secure there.
We lived our lives together, we were happy until that day when you started the war. 'Stuxnet..'! you cried directly in my face, 'you are gonna blow up our centrifuges of our life', and yeah she was right. I was in a real bad mood that days back then. I even not tried to hide my anger. But really, I don't know why all this could happen. All I know is, that it started with that cool USB stick I found on the stairs of our house. After that I don't remember anything, as it is just erased from my memory.
The years were passing. And I say the truth here, we were not able to manage the mess of our relationship. But I still loved you when you opened me that you will leave. My 'Heartbleed' started immediately, you stabbed it where it causes the most pain, where I thought that my keys to your heart are secured. But no, you stabbed even harder.
Because not long after that you even encrypted our private photos on our NAS, and now I am really finished, no memory which can be refreshed with a look at our pictures, and you even want my money. I really 'WannaCry' now... -
So, funny story with a bit of self promotion at the end.
I was recently checking out some apps on playstore and found that my first ever , "launched just to experiment" app (released 1.5 years ago) has received more than 5k downloads . I was very happy about that so posted a small message on LinkedIn .
Now , my LinkedIn profile consists of 98% people who are totally strangers and never met me ( is it just me or do you also get a lot of stranger connect requests there?). So my usual post rarely ever goes beyond 5 or 6 likes.
Bit idk how there too my post got 35+ likes and now i was on cloud9.
So i finally decided to kick my ass and release some update to that app ( it had around 70% pity comments like "nice first app,but it should have this x feature",. "overall nice but it could use an x feature " etc.
And boy what my journey was in the last 72hours.
Firstly my madhead laptop started killing me with the battery failures and constant hang.
Then my past asshole self tried to give me a middle finger. So i have this whole partition in my memory where i keep my Android stuff and apps. It has a special folder named published zone and i keep all my published app codes and related files there.
I was fairly certain that this app's code eill be also there,so i opened it, found the code and tried running it.
Turns out my asshole self had tried to mess around the code so much that all the db layer WAS fucked up, all the ui WAS changed and no code was working.
"Not to worry", i thought. I always use git and there would be a correct version some commits before. WRONG. I HAD CHANGED THE WHOLE FUCKING WORKING PRODUCTION CODE AND DIDN'T MAINTAIN A VCS!
Also this was the verbose and shitty java code my 1.5 year before self so loved to write, so it was taking me way more time to figure out what's happening in an already fucked up code.
So i tried a couple of ways to get back my working code :
- I tried looking for a google recommended solution. Those guys take my whole app code build and distribute via playstore, but they provide no means to retrieve back the original code.
- i checked my (occasionally) back up hard disk but no. My hard disk would have 100s of movies from 2016 , but not a useful piece of fuckin code.
- i also tried to get my apk and decompile it via some online decompiler. Here the google again fucks up and don't allow me to get my apk directly. Meanwhile i found a ton of shady websites which are hosting an apk of my app without my knowledge O_o . I tried to decompile on of them but code was even more non understandable than my fuck up code.
So i ended up looking at both the mess up code and decompiled code and coded the whole app from scratch ( well not scratch, i extracted the resources and some undamaged activities from the mess up code . Also github was down for more than 3 hours yesterday , at the same time when i was trying to look onto some repositories)
Lessons learned:
- DON'T FUCK UP WITH THE PRODUCTION CODE
- MAINTAIN VCS
- Your laptop is shit reliable, github is also shit reliable , so save code at multiple places.
- there are way more copies of your code lying on the internet than you think.
Checkout my app here :https://play.google.com/store/apps/...2 -
I’m struggling in studying and that’s seriously holding me back, regardless of the type of technical book I’m reading I’m always in a fight with my brain. Even if I enjoy the topic and then I’ll enjoy using what I read while I study I struggle to learn more than 1-2 chapters (sometimes even less) at time then my head starts to hurt, my focus drifts away and if I force myself to go ahead my brain just refuses to store the new informations, it feels like filling a full tank.
At this point I should have learned C++ and Swift and started to contribute to projects which aren’t overdone web apps but all I have are two half read books which silently “judges” me anytime I open my eBook library and I dread returning to having associated them to headache and frustration and the only things I read this year are design patterns (which haven’t found a single real life use since then) and F# (which I never used with the exception of some little demos and is now slowly fading away in my memory).
Have you got any study advice to help me dealing with this frustrating situation?3 -
I struggled with weather to post this but I feel like I have to. I didnt want to feed into the fear or give 'them' any more reason to argue against common sense but I guess it cant be helped.
The reason I was gone for a while was because I went and got my vaccination.
In less than half hour after getting the vaccine, I was in the ICU. The staff told me I had a stroke possibly from clotting and inflamation. I couldnt feel my arm or anything below my shoulders. Yes really.
Apparently I "died" for a little while and when they brought me back I was in a coma for almost a week.
I'm back home now and I still dont fully understand what happened. Still have numbness, and horrible headaches, and can barely think straight sometimes, but the doctors told me that I didnt suffer any permanent brain damage according to my scans.
Also they told me I had old damage to my left and right temporal lobe, which makes sense because I have always suffered problems with short term memory and other issues.
And I'm just at a loss how this could happen. I have no serious injuries. We were told this is safe.
And this is the exact reason I didnt want to post it, because now tards will come in and be "lololol serves you right vaxxer!"
If I knew the side effects were this bad maybe I would have changed my mind but no one told me! I mean I think I still would have got it because we have to protect vulnerable people, but still.
The hospital assured me it wasnt the vaccine and must have been an underlaying condition, but I'm not so sure. I just happen to have a pre-existing problem that I dont know about that causes a stroke and paralysis only half an hour after the shot?
And now I dont know if I'll ever be ok. And doctors warned me I may suffer more strokes and to avoid physically demanding tasks for a while. My primary job is construction (not by chooce). Now I face the prospect of not even being able to work my existing job or do the things I love, like hiking, anymore. So much of the world doesnt make any sense right now and I just dont know what to believe anymore.
Tards will probably be in shortly to suggest I check for microchips or test fucking magnets on myself.
No, just stop.8 -
Some of you know I'm an amateur programmer (ok, you all do). But recently I decided I'm gonna go for a career in it.
I thought projects to demo what I know were important, but everything I've seen so far says otherwise. Seems like the most important thing to hiring managers is knowing how to solve small, arbitrary problems. Specifics can be learned and a lot of 'requirements' are actually optional to scare off wannabes and tryhards looking for a sweet paycheck.
So I've gone back, dusted off all the areas where I'm rusty (curse you regex!), and am relearning, properly. Flash cards and all. Getting the essentials committed to memory, instead of fumbling through, and having to look at docs every five minutes to remember how to do something because I switch languages, frameworks, and tooling so often. Really committing toward one set of technologies and drilling the fundamentals.
Would you say this is the correct approach to gaining a position in 2020, for a junior dev?
I know for a long time, 'entry level' positions didn't really exist, but from what I'm hearing around the net, thats changing.
Heres what I'm learning (or relearning since I've used em only occasionally):
* Git (small personal projects, only used it a few times)
* SQL
* Backend (Flask, Django)
* Frontend (React)
* Testing with Cypress or Jest
Any of you have further recommendations?
Gulp? Grunt? Are these considered 'matter of course' (simply expected), or learn-as-you for a beginner like myself?
Is knowing the agile 'manifesto' (whatever that means) by heart really considered a big deal?
What about the basics of BDD and XP?
Is knowing how to properly write user-stories worth a damn or considered a waste of time to managers?
Am I going to be tested on obscure minutiae like little-used yarn/npm commands?
Would it be considered a bonus to have all the various HTTP codes memorized? I mean thats probably a great idea, but is that an absolute requirement for newbies, or something you learn as you practice?
During interviews, is there an emphasis on speed or correctness? I'm nitpicky, like to write cleanly commented code, and prefer to have documentation open at all times.
Am I going to, eh, 'lose points' for relying on documentation during an interview?
I'm an average programmer on my good days, and the only thing I really have going for me is a *weird* combination of ADD and autism-like focus that basically neutralize each other. The only other skill I have is talking at people's own level to gauge what they need and understand. Unfortunately, and contrary to the grifter persona I present for lulz, I hate selling, let alone grifting.
Otherwise I would have enjoyed telemarketing way more and wouldn't even be asking this question. But thankfully I escaped that hell and am now here, asking for your timeless nuggets of bitter wisdom.
What are truly *entry level* web developers *expected* to know, *right out the gate*, obviously besides the language they're using?
Also, what is the language they use to program websites? It's like java right? I need to know. I'm in an interview RIGHT now and they left me alone with a PC for 30 minutes. I've been surfing pornhub for the last 25 minutes. I figure the answer should take about 5 minutes, could you help me out and copypasta it?
Okay, okay, I'm kidding, I couldn't help myself. The rest of the questions are serious and I'd love to know what your opinions are on what is important for web developers in 2020, especially entry level developers.7 -
I can work with Angular, even though it's pain in the but.
My current Angular job is actually the job with the first manager that had decent human values and ethics, I like my team, and yeah, what we building is shit. But it's only 30% shit because of Angular, another 30% are due to SAFe, and the rest is the usual stuff.
Still enjoy my job and respect my team.
But please do not expect me to pretend Angular is on a comparable level to React. Angular hasn't brought any actual innovation in most major versions but releases those breaking major updates still at least twice a year.
Ivy might be awesome, but only because Angular told the world 3 years ago also to have Ivy compatible compile targets for their libs/packages doesn't mean everybody cared.
And the ngcc, the awesome compatibility compiler, mutates node modules in place. So ne parallel stuff, no using yarn2 or pnpm.
At the same time, React brought so many innovations into the frontend world but is basically backwards compatible.
Not sure how the Angular partial compilation and whatever needs to go on works, but it seems like there's hardly anyone that really knows, so you can't use Vite or whatever other new tool.
And sure, if you're really good, you can write Angular without producing memory leaks.
But it's really hard. Do you know what's also quite hard: Producing memory leaks with React!
And for sure, Angular Universal, which isn't used by anyone, it feels like, will still be on a comparable level to an open source product that's used all over the world, builds the basis for an open source company, and is improved by thousand of issues day by day.
And sure, two kinds of change detection are a great idea. And yeah, pretending Angular comes with all included makes it worth it that the API is fucking huge and you're better of knowing nothing, because you have to read up things, than knowing quite a lot, since making assumptions and believing apis work in a similar way and follow similar contentions...
Whatever... I work with it. Like the time. Like the company, even my poss. But please don't expect my lying to you this was a good idea, or Angular is even remotely the same level of React.15 -
Shit, I lost the rant again. Well let's begin from the top.
This is little bit personal but I'm not keeping any of this as a secret. I'm a hyperactive thinker at nights (ADHD). I must write this down, although it's well over middle-night at this point.
I just discovered that I might be better writer whilst I'm sleepy, hungry, out of affection of the meds or all of the above.
And may I remind you that I'm not a native English speaker or writer.
* Saved to clipboard, so I won't lose this again *
I've written now 2 long rants, 8 issue reports (devRant) and a loong collab posting in this one sitting, or rather laying. It feels like I'm writing perfectly without missing a beat. I know that's not right, it's the main symptom in ADHD; My brain is actually running slower than an average, much slower. That's a reasonable explanation for the “fast” innovation.
I'm running without restrictions of a normal human, I don't "overthink" every single word and rather go with the flow. That's what spell checkers are for.
* Save *
You can probably see what's happening. It's certainly also true when writing code. I left out the normal cleaning up (except for the grammar, found 10 errors).
It's pretty much the same thing as I'd imagine being drunk or even high.
I must not be the only one.
* Writing tags... *
* Update error count *
* Recover one part from memory *10 -
I was just looking new and lesser known programming languages. Then I came to Rust.
"Rust manages memory at compile time"
What?😵 How is that even possible?9 -
I feel like writing or telling people about the time I jumped from Windows 7 Ultimate and jumping to Windows 10. (I'm not against 10, but I'm never updating after what had happened to me)
It all starts when none of my games will play due to a possible issue with my graphics card. I look up "3D source game bug" and not many results pop up. I go on Microsoft's Qna areas and ask this question but to my surprise nothing they say would make sense. "Clean the pins of your graphics card, make sure you verify the games on Steam". I verified the games and they checked out as perfectly fine. I don't have access to my graphics card because this is a laptop, sadly not a tower.
Two months pass and my computer is already showing signs of stress, like it didn't want to live in a sense. It was three times slower than when I was on Windows 7 and it was unallocating areas of my main hard drive where I could make virtual hard drives.
Instantly I start looking up Linux distros and find Linux Mint. 17.3 was the current version at the time. I downloaded it and burned it onto a DVD-rom and rebooted my computer. I loaded into the disc and to my surprise it seemed almost like Windows 7 apart from the Linux part. I grab my external hard drive and partition it to hold the Linux distro and leave it plugged in incase Windows 10 does actually fail.
On December 19, a few months after Windows 10 had released. I start my laptop to try and continue my studies in video game development. But to my surprise, Windows 10 had finally crashed permanently. The screen flickered blue and black, and an error box saying Loginui.exe failed to start. I look at it for a solid minute as my computer had just committed suicide in a sense.
I reboot thinking it would fix the error but it didn't. I couldn't log in anymore.
I force shutdown the laptop and turn it back on putting it into safe mode.
To my surprise loginui.exe works and I sign in. I look at my desktop, the space wallpaper I always admired, the sound files, screen shots I had saved.
I go into file explorer and grab everything out of my default hard drive Windows was installed on. Nothing but 400gb got left behind and that was mainly garbage prototypes I had made and Windows itself. I formatted my external hard drive and placed everything on it. Escaping Windows 10 with around 100GB of useful data I looked at the final shutdown button I would look at.
I click it and try to boot into normal Windows 10. But it doesn't work. It flickers and the error pops up once more.
I force it to shutdown and insert the previous Linux Mint disc I made and format the default hard drive through Linux. I was done. 10 gave me a lot of shit. Java wouldn't work, my games has a functional UI but no screen popped up except a black abyss and it wouldn't even let me try to update my graphics card, apparently my AMD Radeon 5450 was up to date at the AMD Radeon 5000's.
I installed Linux Mint and thinking the games would actually play I open steam and Launch Half-Life 2 to check if Linux would be nicer to me than Windows 10 had been.
To my surprise the game ran. The scene from Highway 17 popped on screen and the UI was fully functional. But it was playing at 10-15fps rather than the usual 60-70fps. Keep look at my drivers and see my graphics card isn't in use. I do some research and it turns out I have a Hybrid Laptop.
Intel HD Graphics and an AMD Radeon 5450 and it was using the Intel and not the AMD. Months of testing and attempts of getting the games to work at high frame rates pass and the Damn thing still functions at a low terrible fps. Finally I give up. I ask my mom for a Windows 7 disc and she says we can't afford it. A few months pass and I finally get a Windows 7 installation disc through money I've saved up. Proudly I put it into my optical disc drive and install it to my main hard drive deleting Linux completely. I announced to all my friends my computer was back in working order and I install everything I needed, Steam, Skype, Blender, and Unity as well as all my games. I test Half-Life 2 and it's running exceptionally smoothly, I test Minecraft at max settings and it's working beautifully. The computer was functioning properly once again and my life as a developer started as I modeled things and blender, learned beginners C# and learned a lot of Batch. Today the computer still runs at a great speed and I warn others of what happened to me after I installed Windows 10 to my machine if they are thinking of switching from 7 or 8 on an older machine.
Truly the damage to my data cannot be undone. But the memory of the maintenance, work, tests, all are a memory of how Windows 10 ruined me and every night before the one year anniversary of Windows 10's release, I took out the battery of my laptop and unplugged it from the a.c. power, just so Windows 10 doesn't show it's DLLs, batch scripts, vbs scripts, anything on my computer. But now, after this has happened and I have recovered, I now only have a story to tell5 -
TL;DR: I have some rambly shit to say...
Update on the Uni stuff: I think I got a pass in all the subjects. Two exams left but I am holding on. It's a big deal to me since last year I could barely do a single subject per semester - a subject I had failed a few times because of lack of interest and good ol' depression. Anyways, I persisted with that subject, got my Bachelor's in Food Technology and now I'm doing that Master's of mine... It probably looks wild to people here that I did that switch but I have always had a relationship with computers as long as I remember myself. So it's not surprising that as soon as I got a choice in what I *actually* wanted to do I chose this kinda thing. But I do have to rant that it took me 10 fucking years to choose! And that I did not choose it before choosing food technology which I will probably never use anyways. I wasted so much of my energy and time on that. I did elect programming as one of the subjects while doing food tech but I really should have moved to something else. But oh well. Guess I had to find out the hard way.
For all those reading, this is what it looks like when you're 30, have very little experience in doing programming for anything else than academics and are doing a major career switch through studies after struggling for 10 years with a 4-year Bachelor's. But such is life.
Also a bit off topic but I just cannot handle people not telling what they mean because of the inability or lesser ability to tell what that is in the first place.
I can't deal with the fact of how fucked human societies are. I just can't. I am way too nice for it. So I listen to stuff like true crime to really get a feel of how evil people can be. I know it's ~problematic~ or whatever, but to me it is a way of engaging with the lesser spoken side of human beings.
And maybe, just maybe, I should get checked for ADHD again because I feel like despite my therapy for depression, nothing really has changed with the ADHD symptoms I was diagnosed with. And maybe for autism since people have labelled me that way and it might explain some stuff... All that is to say I need some good mental care. And this society is shit for it. Hell, apparently one of the psychologists I was under the care of thought depression resulted from ungratefulness. All this while I was legit being abused. But that abuse has stopped now that I found a psychologist that is actually standing up for me. I just mourn for all the time I spent being depressed and how it fucked my memory and stuff. How much it affected me and all. I have no idea why I'm being this vulnerable but it feels somewhat fitting... How do you cope with being 30 and not remembering almost all your life? What you remember being what you managed to write down or has been negative enough it stuck in the brain for forever...
Just why am I fucking supposed to be all happy and shit when I am just tired of life because it is too goddamn much? I have no real reason to look forward to things, online friends and the offline one included. Because ultimately, I have no damn motivation to look forward to anything, really. I am supposedly doing better but in reality I am just getting better at going through the motions. The therapy, while mindblowingly effective, is not actually addressing the core cause of everything and just expecting me to fake it till I make it. And this is me saying that about CBT. Why should I have to tell myself things just to feel human? I am one and as long as I'm alive, nothing will change that. So why do I have to always feel like an alien wherever I am? So out of touch with myself that I don't have a self image or an ability to even tell what the actual fuck I want from life... I am getting better with the latter, but still. It hurts. I wanna shed so many tears but I'm frustratingly unable to do so.
I am just a human trying to human in this ocean of 8 billion humans. Maybe I will find some more connections, maybe I won't.
I wanna end this rambling session by a few things:
1. I will have to go to Canada at some point this year to see my in-laws and some other family over there...
2. I will probably have to seek a job there (for financial reasons it is much better for me to have one there and to work remotely in Georgia) and I have no idea of where to start since I am not the greatest material for it.
3. Life is going alright-ish.
4. I will hear from the startup company at some point this month.
5. I have plans for my future but no idea if they will ever come true at this point.
6. My family arrangement will have to change in more ways than one.
7. I should resume my unofficial first music album and engage in creative stuff because at the core, I have a need to do so.
8. Do I really have to do Duolingo again? I really want to not forget German and Russian, but I just never have practice. And Duolingo is surprisingly easy to forget to do for me.
The end.3 -
So i wasted last 24 hours trying to satisfy my ego over a shitty interview and revisiting my old job's codebase and realising that i still don't like that shit. just i am 25 and have no clue where am i heading at. i am just restless, my most of the decisions in 2023 have given very bad outcomes and i am just trying doing things to feel hopeful.
context for the interview story-----
my previous job was at a b2b marketing company whose sdk was used by various startups to send notifications to their users, track analytics etc. i understood most of it and don't find it to be any major engineering marvel, but that interviewer was very interested in asking me to design a system around it.
in my 1.2 years of job there, i found the codebase to be extremely and unnecessarily verbose ( java 7) with questionable fallbacks and resistance towards change from the managers. they were always like "we can't change it otherwise a lot of our client won't use our sdk". i still wrote a lot of testcases and tried to understand the working of major features.
BTW, before you guys go on a declare me an embarrassment of an engineer who doesn't know the product's code base, let me tell you that we are talking SDKs (plural) and a service based company here. their was just one SDK with interesting, heavy lifting stuff and 9 more SDKs which were mostly wrappers and less advanced libraries. i got tasks in all of them, and 70% of my time went into maintaining those and debugging client side bugs instead of exploring the "already-stable-dont-change" code base.
so based on my vague understanding and my even more vague memory from 1 year ago, i tried to explain an overall architecture to that interviewer guy. His face was screaming the word "pathetic" from his expressions, so i thought that today i will try to decode the codebase in 12-15 hours, publish a cool article and be proud of how much i know a so called martech system design. their codebase is open sourced, so it wasn't difficult to check it out once more.
but boy oh boy i got so bored. unnecessary clases , unnecessary callbacks static calls , oof. i tried to refactor a few classes, but even after removing 70% of codebase, i was still left with 100+ classes , most of them being 3000-4000 files long. and this is your plain old java library adding just 800kb to your project.
boring , boring stuff. i would probably need 2-3 more days to get an understanding of complete project, although by then i would be again questioning my life choices , that was this a good use of my 36 hours?
what IS a correct usage of my time? i am currently super dissatisfied with my job, so want to switch. i have been here for 6 months, so probably i wouldn't be going unless i get insane money or an irresistible company offer. For this i had devised a 2 part plan to either become good at modern hot buzz stuff in my domain( the one being currently popularized by dev influenzas) or become good at dsa/leetcode/cp. i suck bad at ds/algo stuff, nor am i much motivated. so went with that hot buzz stuff.
but then this interview expected me to be a mature dev with system design knowledge... agh fuck. its festive season going on and am unable to buy any cool shirts since i am so much limited with my money from my mediocre salary and loans. and mom wants to buy a home too... yeah kill me3 -
I like rants that are thought provoking and push a message forward regardless of whether they may sting a little, so for my first post on here I'd like to hit at home with many of you.
Html5 "Native" Applications are not needed. Let's cover mobile first of all, the misconception that apps are written in either javascript or Native android/ Native ios environment. Or even some third party paid tools like xamarin is quite strange to me. OpenGL ES is on both IOS and Android there is no difference. It's quite easy to write once run everywhere but with native performance and not having to jump through js when it's not needed. Personally I never want to see html or css if I'm working on a mobile app or desktop. Which brings me to desktop, I can't begin to describe how unthought out an electron app is. Memory usage, storage space for embedding chromium, web views gained at the expense of literally everything else, cross platform desktop development has been around for decades, openGL is everywhere enough said. Finally what about targeting browser if your writing a native app for mobile and desktop let's say in c++ and it's not in javascript how can it turn back into javascript, well luckily c++ has emscripten which does that simply put, or you could be using a cross complier language like haxe which is what I use. It benefits with type safety, while exporting both c++ and javascript code. Conclusion in reality I see the appeal to the js ecosystem it's large filled with big companies trying to make js cross development stronger every day. However development in my mind should be a series of choices, choices that are invisible don't help anyone, regardless of the popularity of the choice, or the skill required.8 -
Can anyone help me with this theory about microprocessor, cpu and computers in general?
( I used to love programming when during school days when it was just basic searching/sorting and oop. Even in college , when it advanced to language details , compilers and data structures, i was fine. But subjects like coa and microprocessors, which kind of explains the working of hardware behind the brain that is a computer is so difficult to understand for me 😭😭😭)
How a computer works? All i knew was that when a bulb gets connected to a battery via wires, some metal inside it starts glowing and we see light. No magics involved till now.
Then came the von Neumann architecture which says a computer consists of 4 things : i/o devices, system bus ,memory and cpu. I/0 and memory interact with system bus, which is controlled by cpu . Thus cpu controls everything and that's how computer works.
Wait, what?
Let's take an easy example of calc. i pressed 1+2= on keyboard, it showed me '1+2=' and then '3'. How the hell that hapenned ?
Then some video told me this : every key in your keyboard is connected to a multiplexer which gives a special "code" to the processer regarding the key press.
The "control unit" of cpu commands the ram to store every character until '=' is pressed (which is a kind of interrupt telling the cpu to start processing) . RAM is simply a bunch of storage circuits (which can store some 1s) along with another bunch of circuits which can retrieve these data.
Up till now, the control unit knows that memory has (for eg):
Value 1 stored as 0001 at some address 34A
Value + stored as 11001101 at some address 34B
Value 2 stored as 0010 at some Address 23B
On recieving code for '=' press, the "control unit" commands the "alu" unit of cpu to fectch data from memory , understand it and calculate the result(i e the "fetch, decode and execute" cycle)
Alu fetches the "codes" from the memory, which translates to ADD 34A,23B i.e add the data stored at addresses 34a , 23b. The alu retrieves values present at given addresses, passes them through its adder circuit and puts the result at some new address 21H.
The control unit then fetches this result from new address and via, system busses, sends this new value to display's memory loaded at some memory port 4044.
The display picks it up and instantly shows it.
My problems:
1. Is this all correct? Does this only happens?
2. Please expand this more.
How is this system bus, alu, cpu , working?
What are the registers, accumulators , flip flops in the memory?
What are the machine cycles?
What are instructions cycles , opcodes, instruction codes ?
Where does assembly language comes in?
How does cpu manipulates memory?
This data bus , control bus, what are they?
I have come across so many weird words i dont understand dma, interrupts , memory mapped i/o devices, etc. Somebody please explain.
Ps : am learning about the fucking 8085 microprocessor in class and i can't even relate to basic computer architecture. I had flunked the coa paper which i now realise why, coz its so confusing. :'''(14 -
I give up. I am having trouble understanding Go lang *, & stuff.
attached code does print what fmt.Println has, but the function calling: FindByID is throwing invalid memory address or nil pointer when it attempts to use the result of the code.
Can someone please explain where I'm wrong?
Error log blames the line of "if" condition even though fmt does print its output to console16 -
... worst drunk coding experience?
none. or to be more precise, all of the three of them I had. I can't code drunk, i hate doing it, i hatw even thinking about doing it when drunk.
so after those initial three attempts i don't try to do it again, ever.
BUT, best coding experience while high?
ALL OF THEM.
some of the best pieces of code I wrote i did when I was high. my mind goes into overdrive at those times, and my thinking is not lines/threads of thought, but TREES of thought, branching and branching, all nodes of each layer of the tree coming to me AT ONCE, one packet == whole layer across all of the branches.
and the best was when one day, in about 14 hour marathon of coding while high, i wrote from scratch a whole vertical slice of my AI system that i've been toying around in my head for several years prior, and I had all of the high-level concepts ALMOST down, but could never specify them into concrete implementations.
and I do mean MY ai system, my own design, from the ground up, mixing principles of neural networks and neuropsychology/human brain that I still haven't seen even mentioned anywhere.
autonomous game ai which percieves and explores its environment and tools within it via code reflection, remembers and learns, uses tools, makes decisions for itself for its own well-being.
in the end, i had a testbed with person, zombie and shotgun.
all they had pre-defined in their brains were concepts of hunger and health. nothing more.
upon launching it, zombie realized it wants to feed, approached oblivious person, and started eating it.
at which point, purely out of how the system worked, person realized: "this hurts, the hurt is caused by zombie, therefore i hate zombie, therefore i want to hurt it", then looked around, saw the shotgun, inspected its class by reflection, realized "this can hurt stuff", picked the shotgun up, and shot the zombie.
remembered all of that, and upon seeing another zombie, shot it immediately.
it was a complete system, all it needed to become full-fledged thing was adding more concepts and usable objects, and it would automatically be able to create complex multi-stage, multi-element plans to achieve its goals/needs/wants and execute them. and the system was designed in such a way that by just adding a dictionary of natural language words for the concept objects on top of it, it should have been able to generate (crude but functional) english sentences to "talk" about its memories, explain what happened when, how it reacted, what it did and why, just by exploring the memory graph the same way as when it was doing its decision process... and by reversing the function, it should have been able to recieve (crude) english sentences that would make it learn what happened somewhere else in the gameworld to someone else, how to use stuff and tell it what to do, as in, actually transfer actual actionable usable knowledge to it...
it felt amazing to code for 14 hours straight, with no testruns during that, run it for the first time after those 14 hours, and see that happen.
and it did, i swear! while i was coding, i was routinely just realizing typos and mistakes i did 5-20 minutes ago, 4 files/classes ago! the kind you (and i) usually notice only when you try to run the thing and it bugs out.
it was a transcendental experience.
and then, two days later, i don't remember anymore what happened, but i lost all of that code.
and since then, i never mustered enough strength and resolve to try and write the whole thing again.
... that was like 4 years ago.
i hope that miracle will happen again one day...3 -
Question for devs who use Intellij IDEA.
How often do you use livetemplates?
I am a new android dev with ADHD and just discovered live templates. They make my life much easier, for example I have shortcuts for generating recyclerview adapter/viewholder/implementation boilerplate code.
In that way I am able to focus on implementation, and do my coding like building blocks, rather than memorizing every detail of implementation. Also I don't need to go to stackoverflow and copypaste basic things multiple times. Even for example during live coding interview having livetemplates seems awesome, copypasting from stackoverflow would be shameful (I think). Using my own custom shortcuts for livetemplates seems the best way for how my brain functions (I suck at memorizing tiny details, but I remember general idea/flow of a pattern and I would prefer memorizing what to use and when to use, instead of all small details of implementation).
Is getting to dependent on livetemplates a good practice to get used to? Do other developers frown upon a dev who has dozens of livetemplates and relies on them instead of writing all code from memory by hand?8 -
Cleaning up code warnings in a 3rd party piece of software and found a function that was returning a pointer to a local variable, who wrote this piece of shit?!1
-
In the past, apps I've written have used a flat file backend. It's very fast, but obviously clunky to have a big structure of flat files for an app. It ran circles around framework-based RDBMS backends, as performance is concerned, but again, it was clunky. Managing backups and permissions on tens or hundreds of thousands of small files was no fun. Optimizing code for scaling was fun- generating indexes, making shortcuts -but something was still missing. Early in 2017 I discovered redis. A nosql backend that just stores variables and lives almost entirely in memory. Excellent modules and frameworks for every language. It was EXACTLY what I'd needed, even though I didn't know I did. I spent a good deal of time in 2017 converting apps from flat files to redis, and cackled with glee as they became the apps I wanted them to be. Earlier this week, I started building my first app that started with redis, instead of flat files, and I can't stop gushing to anyone who will listen. Redis for president!
-
YGGG IM SO CLOSE I CAN ALMOST TASTE IT.
Register allocation pretty much done: you can still juggle registers manually if you want, but you don't have to -- declaring a variable and using it as operand instead of a register is implicitly telling the compiler to handle it for you.
Whats more, spilling to stack is done automatically, keeping track of whether a value is or isnt required so its only done when absolutely necessary. And variables are handled differently depending on wheter they are input, output, or both, so we can eliminate making redundant copies in some cases.
Its a thing of beauty, defenestrating the difficult aspects of assembly, while still writting pure assembly... well, for the most part. There's some C-like sugar that's just too convenient for me not to include.
(x,y)=*F arg0,argN. This piece of shit is the distillation of my very profound meditations on fuckerous thoughtlessness, so let me break it down:
- (x,y)=; fuck you in the ass I can return as many values as I want. You dont need the parens if theres only a single return.
- *F args; some may have thought I was dereferencing a pointer but Im calling F and passing it arguments; the asterisk indicates I want to jump to a symbol rather than read its address or the value stored at it.
To the virtual machine, this is three instructions:
- bind x,y; overwrite these values with Fs output.
- pass arg0,argN; setup the damn parameters.
- call F; you know this one, so perform the deed.
Everything else is generated; these are macro-instructions with some logic attached to them, and theres a step in the compilation dedicated to walking the stupid program for the seventh fucking time that handles the expansion and optimization.
So whats left? Ah shit, classes. Disinfect and open wide mother fucker we're doing OOP without a condom.
Now, obviously, we have to sanitize a lot of what OOP stands for. In general, you can consider every textbook shit, so much so that wiping your ass with their pages would defeat the point of wiping your ass.
Lets say, for simplicity, that every program is a data transform (see: computation) broken down into a multitude of classes that represent the layout and quantity of memory required at different steps, plus the operations performed on said memory.
That is most if not all of the paradigm's merit right there. Everything else that I thought to have found use for was in the end nothing but deranged ways of deriving one thing from another. Telling you I want the size of this worth of space is such an act, and is indeed useful; telling you I want to utilize this as base for that when this itself cannot be directly used is theoretically a poorly worded and overly verbose bitch slap.
Plainly, fucktoys and abstract classes are a mistake, autocorrect these fucking misspelled testicle sax.
None of the remaining deeper lore, or rather sleazy fanfiction, that forms the larger cannon of object oriented as taught by my colleagues makes sufficient sense at this level for me to even consider dumping a steaming fat shit down it's execrable throat, and so I will spare you bearing witness to the inevitable forced coprophagia.
This is what we're left with: structures and procedures. Easy as gobblin pie.
Any F taking pointer-to-struc as it's first argument that is declared within the same namespace can be fetched by an instance of the structure in question. The sugar: x ->* F arg0,argN
Where ->* stands for failed abortion. No, the arrow by itself means fetch me a symbol; the asterisk wants to jump there. So fetch and do. We make it work for all symbols just to be dicks about it.
Anyway, invoking anything like this passes the caller to the callee. If you use the name of the struc rather than a pointer, you get it as a string. Because fuck you, I like Perl.
What else is there to discuss? My mind seems blank, but it is truly blank.
Allocating multitudes of structures, with same or different types, should be done in one go whenever possible. I know I want to do this, and I know whichever way we settle for has to be intuitive, else this entire project has failed.
So my version of new always takes an argument, dont you just love slurping diarrhea. If zero it means call malloc for this one, else it's an address where this instance is to be stored.
What's the big idea? Only the topmost instance in any given hierarchy will trigger an allocation. My compiler could easily perform this analysis because I am unemployed.
So where do you want it on the stack on the heap yyou want to reutilize any piece of ass, where buttocks stands for some adequately sized space in memory -- entirely within the realm of possibility. Furthermore, evicting shit you don't need and replacing it with something else.
Let me tell you, I will give your every object an allocator if you give the chance. I will -- nevermind. This is not for your orifices, porridges, oranges, morpheousness.
Walruses.16 -
Trying out Gnome again, because KDE is "just ok", and Hyprland and DWM are fine, but I wanted to try something different. (Actually DWM is amazing, and Hyprland is sorta weird?)
You know, it's not that bad. Doesn't even seem to be as memory crazy as everyone seems to say either...idk what I did, but it appears to be using around a GB, maybe a little less. Definitely not the experience I remember from the Gnome 2 days. Anyway, I was curious, so I was looking at the source on Github....and why the fuck is there javascript in this DE code? WHY. I do not understand.
Maybe I'm fucking nuts, but I actually kind of like the workflow, once I've applied a couple of "tweaks". But seriously, I am fucking gobsmacked at the JS thing. Why.9 -
Sydochen has posted a rant where he is nt really sure why people hate Java, and I decided to publicly post my explanation of this phenomenon, please, from my point of view.
So there is this quite large domain, on which one or two academical studies are built, such as business informatics and applied system engineering which I find extremely interesting and fun, that is called, ironically, SAD. And then there are videos on youtube, by programmers who just can't settle the fuck down. Those videos I am talking about are rants about OOP in general, which, as we all know, is a huge part of studies in the aforementioned domain. What these people are even talking about?
Absolutely obvious, there is no sense in making a software in a linear pattern. Since Bikelsoft has conveniently patched consumers up with GUI based software, the core concept of which is EDP (event driven programming or alternatively, at least OS events queue-ing), the completely functional, linear approach in such environment does not make much sense in terms of the maintainability of the software. Uhm, raise your hand if you ever tried to linearly build a complex GUI system in a single function call on GTK, which does allow you to disregard any responsibility separation pattern of SAD, such as long loved MVC...
Additionally, OOP is mandatory in business because it does allow us to mount abstraction levels and encapsulate actual dataflow behind them, which, of course, lowers the costs of the development.
What happy programmers are talking about usually is the complexity of the task of doing the OOP right in the sense of an overflow of straight composition classes (that do nothing but forward data from lower to upper abstraction levels and vice versa) and the situation of responsibility chain break (this is when a class from lower level directly!! notifies a class of a higher level about something ignoring the fact that there is a chain of other classes between them). And that's it. These guys also do vouch for functional programming, and it's a completely different argument, and there is no reason not to do it in algorithmical, implementational part of the project, of course, but yeah...
So where does Java kick in you think?
Well, guess what language popularized programming in general and OOP in particular. Java is doing a lot of things in a modern way. Of course, if it's 1995 outside *lenny face*. Yeah, fuck AOT, fuck memory management responsibility, all to the maximum towards solving the real applicative tasks.
Have you ever tried to learn to apply Text Watchers in Android with Java? Then you know about inline overloading and inline abstract class implementation. This is not right. This reduces readability and reusability.
Have you ever used Volley on Android? Newbies to Android programming surely should have. Quite verbose boilerplate in google docs, huh?
Have you seen intents? The Android API is, little said, messy with all the support libs and Context class ancestors. Remember how many times the language has helped you to properly orient in all of this hierarchy, when overloading method declaration requires you to use 2 lines instead of 1. Too verbose, too hesitant, distracting - that's what the lang and the api is. Fucking toString() is hilarious. Reference comparison is unintuitive. Obviously poor practices are not banned. Ancient tools. Import hell. Slow evolution.
C# has ripped Java off like an utter cunt, yet it's a piece of cake to maintain a solid patternization and structure, and keep your code clean and readable. Yet, Cs6 already was okay featuring optionally nullable fields and safe optional dereferencing, while we get finally get lambda expressions in J8, in 20-fucking-14.
Java did good back then, but when we joke about dumb indian developers, they are coding it in Java. So yeah.
To sum up, it's easy to make code unreadable with Java, and Java is a tool with which developers usually disregard the patterns of SAD. -
So, i have that assignment about docker stuff. nifty piece of software i must say.
anyways im installing docker software on windows bc im thinking if i have something that gives me at least the correct structure and some skeletal syntax i will have a faster grasp of the thing. expecting some sort of high level ide but end up instead with what looks like a blank window, with the only obvious choice being sign into some bullshit i dont need. but thats another story
my point is:
when installing the thing it prompted me to install WSL2. which i supposedly am not supposed to have because my cpu doesnt support intel virtualisation. but being impatient (thats why i came to look for an assisted solution), i pursued the installation.
lo and behold: i end up with a shell prompt at the root of a linux filesystem!
i ran 2 or 3 muscle-memory commands and closed the prompt, i was in docker stuff up to the neck.
later on, when i go back to my project, in a virtual machine its sluggish af and screams at me that amd-v is not supported because of something something nested pages (will look up later how that one works).
dont have time to explore it some more yet, and especially experiment or even barely look at this glorious mess because i have something barely working and no time to have it fail.
but this story definitely left me perplexed.
and also : you can run WSL2 on an fx83508 -
During my small tenure as the lead mobile developer for a logistics company I had to manage my stacks between native Android applications in Java and native apps in IOS.
Back then, swift was barely coming into version 3 and as such the transition was not trustworthy enough for me to discard Obj C. So I went with Obj C and kept my knowledge of Swift in the back. It was not difficult since I had always liked Obj C for some reason. The language was what made me click with pointers and understand them well enough to feel more comfortable with C as it was a strict superset from said language. It was enjoyable really and making apps for IOS made me appreciate the ecosystem that much better and realize the level of dedication that the engineering team at Apple used for their compilation protocols. It was my first exposure to ARC(Automatic Reference Counting) as a "form" of garbage collection per se. The tooling in particular was nice, normally with xcode you have a 50/50 chance of it being great or shit. For me it was a mixture of both really, but the number of crashes or unexpected behavior was FAR lesser than what I had in Android back when we still used eclipse and even when we started to use Android Studio.
Developing IOS apps was also what made me see why IOS apps have that distinctive shine and why their phones required less memory(RAM). It was a pleasant experience.
The whole ordeal also left me with a bad taste for Android development. Don't get me wrong, I love my Android phones. But I firmly believe that unless you pay top dollar for an android manufacturer such as Samsung, motorla or lg then you will have lag galore. And man.....everyone that would try to prove me wrong always had to make excuses later on(no, your $200_$300 dllr android device just didn't cut it my dude)
It really sucks sometimes for Android development. I want to know what Google got so wrong that they made the decisions they made in order to make people design other tools such as React Native, Cordova, Ionic, phonegapp, titanium, xamarin(which is shit imo) codename one and many others. With IOS i never considered going for something different than Native since the API just seemed so well designed and far superior to me from an architectural point of view.
Fast forward to 2018(almost 2019) adn Google had talks about flutter for a while and how they make it seem that they are fixing how they want people to design apps.
You see. I firmly believe that tech stacks work in 2 ways:
1 people love a stack so much they start to develop cool ADDITIONS to it(see the awesomeios repo) to expand on the standard libraries
2 people start to FIX a stack because the implementation is broken, lacking in functionality, hard to use by itself: see okhttp, legit all the Square libs, butterknife etc etc etc and etc
From this I can conclude 2 things: people love developing for IOS because the ecosystem is nice and dev friendly, and people like to develop for Android in spite of how Google manages their API. Seriously Android is a great OS and having apps that work awesomely in spite of how hard it is to create applications for said platform just shows a level of love and dedication that is unmatched.
This is why I find it hard, and even mean to call out on one product over the other. Despite the morals behind the 2 leading companies inferred from my post, the develpers are what makes the situation better or worse.
So just fuck it and develop and use for what you want.
Honorific mention to PHP and the php developer community which is a mixture of fixing and adding in spite of the ammount of hatred that such coolness gets from a lot of peeps :P
Oh and I got a couple of mobile contracts in the way, this is why I made this post.
And I still hate developing for Android even though I love Java.3 -
[CONCEITED RANT]
I'm frustrated than I'm better tha 99% programmers I ever worked with.
Yes, it might sound so conceited.
I Work mainly with C#/.NET Ecosystem as fullstack dev (so also sql, backend, frontend etc), but I'm also forced to use that abhorrent horror that is js and angular.
I write readable code, I write easy code that works and rarely, RARELY causes any problem, The only fancy stuff I do is using new language features that come up with new C# versions, that in latest version were mostly syntactic sugar to make code shorter/more readable/easier.
People I have ever worked with (lot of) mostly try to overdo, overengineer, overcomplicate code, subdivide into methods when not needed fragmenting code and putting tons of variables.
People only needed me to explain my code when the codebase was huge (200K+ lines mostly written by me) of big so they don't have to spend hours to understand what's going on, or, if the customer requested a new technology to explain such new technology so they don't have to study it (which is perfectly understandable). (for example it happened that I was forced to use Devexpress package because they wanted to port a huge application from .NET 4.5 to .NET 8 and rewriting the whole devexpress logic had a HUGE impact on costs so I explained thoroughly and supported during developement because they didn't knew devexpress).
I don't write genius code or clevel tricks and patterns. My code works, doesn't create memory leaks or slowness and mostly works when doing unit tests at first run. Of course I also put bugs and everything, but that's part of the process.
THe point is that other people makes unreadable code, and when they pass code around you hear rising chaos, people cursing "WTF this even means, why he put that here, what the heck this is even supposed to do", you got the drill. And this happens when I read everyone code too.
But it doesn't happens the opposite. My code is often readable because I do code triple backflips only on personal projects because I don't have to explain anyone and I can learn new things and new coding styles.
Instead, people want to impress at work, and this results in unintelligible, chaotic code, full of bugs and that people can't read. They want to mix in the coolest technologies because they feel their virtual penis growing to showoff that they are latest bleeding edge technology experts and all.
They want to experiment on business code at the expense of all the other poor devils who will have to manage it.
Heck, I even worked with a few Microsoft MVPs.
Those are deadly. They're superfast code throughput people that combine lot of stuff.
THen they leave at you the problems once they leave.
This MVP guy on a big project for paperworks digital acquisiton for a big company did this huge project I got called to work in, which consited in a backend and a frontend web portal, and pushed at all costs to put in the middle another CDN web project and another Identity Server project to both do Caching with the cdn "to make it faster" and identity server for SSO (Single sign on).
We had to deal with gruesome work to deal with browser poor caching management and when he left, the SSO server started to loop after authentication at random intervals and I had to solve that stuff he put in with days of debugging that nasty stuff he did.
People definitely can't code, except me.
They have this "first of the class syndrome" which goes to the extent that their skill allows them to and try to do code backflips when they can't even do code pushups, to put them in a physical exercise parallelism.
And most people is like this. They will deny and won't admit, they believe they're good at it, but in reality they aren't.
There is some genius out there that does revoluitionary code and maybe needs to do horrible code to do amazing stuff, and that's ok. And there is also few people like me, with which you can work and produce great stuff.
I found one colleague like this and we had a $800.000 (yes, 800k) project in .NET Technology, which consisted in the renewal of 56 webservices and 3 web portals and 2 Winforms applications for our country main railway transport system. We worked in 2 on it, with a PM from the railway company.
It was estimated 14 months of work and we took 11 and all was working wonders. We had ton of fun doing it because also their PM was a cool guy and we did an awesome project and codebase was a jewel. The difficult thing you couldn't grasp if you read the code is if you don't know how railway systems work and that's the only difficult thing.
Sight, there people is macking me sick of this job11 -
Rubber ducking your ass in a way, I figure things out as I rant and have to explain my reasoning or lack thereof every other sentence.
So lettuce harvest some more: I did not finish the linker as I initially planned, because I found a dumber way to solve the problem. I'm storing programs as bytecode chunks broken up into segment trees, and this is how we get namespaces, as each segment and value is labeled -- you can very well think of it as a file structure.
Each file proper, that is, every path you pass to the compiler, has it's own segment tree that results from breaking down the code within. We call this a clan, because it's a family of data, structures and procedures. It's a bit stupid not to call it "class", but that would imply each file can have only one class, which is generally good style but still technically not the case, hence the deliberate use of another word.
Anyway, because every clan is already represented as a tree, we can easily have two or more coexist by just parenting them as-is to a common root, enabling the fetching of symbols from one clan to another. We then perform a cannonical walk of the unified tree, push instructions to an assembly queue, and flatten the segmented memory into a single pool onto which we write the assembler's output.
I didn't think this would work, but it does. So how?
The assembly queue uses a highly sophisticated crackhead abstraction of the CVYC clan, or said plainly, clairvoyant code of the "fucked if I thought this would be simple" family. Fundamentally, every element in the queue is -- recursively -- either a fixed value or a function pointer plus arguments. So every instruction takes the form (ins (arg[0],arg[N])) where the instruction and the arguments may themselves be either fixed or indirect fetches that must be solved but in the ~ F U T U R E ~
Thusly, the assembler must be made aware of the fact that it's wearing sunglasses indoors and high on cocaine, so that these pointers -- and the accompanying arguments -- can be solved. However, your hemorroids are great, and sitting may be painful for long, hard times to come, because to even try and do this kind of John Connor solving pinky promises that loop on themselves is slowly reducing my sanity.
But minor time travel paradoxes aside, this allows for all existing symbols to be fetched at the time of assembly no matter where exactly in memory they reside; even if the namespace is mutated, and so the symbol duplicated, we can still modify the original symbol at the time of duplication to re-route fetchers to it's new location. And so the madness begins.
Effectively, our code can see the future, and it is not pleased with your test results. But enough about you being a disappointment to an equally misconstructed institution -- we are vermin of science, now stand still while I smack you with this Bible.
But seriously now, what I'm trying to say is that linking is not required as a separate step as a result of all this unintelligible fuckery; all the information required to access a file is the segment tree itself, so linking is appending trees to a new root, and a tree written to disk is essentially a linkable object file.
Mission accomplished... ? Perhaps.
This very much closes the chapter on *virtual* programs, that is, anything running on the VM. We're still lacking translation to native code, and that's an entirely different topic. Luckily, the language is pretty fucking close to assembler, so the translation may actually not be all that complicated.
But that is a story for another day, kids.
And now, a word from our sponsor:
<ad> Whoa, hold on there, crystal ball. It's clear to any tzaddiq that only prophets can prophecise, but if you are but a lowly goblinoid emperor of rectal pleasure, the simple truths can become very hard to grasp. How can one manage non-intertwining affairs in their professional and private lives while ALSO compulsively juggling nuts?
Enter: Testament, the gapp that will take your gonad-swallowing virtue to the next level. Ever felt like sucking on a hairy ballsack during office hours? We got you covered. With our state of the art cognitive implants, tracking devices and macumbeiras, you will be able to RIP your way into ultimate scrotolingual pleasure in no time!
Utilizing a highly elaborated process that combines illegal substances with the most forbidden schools of blood magic, we are able to [EXTREMELY CENSORED HERETICAL CONTENT] inside of your MATER with pinpoint accuracy! You shall be reformed in a parallel plane of existence, void of all that was your very being, just to suck on nads!
Just insert the ritual blade into your own testicles and let the spectral dance begin. Try Testament TODAY and use my promo code FIRSTBORNSFIRSTNUT for 20% OFF in your purchase of eternal damnation. Big ups to Testament for sponsoring DEEZ rant.3 -
I think studying engineering has really fucked up the way i learn new things...i find it nearly impossible to commit anything to memory that could easily be looked up. On its own it doesnt sound so bad but now I keep forgetting simple programming syntax and android design patterns because my brain just keeps saying
"You dont need to remember this, you can find it online is 2 mins"
Id rather just keep a bookmark of a great navigation drawer tutorial as opposed to learning it myself...i worry now what will happen in my technical interviews even though I consider myself a good programmer -
I literally don't understand the purpose of a "higher half kernel"
What does it matter where my kernel is mapped in virtual memory?
"It is traditional and generally good to have your kernel mapped in every user process" what the hell does that even mean??
Mapping my kernel into userspace is something is explicitely don't want to do. Like at all. Ever
And in physical memory it matters even less where it is.
I'm so confused right now3 -
Oh... I dont know what to pick...
So i will pick 3 projects from my 3 stages of my dev "carrier"
1.Right after i discovered programing and learned how ,if, while and similar structures worked. The launguage was object pascal with delphi 2007
That was a "safe" with a stupidly complicated lock (text inputs, sliders, ect) it opened a secret folder in the end.
2. It was a embedded code for a Atmega8 AVR, Atmel studio, pure C but without memory managment (i didnt even know that it even existed)
It was a Pip boy knockoff, 16x2 display and a small keyboard connected to the arduino like board that i made on a proto board.
It wasnt that much of a pipboy, it was more of a showoff of atmega8 internal systems, (ADC, timers, interrupts and such)
3.DataLab, after helping my friend with his master thesis, (we meet on discord long story, i was in high school) i decided that mathlab is shit and i created a visual scripting enviroment, launguage C# .net 4 (in the latest version)
I remade the whole program from scrach 1 time, significantly improving everything (code reuse, better algorithms, data processing, code redability and edge cases) I have learned good practises from everywhere. I learned how to use git.
DataLab project looks just like LabViev (i didnt notice that it even existed...), it is frozen now because of my mental status but im planning on using it on my CV when i will be looking for jobs on holidays. There are many things that i can improve in that program but ... first i have to fix myself. -
Me and this friend of mine were usually average in college subjects. We were not really bad at them, we just never got any exceptional marks in those subjects.
So when our 4th sem result came, a third friend of us got really good marks in some subject , like in 90s, and we again had marks around 70s.
At that time we both knew that we know that subject way more than this topper guy in terms of knowledge, but he just crammed everything about that subject word to word and got the better marks.
We thus believed that marks doesn't matter, its the knowledge and we both know its stupid to cram useless things which could easily be referred from documentations or internet when required.
But last sem, something different happens. looks like mah boy was a little envious on the inside, he scored a whopping 88%, just near to that topper friend of ours . i was happy watching his happiness , and he was saying that "dude this sem, i will even try to beat that guy in marks."
Even though none of them are class toppers, but they are somehow running in the race to be one. I on the other hand is still firm on the belief of not cramming stupid shit just to get a status of some 'topper'.
even though cramming subject knowledge is not a total waste, i still believe we should only understand what we need to understand, like learning the moral from a war story, not cramming the actual war dates.
Some might find this quality of mine to be the reason of me being 'average', but i feel totally fine with it. I have trained myself to be able to lookup for a particular resource online faster than they are able to lookup for that resource crammed in their brain memory, and i wonder if i should feel guilty about it. Yet the society will always see me as an 'average' guy and them as a 'winner' -
tell me guys what would you prefer:
function a(){
..
b(..)
..
b(..)
..
}
function b(p1,p2,p3,p4,p5,p6){.
...
}
or
function a(){
..
b(..)
..
b(..)
..
}
function b(
p1,
p2,
p3,
p4,
p5,
p6
){
...
}
if you read this rant before expanding, you got a complete context on how what function a is, its calling b 2 times and how function b looks.
if instead of the first option, i had used 2nd block, you wouldn't even know the 2nd param of b function without expanding this rant.
my point?
i prefer to keeping unnecessary info on one line. and w lot of linters disagree by splitting up the code. and most importantly , my arrogant tl disagree by saying he prefers the splitted code "for readability" and becaue "he likes code this way, old-eng1 likes this and old-eng2 likes this" .
why tf does an ide have horizontal a scrolling option available when you are too stupid to use it?
ok, i know some smartass is going to point that i too can use vertical scrolling, but hear me out: i am optimising this!
case 1 : a function with 7 params is NOT split into 7 lines. lets calculate the effort to remember it
- since all params could have similar charactersticks ( they will be of some type, might have defaults, might be a suspendable/async function etc), each param will take similar memory-efforts points. say 5sp each.
- total memory-efforts= 5sp *7 = 35 sp.
- say a human has 100 sp of fast memory storage, he can use the remaining 65 sp for loading say 5 small lines above or below.
- but since 5 lines above are already read and still visible on screen, they won't be needed to be loaded again nd again, nd we can just check the lines below.
- thus we are able to store 65+35+65 = 165 sp or about 11 lines of code in out fast memory for just a 100sp brain storage
case 2 function with 7 params IS split into 7 lines.
- in this case all lines are somewhat similar. 5sp for param lines as they are still similar which implies same 35sp for storing current function and params
- remaining 65sp can only be used to store next 5 lines of 13sp as the previous code is no longer visible.
- plus if you wanna refresh the code above, you gotta scroll, which will result in removing bottom code from screen , and now your 65sp from bottom code is overwritten by 65sp of top code.
- thus at a time, you are storing only 6 lines worth of code info. this makes you slow.
this is some imaginary math, but i believe it works10 -
It annoys me immensely when I struggle with myself, criticizing my own lack of knowledge in certain areas and my colleagues say: "You'll learn by doing". No, I won't, that's a foolish dogma.
I won't and I have never learned by 'doing'. The best results I've obtained have been through understanding every last bit of what's under the hood of a particular functionality. I'm not going to understand the white box by constantly probing the black box, it's just unsatisfactory and insufficient information. It's even dangerous to base yourself on the black box results because you often might get false positives.
I got through university by massive multilateral sensory focus: kinesthetic (writing things down), auditory (listening to the professor), visual (observing graphs and models of the material taught), conscious (mentalizing it all and interlinking information so that later it's accessible from long-term memory). I can confirm this is necessary for the brain because a Neurologist once told me just that.
At least for me, I had the most horrible grades (D's and F's) in freshman year with the 'learn by doing' method and the best grades (A, A+) with the multi-sensory method in later years as I matured my studying methods. In fact, with that method I've continuously outsmarted other people who had 10 years more experience than me ('experts', 'consultants',..) but they preferred to stay in the ignorant 'bro zone' rather than learning things properly. Even worse, the day they arrived on the scene, they completely broke the production environment and messed it up for the whole team. I felt like banging my head on my desk. It just makes me disappointed in the system.
If you follow popular method, you'll soon find yourself in the same problems that arise from doing what everyone else does. What happens at that point? That's right, they have to call in someone who actually bothered learning things.10 -
Being too careful and always trying to reduce memory and processoe usage might be a bad thing after all. Lengthening development time and inducing more stress on the developer just to reduce resource usage is not very sensible when dealing with small to medium size programs that doesn't deal with big data/file types.
What made me notice this habit in programmers was when I was smashing my head on the keyboard contemplating what method I should use to store the history of outputs for a fucking text based program that has minimal gui elements..
Having ocd as a programmer is a nightmare. But thank god it's not as bad as it was a year ago. I couldn't even read something without repeating the same page over and over again because my stupid brain decided that I was not reading it right. WHAT THE FUCK IS READING IT RIGHT ? Thank god for my psychiatrist and pills. I can atleast work on my projects without wanting to kill myself now ! 😂1 -
We’re only random people living in random places, speaking random languages, eating random food, sleeping, studying and working random hours. Traveling to random points on a sphere.
Just random range is different.
Just random stuff happens on crossroads of two random dots and the entropy speed ups or slows down.
Nothing special at all.
Just a finite state machine iteration.
I mean the amount of effort we put into explanation of infinity is outstanding.
What if there is no infinity at all ?
What if infinity is just misunderstanding of our interpretation of the world around us. It’s just pixels, resolution, gaussian splatting, quantum state, you name it.
Hey man the world is flat. Just put it to the 2d space. How many space you need from a simulation perspective where your patient eyes can only see up to certain amount of light particles per second on a shitty lens.
Propose a world optimization techniques by slowing down subject perception, tiredness introduced. Compress memory, sleep introduced. Limit neurons, cpu power assigned. Deploy on cloud - put it to life. Exit 0 body failure. Exit 1 suicide. Kill -9 killed by tty from ip EARTH.X.Y
What you can do to make the world around this planet alive? Make it blink.
We developers are lazy and I believe that nature is even more lazy than us.
You think you’re going to elevator right now ? You’re going to the preloader. Looking at the window equals playing video from playback. Never goes live, just precomputed fsm. Cars, trains, airplains ? Preloaders everywhere. Highways to split traffic to cities and communication. The road and cities planning department is a matrix maintenance department. And don’t get me started about space.
Space is empty because it’s not even finished. So they put it all behind glass called milky way. You know how glass looked 500 years ago ? It was milky so it’s milky way so we don’t see shit.
If the space would be finished I’ll be starting writing this text from mars, finished it and sent from earth but no it’s light years guys, light years is not a second for a matter. Light year is a second of the the injected thoughts exchange only. Thoughts of the global computer called generative AI that they introduced on local computing devices called cloud.
Even the preloader system is not present, they left us with the one map and overpopulated demo. What a shit hole.I bet they’re increasing temperature right now to erase this alpha build and cash out. Obviously so many bugs here that his one can’t be fixed anymore. To many viruses.
Hope for 0days to start happening so we can escape using time travel or something.
I bet they cut a budget or something, moved the team to other projects. Or even worse solar system team got layoff off because we are just neurons that ordered to do it. And now we’re stuck in some maintenance mode, no new physics no new thoughts to pursue, just slow degeneration. I would pay more for the next run and switch to other galaxy far far away where they at lest have more modern light speed technology.
What do you think about it Trinity ? Not even worth wasting your time for that. No white rabbit this time.
I do not recommend this game at this stage of early access.
- only one available map despite promises for expansions over the years no single dlc arrived,
- missing space adventures
- no galaxy travel mode only a teaser trailers of what you can do in other “universes”
- developers don’t respond to complains
- despite diversity of species and buildings at first sight world looks to generic
- instead of new features bots with mind manipulation, AB testing and data harvesting was introduced
- death anti cheat mode installed1 -
What ai model would I use to propagate a series of survival factors and decision making scenarios that if the optimal order of activities are pursued would lead survival and even prosperity and the worst set of possibilities would lead to death where the environment and sensations being experienced would always lead to specific pitfalls but wherein some of these pathways would lead to later reward and where the obstacles like predators could be overcome by simple combinations of objects which would be a crude mimicry of the invention ?
Neural nets don’t see to fit this given my understanding but there is a training aspect I’m looking for where the creature being simulated dies, develops fear responses, feels pain, avoids pain, remembers things, develops behaviors related to characteristics in creatures, has unborn motivations that weight decisions, and learns to prioritize.
I had created a massive dataset of objects including memories and aspects of semantic memory and episodic memory colored by emotion inspired by past conflict and reward with the idea that a running average would affect behavior and decide on various behaviors all the way down to perceptual differences
Any thoughts again ? Or will wolf try to steal these too ?29