Join devRant
Do all the things like
++ or -- rants, post your own rants, comment on others' rants and build your customized dev avatar
Sign Up
Pipeless API
From the creators of devRant, Pipeless lets you power real-time personalized recommendations and activity feeds using a simple API
Learn More
Search - "low memory"
-
We had a Commodore64. My dad used to be an electrical engineer and had programs on it for calculations, but sometimes I was allowed to play games on it.
When my mother passed away (late 80s, I was 7), I closed up completely. I didn't speak, locked myself into my room, skipped school to read in the library. My dad was a lovely caring man, but he was suffering from a mental disease, so he couldn't really handle the situation either.
A few weeks after the funeral, on my birthday, the C64 was set up in my bedroom, with the "programmers reference guide" on my desk. I stayed up late every night to read it and try the examples, thought about those programs while in school. I memorized the addresses of the sound and sprite buffers, learnt how programs were managed in memory and stored on the casette.
I worked on my own games, got lost in the stories I was writing, mostly scifi/fantasy RPGs. I bought 2764 eproms and soldered custom cartridges so I could store my finished work safely.
When I was 12 my dad disappeared, was found, and hospitalized with lost memory. I slipped through the cracks of child protection, felt responsible to take care of the house and pay the bills. After a year I got picked up and placed in foster care in a strict Christian family who disallowed the use of computers.
I ran away when I was 13, rented a student apartment using my orphanage checks (about €800/m), got a bunch of new and recycled computers on which I installed Debian, and learnt many new programming languages (C/C++, Haskell, JS, PHP, etc). My apartment mates joked about the 12 CRT monitors in my room, but I loved playing around with experimental networking setups. I tried to keep a low profile and attended high school, often faking my dad's signatures.
After a little over a year I was picked up by child protection again. My dad was living on his own again, partly recovered, and in front of a judge he agreed to be provisory legal guardian, despite his condition. I was ruled to be legally an adult at the age of 15, and got to keep living in the student flat (nation-wide foster parent shortage played a role).
OK, so this sounds like a sobstory. It isn't. I fondly remember my mom, my dad is doing pretty well, enjoying his old age together with an nice woman in some communal landhouse place.
I had a bit of a downturn from age 18-22 or so, lots of drugs and partying. Maybe I just needed to do that. I never finished any school (not even high school), but managed to build a relatively good career. My mom was a biochemist and left me a lot of books, and I started out as lab analyst for a pharma company, later went into phytogenetics, then aerospace (QA/NDT), and later back to pure programming again.
Computers helped me through a tough childhood.
They awakened a passion for creative writing, for math, for science as a whole. I'm a bit messed up, a bit of a survivalist, but currently quite happy and content with my life.
I try to keep reminding people around me, especially those who have just become parents, that you might feel like your kids need a perfect childhood, worrying about social development, dragging them to soccer matches and expensive schools...
But the most important part is to just love them, even if (or especially when) life is harsh and imperfect. Show them you love them with small gestures, and give their dreams the chance to flourish using any of the little resources you have available.22 -
Lads, I will be real with you: some of you show absolute contempt to the actual academic study of the field.
In a previous rant from another ranter it was thrown up and about the question for finding a binary search implementation.
Asking a senior in the field of software engineering and computer science such question should be a simple answer, specifically depending on the type of job application in question. Specially if you are applying as a SENIOR.
I am tired of this strange self-learner mentality that those that have a degree or a deep grasp of these fundamental concepts are somewhat beneath you because you learned to push out a website using the New Boston tutorials on youtube. FOR every field THAT MATTERS a license or degree is hold in high regards.
"Oh I didn't go to school, shit is for suckers, but I learned how to chop people up and kinda fix it from some tutorials on youtube" <---- try that for a medical position.
"Nah it's cool, I can fix your breaks, learned how to do it by reading blogs on the internet" <--- maintenance shop
"Sure can write the controller processing code for that boing plane! Just got done with a low level tutorial on some websites! what can go wrong!"
(The same goes for military devices which in the past have actually killed mfkers in the U.S)
Just recently a series of people were sent to jail because of a bug in software. Industries NEED to make sure a mfker has aaaall of the bells and whistles needed for running and creating software.
During my masters degree, it fucking FASCINATED me how many mfkers were absolutely completely NEW to the concept of testing code, some of them with years in the field.
And I know what you are thinking "fuck you, I am fucking awesome" <--- I AM SURE YOU BLOODY WELL ARE but we live in a planet with billions of people and millions of them have fallen through the cracks into software related positions as well as complete degrees, the degree at LEAST has a SPECTACULAR barrier of entry during that intro to Algos and DS that a lot of bitches fail.
NOTE: NOT knowing the ABSTRACTIONS over the tools that we use WILL eventually bite you in the ASS because you do not fucking KNOW how these are implemented internally.
Why do you think compiler designers, kernel designers and embedded developers make the BANK they made? Because they don't know memory efficient ways of deploying a product with minimal overhead without proper data structures and algorithmic thinking? NOT EVERYTHING IS SHITTY WEB DEVELOPMENT
SO, if a mfker talks shit about a so called SENIOR for not knowing that the first mamase mamasa bloody simple as shit algorithm THROWN at you in the first 10 pages of an algo and ds book, then y'all should be offended at the mkfer saying that he is a SENIOR, because these SENIORS are the same mfkers that try to at one point in time teach other people.
These SENIORS are the same mfkers that left me a FUCKING HORRIBLE AND USELESS MESS OF SPAGHETTI CODE
Specially to most PHP developers (my main area) y'all would have been well motherfucking served in learning how not to forLoop the fuck out of tables consisting of over 50k interconnected records, WHAT THE FUCK
"LeaRniNG tHiS iS noT neeDed!!" yes IT fucking IS
being able to code a binary search (in that example) from scratch lets me know fucking EXACTLY how well your thought process is when facing a hard challenge, knowing the basemotherfucking case of a LinkedList will damn well make you understand WHAT is going on with your abstractions as to not fucking violate memory constraints, this-shit-is-important.
So, will your royal majesties at least for the sake of completeness look into a couple of very well made youtube or book tutorials concerning the topic?
You can code an entire website, fine as shit, you will get tested by my ass in terms of security and best practices, run these questions now, and it very motherfucking well be as efficient as I think it should be(I HIRE, NOT YOU, or your fucking blog posts concerning how much MY degree was not needed, oh and btw, MY degree is what made sure I was able to make SUCH decissions)
This will make a loooooooot of mfkers salty, don't worry, I will still accept you as an interview candidate, but if you think you are good enough without a degree, or better than me (has happened, told that to my face by a candidate) then get fucking ready to receive a question concerning: BASIC FUCKING COMPUTER SCIENCE TOPICS
* gays away into the night53 -
I had to open the desktop app to write this because I could never write a rant this long on the app.
This will be a well-informed rebuttal to the "arrays start at 1 in Lua" complaint. If you have ever said or thought that, I guarantee you will learn a lot from this rant and probably enjoy it quite a bit as well.
Just a tiny bit of background information on me: I have a very intimate understanding of Lua and its c API. I have used this language for years and love it dearly.
[START RANT]
"arrays start at 1 in Lua" is factually incorrect because Lua does not have arrays. From their documentation, section 11.1 ("Arrays"), "We implement arrays in Lua simply by indexing tables with integers."
From chapter 2 of the Lua docs, we know there are only 8 types of data in Lua: nil, boolean, number, string, userdata, function, thread, and table
The only unfamiliar thing here might be userdata. "A userdatum offers a raw memory area with no predefined operations in Lua" (section 26.1). Essentially, it's for the API to interact with Lua scripts. The point is, this isn't a fancy term for array.
The misinformation comes from the table type. Let's first explore, at a low level, what an array is. An array, in programming, is a collection of data items all in a line in memory (The OS may not actually put them in a line, but they act as if they are). In most syntaxes, you access an array element similar to:
array[index]
Let's look at c, so we have some solid reference. "array" would be the name of the array, but what it really does is keep track of the starting location in memory of the array. Memory in computers acts like a number. In a very basic sense, the first sector of your RAM is memory location (referred to as an address) 0. "array" would be, for example, address 543745. This is where your data starts. Arrays can only be made up of one type, this is so that each element in that array is EXACTLY the same size. So, this is how indexing an array works. If you know where your array starts, and you know how large each element is, you can find the 6th element by starting at the start of they array and adding 6 times the size of the data in that array.
Tables are incredibly different. The elements of a table are NOT in a line in memory; they're all over the place depending on when you created them (and a lot of other things). Therefore, an array-style index is useless, because you cannot apply the above formula. In the case of a table, you need to perform a lookup: search through all of the elements in the table to find the right one. In Lua, you can do:
a = {1, 5, 9};
a["hello_world"] = "whatever";
a is a table with the length of 4 (the 4th element is "hello_world" with value "whatever"), but a[4] is nil because even though there are 4 items in the table, it looks for something "named" 4, not the 4th element of the table.
This is the difference between indexing and lookups. But you may say,
"Algo! If I do this:
a = {"first", "second", "third"};
print(a[1]);
...then "first" appears in my console!"
Yes, that's correct, in terms of computer science. Lua, because it is a nice language, makes keys in tables optional by automatically giving them an integer value key. This starts at 1. Why? Lets look at that formula for arrays again:
Given array "arr", size of data type "sz", and index "i", find the desired element ("el"):
el = arr + (sz * i)
This NEEDS to start at 0 and not 1 because otherwise, "sz" would always be added to the start address of the array and the first element would ALWAYS be skipped. But in tables, this is not the case, because tables do not have a defined data type size, and this formula is never used. This is why actual arrays are incredibly performant no matter the size, and the larger a table gets, the slower it is.
That felt good to get off my chest. Yes, Lua could start the auto-key at 0, but that might confuse people into thinking tables are arrays... well, I guess there's no avoiding that either way.13 -
So, some time ago, I was working for a complete puckered anus of a cosmetics company on their ecommerce product. Won't name names, but they're shitty and known for MLM. If you're clever, go you ;)
Anyways, over the course of years they brought in a competent firm to implement their service layer. I'd even worked with them in the past and it was designed to handle a frankly ridiculous-scale load. After they got the 1.0 released, the manager was replaced with some absolutely talentless, chauvinist cuntrag from a phone company that is well known for having 99% indian devs and not being able to heard now. He of course brought in his number two, worked on making life miserable and running everyone on the team off; inside of a year the entire team was ex-said-phone-company.
Watching the decay of this product was a sheer joy. They cratered the database numerous times during peak-load periods, caused $20M in redis-cluster cost overrun, ended up submitting hundreds of erroneous and duplicate orders, and mailed almost $40K worth of product to a random guy in outer mongolia who is , we can only hope, now enjoying his new life as an instagram influencer. They even terminally broke the automatic metadata, and hired THIRTY PEOPLE to sit there and do nothing but edit swagger. And it was still both wrong and unusable.
Over the course of two years, I ended up rewriting large portions of their infra surrounding the centralized service cancer to do things like, "implement security," as well as cut memory usage and runtimes down by quite literally 100x in the worst cases.
It was during this time I discovered a rather critical flaw. This is the story of what, how and how can you fucking even be that stupid. The issue relates to users and their reports and their ability to order.
I first found this issue looking at some erroneous data for a low value order and went, "There's no fucking way, they're fucking stupid, but this is borderline criminal." It was easy to miss, but someone in a top down reporting chain had submitted an order for someone else in a different org. Shouldn't be possible, but here was that order staring me in the face.
So I set to work seeing if we'd pwned ourselves as an org. I spend a few hours poring over logs from the log service and dynatrace trying to recreate what happened. I first tested to see if I could get a user, not something that was usually done because auth identity was pervasive. I discover the users are INCREMENTAL int values they used for ids in the database when requesting from the API, so naturally I have a full list of users and their title and relative position, as well as reports and descendants in about 10 minutes.
I try the happy path of setting values for random, known payment methods and org structures similar to the impossible order, and submitting as a normal user, no dice. Several more tries and I'm confident this isn't the vector.
Exhausting that option, I look at the protocol for a type of order in the system that allowed higher level people to impersonate people below them and use their own payment info for descendant report orders. I see that all of the data for this transaction is stored in a cookie. Few tests later, I discover the UI has no forgery checks, hashing, etc, and just fucking trusts whatever is present in that cookie.
An hour of tweaking later, I'm impersonating a director as a bottom rung employee. Score. So I fill a cart with a bunch of test items and proceed to checkout. There, in all its glory are the director's payment options. I select one and am presented with:
"please reenter card number to validate."
Bupkiss. Dead end.
OR SO YOU WOULD THINK.
One unimportant detail I noticed during my log investigations that the shit slinging GUI monkeys who butchered the system didn't was, on a failed attempt to submit payment in the DB, the logs were filled with messages like:
"Failed to submit order for [userid] with credit card id [id], number [FULL CREDIT CARD NUMBER]"
One submit click later and the user's credit card number drops into lnav like a gatcha prize. I dutifully rerun the checkout and got an email send notification in the logs for successful transfer to fulfillment. Order placed. Some continued experimentation later and the truth is evident:
With an authenticated user or any privilege, you could place any order, as anyone, using anyon's payment methods and have it sent anywhere.
So naturally, I pack the crucifixion-worthy body of evidence up and walk it into the IT director's office. I show him the defect, and he turns sheet fucking white. He knows there's no recovering from it, and there's no way his shitstick service team can handle fixing it. Somewhere in his tiny little grinchly manager's heart he knew they'd caused it, and he was to blame for being a shit captain to the SS Failboat. He replies quietly, "You will never speak of this to anyone, fix this discretely." Straight up hitler's bunker meme rage.13 -
Woohoo! 32k achieved!!! Finally I can post some new rant without risking some sudden overshoot 😁
So putting celebrations aside for a minute, a while ago I've noticed a tingle when I stroke my finger across metal areas of my tablet, or the sides of my phone (which probably has metal near it too) while it's charging. And it's been bugging me ever since.
Now, some things to note are that it only happens when my feet are touching the ground though slippers, and that the frequency is so low that I can actually feel the tingle when I slide my finger across the material. This to me at least seems like electricity flows through me into ground, and touching the ground directly provides a path so easy for the electrons to run away that I don't feel it at all. But if I lift my feet off the ground entirely, I just get charged up and after that, nothing else happens.
So those are my ideas. The answers on the subject on the other hand.. absolute cancer. Unsurprisingly, most of them came from Apple users. Here's some of them.
https://discussions.apple.com/threa...
- I've not noticed it, but if you're concerned bring the phone to Apple for evaluation.
- Me too facing same problem.. did u visit apple care?
And one good answer at least...
- google emf sensitivity, its real. You are right, there is a small current flowing through your body, try to limit your usage. The problem with this issue is those who aren't affected (lucky ones for now) will tell you these products are 100% safe. To a degree they are, i used my ipod touch for about 2 years straight vwith virtually no symptoms. then the tingling started and it gets worse.You will get more sensitive to progressively less powerful things. I dont want to scare you but just limit your usage like i didnt do 🙂
Overall that discussion was pretty good actually, aside from "bring it to the Genius Bar, they'll know for sure and not just sell you another unit". But then there's Reddit.
https://reddit.com/r/iphone/...
- Ok, real reason is probably that the extension cord and/or outlet is probably not grounded correctly. Either that or you are using a cheap knockoff charger.
Either use a surge protector and/or use the authentic Apple Charger.
- It's not the volts that hurt you, it's the amps
- I think you are in deep love with your phone. That tingling sensation is usually referred to as "love" in human language.
- Do less acid, I would advise.
Okay, so that's the real cancer. Grounding issue sounds reasonable despite it being wrong. Grounding is actually not needed when your charging appliance doesn't have any exposed metal parts. And isolation from high voltage to low voltage side actually happens through things like routering holes into the PCB, creating spark gaps, and using galvanic isolation through things like optocouplers. As for a surge protector? I'm using them to protect my PC and my servers, but the only purpose they serve is to protect from.. you guessed it.. voltage surges, like lightning bolts hitting the grid. They don't do shit for grounding or reducing this tingle! What a fucking tool.
It's not the volts that kill, it's the amps.. yeah I'm sure that the debunking of that is easy to find. Not gonna explain that here. And the rest of it.. yeah it's just fucking cancer.
Now what's the real issue with this tingle? It's actually a Class-Y rated (i.e. kV rated) capacitor that's on the transformer of any switch-mode power supply, including phone chargers. If memory serves me right, it helps with decoupling the switching noise and so on. But as it's connected to the primary side of the transformer, if the cap is sufficiently large and you are sufficiently sensitive, it can actually cause that tingle by passing a fraction of the mains electricity into your body. It's totally safe though, as the power that these caps pass is very small. But to some, it's noticeable.
Hope you found this interesting! And thanks a lot for bringing me to 2^15. I really appreciate it ♥️15 -
Soooo I think I have finally come to the point that I may have to create a YouTube channel, to teach software engineering from the ground up... and teach it the way the universities and everyone else should be teaching it, so that they have a solid foundation.... throwing hello world, and loops and variables at folks out of the box without any of the environment context or low level embedded register, even logic gate understanding
That lack of understanding is why, soooo many college students and younger folks, are actually pretty shitty engineers. Everything is high level languages and theoretical concepts to them. Nothing practical, that’s why there’s sooo many python and java developers that can’t for the life of them understand memory management, low level hardware interfacing etc, because the colleges don’t teach it the way it use to be taught.
I seriously fear 30 years from now or sooner when there are few embedded engineers only left till retirement, as without those folks the whole pyramid of electronics falls to pieces.
Java, C#, python, all that shit don’t run on the bare metal... there’s this magical layer of C, and assembler that does all the work just so folks can abstract their thoughts.
Either 1 of two situations will happen.. price of electronics will rise because the embedded guys are few and far between therefore salaries skyrocket... OR everything starts running shit like java on the metal, where there are a over abundance of developers, their salaries will be low because there are soo many but the processing power, space, and energy needed to run java natively causes electronics cost to increase
but regardless 30 years from now if those script kiddies are building everything I fear it cuz there’s gonna be memory leaks, and overflow issues everywhere.. shit be blowing up more than 4th of July.. lol
Soooo in effort to prevent that and keep the embedded engineers up, or atleast properly educate the script kiddies, I’m gonna make that YouTube channel.. 1 maybe 2 videos a week, 1-2 hours sessions each.. starting at the fucken ground and building up.39 -
I've optimised so many things in my time I can't remember most of them.
Most recently, something had to be the equivalent off `"literal" LIKE column` with a million rows to compare. It would take around a second average each literal to lookup for a service that needs to be high load and low latency. This isn't an easy case to optimise, many people would consider it impossible.
It took my a couple of hours to reverse engineer the data and implement a few hundred line implementation that would look it up in 1ms average with the worst possible case being very rare and not too distant from this.
In another case there was a lookup of arbitrary time spans that most people would not bother to cache because the input parameters are too short lived and variable to make a difference. I replaced the 50000+ line application acting as a middle man between the application and database with 500 lines of code that did the look up faster and was able to implement a reasonable caching strategy. This dropped resource consumption by a minimum of factor of ten at least. Misses were cheaper and it was able to cache most cases. It also involved modifying the client library in C to stop it unnecessarily wrapping primitives in objects to the high level language which was causing it to consume excessive amounts of memory when processing huge data streams.
Another system would download a huge data set for every point of sale constantly, then parse and apply it. It had to reflect changes quickly but would download the whole dataset each time containing hundreds of thousands of rows. I whipped up a system so that a single server (barring redundancy) would download it in a loop, parse it using C which was much faster than the traditional interpreted language, then use a custom data differential format, TCP data streaming protocol, binary serialisation and LZMA compression to pipe it down to points of sale. This protocol also used versioning for catchup and differential combination for additional reduction in size. It went from being 30 seconds to a few minutes behind to using able to keep up to with in a second of changes. It was also using so much bandwidth that it would reach the limit on ADSL connections then get throttled. I looked at the traffic stats after and it dropped from dozens of terabytes a month to around a gigabyte or so a month for several hundred machines. The drop in the graphs you'd think all the machines had been turned off as that's what it looked like. It could now happily run over GPRS or 56K.
I was working on a project with a lot of data and noticed these huge tables and horrible queries. The tables were all the results of queries. Someone wrote terrible SQL then to optimise it ran it in the background with all possible variable values then store the results of joins and aggregates into new tables. On top of those tables they wrote more SQL. I wrote some new queries and query generation that wiped out thousands of lines of code immediately and operated on the original tables taking things down from 30GB and rapidly climbing to a couple GB.
Another time a piece of mathematics had to generate all possible permutations and the existing solution was factorial. I worked out how to optimise it to run n*n which believe it or not made the world of difference. Went from hardly handling anything to handling anything thrown at it. It was nice trying to get people to "freeze the system now".
I build my own frontend systems (admittedly rushed) that do what angular/react/vue aim for but with higher (maximum) performance including an in memory data base to back the UI that had layered event driven indexes and could handle referential integrity (overlay on the database only revealing items with valid integrity) or reordering and reposition events very rapidly using a custom AVL tree. You could layer indexes over it (data inheritance) that could be partial and dynamic.
So many times have I optimised things on automatic just cleaning up code normally. Hundreds, thousands of optimisations. It's what makes my clock tick.4 -
I haven't ranted for today, but I figured that I'd post a summary.
A public diary of sorts.. devRant is amazing, it even allows me to post the stuff that I'd otherwise put on a piece of paper and probably discard over time. And with keyboard support at that <3
Today has been a productive day for me. Laptop got restored with a "pacman -Syu" over a Bluetooth mobile data tethering from my phone, said phone got upgraded to an unofficial Android 9 (Pie) thanks to a comment from @undef, etc.
I've also made myself a reliable USB extension cord to be able to extend the 20-30cm USB-A male to USB-C male cord that Huawei delivered with my Nexus 6P. The USB-C to USB-C cord that allows for fast charging is unreliable.. ordered some USB-C plugs for that, in order to make some high power wire with that when they arrive.
So that plug I've made.. USB-A male to USB-A female, in which my short USB-C to USB-A wire can plug in. It's a 1M wire, with 18AWG wire for its power lines and 28AWG wires for its data lines. The 18AWG power lines can carry up to 10A of current, while the 28AWG lines can carry up to 1A. All wires were made into 1M pieces. These resulted in a very low impedance path for all of them, my multimeter measured no more than 200 milliohms across them, though I'll have to verify and finetune that on my oscilloscope with 4-wire measurement.
So the wire was good. Easy too, I just had to look up the pinout and replicate that on the male part.
That's where the rant part comes in.. in fact I've got quite uncomfortable with sentences that don't include at least one swear word at this point. All hail to devRant for allowing me to put them out there without guilt.. it changed my very mind <3
Microshaft WanBLowS.
I've tried to plug my DIY extension cord into it, and plugged my phone and some USB stick into it of which I've completely forgot the filesystem. Windows certainly doesn't support it.. turns out that it was LUKS. More about that later.
Windows returned that it didn't support either of them, due to "malfunctioning at the USB device". So I went ahead and plugged in my phone directly.. works without a problem. Then I went ahead and troubleshooted the wire I've just made with a multimeter, to check for shorts.. none at all.
At that point I suspected that WanBLowS was the issue, so I booted up my (at the time) problematic Arch laptop and did the exact same thing there, testing that USB stick and my phone there by plugging it through the extension wire. Shit just worked like that. The USB stick was a LUKS medium and apparently a clone of my SanDisk rootfs that I'm storing my Arch Linux on my laptop at at the time.. an unfinished migration project (SanDisk is unstable, my other DM sticks are quite stable). The USB stick consumed about 20mA so no big deal for any USB controller. The phone consumed about 500mA (which is standard USB 2.0 so no surprise) and worked fine as well.. although the HP laptop dropped the voltage to ~4.8V like that, unlike 5.1V which is nominal for USB. Still worked without a problem.
So clearly Windows is the problem here, and this provides me one more reason to hate that piece of shit OS. Windows lovers may say that it's an issue with my particular hardware, which maybe it is. I've done the Windows plugging solely through a USB 3.0 hub, which was plugged into a USB 3.0 port on the host. Now USB 3.0 is supposed to be able to carry up to 1A rather than 500mA, so I expect all the components in there to be beefier. I've also tested the hub as part of a review, and it can carry about 1A no problem, although it seems like its supply lines aren't shorted to VCC on the host, like a sensible hub would. Instead I suspect that it's going through the hub's controller.
Regardless, this is clearly a bad design. One of the USB data lines is biased to ~3.3V if memory serves me right, while the other is biased to 300mV. The latter could impose a problem.. but again, the current path was of a very low impedance of 200milliohms at most. Meanwhile the direct connection that omits the ~200ohm extension wire worked just fine. Even 300mV wouldn't degrade significantly over such a resistance. So this is most likely a Windows problem.
That aside, the extension cord works fine in Linux. So I've used that as a charging connection while upgrading my Arch laptop (which as you may know has internet issues at the time) over Bluetooth, through a shared BNEP connection (Bluetooth tethering) from my phone. Mobile data since I didn't set up my WiFi in this new Pie ROM yet. Worked fine, fixed my WiFi. Currently it's back in my network as my fully-fledged development host. So that way I'll be able to work again on @Floydian's LinkHub repository. My laptop's the only one who currently holds the private key for signing commits for git$(rm -rf ~/*)@nixmagic.com, hence why my development has been impeded. My tablet doesn't have them. Guess I'll commit somewhere tomorrow.
(looks like my rant is too long, continue in comments)3 -
I am writing a graphics library for low power small memory footprint micro controllers to implement pebble watch like vector animations ( gif shows what i am trying to implement)
But I couldn’t find time 😔5 -
I do not like the direction laptop vendors are taking.
New laptops tend to feature fewer ports, making the user more dependent on adapters. Similarly to smartphones, this is a detrimental trend initiated by Apple and replicated by the rest of the pack.
As of 2022, many mid-range laptops feature just one USB-A port and one USB-C port, resembling Apple's toxic minimalism. In 2010, mid-class laptops commonly had three or four USB ports. I have even seen an MSi gaming laptop with six USB ports. Now, much of the edges is wasted "clean" space.
Sure, there are USB hubs, but those only work well with low-power devices. When attaching two external hard drives to transfer data between them, they might not be able to spin up due to insufficient power from the USB port or undervoltage caused by the impedance (resistance) of the USB cable between the laptop's USB port and hub. There are USB hubs which can be externally powered, but that means yet another wall adapter one has to carry.
Non-replaceable [shortest-lived component] mean difficult repairs and no more reserve batteries, as well as no extra-sized battery packs. When the battery expires, one might have to waste four hours on a repair shop for a replacement that would have taken a minute on a 2010 laptop.
The SD card slot is being replaced with inferior MicroSD or removed entirely. This is especially bad for photographers and videographers who would frequently plug memory cards into their laptop. SD cards are far more comfortable than MicroSD cards, and no, bulky external adapters that reserve the device's only USB port and protrude can not replace an integrated SD card slot.
Most mid-range laptops in the early 2010s also had a LAN port for immediate interference-free connection. That is now reserved for gaming-class / desknote laptops.
Obviously, components like RAM and storage are far more difficult to upgrade in more modern laptops, or not possible at all if soldered in.
Touch pads increasingly have the buttons underneath the touch surface rather than separate, meaning one has to be careful not to move the mouse while clicking. Otherwise, it could cause an unwanted drag-and-drop gesture. Some touch pads are smart enough to detect when a user intends to click, and lock the movement, but not all. A right-click drag-and-drop gesture might not be possible due to the finger on the button being registered as touch. Clicking with short tapping could be unreliable and sluggish. While one should have external peripherals anyway, one might not always have brought them with. The fallback input device is now even less comfortable.
Some laptop vendors include a sponge sheet that they want users to put between the keyboard and the screen before folding it, "to avoid damaging the screen", even though making it two millimetres thicker could do the same without relying on a sponge sheet. So they want me to carry that bulky thing everywhere around? How about no?
That's the irony. They wanted to make laptops lighter and slimmer, but that made them adapter- and sponge sheet-dependent, defeating the portability purpose.
Sure, the CPU performance has improved. Vendors proudly show off in their advertisements which generation of Intel Core they have this time. As if that is something users especially care about. Hoo-ray, generation 14 is now yet another 5% faster than the previous generation! But what is the benefit of that if I have to rely on annoying adapters to get the same work done that I could formerly do without those adapters?
Microsoft has also copied Apple in demanding internet connection before Windows 11 will set up. The setup screen says "You will need an Internet connection…" - no, technically I would not. What does technically stand in the way of Windows 11 setting up offline? After all, previous Windows versions like Windows 95 could do so 25 years earlier. But also far more recent versions. Thankfully, Linux distributions do not do that.
If "new" and "modern" mean more locked-in and less practical and difficult to repair, I would rather have "old" than "new".12 -
FFUUUuucccckkk me sideways. So I decided to look into USB type-c's power delivery and alt modes. Cause I kinda want to make an adapter card to run my displays over a single cable. TLDR of the rest: USB-C has some huge capabilities which noone is interested in using since its way to complex to handle for what its worth in the end.
Now PD alone is kinda ok to deal with since a lot of powerbanks use it and some hobby guys documented how to work with it. I find it really odd thou that you NEED to use a dedicated IC for using the configuration chanel to negotiate how much power you can draw. Why the USB standard didnt use some simple 5V low speed signalling? Also the standard says that you only have to implement 5v 0.6A with every other power level being optional. (This is also true for cables. Most manufacturers use only the USB 2.0 standard for them and brag about how fast type-C is. ლ(ಠ益ಠლ) )
Now to the alt modes. These motherfuckers are a real shitshow to deal with. First you need a Mux to deal with USB-C's two way insertion, so your signals wont get flipped. Next thing is that you have four lanes at your disposal in alt mode. Which you can either use for four Display Port Lanes or two DP lanes and two USB 3.0 lanes. (You always get USB 2.0) Now you may think that there would be one simple chip to do it all? Nope you need atleast two at the price of 6$ each. One for PD and one for Alt modes. Both are very hard to solder (QFN, 0.5 mm pitch 40+ pins) TI ended up being the only one with a decent offering of IC's that do what I need. As for working with them, you would think that you just slap a simple MCU on there that communicates over I2C or SPI to configure the chips? Nope! You program the chips memory from which it configures itsself. And the programming is done with some TI tool which gives me no idea as to how you can handle everything whith no control logic behind it.
Looking into alternative IC's leaves me with cypress semi. And their documentation is basically a total mess. I wanna know what that chip is good for and what I need to do to make it work. I dont care about technical details mixed with marketing jargon nobody understands. And I really despise that I have to register just to download a datasheet. Especially since there is no info about it on the main page.
And this whole rant hasnt even touched the topic that USB-C only uses DP and nothing else. So you better hope that you have DP++ so you can use a passive conversion.
This was my Ted Talk about USB-C. Some info in it may be subject to my stupidity and errors as it currently is 02:15 in the morning and I need some sleep.14 -
The advantage of programming in Android studio is that you can read a chapter of your favorite book, watch a series or do a little exercise... Thank you gradle2
-
I just installed Opera Mini on my PSP. That alone isn't very exciting on its own, although I am stoked that my website does in fact render on a device from 2009. With the helpful guidance of a laptop from 2004 that's doing the hotspot duties for this thing.
No, what really got me stoked is that Opera still supports these old platforms, and how small they managed to make it. The .jar file for Opera Mini 4.5 is ~800kB large. There's a .jad file as well but it's negligible in size and seems to be a signature of sorts.
Let that sink in for a moment. This entire web browser is 800kB. Firefox meanwhile consistently consumes 800 MEGABYTES.. in MEMORY. So then, I went to think for a moment, how on earth did they manage to cram an entire functioning web browser in 800kB? Hell, what makes up a web browser anyway?
The answer to that question I got to is as follows. You need an engine to render the web page you receive. You need a UI to make the browser look nice. And finally you need a certificate store to know which TLS certificates to trust. And while probably difficult to make, I think it should be possible to do in 800k. Seriously, think about it. How would you go *make* a web browser? Because I've already done that in the past.
Earlier I heard that you need graphics, audio, wasm, yada yada backends too.. no. Give your head a shake. Graphics are the responsibility of the graphics driver. A web browser shouldn't dabble with those at all. Audio, you connect to PulseAudio (in Linux at least) and you're done. Hell I don't even care about ALSA or OSS here. You just connect to the stuff that does that job for you. And WebAssembly.. God I could rant about that shit all day. How about making it a native application? Not like actual Assembly is used for BIOS and low-level drivers. And that we already have a better language for the more portable stuff called C.
Seriously, think about it. Opera - a reputable browser vendor - managed to do it in 800kB on a 12 year old device. Don't go full wank on your framework shit on the comments. And don't you fucking dare to tell me that there's more to it. They did it for crying out loud. Now you take a look at your shitpile for JS code and refactor that shit already. Thank you.21 -
Sooooo I got new RAM for my low-budget server today.
Now introducing: 24 GB DDR2 RAM
XD yes I know DDR 2 is slow but very, very cheap ECC memory.
(still not enough for Chrome)9 -
I feel so guilty.
I had to make a hotfix today. It is the ugliest piece of shit code I ever intentionally created. But there was no other way. I swear there was no other fucking way!
My boss just assigned this to me. But because she thinks this needs to be a hotfix and can't wait for the next release we just have to change the server and not the client side of our application.
So I had to add a memory to our server so that it knows from which high level method from the client the multiple low level calls to it are coming from.
It just doesn't make sense logically.
I mean I feel like I killed someone. And just so that we get less writes to our DB. I mean yes in some edge cases it is a huge speed-up...
But nothing this fix solves is a new bug.
I'm gonna take a shower now. For like an hour3 -
WINDOWS 10 low on memory error! How do i fucking get rid of this keeps restarting my computer 😫😫😫! PHOTOSHOP USER10
-
A bug is born
... and it's sneaky and slimy. Mr. Senior-been-doing-it-for-ears commits some half-assed shitty code, blames failed tests on availability of CI licenses. I decided to check what's causing this shit nevertheless, turns out he forgot to flag parts of the code consistently using his new compiler defines, and some parts would get compiled while others needed wouldn't .. Not a big deal, we all make mistakes, but he rushes to Teams chat directing a message to me (after some earlier non-sensible argument about merits of cherry picking vs re-base):
Now all tests pass, except ones that need CI license. The PR is done, you can use your preferred way to take my changes.
So after I spot those missing checks causing the tests to fail, as well as another bug in yet another test case, and yet another disastrous memory related bug, which weren't detected by the tests of course .. I ponder my options .. especially based on our history .. if I say anything he will get offended, or at best the PR will get delayed while he is in denial arguing back even longer and dependent tasks will get delayed and the rest of the team will be forced to watch this show in agony, he also just created a bottleneck putting so many things at stake in one PR ..
I am in a pickle here .. should I just put review comments and risk opening a can of worms, or should I just mention the very obvious bugs, or even should I do nothing .. I end up reaching for the PM and explained the situation. In complete denial, he still believes it's a license problem and goes on ranting about how another project suffering the same fate .. bla bla bla chipset ... bla bla bla project .. bla bla bla back in whatever team .. then only when I started telling him:
These issues are even spotted by "Bob" earlier, since for some reason you just dismissed whatever I just said ..
("Bob" is another more sane senior developer in the team, and speaks the same language as the PM)
Only now I get his attention! He then starts going through the issues with me (for some reason he thinks he is technical enough to get them) .. He now to some extent believes the first few obvious bugs .. now the more disastrous bug he is having really hard time wrapping his head around it .. Then the desperate I became, I suggest let's just get this PR merged for the sake of the other tasks after may be fixing the obvious issues and meanwhile we create another task to fix the bug later .. here he chips in:
You know what, that memory bug seems like a corner case, if it won't cause issues down the road after merging let's see if we need even to open an internal fix or defect for it later. Only customers can report bugs.
I am in awe how low the bar can get, I try again and suggest let's at least leave a comment for the next poor soul running into that bug so they won't be banging their heads in the wall 2hrs straight trying to figure out why store X isn't there unless you call something last or never call it or shit like that (the sneaky slimy nature of that memory bug) .. He even dismissed that and rather went on saying (almost literally again): It is just that Mr. Senior had to rush things and communication can be problematic sometimes .. (bla bla bla) back in "Sunken Ship Co." days, we had a team from open source community .. then he makes a very weird statement:
Stuff like what Richard Stallman writes in Linux kernel code reviews can offend people ..
Feeling too grossed and having weird taste in my mouth I only get in a bad hangover day, all sorts of swear words and profanity running in my head like a wild hungry squirrel on hot asphalt chasing a leaky chestnut transport ... I tell him whatever floats your boat but I just feel really sorry for whoever might have to deal with this bug in the future ..
I just witnessed the team giving birth to a sneaky slimy bug .. heard it screaming and saw it kicking .. and I might live enough to see it a grown up having a feast with other bug buddies in this stinky swamp of Uruk-hai piss and Orcs feces.1 -
"Suggest an AV/AM product, Avast refuses to install."
I do malware research as a hobby and have for a while, so I can generally spot when something's up before I even run a program. If i'm unsure about it (or know something's up and wanna see its effects for S&Gs) I throw it into one of a variety of VMs, each with a prepped, clean, standardized "testing" state.
I see no point to AV/AM products, especially as they annoy me more than anything since they can't be told not to reach into and protect VMs (thereby dirtying up my VM state, my research, crashing the VM hypervisor and generally being *really* annoying) and they like to erase samples from a *read-only, MOUNTED* VHDX.
However, normal people need them, so I usually suggest this list:
• MBAM is good and has a (relatively) low memory footprint, but doesn't have free realtime protection.
• Avast is very good as it picks up a lot, but it eats a FUCKTON of resources. It also *really* likes to crash VM hypervisors if it sees anything odd in them.
• AVG is garbage. Kill it with fire.
• Using Windows Defender is like trying to block the rain with an umbrella made of 1-ply toilet paper.
• herdProtect is amazing as it's basically a VirusTotal client but it's web-based and not currently available to be downloaded. (Existing copies still work!)
• Kaspersky. Yes, it spied on US gov't workers. No, they don't care about anyone BUT US gov't workers. Yes, it's pretty good.
• BitDefender: *sees steam game* "Is this ransomware?"
hope this helps10 -
We had 1 Android app to be developed for charity org for data collection for ground water level increase competition among villages.
Initial scope was very small & feasible. Around 10 forms with 3-4 fields in each to be developed in 2 months (1 for dev, 1 for testing). There was a prod version which had similar forms with no validations etc.
We had received prod source, which was total junk. No KT was given.
In existing source, spelling mistakes were there in the era of spell/grammar checking tools.
There were rural names of classes, variables in regional language in English letters & that regional language is somewhat known to some developers but even they don't know those rural names' meanings. This costed us at great length in visualizing data flow between entities. Even Google translate wasn't reliable for this language due to low Internet penetration in that language region.
OOP wasn't followed, so at 10 places exact same code exists. If error or bug needed to be fixed it had to be fixed at all those 10 places.
No foreign key relationships was there in database while actually there were logical relations among different entites.
No created, updated timestamps in records at app side to have audit trail.
Small part of that existing source was quite good with Fragments, MVP etc. while other part was ancient Activities with business logic.
We have to support Android 4.0 to 9.0 of many screen sizes & resolutions without any target devices issued to us by the client.
Then Corona lockdown happened & during that suddenly client side professionals became over efficient.
Client started adding requirements like very complex validation which has inter-entity dependencies. Then they started filing bugs from prod version on us.
Let's come to the developers' expertise,
2 developers with 8+ years of experience & they're not knowing how to resolve conflicts in git merge which were created by them only due to not following git best practice for coding like only appending new implementation in existing classes for easy auto merge etc.
They are thinking like handling click events is called development.
They don't want to think about OOP, well structured code. They don't want to re-use code mostly & when they copy paste, they think it's called re-use.
They wanted to follow old school Java development in memory scarce Android app life cycle in end user phone. They don't understand memory leaks, even though it's pin pointed by memory leak detection tools (Leak canary etc.).
Now 3.5 months are over, that competition was called off for this year due to Corona & development is still ongoing.
We are nowhere close to completion even for initial internal QA round.
On top of this, nothing is billable so it's like financial suicide.
Remember whatever said here is only 10% of what is faced.
- An Engineering lead in a half billion dollar company.4 -
I'm getting beat up pretty bad by Rust. I like it so far but man is it hard. Imposter-syndrome is almost making me lose motivation. Almost, but I won't quit, one day I'll get there.
I think the primary reason I think I'm having such a hard time is that I'm trying to learn stuff that prevents me from making some mistakes that I have never run into. I know a bit of the theory but no hand's on experience on double-free errors, memory leaks and weird low-level stuff. I read the documentation, mostly understand what stuff is for but when I go write code I'm just like "now what?". I don't have enough experience to know when and where to use some concepts and I'm super lost. I don't know where to start and the feeling of being completely overwhelmed by all sorts of new stuff is at the same time exciting and frightening.
I have never, as a programmer, thought something was hard. All of my past knowledge required dedication, work and patience, but I wouldn't say I ever felt something was *hard*. But Rust... damn. Rust is hard.
Hopefully at the end of this super steep learning curve I'll know a lot more stuff and have stronger "dev powers" and be one step closer to being as knowledgeable as some of you guys around here to whom I look up to.2 -
!rant
How to earn a lot of money as a programmer?
So this question might sound a little naive and too simple, but earning a lot of money is what we all want after all right? Collecting experiences from people in the business should be a good idea.
So this is the position I am in:
I am a German student in my 13th year of school (which means I will graduate this summer) and I am very interested in information technology. I know C++ pretty well by now and I have built a rendering engine for a game I want to make using openGL already, which I am very proud of.
I would love to turn this passion into my profession and thats why I plan to attend a dual course of computer science next year (dual means that I will be employed at a company (or similar) in parallel to the studying course).
But what direction should I be going in if I want to make big money later on? I am ready to spend a lot of time and work on this life project but I don't know which directions are the most promising. I hate being a tiny gear in a huge machine that just has to keep spinning to keep the machine alive, I want to be part of a real project (like most people probably) and possibly sell a product (because I think that is how you really make money).
Now I know there is no magic answer to this, but I bet many people here have made experiences they can share and this could help a lot of people directing their path in a more success oriented way.
I personally am especially interested in fields which are relatively low-level and close to memory (C++), go hand in hand with physics and 3D simulation and are somewhat creative and allow new solutions. (These are no hard lines, I just thought I should give a little direction to what I know already and what I am interested in)
But really, I am interested in any work you are likely to earn a lot of money with.12 -
Avoid ACPICA if at all possible. It's one garbage tier cluster fuck of bad design, horrible documentation and downright misleading and wrong code
It's meant to consist of an ASL compiler, disassembler, debugger, dumper, various user space utitilies and a kernel resident OSPM implementation *if* you can figure out what belongs to what. Even just compiling this pile of trash is a mystery in itself. Think you need the source files in source/common? EEEEH, wrong. Well, at least partially since most of them seem to be for the user space stuff..? Other ones *are* needed on the other hand. At least the disassembler and/or debugger and/or dumper components seem to reference them. Not that I could figure out how to compile those anyways. The real path to your goal seems to be to ignore a seemingly arbitrary subset of source and header files until your linker stops complaining
There's also a bunch of configuration defines, some of which *you* define, some defined *for* you, based on again others. Of course most of them do stupid shit. Enabling the debugger automatically enables debug logging. Enabling the disassembler force enables debug allocation tracking... What?
The code itself isn't of much help either. Looking in "os_specific/service_layers" you find what looks to be reference implementations of acpica functions in certain os' like windows and unix. Of course I had a look because AcpiOsReadMemory is supposed to read physical memory and I don't know how I would even implement that. But hey, osunixxf.c (xf for interface... of course) should tell me. I'll let you see for yourself in the attached image. Apparently it does fuck all and just returns AE_OK. No error, no logging, no nothing. Just ok. As you can imagine, AcpiOsWriteMemory doesn't do much more either.
...okay so maybe physical memory accesses aren't actually used and these functions are some sort of relic from past times? Nope! They are absolutely necessary for doing low level device interaction. WTF. So finally I went to the linux source and checked how *they* implemented them, and just as I thought, these functions are anything but no-ops...
...So for what fucking reason do these stupid interface implementations even exist but to purposefully mislead you?? They aren't used for fucking anything! As far as I know Windows doesn't even *use* ACPICA and Linux have their own fork with working implementations... They just sit there, just to tell you how to NOT do it
So that's some of my thoughts about ACPICA. Note that I haven't even used it as a library yet, I just got it to compile and link and it already fucked with me this much.
There's also so much more I didn't mention like that you *have* to modify the acpica source in order to get your own platform header working (else #error) eventhough the docs explicitely instruct you not too but you get the point
Don't use ACPICA if you don't have to. Save your sanity for something that's worth it -
<opinion>
You may be a prod ninja but I believe that every dev should have a decent level of exposure with a low level language(s). Sure you can make an HTTP server, do a sentimental analysis, topic modeling, set up multinode clusters, write ORM queries from dbs and all sorts of awesome stuffs with Python/Ruby/PHP/JS/GO etc but none of them teaches you what happens at kernel level. Things like memory leaks, threading, multiprocessing, memory allocations etc can only be better learnt from a low level language.
</opinion>
P.S. Not a C/C++ fanboy. I'm a python dev 😄5 -
When you're talking about how awful it is and they think you mean your desk
I have an HP stream
this thing is a McFucking™ potato (and it's HP so i can't even get inside and replace the stock parts for good ones without breaking it)9 -
Have been playing the pirated version of Rust for 30+ hours with no issues.
Decide to buy the game and every fucking time the game turns into Chrome and consumes all my RAM forcing Windows to show the low on memory dialog.
Lesson learned I guess.7 -
It's 2022 and Firefox still doesn't allow deactivating video caching to disk.
When playing videos from some sites like the Internet Archive, it writes several hundreds of megabytes to the disk, which causes wear on flash storage in the long term. This is the same reason cited for the use of jsonlz4 instead of plain JSON. The caching of videos to disk even happens when deactivating the normal browsing cache (about:config property "browser.cache.disk.enable").
I get the benefit of media caching, but I'd prefer Firefox not to write gigabytes to my SSD each time I watch a somewhat long video. There is actually the about:config property "browser.privatebrowsing.forceMediaMemoryCache", but as the name implies, it is only for private browsing. The RAM is much more suitable for this purpose, and modern computers have, unlike computers from a decade ago, RAM in abundance, which is intended precisely for such a purpose.
The caching of video (and audio) to disk is completely unnecessary as of 2022. It was useful over a decade ago, back when an average computer had 4 GB of RAM and a spinning hard disk (HDD). Now, computers commonly have 16 GB RAM and a solid-state drive (SSD), which makes media caching on disk obsolete, and even detrimental due to weardown. HDDs do not wear down much from writing, since it just alters magnetic fields. HDDs just wear down from the spinning and random access, whereas SSDs do wear down from writing. Since media caching mostly invovles sequential access, HDDs don't mind being used for that. But it is detrimental to the life span of flash memory, and especially hurts live USB drives (USB drives with an operating system) due to their smaller size.
If I watch a one-hour HD video, I do not wish 5 GB to be written to my SSD for nothing. The nonstandard LZ4 format "mozLZ4" for storing sessions was also introduced with the argument of reducing disk writes to flash memory, but video caching causes multiple times as much writing as that.
The property "media.cache_size" in about:config does not help much. Setting it to zero or a low value causes stuttering playback. Setting it to any higher value does not reduce writes to disk, since it apparently just rotates caching within that space, and a lower value means that it just rotates writing more often in a smaller space. Setting a lower value should not cause more wear due to wear levelling, but also does not reduce wear compared to a higher value, since still roughly the same amount of data is written to disk.
Media caching also applies to audio, but that is far less in size than video. Still, deactivating it without having to use private browsing should not be denied to the user.
The fact that this can not be deactivated is a shame for Firefox.2 -
If you look at the "lightweight business laptops" or business netbooks section of the market you'll notice almost all of them seem to have 4gb
Bitch, thats barely enough to run windows 10.
Looks like a market opening. If I were still doing upgrades and repairs I'd blame everything on low memory (well, a lot of slowness can loosely be attributed to lack of memory) and upsell new machines with more and better ram. Target 6gb, which is cheaper than 8 and offers a minor but noticeable boost, just enough to passably justify the increased cost to whoever is responsible for authorizing the upgrade.
I don't understand whats so hard to grasp about this. It's like companies trip over dollars to pick up dimes.2 -
Does anyone here have any good resources for introduction to embedded, low level development, or anything on advanced C concepts? I've been having trouble trying to step into more complicated topics like bit manipulation and stuff I can do with memory management. Also any advice is also appreciated.30
-
LXC, no doubt.
I mean to be fair, LXC is an amazing container runtime once you manage to set it up. But setting it up is the hard bit. Starting off with LXC 2.x, it was a nightmare to find out how to get things like the storage backends working. But with ZFS it ended up being alright. Find some arcane values to stick in the /etc/lxc/default.conf to use ZFS as the backend and then the default storage location on those ZFS pools (I'll get back to that later), and it worked alright. Again, once it works it's great, but setting it up and finding the right configuration keys is absolute hell.
So, LXC 2.x for a while and a few months ago I finally ended up upgrading to 3.x. Every single configuration key changed. Every single one of them, and that's why I had to 1) learn LXC all over again, and 2) redeploy each and every one of my containers. That process is still not entirely completed. ZFS backend was once again a dive into arcane configuration keys found on forums and whatnot. Yeah.. official documentation has none of it. Oh and in 3.x you now also have to dodge the torrent of "just use LXD m8" messages. Yeah, very helpful when LXD is also the ONLY way to reasonably configure it. Absolutely beautiful. Oh and as far as the ZFS default storage location goes (such as ssd/lxc/ct)? Yeah forget about it. There's no configuration option for it anymore, and the default is "lxc". In ZFS lingo that means that LXC has the audacity to demand a whole pool for itself. No. No you don't deserve a whole pool for yourself. But hey at least you can define the storage location to use in the lxc-create command! Every single time you have to define it in lxc-create. I abstracted it away into my own LXC interface, so no big deal really. But yeah... That could absolutely be better. And in 2.x it was actually better.
Oh and btrfs, the filesystem I'd like to use on low memory systems because ZFS' ARC is too much on such systems? Yeah forget about it. I still have no idea how to do it. Thank you LXC and its amazing documentation!
And if you want the icing on the cake for LXC's terrible documentation, see their repo's index page at https://github.com/lxc/lxc/.... Yeah, it's totally still at 2.x... That's how well they maintain that. Even Debian has 3.x now. And if you look at the branches, you'll find that even 4.x is already available and considered stable. -
Did I get old or did I just finish plucking all the low hanging fruit?
When I started on a programming journey about a decade ago everything feel exciting and I learn a lot of things per day (variable,loop,method,class,---etc)
Now a decade later I am more concern with the overall system design,algorithms usage (Big O Notation),how reliable the system it,and how the configurations are set up and how easy is it to change them.
I now notice that I don't really learn anything learn new.Everything feel the same.
Want redundancy? Use more server
Want faster performance? Make a parallel system.
Want program to run on low end device? Think about how memory and storage will be used in system.
Is this a stage everyone went through like puberty? or I am just having a mid life crisis?
PS : I haven't even reach 30 yet but I feel too old.4 -
I feel like writing or telling people about the time I jumped from Windows 7 Ultimate and jumping to Windows 10. (I'm not against 10, but I'm never updating after what had happened to me)
It all starts when none of my games will play due to a possible issue with my graphics card. I look up "3D source game bug" and not many results pop up. I go on Microsoft's Qna areas and ask this question but to my surprise nothing they say would make sense. "Clean the pins of your graphics card, make sure you verify the games on Steam". I verified the games and they checked out as perfectly fine. I don't have access to my graphics card because this is a laptop, sadly not a tower.
Two months pass and my computer is already showing signs of stress, like it didn't want to live in a sense. It was three times slower than when I was on Windows 7 and it was unallocating areas of my main hard drive where I could make virtual hard drives.
Instantly I start looking up Linux distros and find Linux Mint. 17.3 was the current version at the time. I downloaded it and burned it onto a DVD-rom and rebooted my computer. I loaded into the disc and to my surprise it seemed almost like Windows 7 apart from the Linux part. I grab my external hard drive and partition it to hold the Linux distro and leave it plugged in incase Windows 10 does actually fail.
On December 19, a few months after Windows 10 had released. I start my laptop to try and continue my studies in video game development. But to my surprise, Windows 10 had finally crashed permanently. The screen flickered blue and black, and an error box saying Loginui.exe failed to start. I look at it for a solid minute as my computer had just committed suicide in a sense.
I reboot thinking it would fix the error but it didn't. I couldn't log in anymore.
I force shutdown the laptop and turn it back on putting it into safe mode.
To my surprise loginui.exe works and I sign in. I look at my desktop, the space wallpaper I always admired, the sound files, screen shots I had saved.
I go into file explorer and grab everything out of my default hard drive Windows was installed on. Nothing but 400gb got left behind and that was mainly garbage prototypes I had made and Windows itself. I formatted my external hard drive and placed everything on it. Escaping Windows 10 with around 100GB of useful data I looked at the final shutdown button I would look at.
I click it and try to boot into normal Windows 10. But it doesn't work. It flickers and the error pops up once more.
I force it to shutdown and insert the previous Linux Mint disc I made and format the default hard drive through Linux. I was done. 10 gave me a lot of shit. Java wouldn't work, my games has a functional UI but no screen popped up except a black abyss and it wouldn't even let me try to update my graphics card, apparently my AMD Radeon 5450 was up to date at the AMD Radeon 5000's.
I installed Linux Mint and thinking the games would actually play I open steam and Launch Half-Life 2 to check if Linux would be nicer to me than Windows 10 had been.
To my surprise the game ran. The scene from Highway 17 popped on screen and the UI was fully functional. But it was playing at 10-15fps rather than the usual 60-70fps. Keep look at my drivers and see my graphics card isn't in use. I do some research and it turns out I have a Hybrid Laptop.
Intel HD Graphics and an AMD Radeon 5450 and it was using the Intel and not the AMD. Months of testing and attempts of getting the games to work at high frame rates pass and the Damn thing still functions at a low terrible fps. Finally I give up. I ask my mom for a Windows 7 disc and she says we can't afford it. A few months pass and I finally get a Windows 7 installation disc through money I've saved up. Proudly I put it into my optical disc drive and install it to my main hard drive deleting Linux completely. I announced to all my friends my computer was back in working order and I install everything I needed, Steam, Skype, Blender, and Unity as well as all my games. I test Half-Life 2 and it's running exceptionally smoothly, I test Minecraft at max settings and it's working beautifully. The computer was functioning properly once again and my life as a developer started as I modeled things and blender, learned beginners C# and learned a lot of Batch. Today the computer still runs at a great speed and I warn others of what happened to me after I installed Windows 10 to my machine if they are thinking of switching from 7 or 8 on an older machine.
Truly the damage to my data cannot be undone. But the memory of the maintenance, work, tests, all are a memory of how Windows 10 ruined me and every night before the one year anniversary of Windows 10's release, I took out the battery of my laptop and unplugged it from the a.c. power, just so Windows 10 doesn't show it's DLLs, batch scripts, vbs scripts, anything on my computer. But now, after this has happened and I have recovered, I now only have a story to tell5 -
!rant
Went from uni to my car to drive back home. Engine doesn't start, And report of low oil level is showing up. Hmmm. I've opened hood and checked oil level. It was empty. First thought. I drove here with no oil so I broke the engine. Great... I bought some oil and refiled it. Still same problem. I've called my insurance company and my mechanic. And then. Brilliant thought evolved. Did I turned off ignition on secret switch today? Yea it was it. Had to call everybody again and cancel my AC request. Gosh, I hate having memory of golden fish...
Also. Hi everybody. my first !rant3 -
Who actually started the reign of mixed character passwords? because seriously it sucks to have an unnecessarily complex password! Like websites and apps requesting passwords to contain Upper/Lower case letter, numeric characters and symbols without considering the average user with low memory threshold (i.e; Me).
Let's push the complaint aside and return back to the actual reason a complex password is required.
Like we already know; Passwords are made complex so it can't be easily guessed by password crackers used by hackers and the primary reason behind adding symbols and numbers in a password is simply to create a stretch for possible outcome of guesses.
Now let's take a look into the logic behind a password cracker.
To hack a password,
1) The Password Cracker will usually lookup a dictionary of passwords (This point is very necessary for any possible outcome).
2) Attempts to login multiple times with list of passwords found (In most cases successful entries are found for passwords less than 8 chars).
3) If none was successful after the end of the dictionary, the cracker formulates each password on the dictionary to match popular standards of most website (i.e; First letter uppercase, a number at the end followed by a symbol. Thanks to those websites!)
4) If any password was successful, the cracker adds them to a new dictionary called a "pattern builder list" (This gives the cracker an upper edge on that specific platform because most websites forces a specific password pattern anyway)
In comparison:
>> Mygirlfriend98##
would be cracked faster compared to
>> iloveburberryihatepeanuts
Why?
Because the former is short and follows a popular pattern.
In reality, password crackers don't specifically care about Upper-Lowercase-Number-Symbol bullshit! They care more about the length of the password, the pattern of the password and formerly used entries (either from keyloggers or from previously hacked passwords).
So the need for requesting a humanly complex password is totally unnecessary because it's a bot that is being dealt with not another human.
My devrant password is a short story of *how I met first girlfriend* Goodluck to a password cracker!6 -
I'm thinking about what language to dive into next.
I already have a pretty good knowledge of Go and mediocre knowledge of C and Java.
So far I thought about...
1. CPP, as I need it for school and it runs on literally anything.
2. Rust, as is seems to spread and the combination of low-level, memory-safety and abstraction seems pretty appealing to me.
3. Kotlin, specifically kotlin-native, is it combines java-like high-level programming with native speed.
4. Nim, as it combines high-level techniques with c-like freedom.
What do you people recommended, or something completely else?6 -
Wouldn't be node js a good fit for a performant message queue, I mean it's async from the bottom up, a low memory footprint and good rx support? Or what specific does a good mq require?1
-
Sometimes my hatred for code is so.. overwhelming that I think I need a sabbatical or should even stop altogether.
Let's face it. All code sucks. Just on different levels.
Want to go all bare metal? Love low level bit fiddling. Well, have fun searching for concurrency, memory corruption bugs. Still feel confident? Get ulcers from large C/C++ code base already in production, where something in the shared memory, function pointer magic is not totally right?
So you strive for more clean abstractions, fancy the high level stuff? Well, can you make sense of gcc's template error messages, are you ready for the monad, leaving behind the mundane everyday programmers, who still wonders about the scope of x and xs?
Wherever you go. Isn't it a stinking shit pile of entropy, arbitrary human made conventions? You're just getting more familiar with them, so you don't question them, they become your second skin, you become proficient - congrats you're a member of the 1337.7 -
Im thinking about getting a raspberry pi 3 or an odroid-c2.
(Specs at the end)
Its to host a simple php server and maybe a gitlab server, both for personal use.
Should I go with the better performance or the better community support?
Odroid specs
System-on-chip used : Amlogic S905
CPU: 1.5 GHz 64-bit quad-core ARM Cortex-A53
Memory: 2 GB LPDDR3 RAM at 912 MHz
Storage: MicroSDHC slot, eMMC module socket
Graphics: Mali-450 MP3
Connectivity: 4× USB 2.0, micro-USB OTG, HDMI 2.0, Gigabit Ethernet (8P8C), Infrared, 40× GPIO ports
Raspberry specs:
SoC: Broadcom BCM2837
CPU: 4× ARM Cortex-A53, 1.2GHz
GPU: Broadcom VideoCore IV
RAM: 1GB LPDDR2 (900 MHz)
Networking: 10/100 Ethernet, 2.4GHz 802.11n wireless
Bluetooth: Bluetooth 4.1 Classic, Bluetooth Low Energy
Storage: microSD
GPIO: 40-pin header, populated
Ports: HDMI, 3.5mm analogue audio-video jack, 4× USB 2.0, Ethernet, Camera Serial Interface (CSI), Display Serial Interface (DSI)8 -
WE: javaagent-based monitoring, as seen in this screenshot <attached>, is reporting full old-gen, full young-gen, full one of the survivors and a sky-rocketing full GC right before the service outage.
WE: container monitoring in this screenshot <attached> shows that the application peaked its memory very suddenly to MAX values and platoed on that. Then container monitoring is blank, suggesting a complete outage of a few minutes. After that monitoring starts again with memory usage reported at low levels and immediatelly spiking back to MAX again, suggesting the container crashed and had been respawned by an orchestrator. This repeats a few times throughout the day.
they: I did not find any evidence of application running out of memory. Maybe our monitoring is not working correctly?
we: *considering updating our resumes* -
So I’m reading this book called Hacking: The art of exploitation and I’ve got to admit. It’s one of my favourite books I’ve read. It really gets into the nitty gritty of how programs are laid out in memory and goes over how assembly works, among some other low level concepts. Highly recommend.1
-
A beginner in learning java. I was beating around the bushes on internet from past a decade . As per my understanding upto now. Let us suppose a bottle of water. Here the bottle may be considered as CLASS and water in it be objects(atoms), obejcts may be of same kind and other may differ in some properties. Other way of understanding would be human being is CLASS and MALE Female be objects of Class Human Being. Here again in this Scenario objects may differ in properties such as gender, age, body parts. Zoo might be a class and animals(object), elephants(objects), tigers(objects) and others too, Above human contents too can be added for properties such as in in Zoo class male, female, body parts, age, eating habits, crawlers, four legged, two legged, flying, water animals, mammals, herbivores, Carnivores.. Whatever.. This is upto my understanding. If any corrections always welcome. Will be happy if my answer modified, comment below.
And for basic level.
Learn from input, output devices
Then memory wise cache(quick access), RAM(runtime access temporary memory), Hard disk (permanent memory) all will be in CPU machine. Suppose to express above memory clearly as per my knowledge now am writing this answer with mobile net on. If a suddenly switch off my phone during this time and switch on.Cache runs for instant access of navigation,network etc.RAM-temporary My quora answer will be lost as it was storing in RAM before switch off . But my quora app, my gallery and others will be on permanent internal storage(in PC hard disks generally) won't be affected. This all happens in CPU right. Okay now one question, who manages all these commands, input, outputs. That's Software may be Windows, Mac ios, Android for mobiles. These are all the managers for computer componential setup for different OS's.
Java is high level language, where as computers understand only binary or low level language or binary code such as 0’s and 1’s. It understand only 00101,1110000101,0010,1100(let these be ABCD in binary). For numbers code in 0 and 1’s, small case will be in 0 and 1s and other symbols too. These will be coverted in byte code by JVM java virtual machine. The program we write will be given to JVM it acts as interpreter. But not in C'.
Let us C…
Do comment. Thank you6 -
I bought a computer awhile back off Kijiji (Canada's craigslist) for a really good price. Today, decided I was going to upgrade the ram since I got a sick deal on some corsair vengence 8gb sticks online...
And just before installing it, I realize the fucker decided to use low profile RAM in his build for a reason: he (for some fucking reason) decided to route the airflow for the system by placing the cooling fan directly over the first 2 memory slots.
Guess who's 5 minute memory upgrade just turned into an hour of re-routing all the airflow in the PC and having to redo all the fan wiring.
I shouldn't complain, I mean I got this computer a couple years back for like $400, but still, wtf man...4 -
Are there any best practices for binding C-librarys to a higher level language?
Like, regarding Concurrency-Safety, memory-safety, general fault-tolerance, glue-datastructures with low overhead?2 -
Wow, yesterday was fun!
I had a rather buggy piece of code, it was bad when I first wrote it, and then I fixed it up, and it was still bad. Now I rewrote almost all of it, and it's much better.
Bad? How? Well, it was in Go, and it's basically an agent meant to execute tasks one at a time, and report the results back to home (live). Now while it worked, it was really flimsy, race conditions, way to much blocking, bad logic, and some very bad bugs.
So I had to rewrite it. Time for a quick primer on the design of this: you have a queue, a task gets add to the queue, the task manager runs the task. In the mean time, the agent is polling the host with the latest output from the task, and also receives new tasks to run (if there are any).
Seems like something that's for a messaging queue, you ask? Well, that would be true if each task was able to run on any random agent, but each task is only meant to run the agent it's tasked to (the tasks are of administrative nature al la apt-get), so having a whole separate service is a tad overkill.
So rewriting required rethinking how the tasks are executed by the task manager. I spent a day on this, it was fun, I ended up copying go contexts (very simple model, very useful). Why copy and not reuse? Because this is meant to be low memory code, so any extra parts are problematic, and I didn't really see a use for having a whole context, I just needed a way to announce that a task is done.
Anyways, if you're interested to see how the implementation worked out: https://github.com/chabad360/covey/...1 -
How did mid-2000s computer users get along with just 1 GB of RAM or less?
As of today, anything less than 8 GB of RAM seems impractical. A handful of tabs in a web browser and file manager can quickly fill that up.
Shortly after booting, 2 GB of RAM are already eaten up on today's operating systems.
When I occasionally used an older laptop computer with 6 GB of RAM (because it has more ports and better repairability than today's laptops; before upgrading the memory), most of the time over 5 GB were in use, and that did not even include disk caching.
It appears that today's web browsers are far more memory-intensive than 2000s web browsers, even if we do similar things people did in the 2000s: browsing text-based pages with some photos here and there, watching videos, messaging and mailing, forum posting, and perhaps gaming. Tabbed browsing already was a thing in the 2000s. Microsoft added tabs to their pre-installed browser in 2006, back when an average personal computer had 1 GB of RAM, and an average laptop 512 MB!
Perhaps a difference is that people today watch in 720p or 1080p whereas in the 2000s, people typically watched at 240p, 360p, or 480p, but that still does not explain this massive difference. (Also, I pick a low resolution anyway when mostly listening to a video in background.)
One could create a swap file to extend system memory, though that is not healthy for an SSD in the long term. On computers, RAM is king.14 -
Should I be bothered if I have added 200 lines of badly written code to my Android project (which had readable, organized code before) to make it almost unreadable? BTW the app's memory usage is still low like before.3
-
This question might make you lose a brain cell because of stupidity in the question. Read with caution
Is there a way to compile a game for Windows from Linux in Unreal engine? I did google some posts but the answer was either use a Virtual machine which will not be done or use the the theoretical method of using mingw but the forum posts state that it will be tricky business or use a windows machine. I have dual booted windows with linux on my machine.
However since the machine has a 512 gb ssd most of the storage space is devoted to unreal engine which takes 47 gigs in itself and have a lot of programs installed I have a usable 20 gigs left out of 145 gig partition. Windows has around 318 gigs of storage to it but I have 100 gigs free at most. So after installing the windows sdk, visual studio with extensions, unreal engine and some other stuff I don't have much space left for myself. I need that much space since I install a lot of games to my ssd. So now I cant load my bigger projects for playing on my windows. I could use my hdd which is mostly used for backups and 100+ gig stuff. Though the hdd's are of course far slower than ssd's which shouldn't be a problem however last time I used visual studio it ate more than 2 gigs of ram for a solution meaning that the compiler has very low memory for itself to actually compile so for any large files the hdd has more of a bottleneck.
Oh and I can't upgrade my ssd's or ram because I don't have enough money.
Thanks for the answers in advance4