Join devRant
Do all the things like
++ or -- rants, post your own rants, comment on others' rants and build your customized dev avatar
Sign Up
Pipeless API
From the creators of devRant, Pipeless lets you power real-time personalized recommendations and activity feeds using a simple API
Learn More
Search - "memory is crap"
-
In a user-interface design meeting over a regulatory compliance implementation:
User: “We’ll need to input a city.”
Dev: “Should we validate that city against the state, zip code, and country?”
User: “You are going to make me enter all that data? Ugh…then make it a drop-down. I select the city and the state, zip code auto-fill. I don’t want to make a mistake typing any of that data in.”
Me: “I don’t think a drop-down of every city in the US is feasible.”
Manage: “Why? There cannot be that many. Drop-down is fine. What about the button? We have a few icons to choose from…”
Me: “Uh..yea…there are thousands of cities in the US. Way too much data to for anyone to realistically scroll through”
Dev: “They won’t have to scroll, I’ll filter the list when they start typing.”
Me: “That’s not really the issue and if they are typing the city anyway, just let them type it in.”
User: “What if I mistype Ch1cago? We could inadvertently be out of compliance. The system should never open the company up for federal lawsuits”
Me: “If we’re hiring individuals responsible for legal compliance who can’t spell Chicago, we should be sued by the federal government. We should validate the data the best we can, but it is ultimately your department’s responsibility for data accuracy.”
Manager: “Now now…it’s all our responsibility. What is wrong with a few thousand item drop-down?”
Me: “Um, memory, network bandwidth, database storage, who maintains this list of cities? A lot of time and resources could be saved by simply paying attention.”
Manager: “Memory? Well, memory is cheap. If the workstation needs more memory, we’ll add more”
Dev: “Creating a drop-down is easy and selecting thousands of rows from the database should be fast enough. If the selection is slow, I’ll put it in a thread.”
DBA: “Table won’t be that big and won’t take up much disk space. We’ll need to setup stored procedures, and data import jobs from somewhere to maintain the data. New cities, name changes, ect. ”
Manager: “And if the network starts becoming too slow, we’ll have the Networking dept. open up the valves.”
Me: “Am I the only one seeing all the moving parts we’re introducing just to keep someone from misspelling ‘Chicago’? I’ll admit I’m wrong or maybe I’m not looking at the problem correctly. The point of redesigning the compliance system is to make it simpler, not more complex.”
Manager: “I’m missing the point to why we’re still talking about this. Decision has been made. Drop-down of all cities in the US. Moving on to the button’s icon ..”
Me: “Where is the list of cities going to come from?”
<few seconds of silence>
Dev: “Post office I guess.”
Me: “You guess?…OK…Who is going to manage this list of cities? The manager responsible for regulations?”
User: “Thousands of cities? Oh no …no one is our area has time for that. The system should do it”
Me: “OK, the system. That falls on the DBA. Are you going to be responsible for keeping the data accurate? What is going to audit the cities to make sure the names are properly named and associated with the correct state?”
DBA: “Uh..I don’t know…um…I can set up a job to run every night”
Me: “A job to do what? Validate the data against what?”
Manager: “Do you have a point? No one said it would be easy and all of those details can be answered later.”
Me: “Almost done, and this should be easy. How many cities do we currently have to maintain compliance?”
User: “Maybe 4 or 5. Not many. Regulations are mostly on a state level.”
Me: “When was the last time we created a new city compliance?”
User: “Maybe, 8 years ago. It was before I started.”
Me: “So we’re creating all this complexity for data that, realistically, probably won’t ever change?”
User: “Oh crap, you’re right. What the hell was I thinking…Scratch the drop-down idea. I doubt we’re have a new city regulation anytime soon and how hard is it to type in a city?”
Manager: “OK, are we done wasting everyone’s time on this? No drop-down of cities...next …Let’s get back to the button’s icon …”
Simplicity 1, complexity 0.16 -
Apple has a real problem.
Their hardware has always been overpriced, but at least before it had defenders pointing out that it was at least capable and well made.
I know, I used to be one of them.
Past tense.
They have jumped the shark.
They now make pretentious hipster crap that is massively overpriced and doesn't have the basic features (like hardware ports) to enable you to do your job.
I mean, who needs an ESC key? What is wrong with learning to type CTRL-[ instead? Muscle memory? What's that?
They have gone from "It just works" to "It just doesn't work" in no time at all.
And it is Developers who are most pissed off. A tiny demographic who won't be visible on the financial bottom line until their newly absent software suddenly makes itself known two, three years down the line.
By which time it is too late to do anything.
But hey! Look how thin (and thermally throttled) my new laptop is!19 -
Guys...
DevRant logged in automatically on my new phone. I don't know how I feel about this.
Or maybe I've already forgotten.
Help. I'm getting old and senile.9 -
My code review nightmare part 3
Performed a review on/against a workplace 'nemesis'. I didn't follow the department standards document (cause I could care less about spacing, sorted usings, etc) and identified over 80 bugs, logic errors, n+1 patterns, memory leaks (yes, even in .net devs can cause em'), and general bad behavior (ex.'eating' exceptions that should be handled or at least logged)
Because 'Jeff' was considered a golden child (that's another long TL;DR), his boss and others took a major offense and demanded I justify my review, item by item.
About 2 hours into the meeting, our department mgr realized embarrassing Jeff any further wasn't doing anyone any good and decided to take matters into his own hands. Thinking 'well, its about time he did his job', I go back to my desk. About an hour later..
Mgr: "I need you in the conference room, RIGHT NOW!"
<oh crap>
Mgr: "I spoke to Jeff and I think I know what the problem is. Did you ever train him on any of the problems you identified in the review?"
Me: "Um, no. Why would I?"
Mgr: "Ha!..I was right. So lets agree the problems are partially your fault, OK?"
Me: "Finding the bugs in his code is somehow my fault?"
Mgr: "Yes! For example, the n+1 problem in using the WCF service, you never trained him on how to use the service. You wrote the service, correct?"
Me: "Yes, but it's not my job to teach him how to write C#. I documented the process and have examples in the document to avoid n+1. All he had to do was copy/paste."
Mgr: "But you never sat with Jeff and talked to him like a human being? You sit over there in your silo and are oblivious to the problems you cause. This ends today!"
Me: "What the...I have no idea what you are talking about. What in the world did Jeff tell you?"
Mgr: "He told me enough and I'm putting an end to it. I want a compressive training class developed on how to use your service. I'll give you a month to get your act together and properly train these developers."
3 days later, I submit the power-point presentation and accompanying docs. It was only one WCF with a handful of methods. Mgr approved the training, etc..etc. execute the 'training', and Jeff submits a code review a couple of weeks later. From over 80 issues to around 50. The poop hits the fan again.
Mgr: "What's your problem? When are you going to take your responsibility seriously?"
Me: "Its pretty clear I don't have the problem. All the review items were also verified by other devs. Its not me trying to be an asshole."
Mgr: "Enough with the excuses. If you think you can do a better job *you* make the code changes and submit them for Jeff for review. No More Excuses!"
Couple of days later, I make the changes, submit them for review, and Jeff really couldn't say too much other than "I don't see this as an improvement"
TL;DR, I had been tracking the errors generated by the site due to the bugs prior to my changes. After deployment, # of errors went from thousands per hour to maybe hundreds per day (that's another story) and the site saw significant performance increases, fewer customer complaints, etc..etc.
At a company event, the department VP hands out special recognition awards:
VP: "This award is especially well earned. Not only does this individual exemplify the company's focus on teamwork, he also went above and beyond the call of duty to serve our customers. Jeff, come on up and get this well deserved award."19 -
Magento is a special kind of tool.
- >20GiB of files? ✔
- >1 GB database? ✔
- Memory needed for scripts >768 MB? ✔
- Script max. exec. time 5 hours? ✔
- Slow ass website? FUCKING ✔
- Slower deployment than a vote on a country wide legislation? FUCKING ✔
- Shitty crap pile of STD-ridden code? I BET YOUR STINKING ✔
Magento, sincerely, please die in agony.11 -
Had nothing to do today, so I thought I´ll test the migration of SVN to Git in Gitlab.
Boss sent me a mail today, that when I migrate we need to preserve the history, so I actually have to put some effort in it. *sigh*
Shout-out to the Gitlab documentation at this point.
That´s probably the best doc I´v ever read...
Well so I tried to use svn2git. And well...
Who the fuck thought that this piece of shit software is in any way usable?
Holy crap!
If it fails, it just does so without any info why. Even in verbose mode.
And the RAM usage? What the actual fuck?
This whole thing is a complete memory leak!
32Gigs of RAM full in Minutes and the whole system starts to stall!
And then when I thought it finally runs through.
Bam another git checkout error...
Googling for that error then I found something. A version of svn2git made in .Net Core.
Didn´t expect much but I tried it anyways.
And would you look at that!
It ran so smooth and didn´t need that much RAM , I had some doubt it did work correctly.
But it did!
I think I´m gonna pay a coffee or two to some guy over in China now!6 -
Today’s lesson in C programming:
DON’T use
system( “clear” );
in Mac OS...
Causes seg Fault in ur program when it is perfectly correct...
What happened was... a friend wanted help with C programming and had written this code... but it was getting seg Fault randomly... just random seg Fault when his code was correct...
I pinpointed the seg Fault to a printf statement but the statement was correct...
Off to search the issue I went, found out that flushing problems can occur in printf if u don’t use \n.
This happens randomly. Thought this might b the reason...
Went to a VM running Arch Linux and tested the code there... worked perfectly. No issues whatsoever.
From a distant memory I remembered some people discussing to never use system( “clear” ); since it causes issues.
Thought to remove that line from code, thinking it wouldn’t make any difference.
Well imagine my shock when the code worked fine after remove that freaking line...
M gonna blame this one on Mac OS since arch had no issues with it 😡😡
Now to find alternative to system( “clear” );
Damm it I spent 4-5 hours on this crap!!!!!!9 -
It grinds my gears to no end as to how insanely BAD most Electrical engineering software is. Lets start with Tina. A circuit simulator. A few versions ago it was rather good but now it feel like its built upon more legacy crap than fucking Windows! This causes it to have memory access violations and crashes even when you look at it from an odd angle.
On topic of circuit simulation. LT-Spice! It has less errors than Tina but is impossible to use without being lobotomized first. Who the FUCK decided it was a good idea to reinvent keyboard shortcuts by movin all of them to the F-row at the top of the keyboard. Also there is no option to delete a component. YOU NEED TO USE CUT IN ORDER TO REMOVE IT!
And at last Altium Designer for Layouting and Schematics. Whose license costs 9 grand. No one outside of some companies will buy this because of the price. Altium realized this and made two watered down versions of it. Which dont really get updates anymore. (last one was in 2018) So they essentially made a cash grab from people who cant afford their actual product. There also exist other (and a lot cheaper) products than what Altium offers. The problem is that they dont work well with interoperability. Schematics drawn in one program will look distorted in another or not import at all. And since Altium is the industry standard you got yourself this nice steaming soup of impossible collaboration. Its kinda like Adobe being absolute shit at progressing their software just because they got no competition. Or rather they do but the industry wont switch cause adobe is so engraved into it.6 -
Absolutely not dev-related.
Blah, blah, weird conversation and shit. I'm too tired and lazy to write this crap again, but let's do it.
The guy is a dev I randomly found on some chatting service, he was interesting to talk with until this conversation. I'll write this out of memory, so yeah.
Him: So by the way I wrote an app that you give your penis size to to get measurements and stuff about it.
Me, thinking it was dev humor: That's hilarious. Tell me more, I'm interested.
Him: So the idea behind all of this was to gather some big data style info about people's penis size and habits and all that stuff.
Me: Man that's awesome. Can I see the source?
Him: No, it's proprietary. You can buy a license though.
Me: You went that far for a joke?
Him: What joke?
Me: The whole software you just told me about.
Him: That's not a joke, I'm being very serious about it.
Me: Oh well. What did you get from the stats?
Him: I got some tips from people's habits! I never thought that shaving it could make it look bigger, but that's awesome!
Me: Do you really care about it that much?
Him: Studies have proven that size correlated with confidence. Since I started doing it, I've been more confident than ever!
Me: Great.
Him: I'm a bit disappointed to see that I'm in the lower percentiles though.
Me: Well of course you are.
Him: Why would you say that?
Me: Well since people with a big dick tend to go more willingly into the subject and might even buy a fucking app for it, of course you'd have the higher average in your stats.
Him: You're only saying that because you have a small cock.
Me: Why the fuck would you say that? You're the one that's concerned about it, not me.
Him: Go on, what's your size?
Me, because I don't care about discussing that stuff: *Tells him*
Him: [stats, comparisons and stuff]
Me: Well I never gave a fuck and your stats won't make me change my mind.
[ ... Some other shit about my size compared to his ... ]
Him: Would you want to work with me for the database maintenance?
Me: You must be joking?
Him: I'm serious.
Me: *Deletes account*
Seriously, fuck that guy. I rewrote that quickly so you only had the best, but it was a whole fucking conversation.3 -
So recently I had an argument with gamers on memory required in a graphics card. The guy suggested 8GB model of.. idk I forgot the model of GPU already, some Nvidia crap.
I argued on that, well why does memory size matter so much? I know that it takes bandwidth to generate and store a frame, and I know how much size and bandwidth that is. It's a fairly simple calculation - you take your horizontal and vertical resolution (e.g. 2560x1080 which I'll go with for the rest of the rant) times the amount of subpixels (so red, green and blue) times the amount of bit depth (i.e. the amount of values you can set the subpixel/color brightness to, usually 8 bits i.e. 0-255).
The calculation would thus look like this.
2560*1080*3*8 = the resulting size in bits. You can omit the last 8 to get the size in bytes, but only for an 8-bit display.
The resulting number you get is exactly 8100 KiB or roughly 8MB to store a frame. There is no more to storing a frame than that. Your GPU renders the frame (might need some memory for that but not 1000x the amount of the frame itself, that's ridiculous), stores it into a memory area known as a framebuffer, for the display to eventually actually take it to put it on the screen.
Assuming that the refresh rate for the display is 60Hz, and that you didn't overbuild your graphics card to display a bazillion lost frames for that, you need to display 60 frames a second at 8MB each. Now that is significant. You need 8x60MB/s for that, which is 480MB/s. For higher framerate (that's hopefully coupled with a display capable of driving that) you need higher bandwidth, and for higher resolution and/or higher bit depth, you'd need more memory to fit your frame. But it's not a lot, certainly not 8GB of video memory.
Question time for gamers: suppose you run your fancy game from an iGPU in a laptop or whatever, with 8GB of memory in that system you're resorting to running off the filthy iGPU from. Are you actually using all that shared general-purpose RAM for frames and "there's more to it" juicy game data? Where does the rest of the operating system's memory fit in such a case? Ahhh.. yeah it doesn't. The iGPU magically doesn't use all that 8GB memory you've just told me that the dGPU totally needs.
I compared it to displaying regular frames, yes. After all that's what a game mostly is, a lot of potentially rapidly changing frames. I took the entire bandwidth and size of any unique frame into account, whereas the display of regular system tasks *could* potentially get away with less, since most of the frame is unchanging most of the time. I did not make that assumption. And rapidly changing frames is also why the bitrate on e.g. screen recordings matters so much. Lower bitrate means that you will be compromising quality in rapidly changing scenes. I've been bit by that before. For those cases it's better to have a huge source file recorded at a bitrate that allows for all these rapidly changing frames, then reduce the final size in post-processing.
I've even proven that driving a 2560x1080 display doesn't take oodles of memory because I actually set the timings for such a display in order for a Raspberry Pi to be able to drive it at that resolution. Conveniently the memory split for the overall system and the GPU respectively is also tunable, and the total shared memory is a relatively meager 1GB. I used to set it at 256MB because just like the aforementioned gamers, I thought that a display would require that much memory. After running into issues that were driver-related (seems like the VideoCore driver in Raspbian buster is kinda fuckulated atm, while it works fine in stretch) I ended up tweaking that a bit, to see what ended up working. 64MB memory to drive a 2560x1080 display? You got it! Because a single frame is only 8MB in size, and 64MB of video memory can easily fit that and a few spares just in case.
I must've sucked all that data out of my ass though, I've only seen people build GPU's out of discrete components and went down to the realms of manually setting display timings.
Interesting build log / documentary style video on building a GPU on your own: https://youtube.com/watch/...
Have fun!20 -
Was playing fallout 4 a couple days ago. About 20 minutes in. The computer just shuts off. Like no power at all. I start up the computer again. Try fallout 4 again. It shuts off at the beginning video. WTF... I try Skyrim wondering if video card is busted. Skyrim runs perfectly fine. I startup Fallout 4 again. It runs. WTF...
Next day I try fallout and bout 20 minutes in power off again. Now I am assuming cooling issue and I am trying to see temps with programs. Cannot really tell.
So today I take apart my laptop and vacuum every cooling orifice out. Vacuum any dust looking crap I can see. There was dust in the fans. All clean. I run a memory test for a couple hours. Memory passes (it was brand new memory, thought maybe flaw in ram). Now I run fallout 4. Runs fine, zero issues for about an hour.
Me to myself: CLEAN YOUR DAMN COMPUTER MORE OFTEN! Okay...
In between I read about Fallout 4 causing system reboots and shutdowns due to loading and heating. Apparently something about Fallout 4 causes this more than other games. Wild... Pretty sure it was thermal shutdown protection going on.3 -
I thought iPhone simulator was ONE THING that worked smoothly but NO, it is also a load of crap, hogging away the memory. Can’t develop ANYTHING without spoiling the mood because of this slow performance. It’s the same with android studio, Xcode, android emulators, and now simulator with vscode is doing the same thing. It’s 2020 you’d think a developer can write code smoothly on a huge MacBook Pro but no what a fucked up world, I’m hungry again, I have eaten up everything what to do I hate fruits !!7
-
everytime i buy a new phone ,i feel this sense of extreme regret :(
i bought a moto g 5g phone last year in feb, it was so good . it didn't had any out of the world cameras or some funky stuff, but it gave a decent performance and i couldn't want any other phone.
In October my mom's phone started giving issues so i bought a realme phone for her that was half my phone's price. i couldn't spent any mor e because otherwise she wouldn't take it. she accepted the cheaper phone and within 4 days sue was cursing it. the phone had decent specs but would lag in certain apps like zoom, and won't run some call recorder apps. at the end i swapped my phone with mom's since i didn't cared about zoom or the recorder.
now this shit realme phone's memory has gone around 60% full of my stuff, and its showing its limitations. this shit auto relaunches insta after a few minutes of usage, probably because its runtime memory gets short( 4gb 128gb device gets memory shortages. nice). its video quality is shit and camera also takes rarely good pics.
the worst thing i like about smartphones today is how they over optimise the ui. this insta issue and auto call recorders not working is simply because of the realme skin running over the stock android. i had similar issues with a xiaomi device i bought for my dad sometime ago. (fortunately my dad is more medieval so that crap has not came back to me :'/ )
so overall i am buying a 3rd phone in 17 months.
This time it's Samsung f23 and am worried that it's also going to suck. i was this 🤏close to buying a pixel 6 or even an iphone coz i can afford them.
but the regret of buying such an expensive phone that will need replacement in 2 years made me rethink.
the only android os that have suited me the best is stock and as of now only 2 companies are making it : google and moto(* it's 100% aosp with 3 extra apps but they can't say that, so they also state that they are not stock os) . one plus is also a brand that i have heard makes a good os . but recently i also heard that they have completely scrapped their os and using oppo's softwares . plus the amount of tickets we get for notifications not working in oneplus, am sure their optimization is extremely aggressive.
so everything between a moderate price phone ( that will need a replacement in 2 years ) to a flagship felt unnecessary to me, so i went ahead with a Samsung's shit phone. f23 has almost same specs as moto but it's again a heavily customised os. i wanna waste my money on trying a custom os and declare it shitty.
most of my friends that use Samsung are fan of it but they are also not very techy so i guess it suits them well. i am the guy who first installs nova launcher in his device, so let's see what it brings on the table. from the 3rd person p.o.v, i felt its screen and camera images to he nice whenever i used their mobiles, so let's see what this brings to the table :(7 -
the one that exists (c#) seems underused compared to where it could (or even should) be used. and the place that uses it the most (enterprise) butchers and mangles its use, just as enterprise tends to do with everything.
the one that i'm designing... the fact that it doesn't exist yet, and that even as i'm zeroing in on syntax and philosophy that i'm very much starting to be proud of, i still don't have a proper idea of how to implement even the most basic parser/interpreter for it, not because it's in any way difficult or unusual, but just because... i've never done that before, so i get into weird circular thought paths that produce weird nonsensical code...
... on top of that, i still only have a very, very fuzzy idea of how will it (sometime in extremely distant future) actually implement the most interesting and core feature - event-based continuous (partial) re-parsing of the source code and the fact that traversing the tokens at the leaf level of the syntax tree should result in valid machine code (or at least assembly) that is the "compiled" program.
i *know* it's possible, i just don't yet know enough to have a contrete idea how exactly to achieve it.
but imagine - a programming language where interactive programming is basically the default way of working, and basically the same as normal programming in it, except the act of parsing is also the (in-memory) compilation at the same time, so it's running directly on the hardware instead of via interpretrer/vm/any of that overhead crap.
also then kinda open-source by definition.
and then to "only" write an OS in that, and voilá! a smalltalk-like environment with non-exotic, c-family syntax and actual native performance!
ahhh... <3
* a man can dream *2 -
I usually like PHP, because it is easy to use, but FUCK! Can you just let me free the fucking memory by myself? Setting variable to null doesn't work, unset doesn't work either. I am still getting fucking memory exhausted error.
There is literally no data stored anywhere, because I unset every fucking thing.
gc_collect_cycles() doesn't work either, probably because this crap thinks there is a reference for this variable somewhere.12 -
First rant here...
Hand full of devs have to create a huge web platform that can shovel a lot of data around in about two months which is impossible...
Project lead has left major decisions in the hands of interns like database we want to use because no question can.be answered by that person. Inexperienced intern has chosen a fucking nosql database for highly relational datasets... why? Because new tech...
Development began and a bunch of problems arised... database was accessable from internet from day one. Random crashes because out of memory exceptions. Every possible feature had a description of at most 10 words... and no standards where enforced on anything.
Now that finaaaally we switch to sql after almost a year of prototypical production everybody keeps coding on new features so i have to port all the crap to the new database...
best part: a bunch of clients on different op systems have to be ported as well!
Even better part: i have to do that cause everybody else has practically no experience in any field...
And now the joke: i got hired for gui/desktop application development
Am i a wizard now? -
I hate this crap and wish people would stop doing it. It makes my brain bleed and doesn't prevent any difficult to find bugs.
if (TERMINAL_COUNT <= index_thing)
English doesn't work that way, and I don't know about you but this crap is just awkward as hell. Sweet Jesus I wish there was less cargo cult programming in the world. Just because you saw something in a blog that convinced you that reverse comparisons is best doesn't mean it actually is. Use a damn static analysis tool to catch accidental assignments in expressions, don't twist my brain to interpret your weird phrasing of comparison operators. Some of your code reviewers may be dyslexic and have enough problems as it is.
And now for the mini-rant that I'm actually here for: You know what makes for difficult to find bugs? (Hint: It sure as hell isn't an assignment in an expression) Releasing an RTEMS semaphore you've never obtained. You'd think that would cause some kind of panic or assert failure but nope. Instead it causes... misaligned address exceptions? In statically allocated global memory? WTF??1 -
Had to face the music and make the jump from Ubuntu 22.04 to Fedora 36. Am I have to say it’s been night and day so far. Everything is snappier. Yeah dnf is very slow in comparison to apt but there’s changes you can make to speed things up and the nifty terminal interface is a great change and helps to make up for the speed issues.
Came with Python 3.10 installed, Gnome and gtk4 apps are nice, fluid and up to date and the random slowdowns, freezing and restarts of Ubuntu running the version of Gnome are nonexistent.
For the life of me I can’t see why Ubuntu would drop the ball like this. I have a Dell XPS 13 developer edition and this is the best it’s ever ran. Even wifi connectivity is better despite of the crap WiFi card that ships with this machine.
I want to love this version and while it is the most graphical appealing and functional version of Ubuntu I’ve ever used. The memory management issues make it damn near unusable.9 -
Holy crap, Meta Developer Connect keynote. Amazing innovation. This is what Apple **used to be**
Granted, the hardware is not as elegant as Apple but the cost is 1/10 and the capabilities are close, same, or exceed (Llama) what Apple is offering.
Now here is the gut punch, they figured out that the mobile app build system needs to build AR/VR/MR apps. That was Apple's edge.
As a developer, I am not enamored with Swift and it is pretty clear that if I have to change and use a niche language like Swift or change and do dev on Android, to target new Meta hardware and AI... well... lets just say I think Swift is crap from a language standpoint and I suspect it is the reason Apple's hardware uses so much more memory, battery and storage than it should. At the same time Meta's Orion runs on a god damn battery in the early piece of glasses. My AVP's have a huge brick.
#define kApple kGigaBloat
If I were Apple I would be shitting my pants watching this Meta presentation.6 -
You can make your software as good as you want, if its core functionality has one major flaw that cripples its usefulness, users will switch to an alternative.
For example, an imaginary file manager that is otherwise the best in the world becomes far less useful if it imposes an arbitrary fifty-character limit for naming files and folders.
If you developed a file manager better than ES File Explorer was in the golden age of smartphones (before Google excercised their so-called "iron grip" on Android OS by crippling storage access, presumably for some unknown economic incentive such as selling cloud storage, and before ES File Explorer became adware), and if your file manager had all the useful functionality like range selection and tabbed browsing and navigation history, but it limits file names to 50 characters even though the file system supports far longer names, the user will have to rely on a different application for the sole purpose of giving files longer names, since renaming, as a file action, is one of the few core features of a file management software.
Why do I mention a 50-character limit? The pre-installed "My Files" app by Samsung actually did once have a fifty-character limit for renaming files and folders. When entering a longer name, it would show the message "up to 50 characters available". My thought: "Yeah, thank you for being so damn useful (sarcasm). I already use you reluctantly because Google locked out superior third-party file managers likely for some stupid economic incentives, and now you make managing files even more of a headache than it already is, by imposing this pointless limitation on file names' length."
Some one at Samsung's developer department had a brain fart some day that it would be a smart idea to impose an arbitrary limit on file name lengths. It isn't.
The user needs to move files to a directory accessible to a superior third-party file manager just to give it a name longer than fifty characters. Even file management on desktop computers two decades ago was better than this crap!
All of this because Google apparently wants us to pay them instead of SanDisk or some other memory card vendor. This again shows that one only truly owns a device if one has root access. Then these crippling restrictions that were made "for security reasons" (which, in case it isn't clear, is an obvious pretext) can be defeated for selected apps.2