Join devRant
Do all the things like
++ or -- rants, post your own rants, comment on others' rants and build your customized dev avatar
Sign Up
Pipeless API

From the creators of devRant, Pipeless lets you power real-time personalized recommendations and activity feeds using a simple API
Learn More
Search - "large file"
-
One of our web developers reported a bug with my image api that shrunk large images to a thumbnail size. Basically looked like this img = ResizeImage(largeImage, 50); // shrink the image by 50%
The 'bug' was when he was passed in the thumbnail image and requesting a 300% increase, and the image was too pixelated.
I tried to explain that if you need the larger image, use the image from disk (since the images were already sized optimally for display) and the api was just for resizing downward.
Thinking I was done, the next day I was called into a large conference room with the company vice-president, two of the web-dev managers, and several of the web developers.
VP: "I received an alarming email saying you refused to fix that bug in your code. Is that correct?"
Me: "Bug? No, there is no bug. The image api is executing just as it is supposed to."
MGR1: "Uh...no it isn't. Images using *your* code is pixelated and unfit for our site and our customers."
MGR2: "Yes, I looked at your code and don't understand what the big deal is. Looks like a simple fix."
<web developers nodding their heads>
Me: "OK, I'll bite. What is the simple fix?"
<MGR2 looks over at one of the devs>
Dev1: "Well, for example, if we request an image resize of 300, and the image is only 50x50, only increase the size by 10. Maybe 15."
Me: "Wow..OK. So what if the image is, for example, 640x480?"
MGR1: "75. Maybe 80 if it's a picture of boots."
VP: "Oh yes, boots. We need good pictures of boots."
Me: "I'm not exactly sure how to break this to you, but my code doesn't do 'maybe'. I mean, you have the image from disk.
You obviously used the api to create the thumbnail, but are trying to use the thumbnail to go back to the regular size. Why not use the original image?"
<Web-Dev managers look awkwardly towards the web devs>
Dev3: "Yea, well uh...um...that would require us to create a variable or something to store the original image. The place in the code where we need the regular image, it's easier to call your method."
Me: "Um, not really. You still have to resolve the product name from the URL path. Deriving the original file name is what you are doing already. Just do the same thing in your part of the code."
Dev2: "But we'd have to change our code"
Mgr2: "I know..I know. How about if we, for example, send you 12345.jpg and request a resize greater than 100, you go to disk and look for that image?"
<VP, mgrs, and devs nod happily>
Me: "Um, no that won't work. All I see is the image stream. I have no idea what file is and the api shouldn't be guessing, going to disk or anything like that."
Dev1: "What if we pass you the file name?"
<VP, mgrs, and devs nod happily again>
Me: "No, that would break the API contract and ...uh..wait...I'm familiar with your code. How about I make the change? I'm pretty sure I'll only have to change one method"
VP: "What! No...it’s gotta be more than that. Our site is huge."
<Mgrs and devs grumble and shift around in their chairs>
Me: "I'm done talking about this. I can change your code for you or you can do it. There is no bug and I'm not changing the api because you can't use it correctly."
Later I discovered they stopped using the resize api and wrote dynamic html to 'resize' the images on the client (download the 5+ meg images, and use the length and width properties)25 -
I have to let it out. It's been brewing for years now.
Why does MySQL still exist?
Really, WHY?!
It was lousy as hell 8 years ago, and since then it hasn't changed one bit. Why do people use it?
First off, it doesn't conform to standards, allowing you to aggregate without explicitly grouping, in which case you get god knows what type of shit in there, and then everybody asks why the numbers are so weird.
Second... it's $(CURRENT_YEAR) for fucks sake! This is the time of large data sets and complex requirements from those data sets. Just an hour through SO will show you dozens of poor people trying to do with MySQL what MySQL just can't do because it's stupid.
Recursion? 4 lines in any other large RDBMS, and tough luck in MySQL. So what next? Are you supposed to use Lemograph alongside MySQL just because you don't know that PostgreSQL is free and super fast?
Window functions to mix rows and do neat stuff? Naaah, who the hell needs that, right? Who needs to find the products ordered by the customer with the biggest order anyway? Oh you need that actually? Well you should write 3-4 queries, nest them in an incredibly fucked up way, summon a demon and feed it the first menstrual blood of your virgin daughter.
There used to be some excuses in the past "but but but, shared hosting only has MySQL". Which was wrong by the way. This was true only for big hosting names, and for people who didn't bother searching for alternatives. And now it's even better, since VPS and PaaS solutions are now available at prices lower than shared hosting, which give you better speed, performance and stability than shared hosting ever did.
"But but but Wordpress uses MySQL" - well then kill it! There are other platforms out there, that aren't just outrageously horrible on the inside and outside. Wordpress is crap, and work on it pays crap. Learn Laravel, Symfony, Zend, or even Drupal. You'll be able to create much more value than those shitty Wordpress sites that nobody ever visits or pay money on.
"But but but my client wants some static pages presented beside their online shop" - so why use Wordpress then? Static pages are static pages. Whip up a basic MVC set-up in literally any framework out there, avoid MySQL, include a basic ACL package for that framework, create a controller where you add a CKEditor to edit page content, and stick a nice template from themeforest for that page and be done with that shit! Save the mock-up for later use if you do that stuff often. Or if you're lazy to even do that, then take up Drupal.
But sure, this is going a bit over the scope. I actually don't care where you insert content for your few pages. It can be a JSON file for all I care. But if I catch you doing an e-commerce solution, or anything else than just text storage, on MySQL, I'll literally start re-assessing your ability to think rationally.11 -
Watching the Dutch government trying to get through the public procurement process for a "corona app" is equal parts hilarious and terrifying.
7 large IT firms screaming that they're going to make the perfect app.
Presentations with happy guitar strumming advertisement videos about how everyone will feel healthy, picnicking on green sunny meadows with laughing families, if only their app is installed on every citizen's phone.
Luckily, also plenty of security and privacy experts completely body-bagging these firms.
"It will connect people to fight this disease together" -- "BUT HOW" -- "The magic of Bluetooth. And maybe... machine learning. Oh! And blockchain!" -- "BUT HOW" -- "Shut up give us money, we promise, our app is going to cure the planet"
You got salesmen, promising their app will be ready in 2 weeks, although they can't even show any screenshots yet.
You got politicians mispronouncing technical terminology, trying hard to look as informed as possible.
You got TV presenters polling population support for "The App" by interviewing the most digitally oblivious people.
One of the app development firms (using some blockchain-based crap) promised transparency about their source code for auditing.... so they committed their source, including a backup file from one of their other apps, containing 200 emails/passwords to Github.
It's kind of entertaining... in the same way as a surgery documentary about the removal of glass shards from a sexually adventurous guy's butthole.
Imma keep watching out of morbid fascination.... from a very safe distance, far away from the blood and shit that's splattering against the walls.
And my phone -- keep your filthy infected bytes away from my sweet baby.
I'll stick with social distancing, regular hand washing, working from home and limited supermarket trips, thank you very much.26 -
I started to download a large file... I left my pc on over night....... WINDOWS 10 UPDATE MOTHERFUCKER!!!7
-
*Opens some Computerphile video on YouTube in Chrome Canary*
CPU > hey ho dude, wait a minute..! I can't process all of this in realtime!!! >_<
Alright.. I think I've still got a copy of all their videos sitting somewhere in the file server.. perhaps I could use that instead.
*Opens said video from the file server in SMPlayer*
CPU > aah, thanks man. Now I can allocate 15-ish % of my resources to that and give you a good watching experience.
Web browsers are really great for being the most general-purpose document viewers, application execution environments (remote code execution engines as someone here called it), and overall be one of the most versatile programs on any PC's standard software suite.
But that comes at a price.. performance. And definitely when it comes to featureful fucking WordPress shitsites (shites?), bloated YouTube, Google, Facebook, and all that fucking garbage.. I fucking hate web browsers and this "Web 2.0" that people keep on talking about. Your boatload of JavaScript frameworks just to ease your own fucking development has a real impact when it happens on dozens of tabs, you know.
Besides, can't those framework creators just make it into a "compiler" * of sorts? So that front-end devs can flail their dicks in an shit-infested environment full of libraries and frameworks all they want, but the framework can convert it into plain JS code that the web server can then serve. Or better yet, the JavaScript standard could be improved to actually be usable on its own!
Look, I'm not a front-end dev. Heck, I'm not even a dev to begin with. But what I do know is that efficiency matters, especially at large scale. Web browsers being so overgeneralized and web devs adding a boatload of fucking libraries or frameworks or whatever, it adds up, both to the CPU's and my own temper.
(*) Quote marks because source code to source code isn't really compiling, but then uglified JS looks worse than machine code anyway so meh :/6 -
Been reviewing ALOT of client code and supplier’s lately. I just want to sit in the corner and cry.
Somewhere along the line the education system has failed a generation of software engineers.
I am an embedded c programmer, so I’m pretty low level but I have worked up and down and across the abstractions in the industry. The high level guys I think don’t make these same mistakes due to the stuff they learn in CS courses regarding OOD.. in reference how to properly architect software in a modular way.
I think it may be that too often the embedded software is written by EEs and not CEs, and due to their curriculum they lack good software architecture design.
Too often I will see huge functions with large blocks of copy pasted code with only difference being a variable name. All stuff that can be turned into tables and iterated thru so the function can be less than 20 lines long in the end which is like a 200% improvement when the function started out as 2000 lines because they decided to hard code everything and not let the code and processor do what it’s good at.
Arguments of performance are moot at this point, I’m well aware of constraints and this is not one of them that is affected.
The problem I have is the trying to take their code in and understand what’s its trying todo, and todo that you must scan up and down HUGE sections of the code, even 10k+ of line in one file because their design was not to even use multiple files!
Does their code function yes .. does it work? Yes.. the problem is readability, maintainability. Completely non existent.
I see it soo often I almost begin to second guess my self and think .. am I the crazy one here? No. And it’s not their fault, it’s the education system. They weren’t taught it so they think this is just what programmers do.. hugely mundane copy paste of words and change a little things here and there and done. NO actual software engineers architecture systems and write code in a way so they do it in the most laziest, way possible. Not how these folks do it.. it’s like all they know are if statements and switch statements and everything else is unneeded.. fuck structures and shit just hard code it all... explicitly write everything let’s not be smart about anything.
I know I’ve said it before but with covid and winning so much more buisness did to competition going under I never got around to doing my YouTube channel and web series of how I believe software should be taught across the board.. it’s more than just syntax it’s a way of thinking.. a specific way of architecting any software embedded or high level.
Anyway rant off had to get that off my chest, literally want to sit in the corner and cry this weekend at the horrible code I’m reviewing and it just constantly keeps happening. Over and over and over. The more people I bring on or acquire projects it’s like fuck me wtf is this shit!!! Take some pride in the code you write!17 -
Our web department was deploying a fairly large sales campaign (equivalent to a ‘Black Friday’ for us), and the day before, at 4:00PM, one of the devs emails us and asks “Hey, just a heads up, the main sales page takes almost 30 seconds to load. Any chance you could find out why? Thanks!”
We click the URL they sent, and sure enough, 30 seconds on the dot.
Our department manager almost fell out of his chair (a few ‘F’ bombs were thrown).
DBAs sit next door, so he shouts…
Mgr: ”Hey, did you know the new sales page is taking 30 seconds to open!?”
DBA: “Yea, but it’s not the database. Are you just now hearing about this? They have had performance problems for over week now. Our traces show it’s something on their end.”
Mgr: “-bleep- no!”
Mgr tries to get a hold of anyone …no one is answering the phone..so he leaves to find someone…anyone with authority.
4:15 he comes back..
Mgr: “-beep- All the web managers were in a meeting. I had to interrupt and ask if they knew about the performance problem.”
Me: “Oh crap. I assume they didn’t know or they wouldn’t be in a meeting.”
Mgr: “-bleep- no! No one knew. Apparently the only ones who knew were the 3 developers and the DBA!”
Me: “Uh…what exactly do they want us to do?”
Mgr: “The –bleep- if I know!”
Me: “Are there any load tests we could use for the staging servers? Maybe it’s only the developer servers.”
DBA: “No, just those 3 developers testing. They could reproduce the slowness on staging, so no need for the load tests.”
Mgr: “Oh my –bleep-ing God!”
4:30 ..one of the vice presidents comes into our area…
VP: “So, do we know what the problem is? John tells me you guys are fixing the problem.”
Mgr: “No, we just heard about the problem half hour ago. DBAs said the database side is fine and the traces look like the bottleneck is on web side of things.”
VP: “Hmm, no, John said the problem is the caching. Aren’t you responsible for that?”
Mgr: “Uh…um…yea, but I don’t think anyone knows what the problem is yet.”
VP: “Well, get the caching problem fixed as soon as possible. Our sales numbers this year hinge on the deployment tomorrow.”
- VP leaves -
Me: “I looked at the cache, it’s fine. Their traffic is barely a blip. How much do you want to bet they have a bug or a mistyped url in their javascript? A consistent 30 second load time is suspiciously indicative of a timeout somewhere.”
Mgr: “I was thinking the same thing. I’ll have networking run a trace.”
4:45 Networking run their trace, and sure enough, there was some relative path of ‘something’ pointing to a local resource not on development, it was waiting/timing out after 30 seconds. Fixed the path and page loaded instantaneously. Network admin walks over..
NetworkAdmin: “We had no idea they were having problems. If they told us last week, we could have identified the issue. Did anyone else think 30 second load time was a bit suspicious?”
4:50 VP walks in (“John” is the web team manager)..
VP: “John said the caching issue is fixed. Great job everyone.”
Mgr: “It wasn’t the caching, it was a mistyped resource or something in a javascript file.”
VP: “But the caching is fixed? Right? John said it was caching. Anyway, great job everyone. We’re going to have a great day tomorrow!”
VP leaves
NetworkAdmin: “Ouch…you feel that?”
Me: “Feel what?”
NetworkAdmin: “That bus John just threw us under.”
Mgr: “Yea, but I think John just saved 3 jobs. Remember that.”4 -
Python's documentation is savage though ... 😂
"it’s your problem if the file is twice as large as your machine’s memory."1 -
Large comment block at the top of the file, commenting explicit line numbers, as in: "line 365: copy to a new image".
Only everytime the comment block grew, the line numbers got more and more off. -_-4 -
I've found and fixed any kind of "bad bug" I can think of over my career from allowing negative financial transfers to weird platform specific behaviour, here are a few of the more interesting ones that come to mind...
#1 - Most expensive lesson learned
Almost 10 years ago (while learning to code) I wrote a loyalty card system that ended up going national. Fast forward 2 years and by some miracle the system still worked and had services running on 500+ POS servers in large retail stores uploading thousands of transactions each second - due to this increased traffic to stay ahead of any trouble we decided to add a loadbalancer to our backend.
This was simply a matter of re-assigning the IP and would cause 10-15 minutes of downtime (for the first time ever), we made the switch and everything seemed perfect. Too perfect...
After 10 minutes every phone in the office started going beserk - calls where coming in about store servers irreparably crashing all over the country taking all the tills offline and forcing them to close doors midday. It was bad and we couldn't conceive how it could possibly be us or our software to blame.
Turns out we made the local service write any web service errors to a log file upon failure for debugging purposes before retrying - a perfectly sensible thing to do if I hadn't forgotten to check the size of or clear the log file. In about 15 minutes of downtime each stores error log proceeded to grow and consume every available byte of HD space before crashing windows.
#2 - Hardest to find
This was a true "Nessie" bug.. We had a single codebase powering a few hundred sites. Every now and then at some point the web server would spontaneously die and vommit a bunch of sql statements and sensitive data back to the user causing huge concern but I could never remotely replicate the behaviour - until 4 years later it happened to one of our support staff and I could pull out their network & session info.
Turns out years back when the server was first setup each domain was added as an individual "Site" on IIS but shared the same root directory and hence the same session path. It would have remained unnoticed if we had not grown but as our traffic increased ever so often 2 users of different sites would end up sharing a session id causing the server to promptly implode on itself.
#3 - Most elegant fix
Same bastard IIS server as #2. Codebase was the most unsecure unstable travesty I've ever worked with - sql injection vuns in EVERY URL, sql statements stored in COOKIES... this thing was irreparably fucked up but had to stay online until it could be replaced. Basically every other day it got hit by bots ended up sending bluepill spam or mining shitcoin and I would simply delete the instance and recreate it in a semi un-compromised state which was an acceptable solution for the business for uptime... until we we're DDOS'ed for 5 days straight.
My hands were tied and there was no way to mitigate it except for stopping individual sites as they came under attack and starting them after it subsided... (for some reason they seemed to be targeting by domain instead of ip). After 3 days of doing this manually I was given the go ahead to use any resources necessary to make it stop and especially since it was IIS6 I had no fucking clue where to start.
So I stuck to what I knew and deployed a $5 vm running an Nginx reverse proxy with heavy caching and rate limiting linked to a custom fail2ban plugin in in front of the insecure server. The attacks died instantly, the server sped up 10x and was never compromised by bots again (presumably since they got back a linux user agent). To this day I marvel at this miracle $5 fix.1 -
'Yay!! My program runs and is giving expected output.'
** Professor gives large input file **
Segmentation fault (core dumped)
'FML'
(My story in every algorithms lab)5 -
Ok, so when I inherit a Wordpress site I've really stopped expecting anything sane. Examples: evidence that the Wordpress "developer" (that term is used in the loosest sense possible) has thought about his/her code or even evidence that they're not complete idiots who wish to make my life hell going forwards.
Have a look at the screen shot below - this is from the theme footer, so loaded on every page. The screenshot only shows a small part of the file. IT LITERALLY HAS 3696 lines.
Firstly, lets excuse the frankly eye watering if statement to check for the post ID. That made me face palm myself immediately.
The insanity comes for the thousands of lines of JQuery code, duplicated to hell and back that changes the color of various dividers - that are scattered throughout the site.
To make things thousands of times worse, they are ALL HANDED CODED.
Even if JavaScript was the only way I could format these particular elements I certainly wouldn't duplicate the same code for every element. After copy and pasting that JQuery a couple of times and normal developer would think one word, pretty quickly - repetition.
When a good developer notes repetition ways to abstract crap away is the first thought that comes to mind.
Hell, when I was first learning to code god knows how long ago I always used functions to avoid repetition.
In this case, with a few seconds though this "developer" could have created a single JQuery handler and use data attributes within the HTML. Hell, as bad as that is, it's better than the monstrosity I'm looking at now.
I'm aware Wordpress is associated with bad developers due to it's low barrier to entry, but this site is something else.
The scary thing is that I know the agency that produced this. They are very large, use Wordpress exclusively and have some stupidly huge clients that would be know nationally.
Wordpress truly does attract some of the most awful "developers" and deserves it's reputation.
If you're a good developer and use Wordpress I feel sorry for you, as you're in small numbers from my experience.
Rant over, have vented a bit and feel better. Thanks Devrant.6 -
Long rant ahead.. 5k characters pretty much completely used. So feel free to have another cup of coffee and have a seat 🙂
So.. a while back this flash drive was stolen from me, right. Well it turns out that other than me, the other guy in that incident also got to the police 😃
Now, let me explain the smiley face. At the time of the incident I was completely at fault. I had no real reason to throw a punch at this guy and my only "excuse" would be that I was drunk as fuck - I've never drank so much as I did that day. Needless to say, not a very good excuse and I don't treat it as such.
But that guy and whoever else it was that he was with, that was the guy (or at least part of the group that did) that stole that flash drive from me.
Context: https://devrant.com/rants/2049733 and https://devrant.com/rants/2088970
So that's great! I thought that I'd lost this flash drive and most importantly the data on it forever. But just this Friday evening as I was meeting with my friend to buy some illicit electronics (high voltage, low frequency arc generators if you catch my drift), a policeman came along and told me about that other guy filing a report as well, with apparently much of the blame now lying on his side due to him having punched me right into the hospital.
So I told the cop, well most of the blame is on me really, I shouldn't have started that fight to begin with, and for that matter not have drunk that much, yada yada yada.. anyway he walked away (good grief, as I was having that friend on visit to purchase those electronics at that exact time!) and he said that this case could just be classified then. Maybe just come along next week to the police office to file a proper explanation but maybe even that won't be needed.
So yeah, great. But for me there's more in it of course - that other guy knows more about that flash drive and the data on it that I care about. So I figured, let's go to the police office and arrange an appointment with this guy. And I got thinking about the technicalities for if I see that drive back and want to recover its data.
So I've got 2 phones, 1 rooted but reliant on the other one that's unrooted for a data connection to my home (because Android Q, and no bootable TWRP available for it yet). And theoretically a laptop that I can put Arch on it no problem but its display backlight is cooked. So if I want to bring that one I'd have to rely on a display from them. Good luck getting that done. No option. And then there's a flash drive that I can bake up with a portable Arch install that I can sideload from one of their machines but on that.. even more so - good luck getting that done. So my phones are my only option.
Just to be clear, the technical challenge is to read that flash drive and get as much data off of it as possible. The drive is 32GB large and has about 16GB used. So I'll need at least that much on whatever I decide to store a copy on, assuming unchanged contents (unlikely). My Nexus 6P with a VPN profile to connect to my home network has 32GB of storage. So theoretically I could use dd and pipe it to gzip to compress the zeroes. That'd give me a resulting file that's close to the actual usage on the flash drive in size. But just in case.. my OnePlus 6T has 256GB of storage but it's got no root access.. so I don't have block access to an attached flash drive from it. Worst case I'd have to open a WiFi hotspot to it and get an sshd going for the Nexus to connect to.
And there we have it! A large storage device, no root access, that nonetheless can make use of something else that doesn't have the storage but satisfies the other requirements.
And then we have things like parted to read out the partition table (and if unchanged, cryptsetup to read out LUKS). Now, I don't know if Termux has these and frankly I don't care. What I need for that is a chroot. But I can't just install Arch x86_64 on a flash drive and plug it into my phone. Linux Deploy to the rescue! 😁
It can make chrooted installations of common distributions on arm64, and it comes extremely close to actual Linux. With some Linux magic I could make that able to read the block device from Android and do all the required sorcery with it. Just a USB-C to 3x USB-A hub required (which I have), with the target flash drive and one to store my chroot on, connected to my Nexus. And fixed!
Let's see if I can get that flash drive back!
P.S.: if you're into electronics and worried about getting stuff like this stolen, customize it. I happen to know one particular property of that flash drive that I can use for verification, although it wasn't explicitly customized. But for instance in that flash drive there was a decorative LED. Those are current limited by a resistor. Factory default can be say 200 ohm - replace it with one with a higher value. That way you can without any doubt verify it to be yours. Along with other extra security additions, this is one of the things I'll be adding to my "keychain v2".11 -
Have you ever thought that even today, if you had a very large "file", say 10 petabytes, that it would take 74 hours on a 300 Mb/s connection to transfer it anywhere in the world , therefore it would still be much faster to fly it physically anywhere, even with the ~5 hour time to transfer it to some sort of drive(s) at 5 gigabits a second.9
-
Retarded senior web dev:
shouting 'STOP' to the ones who pointed out his design flaws
cannot accept a js file with more than 100 lines.
nitpicking others not limited to his owm group
eager to try bleeding edge alpha builds packages for large application
left the company before finishing the project he started2 -
Many of you who have a Windows computer may be familiar with robocopy, xcopy, or move.
These functions? Programs? Whatever they may be, were interesting to me because they were the first things that got me really into batch scripting in the first place.
What was really interesting to me was how I could run multiples of these scripts at a time.
<storytime>
It was warm Spring day in the year of 2007, and my Science teacher at the time needed a way to get files from the school computer to the hard-drive faster. The amount of time that the computer was suggesting was 2 hours. Far too long for her. I told her I’d build her something that could work faster than that. And so started the program would take up more of my time than the AI I had created back in 2009.
</storytime>
This program would scan the entirety of the computer's file system, and create an xcopy batch file for each of these directories. After parsing these files, it would then run all the batch files at once. Multithreading as it were? Looking back on it, the throughput probably wasn't any better than the default copying program windows already had, but the amount of time that it took was less. Instead of 2 hours to finish the task it took 45 minutes. My thought for justifying this program was that; instead of giving one man to do paperwork split the paperwork among many men. So, while a large file is being copied, many smaller files could be copied during that time.
After that day I really couldn't keep my hands off this program. As my knowledge of programming increased, so did my likelihood of editing a piece of the code in this program.
The surmountable amount of updates that this program has gone through is amazing. At version 6.25 it now sits as a standalone batch file. It used to consist of 6 files and however many xcopy batch files that it created for the file migration, now it's just 1 file and dirt simple to run, (well front-end, anyways, the back-end is a masterpiece of weirdness, honestly) it automates adding all the necessary directories and files. Oh, and the name is Latin for Imitate, figured it's a reasonable name for a copying program.
I was 14, so my creativity lacked in the naming department >_<1 -
POSIX tools are teaching gold. According to the unix philosophy, they have to be small and every output must be capable of being the input of another program.
That makes them easy to be build. At least that's often the case, no one looks at grep now. But some of them are quite simple.
I was explaining to my wife the unix philosophy and since she is learning python3 right now, I open up vim and wrote cat. After a minute it printed out /usr/share/dict/words. After two minutes it printed a given file. After three minutes it through an error if the file didn't exist. A little later it accepted multiple files. Half an hour later it accepted flags and the -n flag was implemented. And there was an interesting question of what would happen if someone opened an incredible large file. Solution? Read it piece by piece and directly write it.
If someone teaches programming, please try it and come back to me if that is working. The idea is that you recreate existing POSIX programs, by doing that introducing people to those programs and making the comfortable with them. A few from the top of my head that I believe are quite easy to build:
ls, more, mkdir, sort, uniq, find, pwd, whoami, id, groups, passwd5 -
I was asked to fix a critical issue which had high visibility among the higher ups and were blocking QA from testing.
My dev lead (who was more like a dev manager) was having one of his insecure moments of “I need to get credit for helping fix this”, probably because he steals the oxygen from those who actually deserve to be alive and he knows he should be fired, slowly...over a BBQ.
For the next few days, I was bombarded with requests for status updates. Idea after idea of what I could do to fix the issue was hurled at me when all I needed was time to make the fix.
Dev Lead: “Dev X says he knows what the problem is and it’s a simple code fix and should be quick.” (Dev X is in the room as well)
Me: “Tell me, have you actually looked into the issue? Then you know that there are several race conditions causing this issue and the error only manifests itself during a Jenkins build and not locally. In order to know if you’ve fixed it, you have to run the Jenkins job each time which is a lengthy process.”
Dev X: “I don’t know how to access Jenkins.”
And so it continued. Just so you know, I’ve worked at controlling my anger over the years, usually triggered by asinine comments and decisions. I trained for many years with Buddhist monks atop remote mountain ranges, meditated for days under waterfalls, contemplated life in solitude as I crossed the desert, and spent many phone calls talking to Microsoft enterprise support while smiling.
But the next day, I lost my shit.
I had been working out quite a bit too so I could have probably flipped around ten large tables before I got tired. And I’m talking long tables you’d need two people to move.
For context, unresolved comments in our pull request process block the ability to merge. My code was ready and I had two other devs review and approve my code already, but my dev lead, who has never seen the code base, gave up trying to learn how to build the app, and hasn’t coded in years, decided to comment on my pull request that upper management has been waiting on and that he himself has been hounding me about.
Two stood out to me. I read them slowly.
“I think you should name this unit test better” (That unit test existed before my PR)
“This function was deleted and moved to this other file, just so people know”
A devil greeted me when I entered hell. He was quite understanding. It turns out he was also a dev.3 -
So a follow up to my last Mathematica rant:
I have a JSON file made up of arrays of arrays of arrays with the outermost layer containing ~10,000 arrays.
So, my graphing works perfectly the first time for one of my graphs. I fix another unrelated graph, graph the whole file, and suddenly the first one stops working. The file read-in only reads in the array {2,13}. I double checked the contents of the file, they were as large as always.
Then, I proceed to look for bugs, find none, and decide to restart Mathematica. This doesn't help.
So I go back, find no bugs, and eventually am so fed up that I just restart Mathematica again, no changes.
Suddenly, the array reads in fine. Waiting for the graphs to come out but I think they'll be fine.
WTF Mathematica? Why must I restart TWICE to make bugs caused by your application go away?7 -
I swear GNU/Linux is the pure definition of a badly designed OS/Kernel
1) The separate file system. Of all things, all the set standards Linux uses exFat which can only be read by Linux. Not NTFS, not FAT32, you know, the common ones.
2) Unintuitiveneness and inefficiency of workflow. Linux is extremely inefficient, especially the cli versions, where one cannot perform several tasks simultaneously.
3) It's MESSY. The use of a Terminal is incredibly uncomfortable, because the text is tightly spaced, and in monochromatic in root. When looking at a large chunk of text, my eyes hurt on a deeper level than physical.
4) It's the most retarded way to handle drives. Why not assign drive letters and names? Why is it dev/sda1 dev/sdb1. If I have two drives of the same capacity, I cannot differentiate between them. How am I supposed to know which is my system drive and which is my portable hard drive that I'm formatting? And this stupid disk utility fdisk. What the fuck is that? Why is the command o wiping the device? Why is t selecting a partition? What the fuck?
5) Stupid naming system. Most CLI commands have deliberately stupid and hard to remember names. Also the prefixes to them such as -x -c or -v, say nothing to me. Reading through the manual in white, tight monochromic text is impossible.
6) Error messages that don't make sense. How am I supposed to know what "Error! [err=/dev/null, arch="27xE39Tmx849D" result="success"]" is supposed to mean? A search will cut down the error and I will find nothing.
7) General hype towards it being "focused on developers". It's not. It's really not. As a developer myself I find it absolutely painful to write code on Linux. It's sluggish, requires it's own set of IDEs and software packages.
People say "Oh you can't write and compile code on Windows". Yes you can. Windows has the exact same set of compilers as Linux, like gcc and gpp. Windows has a versatile and powerful command line. It's hidden from a regular user, because its actually user friendly, and is made for people, not aliens. The fact that you have to download a package manager first to access new ones is what flies over many peoples heads.
Go on start a wreck in the comments40 -
There are a couple of them to list! But to sum my main ones(biggest personal heroes):
John McCarthy, one of the founding fathers of Artificial Intelligence and accredited with coining such term(sometimes before 1960 if memory serves right), a mathematical prodigy, the man based the original model of the Lisp programming language in lambda calculus. Many modern concepts that we have in programming where implemented in one way or another from his systems back in the day, and as a data analyst and ML nut.....well I am a big fan.
Herb Sutter: C++ programmer extraordinaire. I appreciate him more for his lectures and published articles than anything else. Incredibly smart and down to earth and manages to make C++ less intimidating while still approaching it with respect.
Rich Hickey: The mastermind behind Clojure, the Lisp dialect for the JVM. Rich is really talented and his lectures behind his motivations and reasons behind everything he does with Clojure are fascinating to see.
Ryan Dahl: Awww shit y'all know how it is. The man changed web development both in the backend and the frontend for good. The concept of people writing their own servers to run their pages was not new, but the Node JS runtime environment made it more widely available to people by means of a simple to use language that was already popular with web developers. I would venture to say that Ryan's amazing contributions to JS made the language better, as it stands, the language continues to evolve and new features that make it overall better keep being added. He is currently building Deno, which would be a runtime environment for TypeScript, in Rust.
Anders Hejlsberg: This dude was everywhere man....the original author of Turbo Pascal and the lead of Delphi back in the day. These RAD tools paved the way for what would be a revolution in the computing world. The dude is also the lead architect and designer of the C# programming language as well as TypeScript.
This fucker is everywhere and I love it.
Yukihiro "Matz" Matsumoto: Matsumoto san is the creator of the Ruby programming language. Not only am I a die hard fan of Ruby, but of the core philosophies that the man keeps as the core of his language design: Make the developer happy, principle of least surprise. Also I follow: minswan which is a term made by the Ruby community that states Mats is nice so we are nice. <---- because being cool to others is better than being a passive aggressive cunt.
Steve Wozniak: I feel as if the man does not get enough recognition...the man designed the Apple || computer which (regardless of how much most of y'all bitch and whine) paved the way for modern micro computers. Dude is also accredited with designing one of the first programmable universal remotes(which momma said was shitty) but he did none the less.
Alan Kay: Developed Smalltalk and the original OOP way of doing things. Smalltalk as a concept is really fucking interesting. If you guys ever get the chance, play with Pharo, which is a modern Smalltalk. The thing is really interesting and the overall idea of Smalltalk can be grasped in very little time. It sucks because the software scales beautifully in terms of project building, the idea of hoisting a program as its own runtime environment and ide by preserving state through images is just mind blowing to me. Makes file based programs feel....well....quaint.
Those are some of the biggest dudes for me. I know that the list is large, but I wanted to give credit to the people that inspired me the most. Honorary mention goes to other language creators and engineers of course, but it would be way too large to list!9 -
I felt like being the cause for “that dreaded legacy code“ and wrote 250 lines of C preprocessor macros for generating bitfields in a large header file automatically, with the goal of simplifying and clarifying register access for all peripherals in the end. Then, I found out that SDCC's optimisation for bitfields is absolutely awful (if existent at all), and I don't really want to use these abstractions if they have a performance impact.
Did I deserve that?7 -
My laptop battery is absolute rat-shit, it drains half of itself when I try to copy a large file...18
-
U guys know anyone who would ask "Can I turn off this power switch?" AFTER they already turned it off? My mother did that. On a PC that was in the middle of a large file upload over a ridiculously slow internet...2
-
Data scientist and related devs, how do you handle large datasets?
I was given a .txt file containing +1M edges of a directed graph. I tried to analyze it with networkx, but my computer killed the process as it was eating too much CPU/memory.
I would be grateful for any advice!24 -
Dev Diary Entry #56
Dear diary, the part of the website that allows users to post their own articles - based on an robust rights system - through a rich text editor, is done! It has a revision system and everything. Now to work on a secure way for them to upload images and use these in their articles, as I don't allow links to external images on the site.
Dev Diary Entry #57
Dear diary, today I finally finished the image uploading feature for my website, and I have secured it as well as I can.
First, I check filesize and filetype client-side (for user convenience), then I check the same things serverside, and only allow images in certain formats to be uploaded.
Next, I completely disregard the original filename (and extension) of the image and generate UUIDs for them instead, and use fileinfo/mimetype to determine extension. I then recreate the image serverside, either in original dimensions or downsized if too large, and store the new image (and its thumbnail) in a non-shared, private folder outside the webpage root, inaccessible to other users, and add an image entry in my database that contains the file path, user who uploaded it, all that jazz.
I then serve the image to the users through a server-side script instead of allowing them direct access to the image. Great success. What could possibly go horribly wrong?
Dev Diary Entry #58
Dear diary, I am contemplating scrapping the idea of allowing users to upload images, text, comments or any other contents to the website, since I do not have the capacity to implement the copyright-filter that will probably soon become a requirement in the EU... :(
Wat to do, wat to do...1 -
I'm amazed so many people have "one" favourite editor. I have a whole bunch depending on the situation:
- IntelliJ whenever dealing with Java files
- VS whenever dealing with .NET
- VS code whenever dealing with Salesforce
- Notepad++ when just opening "any old file" to do some quick editing (never been won over to Sublime)
- vim when needing to edit files in a console environment
- nano as the second choice in the above situation when vim isn't available
- Emeditor when needing to open / work with very large files
I've never even remotely found a "one size fits all" solution.3 -
So that is why strangers want to send me videos/etc. on WhatsApp then !
Not that I could ever get it to install mind you!
Related links:
https://dailymail.co.uk/news/...
----------------
Jeff Bezos' cell phone was hacked in 2018 after he received a malicious WhatsApp message from the crown prince of Saudi Arabia, months before the National Enquirer exposed his affair, it has been claimed.
The Amazon billionaire received a video file containing malicious code from Mohammed Bin Salman's personal phone number, The Guardian reported on Tuesday.
According to forensic examination of the phone afterwards, the message was sent on May 1, 2018. Within hours, a large amount of data from Bezos' phone was extracted. There is no detail of what kind of data was taken.
-----------
https://indiatoday.in/technology/...
-----------------------
https://dailymail.co.uk/sciencetech...
However, there does not appear to be any reports of the vulnerability being actively exploited in the wild.
-----------------------
Apart from the one above maybe..
So, be careful out there !7 -
Looks like copying large file e.g. 1GB from Remote Desktop Connection will also affect SQL Server performance and somehow slowing down the SQL transaction 100000x times
What a new thing to experience😆5 -
Back in grammar school we started programming in TI-Basic on a TI89 Titanium as it was part of math class (calculus and geometry). I didn't really understand much because the teacher thought it was a great idea to start with recursively calculating GCD (and we were in a sort of "linguist profile", nobody had ever touched a line of code in their lives before). I still liked it though and by some coincidence I got an old Win95 compaq notebook to play with from a friend.
I started playing around with the CMD prompt and batch files and could apply some of the things I had learned on the TI, like GOTO or If statements. I still didn't know what I was doing of course, and so it happened that I used the > file pipe when trying to compare two values. Suddenly there was a file with some code fragments and I started to get what I had done. I put the file pipe into an endless GOTO loop and was amused how those few lines filled up the whole desktop with nonsense files. I went on to refine this a little so I could control it with another file that acted as a kill switch when present. Over the next weeks I played some more with it and made it write out and start another batch file that would check whether the original script was still there and recreate it if not.
That notebook was so large and heavy I could not bring it to school, so I wrote all code by hand on paper and typed it in when I got home, that way I could still code in class when I was bored and no one would notice.
So my first ever "program" that I wrote myself was some lousy malware.5 -
Sooooo this is the thing.
For a stupid fucking project at work we basically have to scrum manage a bunch of individual components on a rather large web app.
We start with the html and css and js bs and we all have to work on different sections of one page at a time. Large blocks right? Ok cool.
Originally I had suggested to build everything inside individual php files and then stack them up with require(). As fucking simple as fucking that. Except that the manager does not have php on her pc. The other two developer don't either. I am the only one that fucks with php OUTSIDE our fucking servers.
Go fucking figure...the lead developer does not fuck with php outside the servers.....man
So, because i know it would be a shitstorm with something as basic as installing i dunno...fucking xampp my manager said that she needs a different solution.
Fuck it...fine...whatever. i know go. So i make a fucking server wich upon being fired you can just code the templates and paste them where they need to go. Docs and everything..a sane folder structure and everything and a fucking pipleline for the assets and everything. I would have thought that shit was good enough but I even added a cmd tool that merges all the fucking html files together into one html file with all the shit included.
All in Golang. It works, its fast and i can just give them the fucking folder with the exe and it will work.
I dunno if this was the best way to do it. But it took me maybe 20 mins to do it and it works.
I would have expected our manager to be impressed but she legit did not gave two fucking shits about the fact that one of her developers is able to create this mini server for static sites shitstain project in 20 minutes.
Man I don't want praise. She thinks that jquery is the best thing in the world so I don't expect much. But shit man.......a better reaction would have been better. She basically went meh ok as long as it works.
I also showed them a demo of a flutter project to replace the shitty ass webview filled school app that they have for android and ios. Shit is native and it looks beautiful. Ask me what she said.
Go on, fucking ask me.
She said tha if it would take me much time to continue on that the she would rather leave it to the third party vendor that currently makes the app.
I told her that such shitty app costs the school 40 fucking thousand dollars a year that I could do in a fucking month, which would also be better since it would raise the salaries of me and the other 2 developers and will more importantly make us more valuable to the school.
Said that she would think about it because we have a lot of projects.
I
Fucking
Hate
It
When someone fucks with my ability to make more money. I hate it fam. And i fucking despise being limited by other people.
Fuck this week.
I am never gonna grow in here. Ever. But it pays the bills so fuck it.6 -
the more i learn about web dev, the more i realise the reason for its mess up . There are 2 major problems in it : the people who create various important concepts and tools for web dev were 1) working on it without any collaboration and agreements on the philosophy and 2) were too stubborn on their ideology i guess.
There is no limitation to anything's functionalities, and the limits that are "defined" are badshit crazy. for eg:
====================================
HTML creator : "I am gonna make a language that would provide a skeleton to web page. it will just have the text and basic markers to let the scripting and styling engines/languages know which text is supposed to be rendered and how.
It won't provide any click or loading functionality.
someone: "So i guess opening a page or loading an image would be handled by JS or other programming language? also, bold , italic or division would be added via CSS?"
HTMLguy : Nah, my html engine would ALSO do that.
someone : what , why? won't that just be stupid and against your philosophy?
HTMLguy : WHAT? am too awesome, can't hear you
w3c , 50 yrs later : sorry can't change this, gotta support the 50 yrs of web dev and billion sites
=================================
CSS guy: I am gonna make the world's best beautifying stylesheet language to provide colors, styling, fonts and backgrounds to a page. every loadings and clicks would be handled somewhere else
Some1: cool, then clicks, hover and running of animation would be handled by JS only
CSSguy :Umm, i guess i could handle those.
Some1 wha-?
CSSguy : Thankyou Thankyou Thankyou for the nobel price!
====================================
JS guy : I am gonna make a god web programming language! It can do everything: add/remove html tags, add styling, control animations, control browser, handle clicks , perform operations, everything!
some1: cool! you must be making very large programming language with lots of modules.
JS guy: No! i am gonna keep it small. no built in classes and file imports! just use the functions directly. if someone wants the additional lib functionality, install them on your server
some1 : innovative! what's typeof NaN ?
JSguy :shut up.7 -
FX [ Buys something on Amazon.. ]
FX [ Decides to leave feedback / review... ]
--------
We apologize but this account has not met the minimum eligibility requirements to write a review. If you would like to learn more about our eligibility requirements, please see our community guidelines.
--------
What would those be ?!?
I mean, I've only been a customer since like, forever !
FX [ Goes to read small print.. ]
----------
Eligibility
To contribute to Community Features (for example, Customer Reviews, Customer Answers), you must have spent at least £40 on Amazon using a valid payment card in the past 12 months.
----------
Well, FU then.
The item took 4 months to arrive as it was. (Which I thought might be worth mentioning in the review, along with how the item is actually the same as another item with a different name..)
This item was supposed to be according to the reviews, superior to the other brand..
Only, they are both made by the same parent company, even come in the same type, style of boxes !
So, really are "The same"..
So you don't need to wait 4 months to buy the better version.. (Which is oddly cheaper..)
But I can't tell anyone this, because Amazon won't let me..
They come in different sizes (small, medium, large I think.. small for fingernails, larger for toenails, though if you have small toenails, small will do for both! or larger for fingernails if you have huge fingers..), and lots of different models (With different features..), pricing from $10 USD to $30 depending upon which one you want.
Related links:
https://youtube.com/watch/...
> Seki Edge Nail Clippers The Best Fingernail
> Clippers On The Planet? - Hygiene Episode 01
https://youtube.com/watch/...
> TAKUMINOWAZA Stainless Steel Cuticle
> Nipper Nail Clipper Handmade in Japan
Shows you a good selection in relation to hand size.
I went for the G-1200 (pictured.) as it has a nail catcher, but I wasn't sure if it would get in the way of cutting toenails, or if the nail catcher bit was removable or not.
It doesn't get in the way, and is removable.
It doesn't work very well in practice to catch nails mind you..
The other model is the G-1113, similar, no nail catcher, less plastic.
Both come with a really good nail file part, so you don't have to lug half a brick around to finish your nails with..
Only took me 50 years to hear about them..
This has been a public service announcement..8 -
A continuation of the worst idiot that I worked for, in possibly the worst project of the world. ( The guy who said youtube watching doesn't cost data, downloading the videos offline does)
Guy sends me a template for a patent application.. I ask him why, and he's all secretive until he takes me into a meeting with the patent officers of the organization to reveal his grand plans.
Here goes his idea. He wanted to file a patent for a sonar made for large vehicles in India. His idea was that people in India are used to overtake busses while they turn and they are overrun by the large vehicles. True to some extent but a completely overkill solution for a minor issue that could be solved by educating the masses. I try to explain this to him, and he's pissed off. Starts throwing random, made up stats at me saying 2000 people die everyday on every street. I'm like WHAT??? I look at the patent officer, and he gives me that "don't look at me dude, I'm just here for any questions about the patent process" look. He's busy doodling in his notebook while I try everything possible to invalidate the stupid idea my client has barfed all over the meeting room and the attendants. I even bring out the technical challenges leaving aside the practicality of the nonsense. I asked him how to distinguish between a pedestrian, a parked vehicle, a dog, a cow.. To which he responds with an on the spot thoughtless answer. Heat signatures!! In 5 minutes we went from sonar to heat maps in a tropical country such as India.. He now wants a hybrid solution.
He was about to start yelling when I caved in on the condition that I want nothing to do with the idea after I finish the patent application.. Made up some document and sent it to the asshole, only to never hear about it again.. Thank god for that.. R&D my ass..7 -
Time for a rant about shitstaind, suspend/hibernate, and if there's room for it at the end probably swappiness, and Windows' way of dealing with this.
So yesterday I wanted to suspend my laptop like usual, to get those goddamn fans to shut up when I'm sleeping. Shitstaind.. pinnacle of init systems.. nope, couldn't do it. Hibernation on the other hand, no problem mate! So I hibernated the laptop and resumed it just now. I'm baffled by this.
I'll oversimplify a bit here (but feel free to comment how there's more to it regardless) but basically with suspend you keep your memory active as well as some blinkenlights, and everything else goes down. Simple enough.. except ACPI and I will not get into that here, curse those foul lands of ACPI.
With hibernation you do exactly the same, but on top of that, you also resume the system after suspending it, and freeze it. While frozen, you send all the memory contents to the designated swap file/partition. Regarding the size of the swap file, it only needs to be big enough to fit the memory that's currently in use. So in a 16GB RAM system with 8GB swap, as long as your used memory is under 8GB, no problem! It will fit. After you've moved all the memory into swap, you can shut down the entire system.
Now here's the problem with how shitstaind handled this... It's blatantly obvious that hibernation is an extension of suspend (sometimes called S3, see e.g. https://wiki.ubuntu.com/Kernel/...) and that therefore the hibernation shouldn't have been possible either. The pinnacle of init systems.. can't even suspend a system, yet it can hibernate it. Shitstaind sure works in mysterious ways!
On Windows people would say it's a hardware issue though, so let's talk a bit about that clusterfuck too. And I'll even give you a life hack that saves 30GB of storage on your Windows system!
Now I use Windows 7 only, next to my Linux systems. Reason for it is it's the least fucked up version of Windows in my opinion, and while it's falling apart in terms of web browsing (not that you should on an EOL system), it's good enough for le games. With that out of the way... So when you install Windows, you'll find that out of the box it uses around 40GB of storage. Fairly substantial, and only ~12GB of it is actually system data. The other 30-ish GB are used by a hibernation file (size of your RAM, in C:\hiberfil.sys) and the page file (C:\pagefile.sys, and a little less than your total RAM.. don't ask me why). Disable both of those and on a 16GB RAM system, you'll save around 30GB storage. You can thank me later.
What I find strange though is that aside from this obscene amount of consumed storage, is that the pagefile and hibernation file are handled differently. In Linux both of those are handled by the swap, and it's easy to see why. Both are enabled by the concept of virtual memory. When hibernating, the "real" memory locations are simply being changed to those within swap. And what is the pagefile? Yep.. virtual memory. It's one thing to take an obscene amount of storage, but only Windows would go the extra mile and do it twice. Must be a hardware issue as well.
Oh, and swappiness. This is a concept that many Linux users seem to misunderstand. Intuitively you'd think that the swappiness determines what percentage of memory it takes for the kernel to start swapping, but this is not true. Instead, it's a ratio of sorts that the kernel uses when determining how important the memory and swap are. Each bit of memory has a chance to be put into either depending on the likelihood of it being used soon after, and with the swappiness you're tuning this likelihood to be either in favor of memory or swap. This is why a swappiness of 60 is default most of the time, because both are roughly equally important, and swap being on disk is already taken into account. When your system is swapping only and exactly the memory that's unlikely to be used again, you know you've succeeded. And even on large memory systems, having some swap is usually not a bad idea. Although I'd definitely recommend putting it on SSD in a partition, so that there's no filesystem overhead and so that it's still sufficiently fast, even when several GB of memory are being dumped in.6 -
My implementation of facebook's haystack storage solution. It's certainly not a faithful recreation, but I think this served my needs better.
The idea is you store all of your files in one large file, and just write down where each of your files starts and ends. This particular implementation I called an indexed haystack because it gives you back an index, sort of like an array.
I was attracted to the idea because it makes the file structure of the server so much more simple, and backups so much easier when you only have a few files rather than a few thousand. Facebook came up with it because it was more efficient to store a million photos all in the same file rather than in a million separate ones.
There is a 100GB limit to each haystack but that isn't technical, it's just a sensible thing to do.15 -
i'm starting a project where i will have a large amount audio clips, anywhere from a few seconds to about an hour long, and i need to store them based on which user created them and what group they are created in (so they will be sorted based on two integers). i'll need to concatenate and/or merge the audio files frequently, and i may need to filter which audio i use based on users and time created.
how should i store the audio? i'm pretty sure a database is the best option, but should i consider using the file system? if i shant, should i use mysql or postgres? i know postgres has more types and supports complex queries.
does anyone have experience who can help?8 -
I dug up my old ledger web app that I wrote when I was in my late twenties, as I realized with a tight budget toward the end of this year, I need to get a good view of future balances. The data was encrypted in gpg text files, but the site itself was unencrypted, with simple httpasswd auth. I dove into the code this week, and fixed a lot of crap that was all terrible practice, but all I knew when I wrote it in the mid-2000s. I grabbed a letsencrypt cert, and implemented cookies and session handling. I moved from the code opening and parsing a large gpg file to storing and retrieving all the data in a Redis backend, for a massive performance gain. Finally, I switched the UI from white to dark. It looks and works great, and most importantly, I have that future view that I needed.1
-
Ugh. That may have been a mistake.
I'm deep in a large effort to refactor my project. It's a one man deal and something I've been working on pretty much every day in some fashion for nearly 10 years (five years ago I started a scratch rewrite to move from a fully CGI server rendered application to a browser rendered asynchronous version built around JS) and that took me three years.
I started this refactor about 8 weeks ago. Turns out I've been tackling the largest modules and progress has been decent. So that's good.
But I got to wondering ... Just how much code is there?
So I whipped up a quick script to do some calculations. Read each file and get a line and word count, skipping empty lines.
In JS it turns out I have 83,973 lines and 467,683 words.
On the back end, 86,230 lines and 580,422 words.
Average publishing stats say the are about 250 words/printed page.
That means I'm confronting refactoring 1,870 pages of JS. That's the size of several decent sized novels. (I think I've done the equivalent of Maybe 400 at this point).
Makes me feel like the walls are creeping in to know how much is left to go ... -
Sometimes we woulg get a request which involves adding something or changing something to a rather large and poorly made codebase which me and my lead have not had the time to change.
This b how shit goes:
* the lead gets a call after an email was sent with apparently only 5 secs of response time( inpatient fucks)
* lead calls me in next to his station to listen to the call
* i b listening and shit, not even taking notes and shit, looking all secret weapon and shit.
Texas as fuck.
* lead puts shit on hold and looks at me
Lead: "Allright. You know the codebase as well as I do, what you think?"
Me: pffft gimme 30 mins and Ill whip out yo solution
Lead: we positive on the estimate?
Me: as positive as the Texas Rangers sucking ass but we still love em, fuck the Astros
Lead: there is only room for one team
Me: only one
**fist bump
* goes back to the call:
Lead: yeah its gonna take 2 days at most.
Aaaaaaaaaaaaaaand we do finish them in 30 mins. The trick is in doing it extra fast so we have enough time to fuck around or do some other shit and to make it seem like we do some hard shit. After maybe 6 hours we tell them that we managed to fix it before time.
Texas....as....fuck
Btw me and the lead tall about whatever while we code the stuff, most of the time I do it since my boy has heavy eye problems and I want him to relax. He has been training me a lot in regards to knowing the codebase, before I got here it was only him for two fucking campuses and the man did an outstanding job. My boy got my ass and I got his.
Teamwork, the southern gentleman's way.
Texas.
P.d while coding it he said the one of the file sizes was too big to handle, i said "das what she said" and our female manager said "i heard that".......i could have sworn that she gave me a lil wink. Well damn.8 -
Okay, I really need some help here.
We're building quite a large application that will serve as the backbone for the whole company. We have to implement some sort of role system. We're debating whether or not to store roles in the DB or in code (some sort of config file). Personally I don't see any reason to do so. What's your thoughts?13 -
Any other language: Hey fuckface, you can't name this variable by a single letter, tf is wrong with you? use some descriptive shit.
Golang: lmao fuck u
I really find it interesting how we use short variable names for items in golang. Kinda makes sense when you think of it. Most of these items come up in short methods for which the mental model lets you know and remember what you are doing, they even make sense when going through the std lib in which that shit is all over the place. YET years of going by other languages has made me squint my eyes a bit in frustration every time I see it.
Say for example that a function is implementing io.Writer. What would you call the method parameter? you could argue that writer would be sensible since it has it in the signature, but what about when the io.Writer in itself is a file or a socket or whatever? writer would be funny or strange? nah fuck it just w, it makes sense, but x wouldn't. I find these points to make sense even if i don't like them.
Would, now, this practice be acceptable in C? you are supposed to write the same modular code with C in which you compose large functionality in separated units of code, yet I am sure this practice of single name variables is something that C engineers dislike greatly.
Are go devs just doing this out of blind love for their preference in languages? and how would this work if mfkers add generics to go(I hope not, Go is simple enough to understand in order to extend functionality through the empty interface, but that is a preference of mine as well)
The more I use Go the more I like it to be honest, I think the code looks ugly syntactically, but that is subjective as all hell and based on my constant preference for a language to look like Ruby, which even though it might not be everyone's cup of tea it remains to my eyes as the most beautiful language in existence, again, an obvious personal preference.19 -
I once had to fix a webservice endpoint another developer added that accepted any file from the public internet and loaded it directly onto an NFS file mount with the rest of the site's image assets and then inserted a record of the file into SQL via a hand-stitched query with parameters from the endpoint.
I was working for a large enterprise company at the time... I was very disappoint. -
Fridays..
I was sitting in Office, reading some documentation today before I can leave for Weekend. Suddenly my co-worker next to my shouts fuck.
He is one of our database engineers and he got notified that one of the prod db Servers was on 100% and the storage was on 95% out of nowhere. So he checked that and someone else started a comparison between two large datasets.
Line by line.
Logging every cell result in a log file.
That resultet in a insanely fast growing lock file in the size of a few hundred Gigabytes.
So he looked up who did that stuff and tried to call him but it seems like the other dev leaved Right after starting his crappy script.
So he had to get his private number to call him so they could kill that script without Breaking the prod db… -
When I first started down the path to becoming a developer, I was a "business analyst" where I managed our departments reports and ended up migrating all the reports from daily query run in MS Access with Task manager and emailed out to all the managers including the VP of the entire business unit, I created
Views in the database and sent out the same spreadsheet with the view in excel daily since management didn't want "change". Granted this was at a large health care company in the US and didn't want to invest in a real dashboard for their reports. The only thing that was changed in the email and file was the file name with the current date. I left the company a while ago and recently applied for a similar position for the shits and gigs. Interviewed with the It manager and they're still using the same excel macro I wrote 3 years later.2 -
!rant - well maybe
I really wonder what is going to be the end product concerning Deno and TypeScript when it deals to managing dependencines. Thus far the general idea is to have a deps.ts file for which the dependencies required are fetched through a url, cached into the project and then imported from that file onwards.
This seems interesting to me, and I would venture to say that it eliminates some of the pain points from running Node applications, we all know about the dread caused by overly large node_modules folders, but would y'all say this is the right approach? rather than stopping people from generating a large pool of dependencies, it seems that the issue would continue to persist, but it would just come from the internet during runtime rather than from living in the file system of the application.
Either way, I still remain a big fan of Ryan Dahl and his creations and can't wait to see Deno stable enough to test out on a couple of projects.2 -
I need help!
I have recently scraped ALOT of news article data and formated for training data. This adds up to 5.6gb. Where can I upload this for people to access as I think it's quite useful. I have tried GitHub LFS ( Large File Storage ) but topped out account at 1gb. Also I don't really want to pay for hosting.2 -
I recently refactored the horrible main.js of one of our clients. I didn't even know you could fit so much shit in "just" 700 lines of code (yes, it's really that big...). After 3 hours full of swearing and grinding teeth about this piece of shit, I was finally done and tested it.
It was so incredibly satisfying to see the page loading twice as fast! -
Ideas I've had over the years that could pan out and be useful:
SMS-DB: Stands for SMS-Data Burst. Used to allow those with low cell signal or no data plan to transfer data between a phone and some client via the standard SMS text space. Would be slow, but would act kinda like dial-up over SMS (as mobile lines are compressed on all service levels, even LTE, so traditional dial-up wouldn't work!) I have a general idea on how packets would be laid out, but that's about it so far...
everything2PNG: Allows one to transpose any file's data into a PNG with a 3 byte per pixel (full color RGB), which allows for a "compression" of sorts (about 91, 93% on preliminary tests) AND allowing further, more efficient compression of the resulting file. (Plus... it's just kinda cool to see files transposed as PNGs.) I actually have a simple transposer to go to PNG, but can't yet go back. Large files (around 600MB) use upwards of 4GB with efficient paging and other optimizations via NumPy so far, so it's not *viable* yet, but it's coming along nicely.
RPi-GPIO Interconnection Bus: A master/slave or round robin method to allow for Raspberry Pis to communicate using GPIO, which can help free up network bandwidth in RPi cloud computing clusters. At most, this'd allow for 4 bits used for pushing to the GPIO "bus", and 4 bits used for pulling from the "bus". 8 pins total are usually unused minimum, so either 3 or 4 pins for upload, 3 or 4 for download, and potentially 1 or 2 for commands, general non-data communication, etc. I made a version of this concept using Round Robin for a client, but it was horribly slow. (I also don't have distribution rights for the code, so i'm working from scratch.) Definitely doable. -
Be me
>run kubeadm join phase control-plane-prepare
>get error: {Large Error} {Missing conf file...}
Please use 'kubeadm join phase control-plane-prepare' to generate {conf file}
>mfw
I called you to generate that file. Could you please do your job -
Working with someone who made repetitive css code like this:
#home {
x
}
#blog {
y
}
where x and y exactly have same child classes structure with small differences, making css file somewhat large.2 -
Should I modify the file? or Copy the file?
it is already in use for a similar case but different.
double code vs doing one thing per class
a large refactor is needed for this one.
#codestruggles2 -
Last job and current job I got mostly the same way. Current job was done slightly more effectively.
Here is what I did both times:
* Each day I checked all the job sites for developer jobs in the locations I was willing to travel to. I made bookmarks to various search pages so I could quickly see the results.
* I regularly searched for websites of any IT companies or large corporations that had offices in those locations. I bookmarked these and would check each day to see if they had job openings on their websites.
* Every job I applied for I made a folder with the date and job description.
* Inside the job folders I made a notes.txt file with the wording of the job and links to the ad. I googled the company and added notes like peoples names, etc. to these notes files.
* For every job I made minor alterations my resume to make sure it aligned with the job ad and copied it to the job folder
* I created another text document called cover_letter.txt which had a written letter describing all my experience that matched with the job ad
* Where possible I would call and speak with someone to get more detail about the job and updated the letter and resume accordingly
* Finally I would email or post the letter and resume
Using this method I was able to apply to several jobs every day and I was able to reuse and improve on the letters as the weeks went by. Also since I applied for a lot of jobs when someone replied I had the job ad available to look at.
For both last and current jobs I moved countries. The difference was between last and current was the previous time I moved first then started looking and for my current job I started looking before I moved. For the current job employers seemed to welcome my situation and I had several job interviews lined up for after my arrival. I felt it put me in a better light since I was essentially unavailable until my arrival date compared to before when I was unemployed and looking and getting desperate.
The job I have now I was interviewed while overseas on skype and then in-person the day after landing in the country. They quickly told me I would be hired. It seemed good so I canceled the other interviews. Sorry no exciting circumstances.1 -
Having spent days wrestling to get an old phone working just enough so my mum can use it, before I waste any more time on getting another one to spend time trying to update the OS on it.. has anyone suggestions for a replacement, or coming soon replacement that fits the following requirements:
Costs less than $100 (Either new or second hand.)
Clamshell design, so screen is protected inside.
Physical QWERTY keyboard, with separate number row and backlit with large keys.
2G or 3G network capable.
Ideally bug free..
I just spent ages getting a simple thing like the ringtone to work correctly as it wouldn't let you choose the one you wanted, and the USB to PC link wasn't working, so in the end had to connect it to the internet via the wifi router, and download the working ringtone file off my own website..
I tried updating the OS, but just got errors, which I suspect is because I was using W10, and the updating only works in XP..
And I haven't got around to installing XP again on my backup PC..16 -
I hate when programming books have shit code examples.
Just came across these, in a single example app in a Go book:
- inconsistent casing of names
- ignoring go doc conventions about how comments should look like
- failing to provide comments beyond captain obvious level ones
- some essential functionality delegated to a "utils" file, and they should not be there (the whole file should not exist in such a small project. If you already dump your code into a "utils" here, what will you do in a large project?)
- arbitrary project structure. Why are some things dumped in package main, while others are separated out?
- why is db connection string hardcoded, yet the IP and port for the app to listen on is configurable from a json file?
- why does the data access code contain random functions that format dates for templates? If anything, these should really be in "utils".
- failing to use gofmt
These are just at a first glance. Seriously man, wft!
I wanted to check what topics could be useful from the book, but I guess this one is a stinker. It's just a shame that beginners will work through stuff like this and think this is the way it should be done.2 -
Out of the Operating Systems, I think Windows has the best file moving system.
Yes, you're unable to transfer files if they're currently in use, but you can start new file moving jobs in the middle of a current file moving process, and as of Windows 7, files that are unable to be moved for whatever reason don't cancel out the entire move.
Linux is next best because files will move regardless of if they're in use or not, and a file that is unable to be moved for whatever reason also won't cancel the entire move. However, if a new file move process is started, it will pause until the current move job is completed, which is a pain if moving a lot of large files.
Mac is the worst. One failed move results in cancelling the entire process. If a file will be duplicated or is unable to be moved, if you cancel that specific move, you have to start it all over again. No way to cancel or skip, just start it all over.4 -
For me that would be Proxmox. I know, people like it - but for no apparent reason it decided to nuke half my ZFS datasets in a pool, with no logic behind it whatsoever. All disks were tested, all came out good. Within the same pool there were datasets that were lost and some that remained.
I really don't get it. Looking at Proxmox' source code, it's more or less the command line tools and then there's the web interface (e.g. https://github.com/proxmox/...). Oh and they have the audacity to use their own file extension. Why not I guess?
Anyway, half my data was gone. I couldn't tell how or why or what the fuck even happened there. But Proxmox runs Debian underneath and I've been rather pissed about Proxmox' idea of "don't touch the host system aaa" for a while at that point. So I figured, fuck it I'll just take pure Debian then and write my own slightly better garbage on top of that. And as such the distribution project was born. I've been working on it for a little over a year now. And I've never had such issues again.
I somewhat get the idea of "don't touch the host" now, but still not quite. Yes, the more you do in the containers, the better. And the less you do on the host in terms of reconfiguration, the longer it will stay alive for. That goes for any system - more reconfiguration means usually means less stability and harder to replace. But sometimes you just have to work from the host. Like say migrating a container between hosts, which my code can do. You can't do that from a container, at all. There are good reasons to work with the host. Proxmox isn't telling that. Do they expect their users to be idiots? Only enterprise sysadmins amirite?
So yeah, that project - while I do take inspiration from it in mine - I don't like it. It's enterprise, it has the ZFS and the Ceph and the LXC and the VM's - woohoo! Not like anyone could implement that on a base Debian system. But they have the configuration database (pmxcfs), the distributed configuration database of a couple MB large and capped there, woah!
Ok sure it isn't Microsoft or IBM or Oracle or whatever, and those are definitely worse. But those are usually vendor lock-ins.. I avoid those on that premise alone :)3 -
Background: We switched from just simple old PHP and JS using notepad++ to PHPStorm and its infinite configurables, Symfony 4, Twig, Composer, Doctrine, Yarn, NPM, Bootstrap, ( thank the stars we didn't try to add Docker in with all this ), any other junk I'm missing here? Then upgraded to Symfony 5.
Symfony's autowiring: madness behind the curtains. I get frustrated about when and where I can just magically inject these dependencies or use config variables, you know, like the ones you define in service.yaml. Hmmm, "service".yaml. In a controller you can say getParameter() but in a service you have to inject the parameter, FROM THE "SERVICE".yaml!!! Autowiring drives me nuts. Ok, so we can supply dependencies using the constructor, that's great! Within a controller you never have to instantiate the object you're passing to the constructor (autowiring handles that). That's cool, weird when we you try to trace it for the first few times, but nice I guess. Feels like half-assin' it. What bugs me here is that it only works in controllers... I guess out of the box.. i'm not even sure. To get that feature to work for services you have to make some yaml edits. Right?Maybe? Some of the Symfony tutorials have you code up some junk then trash it. Change config then wipe that out and do X instead... so I have no idea what "out of the box" for Symfony really is.
Found this cool article that describes my frustrations in better terms and seems like a good resource to learn about autowiring. I need to continue my yaml wizardry classes. https://alanstorm.com/symfony-autow...
.....And on to YAMLs, or CSS, or JS or any other friggin' change you make to a file anywhere... Make a change, reload page, nothing... nope you have to do some hidden cheat combo of yarn dostuff -> cache:clear -> cache:warmup -> cache:cache:the:cache ... I really really hate this crap. Maybe I'm too old school for all this junk. It was simple with pure PHP. Edit code, push file, reload page, and oh look it changed! Done. So happy! Ok, Ok. Occasionally the js or css might get cached by the browser and you have to ctrl/f5 or Shift/f5 .. one of those. With this framework there's just so much more that you have to remember to do get some new feature of your site loaded.
Now, I totally get wanting to use some type of entity framework, but I feel like my entire world turned backwards. Designing tables using something like MySQL Workbench made sense. I can see all the columns and datatypes right there as i'm building them. From what I've experienced now with Symfony/Doctrine is you have to make and entity, get a shit-ton of question lobbed at you and if it's a relation field you have to really have a clear idea of the cardinality up front. Then we migrate that to the database. Carefully read through the SQL if you really really just want to use migrations:migrate in Prod. That alter table could cost you some some downtime if your table is large.
Some days man.... -
I've been working for so long with API integrations and one part of that is security. We perform ssl key exchanges for 2-way verification and a large percent of those partners provides me with their own pkcs12 file which contains their private and public keys! What's the sense of the exchange!? I think they just implement it just to boast that they "know" how ssl works,
-
VSCode. I used to be a WebStorm guy, but at one point I found out that I could do like 85% of the stuff in VSCode, and switched over. Things I still kinda miss from the JetBrains ecosystem:
- the elaborate refactoring
- the built-in navigation across the file and the project
- the really clever expand select and go to open/closing bracket (VSCode is kinda getting there, but for expand select it honours camel case words and that can't be turned off, it's weird with HTML files with inlined JS or CSS; for bracket jumping it must rely on an extension)
- the way that everything within the UI is predictable and navigable with keyboard only (tried opening a dropdown in VSCode without having a specific keybinding for that specific dropdown? In WebStorm it was Alt+Up/Alt+Down for any dropdown that has focus IIRC)
- the visual way of changing a colour theme (in VSCode you have to guess what is what before modifying a value; by the way this is an idea for an extension that I might research)
What I like about VSCode:
- the speed (although it can get slow with large files; on the other hand JetBrains IDEs are not that slow except for the startup, given that you're not working on a potato, but here we are)
- its extensibility and very active extension development (and the fact that it's rather easy to write your own extensions, although I haven't benefited from that very much)
- the ease of syncing settings (the Settings Sync extension and now the built-in mechanism introduced I think earlier this month)
- it's free (so I don't have to pay for it myself or nag to my employer to issue me a license)
I've tried Sublime and it's hands down the fastest thing I've seen (it can open a 100 MB text file on the shittiest computer you can find and edit it efficiently), the problem is that it's not so rich in extensions. I've tried vim, nano and whatnot, but I'm far from that, just not my cup of tea. I'm okay for the occasional file edit while SSHd somewhere, but that's all.
In an ideal world we'd have something like Sublime's performance with VSCode's ecosystem and JetBrains', well, brains... -
Writing a large file/making lots of edits, going to save, and then realizing you typed "nano some_system_file" instead of "sudo nano some_system_file"
-
Hi guys, If you are front end dev (especially react dev) please read this and share your thoughts.
I recently started with react.js. But I didn't like the idea of nesting components. I know this is too early to talk about it. I'm not halfway through tutorials. But I'm loosing motivation to learn react.js
This never happened to me. I learned few frameworks in past. Django and codeigniter. They follow MVC/MVT architecture. And writing code in it looks cleaner and simpler.
In react JSX is confusing at first. You have to read same line twice or thrice to understand. I'm not saying JSX is bad, but it's not readable enough.
In early lessons I learnt that in react everything is component. And every component comes under one root component. Don't you guys think this well get messy for large application. You are dealing with number of nested components from one file into another.
I'm not against react. But the way react is forcing you to write code, is not something I enjoy. Let me know your thoughts. Maybe I'll get some kinda booster to continue react.1 -
I am working on partitioning my life and getting my tech stuff and online life organized. Partially fun, partially dread. Still one of the better things I'm dealing with right now.
Tech stuff mainly includes desktop PC (Qubes OS), network (to be driven by openwrt) and smartphone (already running Lineage OS, but I want to build my own LOS). This is the fun part. I want to add a NAS, but I'm too cheap for a proper one (at least for my >20TB media).
Furthermore offline stuff: Remove clutter, get analog documents properly organized (with a sustainable system) and possibly digitalized. I already have maybe half of the things I own in boxes each with a specific purpose (e.g. audio cables, network cables and game controllers each have their own box). Can be tiresome, but it's easy to see a progress and that makes it quite okay.
Online life: That's a big one. A large chunk is email and the hundreds of website accounts. I have them in a keepass file, but all running under the same address. Unfortunately I need to have a Facebook account for some purposes, but I'd like to start over with a new one. Not so easy when you have to transfer group admin privileges though, when I tried the last time I tripped some system and the new account was banned. Annoying. -
Github be like:
Want control on your files? Host your own LFS!(This goes the same even for those who are buying their storage packs for boosting their LFS storage by giving money)
FUCK THIS SHIT... I am a poor student. I also don't have a fucking credit card!! Can't you improve your system instead of asking people to host their shit themselves?
Also, why do they even have access to deleting user files??!! They literally asked me to give a sha sum of files I want to restore so they can delete the rest as one option and providing hashes of files to be deleted as another.
And the hashes are not even secret(as the files are in an open repository).
Which means, if you have a large file on a public repository and animosity with a github staff, BOOM! That file is no more!!10