Join devRant
Do all the things like
++ or -- rants, post your own rants, comment on others' rants and build your customized dev avatar
Sign Up
Pipeless API
From the creators of devRant, Pipeless lets you power real-time personalized recommendations and activity feeds using a simple API
Learn More
Search - "virtual folder"
-
What an absolute fucking disaster of a day. Strap in, folks; it's time for a bumpy ride!
I got a whole hour of work done today. The first hour of my morning because I went to work a bit early. Then people started complaining about Jenkins jobs failing on that one Jenkins server our team has been wanting to decom for two years but management won't let us force people to move to new servers. It's a single server with over four thousand projects, some of which run massive data processing jobs that last DAYS. The server was originally set up by people who have since quit, of course, and left it behind for my team to adopt with zero documentation.
Anyway, the 500GB disk is 100% full. The memory (all 64GB of it) is fully consumed by stuck jobs. We can't track down large old files to delete because du chokes on the workspace folder with thousands of subfolders with no Ram to spare. We decide to basically take a hacksaw to it, deleting the workspace for every job not currently in progress. This of course fucked up some really poorly-designed pipelines that relied on workspaces persisting between jobs, so we had to deal with complaints about that as well.
So we get the Jenkins server up and running again just in time for AWS to have a major incident affecting EC2 instance provisioning in our primary region. People keep bugging me to fix it, I keep telling them that it's Amazon's problem to solve, they wait a few minutes and ask me to fix it again. Emails flying back and forth until that was done.
Lunch time already. But the fun isn't over yet!
I get back to my desk to find out that new hires or people who got new Mac laptops recently can't even install our toolchain, because management has started handing out M1 Macs without telling us and all our tools are compiled solely for x86_64. That took some troubleshooting to even figure out what the problem was because the only error people got from homebrew was that the formula was empty when it clearly wasn't.
After figuring out that problem (but not fully solving it yet), one team starts complaining to us about a Github problem because we manage the github org. Except it's not a github problem and I already knew this because they are a Problem Team that uses some technical authoring software with Git integration but they only have even the barest understanding of what Git actually does. Turns out it's a Git problem. An update for Git was pushed out recently that patches a big bad vulnerability and the way it was patched causes problems because they're using Git wrong (multiple users accessing the same local repo on a samba share). It's a huge vulnerability so my entire conversation with them went sort of like:
"Please don't."
"We have to."
"Fine, here's a workaround, this will allow arbitrary code execution by anyone with physical or virtual access to this computer that you have sitting in an unlocked office somewhere."
"How do I run a Git command I don't use Git."
So that dealt with, I start taking a look at our toolchain, trying to figure out if I can easily just cross-compile it to arm64 for the M1 macbooks or if it will be a more involved fix. And I find all kinds of horrendous shit left behind by the people who wrote the tools that, naturally, they left for us to adopt when they quit over a year ago. I'm talking entire functions in a tool used by hundreds of people that were put in as a joke, poorly documented functions I am still trying to puzzle out, and exactly zero comments in the code and abbreviated function names like "gars", "snh", and "jgajawwawstai".
While I'm looking into that, the person from our team who is responsible for incident communication finally gets the AWS EC2 provisioning issue reported to IT Operations, who sent out an alert to affected users that should have gone out hours earlier.
Meanwhile, according to the health dashboard in AWS, the issue had already been resolved three hours before the communication went out and the ticket remains open at this moment, as far as I know.5 -
Yesterday, microsoft showed me once again, what it means to "obey".
I tried to install Microsoft SQL Server 2012 on a virtual machine with OS Windows7.
The installation-center asked me to choose an installation-folder for SQL-Server.
No matter what, for any folder i had chosen for the installation, the setup replied with the errormessage "The installation-folder is invalid"
So i considered asking our platform-services team, whether they gave me administrative rights for the vm.
They did. I had full access to the components of my vm.
After a few days i finally recognized, that i had picked a wrong iso for the installation of sql server.
Instead of sql server 2012 + Service Pack 3, i picked sql server 2012 ServicePack 3.
So after all, Microsoft tried to tell me by showing the message "The installation-folder is invalid", that the setup weren't able to find an installation of Microsoft SQL Server 2012.
God damned!!1!3 -
Hello and welcome, to a presentation in which I will tell you my thoughts on the shortcomings of modern day computers and programming practices.
Computers are based on a very fundamental and old idea, folders, and files, a file is basically a concrete amount of data, whereas a folder is a group of files, and it comes from the real life concept of files and folders, now it might be quite obvious already that using a concept invented in 1898 by a guy called Edwin G. Seibels, might not be the best way for computers to function in the year 2020, but alas, it is.
Unless of course, you step into the world of a programmer.
A programmer’s world is much different, they use this idea of a data structure, or in simpler terms, an object. An Object is just like what you would think of as an object in your head, something with different properties that you can think about in different ways, for example your mobile phone, it has a battery percentage, it has a screen size, it has free space available. Programmers use these data structures to analyse data very quickly, like finding all phones with a screen size bigger than a certain size for example.
The problem is that programmers still use files and folders to create the programs that use these objects.
Consider this example.
Let’s say you want to create a virtual version of a drink bottle, consider what properties it will have, colour, volume, height, width, depth, material, etc..
As a programmer, you can leverage programming features and change the properties of a drink bottle directly, if you wanted to change the colour, you just say, drink bottle “dot” colour, equals blue, or red.
But if the drink bottle was represented as a file, all the drink bottles data would be inside the one file, so you would have to open the whole file, find the line or section of the file that has the colour data of the drink bottle, and select it, highlight it, delete what’s there, and type in your new value.
One way to explain this better is to imagine a folder that now represents the drink bottle, imagine adding a new file into that folder that represents each property I described before, colour, volume, etc.., well now, you could just open that folder, find the file for colour, either by looking with your eyes or you could do a file search in the folder for a file called colour, open it, and edit the value inside. This way of editing objects is the one that more closely represents the way programmers and a program itself interacts with objects inside a running programming language.
But the thing is, programmers don’t use the folder/file way of creating objects and putting them into programs, because it would be too cumbersome, they just create 1 file for an object, or have lots of objects in a file, and create all the objects in 1 file, and then run the program which creates the objects, then when they stop the program, it deletes the objects. So there is no actual link between the object in a file and the object that the program creates by reading the data from that file, if you change the object in your program, it does not get saved to the file.
So programmers created databases to house these objects, but there is still a flaw in databases, they are hard to interface with, and mostly databases are just used to send data or retrieve data from, programmatically, you can’t really browse a database the way you can browse the files on your computer. You can, but database interfaces are not made to be easily navigated the way files and folders are.
As it stands, there is no way to store objects instead of files on your computer and interact with them in complex ways the way programmers can inside the programs they create.
If the idea of an object became standard the way a file and folder is standard, I think it would empower human’s a great deal to express things far more easily and fluidly than they can today.
Thanks for reading.8 -
I continue to internally read and study about Smalltalk in an effort to see where we might have FUCKED UP and went backwards in terms of software engineering since I do not believe that complex source code based languages are the solution.
So I have Pharo. Nothin to complex really, everything is an object, yet, you do have room for building DSL's inside of it over a simple object model with no issue, the system browser can be opened across multiple screens (morph windows inside of a smalltalk system) for which you can edit you code in composable blocks with no issues. Blocks being a particular part of the language (think Ruby in more modern features) give ample room for functional programming. Thus far we have FP and OO (the original mind you) styles out in the open for development.
Your main code can be executed and instantly ALTER the live environment of a program as it is running, if what you are trying to do is stupid it won't affect the live instance, live programming is ahead of its time, and impressive, considering how old Smalltalk is. GUI applications can be given headless (this is also old in terms of how this shit was first distributed) So I can go ahead and package the virtual machine with the entire application into a folder, and distribute it agains't an organization "but why!!!! that package is 80+ mbs!") yeah cuz it carries the entire virtual machine, but go ahead and give it to the Mac user, or the Linux user, it will run, natively once it is clicked.
Server side applications run in similar fashion to php, in terms of lifecycles of request and how session storage is handled, this to me is interesting, no additional runtimes, drop it on a server, configure it properly and off you go, but this is common on other languages so really not that much of a point.
BUT if over a network a user is using your application and you change it and send that change over the network then the the change is damn near instant and fault tolerant due to the nature of the language.
Honestly, I don't know what went wrong or why we are not bringing this shit to the masses, the language was built for fucking kids, it was the first "y'all too stupid to get it, so here is simple" engine and we still said "nah fuck it, unlimited file system based programs, horrible build engines and {}; all over the place"
I am now writing a large budget managing application in Pharo Smalltalk which I want to go ahead and put to test soon at my institution. I do not have any issues thus far, other than my documentation help is literally "read the source code of the package system" which is easy as shit since it is already included inside. My scripts are small, my class hierarchies cover on themselves AND testing is part of the system. I honestly see no faults other than "well....fuck you I like opening vim and editing 300000000 files"
And honestly that is fine, my questions are: why is a paradigm that fits procedural, functional and OBVIOUSLY OO while including an all encompassing IDE NOT more famous, SELECTION is fine and other languages are a better fit, but why is such environment not more famous?9 -
I have a small NUC-like machine in my home with an old external hdd connected to it. I use it to run my local gitlab, nextcloud and to test a few websites I build for the lolz.
If you too have a homelab, whether it's a single raspberry or an entire room full or racks, you know damn well that everything you have running locally as a web service keeps going until it doesn't, for whatever fucking reason. This time, it was the turn of my nextcloud.
The machine has arch linux running, I chose it since I already use it on my coding laptop and being a rolling release means I don't have to manually upgrade to a newer version, risking various fuck-ups and consequent screaming of profanity.
The downside is that arch is a bleeding-edge distro, so, despite being pretty good for what concerns security, as updates are pushed out some packages may still require legacy software to work as intended, since obviously not all developers for all packages can release simultaneously.
The problem was that php reached 8.2.x but nextcloud couldn't use anything beyond 8.1, so the highlighted solution was to download php-legacy, a package with a set of utilities which the cloud could use instead of mainline php.
Pretty easy, right? fuck my life, here we go.
I edited apache-httpd's configurations to link the new libraries, updated every reference in every virtual host that could possibly screw up the web server.
Done.
Then I went on and disabled the php-fpm mainline, creating a new systemd unit that would instead run the legacy executable and afterwards I edited nextcloud's additional configs so they use that instead.
Done, getting a bit dizzy, but I reboot everything and breathe.
At this point the migration should be complete, but wait, the server returns an error saying that the application is still trying to use php 8.2+...wait, what in the sysadmin Christ?
Back to nextcloud config, everything is set, everything else in every other fucking php-legacy and web server is fine, the old fpm service is disabled, I am confused, and why in the FUCKING FUCK is the new php-fpm unit failing to start at boot with "error 78/config - directory not found"? Hello? Am I being trolled by a shitty dual-core amazon fake NUC?
Maybe yes, cause it turns out that the unit was referencing a directory in the external hdd, which gets mounted at boot time after the unit itself starts, so nothing much, just a matter of tinkering with cron jobs, a reboot and at least this one is off my balls.
But why still isn't the server responding correctly? why? WHY?
After slamming my cock on the keyboard here and there scrolling back through all the config files I think to myself, hmmm, my gitlab is working flawlessly, well yeah, I didn't need to install the whole web stack, everything was nice and easy wrapped in a docker container...so why am I even here, why the fuck am I bothering with all this layered web-app bullshit, why don't I just run the up-to-date docker image that someone else has already set up for me, back up all the data and reupload them on the application?
Oh joy, you can't imagine, after 3...almost 4 hours of pure computer-touching the relief I had from seeing the blue web page with the "welcome to nextcloud" title.
Right now it's copying back all the files, and the external hdd is now linked to include the data folder.
Like really, everything was solved in two lines of bash.
I am still fuming, but at least I learned a valuable lesson, if you want a service up for yourself, implement it and deploy it as fucking easy straight-forward as you can, giving MAXIMUM priority to already fully-working options that are out there just waiting to be downloaded and used. I swing my scrotal sack on web-apps elegance as long as it's MY homelab in MY place.
Eat a fat dick php.
sudo pacman -Rns nextcloud
sudo systemctl disable --now php-fpm-legacy
sudo pacman -Rns php-legacy
sudo pacman -Rns $(sudo pacman -Qdtq)2 -
Leaving things out of VCS. My usual folder structure is like this:
- Project name:
|-- env (virtual environment)
|-- Project name (git repo)
\-- (keys, credentials, etc.)
It makes sense, but after a while, more and more important stuff starts piling up in the outer folder (not version-controlled). -
Ok, so for past 1 whole day I am trying to make vhost work on my brand new laptop, running Ubuntu 16.04 LTS... When I installed OS, I've set hard disk encryption, and on top of it - user home folder encryption. Don't ask me why I did both.
Setting up vhost is simple and straight forward - I did it hundreds, maybe thousands of times, on various Linux distros, server and desktop releases alike.
And of course, as it usually happens, opposed to all logic and reason - setting up virtual host on this machine did't work. No matter what I do - I get 403 (access not allowed).
All is correctly set - directory params in apache config, vhost paths, directory params within vhost, all the usual stuff.
I thought I was going crazy. I go back to several live servers I'm maintaining - exactly the same setup that doesn't work on my machine. Google it, SO-it, all I can see is exactly what I have been doing... I ended up checking char by char every single line, in disbelief that I cannot find what is the problem.
And then - I finally figured it out after loosing one whole day of my life on it:
I was trying to setup vhost to point to a folder inside my user's home folder - which is set to be encrypted.
Aaaaaand of course - even with all right permissions - Apache cannot read anything from it.
As soon as I tried any other folder outside my home folder - it worked.
I cannot believe that nobody encountered this issue before on Stackoverflow or wherever else.9 -
Has this ever happened to you?
You open up your text editor and start typing commands. But it tries to turn on your virtual environment or run a script:
& "c:///project/folder/..."
But since you're in the middle of typing your commands, it get's meddled in between and it fails to run the script, so you gotta do it by hand.
git fet & "c:///project/folder/..."
Unrecognized command "get"7