Join devRant
Do all the things like
++ or -- rants, post your own rants, comment on others' rants and build your customized dev avatar
Sign Up
Pipeless API
From the creators of devRant, Pipeless lets you power real-time personalized recommendations and activity feeds using a simple API
Learn More
Search - "cron linux"
-
*Now that's what I call a Hacker*
MOTHER OF ALL AUTOMATIONS
This seems a long post. but you will definitely +1 the post after reading this.
xxx: OK, so, our build engineer has left for another company. The dude was literally living inside the terminal. You know, that type of a guy who loves Vim, creates diagrams in Dot and writes wiki-posts in Markdown... If something - anything - requires more than 90 seconds of his time, he writes a script to automate that.
xxx: So we're sitting here, looking through his, uhm, "legacy"
xxx: You're gonna love this
xxx: smack-my-bitch-up.sh - sends a text message "late at work" to his wife (apparently). Automatically picks reasons from an array of strings, randomly. Runs inside a cron-job. The job fires if there are active SSH-sessions on the server after 9pm with his login.
xxx: kumar-asshole.sh - scans the inbox for emails from "Kumar" (a DBA at our clients). Looks for keywords like "help", "trouble", "sorry" etc. If keywords are found - the script SSHes into the clients server and rolls back the staging database to the latest backup. Then sends a reply "no worries mate, be careful next time".
xxx: hangover.sh - another cron-job that is set to specific dates. Sends automated emails like "not feeling well/gonna work from home" etc. Adds a random "reason" from another predefined array of strings. Fires if there are no interactive sessions on the server at 8:45am.
xxx: (and the oscar goes to) fuckingcoffee.sh - this one waits exactly 17 seconds (!), then opens an SSH session to our coffee-machine (we had no frikin idea the coffee machine is on the network, runs linux and has SSHD up and running) and sends some weird gibberish to it. Looks binary. Turns out this thing starts brewing a mid-sized half-caf latte and waits another 24 (!) seconds before pouring it into a cup. The timing is exactly how long it takes to walk to the machine from the dudes desk.
xxx: holy sh*t I'm keeping those
Credit: http://bit.ly/1jcTuTT
The bash scripts weren't bogus, you can find his scripts on the this github URL:
https://github.com/narkoz/...53 -
So that high level prank from yesterday.
Senior Linux engineer, the fucker.
He somehow installed shitloads of cron jobs onto my system.
Every few minutes it would create a new user with a freaking complicated password. Then it would install openssh server in case it wasn't installed yet. After that it'd set all iptables rules to allow incoming AND outgoing connections on port 22.
That was one badass ansible script though!
I'm not sure what more there's to it because sometimes when i removed crons, they'd magically appear again later AND i forgot to check the boot scripts so i might be fucked again when I get to work today!
Plus side, i finally fully understand cron 😅19 -
Every Unix command eventually become an internet service .
Grep- > Google
rsync- > Dropbox
man- > stack overflow
cron- > ifttt5 -
Pranks again today. Mother of God the level of those pranks is becoming high as fuck.
Define high?
Having to debug shit at system (cron, firewalling, users, sometimes even digging through logs/dmesg) level because weird shit happens all day long.
This is upping my Linux skills a lot though! I love it 😍9 -
I've got a confession to make.
A while ago I refurbished this old laptop for someone, and ended up installing Bodhi on it. While I was installing it however, I did have some wicked thoughts..
What if I could ensure that the system remains up-to-date by running an updater script in a daily cron job? That may cause the system to go unstable, but at least it'd be up-to-date. Windows Update for Linux.
What if I could ensure that the system remains protected from malware by periodically logging into it and checking up, and siphoning out potential malware code? The network proximity that's required for direct communication could be achieved by offering them free access to one of my VPN servers, in the name of security or something like that. Permanent remote access, in the name of security. I'm not sure if Windows has this.
What if I could ensure that the system remains in good integrity by disabling the user from accessing root privileges, and having them ask me when they want to install a piece of software? That'd make the system quite secure, with the only penetration surface now being kernel exploits. But it'd significantly limit what my target user could do with their own machine.
At the end I ended up discarding all of these thoughts, because it'd be too much work to implement and maintain, and it'd be really non-ethical. I felt filthy from even thinking about these things. But the advantages of something like this - especially automated updates, which are a real issue on my servers where I tend to forget to apply them within a couple of weeks - can't just be disregarded. Perhaps Microsoft is on to something?11 -
Alright, so my previous rant got a way better response than I expected! (https://devrant.io/rants/832897)
Hereby the first project that I cannot seem to get started on too badly :/.
DISCLAIMER: I AM NOT PROMOTING PIRACY, I JUST CAN'T FIND A SUITABLE SERVICE WHICH HAS ALL THE MUSIC I WANT. I REGULARLY BUY ALBUMS. before everyone starts to go batshit crazy regarding piracy, this is legal in The Netherlands for personal use. I think that supporting the artists you love is very good and I actually regularly pay for albums and so on but:
- I want all the music from about every artist in my scene. Either on Deezer or on Spotify this is not available and I'm not gonna get them both (they both have about half of the music I want). Their services are awesome but I'm not going to pay for something if I can't listen to all the music I like, hell even some artists (on deezer mostly) only have half their music on there and it's mostly not better on Spotify.
- I'd happily buy all albums because I love supporting the artists I love but buying everything is just way too fucking much."Get a premium music streaming subscription!" - see the first point.
You can either agree or disagree with me but that's not what this rant is about so here we go:
The idea is to create a commandline program (basically only needs to be called by a cron job every day or so) which will check your favourite youtube (sorry, haven't found a suitable non-google youtube replacement yet) channels every day through a cronjob and look for new uploads. If there are, it will download them, convert them to MP3 or whatever music format you'd like and place them in the right folder. Example with a favourite artist of mine:
1. Script checks if there are any new uploads from Gearbox Digital (underground raw hardstyle label).
2. Script detects two new uploads.
3. Script downloads the files (I managed to get that done through the (linux only or also mac?) youtube-dl software) and converts them to mp3 in my case (through FFMPEG maybe?).
4. Script copies them to the music library folder but then the specific sub-folder for Gearbox Digital in this case.
You should be able to put as many channels in there as you want, I've tried this with the official YouTube Data API which worked pretty fine tbh (the data gathering through that API). The ideal case would be to work without API as youtube-dl and youtube-dlg do. This is just too complicated for me :).
So, thoughts?43 -
I recently started using Linux on my desktop.
I just love how when I see some minor thing I don't like about the operating system I can just change it myself!
Want to remove the menu icon? Simple change to the settings. Want your downloads folder to be clean? Simple cron job which asks you if you want to clean your downloads folder every day.
Man I love having the freedom to screw up my operating system!9 -
Typed crontab - r instead of
crontab - e, gonna be a long weekend to recover crons from log files.3 -
I fucked up. I used the shebang line #!/usr/bin/env python3 in a script that was being ran every 5 minutes with a cron job. This generated an email to a system that dropped a file for processing and sent an age email for each file every minute. Because the Linux OS generated emails didn’t contain a keyword the script closed by design but I forgot to uncomment the delete temp file line. This started on Wednesday before a 4 day weekend. By the time I got in on Monday I was 40GB over my email quota and receiving 2500 emails a minute. I fixed the script and stopped the emails but down I have to clear out those emails. Here it is Wednesday and I am deleting 1 MB every 3 seconds. This is painful.1
-
FUCKING SYSTEMD PIECE OF CRAP.
*Punches a wall or something*
Ugh, newest version of PHP-FPM apparently has a dependency on a Systemd package. The package doesn't change the system's init daemon to systemd, but just the fact that it has that, that more and more stuff is becoming dependent on that crap of a bloated piece of software is driving me crazy.
I hate systemd from the bottom of my soul, not for being a bad piece of software by any means. The systemd environment is quite well fitted together, but for being a monolithic monstrosity that is taking over more and more of the traditionally independent system services.
It would be absolutely good in my book, if it allowed a user or admin to choose which parts of SystemD they are going to install, and so, in the core, it would be a mere init daemon.
But noooooo, systemd has to take over cron, system dns resolver, home and user management and I bet its not the end.
GNU/Linux is becoming GNU/SystemD/Linux...9 -
Spent over an hour on a shell script that wasn't working properly. I use it, works perfectly. Every time cron executes, does nothing, not even log an error.
It took me that long to realize that the user I was getting the cron to run on didn't have permission to write to my log file... You would think I'd realize this when my error scripts didn't log...
(on that note, the Bandit games at OverTheWire have been awesome refresher on getting back into the swing of linux - highly recommend) -
Most of the faculty on my college's IT engineering department aren't exactly adept with Linux, despite the fact that 10/12 labs in our building run on Ubuntu.
Last week, a really great professor (who doesn't take any classes I can attend) from the Electronics and Communications department and I wrote some bash scripts to automate updates and so on, staying back after college until late evening to try to get the PCs updated.
We'll be trying to use SSH to update as many computers as we can remotely, and trying to learn to use Cron to automate the whole updating deal.
I'm learning this stuff on the side, since it's not on my syllabus at all, and the professor isn't even related to the departments that run the labs usually.
We're not getting anything for doing this, the head of my department (who has it in for me) has no idea about this, and nobody else is bothered enough to learn either. -
Made this one-liner today:
hostname $(curl nsanamegenerator.com | grep body | sed -e 's/<.*;//g' | sed -e 's/<.*>//g')
and added it to my laptop's crontab...4 -
I have a small NUC-like machine in my home with an old external hdd connected to it. I use it to run my local gitlab, nextcloud and to test a few websites I build for the lolz.
If you too have a homelab, whether it's a single raspberry or an entire room full or racks, you know damn well that everything you have running locally as a web service keeps going until it doesn't, for whatever fucking reason. This time, it was the turn of my nextcloud.
The machine has arch linux running, I chose it since I already use it on my coding laptop and being a rolling release means I don't have to manually upgrade to a newer version, risking various fuck-ups and consequent screaming of profanity.
The downside is that arch is a bleeding-edge distro, so, despite being pretty good for what concerns security, as updates are pushed out some packages may still require legacy software to work as intended, since obviously not all developers for all packages can release simultaneously.
The problem was that php reached 8.2.x but nextcloud couldn't use anything beyond 8.1, so the highlighted solution was to download php-legacy, a package with a set of utilities which the cloud could use instead of mainline php.
Pretty easy, right? fuck my life, here we go.
I edited apache-httpd's configurations to link the new libraries, updated every reference in every virtual host that could possibly screw up the web server.
Done.
Then I went on and disabled the php-fpm mainline, creating a new systemd unit that would instead run the legacy executable and afterwards I edited nextcloud's additional configs so they use that instead.
Done, getting a bit dizzy, but I reboot everything and breathe.
At this point the migration should be complete, but wait, the server returns an error saying that the application is still trying to use php 8.2+...wait, what in the sysadmin Christ?
Back to nextcloud config, everything is set, everything else in every other fucking php-legacy and web server is fine, the old fpm service is disabled, I am confused, and why in the FUCKING FUCK is the new php-fpm unit failing to start at boot with "error 78/config - directory not found"? Hello? Am I being trolled by a shitty dual-core amazon fake NUC?
Maybe yes, cause it turns out that the unit was referencing a directory in the external hdd, which gets mounted at boot time after the unit itself starts, so nothing much, just a matter of tinkering with cron jobs, a reboot and at least this one is off my balls.
But why still isn't the server responding correctly? why? WHY?
After slamming my cock on the keyboard here and there scrolling back through all the config files I think to myself, hmmm, my gitlab is working flawlessly, well yeah, I didn't need to install the whole web stack, everything was nice and easy wrapped in a docker container...so why am I even here, why the fuck am I bothering with all this layered web-app bullshit, why don't I just run the up-to-date docker image that someone else has already set up for me, back up all the data and reupload them on the application?
Oh joy, you can't imagine, after 3...almost 4 hours of pure computer-touching the relief I had from seeing the blue web page with the "welcome to nextcloud" title.
Right now it's copying back all the files, and the external hdd is now linked to include the data folder.
Like really, everything was solved in two lines of bash.
I am still fuming, but at least I learned a valuable lesson, if you want a service up for yourself, implement it and deploy it as fucking easy straight-forward as you can, giving MAXIMUM priority to already fully-working options that are out there just waiting to be downloaded and used. I swing my scrotal sack on web-apps elegance as long as it's MY homelab in MY place.
Eat a fat dick php.
sudo pacman -Rns nextcloud
sudo systemctl disable --now php-fpm-legacy
sudo pacman -Rns php-legacy
sudo pacman -Rns $(sudo pacman -Qdtq)2 -
Having problems with getting user's IP address with PHP.
So basically I made a custom DDoS protection for my linux server.
It works like this: php website gathers visitor IP address when he does a certain action (in this case registers an account). All visitor ips are stored in ips.txt securely on my website ftp.
Then my linux server has iptables rules setup in a way where it blocks all traffic except my website traffic.
On linux server I have a cron job which pulls whitelisted ips every 5 minutes from my php website FTP and then whitelists all IP's in iptables.
That way only visitor IP's (of those who registered account in my website) are being whitelisted in my linux server.
In case of a DDoS attack, all traffic is dropped except for the whitelisted visitor's IP's gathered from website ips.txt
Now I'm having a problem. My PHP script is not accurate. Some visitors in my website are not being whitelisted because they might have a different ipv4 ip address than what is given from php website. So basically I am looking for some php script/library that would gather ALL ipv4 ips from a visitor, then whitelist them.
Also regarding ipv6, my iptables are all default (which means that all ipv6 visitor traffic is allowed) so problem is not with visitors that have ipv6. Problem is with my script not getting ALL ipv4 ip addresses assigned to the user.
Can you recommend me some php library for that? So far I've used https://github.com/marufhasan1/... but apparently it's not accurate enough.16 -
CRON JOBS SUCK. @LINUX YEAH YOU HEARD ME
MY PROGRAM WRITES INFO TO A DATABASE, SENDS EMAILS AND OUTPUT IS PIPED TO A LOG FILE. NONE OF THESE THINGS HAVE OCCURRED DURING THE CRON RUN SO I DON'T KNOW WHAT IS OR ISN'T WORKING.5