Join devRant
Do all the things like
++ or -- rants, post your own rants, comment on others' rants and build your customized dev avatar
Sign Up
Pipeless API
From the creators of devRant, Pipeless lets you power real-time personalized recommendations and activity feeds using a simple API
Learn More
Search - "compress"
-
I wrote a Student Information system for my midterm project back in 94 written in Clipper and runs on MS-DOS.
I demoed & explained to the panel of professors how it tracks enrollments, payments, class schedules, grades and attendance of each and every student. Has user authentication, auditing and reporting functionalities.
It has a lite version also written in Clipper that can be installed on a Professor's laptop so that he/she can update records even at home, and would be able to sync with the db at school via a BBS. Telix for DOS (self-taught) was my choice for the BBS as it was shareware, has built-in Zmodem support and comes with it's own programming language called SALT (Script Application Language for Telix) that can be used for automating tasks. The lite version of my project would dump the updates on an ASCII file, compress the file using PKZIP, use the laptop's modem to dial-up the number to the school's BBS and send the file across using Zmodem protocol.
The main version would then download the file(s) from the BBS and proceed to do a sync.
After the doing the demo and answering all their questions the panel asked me to wait outside the room, called me back in after 15mins and told me that I don't have to attend that class for the remainder of the term. The happiness as the my classmates outside of the room gawked at me felt like King Midas himself gave my balls his golden touch.
Then in 97, 2yrs after I graduated, I accompanied my cousins to a different campus of the same school for their enrollment and right there on the bottom of the screen were my initials on a very very familiar UI! They actually used, and were still using, my school project. Needless to say my cousins didn't believe that it was written by me.15 -
Manager: The site I loading too slow. How can we improve this?
Me: *f5 & look at the network log* the server is taking too long to respond some image requests. We could encode them into the Html to have them all delivered in a single request.
Manager: GTMetrix says we need to compress the images.
Me: *reads GTMetrix report* we would only have a 150kb improvement. It won't even be noticeable.
Manager: If the images take a long time to load, it means that they're too big, right?
Me: or the server is taking a long time to respond our request for them, which is the case.
Manager: compress the images and upload them.
Me: *compresses the images and uploads them* done.
Manager: I don't see any improvement.
Me: if only there was someone who could have predicted such an outcome...1 -
I love the Hollywood Technical Jargon Generator:
"The SDRAM transmitter is down, compress the auxiliary address so we can synthesize the GB matrix!"3 -
Trying to compress 1Tb of data,
Will take 40 hours... I expected worse. Good boy Winrar, I will buy you this year promise..8 -
!rant
As a kid I tried to compress one zip file 4-5 times...self praising myself that I just found the world's best idea to save hard-disk space!!! Only to realise that sometimes, I'm stupid.2 -
The company I work for wants me to create an app to compress "foxit phantompdf" files, so that they are still readable after compression.
(referencing my last rant)7 -
Do NOT "compress" your code by leaving out braces in control structures and putting the 300 char statement on the same fucking line as the control structure!
Yes, your code file becomes vertically shorter than the usual 3000 lines, BUT my brain tumor proportionally grows larger.7 -
Life Before the Computer
An application was for employment
A program was a TV show
A cursor used profanity
A keyboard was a piano!
Memory was something that you lost with age
A CD was a bank account
And if you had a 3-inch floppy
You hoped nobody found out!
Compress was something you did to garbage
Not something you did to a file
And if you unzipped anything in public
You'd be in jail for awhile!
Log on was adding wood to a fire
Hard drive was a long trip on the road
A mouse pad was where a mouse lived
And a backup happened to your commode!
Cut - you did with a pocket knife
Paste you did with glue
A web was a spider's home
And a virus was the flu!
I guess I'll stick to my pad and paper
And the memory in my head
I hear nobody's been killed in a computer crash
But when it happens they wish they were dead!3 -
TLDR; i wrote recursive compression with random algorithm to fuck up some lazy ass girl.
one day, unknown classmate told me she has a family reunion and cannot do his programming assignment which will be collected next day in the morning, so she ask me to do it. i say i need to put a price tag to it because i want to buy a new RasPi --i don't know her either so i don't feel bad about it. i told her i need $20 and after some bargaining it settled at $15. i work on it about 3 hours and told her it's finished and send her demo video as a proof. she happy with the result. and will come to my house later that night to get the source code. at night, she came, and give me only $8 bucks, of course i get mad, with every arguments she throws at me i resist to give her the source code. but since i tired enough to get into a longer arguments i accept the 8 bucks i go upstairs to get the source code. but instead of giving her the actual source code; i wrote a quick script to do 50 compress source code folder recursively with random compression algorithm--sometimes gzip sometimes lzma. and give her the final 50 times compressed source code. EAT SHIT MOTHERFUCKER11 -
Long rant ahead.. 5k characters pretty much completely used. So feel free to have another cup of coffee and have a seat 🙂
So.. a while back this flash drive was stolen from me, right. Well it turns out that other than me, the other guy in that incident also got to the police 😃
Now, let me explain the smiley face. At the time of the incident I was completely at fault. I had no real reason to throw a punch at this guy and my only "excuse" would be that I was drunk as fuck - I've never drank so much as I did that day. Needless to say, not a very good excuse and I don't treat it as such.
But that guy and whoever else it was that he was with, that was the guy (or at least part of the group that did) that stole that flash drive from me.
Context: https://devrant.com/rants/2049733 and https://devrant.com/rants/2088970
So that's great! I thought that I'd lost this flash drive and most importantly the data on it forever. But just this Friday evening as I was meeting with my friend to buy some illicit electronics (high voltage, low frequency arc generators if you catch my drift), a policeman came along and told me about that other guy filing a report as well, with apparently much of the blame now lying on his side due to him having punched me right into the hospital.
So I told the cop, well most of the blame is on me really, I shouldn't have started that fight to begin with, and for that matter not have drunk that much, yada yada yada.. anyway he walked away (good grief, as I was having that friend on visit to purchase those electronics at that exact time!) and he said that this case could just be classified then. Maybe just come along next week to the police office to file a proper explanation but maybe even that won't be needed.
So yeah, great. But for me there's more in it of course - that other guy knows more about that flash drive and the data on it that I care about. So I figured, let's go to the police office and arrange an appointment with this guy. And I got thinking about the technicalities for if I see that drive back and want to recover its data.
So I've got 2 phones, 1 rooted but reliant on the other one that's unrooted for a data connection to my home (because Android Q, and no bootable TWRP available for it yet). And theoretically a laptop that I can put Arch on it no problem but its display backlight is cooked. So if I want to bring that one I'd have to rely on a display from them. Good luck getting that done. No option. And then there's a flash drive that I can bake up with a portable Arch install that I can sideload from one of their machines but on that.. even more so - good luck getting that done. So my phones are my only option.
Just to be clear, the technical challenge is to read that flash drive and get as much data off of it as possible. The drive is 32GB large and has about 16GB used. So I'll need at least that much on whatever I decide to store a copy on, assuming unchanged contents (unlikely). My Nexus 6P with a VPN profile to connect to my home network has 32GB of storage. So theoretically I could use dd and pipe it to gzip to compress the zeroes. That'd give me a resulting file that's close to the actual usage on the flash drive in size. But just in case.. my OnePlus 6T has 256GB of storage but it's got no root access.. so I don't have block access to an attached flash drive from it. Worst case I'd have to open a WiFi hotspot to it and get an sshd going for the Nexus to connect to.
And there we have it! A large storage device, no root access, that nonetheless can make use of something else that doesn't have the storage but satisfies the other requirements.
And then we have things like parted to read out the partition table (and if unchanged, cryptsetup to read out LUKS). Now, I don't know if Termux has these and frankly I don't care. What I need for that is a chroot. But I can't just install Arch x86_64 on a flash drive and plug it into my phone. Linux Deploy to the rescue! 😁
It can make chrooted installations of common distributions on arm64, and it comes extremely close to actual Linux. With some Linux magic I could make that able to read the block device from Android and do all the required sorcery with it. Just a USB-C to 3x USB-A hub required (which I have), with the target flash drive and one to store my chroot on, connected to my Nexus. And fixed!
Let's see if I can get that flash drive back!
P.S.: if you're into electronics and worried about getting stuff like this stolen, customize it. I happen to know one particular property of that flash drive that I can use for verification, although it wasn't explicitly customized. But for instance in that flash drive there was a decorative LED. Those are current limited by a resistor. Factory default can be say 200 ohm - replace it with one with a higher value. That way you can without any doubt verify it to be yours. Along with other extra security additions, this is one of the things I'll be adding to my "keychain v2".11 -
I was in second year of University when I joined the internship, I knew the business idea sucks and he wouldn't be able to carry out the operations either. Little did I know that I will work with the dumbest team ever, literally, the dumbest.
So, the major chunk of the software was outsourced to a consultancy. I was a tech intern, and we were developing an Android App that will save your parking location, let you reserve locations and all etc.
I knew I have stepped on a wrong turf, but again, I had nothing better to do that summer. So, for a very meager stipend, I said yes to a very stupid project. Let the stupidity flow...
~ The boss, had quit his job for this dumb idea with no funding, no team, nothing.
~ He was pursuing a certification course in Android Development from somewhere, where their final project will be a calculator!
~ He had little to no tech skills, hardly knew Java but was leading an Android App Dev project in Java. He had little to no managerial, marketing or sales skills either.
~ For a brief period, I had to work along with the consultancy guys to ramp up their work. They would take backups in a USB drive every evening, and share each others code using the same. VCS died a painful death that day.
~ They hardly wrote functions, rather, wrote very long code in the main (onCreate) function. Code style died of cancer.
~ They couldn't compress an image before sending it to a server. I had to do it for them.
~ Had no concept of creating utility classes.
And best of all,
~ Wrote 20 cases (switch case) with the same code! Instead of using a loop...1 -
Client: hey here are the original final 15 megapixel pictures for the website (to replace placeholders that have been replaced by other placeholders), please implement them.
Me: okay, let me just compress them a bit. *save for web and devices, upload*
Client: they are a bit pixely could you make them larger?
Me: errrr. well that will kill loadtime
*repeat this process 8 times over F-ing chat until pic is F-ing huge*
Client: okay now that the picture is nice and crisp and huge please add some blur in photoshop4 -
I'm surprised how after using gulp task with minify js, css, html, smoosh and compress to gz file the whole web size down from 200kb to 20kb.
That let me think what a shit web developer I'm and what cool are those things.1 -
Set up a 2GB upload to run and a 6GB folder to compress while I went to do an errand. Came back to find computer had rebooted itself while I was out. No reason for it in event logs. Just a random reboot for giggles, I guess. File upload aborted with no resume and I’m unsure if the full folder compressed. Have to start over.3
-
(long post is long)
This one is for the .net folks. After evaluating the technology top to bottom and even reimplementing several examples I commonly use for smoke testing new technology, I'm just going to call it:
Blazor is the next Silverlight.
It's just beyond the pale in terms of being architecturally flawed, and yet they're rushing it out as hard as possible to coincide with the .Net 5 rebranding silo extravaganza. We are officially entering round 3 of "sacrifice .Net on the altar of enterprise comfort." Get excited.
Since we've arrived here, I can only assume the Asp.net Ajax fiasco is far enough in the past that a new generation of devs doesn't recall its inherent catastrophic weaknesses. The architecture was this:
1. Create a component as a "WebUserControl"
2. Any time a bound DOM operation occurs from user interaction, send a payload back to the server
3. The server runs the code to process the event; it spits back more HTML
Some client-side js then dutifully updates the UI by unceremoniously stuffing the markup into an element's innerHTML property like so much sausage.
If you understand that, you've adequately understood how Blazor works. There's some optimization like signalR WebSockets for update streaming (the first and only time most blazor devs will ever use WebSockets, I even see developers claiming that they're "using SignalR, Idserver4, gRPC, etc." because the template seeds it for them. The hubris.), but that's the gist. The astute viewer will have noticed a few things here, including the disconnect between repaints, inability to blend update operations and transitions, and the potential for absolutely obliterative, connection-volatile, abusive transactional logic flying back and forth to the server. It's the bring out your dead approach to seeing how much of your IT budget is dedicated to paying for bandwidth and CPU time.
Blazor goes a step further in the server-side render scenario and sends every DOM event it binds to the server for processing. These include millisecond-scale events like scroll, which, at least according to GitHub issues, devs are quickly realizing requires debouncing, though they aren't quite sure how to accomplish that. Since this immediately becomes an issue with tickets saying things like, "scroll event crater server, Ugg need help! You said Blazorclub good. Ugg believe, Ugg wants reparations!" the team chooses a great answer to many problems for the wrong reasons:
gRPC
For those who aren't familiar, gRPC has a substantial amount of compression primarily courtesy of a rather excellent binary format developed by Google. Who needs the Quickie Mart, or indeed a sound markup delivery and view strategy when you can compress the shit out of the payload and ignore the problem. (Shhh, I hear you back there, no spoilers. What will happen when even that compression ceases to cut it, indeed). One might look at all this inductive-reasoning-as-development and ask themselves, "butwai?!" The reason is that the server-side story is just a way to buy time to flesh out the even more fundamentally broken browser-side story. To explain that, we need a little perspective.
The relationship between Microsoft and it's enterprise customers is your typical mutually abusive co-dependent relationship. Microsoft goes through phases of tacit disinterest, where it virtually ignores them. And rightly so, the enterprise customers tend to be weaksauce, mono-platform, mono-language types who come to work, collect a paycheck, and go home. They want to suckle on the teat of the vendor that enables them to get a plug and play experience for delivering their internal systems.
And that's fine. But it's also dull; it's the spouse that lets themselves go, it's the girlfriend in the distracted boyfriend meme. Those aren't the people who keep your platform relevant and competitive. For Microsoft, that crowd has always been the exploratory end of the developer community: alt.net, and more recently, the dotnet core community (StackOverflow 2020's most loved platform, for the haters). Alt.net seeded every competitive advantage the dotnet ecosystem has, and dotnet core capitalized on. Like DI? You're welcome. Are you enjoying MVC? Your gratitude is understood. Cool serializers, gRPC/protobuff, 1st class APIs, metadata-driven clients, code generation, micro ORMs, etc., etc., et al. Dear enterpriseur, you are fucking welcome.
Anyways, b2blazor. So, the front end (Blazor WebAssembly) story begins with the average enterprise FOMO. When enterprises get FOMO, they start to Karen/Kevin super hard, slinging around money, privilege, premiere support tickets, etc. until Microsoft, the distracted boyfriend, eventually turns back and says, "sorry babe, wut was that?" You know, shit like managers unironically looking at cloud reps and demanding to know if "you can handle our load!" Meanwhile, any actual engineer hides under the table facepalming and trying not to die from embarrassment.36 -
This is a story of how I did a hard thing in bash:
I need to extract all files with extension .nco from a disk. I don't want to use the GUI (which only works on windows). And I don't want to install any new programs. NCO files are basically like zip files.
Problem 1: The file headers (or something) is broken and 7zip (7z) can only extract it if has .zip extension
Problem 2: find command gives me relative to the disk path and starts with . (a dot)
Solution: Use sed to delete dot. Use sed to convert to full path. Save to file. Load lines from file and for each one, cp to ~/Desktop/file.zip then && 7z e ~/Desktop/file.zip -oOutputDir (Extract file to OutputDir).
Problem 3: Most filenames contain a whitespace. cp doesn't work when given the path wrapped in quotes.
Patch: Use bash parameter substitution to change whitespace to \whitespace.
(Note: I found it easier to apply sed one after another than to put it all in one command)
Why the fuck would anyone compress 345 images into their own archive used by an uncommon windows-only paid back-up tool?
Little me (12 years old) knowing nothing about compression or backup or common software decided to use the already installed shitty program.
This is a big deal for me because it's really the first time I string so many cool commands to achieve desired results in bash (been using Ubuntu for half a year now). Funny thing is the images uncompressed are 4.7GB and the raw files are about 1.4GB so I would have been better off not doing anything at all.
Full command:
find -type f -name "*.nco" |
sed 's/\(^./\)/\1/' |
sed 's/.*/\/media\/mitiko\/2011-2014_1&/' > unescaped-paths.txt
cat unescaped-paths.txt | while read line; do echo "${line// /\\ }" >> escaped-paths.txt; done
rm unescaped-paths.txt
cat escaped-paths.txt | while read line; do (echo "$line" | grep -Eq .*[^db].nco) && echo "$line" >> paths.txt; done
rm escaped-paths.txt
cat paths.txt | while read line; do cp $line ~/Desktop/file.zip && 7z e ~/Desktop/file.zip -oImages >/dev/null; done3 -
I think I am going to start doing lets play series for minecraft and post online. I spend a lot of time playing modpacks with my kids and I should just start recording and add audio commentary later. I can compress the video and do speedups for boring parts and cut other segments. If I create a spreadsheet for the modpack and do it by numbers then I can burn through the modpack fast. I like watching lets play series to figure out modpacks. Especially direwolf. However, these lets play episodes are a good 30 to 45 minutes long. I want to play, not watch someone else play. I hate the ones where they don't know how to do the modpack and they don't cut the video. Direwolf does a great job editing his videos. His video is FULL of good content.
If I can get the videos short enough it will keep the attention span of the mass plebes that cannot find their way out of a paper bag. This is definitely catering to the lowest denominator. I can turn my hobby into something useful. I want to try and compress the 30 minute video into 5 to 10 minutes. It will be a minecraft junky playthrough. Maybe add some time lapse stuff too.
I think I should do the playthrough first, reset server and do again with video capture. I want to incorporate comedy into the videos too. Gaming should be fun. I wonder how much space a 30 minutes video will take?6 -
Custom image format update:
Currently converts a 12KB PNG image into 172KB BIN file, now to compress the fuck out of it!
Update: Fuck should I use a buffer or simple binary data for file structure manipulation....15 -
- Make it engaging, and avoid too much bullshit.
- Everyone should get to air their concerns, regardless of your position, everyone should have a say of the matters being discussed.
And for the love of god, compress it, don't make it unnecessary long. It's not fun to attend something that takes an hour of your working time, because time is precious and you lose time to fix shit.1 -
It's disgusting how a package like compress-commons can have 1.3 million weekly downloads yet no documentation whatsoever and not even any relevant comments in the code. Honestly fuck the javascript ecosystem3
-
The devil in my mind said "Let's make a CSS compressor and de-compressor."
I said OK.
And then the mf added "use emojis as symbols".
Like, damn, why not.
Expect me to make a sideloader for CSS and de-compressor and shit.
yee5 -
Anyone else have 100 .txt files on the computer called note(54).txt? Or TODO(32).txt? Yeah... Someday I will compress them into one so I don't miss something important for work. Why don't we have issue tracking?!?!5
-
Me: *ranting to my friend who isn’t a dev (but is in IT and a good knowledge of networking) about a problem I came across in my project.
“I don’t know how I’ll send folders”
Him: “just compress it before hand and decompress when it’s sent”
Me: *mind fucked*
Have any of you guys gotten surprisingly good ideas from non programmers?10 -
When you carefully compress and cache all your webpage resouces to boost pageloads, but 3rd party social plugins make your website slow as hell.1
-
I was working for a project with one of the project managers. Despite several discussions, he was not ready to have provisioned for procurement of couple of extra drives for database backups. Also because it's always how they worked, developers were allowed to make changes to the production databases directly.
Since I knew it was going to be burning some day, despite his negligence, I ran a script to take full database backups every night, compress, and remove old backups all to do in the drives we had on server. Sat it automated using scheduler.
One day it happened that one of the junior developers deleted one major table taking whole production down. Next thing you know everyone went crazy. Since I felt bad for the managers and users, I was able to restore database using backup from last night.
You know who jumped in first before senior management to take credit of all this and got some nice kudos..that project manager. Also, you know who got burned..it would not be a rant if I did not got schooled for not following on the wisdom of project manager.
Anyways, we are still not taking database backups (as per project manager) -
Okay so after a few days of thinking I think I'm sure about what I'm about to write :
Best : Discovering how to use streams while making a service that should extract a tar.gz, extract the tar.gz within it, filter the extracted files and correct some of them, then compress each folder as tar.gz and compress all the archives as .zip. The amazing thing for me is that with streams I could do all the operations in just two passes, maybe one if I had more time, saving disk writing time.
Worst : upgrading a bunch of legacy Access 97 apps and its VBA code to Access 2013 -
So this is a story all about how:
I went from my brother being asked to clone the disk of old laptop to a new one. My m.2 ssd being held in with duct tape because of it and me having to run a tar command that took over 5 hours to complete.
So my bro got asked by a friends mom to help her with "PC Problems". In short the old laptop was barely useable and they ordered a new one on the reccomendations of my brother. Turns out the laptop he recomended shoud have had a 1tb hdd in it. Instead it had a 256gb m.2. Note: The old disk was 500GB.
So cloning wasnt possible. He then told me, that they also bought a regular sata ssd which was supposed to go in instead of the non present hdd. So I <-Want to emphasize on this! went to work.
Ordered a ssd in the right form factor.
Made a disk image of the old HDD.
Had to drill out the screw holding in my m.2 ssd as it was stuck.
Its now held in with a piece of kepton tape.
Wrote old disk's image on new disk.
It doesnt boot in new laptop.
Spend an evening trying to rebuild boot partitions.
Fuck it, lets install windows 10 (instead of 7) and copy all the shit over manually!
Lets buy office for 30$ on g2a as it would be a pain in the ass to explain the that its normally a subscription.
Previous rant happend.
Brought them the laptop yesterday telling them I did all the work.
I collect the ssd they ordered explaing i ordered the correct one and would take tis one for me.
Still had the 500GB img file of the disk and wanted to compress it in order to put some data back in place.
The tar command took over 5 hours to complete! I havent checked on it yet as I went to sleep. -
If in emergency you have to hide porn..
Just put all in one folder and then compress
it with zip/rar and change the extension which none of your computer program can understand or just change it to .class or .cpp!5 -
IMAGE COMPRESSION QUESTION
lets say i upload a 100x100 photo from my android device. this image has a size of e.g. 2MB. not a lot. if i compress it then the size will be e.g. 300kB. cool. upload is thunderbolt for any internet speed.
lets consider this case. a random ass motherfucker decides it is cool to upload a 10000x10000 image that has a size e.g. 300MB. compressing this would be e.g. 150MB which is still a lot as fuck for one pic.
heres my question: where should the compression be handled? at backend (REST API server) or client (android image compression library)?
because if i try to send a 150MB pic to the server and their internet sucks but to be fucking honest even the best internet speed would take way too long to upload, is it better to do the compression on the backend or client?
or should i do compression in android? if i should do compression on client then should i;
1) do the compression on the main thread with a progress dialog to wait them until the compression + PLUS the fucking upload is done or
2) do the compression + THE upload in a background thread in which case it can be dangerous for verbose amount of fuckups (internet dies phone explodes etc) and the app crashes
which (one) option of the 2 suboptions from the second parent option branch?
of course this is an extremely unrealistic case, it is possible but thats not my point: my point is WHERE SHOULD THE COMPRESSION (as some kind of universal standard) BE HANDLED AT?6 -
Duck! this sloppy whiny winnfsd.
Yay! Let's use state of the art Docker with a VirtualBox VM on Windows10.
Don't get me wrong.
The Docker containers in this VM doing a great job on performance.
But in the very moment a Docker container uses a mounted folder via the windows network filesystem, all hell is breaking loose.
Building a vendor folder using a composer Docker image with 84 Packages takes about 15 seconds when cache has been warmed up.
The same Docker command pointing on a folder mounted to Windows Filesystem with warmed up cache takes about 10 Minutes!@&&@""+&
And what is the duckin' reason for this delay?
Because every transfer of a teeny tiny file has to establish a connection to fat ass Windows OS and has to pass it's glorious "security" layer.
DUCK it!
For real.
I currently working on a shell script which builds the whole vendor folder on a volume on Docker VM.
After completion, the shell script will compress the folder to one file.
This one file will be transferred over this god damned network filesystem.
Finally the script will unpack the compressed vendor folder in it's destination folder.
*sigh*
What year is it?!??3 -
!rant
TL;DR I want to add a vcodec to FFmpeg but don't understand the how or haven't found the right resources yet.
I just learned the very basics recently and thought it is great for some automation.
For now, I want to use it to compress my anime collection. I found a modded version of HEVC (x265) for anime, which I want to include as vcodec. Unfortunately, even after researching it a bit, I don't understand how to do so. Does somebody here know how to do it?
Modded HEVC.
Warning: Readme.me contains anime images
https://github.com/msg7086/...5 -
How hard can it fucking be to just bundle, minify and compress css and js in webpack.
FFS, am I retarded? -
It's 2.30AM now, and I'm guiding my high school friend on how to compress a file... It has been 2 hours now..4
-
Ticket: implement compression algorithm to crypto object x
Details: object to big, we must devise a way to compress it. A deflate algorithm should be added here, yada yada yada we did not have the time Yara yada...
Go see crypto provider's documentation... It has compression options... -_-
You lazy fucking stack overflow copy question dimwits!!! Jesus fucking Christ! This reached production like this shit, I've got clients complaining of the size of the payload because you are a bunch of lazy fucks who can't even read simple documentation!!!
I want to kill someone for wasting my time and patience... Don't call me for this kind of crap... I have better things to do!
I mean, the time it took you to write the ticket should suffice... -
I'am trying to monetize my vuejs app
But google is telling me that there is no content inside my website
And of course there is no content because vuejs render engine compress everything in JavaScript
What the hell is this?😠😨😨😨.
So should i conclude that frameworks app cannot be monetize?
Please tell me no
So how can i monetize it?7 -
Storytime.
The Prometheus tales
Part IV - A new FUBAR.
A new and very fascinating problem emerged a few days, after feeding some node definitions to the new titan instance.
It's a storage fuck-up. A major one.
If I'm informed correctly, the latest prometheus should have the same (or even better) log compression algorithms for metrics, as the old one - because these fuckers are so damn good at what they are doing: compress some fucking logs.
The new instance is agregating metrics as planned. Grafana work's like a fucking charm.
Nethertheless, because of very fascinating but unknown reasons, the new instance creates 50GB of metrics in under 4 fucking hours.
Am I missing something here? Some magic parameter that has to be passed to the titan, that enables the hardcore compress-them-fuckers-feature?
Debugging session is tomorrow.
To be continued. -
I need some advice to avoid stressing myself out. I'm in a situation where I feel stuck between a rock and a hard place at work, and it feels like there's no one to turn to. This is a long one, because context is needed.
I've been working on a fairly big CMS based website for a few years that's turned into multiple solutions that I'm more or less responsible for. During that time I've been optimizing the code base with proper design patterns, setting up continuous delivery, updating packaging etc. because I care that the next developer can quickly grasp what's going on, should they take over the project in the future. During that time I've been accused of over-engineering, which to an extent is true. It's something I've gotten a lot better at over the years, but I'm only human and error prone, so sometimes that's just how it is.
Anyways, after a few years of working on the project I get a new colleague that's going to help me on my CMS projects. It doesn't take long for me to realize that their code style is a mess. Inconsistent line breaks and naming conventions, really god awful anti-pattern code. There's no attempt to mimic the code style I've been using throughout the project, it's just complete chaos. The code "works", although it's not something I'd call production code. But they're new and learning, so I just sort of deal with it and remain patient, pointing out where they could optimize their code, teaching them basic object oriented design patterns like... just using freaking objects once in a while.
Fast forward a few years until now. They've learned nothing. Every time I read their code it's the same mess it's always been.
Concrete example: a part of the project uses Vue to render some common components in the frontend. Looking through the code, there is currently *no* attempt to include any air between functions, or any part of the code for that matter. Everything gets transpiled and minified so there's absolutely NO REASON to "compress" the code like this. Furthermore, they have often directly manipulated the DOM from the JavaScript code rather than rendering the component based on the model state. Completely rendering the use of Vue pointless.
And this is just the frontend part of the code. The backend is often orders of magnitude worse. They will - COMPLETELY RANDOMLY - sometimes leave in 5-10 lines of whitespace for no discernable reason. It frustrates me to no end. I keep asking them to verify their staged changes before every commit, but nothing changes. They also blatantly copy/paste bits of my code to other components without thinking about what they do. So I'll have this random bit of backend code that injects 3-5 dependencies there's simply no reason for and aren't being used. When I ask why they put them there I simply get a “I don't know, I just did it like you did it”.
I simply cannot trust this person to write production code, and the more I let them take over things, the more the technical debt we accumulate. I have talked to my boss about this, and things have improved, but nowhere near where I need it to be.
On the other side of this are my project manager and my boss. They, of course, both want me to implement solutions with low estimates, and as fast and simply as possible. Which would be fine if I wasn't the only person fighting against this technical debt on my team. Add in the fact that specs are oftentimes VERY implicit, so I'm stuck guessing what we actually need and having to constantly ask if this or that feature should exist.
And then, out of nowhere, I get assigned a another project after some colleague quits, during a time I’m already overbooked. The project is very complex and I'm expected to give estimates on tasks that would take me several hours just to research.
I'm super stressed and have no one I can turn to for help, hence this post. I haven't put the people in this post in the best light, but they're honestly good people that I genuinely like. I just want to write good code, but it's like I have to fight for my right to do it.1 -
Just wanted to free up some space and separate all of my projects.
First idea ... failed!
mksquashfs /home/tracktraps/Development/myproject1 ~/Squash/myproject1.sfs -info -progress -b 1048576 -comp xz -Xdict-size 100%
mkdir /mnt/myproject1
mount ~/Squash/myproject1.sfs /mnt/myproject1
unionfs -o allow_other,nonempty ~/.unionfs/changes/myproject1=rw:/mnt/myproject1/=ro ~/Development/Project1
Too much cpu overhead, too many folders, can't delete files, all get mixed up ...
Second idea ... failed!
dd if=/dev/zero of=~/Imgs/myproject1.btfs bs=1M count=10240
mkfs.btrfs ~/Imgs/myproject1.btfs
mount -o defaults,noatime,autodefrag,compress,compress-force,inode_cache ~/Imgs/myproject1.btfs ~/Development/Project1
Well ... little overhead, gzip compression, saved a lot of space, but fixed img size.
Third idea ... yay!
truncate -s 200G ~/Imgs/myproject1.btfs
mkfs.btrfs ~/Imgs/myproject1.btfs
mount -o defaults,noatime,autodefrag,compress,compress-force,inode_cache ~/Imgs/myproject1.btfs ~/Development/Project1
Well ... little overhead, gzip compression, saved a lot of space ... but wait ... why do my btfs files consume more and more space?
Hmm ... time for a little bash and my beloved systemd timers.
for f in `find . -type f -name "*.btfs"`
do
project=${$f%.*}
btrfs balance start -v -dusage=100 ~/Development/$project
btrfs balance start -v -musage=100 ~/Development/$project
fstrim ~/Development/$project
fallocate -d -v $f
done1 -
What are some tips for shooting and editing cell phone videos?
WTC was great but somehow send the camera app algo decided to process out the sun and a lot of colors....
Any way to fix it sorta like Lightroom?
Also how do I compress the videos to YouTube's size.... Without uploading to YouTube?
https://youtu.be/FEmuDC9tkyo -
Client: Any way we can speed up that image load? Maybe compress the image?
Me: No, we need to compute if they have earned that image...that's what takes so long.
Client: Oh, I thought it would just somehow show if they've earned it! lol -
Thank you modpagespeed to use shit methods to compress the source and your amazing work with client side cache. The whole site was fucked up for a day and I didn't notice.
Note: press Ctrl F5 20 times if you tweak anything in js. Even if it's 100% working, pagespeed can fuck it up. Turn that shit off.5 -
We’re only random people living in random places, speaking random languages, eating random food, sleeping, studying and working random hours. Traveling to random points on a sphere.
Just random range is different.
Just random stuff happens on crossroads of two random dots and the entropy speed ups or slows down.
Nothing special at all.
Just a finite state machine iteration.
I mean the amount of effort we put into explanation of infinity is outstanding.
What if there is no infinity at all ?
What if infinity is just misunderstanding of our interpretation of the world around us. It’s just pixels, resolution, gaussian splatting, quantum state, you name it.
Hey man the world is flat. Just put it to the 2d space. How many space you need from a simulation perspective where your patient eyes can only see up to certain amount of light particles per second on a shitty lens.
Propose a world optimization techniques by slowing down subject perception, tiredness introduced. Compress memory, sleep introduced. Limit neurons, cpu power assigned. Deploy on cloud - put it to life. Exit 0 body failure. Exit 1 suicide. Kill -9 killed by tty from ip EARTH.X.Y
What you can do to make the world around this planet alive? Make it blink.
We developers are lazy and I believe that nature is even more lazy than us.
You think you’re going to elevator right now ? You’re going to the preloader. Looking at the window equals playing video from playback. Never goes live, just precomputed fsm. Cars, trains, airplains ? Preloaders everywhere. Highways to split traffic to cities and communication. The road and cities planning department is a matrix maintenance department. And don’t get me started about space.
Space is empty because it’s not even finished. So they put it all behind glass called milky way. You know how glass looked 500 years ago ? It was milky so it’s milky way so we don’t see shit.
If the space would be finished I’ll be starting writing this text from mars, finished it and sent from earth but no it’s light years guys, light years is not a second for a matter. Light year is a second of the the injected thoughts exchange only. Thoughts of the global computer called generative AI that they introduced on local computing devices called cloud.
Even the preloader system is not present, they left us with the one map and overpopulated demo. What a shit hole.I bet they’re increasing temperature right now to erase this alpha build and cash out. Obviously so many bugs here that his one can’t be fixed anymore. To many viruses.
Hope for 0days to start happening so we can escape using time travel or something.
I bet they cut a budget or something, moved the team to other projects. Or even worse solar system team got layoff off because we are just neurons that ordered to do it. And now we’re stuck in some maintenance mode, no new physics no new thoughts to pursue, just slow degeneration. I would pay more for the next run and switch to other galaxy far far away where they at lest have more modern light speed technology.
What do you think about it Trinity ? Not even worth wasting your time for that. No white rabbit this time.
I do not recommend this game at this stage of early access.
- only one available map despite promises for expansions over the years no single dlc arrived,
- missing space adventures
- no galaxy travel mode only a teaser trailers of what you can do in other “universes”
- developers don’t respond to complains
- despite diversity of species and buildings at first sight world looks to generic
- instead of new features bots with mind manipulation, AB testing and data harvesting was introduced
- death anti cheat mode installed1