Join devRant
Do all the things like
++ or -- rants, post your own rants, comment on others' rants and build your customized dev avatar
Sign Up
Pipeless API
From the creators of devRant, Pipeless lets you power real-time personalized recommendations and activity feeds using a simple API
Learn More
Search - "compressing"
-
!Story
The day I became the 400 pound Chinese hacker 4chan.
I built this front-end solution for a client (but behind a back end login), and we get on the line with some fancy European team who will handle penetration testing for the client as we are nearing dev completion.
They seem... pretty confident in themselves, and pretty disrespectful to the LAMP environment, and make the client worry even though it's behind a login the project is still vulnerable. No idea why the client hired an uppity .NET house to test a LAMP app. I don't even bother asking these questions anymore...
And worse, they insist we allow them to scrape for vulnerabilities BEHIND the server side login. As though a user was already compromised.
So, I know I want to fuck with them. and I sit around and smoke some weed and just let this issue marinate around in my crazy ass brain for a bit. Trying to think of a way I can obfuscate all this localStorage and what it's doing... And then, inspiration strikes.
I know this library for compressing JSON. I only use it when localStorage space gets tight, and this project was only storing a few k to localStorage... so compression was unnecessary, but what the hell. Problem: it would be obvious from exposed source that it was being called.
After a little more thought, I decide to override the addslashes and stripslashes functions and to do the compression/decompression from within those overrides.
I then minify the whole thing and stash it in the minified jquery file.
So, what LOOKS from exposed client side code to be a simple addslashes ends up compressing the JSON before putting it in localStorage. And what LOOKS like a stripslashes decompresses.
Now, the compression does some bit math that frankly is over my head, but the practical result is if you output the data compressed, it looks like mandarin and random characters. As a result, everything that can be seen in dev tools looks like the image.
So we GIVE the penetration team login credentials... they log in and start trying to crack it.
I sit and wait. Grinning as fuck.
Not even an hour goes by and they call an emergency meeting. I can barely contain laughter.
We get my PM and me and then several guys from their team on the line. They share screen and show the dev tools.
"We think you may have been compromised by a Chinese hacker!"
I mute and then die my ass off. Holy shit this is maybe the best thing I've ever done.
My PM, who has seen me use the JSON compression technique before and knows exactly whats up starts telling them about it so they don't freak out. And finally I unmute and manage a, "Guys... I'm standing right here." between gasped laughter.
If only it was more common to use video in these calls because I WISH I could have seen their faces.
Anyway, they calmed their attitude down, we told them how to decompress the localStorage, and then they still didn't find jack shit because i'm a fucking badass and even after we gave them keys to the login and gave them keys to my secret localStorage it only led to AWS Cognito protected async calls.
Anyway, that's the story of how I became a "Chinese hacker" and made a room full of penetration testers look like morons with a (reasonably) simple JS trick.9 -
A client obsessed with *security* won’t give us access to the server that hosts the website we built.
Code release involves building templates, compressing the changed files into a zip folder, and emailing them to the client with instructions on where the changed files go8 -
I like coding at night, nobody bothers you... Anyway, I'll never forget when I had to write a Huffman compressor(and decompressor) in C for a school project. It was New Year's Eve of 2016, as fireworks were blowing outside the window of my room, I was fixing bugs. Then, around 4am, I fixed all the bugs. I felt exhilarated as I started compressing and decompressing random images on the internet, comparing hashes.
One of the best New Year's Eve ever... Don't look at me like that... I like being weird.3 -
I am calling this a premonition rant, of more rants to come.
I have a feeling in my bones.
We have a newly acquired fat cat customer with bucks to blow who we have done some digital work for already and swag bag of marketing perkiness.
I will call the CEO of this whale "The Porcupine"
The Porcupine has a business degree and industry experience, nothing to do with websites or applications.
It claims to be a visual perfectionist yet never delivers an overall coherent review.
It likes to fixate on minor brand style differences in websites and apps we have built.
The Porcupine seems to be always busy with policy and legal and other things rather than participating in their own projects.
Procrastination on feedback or reviews until the day before release is common.
Many overtime hours worked, not a sliver of thanks. The haughty attitude indicative of somebody who thinks web development is like desktop publishing.
"It's just code" in response to a crash production server change they were warned was a risk that borked all of our responsive templates and took 3 hours to fix.
Their entire brand is shades of pea green, grey and lime. No serif fonts because they are suck. Arial and Helvetica are boss.
One of my devs missed a CSS style on privacy policy hyperlink text that went times new roman and I had various account directors and our CEO on phone telling me how embarrassing it was for us to let this happen.
Anyway. They pay on time and the cost estimates for all the upcoming work are juicy.
We have shitloads going on for an upcoming hard date conference and everything is already compressing.
Therefore I can already smell doom and feel those porcupine quill getting closer to my ass as I beg their AD today if we have any feedback on the 10 or so project reviews yet?
Nope.4 -
I have a really huge admiration for people who works for the free software, those who made very good tools for almost everything. The Debian community, the FSF.
I'm also admirative in front of those who used computering as a science and made big discoveries in AI, compressing methods, pentesting...
I'm wondering how it is to work in these two worlds?2 -
Me: I will not care about others anymore.
Also me: Converting print screen to JPG, removing EXIF data and compressing it in TinyPNG before sending to friends.2 -
Okay so after a few days of thinking I think I'm sure about what I'm about to write :
Best : Discovering how to use streams while making a service that should extract a tar.gz, extract the tar.gz within it, filter the extracted files and correct some of them, then compress each folder as tar.gz and compress all the archives as .zip. The amazing thing for me is that with streams I could do all the operations in just two passes, maybe one if I had more time, saving disk writing time.
Worst : upgrading a bunch of legacy Access 97 apps and its VBA code to Access 2013 -
Why do SAMBA network drives have to suck this much? Yeah I understand that compiling to a network drive is probably a bad idea just for performance reasons alone but can't you at least not fuck with my git repo?
$ git gc
Enumerating objects: 330, done.
Counting objects: 100% (330/330), done.
Delta compression using up to 24 threads
Compressing objects: 100% (165/165), done.
Writing objects: 100% (330/330), done.
Total 330 (delta 177), reused 281 (delta 151), pack-reused 0
error: unable to open .git/objects/7e: Not a directory
error: unable to open .git/objects/7e: Not a directory
fatal: unable to mark recent objects
fatal: failed to run prune
$ git gc
error: unable to open .git/objects/00: Not a directory
fatal: unable to add recent objects
fatal: failed to run repack
$ git gc
Enumerating objects: 330, done.
Counting objects: 100% (330/330), done.
Delta compression using up to 24 threads
Compressing objects: 100% (139/139), done.
Writing objects: 100% (330/330), done.
Total 330 (delta 177), reused 330 (delta 177), pack-reused 0
Removing duplicate objects: 100% (256/256), done.
error: unable to open .git/objects/05: Not a directory
error: unable to open .git/objects/05: Not a directory7 -
I've been running Xubuntu for a while now and decided to try out Ubuntu's latest release.
So I backup my home and workspace directories by compressing them into archives and deciding to move them to another partition.
Long story short, I wiped my partition and did a clean install only to realize a little later that I had moved the backups in only my mind and not on the laptop 😭😭
That is how I lost all my local projects (the ones I hadn't pushed)
Well fuck me😒 -
I want to use Babel or Typescript for the first time. Because as I read it is the way to go, when compressing JavaScript and make it browser compatible. If that's false, please correct me.
There's a question I've got about this. Right now I am using a PHP router file dealing with requests and selecting the right .js file and compresses it. So I can write like modular JavaScript functions and include them when needed.
My question is, what do I have to change in my setup to switch to the mentioned technologies?11 -
1) A widely used compressing algorithm as my very own invention.
2) Master electrical engineering
3) An universal OS Kernel as my own invention. I will declare war on Linux and Windows -
IMAGE COMPRESSION QUESTION
lets say i upload a 100x100 photo from my android device. this image has a size of e.g. 2MB. not a lot. if i compress it then the size will be e.g. 300kB. cool. upload is thunderbolt for any internet speed.
lets consider this case. a random ass motherfucker decides it is cool to upload a 10000x10000 image that has a size e.g. 300MB. compressing this would be e.g. 150MB which is still a lot as fuck for one pic.
heres my question: where should the compression be handled? at backend (REST API server) or client (android image compression library)?
because if i try to send a 150MB pic to the server and their internet sucks but to be fucking honest even the best internet speed would take way too long to upload, is it better to do the compression on the backend or client?
or should i do compression in android? if i should do compression on client then should i;
1) do the compression on the main thread with a progress dialog to wait them until the compression + PLUS the fucking upload is done or
2) do the compression + THE upload in a background thread in which case it can be dangerous for verbose amount of fuckups (internet dies phone explodes etc) and the app crashes
which (one) option of the 2 suboptions from the second parent option branch?
of course this is an extremely unrealistic case, it is possible but thats not my point: my point is WHERE SHOULD THE COMPRESSION (as some kind of universal standard) BE HANDLED AT?6 -
I have to do a transfer of about 2 GB of data from one remote server to another. Any suggestions?
My idea was to do multiple curl requests while compressing the data using gzcompress.
Preliminary testing shows that won't work. Now I'm considering putting the data in a file on our S3 bucket for the other server to obtain.14 -
so apparently devrant has an image dimension limitation~
compressing it from a 9mb one to 2.7mb still gives an error on jpg with the resolution 4156x2258!
@dfox is it just me or a random bug?13 -
I'm assigned with doing Guides for customers that need connections through Microsoft Remote Desktop.
I made one on a Windows laptop, fine, but they also need guides for Mac OsX and iOS. Dat emacs-fingertorture screenshot buttoncombo.. and once I was done with that I needed a quick way to put 20 odd screenshots on my working computer ( Linux ) and edit them to have a unified design and to also share it with my coworker, so I put them up on Google drive. It started acting up right away with compressing for download in Linux.. then on the Windows computer......... and once I had the zips Windows internal unzipped couldn't unzip them, neither could 7zip................. neither could Linux Unzip -e ................ I ended up just downloading them one by one on both computers, I almost got a nosebleed from that.. -
I'm trying to move a backup User folder from a dead Windows computer over to cloud storage. In checking the size of the folder before compressing it, it shows as 1.08GB, which I guessed would be reasonable. So I zipped it and it came out to be 48GB!! Compressed to 75%! So I went one folder deeper and checked the properties of all files there. That came out to be much larger than 1.08GB. Thinking Windows has some problem revealing the true size of a parent folder and its content, I did a Google search. Sure enough, it's a bug where incorrect folder size is reported. What the heck is going on at Microsoft that this blatant of a bug would ever have a chance of getting into the code? And why is this single user at 48 GB _compressed_? I'd understand if the user was a photographer, but he's just a gamer, and these aren't applications, just save files and profile settings!
https://social.technet.microsoft.com/...4 -
i understand way too little about web data types. while having to store a shitload of data in cookies (sorry for that, no localstorage for local sites, insensitive though) i was so proud of compressing strings with bitshifting only to find out that uriencoding bloats chinese characters massively up. fml3
-
So I'm trying to see if Python adds language features that are easier to use in Linux for quick purposes like converting a raster to sectional image files and compressing them.
Trying to make this multi platform.
So the question then becomes, since I have a testing build location so as not to fuck up my host system, how do I point PyCharm to the python bindings AND the installed lib in its custom location ?
See this kind of setup information is something that would make getting started with things like this much easier and quicker.
Can't tell me there isn't a purpose for this. This seems like a reasonable use-case.9 -
Does anyone know of a good tool or service for compressing svgs for web.
I got about 80 files I need to reduce in size
Help!7