Join devRant
Do all the things like
++ or -- rants, post your own rants, comment on others' rants and build your customized dev avatar
Sign Up
Pipeless API
From the creators of devRant, Pipeless lets you power real-time personalized recommendations and activity feeds using a simple API
Learn More
Search - "long file name"
-
Long but worth it...
So I was cleaning out my Google Drive last night, and deleted some old (2 years and up) files. I also deleted my old work folder, it was for an ISP I worked for over 2 years ago. After deleting the files I had a little twinge of "Man I hope they're not still using those". But seriously, it'd be a pretty big security risk if I was still the owner of those files... right? Surely they copied them and deleted all the info from the originals. IP addresses, Cisco configs, username and passwords for various devices, pretty much everything but customer info.
Guess who I get a call from this morning... "Hi this is Debbie from 'ISP'. I was trying to access the IP Master List and I can't anymore. I was just told to call you and see if there's any way to get access to it again" (Not her real name...)
I had to put her on hold so I could almost die of laughter...
Me: "Sorry about that Debbie, I haven't worked for that company for over 2 years. Your telling me in all that time no one thought to save them locally? No one made a copy? I still had the original documents?!"
Long pause
D: "Uh... Apparently not..."
Another long pause
D: "So is there any way you can give me access to them again?"
Me: "They're gone Debbie. I deleted them all last night."
D: Very worried voice "Can... Can you check?"
This kids is why you never assume you'll always have access to a cloud stored file, make local copies!!
A little bit of background on this company, the owner's wife fired me on trumped up "time card discrepancy" issues so she could hire her freshly graduated business major son. The environment over there was pretty toxic anyway...
I feel bad for "Debbie" and the other staff there, it's going to be a very bad week for them. I also hope it doesn't impact any customers. But... It is funny as hell, especially since I warned the owner as I was clearing out my desk to save copies, and plan on them being gone soon. Apparently he never listened.
This is why you should have a plan in place... And not just wing it...
PS. First Post!25 -
The programmer and the interns part 2.
We will discuss numerous events that happened over the past week or so.
Case 0:
We had our weekly engineering meeting. The interns were invited as well.
We hold meetings in the generic, big, corporate meeting rooms with a huge table in the middle.
There were more than enough chairs for everyone yet the most motivated and awkward intern (let's call him Simon) chose to stand, cause "it's cool man, I always stand". At this point we all know that he probably read about Agile stand up meetings and is confusing it with this one. Otherwise he's simply trying to stand out from the rest. (See what I did there?)
Anyway the meeting has started way later than planned (what a surprise) and took much longer than Simon expected. Everybody is sitting and listening to the CTO while occasionally glancing at the weird looking intern standing awkwardly and refusing to sit because it would make his original intentions pointless. He even tried to nod whith a serious face and his hands crossed when the CTO said something and looked at his general direction. The meeting was about a hour and a half long but with the delay it was at least 2.5 hours.
At the end Simon was so exhausted that he fell asleep on the office puff, was forgotten and locked inside. 3 hours later when I was home I received a call from him with his sleepy-trying-to-sound-awake voice telling the news. Lucky there's a 24/7 Noc team that could rescue him.
Case 1:
An intern who was late on his Linux test connected to every test VM (should I remind you that each one has a personal VM but they share passwords for their roots?) and tried to reset it with "sleep 10s; shutdown -h now".
He took down all 13 of those so I had to turn them on and switch passwords again.
Case 2:
One of the interns didn't do any of his training chores. Apparently he forgot what he was told to use, ignored all online documentation and used Windows CMD with Linux commands for almost a week already.
Case 3:
Simon uses Vim to write all text possible. Even mails, he then selects all and copies into the mail body. He spent half a day on a homework task I gave them. He wrote everything inside one text file using Vim. When he was done he saved the file and quit the editor. He then said "Oh shit! I've forgot to sign my name!". I explicitly told him that theres absolutely no need for that because I see which mail the file was sent from. He said "I don't even need a program for that!" and gave a couple of strokes on the keyboard.
Later I received an email from him with a .txt attachment. When I opened it the only text that was inside was "by Simon ;)".
I logged to his machine and checked the last command ran on the file:
echo "by Simon ;)" > linuxtasks.txt
Case 4:
The girl here uses a MacBook. She keeps getting confused with the terminal windows and rebooting her own machine instead of the remote VM.
Case 5:
Haven't checked yet how this happened but one of the interns deleted the gui from his local Centos.33 -
Good Morning!, its time for practiseSafeHex's most incompetent co-worker!
Todays contestant is a very special one.
*sitcom audience: WHY?*
Glad you asked, you see if you were to look at his linkedin profile, you would see a job title unlike any you've seen before.
*sitcom audience oooooooohhhhhh*
were not talking software developer, engineer, tech lead, designer, CTO, CEO or anything like that, No No our new entrant "G" surpasses all of those with the title ..... "Software extraordinaire".
*sitcom audience laughs hysterically*
I KNOW!, wtf does that even mean! as a previous dev-ranter pointed out does this mean he IS quality code? I'd say he's more like a trash can ... where his code belongs
*ba dum tsssss*
Ok ok, lets get on with the show, heres some reasons why "G" is on the show:
One of G's tasks was to build an analytics gathering library for iOS, similar to google analytics where you track pages and events (we couldn't use google's). G was SO good at this job he implemented 2 features we didn't even ask for:
- If the library was unable to load its config file (for any reason) it would throw an uncatchable system integrity error, crashing the app.
- If anything was passed into any of the functions that wasn't expected (null, empty array etc.) it would crash the app as it was "more efficient" to not do any sanity checks inside the library.
This caused a lot of issues as some of the data needed to come from the clients server. The day we launched the app, within the first 3 hours we had over 40k crash logs and a VERY angry client.
Now, what makes this story important is not the bugs themselves, come on how many times have we all done something stupid? No the issue here was G defended all of this as the right thing to do!
.. and no he wasn't stoned or drunk!
G claimed if he couldn't get the right settings / params he wouldn't be able to track the event and then our CEO wouldn't have our usage data. To which I replied:
"So your solution was to not give the client an app instead? ... which also doesn't give the CEO his data".
He got very angry and asked me "what would you do then?". I offered a solution something like why not have a default tag for "error" or "unknown" where if theres an issue, we send up whatever we have, plus the file name and store it somewhere else. I was told I was being ridiculous as it wasn't built to track anything like that and that would never work ... his solution? ... pull the library out of the app and forget it.
... once again giving everyone no data.
G later moved onto another cross-platform style project. Backend team were particularly unhappy as they got no spec of what needed to be done. All they knew was it was a single endpoint dealing with very complex model. There was no Java classes, super classes, abstract classes or even interfaces, just this huge chunk of mocked data. So myself and the lead sat down with him, and asked where the interfaces for the backend where, or designs / architecture for them etc.
His response, to this day frightens me ... not makes me angry, not bewilders me ... scares the living shit out of me that people like this exist in the world and have successful careers.
G: "hhhmmm, I know how to build an interface, but i've never understood them ... Like lets say I have an interface, what now? how does that help me in any way? I can't physically use it, does it not just use up time building it for no reason?"
us: "... ... how are the backend team suppose to understand the model, its types, integrate it into the other systems?"
G: "Can I not just tell them and they can write it down?"
**
I'll just pause here for a moment, as you'll likely need to read that again out of sheer disbelief
**
I've never seen someone die inside the way the lead did. He started a syllable and his face just dropped, eyes glazed over and he instantly lost all the will to live. He replied:
" wel ............... it doesn't matter ... its not important ... I have to go, good luck with the project"
*killed the screen share and left the room*
now I know you are all dying in suspense to know what happened to that project, I can drop the shocking bombshell that it was in fact cancelled. Thankfully only ~350 man hours were spent on it
... yep, not a typo.
G's crowning achievement however will go down in history. VERY long story short, backend got deployed to the server and EVERYTHING broke. Lead investigated, found mistakes and config issues on every second line, load balancer wasn't even starting up. When asked had this been tested before it was deployed:
G: "Yeah I tested it on my machine, it worked fine"
lead: "... and on the server?"
G: "no, my machine will do the same thing"
lead: "do you have a load balancer and multiple VM's?"
G: "no, but Java is Java"
... and with that its time to end todays episode. Will G be our most incompetent? ... maybe.
Tune in later for more practiceSafeHex's most incompetent co-worker!!!31 -
Oh, man, I just realized I haven't ranted one of my best stories on here!
So, here goes!
A few years back the company I work for was contacted by an older client regarding a new project.
The guy was now pitching to build the website for the Parliament of another country (not gonna name it, NDAs and stuff), and was planning on outsourcing the development, as he had no team and he was only aiming on taking care of the client service/project management side of the project.
Out of principle (and also to preserve our mental integrity), we have purposely avoided working with government bodies of any kind, in any country, but he was a friend of our CEO and pleaded until we singed on board.
Now, the project itself was way bigger than we expected, as the wanted more of an internal CRM, centralized document archive, event management, internal planning, multiple interfaced, role based access restricted monster of an administration interface, complete with regular user website, also packed with all kind of features, dashboards and so on.
Long story short, a lot bigger than what we were expecting based on the initial brief.
The development period was hell. New features were coming in on a weekly basis. Already implemented functionality was constantly being changed or redefined. No requests we ever made about clarifications and/or materials or information were ever answered on time.
They also somehow bullied the guy that brought us the project into also including the data migration from the old website into the new one we were building and we somehow ended up having to extract meaningful, formatted, sanitized content parsing static HTML files and connecting them to download-able files (almost every page in the old website had files available to download) we needed to also include in a sane way.
Now, don't think the files were simple URL paths we can trace to a folder/file path, oh no!!! The links were some form of hash combination that had to be exploded and tested against some king of database relationship tables that only had hashed indexes relating to other tables, that also only had hashed indexes relating to some other tables that kept a database of the website pages HTML file naming. So what we had to do is identify the files based on a combination of hashed indexes and re-hashed HTML file names that in the end would give us a filename for a real file that we had to then search for inside a list of over 20 folders not related to one another.
So we did this. Created a script that processed the hell out of over 10000 HTML files, database entries and files and re-indexed and re-named all this shit into a meaningful database of sane data and well organized files.
So, with this we were nearing the finish line for the project, which by now exceeded the estimated time by over to times.
We test everything, retest it all again for good measure, pack everything up for deployment, simulate on a staging environment, give the final client access to the staging version, get them to accept that all requirements are met, finish writing the documentation for the codebase, write detailed deployment procedure, include some automation and testing tools also for good measure, recommend production setup, hardware specs, software versions, server side optimization like caching, load balancing and all that we could think would ever be useful, all with more documentation and instructions.
As the project was built on PHP/MySQL (as requested), we recommended a Linux environment for production. Oh, I forgot to tell you that over the development period they kept asking us to also include steps for Windows procedures along with our regular documentation. Was a bit strange, but we added it in there just so we can finish and close the damn project.
So, we send them all the above and go get drunk as fuck in celebration of getting rid of them once and for all...
Next day: hung over, I get to the office, open my laptop and see on new email. I only had the one new mail, so I open it to see what it's about.
Lo and behold! The fuckers over in the other country that called themselves "IT guys", and were the ones making all the changes and additions to our requirements, were not capable enough to follow step by step instructions in order to deploy the project on their servers!!!
[Continues in the comments]26 -
I absolutely HATE "web developers" who call you in to fix their FooBar'd mess, yet can't stop themselves from dictating what you should and shouldn't do, especially when they have no idea what they're doing.
So I get called in to a job improving the performance of a Magento site (and let's just say I have no love for Magento for a number of reasons) because this "developer" enabled Redis and expected everything to be lightning fast. Maybe he thought "Redis" was the name of a magical sorcerer living in the server. A master conjurer capable of weaving mystical time-altering spells to inexplicably improve the performance. Who knows?
This guy claims he spent "months" trying to figure out why the website couldn't load faster than 7 seconds at best, and his employer is demanding a resolution so he stops losing conversions. I usually try to avoid Magento because of all the headaches that come with it, but I figured "sure, why not?" I mean, he built the website less than a year ago, so how bad can it really be? Well...let's see how fast you all can facepalm:
1.) The website was built brand new on Magento 1.9.2.4...what? I mean, if this were built a few years back, that would be a different story, but building a fresh Magento website in 2017 in 1.x? I asked him why he did that...his answer absolutely floored me: "because PHP 5.5 was the best choice at the time for speed and performance..." What?!
2.) The ONLY optimization done on the website was Redis cache being enabled. No merged CSS/JS, no use of a CDN, no image optimization, no gzip, no expires rules. Just Redis...
3.) Now to say the website was poorly coded was an understatement. This wasn't the worst coding I've seen, but it was far from acceptable. There was no organization whatsoever. Templates and skin assets are being called from across 12 different locations on the server, making tracking down and finding a snippet to fix downright annoying.
But not only that, the home page itself had 83 custom database queries to load the products on the page. He said this was so he could load products from several different categories and custom tables to show on the page. I asked him why he didn't just call a few join queries, and he had no idea what I was talking about.
4.) Almost every image on the website was a .PNG file, 2000x2000 px and lossless. The home page alone was 22MB just from images.
There were several other issues, but those 4 should be enough to paint a good picture. The client wanted this all done in a week for less than $500. We laughed. But we agreed on the price only because of a long relationship and because they have some referrals they got us in the door with. But we told them it would get done on our time, not theirs. So I copied the website to our server as a test bed and got to work.
After numerous hours of bug fixes, recoding queries, disabling Redis and opting for higher innodb cache (more on that later), image optimization, js/css/html combining, render-unblocking and minification, lazyloading images tweaking Magento to work with PHP7, installing OpCache and setting up basic htaccess optimizations, we smash the loading time down to 1.2 seconds total, and most of that time was for external JavaScript plugins deemed "necessary". Time to First Byte went from a staggering 2.2 seconds to about 45ms. Needless to say, we kicked its ass.
So I show their developer the changes and he's stunned. He says he'll tell the hosting provider create a new server set up to migrate the optimized site over and cut over to, because taking the live website down for maintenance for even an hour or two in the middle of the night is "unacceptable".
So trying to be cool about it, I tell him I'd be happy to configure the server to the exact specifications needed. He says "we can't do that". I look at him confused. "What do you mean we 'can't'?" He tells me that even though this is a dedicated server, the provider doesn't allow any access other than a jailed shell account and cPanel access. What?! This is a company averaging 3 million+ per year in revenue. Why don't they have an IT manager overseeing everything? Apparently for them, they're too cheap for that, so they went with a "managed dedicated server", "managed" apparently meaning "you only get to use it like a shared host".
So after countless phone calls arguing with the hosting provider, they agree to make our changes. Then the client's developer starts getting nasty out of nowhere. He says my optimizations are not acceptable because I'm not using Redis cache, and now the client is threatening to walk away without paying us.
So I guess the overall message from this rant is not so much about the situation, but the developer and countless others like him that are clueless, but try to speak from a position of authority.
If we as developers don't stop challenging each other in a measuring contest and learn to let go when we need help, we can get a lot more done and prevent losing clients. </rant>14 -
A wild Darwin Award nominee appears.
Background: Admins report that a legacy nightly update process isn't working. Ticket actually states problem is obviously in "the codes."
Scene: Meeting with about 20 people to triage the issue (blamestorming)
"Senior" Admin: "update process not working, the file is not present"
Moi: "which file?"
SAdmin: "file that is in ticket, EPN-1003"
Moi: "..." *grumbles, plans murder, opens ticket*
...
Moi: "The config dotfile is missing?"
SAdmin: "Yes, file no there. Can you fix?"
Moi: "Engineers don't have access to the production system. Please share your screen"
SAdmin: "ok"
*time passes, screen appears*
Moi: "ls the configuration dir"
SAdmin: *fails in bash* > ls
*computer prints*
> ls
_.legacyjobrc
Moi: *sees issues, blood pressure rises* "Please run list all long"
SAdmin: *fails in bash, again* > ls ?
Moi: *shakes* "ls -la"
SAdmin: *shonorable mention* > ls -la
*computer prints*
> ls -la
total 1300
drwxrwxrwx- 18 SAdmin {Today} -- _.legacyjobrc
Moi: "Why did you rename the config file?"
SAdmin: "Nothing changed"
Moi: "... are you sure?"
SAdmin: "No, changed nothing."
Moi: "Is the job running as your account for some reason?"
SAdmin: "No, job is root"
Moi: *shares screenshot of previous ls* This suggests your account was likely used to rename the dotfile, did you share your account with anyone?
SAdmin: "No, I rename file because could not see"
Moi: *heavy seething* so, just to make sure I understand, you renamed a dotfile because you couldn't see it in the terminal with ls?
SAdmin: "No, I rename file because it was not visible, now is visible"
Moi: "and then you filed a ticket because the application stopped working after you renamed the configuration file? You didn't think there might be a correlation between those two things?"
SAdmin: "yes, it no work"
Interjecting Director: "How did no one catch this? Why were there no checks, and why is there no user interface to configure this application? When I was writing applications I cared about quality"
Moi: *heavy seething*
IDjit: "Well? Anyone? How are we going to fix this"
Moi: "The administrative team will need to rename the file back to its original name"
IDjit: "can't the engineering team do this?!"
Moi: "We could, but it's corporate policy that we have no access to those environments"
IDjit: "Ok, what caused this issue in the first place? How did it get this way?!"
TFW you think you've hit the bottom of idiocy barrel, and the director says, "hold my mango lassi."27 -
[This makes me sound really bad at first, please read the whole thing]
Back when I first started freelancing I worked for a client who ran a game server hosting company. My job was to improve their system for updating game servers. This was one of my first clients and I didn't dare to question the fact that he was getting me to work on the production environment as they didn't have a development one setup. I came to regret that decision when out of no where during the first test, files just start deleting. I panicked as one would and tried to stop the webserver it was running on but oh no, he hasn't given me access to any of that. I thought well shit, I might as well see where I fucked up since it was midnight for him and I wasn't able to get a hold of him. I looked at every single line hundreds of times trying to see why it would have started deleting files. I found no cause. Exhausted, (This was 6am by this point) I pretty much passed out. I woke up around 5 hours later with my face on my keyboard (I know you've all done that) only to see a good 30 messages from the client screaming at me. It turns out that during that time every single client's game server had been deleted. Before responding and begging for forgiveness, I decided to take another crack at finding the root of the problem. It wasn't my fault. I had found the cause! It turns out a previous programmer had a script that would run "rm -rf" + (insert file name here) on the old server files, only he had fucked up the line and it would run "rm -rf /". I have never felt more relieved in my life. This script had been disabled by the original programmer but the client had set it to run again so that I could remake the system. Now, I was never told about this specific script as it was for a game they didn't host anymore.
I realise this is getting very long so I'll speed it up a bit.
He didn't want to take the blame and said I added the code and it was all my fault. He told me I could be on live chat support for 3 months at his company or pay $10,000. Out of all of this I had at least made sure to document what I was doing and backup every single file before I touched them which managed to save my ass when it came to him threatening legal action. I showed him my proof which resulted in him trying to guilt trip me to work for him for free as he had lost about 80% of his clients. By this point I had been abused constantly for 4 weeks by this son of a bitch. As I was underage he had said that if we went to court he'd take my parents house and make them live on the street. So how does one respond? A simple "Fuck off you cunt" and a block.
That was over 8 years ago and I haven't heard from him since.
If you've made it this far, congrats, you deserve a cookie!6 -
Our website once had it’s config file (“old” .cgi app) open and available if you knew the file name. It was ‘obfuscated’ with the file name “Name of the cgi executable”.txt. So browsing, browsing.cgi, config file was browsing.txt.
After discovering the sql server admin password in plain text and reporting it to the VP, he called a meeting.
VP: “I have a report that you are storing the server admin password in plain text.”
WebMgr: “No, that is not correct.”
Me: “Um, yes it is, or we wouldn’t be here.”
WebMgr: “It’s not a network server administrator, it’s SQL Server’s SA account. Completely secure since that login has no access to the network.”
<VP looks over at me>
VP: “Oh..I was not told *that* detail.”
Me: “Um, that doesn’t matter, we shouldn’t have any login password in plain text, anywhere. Besides, the SA account has full access to the entire database. Someone could drop tables, get customer data, even access credit card data.”
WebMgr: “You are blowing all this out of proportion. There is no way anyone could do that.”
Me: “Uh, two weeks ago I discovered the catalog page was sending raw SQL from javascript. All anyone had to do was inject a semicolon and add whatever they wanted.”
WebMgr: “Who would do that? They would have to know a lot about our systems in order to do any real damage.”
VP: “Yes, it would have to be someone in our department looking to do some damage.”
<both the VP and WebMgr look at me>
Me: “Open your browser and search on SQL Injection.”
<VP searches on SQL Injection..few seconds pass>
VP: “Oh my, this is disturbing. I did not know SQL injection was such a problem. I want all SQL removed from javascript and passwords removed from the text files.”
WebMgr: “Our team is already removing the SQL, but our apps need to read the SQL server login and password from a config file. I don’t know why this is such a big deal. The file is read-only and protected by IIS. You can’t even read it from a browser.”
VP: “Well, if it’s secured, I suppose it is OK.”
Me: “Open your browser and navigate to … browse.txt”
VP: “Oh my, there it is.”
WebMgr: “You can only see it because your laptop had administrative privileges. Anyone outside our network cannot access the file.”
VP: “OK, that makes sense. As long as IIS is securing the file …”
Me: “No..no..no.. I can’t believe this. The screen shot I sent yesterday was from my home laptop showing the file is publicly available.”
WebMgr: “But you are probably an admin on the laptop.”
<couple of awkward seconds of silence…then the light comes on>
VP: “OK, I’m stopping this meeting. I want all admin users and passwords removed from the site by the end of the day.”
Took a little longer than a day, but after reviewing what the web team changed:
- They did remove the SQL Server SA account, but replaced it with another account with full admin privileges.
- Replaced the “App Name”.txt with centrally located config file at C:\Inetpub\wwwroot\config.txt (hard-coded in the app)
When I brought this up again with my manager..
Mgr: “Yea, I know, it sucks. WebMgr showed the VP the config file was not accessible by the web site and it wasn’t using the SA password. He was satisfied by that. Web site is looking to beat projections again by 15%, so WebMgr told the other VPs that another disruption from a developer could jeopardize the quarterly numbers. I’d keep my head down for a while.”8 -
Today was fucking awesome!
I always wanted to do a project in C++ since I've been more of a Java guy for years now.
And today, I finally wrote a full console program in C++! (For windows, it's a .exe)
The purpose of that program is to show if a file has a file lock on it (because of copying for example).
It started as simple as that, but got complicated quickly:
- It needs colors! So I added colors.
- Just a single file? Boring. I need wildcards, so I can put a * for anything in the file name! Jup.
- Just one directory? Boring. I need a recursive directory walk! Got it.
- But wait! There has to be an option to switch between recursive and wildcard/single mode! So I checked if the first argument equals "-r"! Hacky but works.
- Oh uh... that spams a lot now! The purpose was to show locked files, so I need another argument to specify that I only want to see locked files! Damn now it get's hard... I need a Linux-like command line argument parser (this -h and -s "hello" stuff). So I took the opportunity to write one myself! Done.
- Refactoring everything to use my new fancy parser...
- Adding more and more arguments, just because I can:
- "-d" hides "access denied" messages
- "-l" shows only locked files
- "-r" activates recursive directory walk
- "-f" formats everything nicely, basically printf("%-150.150s | %s", filename, locked); a maximum width which get's truncated if too long so everything lines up nicely
- "-h" which of course displays the help page
- "-w file" watches a file, if the file is locked it will refresh every 500ms, if it's still locked nothing happens, if it's unlocked, the program prints "unlocked" in green and exits. And yes, it does have a rotating line (something like this: "-" "\" "|" "/" "-" and so forth...)
That project was just awesome to make. I learn languages fastest if I just do a big project in them, and today, I really learned a lot.
Thank you for reading all this!3 -
Monday morning, went to the local grocery store to get myself some croissants and 2 bottles of wine.
Cashier: "Already at it in the morning, you sure about that?"
Me: "Long story short, I've got a Wi-Fi driver from Intel to debug and rewrite, and it's a fucking piece of shit.. can't go at it without hitting or preferably exceeding the Ballmer Peak... Also I'm awake since yesterday evening already."
Why even ask? Yeah I'm a fucking alcoholic, and guess why that is.. stupid nontechnical fucks, certified enganeers like that motherfucker at Intel who wrote this pile of garbage called ipw2200, and technology that can't be arsed to work properly on its own unless I build the fucking thing myself, just to name a few reasons.
You know what, fucking piece of shit from Intel, whoever it is? How about I let you choke on my dick while fucking hanging you with a sharp metal wire that's carrying 2kVAC from a microwave transformer, just to see whether I'd nut first, or you either choke, get electrocuted, or get your fucking throat slit first. Certificates aren't an excuse for committing this fucking pile of shit and calling it a fucking product!!
Now, it's time to dive into this giant stinking fucking turd I guess.. first glass of wine to get myself prepared for the shitstorm that's a giant 20k LoC C file with barely any comments, to look what the fuck causes this fucking pile of shit to disconnect and ask for WPA credentials after a while, despite having them stored.. and not reconnect after that, because why the fuck would you?!10 -
Story time:
Yesterday I wanted to go to the theater with my girlfriend. It was her idea because as a student you can get reduced tickets for the play, but only via the online store exactely two hours before the play starts. We had already tried two weeks before but with no success. So this time I said i want to be on my pc with a proper browser and not a mobile version like last time. So we are sitting at home me in front of their website on one screen and with a clock on the other screen. Two minutes realy i hit refresh and I get a selection for the reduced tickets, nice.
You would think.
After selecting the amount. ERROR: Can not get your tickets. I was like fuck they are already sold out because it's a popular play. But hey let's try again. I got one ticket but not the second one, okay strange lets try again, same ERROR again. WHAT the FUCK, no feedback what so ever. My girlfriend had then the idea that they maybe restricted the amount for reduced tickets to one (does not state this explicitly but hey lets give it a shot). Use second browser select one ticket. ERROR can not get you the amount of seats. Rage level near to a 1000 why did it work two minutes before but not anymore. Trying around for five more minutes finally got the second ticket.
Now the real fun begins.
Proceeding to checkout should not be that hard you would think, but you need to be registered for that. Okay so let's do that. The salutation is not required neither is the address for the tickets but you need to have a company name??!!!!! The fuck?? I am not self employed and neither are a most other people around here so why is this field mandatory? Beeing a little under stress I decided to found the "asdf" company with my girlfriend.
Now one would think checking out is easy. Not so fast.
After accepting the terms of service another ERROR, unable to accept your data. What data? I did not input anything new? Where does this come from? Ok never mind I am going to pay with credid card that must work!
ERROR: Internal paymentservice initialization failure! Sorry what? I thought maybe I was to long idle in this browser and they do not reserve the tickets for so long (which would be no surprise to me at this point). Let's try again. Nope same error.
Now my rage level was really over 9000 but we really wanted to go so I decided to call the customer SUPPORT. Or better to say I had a answering maching telling me for ten minutes how sorry they are that this takes so long, yeah you bet. Then and this is now really great: the support guy asks me: "What error do you see? Internal paymentservice initialization failure?" I was like, okay he knows this so they need to know how to handle it. FUCK NO. "Sorry I can't help you. This is our payment system maybe they (IT) are doing some maintenance I can't halp you. Call the theater directly good day." Sorry what just happened, you fuckers are the vendors for the tickets for nearly all big events around here and the theater explicitly states to call you for tickets but you can not help me? Like hell.
This process took 25 very frustrating minutes and I was really angry and wanted to quit, then I saw that there is also a paypal option which I had not tried. With very little hope i selected everything for the payment, registered with paypal and they told me I already had an account. So reactivated this five year old account payed with all the mobile passwords and tans to finally, after 30 fucking minutes, get a pdf file for a ticket. Repeated the last step for the second ticket and with some time left to get there we were off.2 -
Been reviewing ALOT of client code and supplier’s lately. I just want to sit in the corner and cry.
Somewhere along the line the education system has failed a generation of software engineers.
I am an embedded c programmer, so I’m pretty low level but I have worked up and down and across the abstractions in the industry. The high level guys I think don’t make these same mistakes due to the stuff they learn in CS courses regarding OOD.. in reference how to properly architect software in a modular way.
I think it may be that too often the embedded software is written by EEs and not CEs, and due to their curriculum they lack good software architecture design.
Too often I will see huge functions with large blocks of copy pasted code with only difference being a variable name. All stuff that can be turned into tables and iterated thru so the function can be less than 20 lines long in the end which is like a 200% improvement when the function started out as 2000 lines because they decided to hard code everything and not let the code and processor do what it’s good at.
Arguments of performance are moot at this point, I’m well aware of constraints and this is not one of them that is affected.
The problem I have is the trying to take their code in and understand what’s its trying todo, and todo that you must scan up and down HUGE sections of the code, even 10k+ of line in one file because their design was not to even use multiple files!
Does their code function yes .. does it work? Yes.. the problem is readability, maintainability. Completely non existent.
I see it soo often I almost begin to second guess my self and think .. am I the crazy one here? No. And it’s not their fault, it’s the education system. They weren’t taught it so they think this is just what programmers do.. hugely mundane copy paste of words and change a little things here and there and done. NO actual software engineers architecture systems and write code in a way so they do it in the most laziest, way possible. Not how these folks do it.. it’s like all they know are if statements and switch statements and everything else is unneeded.. fuck structures and shit just hard code it all... explicitly write everything let’s not be smart about anything.
I know I’ve said it before but with covid and winning so much more buisness did to competition going under I never got around to doing my YouTube channel and web series of how I believe software should be taught across the board.. it’s more than just syntax it’s a way of thinking.. a specific way of architecting any software embedded or high level.
Anyway rant off had to get that off my chest, literally want to sit in the corner and cry this weekend at the horrible code I’m reviewing and it just constantly keeps happening. Over and over and over. The more people I bring on or acquire projects it’s like fuck me wtf is this shit!!! Take some pride in the code you write!16 -
OK I can't deal with this user anymore.
This morning I get a text. "My laptop isn't getting emails anymore I'm not sure if this is why?" And attached is a screenshot of an email purporting to be from "The <company name> Team". Which isn't even close to the sort of language our small business uses in emails. This email says that his O365 password will soon be expiring and he needs to download the attached (.htm) file so he can keep his password. Never mind the fact that the grammar is awful, the "from" address is cheesy and our O365 passwords don't expire. He went ahead and, in his words, "Tried several of his passwords but none of them worked." This is the second time in less than a year that he's done this and I thought we were very clear that these emails are never real, but I'll deal with that later.
I quickly log into the O365 admin portal and reset his password to a randomly-generated one. I set this to be permanent since this isn't actually a password he should ever be needing to type. I call him up and explain to him that it was a phishing email and he essentially just gave some random people his credentials so I needed to reset them. I then help him log into Outlook on his PC with the new password. Once he's in, he says "so how do I reset this temporary password?" I tell him that no, this is his permanent password now and he doesn't need to remember it because he shouldn't ever need to be typing it anyway. He says "No no no that won't work I can't remember this." (I smile and nod to myself at this point -- THAT'S THE IDEA). But I tell him when he is in the office we will store the password in a password manager in case he ever needs to get to it. Long pause follows. "Can't I just set it back to what it was so I can remember it?"10 -
Please allow me to share my thoughts since I can't totally outrage my frustration because we have this so-called fasting to control our anger towards a person we currently disagree with.
A letter from your loving, sincere, pretty and gorgeous working partner to my young, chubby, smart and clever colleague:
Please do cooperate in times of live editing from the FTP since CTO is not and will never be going to appreciate version control since CTO is too tired for giving a shit and just want deliverables be delivered as fuck perfectly regardless of the resources that we have.
As you know, I tolerated you for not getting the freedom of live editing as what you've experienced from your previous team lead. All I ask of you is to get fresh file from FTP whenever we touch the same file because firstly, God knows how frustrating it is how your hard work is going to be replaced and be gone as much as I do. Secondly, I don't want you to experience how pain in the ass could this be in the long run, and lastly, I don't want any hard feelings to be wasted just because of this.
P.S. I'm too shy to send this to you because I don't want to hurt your feelings and don't want to sound too seriouz and feel old. I also hope we share the same telepathic understanding so we can agree with each other.
Your loving, sincere, pretty and gorgeous working partner,
xoxo ❤️
(thinking of stating my first name) 😂16 -
Hi every developer! My name is Allen. English is not my native language so forgive me if I say something that does not make any sense. Let me tell you my story how I become a programmer. (I am still learning) My first computer was a DELL OptiPlex GX 720 desktop. My father bought it for our self-employee job. Before he allow me to use the computer, I used to sit next to him and watching what he do, what he click and what he gets. When he allow me to use the computer, I was slow at typing. One or 2 WPM (word per minute) my father taught me how to use the computer. Very slowly, my typing speed improves. I understand how to use the computer. but one day, I do what make me regret. I was playing with some executables, when I double clicking it, it does not work I used to associate files with apps. I associate music files with every player I want. So, I did what I used to, I associate exe files with windows media center! The computer started to open hundreds of windows media center (WMC for short) whenever an app is clicked, it opens windows media center. Today, I realized that windows were trying to open every app and every process that regularly run. However, since I associate it with WMC, instead of the app itself, it opens WMC some days after the mistake, I wonder how apps work and how I can create my own. My father told me before that a program is simply a binary file that the computer can read. However, it was too advanced to me at the time.I begin my search with google. Everytime I search, it says "learn to code" or something like that. I see some C++ code but, it was disgusting. when I read just a few lines of a hello world code in java. it was too complex
What I seen
#$$#% $%&$%&*#!@
~
(&*%&$ (_(*^% #&&* (^^$(&^$%^( %^*$())
~
^$70^(`*#%`*#&%^)*!" Hello world "#@
~
~
The actual code:
class helloworld
{
public static void main(String args[])
{
System.out.println("Hello World!");
}
}
I look for an easy way but my attempts fail. then. I push
I to learn how to code.I try learning java. but it still
Very complex. i tried LibertyBASIC. from LibertyBASIC to
Java. after learning LibertyBASIC, it was easy!
LibertyBASIC -> Java -> Ruby -> NOW, C# and XAML
Today, I am learning C# and XAML.
My first OS : Windows 7
My first Computer : DELL OptiPlex GX 720
My first successful click : The Start menu
My first used App : Microsoft Encarta 2009
My first created App : Hi-Lo(number-guessing game. written in LibertyBASIC)
Thankyou for reading this Long story.
8 -
Many of you who have a Windows computer may be familiar with robocopy, xcopy, or move.
These functions? Programs? Whatever they may be, were interesting to me because they were the first things that got me really into batch scripting in the first place.
What was really interesting to me was how I could run multiples of these scripts at a time.
<storytime>
It was warm Spring day in the year of 2007, and my Science teacher at the time needed a way to get files from the school computer to the hard-drive faster. The amount of time that the computer was suggesting was 2 hours. Far too long for her. I told her I’d build her something that could work faster than that. And so started the program would take up more of my time than the AI I had created back in 2009.
</storytime>
This program would scan the entirety of the computer's file system, and create an xcopy batch file for each of these directories. After parsing these files, it would then run all the batch files at once. Multithreading as it were? Looking back on it, the throughput probably wasn't any better than the default copying program windows already had, but the amount of time that it took was less. Instead of 2 hours to finish the task it took 45 minutes. My thought for justifying this program was that; instead of giving one man to do paperwork split the paperwork among many men. So, while a large file is being copied, many smaller files could be copied during that time.
After that day I really couldn't keep my hands off this program. As my knowledge of programming increased, so did my likelihood of editing a piece of the code in this program.
The surmountable amount of updates that this program has gone through is amazing. At version 6.25 it now sits as a standalone batch file. It used to consist of 6 files and however many xcopy batch files that it created for the file migration, now it's just 1 file and dirt simple to run, (well front-end, anyways, the back-end is a masterpiece of weirdness, honestly) it automates adding all the necessary directories and files. Oh, and the name is Latin for Imitate, figured it's a reasonable name for a copying program.
I was 14, so my creativity lacked in the naming department >_<1 -
Do you have a ‘Drama Queen’ on your team?
This happened last week.
DK = Drama Queen
DK: “OMG..the link to the document isn’t working! All I get is page not found. I’m supposed to update the notes for this project…and now I can’t! What the _bleep_ and I supposed to do now?!...I don’t understand how …”
This goes on for it seems 5 minutes.
Me: “Hold on...someone probably accidently mistyped the file name or something. I’m sure the document is still there.”
DK: “Well, I’ll never find it. Our intranet is a mess. I’m going to have to tell the PM that the project is delayed now and there is nothing I can do about it because our intranet is such a mess.”
Me: “Maybe, but why don’t you open up the file and see where the reference is?”
DK: “Oh, _bleep_ no…it is HTML…I don’t know anything about HTML. If the company expects me to know HTML, I’m going to have to tell the PM the project is delayed until I take all the courses on W3-Schools.”
Me: “Um…you’ve been developing as long as I have and you have a couple of blogs. You know what an anchor tag is. I don’t think you have to take all those W3 courses. It’s an anchor tag with a wrong HREF, pretty easy to find and fix”
DK: “Umm…I know *my* blog…not this intranet mess. Did you take all the courses on W3-Schools? Do you understand all the latest web html standards?”
Me: “No, but I don’t think W3 has anything to do the problem. Pretty sure I can figure it out.”
DK: “ha ha…’figuring it out’. I have to know every detail on how the intranet works. What about the javascript? Those intranet html files probably have javascript. I can’t make any changes until I know I won’t break anything. _bleep_! Now I have to learn javascript! This C# project will never get done. The PM is going to be _bleep_issed! Great..and I’ll probably have to work weekends to catch up!”
While he is ranting…I open up the html file, locate the misspelling, fix it, save it..
Me: “Hey..it’s fixed. Looks like Karl accidently added a space in the file name. No big deal.”
DK:”What!!! How did you…uh…I don’t understand…how did you know what the file name was? What if you changed something that broke the page? How did you know it was the correct file? I would not change anything unless I understood every detail. You’re gonna’ get fired.”
Me: “Well, it’s done. Move on.”9 -
I've never used Windows in my day-to-day life. No kidding.
When I got my father's first computer, I used an old distribution called BBC Linux. I didn't have any computer knowledge, it was my first contact with a computer, so I went to a friend's house and asked for a CD to install on my computer. I don't know if this friend ended up making a "gotcha" and thought I'd give up, but I just read the manuals and fell in love. That was year 2000.
Then I used Conectiva Linux, then I went to Red Hat 9, then Slackware, then in 2007 I started using Solaris. And I stayed on Solaris (Solaris 10, Solaris Nevada and OpenSolaris) until 2011.
In 2011 I bought a Mac. I stayed at Apple until 2020, when I couldn't stand Apple forcing me to buy new computers (I still don't understand how a 2011 iMac, i5 (4 Hyper Thread cores) with 16GB of RAM, 1TB SSD only runs up to High Sierra).
Then I bought a Dell. It came with Windows 10, the first thing I did was install WSL2. I could not stand it, the system is bad, sorry. I installed OpenSuse and have been using it for two years.
It's just that every day someone tells me "how can you use this"? "There is no alternative to Windows, do you want to be different?"
I know that my story was the reverse of the "mainstream", so I'm going to talk about my vision of Windows, that in my brain it is actually the "alternative".
- Having a file explorer without "tabs" in 2022 is unthinkable for me.
- I love terminal. And the Windows terminal is very limited. "ps ... | awk ... | xargs ..." is a must for me. "find ./ -name '...' -exec ..."... these things on Windows are totally "different" and have the "powershell way" while all other operating systems keep the same form. And cygwin is not an option. As Wine for serious work is also not.
- Dragging a file into the terminal, and having it write its path, is so natural, that when Windows didn't do it, I was dismayed.
- I've always used StarOffice, OpenOffice and now LibreOffice. All the people in my story received my documents and reports as a PDF and no one complained. Until a coworker saw me editing in LibreOffice and said "oh I want it in word format". As long as he didn't know, everything was fine, right?
- Windows is paid. And is there advertising? I don't understand. And I refuse. If you want to display advertising, then excuse me. I have no problem paying, I'm not an opensource shiite. It's just that paying and not working bothers me much more than an opensource that I can fix or expect a fix knowing the good will of the people involved.
- Hyper-V is a joke. QEMU/KVM is better, and Bhyve on FreeBSD which is a very young project, is already a million times better than Hyper-V.
- Developing in C/C++ for Windows is only possible in two ways: Either you've always lived in Windows and your brain is conditioned, or you compile with MSYS2 (CLang or GCC).
- There is no significant evolution of the windows desktop since 95.
- Multiple workspace support with multiple monitors, not ready. It's another joke.
- REGEDIT does not need any comment.
- The system loses performance over time. I still don't know how Windows achieves this.
- I've seen people complain about desktop fragmentation on Unix and Linux. Many DEs end up leaving applications with different themes (like running a Qt application in Gnome and GTK in KDE), but to be quite honest, the lack of Windows standard bothered me much more. Even Microsoft's own software is completely different: Control Panel, Calculator, Paint and Office, To-Do, and Settings, have horrible style differences and look-and-feel fragmentation.
- Dark mode has not been implemented. It's another joke. Many applications are white while everything else is dark. Sorry, even on Linux which is a mess, this has been resolved. And well resolved.
- NTFS? Serious?
- C:, D:.. It doesn't convince me since DOS.
- Bloatware.
- News "biased" in the search bar is a lack of respect for those who use the computer to work.
And that. For me, Windows is the alternative operating system. I can't take Windows seriously, for me it's an experimental one like Haiku or ReactOS. It's good to play.
About market share, it doesn't convince me to use it. But convinces me to sell. I've always developed applications to run on Windows. And when I need it, I turn on a VM to compile the project. But in everyday life? Impractical.15 -
I've kinda ghosted DevRant so here's an update:
VueJS is pretty good and I'm happy using it, but it seems I need to start with React soon to gain more business partnerships :( I'm down to learn React, but I'd rather jump into Typescript or stick with Vue.
Webpack is cool and I like it more than my previous Gulp implementation.
Docker has become much more usable in the last 2 years, but it's still garbage on Windows/Mac when running an application that runs on Symfony...without docker-sync. File interactions are just too slow for some of my enterprise apps. docker-sync was a life-saver.
I wish I had swapped ALL links to XHR requests long ago. This pseudo-SPA architecture that I've got now (still server-side rendered) is pretty good. It allows my server to do what servers do best, while eliminating the overhead of reloading CSS/JS on every request. I wrote an ES6 component for this: https://github.com/HTMLGuyLLC/... - Frankly, I could give a shit if you think it's dumb or hate it or think I'm dumb, but I'd love to hear any ideas for improving it (it's open source for a reason). I've been told my script is super helpful for people who have Shopify sites and can't change the backend. I use it to modernize older apps.
ContentBuilder.js has improved a ton in the last year and they're having a sale that ends today if you have a need for something like that, take a look: https://innovastudio.com/content-bu...
I bought and returned a 2019 Macbook pro with i9. I'll stick with my 2015 until we see what's in store for 2020. Apple has really stopped making great products ever since Jobs died, and I can't imagine that he was THAT important to the company. Any idiot on the street can you tell you several ways they could improve the latest models...for instance, how about feedback when you click buttons in the touchbar? How about a skinnier trackpad so your wrists aren't constantly on it? How about always-available audio and brightness buttons? How about better ports...How about a bezel-less screen? How about better arrow keys so you can easily click the up arrow without hitting shift all the time? How about a keyboard that doesn't suck? I did love touch ID though, and the laptop was much lighter.
The Logitech MX Master 3 mouse was just released. I love my 2s, so I just ordered it. We'll see how it is!
PHPStorm still hasn't fixed a couple things that are bothering me with the terminal: can't reorder tabs with drag and drop, tabs are saved but don't reconnect to the server so the title is wrong if you reopen a project and forget that the terminal tabs are from your last session and no longer connected. I've accidentally tried to run scripts locally that were meant for the server more than once...
I just found out this exists: https://caniuse.email/
I'm going to be looking into Kubernetes soon. I keep seeing the name (docker for mac, digitalocean) so I'm curious.
AWS S3 Glacier is still a bitch to work with in 2019...wtf? Having to setup a Python script with a bunch of dependencies in order to remove all items in a vault before you can delete it is dumb. It's like they said "how can we make it difficult for people to remove shit so we can keep charging them forever?". I finally removed almost 2TB of data, but my computer had to run that script for a day....so dumb...6 -
I was asked to fix a critical issue which had high visibility among the higher ups and were blocking QA from testing.
My dev lead (who was more like a dev manager) was having one of his insecure moments of “I need to get credit for helping fix this”, probably because he steals the oxygen from those who actually deserve to be alive and he knows he should be fired, slowly...over a BBQ.
For the next few days, I was bombarded with requests for status updates. Idea after idea of what I could do to fix the issue was hurled at me when all I needed was time to make the fix.
Dev Lead: “Dev X says he knows what the problem is and it’s a simple code fix and should be quick.” (Dev X is in the room as well)
Me: “Tell me, have you actually looked into the issue? Then you know that there are several race conditions causing this issue and the error only manifests itself during a Jenkins build and not locally. In order to know if you’ve fixed it, you have to run the Jenkins job each time which is a lengthy process.”
Dev X: “I don’t know how to access Jenkins.”
And so it continued. Just so you know, I’ve worked at controlling my anger over the years, usually triggered by asinine comments and decisions. I trained for many years with Buddhist monks atop remote mountain ranges, meditated for days under waterfalls, contemplated life in solitude as I crossed the desert, and spent many phone calls talking to Microsoft enterprise support while smiling.
But the next day, I lost my shit.
I had been working out quite a bit too so I could have probably flipped around ten large tables before I got tired. And I’m talking long tables you’d need two people to move.
For context, unresolved comments in our pull request process block the ability to merge. My code was ready and I had two other devs review and approve my code already, but my dev lead, who has never seen the code base, gave up trying to learn how to build the app, and hasn’t coded in years, decided to comment on my pull request that upper management has been waiting on and that he himself has been hounding me about.
Two stood out to me. I read them slowly.
“I think you should name this unit test better” (That unit test existed before my PR)
“This function was deleted and moved to this other file, just so people know”
A devil greeted me when I entered hell. He was quite understanding. It turns out he was also a dev.3 -
MTP is utter garbage and belongs to the technological hall of shame.
MTP (media transfer protocol, or, more accurately, MOST TERRIBLE PROTOCOL) sometimes spontaneously stops responding, causing Windows Explorer to show its green placebo progress bar inside the file path bar which never reaches the end, and sometimes to whiningly show "(not responding)" with that white layer of mist fading in. Sometimes lists files' dates as 1970-01-01 (which is the Unix epoch), sometimes shows former names of folders prior to being renamed, even after refreshing. I refer to them as "ghost folders". As well known, large directories load extremely slowly in MTP. A directory listing with one thousand files could take well over a minute to load. On mass storage and FTP? Three seconds at most. Sometimes, new files are not even listed until rebooting the smartphone!
Arguably, MTP "has" no bugs. It IS a bug. There is so much more wrong with it that it does not even fit into one post. Therefore it has to be expanded into the comments.
When moving files within an MTP device, MTP does not directly move the selected files, but creates a copy and then deletes the source file, causing both needless wear on the mobile device' flash memory and the loss of files' original date and time attribute. Sometimes, the simple act of renaming a file causes Windows Explorer to stop responding until unplugging the MTP device. It actually once unfreezed after more than half an hour where I did something else in the meantime, but come on, who likes to wait that long? Thankfully, this has not happened to me on Linux file managers such as Nemo yet.
When moving files out using MTP, Windows Explorer does not move and delete each selected file individually, but only deletes the whole selection after finishing the transfer. This means that if the process crashes, no space has been freed on the MTP device (usually a smartphone), and one will have to carefully sort out a mess of duplicates. Linux file managers thankfully delete the source files individually.
Also, for each file transferred from an MTP device onto a mass storage device, Windows has the strange behaviour of briefly creating a file on the target device with the size of the entire selection. It does not actually write that amount of data for each file, since it couldn't do so in this short time, but the current file is listed with that size in Windows Explorer. You can test this by refreshing the target directory shortly after starting a file transfer of multiple selected files originating from an MTP device. For example, when copying or moving out 01.MP4 to 10.MP4, while 01.MP4 is being written, it is listed with the file size of all 01.MP4 to 10.MP4 combined, on the target device, and the file actually exists with that size on the file system for a brief moment. The same happens with each file of the selection. This means that the target device needs almost twice the free space as the selection of files on the source MTP device to be able to accept the incoming files, since the last file, 10.MP4 in this example, temporarily has the total size of 01.MP4 to 10.MP4. This strange behaviour has been on Windows since at least Windows 7, presumably since Microsoft implemented MTP, and has still not been changed. Perhaps the goal is to reserve space on the target device? However, it reserves far too much space.
When transfering from MTP to a UDF file system, sometimes it fails to transfer ZIP files, and only copies the first few bytes. 208 or 74 bytes in my testing.
When transfering several thousand files, Windows Explorer also sometimes decides to quit and restart in midst of the transfer. Also, I sometimes move files out by loading a part of the directory listing in Windows Explorer and then hitting "Esc" because it would take too long to load the entire directory listing. It actually once assigned the wrong file names, which I noticed since file naming conflicts would occur where the source and target files with the same names would have different sizes and time stamps. Both files were intact, but the target file had the name of a different file. You'd think they would figure something like this out after two decades, but no. On Linux, the MTP directory listing is only shown after it is loaded in entirety. However, if the directory has too many files, it fails with an "libmtp: couldn't get object handles" error without listing anything.
Sometimes, a folder appears empty until refreshing one more time. Sometimes, copying a folder out causes a blank folder to be copied to the target. This is why on MTP, only a selection of files and never folders should be moved out, due to the risk of the folder being deleted without everything having been transferred completely.
(continued below)29 -
Not happy about the modern wave of apps producing really long file name suffixes rather than trying to stick to just 3. For example .afphoto. It doesn't appeal to my OCD nature.3
-
I'm a TA myself and just yesterday wanted to defend my fellow TAs and CS/IT teachers from some of the rants here. Of course not all of the rants are but I found a few quite unfair towards us and I can fully understand a TA getting confused and tired after 5-7 hours of helping and wrapping your head around some of the harder problems the students run into.
However, I'm also a student myself and right now I'm fucking fed up with the shit my supervisor gives me regularly .. So let the rant flow!
(disclaimer: the following text uses “you” to address the rant recipient. So, dear reader, don't feel offended)
First of, why do you fucking care when and especially where I'm working on your project when you know I'm only working part time since I'm usually tutoring students by daylight. Having me come in after my TA shift to work on your project instead of letting me go home, get some rest and food, and start working with a fresh head is neither helping you nor very productive. Also, if you want me to be productive and use your fucking tools to get going faster you better not make me fucking debug your fucking tools. For instance, I don't even have the same first name so all your fucking paths are invalid on my fucking machine! Also, I get that your machine is more powerful than mine and I don't really care about it as long as you don't fucking push convoluted messy timing sensitive scripts and make me search for the correct values on my machine. And, if a file your script is trying to delete is not there aborting is not an valid exception handling!
And don't get me started on the scripts that actually do some work besides setting up your fucking toolchain! -
A couple of years ago, we decide to migrate our customer's data from one data center to another, this is the story of how it goes well.
The product was a Facebook canvas and mobile game with 200M users, that represent approximately 500Gibi of data to move stored in MySQL and Redis. The source was stored in Dallas, and the target was New York.
Because downtime is responsible for preventing users to spend their money on our "free" game, we decide to avoid it as much as possible.
In our MySQL main table (manually sharded 100 tables) , we had a modification TIMESTAMP column. We decide to use it to check if a user needs to be copied on the new database. The rest of the data consist of a savegame stored as gzipped JSON in a LONGBLOB column.
A program in Go has been developed to continuously track if a user's data needs to be copied again everytime progress has been made on its savegame. The process goes like this: First the JSON was unzipped to detect bot users with no progress that we simply drop, then data was exported in a custom binary file with fast compressed data to reduce the size of the file. Next, the exported file was copied using rsync to the new servers, and a second Go program do the import on the new MySQL instances.
The 1st loop takes 1 week to copy; the 2nd takes 1 day; a couple of hours for the 3rd, and so on. At the end, copying the latest versions of all the savegame takes roughly a couple of minutes.
On the Redis side, some data were cache that we knew can be dropped without impacting the user's experience. Others were big bunch of data and we simply SCAN each Redis instances and produces the same kind of custom binary files. The process was fast enough to launch it once during migration. It takes 15 minutes because we were able to parallelise across the 22 instances.
It takes 6 months of meticulous preparation. The D day, the process goes smoothly, but we shutdowns our service for one long hour because of a typo on a domain name.1 -
Coworker: Oh, I couldn't find my Excel file, can you help me to find it?
Me: What is your file name?
Coworker: The file created long time ago, I forgot my file name. But I open it yesterday.
Me: Ok, let's check open recent. (It's surprisingly empty)
Coworker: Yeah, I cleared it just before you arrived. I thought clearing the recent item, will show older items.
Me: ...Ok, let's do a search on all Excel items, which drive did you save your file?
Coworker: I don't remember.
Me: (After search) There are 1000 Excel files. You can start from eliminating the items you remember and ultimately you will find your file. That's all I can help.
Coworker: @x$(/"! ?!
There always a forgetful coworker around me that thinks I am a magician.2 -
Okay ... Windows, I really tried to be nice, but that's it. You get off my ssd right now ...
Just got the error: "filename is too long - unable to delete file" (full name incl. path)
Seriously, WHY?
I mean, sure ... Long file paths/names are a thing and this is why there are limitations. I am totally fine with it!
But Why the hell allows windows you to create those files if it is unable to delete these files later ...
I don't get it. Maybe I am just hit my head to hard as a child and someone may enlighten me ...
PS: windows was running in a VM to give it a real serious try after years on Linux4 -
Not a rant, but i'm proud of myself :3
to make a long story short, i wanted a wrapper for a api (https://esi.evetech.net), but there were none that were updated or actively maintained, so i built my own. the first version had all the non-authed endpoints, and 2 days ago i finally got all the authed endpoints, plus added in features like a config file (for storing the token and project name) :D
I'm one happy fox rn!8 -
With the billions of dollars Google has, they can't even build a proper file manager for their Android operating system.
The pre-installed file manager on Android OS, codenamed "DocumentsUI", is functionally crippled and lacks the most basic functionality.
First of all, there is no range selection or A-to-B selection of items. If many items need to be selected, each item has to be tapped individually. Meanwhile, ES File Manager had A-to-B selection since at least 2012, back when Android OS was an operating system of freedom, before Android OS got cucked.
As any low-tier mobile app, the file manager by Google also lacks a draggable scroll bar, so long lists have to be scrolled through manually. Even the file manager of Windows Mobile 6.5 Professional has a draggable scroll bar! And Windows Mobile 6.5 Professional was released in 2009! Samsung "My Files" had a draggable scroll bar in 2013 but it was later unexplainably removed.
Its search feature can only search the entire storage, not an individual folder, and lacks filters such as date and file type.
Obviously, as in any terrible Android file manager, after items are selected for copying and moving, tapping "Copy to..." or "Move to..." navigates back to the initial directory rather than staying in the current directory. The user is forced to navigate all the way to the folder with the selected files if the intention was moving files to a sub folder. Any Android file manager that does this automatically qualifies as a low-tier file manager.
The file manager by Google even lacks a "details" feature which shows information such as the exact file size and name and the total size and file count of a folder. Some file managers such as the one by MediaTek are unable to show the details for multiple selected items, which is somewhat forgivable, but the Google file manager does not have a "details" feature to begin with.
Files are always sorted alphabetically after each start. The Google file manager does not memorize if the user selects sorting "by size" or "by last modified". As one might expect, it indeed lacks reverse sorting.
Of course, there is no "open with" feature where the application can be selected manually, and there is no ability to create new blank files, and it lacks tabbed browsing, and does not show the number of files inside folders in list view. ES File Manager (before it became adware in ~2016) has all of these features.
Last but not least, there has been a bug where cancelling a file move operation deletes the source folder without it having been transferred. Presumably it has been patched by now, however, a bug where tapping "cancel" leads to data loss is inexcuseable. It shows the app has not even been properly tested, let alone properly created.
http://archive.today/2020.10.27-160...
Google could have hired a college student who could have built something better than the scrapyard-worthy "file manager" they have built.
But granted, at least Google's ever-so-terrible file manager does not limit file names to fifty (50) characters like Samsung's TouchWiz file manager, also known as "My Files", did until at least 2016. There is no way to know what went through the head of the programmer who implemented this pointless limitation. Google's file manager also correctly handles file name conflicts by renaming the new files.
Microsoft built a better file manager for their operating system decades earlier than what Google threw together. Microsoft spent more of their money building a proper file manager.6 -
A friend of mine asked me yesterday for help for his bachelor thesis.
He wants to write about MySQL internals in regards to BLOB storage / usage.
We had a veeeerrrry long discussion....
And found a loooot of scary internet pages.
It's so .... Insane....
What some people with doctor titles or higher education generate...
Isn't content. More poo...
Most "blogs" / "articles" or whatever the author named it were missing all kinds of relevant data (version, configuration, anything relevant) but full of opinionated / biased bullshit.
Highlights were:
- we store lot of BLOB data, Backups take long and require more space
(you store additional data in an database, whaddya expect???!!!!)
- interesting guesswork about locking without any reference (interesting since it was sometimes so far away from reality that it looked more like quantum physics)
- storing blobs means that _each_ blob entry will be stored in a separate file (without any reference, but if an RDBMs did that... It would end in an amazing fireball I guess)
- BLOB's bad since it can represent only the file content, the database cannot distinguish wether it's an MP3 / MPG or anything like that...
(Ehm. Yeah. And an database cannot distinguish if you store under "Name" an Name or gibberish?!)
I somehow think that some people made an doctor and post this gibberish nonsense so people stay dumb to give them a job...
Like the TV repair men who steals the batteries from the remote.
Even conspiracy theories were more convincing -
Why, in 2023, do we still have a path length limit in Windows?
I know it's not that trivial finding a good soution, but at least if I managed to get a .zip which I can't extract in that directory because of a few files with a too long file name, let me know in advance (and not during the extraction) and maybe (amazing idea) let me know how many steps I have to go back in the directories tree to make it work…12 -
just found out a vulnerability in the website of the 3rd best high school in my country.
TL;DR: they had burried in some folders a c99 shell.
i am a begginer html/sql/php guy and really was looking into learning a bit here and there about them because i really like problem solving and found out ctfs mainly focus on this part of programming. i am a c++ programmer which does school contest like programming problems and i really enjoy them.
now back on topic.
with this urge to learn more web programming i said to myself what other method to learn better than real life sites! so i did just that. i first checked my school site. right click. inspect element. it seemed the site was made with wordpress. after looking more into the html code for the site i concluded all the images and files i could see on the site were from a folder on the server named 'wp-content/uploads'. i checked the folder. and here it got interesting. i did a get request on the site. saw the details. then i checked the site. bingo! there are 3 folders named '2017', '2018', '2019'. i said to myself: 'i am god.'
i could literally see all the announcements they have made from 2017-2019. and they were organised by month!!! my curiosity to see everything got me to the final destination.
with this adrenaline i thought about another site. in my city i have the 3rd most acclaimed high school in the country. what about checking their security?
so i typed the web address. looked around. again, right click, inspect element and looked around the source code. this time i was more lucky. this site is handmade!!! i was soooo happy because with my school's site i was restricted with what they have made with wordpress and i don't have much experience with it.
amd so i began looking what request the site made for the logos and other links. it seemed all the other links on the site were with this format: www.site.com/index.php?home. and i was very confused and still am. is this referencing some part of the site in the index.php file? is the whole site written inside the index.php file and with the question mark you just get to a part of the site? i don't really get it.
so nothing interesting inside the networking tab, just some stylesheets for the site's design i guess. i switched to the debugger tab and holy moly!! yes, it had that tree structure. very familiar. just like a project inside codeblocks or something familiar with it. and then it clicked me. there was the index.php file! and there was another folder from which i've seen nothing from the network tab. i finally got a lead!! i returned in the network tab, did a request to see the spgm folder and boooom a site appeared and i saw some files and folders from 2016. there was a spgm.js file and a spgm.php file. there was a contrib, flavors, gal and lang folders. then it once again clicked me! the lang folder was las updated this year in february. so i checked the folder and there were some files named lang with the extension named after their language and these files were last updated in 2016 so i left them alone. but there was this little snitch, this little 650K file named after the name of the school's site with the extension '.php' aaaaand it was last modified this year!!!! i was so excited! i thought i found a secret and different design of the site or something completely else! i clicked it and at first i was scared there was this black/red theme going on my screen and something was a little odd. there were no school announcements or event, nononoooo. this was still a tree structured view. at the top of the site it's written '!c99Shell v. 1.0...'
this was a big nono. i saw i could acces all kinds of folders. then i switched to the normal school website and tried to access a folder i have seen named userfiles and got a 403 forbidden error. wopsie. i then switched to the c99 shell website and tried to access the userfiles folder and my boy showed all of its contents. it was nakeeed naked. like very naked. and in the userfiles folder there were all, but i mean ALL files and folders they have on the server. there were a file with the salary of each job available in the school. some announcements. there was a list with all the students which failed classes. there were folders for contests they held. it was an absolute mess and i couldn't believe it.
i stopped and looked at the monitor. what have i done? just to learn some web programming i just leaked the server of the 3rd most famous high school in my country. image a black hat which would have seriously caused more damage. currently i am writing an email to the school to updrage their security because it is reaaaaly bad.
and the journy didn't end here. i 'hacked' the site 2 days ago and just now i thought about writing an email to the school. after i found i could access the WHOLE server i searched for the real attacker so if you want to knkw how this one went let me know in the comments.
sorry for the long post, but couldn't held it anymore13 -
Client be like:
Pls, could you give the new Postgres user the same perms as this one other user?
Me:
Uh... Sure.
Then I find out that, for whatever reason, all of their user accounts have disabled inheritance... So, wtf.
Postgres doesn't really allow you to *copy* perms of a role A to role B. You can only grant role A to role B, but for the perms of A to carry over, B has to have inheritance allowed... Which... It doesn't.
So... After a bit of manual GRANT bla ON DATABASE foo TO user, I ping back that it is done and breath a sigh of relief.
Oooooonly... They ping back like -- Could you also copy the perms of A on all the existing objects in the schema to B???
Ugh. More work. Lets see... List all permissions in a schema and... Holy shit! That's thousands of tables and sequences, how tf am I ever gonna copy over all that???
Maybe I could... Disable the pager of psql, and pipe the list into a file, parse it by the magic of regex... And somehow generate a fuckload of GRANT statements? Uuuugh, but that'd kill so much time. Not to mention I'd need to find out what the individual permission letters in the output mean... And... Ugh, ye, no, too much work. Lets see if SO knows a solution!
And, surprise surprise, it did! The easiest, simplest to understand way, was to make a schema-only dump of the database, grep it for user A, substitute their name with B, and then input it back.
What I didn't expect is for the resulting filtered and altered grant list to be over 6800 LINES LONG. WHAT THE FUCK.
...And, shortly after I apply the insane number of grants... I get another ping. Turns out the customer's already figured out a way to grant all the necessary perms themselves, and I... No longer have to do anything :|
Joy. Utter, indescribable joy.
Is there any actual security reason for disabling inheritance in Postgres? (14.x) I'd think that if an account got compromised, it doesn't matter if it has the perms inherited or not, cuz you can just SET ROLE yourself to the granted role with the actual perms and go ham...3 -
2005 called. It wants its numbered file names back.
While I am mostly satisfied with "celluloid" as a worthy successor to xplayer, the first major disappointment I stumbled upon is `celluloid-shot0001.jpg`. Are we in 2005?
Just like xplayer, Celluloid, the new default media player of Linux Mint, should use proper, i.e. time-stamped names such as `celluloid-2023-04-10T00-47-42.jpg` or `celluloid-video_file_name-2023-04-10T00-47-42.jpg` for screenshots taken from videos, to eliminate the possibility of file name conflicts if files are moved into other directories, to make screenshots searchable by video file name, and to retain the date and time information if the files are moved to a device that does not support date and time stamp retention such as MTP (Media Transfer Protocol), and to allow for date range selection using wildcards in the terminal (e.g. `celluloid-2023-04*` for all screenshots from April 2023). Besides, PNG screenshots should be supported too, but that's out of scope here.
As a reference, the gnome and mate screenshot tools also pre-fill time stamps into the file name field.
Numbered file names were useful in an era when there was no VFAT and file names needed to have 8.3 file names that could impossibly fit a date and a time, and compact cameras used such names, but those times are long over. Just like the useless and annoying pull-to-refresh gesture on mobile apps and the Media Transfer Protocol, numbered file names belong to the technological graveyard.
If numbers are really desirable, at least `celluloid-shot0001.2023-04-10T00-47-42.jpg` should be used, to include both a number and a date. The command to get this date format is `date +"%Y-%m-%dT%H-%M-%S"`. For compatibility across operating systems, dashes instead of colons have to be used to separate hours and minutes and seconds.
Numbered file names are a thing of the past. Use time stamps.2 -
There were many issues that came about during my entire employment, but I woke up today with some, honestly, quite bizarre questions from my manager that made me open an account here. This is just the latest in many frustrations I have had.
For context, my manager is more of a "tech lead" who maintains a few projects, the number can probably be counted in one hand. So he does have the knowledge to make changes when needed.
A few weeks ago, I was asked to develop a utility tool to retrieve users from Active Directory and insert them into a MSSQL Database, pretty straight forward and there were no other requirements.
I developed it, tested it, pushed it to our repository, then deployed the latest build to the server that had Active Directory, told my manager that I had done so and left it at that.
A few weeks later,
Manager: "Can you update the tool to now support inserting to both MSSQL and MySQL?"
Me: "Sure." (Would've been nice to know that beforehand since I'm already working on something else but I understand that maybe it wasn't in the original scope)
I do that and redeploy it, even wrote documentation explaining what it did and how it worked. And as per his request, a technical documentation as well that explains more in depth how it works. The documents were uploaded as well.
A few days after I have done so,
Manager: "Can you send me the built program with the documentation directly?"
I said nothing and just did as he asked even though I know he could've just retrieved it himself considering I've uploaded and deployed them all.
This morning,
Manager: "When I click on this thing, I receive this error."
Me: "Where are you running the tool?"
Manager: "My own laptop."
Me: "Does your laptop have Active Directory?"
Manager: "Nope, but I am connected to the server with Active Directory."
Me: "Well the tool can only retrieve Active Directory information on a PC with it."
Manager: "Oh you mean it has to run on the PC with Active Directory?"
Me: "Yeah?"
Manager: "Alright. Also, what is the valid value for this configuration? You mentioned it is the Database connection string."
After that I just gave up and stopped responding. Not long after, he sent me a screenshot of the configuration file where he finally figured out what to put in.
A few minutes later,
Manager: "Got this error." And sends a screenshot that tells you what the error is.
Me: "The connection string you set is pointing to the wrong database schema."
Manager: "Oh whoops. Now it works. Anyway, what are these attribute values you retrieve from Active Directory? Also, what is the method you used to connect/query/retrieve the users? I need to document it down for the higher ups."
Me: "The values are the username, name and email? And as mentioned in the technical documentation, it's retrieving using this method."
The 2+ years I have been working with this company has been some of the most frustrating in my entire life. But thankfully, this is the final month I will be working with them.21 -
When you debug nasty code and first committer has but his/her name on top of file. Searching author's home address after long day.
-
Data wrangling is messy
I'm doing the vegetation maps for the game today, maybe rivers if it all goes smoothly.
I could probably do it by hand, but theres something like 60-70 ecoregions to chart,
each with their own species, both fauna and flora. And each has an elevation range its
found at in real life, so I want to use the heightmap to dictate that. Who has time for that? It's a lot of manual work.
And the night prior I'm thinking "oh this will be easy."
yeah, no.
(Also why does Devrant have to mangle my line breaks? -_-)
Laid out the requirements, how I could go about it, and the more I look the more involved
it gets.
So what I think I'll do is automate it. I already automated some of the map extraction, so
I don't see why I shouldn't just go the distance.
Also it means, later on, when I have access to better, higher resolution geographic data, updating it will be a smoother process. And even though I'm only interested in flora at the moment, theres no reason I can't reuse the same system to extract fauna information.
Of course in-game design there are some things you'll want to fudge. When the players are exploring outside the rockies in a mountainous area, maybe I still want to spawn the occasional mountain lion as a mid-tier enemy, even though our survivor might be outside the cats natural habitat. This could even be the prelude to a task you have to do, go take care of a dangerous
creature outside its normal hunting range. And who knows why it is there? Wild fire? Hunted by something *more* dangerous? Poaching? Maybe a nuke plant exploded and drove all the wildlife from an adjoining region?
who knows.
Having the extraction mostly automated goes a long way to updating those lists down the road.
But for now, flora.
For deciding plants and other features of the terrain what I can do is:
* rewrite pixeltile to take file names as input,
* along with a series of colors as a key (which are put into a SET to check each pixel against)
* input each region, one at a time, as the key, and the heightmap as the source image
* output only the region in the heightmap that corresponds to the ecoregion in the key.
* write a function to extract the palette from the outputted heightmap. (is this really needed?)
* arrange colors on the bottom or side of the image by hand, along with (in text) the elevation in feet for reference.
For automating this entire process I can go one step further:
* Do this entire process with the key colors I already snagged by hand, outputting region IDs as the file names.
* setup selenium
* selenium opens a link related to each elevation-map of a specific biome, and saves the text links
(so I dont have to hand-open them)
* I'll save the species and text by hand (assuming elevation data isn't listed)
* once I have a list of species and other details, to save them to csv, or json, or another format
* I save the list of species as csv or json or another format.
* then selenium opens this list, opens wikipedia for each, one at a time, and searches the text for elevation
* selenium saves out the species name (or an "unknown") for the species, and elevation, to a text file, along with the biome ID, and maybe the elevation code (from the heightmap) as a number or a color (probably a number, simplifies changing the heightmap later on)
Having done all this, I can start to assign species types, specific world tiles. The outputs for each region act as reference.
The only problem with the existing biome map (you can see it below, its ugly) is that it has a lot of "inbetween" colors. Theres a few things I can do here. I can treat those as a "mixing" between regions, dictating the chance of one biome's plants or the other's spawning. This seems a little complicated and dependent on a scraped together standard rather than actual data. So I'm thinking instead what I'll do is I'll implement biome transitions in code, which makes more sense, and decouples it from relying on the underlaying data. also prevents species and terrain from generating in say, towns on the borders of region, where certain plants or terrain features would be unnatural. Part of what makes an ecoregion unique is that geography has lead to relative isolation and evolutionary development of each region (usually thanks to mountains, rivers, and large impassible expanses like deserts).
Maybe I'll stuff it all into a giant bson file or maybe sqlite. Don't know yet.
As an entry level programmer I may not know what I'm doing, and I may be supposed to be looking for a job, but that won't stop me from procrastinating.
Data wrangling is fun.1 -
Revision to the Peter/Dilbert Principle, the ProjektAquarius principle: a company will systematically shift the least competent employee on to the assignments the competent employees can't be bothered to do until they become an integral part of the team and drag you down with them. (E.g. eventually they completely fuck up your delivery process, although it's probably still cheaper and quicker than having them do anything else.)
ProjektAquarius principle: A Case Study
We have an engineer who is getting paid quite a bit more than me. Over time his responsibilities have gradually been reduced to documentation and running our almost entirely automated build. Well today the build failed. He pulls me over to tell me, and says he's confused because there is a file there he has never seen before in there and a file he always has seen that isn't there (basically a file got renamed. It was not non-obvious). Answer: change the file name.
Then he comes over and tells us that it's failing again because the script is not finding a file. So a coworker of mine and I go over. He explains the whole build process to us when we ask if there is any point in the script that would help us identify where the script is looking for the file and failing (there wasn't but that's besides the point).
Turns out, he had decided to put the assembly list in order. Normally no problem, but the list is in source destination pairs. So the fucking file was being put in a different directory than the one the script was looking for it in and failing. And that's the story about how my company just paid 3 engineers a quarter of a man hour each for something that would have been resolved in 30 seconds via file search/copying and pasting a file path. Related note: our process for building an install is now about 4 hours long with no change on process besides the BCAK. -
Colleague trying to create a Visual Studio project and getting the error message that the file path name is too long. (Raging noises of agony)
Me laughing inside because I was facing the same issue a few days ago.
Now I am using VS on mac. Still a pile of crap but at least no issues with file paths anymore 😔5 -
This was a project for school, we had to simulate an app that traced bus routes over a map.
All the teams but mine do it in Java (desktop app), we took another approach and did it on Android with the Maps API.
I had fun coding a parser, this parser job was to read a file and load the bus routes and draw them on the map.
It was structured like:
NAME
COLOR
<lat, long>
<lat, long>
The fun part was coding and telling my teammates "chill out, it will work", so we finished, built and run and... done! First code working smooth AF.
I know it's a simple parser and a simple app, but it was a nice feeling not having to debug the app.1 -
I’m working on a new app I’m pretty excited about.
I’m taking a slightly novel (maybe 🥲) approach to an offline password manager. I’m not saying that online password managers are unreliable, I’m just saying the idea of giving a corporation all of my passwords gives me goosebumps.
Originally, I was going to make a simple “file encrypted via password” sort of thing just to get the job done. But I’ve decided to put some elbow grease into it, actually.
The elephant in the room is what happens if you forget your password? If you use the password as the encryption key, you’re boned. Nothing you can do except set up a brute-forcer and hope your CPU is stronger than your password was.
Not to mention, if you want to change your password, the entire data file will need to be re-encrypted. Not a bad thing in reality, but definitely kinda annoying.
So actually, I came up with a design that allows you to use security questions in addition to a password.
But as I was trying to come up with “good” security questions, I realized there is virtually no such thing. 99% of security question answers are one or two words long and come from data sets that have relatively small pools of answers. The name of your first crush? That’s easy, just try every common name in your country. Same thing with pet names. Ice cream flavors. Favorite fruits. Childhood cartoons. These all have data sets in the thousands at most. An old XP machine could run through all the permutations over lunch.
So instead I’ve come up with these ideas. In order from least good to most good:
1) [thinking to remove this] You can remove the question from the security question. It’s your responsibility to remember it and it displays only as “Question #1”. Maybe you can write it down or something.
2) there are 5 questions and you need to get 4 of them right. This does increase the possible permutations, but still does little against questions with simple answers. Plus, it could almost be easier to remember your password at this point.
All this made me think “why try to fix a broken system when you can improve a working system”
So instead,
3) I’ve branded my passwords as “passphrases” instead. This is because instead of a single, short, complex word, my program encourages entire sentences. Since the ability to brute force a password decreases exponentially as length increases, and it is easier to remember a phrase rather than a complicated amalgamation or letters number and symbols, a passphrase should be preferred. Sprinkling in the occasional symbol to prevent dictionary attacks will make them totally uncrackable.
In addition? You can have an unlimited number of passphrases. Forgot one? No biggie. Use your backup passphrases, then remind yourself what your original passphrase was after you log in.
All this accomplished on a system that runs entirely locally is, in my opinion, interesting. Probably it has been done before, and almost certainly it has been done better than what I will be able to make, but I’m happy I was able to think up a design I am proud of.8 -
In the war on bandwidth consumption, work has cut out torrent access. So I, like a child look for porn (actually I was doing that too), found a way around. I use http://filestream.me to cache my torrents. Then go to http://Uptobox.com file host and login to my account, that i created with my fake mailinator.com email address, where I use the remote URL upload feature to download my files from filestream. Change the file name to VM-update.dll (I don't know why I chose a DLL originally, but I release no one asks why you were downloading a DLL). The download. All of this, except the downloading is done in Opera Web Browser with VPN on (a little extra paranoia goes a long way).2