Join devRant
Do all the things like
++ or -- rants, post your own rants, comment on others' rants and build your customized dev avatar
Sign Up
Pipeless API
From the creators of devRant, Pipeless lets you power real-time personalized recommendations and activity feeds using a simple API
Learn More
Search - "fetch"
-
1: Get a dog
2: Name dog Sudo
3: Teach Sudo to fetch my mail
4: Invite Linux-friend over
5: Yell "sudo fetchmail"
6: ?
7: Profit20 -
When you stare into git, git stares back.
It's fucking infinite.
Me 2 years ago:
"uh was it git fetch or git pull?"
Me 1 year ago:
"Look, I printed these 5 git commands on a laptop sticker, this is all I need for my workflow! branch, pull, commit, merge, push! Git is easy!"
Me now:
"Hold my beer, I'll just do git format-patch -k --stdout HEAD..feature -- script.js | git am -3 -k to steal that file from your branch, then git rebase master && git rebase -i HEAD~$(git rev-list --count master..HEAD) to clean up the commit messages, and a git branch --merged | grep -v "\*" | xargs -n 1 git branch -d to clean up the branches, oh lets see how many words you've added with git diff --word-diff=porcelain | grep -e '^+[^+]' | wc -w, hmm maybe I should alias some of this stuff..."
Do you have any git tricks/favorites which you use so often that you've aliased them?50 -
An area sales rep once rang me to tell me his iPhone screen was cracked and was going blurry around some sections.
I told him to fetch me to look, and I will see what I can do.
5 minutes later I get a SCREENSHOT from his phone asking if I can see the crack and blurry edges.
HOW FUCKING DUMB ARE THESE PEOPLE!!!
I mean, come on. He seriously said when I called him: "But I can see the crack and blurry bits on the screenshot on my phone"4 -
Me : Let's just use CDN.
Former boss (fb) : What's that?
Me : You fetch JS and CSS files online. Faster.
Fb: No! Download it!
Me: Why?
Fb: What if there's no internet?!
Me: ... it's a website...18 -
I am currently working for a client who have all their data in Google Sheets and Drive. I had to write code to fetch that data and it's painful to query that data.
I can definitely relate with this.
PS: Their last year revenue was over US$2 Bn and one of their sibling company is among Top IT companies in the country.7 -
I sometimes write code by first putting comments and then writing the code.
Example
#fetch data
#apply optimization
#send data back to server
Then i put the code in-between the comments so that i can understand the flow.
Anyone else has this habit?18 -
*starts coding in c#*
Me: hmm this bit of functionality requires some good ol inheritence
*has flashback to uni lecture on c++ *
Lecturer: And so you can use inheritence with friends for xyz, you know what they say friends can touch eachothers privates
*end of flashback*
Me: Guh! No, not the puns ! Guh!5 -
Client: I have lost everything on the cloud holy crap!
Me: Are you signed into your google drive, and within the folder?
Client: No, how do i do that again? I obviously cant be bothered reading your well formated and instructional guide and would rather contact you at 6pm on a saturday night8 -
Jack and Jill
Pulled down from git
To fetch aPaleOfWater.c
Jack made some changes
And then pushed them all up
A merge conflict occurred
Jill decided his changes sucked
And push --force over his
Jack was enraged
For history was changed
And force pushed Jill
down a hill3 -
I fucking LOVEEEEE that there's atleast 50 different ways to set the mysql root password, depending on what weather, what time and what version you're lucky to fetch from the repos.2
-
*starts coding by 7pm in the evening.
*remembers that he would soon have to go fetch something to eat but keeps on coding.
*tells himself he would get food by 10pm.
*checks the time - it's past 12am.
*codes all night long on an empty belly but doesn't care.10 -
Why do some non-devs treat professional app development like some kids craft-making hobby that requires zero skill and knowledge or brain?
A friend (with ZERO knowledge about coding) said to me today, teach me, or tell me how to learn this app development, I'll learn it within a month and make my own apps plus do freelance app work in free time, apps fetch plenty of money easily. Blah blah.
Not the first time, other non dev friends have talked in the same way on other instances.
It's insulting and infuriating. I don't even know what to reply.7 -
Brother of my friend came to me and asked me to teach him C as it was most important lesson in high school CS. I agreed and started with data types, conditional statements, loops and others that were mostly exam oriented. He was doing good. Then I thought of teaching him a life lesson and introduced him to pointers(questions about pointers are very rare in exams). As soon as I started the pointers, things got pretty bored and he went off topic and started talking about a girl he has crush on and told he wanted to know when her birthday was so that he could gift her something to be ahead of the crowd trying to impress her. I thought to help him out, afterall he's like my younger brother and told him I can help. Result of his previous exam were out then, providing symbol number on Examination Board's website would do the trick because it would return full data of students result which had birthday in it. I modified my previous script to fetch data of his school's result and pass the data to a file. They're together since last few months. He reminds me time to time that my code is what got them together.8
-
Wrote a python script to fetch details of amazon products to monitor price differences.
The script is only 50 lines, which is why I love python!8 -
⚡️ devRantron v1.4.1 ⚡️
I strongly urge all the users of the devRantron to upgrade their app. We have added some major features and made a lot of bugfixes. For example:
1. Edit Rants and Comments
2. Browse Weekly
3. Save drafts of rants so that you can edit and post them later. Also, the app now autosaves when you are typing a new rant and will keep it until you post it.
4. Fixed macOS startup. Previously the app used to open a terminal in the background to launch the app. That has been removed.
5. Confirmation before deleting a rant or comment
6. Huge performance optimization. We have upgraded to React 16 and also changed the way our compiler compiles the application. The way we fetch the notifications has also been changed and it uses less bandwidth.
7. The app will only have single instance now. If you accidentally open the app again, it will just switch to the currently running instance.
8. We now show a release info dialogue before updating. Linux and macOS users will now receive an update notification for new updates.
9. Added the ability to select rant types.
You can get it from here: https://devrantron.firebaseapp.com/
macOS users, please remove the devRantron from "Login Items" in Settings > Users and Groups.
We would like to thank all our users for giving us the feedbacks. If you like the app, you can show your appreciation by giving a start to the repo.
Thank you!23 -
#2 Worst thing I've seen a co-worker do?
Back before we utilized stored procedures (and had an official/credentialed DBA), we used embedded/in-line SQL to fetch data from the database.
var sql = @"Select
FieldsToSelect
From
dbo.Whatever
Where
Id = @ID"
In attempts to fix database performance issues, a developer, T, started putting all the SQL on one line of code (some sql was formatted on 10+ lines to make it readable and easily copy+paste-able with SSMS)
var sql = "Select ... From...Where...etc";
His justification was putting all the SQL on one line make the code run faster.
T: "Fewer lines of code runs faster, everyone knows that."
Mgmt bought it.
This process took him a few months to complete.
When none of the effort proved to increase performance, T blamed the in-house developed ORM we were using (I wrote it, it was a simple wrapper around ADO.Net with extension methods for creating/setting parameters)
T: "Adding extra layers causes performance problems, everyone knows that."
Mgmt bought it again.
Removing the ORM, again took several months to complete.
By this time, we hired a real DBA and his focus was removing all the in-line SQL to use stored procedures, creating optimization plans, etc (stuff a real DBA does).
In the planning meetings (I was not apart of), T was selected to lead because of his coding optimization skills.
DBA: "I've been reviewing the execution plans, are all the SQL code on one line? What a mess. That has to be worst thing I ever saw."
T: "Yes, the previous developer, PaperTrail, is incompetent. If the code was written correctly the first time using stored procedures, or even formatted so people could read it, we wouldn't have all these performance problems."
DBA didn't know me (yet) and I didn't know about T's shenanigans (aka = lies) until nearly all the database perf issues were resolved and T received a recognition award for all his hard work (which also equaled a nice raise).7 -
Long story short, I'm unofficially the hacker at our office... Story time!
So I was hired three months ago to work for my current company, and after the three weeks of training I got assigned a project with an architect (who only works on the project very occasionally). I was tasked with revamping and implementing new features for an existing API, some of the code dated back to 2013. (important, keep this in mind)
So at one point I was testing the existing endpoints, because part of the project was automating tests using postman, and I saw something sketchy. So very sketchy. The method I was looking at took a POJO as an argument, extracted the ID of the user from it, looked the user up, and then updated the info of the looked up user with the POJO. So I tried sending a JSON with the info of my user, but the ID of another user. And voila, I overwrote his data.
Once I reported this (which took a while to be taken seriously because I was so new) I found out that this might be useful for sysadmins to have, so it wasn't completely horrible. However, the endpoint required no Auth to use. An anonymous curl request could overwrite any users data.
As this mess unfolded and we notified the higher ups, another architect jumped in to fix the mess and we found that you could also fetch the data of any user by knowing his ID, and overwrite his credit/debit cards. And well, the ID of the users were alphanumerical strings, which I thought would make it harder to abuse, but then realized all the IDs were sequentially generated... Again, these endpoints required no authentication.
So anyways. Panic ensued, systems people at HQ had to work that weekend, two hot fixes had to be delivered, and now they think I'm a hacker... I did go on to discover some other vulnerabilities, but nothing major.
It still amsues me they think I'm a hacker 😂😂 when I know about as much about hacking as the next guy at the office, but anyways, makes for a good story and I laugh every time I hear them call me a hacker. The whole thing was pretty amusing, they supposedly have security audits and QA, but for five years, these massive security holes went undetected... And our client is a massive company in my country... So, let's hope no one found it before I did.6 -
Not having finished any education, and writing code during interviews.
I have a pretty nice resume with good references, and I think I'm a reasonably good & experienced dev.
But I'm absolutely unable to write code on paper, and really wonder how some devs can just write out algorithms using a pen and reason about it, without trying/failing/playing/fixing in an IDE.
Education I think.
I can transform the theory on a complex Wikipedia page about math/algorithm into code, I can translate a Haskell library into idiomatic python... but what I haven't done is write out sorting functions or fibonacci generators a million times during Java class.
I don't see the point either... but I still feel utterly worthless during an interview if they ask "So you haven't even finished highschool? Can you at least solve this prime number problem using a marker on this whiteboard? Could you explain in words which sorting algorithm is faster and why?"
"Uh... let me fetch a laptop with an IDE, stackoverflow and Wikipedia?"22 -
Don't do "git pull" quickly. Always do a "git fetch" THEN "git log HEAD..origin" OR "git log -p HEAD..origin". It is like previewing first what you will "git pull".
OR something like (example):
- git fetch
- git diff origin/master
- git pull --rebase origin master
Sometimes it is a trap, you will pull other unknown or unwanted files that will cause some errors after quickly doing a git pull when working in a team. Better safe than sorry.
Other tips and tricks related are welcome 😀
Credits: https://stackoverflow.com/questions...5 -
Unexperienced digital immigrant: "Let's make a website, can you do this?"
Me: "Yeah."
"But you gotta use wordpress. We want to be able to change the content easily. Also we want to have that website done fast."
"Meh, ok."
one day later
"Are you done yet?"
Me: "No, I got to find a good way for the lightbox I'm still implementing to fetch data from posts blabla"
"Why are you not done yet? Just take a plugin, even I can do that. Wordpress is so easy, it's just 1, 2, 3 and done."
Yeah I have an idea. Why don't you just make the website yourself.4 -
Me to my family :
Family: so this printer not working
Me: have you installed its software
Family: no, can you do it?
Me: i could travel 1 hour or you can just google and download it, its really quite simp--
Family: yeah this is to complicated for me il need you to come over10 -
Not really a bug, but once I tried to learn building function ajax per table asynchronously instead of calling all of them at once. Spend like couple of hours of trial of error. It wasn’t needed at the time, but suddenly I need to fetch something separately because of a new feature. Just write a couple and line it’s done
-
So the water dispenser in the kitchen does not have sparkling water, which I love. But there's one in the meeting room down the corridor that has sparkling water!
Like any regular employee of course I filed a request with site manager to upgrate the kitchen dispenser... NOT!
I wrote an app that sits in the taskbar when minimized and shows a traffic light with the status of the meeting room availability so I know when it's clear to go fetch me some of that bubbly goodness!7 -
I worked for over 13 hours yesterday on super-urgent projects. I got so much done it's insane.
Projects:
1) the printer auto-configuration script.
2) changing Stripe from test mode to live mode in production
3) website responsiveness
I finished two within five minutes and pushed to both QA and Production. actually urgent, actually necessary. Easy change.
The printer auto-configure script was honestly fun to write, if very involved. However, the APIs I needed to call to fetch data, create a printer client, etc... none of them were tested, and they were _all_ broken in at least two ways. The CTO (api guy in my previous rant) was slow at fixing them, so getting the APIs working took literally four hours. One of them (test print) still doesn't work.
Responsiveness... this was my first time making a website responsive. Ever. Also, one of the pages I needed to style was very complicated (nested fixed-aspect-ratio + flexbox); I ended up duplicating the markup and hacking the styling together just to make it work. The code is horrible. But! "Friday's the day! it's going live and we're pushing traffic to it!" So, I invested a lot of time and energy into making it ready and as pretty as I could, and finally got it working. That page alone took me two hours.
The site and the printer script (and obv the Stripe change as well) absolutely needed to be done by this morning. Super important.
well.
1) Auto-configure script. Ostensibly we would have an intern come in and configure the printers. However, we have no printers that need configuring, so she did marketing instead. :/ Also, the docs Epson sent us only work for the T88V printer (we have exactly one, which we happened to set up and connect to). They do not work for the T88VI printers, which is what we ordered. and all we'll ever be ordering. So. :/ I'll need to rewrite a large chunk of my code to make this work. Joy :/
2) Stripe Live mode. Nobody even seemed to notice that we were collecting info in Test mode, or that I fixed it. so. um. :/
3) Responsiveness.
Well. That deadline is actually next Wednesday. The marketing won't even start until then, and I haven't even been given the final changes yet (like come on). Also! I asked for a QA review last night before I'd push it to production. One person glanced at it. Nobody else cared. Nobody else cared enough to look in the morning, either, so it's still on QA. Super-important deadline indeed. :/
Honestly?
I feel like Alice (from Dilbert) after she worked frantically on urgent projects that ended up just being cancelled. (That one where Wally smells that lovely buttery-popcorn scent of unnecessary work.)
I worked 13 hours yesterday.
for nothing.
fucking. hell.undefined fuck off we urgently don't need this yet! unnecessary work unsung heroine i'm starting to feel like dark terra.7 -
I spent an hour arguing with the CTO, pushing for having all our new products' data in the database (wow) with an API I could hit to fetch said data (wow) prior to displaying it on our order page.
He never actually agreed with me, but he finally acquiesced and wrote the migrations, API, and entered my (rather contrived) placeholder data. (I've been waiting on the boss for details and copy for three days.)
Anyway, it's now live on QA. but. I don't know where QA is for this app, and it's been long enough that i'm kind of afraid to ask.
Does that sound strange?
well.
We have seven (nine?) live applications (three of which share a database), and none of their repos match their URLs, nor even their Heroku app names. (In some of these Heroku names, "db" is short for the app's namesake, while in the rest it's short for "database").
So, I honestly have no idea where "dbappdev" points to, and I don't have access to the DNS records to check. -.-
What's more: I opened "dbappdev" on Heroku and tested out his new API -- lo and behold! it returns nada. Not a single byte. (Given his history I expected a 500, so this is an improvement, I think. Still totally useless, however.)
And furthermore: he didn't push the code to github, so I cannot test (or fix) it locally.
just. UGH.
every day with this guy, i swear.16 -
!rant.
Here's some useful git tricks. Use with care and remember to be careful to only rewrite history when noones looking.
- git rebase: powerful history rewriting. Combine commits, delete commits, reorder commits, etc.
- git reflog: unfuck yourself. Move back to where you were even if where you were was destroyed by rebasing or deleting. Git never deletes commits that you've seen within at least the last 50 HEAD changes, and not at all until a GC happens, so you can save yourself quite often.
- git cherry-pick: steal a commit into another branch. Useful for pulling things out of larger changesets.
- git worktree: checkout a different branch into a different folder using the same git repository.
- git fetch: get latest commits and origin HEADs without impacting local braches.
- git push --force-with-lease: force push without overwriting other's changes5 -
Preface: i'm pretty... definitely wasted. rum is amazing.
anyway, I spent today fighting with ActionCable. but as per usu, here's the rant's backstory:
I spent two or three days fighting with ActionCable a few weeks ago. idr how long because I had a 102*f fever at the time, but I managed to write a chat client frontend in React that hooked up to API Guy's copypasta backend. (He literally just copy/pasted it from a chat app tutorial. gg). My code wasn't great, but it did most of what it needed to do. It set up a websocket, had listeners for the various events, connected to the ActionCable server and channel, and wrote out updates to the DOM as they came in. It worked pretty well.
Back to the present!
I spent today trying to get the rest to work, which basically amounted to just fetching historical messages from the server. Turns out that's actually really hard to do, especially when THE FKING OFFICIAL DOCUMENTATION'S EXAMPLES ARE WRONG! Seriously, that crap has scoping and (coffeescript) syntax errors; it doesn't even run. but I didn't know that until the end, because seriously, who posts broken code on official docs? ugh! I spent five hours torturing my code in an effort to get it to work (plus however many more back when I had a fever), only to discover that the examples themselves are broken. No wonder I never got it working!
So, I rooted around for more tutorials or blogs or anything else with functional sample code. Basically every example out there is the same goddamn chat app tutorial with their own commentary. Remember that copy/paste? yeah, that's the one. Still pissed off about that. Also: that tutorial doesn't fetch history, or do anything other than the most basic functionality that I had already written. Totally useless to me.
After quite a bit of searching, the only semi-decent resource I was able to find was a blog from 2015 that's entirely written in Japanese. No, I can't read more than a handful of words, but I've been using it as a reference because its code is seriously more helpful than what's on official Rails docs. -_-
Still never got it to work, though. but after those five futile hours of fighting with the same crap, I sort of gave up and did something else.
zzz.
Anyway.
The moral of the story is that if you publish broken code examples beacuse you didn't even fking bother to test them first, some extremely pissed off and vindictive and fashionable developer will totally waterboard the hell out of you for the cumulative total of her wasted development time because screw you and your goddamn laziness.8 -
Github Inc. (Feel good inc. parody)
=========================
Ha-ha-ha-ha-ha-ha-ha-ha-ha-ha-ha-ha-ha.
Github.
Fetch it, fetch it, fetch it, Github.
Fetch it, fetch it, fetch it, Github.
Fetch it, fetch it, fetch it, Github.
Fetch it, fetch it, fetch it, Github.
Fetch it, fetch it, fetch it, Github.
Fetch it, fetch it, fetch it, Github.
(change) Fetch it (change), Fetch it (change), Fetch it (change), Github
(change) fetch it (change), fetch it (change), fetch it (change), Github
Repos breaking down on pull request
Juniors have to go cause they don't know wack
So while you filling the commits and showing branch trees
You won't get paid cause it's all damn free
You set a new linter and a new phenomenal style
Hoping the new code will make you smile
But all you wanna have is a nice long sleep.
But your screams they'll keep you awake cause you don't get no sleep no.
git-blame, git-blame on this line
What the f*ck is wrong with that
Take it all and recompile
It is taking too lonnng
This code is better. This code is free
Let's clone this repo you and me.
git-blame, git-blame on this line
Is everybody in?
Laughing at the class past, fast CRUD
Testing them up for test cracks.
Star the repos at the start
It's my portfolio falling apart.
Shit, I'm forking in the code of this here.
Compile, breaking up this shit this y*er.
Watch me as I navigate.
Ha-ha-ha-ha-ha-ha.
Yo, this repo is Ghost Town
It's pulled down
With no clowns
You're in the sh*t
Gon' bite the dust
Can't nag with us
With no push
You kill the git
So don't stop, git it, git it, git it
Until you're the maintainers
And watch me criticize you now
Ha-ha-ha-ha-ha.
Break it, break it, break it, Github.
Break it, break it, break it, Github.
Break it, break it, break it, Github.
Break it, break it, break it, Github.
git-blame, git-blame on this line
What the f*ck is wrong with that
Take it all and recompile
It is taking too lonnng
This code is better. This code is free
Let's clone this repo you and me.
git-blame, git-blame on this line
Is everybody in?
Don't stop, shit it, git it.
See how your team updates it
Steady, watch me navigate
Aha-ha-ha-ha-ha.
Don't stop, shit it, git it.
Peep at updates and reconvert it
Steady, watch me git reset now
Aha-ha-ha-ha-ha.
Github.
Push it, push it, push it, Github.
Push it, push it, push it, Github.
Push it, push it, push it, Github.
Push it, push it, push it, Github.2 -
Long rant ahead.. so feel free to refill your cup of coffee and have a seat 🙂
It's completely useless. At least in the school I went to, the teachers were worse than useless. It's a bit of an old story that I've told quite a few times already, but I had a dispute with said teachers at some point after which I wasn't able nor willing to fully do the classes anymore.
So, just to set the stage.. le me, die-hard Linux user, and reasonably initiated in networking and security already, to the point that I really only needed half an ear to follow along with the classes, while most of the time I was just working on my own servers to pass the time instead. I noticed that the Moodle website that the school was using to do a big chunk of the course material with, wasn't TLS-secured. So whenever the class begins and everyone logs in to the Moodle website..? Yeah.. it wouldn't be hard for anyone in that class to steal everyone else's credentials, including the teacher's (as they were using the same network).
So I brought it up a few times in the first year, teacher was like "yeah yeah we'll do it at some point". Shortly before summer break I took the security teacher aside after class and mentioned it another time - please please take the opportunity to do it during summer break.
Coming back in September.. nothing happened. Maybe I needed to bring in more evidence that this is a serious issue, so I asked the security teacher: can I make a proper PoC using my machines in my home network to steal the credentials of my own Moodle account and mail a screencast to you as a private disclosure? She said "yeah sure, that's fine".
Pro tip: make the people involved sign a written contract for this!!! It'll cover your ass when they decide to be dicks.. which spoiler alert, these teachers decided they wanted to be.
So I made the PoC, mailed it to them, yada yada yada... Soon after, next class, and I noticed that my VPN server was blocked. Now I used my personal VPN server at the time mostly to access a file server at home to securely fetch documents I needed in class, without having to carry an external hard drive with me all the time. However it was also used for gateway redirection (i.e. the main purpose of commercial VPN's, le new IP for "le onenumity"). I mean for example, if some douche in that class would've decided to ARP poison the network and steal credentials, my VPN connection would've prevented that.. it was a decent workaround. But now it's for some reason causing Moodle to throw some type of 403.
Asked the teacher for routers and switches I had a class from at the time.. why is my VPN server blocked? He replied with the statement that "yeah we blocked it because you can bypass the firewall with that and watch porn in class".
Alright, fair enough. I can indeed bypass the firewall with that. But watch porn.. in class? I mean I'm a bit of an exhibitionist too, but in a fucking class!? And why right after that PoC, while I've been using that VPN connection for over a year?
Not too long after that, I prematurely left that class out of sheer frustration (I remember browsing devRant with the intent to write about it while the teacher was watching 😂), and left while looking that teacher dead in the eyes.. and never have I been that cold to someone while calling them a fucking idiot.
Shortly after I've also received an email from them in which they stated that they wanted compensation for "the disruption of good service". They actually thought that I had hacked into their servers. Security teachers, ostensibly technical people, if I may add. Never seen anyone more incompetent than those 3 motherfuckers that plotted against me to save their own asses for making such a shitty infrastructure. Regarding that mail, I not so friendly replied to them that they could settle it in court if they wanted to.. but that I already knew who would win that case. Haven't heard of them since.
So yeah. That's why I regard those expensive shitty pieces of paper as such. The only thing they prove is that someone somewhere with some unknown degree of competence confirms that you know something. I think there's far too many unknowns in there.
Nowadays I'm putting my bets on a certification from the Linux Professional Institute - a renowned and well-regarded certification body in sysadmin. Last February at FOSDEM I did half of the LPIC-1 certification exam, next year I'll do the other half. With the amount of reputation the LPI has behind it, I believe that's a far better route to go with than some random school somewhere.25 -
The website for our biggest client went down and the server went haywire. Though for this client we don’t provide any infrastructure, so we called their it partner to start figuring this out.
They started blaming us, asking is if we had upgraded the website or changed any PHP settings, which all were a firm no from us. So they told us they had competent people working on the matter.
TL;DR their people isn’t competent and I ended up fixing the issue.
Hours go by, nothing happens, client calls us and we call the it partner, nothing, they don’t understand anything. Told us they can’t find any logs etc.
So we setup a conference call with our CXO, me, another dev and a few people from the it partner.
At this point I’m just asking them if they’ve looked at this and this, no good answer, I fetch a long ethernet cable from my desk, pull it to the CXO’s office and hook up my laptop to start looking into things myself.
IT partner still can’t find anything wrong. I tail the httpd error log and see thousands upon thousands of warning messages about mysql being loaded twice, but that’s not the issue here.
Check top and see there’s 257 instances of httpd, whereas 256 is spawned by httpd, mysql is using 600% cpu and whenever I try to connect to mysql through cli it throws me a too many connections error.
I heard the IT partner talking about a ddos attack, so I asked them to pull it off the public network and only give us access through our vpn. They do that, reboot server, same problems.
Finally we get the it partner to rollback the vm to earlier last night. Everything works great, 30 min later, it crashes again. At this point I’m getting tired and frustrated, this isn’t my job, I thought they had competent people working on this.
I noticed that the db had a few corrupted tables, and ask the it partner to get a dba to look at it. No prevail.
5’o’clock is here, we decide to give the vm rollback another try, but first we go home, get some dinner and resume at 6pm. I had told them I wanted to be in on this call, and said let me try this time.
They spend ages doing the rollback, and then for some reason they have to reconfigure the network and shit. Once it booted, I told their tech to stop mysqld and httpd immediately and prevent it from start at boot.
I can now look at the logs that is leading to this issue. I noticed our debug flag was on and had generated a 30gb log file. Tail it and see it’s what I’d expect, warmings and warnings, And all other logs for mysql and apache is huge, so the drive is full. Just gotta delete it.
I quietly start apache and mysql, see the website is working fine, shut it down and just take a copy of the var/lib/mysql directory and etc directory just go have backups.
Starting to connect a few dots, but I wasn’t exactly sure if it was right. Had the full drive caused mysql to corrupt itself? Only one way to find out. Start apache and mysql back up, and just wait and see. Meanwhile I fixed that mysql being loaded twice. Some genius had put load mysql.so at the top and bottom of php ini.
While waiting on the server to crash again, I’m talking to the it support guy, who told me they haven’t updated anything on the server except security patches now and then, and they didn’t have anyone familiar with this setup. No shit, it’s running php 5.3 -.-
Website up and running 1.5 later, mission accomplished.6 -
Just saw a repository with branch name - 👶
bitbucket gives this - git fetch && git checkout 👶 for checking it out,
wondering how would anyone checkout this branch without copy pasting the above line from the web xD8 -
Insomnia: yeah, nice cors header
Postman: neat cors header mate
Fetch in browser: where the FUCK is the cors header you retard6 -
I forgot to git fetch, so I spent quite a while doing exactly WHAT MY TEAMMATE HAD DONE JUST HOURS AGO3
-
Spent an hour figuring out an ERR_NAME_NOT_RESOLVED on a fetch Post call.
Turns out I was calling locahost instead of localhost.........
Only figured it out because I zoomed in on the console output by accident.
fml1 -
Fuck you you fucking fuck, why would you change an api without any notification?
Background: built an app for a customer, it needs to fetch data frlman external api, and save it to a db.
Customer called: it's broken what did you do?!?
Me: I'll look into it.
Turned out the third party just changed their api... Guess I should implenent some kind of notification, if no messages come in for some time...5 -
Okay guys, this is it!
Today was my final day at my current employer. I am on vacation next week, and will return to my previous employer on January the 2nd.
So I am going back to full time C/C++ coding on Linux. My machines will, once again, all have Gentoo Linux on them, while the servers run Debian. (Or Devuan if I can help it.)
----------------------------------------------------------------
So what have I learned in my 15 months stint as a C++ Qt5 developer on Windows 10 using Visual Studio 2017?
1. VS2017 is the best ever.
Although I am a Linux guy, I have owned all Visual C++/Studio versions since Visual C++ 6 (1999) - if only to use for cross-platform projects in a Windows VM.
2. I love Qt5, even on Windows!
And QtDesigner is a far better tool than I thought. On Linux I rarely had to design GUIs, so I was happily surprised.
3. GUI apps are always inferior to CLI.
Whenever a collegue of mine and me had worked on the same parts in the same libraries, and hit the inevitable merge conflict resolving session, we played a game: Who would push first? Him, with TortoiseGit and BeyondCompare? Or me, with MinTTY and kdiff3?
Surprise! I always won! 😁
4. Only shortly into Application Development for Windows with Visual Studio, I started to miss the fun it is to code on Linux for Linux.
No matter how much I like VS2017, I really miss Code::Blocks!
5. Big software suites (2,792 files) are interesting, but I prefer libraries and frameworks to work on.
----------------------------------------------------------------
For future reference, I'll answer a possible question I may have in the future about Windows 10: What did I use to mod/pimp it?
1. 7+ Taskbar Tweaker
https://rammichael.com/7-taskbar-tw...
2. AeroGlass
http://www.glass8.eu/
3. Classic Start (Now: Open-Shell-Menu)
https://github.com/Open-Shell/...
4. f.lux
https://justgetflux.com/
5. ImDisk
https://sourceforge.net/projects/...
6. Kate
Enhanced text editor I like a lot more than notepad++. Aaaand it has a "vim-mode". 👍
https://kate-editor.org/
7. kdiff3
Three way diff viewer, that can resolve most merge conflicts on its own. Its keyboard shortcuts (ctrl-1|2|3 ; ctrl-PgDn) let you fly through your files.
http://kdiff3.sourceforge.net/
8. Link Shell Extensions
Support hard links, symbolic links, junctions and much more right from the explorer via right-click-menu.
http://schinagl.priv.at/nt/...
9. Rainmeter
Neither as beautiful as Conky, nor as easy to configure or flexible. But it does its job.
https://www.rainmeter.net/
10 WinAeroTweaker
https://winaero.com/comment.php/...
Of course this wasn't everything. I also pimped Visual Studio quite heavily. Sam question from my future self: What did I do?
1 AStyle Extension
https://marketplace.visualstudio.com/...
2 Better Comments
Simple patche to make different comment styles look different. Like obsolete ones being showed striked through, or important ones in bold red and such stuff.
https://marketplace.visualstudio.com/...
3 CodeMaid
Open Source AddOn to clean up source code. Supports C#, C++, F#, VB, PHP, PowerShell, R, JSON, XAML, XML, ASP, HTML, CSS, LESS, SCSS, JavaScript and TypeScript.
http://www.codemaid.net/
4 Atomineer Pro Documentation
Alright, it is commercial. But there is not another tool that can keep doxygen style comments updated. Without this, you have to do it by hand.
https://www.atomineerutils.com/
5 Highlight all occurrences of selected word++
Select a word, and all similar get highlighted. VS could do this on its own, but is restricted to keywords.
https://marketplace.visualstudio.com/...
6 Hot Commands for Visual Studio
https://marketplace.visualstudio.com/...
7 Viasfora
This ingenious invention colorizes brackets (aka "Rainbow brackets") and makes their inner space visible on demand. Very useful if you have to deal with complex flows.
https://viasfora.com/
8 VSColorOutput
Come on! 2018 and Visual Studio still outputs monochromatically?
http://mike-ward.net/vscoloroutput/
That's it, folks.
----------------------------------------------------------------
No matter how much fun it will be to do full time Linux C/C++ coding, and reverse engineering of WORM file systems and proprietary containers and databases, the thing I am most looking forward to is quite mundane: I can do what the fuck I want!
Being stuck in a project? No problem, any of my own projects is just a 'git clone' away. (Or fetch/pull more likely... 😜)
Here I am leaving a place where gitlab.com, github.com and sourceforge.net are blocked.
But I will also miss my collegues here. I know it.
Well, part of the game I guess?7 -
"I'll just fetch the data from Wikipedia, it must be really simple using their API"
I've never been so wrong.2 -
“sEniOr tEcHniCiaN”: “I don’t know what Blazor is. I write my projects in ASP.NET. You should just use ASP.NET”
Me: …”Blazor *is* ASP. This project is running on ASP.NET 6.”
“seNioR tEchNiCiaN”: “As previously stated, I don’t use Blazor. I don’t care what version it is.”
Yes, this is a real exchange from my ongoing problems with this idiot.
His attitude is what ticks me off the most.
He doesn’t know what CORS is.
He doesn’t understand that “ASP.NET” covers Blazor, Razor Pages, the old MVC stuff, web APIs, and more.
He doesn’t understand the difference between a web request being initiated from the browser via Fetch and a web request being initiated from the server. (“My ASP site is shown in the browser, so requests to the third party API aren’t originating from the server.”)
And yet has the arrogance to repeatedly talk down to me while I try to explain basic concepts to him in the least condescending way possible.
After going around and around in circles with him, he finally admitted to me that “he doesn’t actually know what the CORS configuration looks like or how to modify it, to be honest.”
I just wanna go home.15 -
Some of the penguin's finest insults (Some are by me, some are by others):
Disclaimer: We all make mistakes and I typically don't give people that kind of treatment, but sometimes, when someone is really thick, arrogant or just plain stupid, the aid of the verbal sledgehammer is neccessary.
"Yeah, you do that. And once you fucked it up, you'll go get me a coffee while I fix your shit again."
"Don't add me on Facebook or anything... Because if any of your shitty code is leaked, ever, I want to be able to plausibly deny knowing you instead of doing Seppuku."
"Yep, and that's the point where some dumbass script kiddie will come, see your fuckup and turn your nice little shop into a less nice but probably rather popular porn/phishing/malware source. I'll keep some of it for you if it's good."
"I really love working with professionals. But what the fuck are YOU doing here?"
"I have NO idea what your code intended to do - but that's the first time I saw RCE and SQLi in the same piece of SHIT! Thanks for saving me the hassle."
"If you think XSS is a feature, maybe you should be cleaning our shitter instead of writing our code?"
"Dude, do I look like I have blue hair, overweight and a tumblr account? If you want someone who'd rather lie to your face than insult you, go see HR or the catholics or something."
"The only reason for me NOT to support you getting fired would be if I was getting paid per bug found!"
"Go fdisk yourself!"
"You know, I doubt the one braincell you have can ping localhost and get a response." (That one's inspired by the BOFH).
"I say we move you to the blockchain. I'd volunteer to do the cutting." (A marketing dweeb suggested to move all our (confidential) customer data to the "blockchain").
"Look, I don't say you suck as a developer, but if you were this competent as a gardener, I'd be the first one to give you a hedgetrimmer and some space and just let evolution do its thing."
"Yeah, go fetch me a unicorn while you're chasing pink elephants."
"Can you please get as high as you were when this time estimate come up? I'd love to see you overdose."
"Fuck you all, I'm a creationist from now on. This guy's so dumb, there's literally no explanation how he could evolve. Sorry Darwin."
"You know, just ignore the bloodstain that I'll put on the wall by banging my head against it once you're gone."2 -
Well... I had in over 15 years of programming a lot of PHP / HTML projects where I asked myself: What psychopath could have written this?
(PHP haters: Just go trolling somewhere else...)
In my current project I've "inherited" a project which was running around ~ 15 years. Code Base looked solid to me... (Article system for ERP, huge company / branches system, lot of other modules for internal use... All in all: Not small.)
The original goal was to port to PHP 7 and to give it a fresh layout. Seemed doable...
The first days passed by - porting to an asset system, cleaning up the base system (login / logout / session & cookies... you know the drill).
And that was where it all went haywire.
I really have no clue how someone could have been so ignorant to not even think twice before setting cookies or doing other "header related" stuff without at least checking the result codes...
Basically the authentication / permission system was fully fucked up. It relied on redirecting the user via header modification to the login page with an error set in a GET variable...
Uh boy. That ain't funny.
Ported to session flash messages, checked if headers were sent, hard exit otherwise - redirect.
But then I got to the first layers of the whole "OOP class" related shit...
It's basically "whack a mole".
Whoever wrote this, was as dumb and as ignorant to build up a daisy chain of commands for fixing corner cases of corner cases of the regular command... If you don't understand what I mean, take the following example:
Permissions are based on group (accumulation of single permissions) and single permissions - to get all permissions from a user, you need to fetch both and build a unique array.
Well... The "names" for permissions are not unique. I'd never expected to be someone to be so stupid. Yes. You could have two permissions name "article_search" - while relying on uniqueness.
All in all all permissions are fetched once for lifetime of script and stored to a cache...
To fix this corner case… There is another function that fetches the results from the cache and returns simply "one" of the rights (getting permission array).
In case you need to get the ID of the other (yes... two identifiers used in the project for permissions - name and ID (auto increment key))...
Let's write another function on top of the function on top of the function.
My brain is seriously in deep fried mode.
Untangling this mess is basically like getting pumped up with pain killers and trying to solve logic riddles - it just doesn't work....
So... From redesigning and porting from PHP 7 I'm basically rewriting the whole base system to MVC, porting and touching every script, untangling this dumb shit of "functions" / "OOP" [or whatever you call this garbage] and then hoping everything works...
A huge thanks to AURA. http://auraphp.com/
It's incredibily useful in this case, as it has no dependencies and makes it very easy to get a solid ground without writing a whole framework by myself.
Amen.2 -
a tale of daily frustration:
git fetch
*yup I'm up-to-date ...*
git add -p .
*hack in beautiful patch ...*
git status -bs
*correct branch, didn't forget any files ...*
git diff --cached
*yep, that is what I mean to commit ...*
git commit -m"[TKT-NUM] Meaningful commit message"
git log -p -1
*double-checking ... looks good ...*
git push remote tkt-num-etc
*for a brief moment feel accomplished ...*
*notice typo in commit message ...*
I don't have a funny image or punchline to sum this post up. But know that if you recognise this feeling, then I am your brother in git.6 -
The day I had to Translate this javascript:
fetch(url).then(r => r.json()).then(r => r.data)
To visual basic6 -
During job interview
Me : Am I going to maintain old solution?
Interviewer : Of course not.
2nd day
PM : Please fetch project X from SVN5 -
Why does FireFox has the shittiest dev tools?
Working on my website and it kept throwing "TypeError: Failed to fetch"
with no other info
Opened Chrome and that thing gave me the entire error without even modifying my logs code, and now I can peacefully solve the problem -.-11 -
Thanks to @sain2424, we found a really weird bug in devRantron.
One of the user that commented on or upvoted his rant has deleted that account. So when the notification component was trying to fetch the user's avatar to show the notification, it was failling and causing the app to crash.
@Cyanite the bugs you reported is fixed as well. Please update 🙃 🙃3 -
Got pretty peeved with EU and my own bank today.
My bank was loudly advertising how "progressive" they were by having an Open API!
Well, it just so happened I got an inkling to write me a small app that would make statistics of the payments going in and out of my account, without relying on anything third-party. It should be possible, right? Right?
Wrong...
The bank's "Open API" can be used to fetch the locations of all the physical locations of the bank branches and ATMs, so, completely useless for me.
The API I was after was one apparently made obligatory (don't quote me on that) by EU called the PSD2 - Payment Services Directive 2.
It defines three independent APIs - AISP, CISP and PISP, each for a different set of actions one could perform.
I was only after AISP, or the Account Information Service Provider. It provides all the account and transactions information.
There was only one issue. I needed a client SSL certificate signed by a specific local CA to prove my identity to the API.
Okay, I could get that, it would cost like.. $15 - $50, but whatever. Cheap.
First issue - These certificates for the PSD2 are only issued to legal entities.
That was my first source of hate for politicians.
Then... As a cherry on top, I found out I'd also need a certification from the local capital bank which, you guessed it, is also only given to legal entities, while also being incredibly hard to get in and of itself, and so far, only one company in my country got it.
So here I am, reading through the documentation of something, that would completely satisfy all my needs, yet that is locked behind a stupid legal wall because politicians and laws gotta keep the technology back. And I can't help but seethe in anger towards both, the EU that made this regulation, and the fact that the bank even mentions this API anywhere.
Seriously, if 99.9% of programmers would never ever get access to that API, why bother mentioning it on your public main API page?!
It... It made me sad more than anything...6 -
Did some updates to an older Web Forms website built by a previous SENIOR developer who is a notoriously horrible developer.
Now before I start, you have to understand this guy studied at a University and had been working for at least two years before I even started working. He is supposed to know the basic shit mentioned below.
This also happened a couple of days ago, so I have calmed down since then so I apologise for the relaxed tone. My next rant will contain a lot more swearing.
This fucking guy did the stupidest shit imaginable.
On the details view of a post|page|article|product|anything that would require a details view this jackass would load the data from the DB.
Using an OleDbConnection, OleDbDataAdapter, DataTable and the poorest writter fucking sql statements you have ever seen. All of these declared in the Page_Load method.
There was literally no reason for him to use OleDb instead of Sql, but he simply did not know any better.
He especially liked: "select * from tbl where id = " & Request("T") & ""
ZERO fucking checks to see if the value is even passed or valid, nothing. He did not even check whether the DataTable had any rows.
He then proceeded to use only the Heading column of the returned row to change the page's title.
Stupidly I assumed the aspx page will be in a better state. Fuck NO!
This fucktard went, added server tags to the opening of the asp:Content tag, copied that shit he used to fetch the data and pasted it between the server tags.
He did not know how to access the DataTable mentioned above from the aspx page!
He did this on every fucking project he worked on. Any place that required <%= %> to display data instead of using asp server controls, this cunt copied whatever was written in the code behind and pasted everything between server tags.
Fuck I could go on forever, but I think this is enough for my first rant.2 -
Man, what a way to start the week. Our mailserver went nuts (something about a Shellbot virus, I don't know) and we were forced to migrate to a new one. Clients calling in panic and threatening to sue us and shit. I was the one tasked to fix the problem (I am a developer mind you, my sysadmin knowledge is limited to google searching and contacting support). At the same time, Turkish hackers attacked our other server and forced me to fetch backups and clear spamming scripts. And to top it all, I was forced to answer the phone calls and respond to the threats. Man, I must have been a complete prick in my previous lifetime to deserve this.4
-
TL;DR : do we need a read-only git proxy
Guys, I just thought about something and this potential gitpocalypse.
There is no doubt anymore that regardless of Microsoft's decisions about Github, some projects will or already have migrated to the competition.
I'm thinking : some projects use the git link to fetch the code. If a dependency gets migrated, it won't be updated anymore, or worse, if the previous repo gets deleted, it can break the project.
Hence my idea : create some repository facade to any public git repository (regardless of their actual location).
Instead of using github.com/any/thing.git, we could use opensourcegit.com/any/thing.git. (fake url for the sake of the example).
It would redirect to the right repository (for public read only), and the owner could change the location of the actual repository in case of a migration.
What do you think ? If I get enough ++'s, I'll create a git repo about this.6 -
Let's talk about input forms.
Please don't do that!
Setting time with the arrow buttons in the UI.
Each time change causes the page to re-fetch all the records. This is fetching, parsing, rendering tens of thousands of entries on every single click. And I want to set a very specific time, so there's gonna be a lot of gigs of traffic wasted for /dev/null.
Do I hear you say "just type the date manually you dumbass!"? I would indeed be a dumbass if I didn't try that. You know what? Typing the date in manually does nothing. Apparently, the handler is not triggered if I type it in manually/remove the focus/hit enter/try to jump on 1 leg/draw a blue triangle in my notebook/pray3 -
Halloween is coming so i made this constructed unicorn mask with a paper mache base with elwire. Does it require any coding skills nope, but i bet people are goin to be suprised that i know how to build wireframes and papermache !7
-
Why do simple errors take the longest to fucking find!
Was using the geolocation api (js) to get the current longitude and latitude of my location. Stored them in an object to use in a fetch(). Every time I ran the fetch it was giving me the wrong location!
1hr later I realized I had used.
Fetch(https${longitude}blablabla${longitude})
After realizing this mistake and everything I that lead up to that moment I closed my MacBook and took my ass to sleep.
Moral of this story is...take fucking breaks.
Goodnight1 -
(New account because my main account is not anonymous)
Let's rant!
I'm 3 exams away from my CS degree, I've chosen to do some internship instead of another exam, thinking was a great idea.
Now I'm in this company, where I've never met anyone because of pandemic. A little overview:
- No git, we exchange files on whatsapp (spicy versioning)
- Ideas are foggy, so they ask for change even if I met their requirements, because from a day to another they change
- My thesis supervisor is not in the IT field, he understands nothing
The first (and only) task they gave me, was a web page to make request to their server, fetch data etc.
Two months passed trying to met their requests, there were a lot of dynamic content changin on the page, so I asked if I could use some rendering framework to make the code less shitty, no answers.
I continued doing shitty code in plain JS.
Another intern guy graduated, I've to mantain his code. This guy once asked me "Why have you created 8 js modules to accomplish the web page job?", I just answered saying that was my way of work, since we're on the same level in the company I didn't felt to explain things like usability, maintainability etc. it's like I've a bit of imposter syndrome, so I've never 100% sure that my knowledge is correct.
Now we came at the point where I've got his code to mantain, and guess what:
900 lines of JS module that does everything from rendering to fetching data..
I do my tasks on his code, then a bug arises so the "managers" ask him what's happened (why don't you ask to me that I'm mantaing is code!?!?), he fixes the bug nonetheless he finished his intership. So we had two copies of the same work, one with my job done and still with his bug, and another one without my work and without the bug.
I ask how to merge, and they send me the lines changed (the numeration was changed on my file ofc, remember: no git...)
Now we arrive today, after a month that they haven't assigned any task to me and they say:
"Ok, now let's re-do everything with this spicy fancy stunning frontend framework".
A very "indie" Framework that now I've to study to "translate" my work. A thing that could be avoided when I've asked for a framework, 2/3 MONTHS AGO.1 -
Don't you just love it when you ask Hibernate to fetch some data from DB and it does that for you.?And also updates a few more tables on its way. And inserts a few more rows. And updates another couple of fields..
Isn't that just amazing? I mean.. What could be more satisfying than getting an "ORA-00001: unique constraint (some.constraint) violated" while issuing a FUCKING **GET** METHOD?5 -
!rant
Did you ever have that feeling of "what the fuck - how did I do THAT" - I just had one; I was fine with the algorithm taking like 15/25 minutes to fetch, process, save all data, now it shreds through 10 times of the previous amount in like 7 minutes.3 -
Not about favorite language but about why PHP is not my favorite language.
I recently launched a web shop built on Prestashop. I found that some product pages are so god damn slow, like taking 50 fuckin' seconds to load. So I started investigating and analyzing the problem. Turns out that for some products we have so many different combinations that it results in a cartesian product totalling about 75K of unique combinations.
Prestashop did a real bad job coding the product controller because for every combination they fetch additional data. So that results in 75K queries being executed for just 1 product detail page. Crazy, even more when you know that the query that loads all these combinations, before iterating through them, takes 7 fuckin' seconds to execute on my dev machine which is a very very fast high end machine.
That said I analyzed the query and now I broke the query down into 3 smaller queries that execute in a much faster 400 ms (in total!) fetching the exact same data.
So what does this have to do with PHP? As PHP is also OO why the fuck would you always put stuff in these god damn associative arrays, that in turn contain associative arrays that contain more arrays containing even more arrays of arrays.
Yes I could do the same in C# and other languages as well but I have never ever encountered that in other languages but always seem to find this in PHP. That's why I hate PHP. Not because of the language but all those fucking retarded assholes putting everything in arrays. Nothing OO about that.2 -
I feel I should open a github repo, for people to contribute privacy policy parts into, have say folders like "google analytics" and then whenever people encounter those in the wild, add examples there, so people could fetch together a full privacy policy for free, as all those new cashgrab websites are just fucking insane. But I am not really sure, if that would find any contributors tbh sadly.
P.S: I seem to have developed now a third sense, when the devrant post cooldown is down, so I can rant more lol, because whenever I feel like posting the thought or rant, the cooldown is just about to expire -
Raging here, overheating really. One spends thousands on technology that is promoted with the catch phrase "it just works", yet here I am, after updating my fancy new emoji maker (iphone x) to 11.2 and then attempt to carry on working by compiling my code to test some new features. And...
oh, whats this xCode? You have a problem? You can't locate something? You can't locate iOS 11.2 (15C114)... sorry and you think that this "May not" be supported in current version of Xcode?
Let me get this straight you advanced piece of technological wizardy, you know you are missing something, you in fact know what it is, you can actually TELL me what is missing and yet, still, in 2017, you can't go FETCH it?????
Really? All you can do is sit, with that stupid look on your face, and watch the paint dry? Your stuck? That's it?
I hate you for the false pretense of advanced capability. and for your lack of a consistent dark theme so my eyes stop bleeding when reading your "I don't know what to do" messages...
By the way, maybe you can stop randomly crashing, or pinwheeling, I get that your bored as a machine designed to crunch numbers/data/code all day long and that for fun you feel you have to add some color to your subsitance. But stop it. Do what I'm told you can do, "JUST WORK" for once without me having to drag you forward kicking and screaming.
K. that feels better. Now for some whiskey.5 -
ATTENTION @dfox!!!!
I found a bug in the devRant REST server. When you upload a rant with a vertical image (when the height is bigger than the width), the server seems to flip between the width and the height (I don't know if it happens all the time), and when I fetch the rant, the JSON data for image width and height is flipped! (in the JSON, the attached image section contains width and height information about the image, and they are flipped in their values, meaning that the width key is equal to the height, and the height key is equal to the width).
An explanation on the image below:
The small window on the left shows the real info about the image. Notice how the height is 800 and width is 600.
The window at the bottom shows the data fetched from devRant servers. Notice how the width returned from the server is 800 and the height returned from the server is 600.22 -
ME: Here's an endpoint to get all the textual info about the entity. And this one fine endpoint is to fetch entity's files
FrontEnd: This is no good. I need all entity info in a single JSON
ME: but files could be quite heavy, are you sure you wan...
FE: Yes, Just give me all the info in a single JSON
ME: okay... I hope you know what you're doing..
ME: <implemented as requested>
ME: <opens a webpage with 2 files attached>
Browser: <takes 30 seconds to open a page and downloads 30MB of data in the JSON>
ME: As mentioned before, your approach is a performance killer
FE: No worries, we'll fix that in the next version. First let's see if anyone will be using this feature at all - maybe it's not even worth working on
ME: <thinking> I know I would NOT be using an app if it takes over half a minute to open up a chat channel. FFS I wouldn't even be using Slack if it took 30 seconds to open some other conversation, because for some reason it wanted to fetch all the uploaded files along with all the messages each time a channel is clicked on.....
ME: <thinking> this project is doomed :(11 -
*Today*
Me: coding is fun , la la la, nothing can go wrong~~ ♡
*moment later*
Me: ahhhhh my back!! Why was i sitting in that terrible unergonimic position
Moral of the story , practice good seating posture and you wont get muscle knots! ;-;3 -
Happened just yesterday.
At 7:00pm some shared office colleagues rocked up at the building drunk, to inform me that they were having their own christmas party, since they couldnt be bothered attending ours the day before. -
Just now when I'm watching one of the many anime's I've saved onto my file server I noticed something.. all of their files are incomplete, and so are they on the NTFS mirror on this WanBLowS host. The files got corrupted. I recall that I used robocopy to place the files back and forth, and yet again it lives up to its expectations of it being a motherfucking piece of Winshit. FUCK YOU ROBOCOPY!!! If I wanted to fetch that anime yet again just to deal with your developers' incompetence, I'd have watched it online!! Meanwhile tell me, HOW DIFFICULT IS IT TO DEAL WITH A NETWORK FILE TRANSFER THAT EVEN USES YOUR OWN SHITFEST OF A PROTOCOL, FUCKING SMB?!! MSFT certified pieces of shit!!!!7
-
So after the original idea getting scraped during a hackathon this week, we created a slack bot to fetch most relevant answers from StackOverflow using user's input. All the user had to do was input few words and the bot handled all typos, links etc and returned the link as well as the most upvoted or the accepted amswer after scraping it from the website.
The average time to find an answer was around 2 seconds, and we also told that we're planning to use flask to deploy a web application for the same.
After the presentation, one of the judge-guys called me and told me that "It isn't good enough, will not be used widely" and "Its similar to Quora".
Never ever have I wanted to punch a son of a bitch in the balls ever.3 -
I'm having difficulty treating HR like human beings. I mean yes I spammed you to fetch me my payslip but why didn't you check why I am not getting it automatically from the first time? And your response is you are busy and HR requests take a minimum of 48 hours to process? THE FUCK? I mean how hard can it be to type my ID in the system and send me my payslip.
I really need to learn how to "play nice" before I get in trouble.3 -
As a pretty solid Angular dev getting thrown a react project over the fence by his PM I can say:
FUCK REACT!
It is nigh impossible to write well structured, readable, well modularized code with it and not twist your mind in recursion from "lift state up" and "rendercycle downwards only"
Try writing a modular modal as a modern function component with interchangeable children (passeable to the component as it should be) that uses portals and returns the result of the passed children components.
Closest I found to it is:
c o d e s a n d b o x.io/s/7w6mq72l2q
(and its a fucking nightmare logic wise and readability wise)
And also I still wouldn't know right of the bat how to get the result from the passed child components with all the oneway binding CLUSTERFUCK.
And even if you manage to there is no chance to do it async as it should be.
You HAVE to write a lot of "HTML" tags in the DOM that practically should not be anywhere but in async functions.
In Angular this is a breeze and works like a charm.
Its not even much gray matter to it...
I can´t comprehend how companies decide to write real big web apps with it.
They must be a MESS to maintain.
For a small "four components that show a counter and fetch user images" - OK.
But fo a big webapp with a big team etc. etc.?
Asking stuff about it on Stackoverflow I got edited unsolicited as fuck and downvoted as fuck in an instant.
Nobody explained anything or even cared to look at my Stackblitz.
Unsolicited edit, downvote, closevote and of they go - no help provided whatsoever.
Its completely fine if you don't have time to help strangers - but then at least do not stomp on beginners like that.
I immediately regretted asking a toxic community like this something that I genuinely seem to not understand. Wasn't SO about helping people?
I deleted my post there and won't be coming back and doing something productive there anytime soon.
Out of respect for my clients budget I'm now doing it the ugly react way and forget about my software architecture standards but as soon as I can I will advise switching to Angular.
If you made it here: WOW
Thank you for giving me a vent to let off some steam :)13 -
A little late but whatever.
About half a year ago, I started working on setting up self hosted (slippy) maps. For one, because of privacy reasons, for two, because it'd be in my own control and I could, with enough knowledge, be entirely in control of how this would work.
While the process has been going on for hours every day for about half a year (with regular exceptions), I'll briefly lay out what I've accomplished.
I started with the OpenMapTiles project and tried to implement it myself. This went well but there were two major pitfalls:
1. It worked postgres database based. This is fine but when you want to have the entire world.... the queries took insanely long (minutes, at lower zoom levels) and quite intimate postgres/tooling knowledge was required, which I don't have.
2. Due to the long queries and such, the performance was so bad that the maps could take minutes to render and when you'd want that in production... yeah, no.
After quite some time I finally let that idea sail and started looking into the MBTiles solution; generating sqlite databases of geojson features. Very fast data serving but the rendering can take quite some time.
After some more months, I finally got the hang of it to the point that I automated 50-70 percent of the entire process. The one problem? It takes a shitload of resources and time to generate a worldwide mbtiles database.
After infinite numbers of trial and error, I figured out that one can devide a 'render' (mbtiles aka sqlite database) into multiple layers (one for building data, one for water, one for roads and so on), so I started doing renders that way.
Result? Styling became way more easy and logical and one could pick specific data to display; only want to display the roads? Its way more simple this way. (Not impossible otherwise but figuring out how that works... Good luck).
Started rendering all the countries, continents and such this way and while this seemed like a great idea; the entire world is at 3-4 percent after about a month. And while 40-70 percent generates 10 times as fast, that's still way too slow.
Then, I figured out that you can fetch data per individual layer/source. Thus, I could render every layer separately which is way faster.
Tried that with a few very tiny datasets and bam, it works. (And still very fast).
So, now, I'm generating all layers per continent. I want to do it world based but figured out that that's just not manageable with my resources/budget.
Next to that, I'm working on an API which will have exactly the features I want/need!13 -
Years ago, we were setting up an architecture where we fetch certain data as-is and throw it in CosmosDb. Then we run a daily background job to aggregate and store it as structured data.
The problem is the volume. The calculation step is so intense that it will bring down the host machine, and the insert step will bring down the database in a manner where it takes 30 min or more to become accessible again.
Accommodating for this would need a fundamental change in our setup. Maybe rewriting the queries, data structure, containerizing it for auto scaling, whatever. Back then, this wasn't on the table due to time constraints and, nobody wanted to be the person to open that Pandora's box of turning things upside down when it "basically works".
So the hotfix was to do a 1 second threadsleep for every iteration where needed. It makes the job take upwards of 12 hours where - if the system could endure it - it would normally take a couple minutes.
The solution has grown around this behavior ever since, making it even harder to properly fix now. Whenever there is a new team member there is this ritual of explaining this behavior to them, then discussing solutions until they realize how big of a change it would be, and concluding that it needs to be done, but...
not right now.2 -
!rant
Me and my bestfriend joined a hackathon way back since we were in college. The task was to fetch JSON data from a REST APIs then we were given a sample link so we can compare the output between the expected output with our own. But the response from the actual API is not in JSON format, it's a string so we need to do dozens of string manipulation to match the expected output.
To submit our work we are given our own subdomain to upload our work and setup the environment and the URL will be submitted. We know how to complete the challenge but the time is running out and we were in panic mode so my friend mistakenly submitted the URL used to compare the output. We already expected to fail the challenge but what the fuck, we got a perfect score and won the challenge.1 -
If we had a devRant vote on the most annoying word of 2018, I'd vote for "token".
Token here, token there, token yourself in the rear!
Some project I'm currently working on has to fetch 4 different tokens for these syphillis-ridden external CUNTful APIs.
Your mother inhales dicks at the trainstation toilet for one token!2 -
TL;DR: Fuck Wordpress and their shitty “editor”.
Client told me the Wordpress editor was unusual slow on their site. I inspected the network traffic, while fiddling around in the admin pages. What I found was an even worse nightmare than expected. Somehow the fucktard of an “engineer” decided to implement the spell check module, to parse all other text areas on the page - even the fucking image sources. The result is a browser sending a GET request to fetch the images from the server every time an author triggers a keyup-event. Disabled the spell check and everything was back to budget-ineffective-feces-Wordpress normal.3 -
Im a fucking idiot, instead of fetching some data and then immediately using it to fetch other data, i saved it in a file to parse later and fetch it into other data. And it's been running for a day now so I dont wanna stop it.7
-
When you want only 10 rows of query result.
Mysql: Select top 10 * from foo.... 😁
Sql server: select top 10 * from foo.. 😁
PostgreSQL: select * from foo limit 10.. 😁
Oracle: select * from foo FETCH NEXT/FIRST 10 ROWS ONLY. 🌚
Oracle, are you trying to be more expressive/verbose because if that's the case then your understanding of verbosity is fucked up just like your understanding of clean-coding, user experience, open source, productivity...
Etc.6 -
When your egghead boss (who is a dev, BTW) fails miserably in understanding that JavaScript fetch does not behave like the default synchronous nature of requests in Python.
After failing to make him learn about the asynchronous nature of JavaScript promises, he ends the discussion by saying "that's why python is better than js"
*facepalm*2 -
Damn, gitea is such a great piece of software, but yet it lacks so much of gitlab, which given are completely different sizes and all, but damn I would kill for the repos import feature of gitlab to be in gitea and maybe even automatic pipelines to fetch a remote automatically..
I could most likely hack together a solution that does the import and remote fetching automatically, but I doubt it would hold against any sort of update and be absolutely brutally murdered by any change.4 -
Be me
Need to fetch data from a website for project I'm working on
Look for an API with no luck
Spend 2 weeks developing a ridiculously overengineered and slow webscraping program with Selenium
Find an API
Fuck this4 -
A (work-)project i spent a year on will finally be released soon. That's the perfect opportunity to vent out all the rage i built up during dealing with what is the javascript version of a zodiac letter.
Everything went wrong with the beginning. 3 people were assigned to rewrite an old flash-application. Me, A and B. B suggested a javascript framework, even though me and A never worked with more than jquery. In the end we chose react/redux with rest on the server, a classic.
After some time i got the hang of time, around that time B left and a new guy, C, was hired soon after that. He didn't know about react/redux either. The perfect start off to a burning pile of smelly code.
Today this burning pile turned into a wasteland of code quality, a house of cards with a storm approaching, a rocket with leaks ready to launch, you get the idea.
We got 2 dozen files with 200-500 loc, each in the same directory and each with the same 2 word prefix which makes finding the right one a nightmare on its on. We have an i18n-library used only for ~10 textfields, copy-pasted code you never know if it's used or not, fetch-calls with no error-handling, and many other code smells that turn this fire into a garbage fire. An eternal fire. 3 months ago i reduced the linter-warnings on this project to 1, now i can't keep count anymore.
We use the reactabular-module which gives us headaches because IT DOESN'T DO WHAT IT'S SUPPOSED TO DO AND WE CANT USE IT WELL EITHER. All because the client cant be bothered to have the table header scroll along with the body. We have methods which do two things because passing another callback somehow crashed in the browser. And the only thing about indentation is that it exists. Copy pasting from websites, other files and indentation wars give the files the unique look that make you wonder if some of the devs hides his whitespace code in the files.
All of this is the result of missing time, results over quality and the worst approach of all, used by A: if A wants an ui-component similar to an existing one, he copies the original and edits he copy until it does what he wants. A knows about classes, modules, components, etc. Still, he can't bring himself to spend his time on creating superclasses... his approach gives results much faster
Things got worse when A tried redux, luckily A prefers the components local state. WHICH IS ANOTHER PROBLEM. He doesn't understand redux and loads all of the data directly from the server and puts it into the local state. The point of redux is that you don't have to do this. But there are only 1 or 2 examples of how this practice hurt us yet, so i'm gonna have to let this slide. IF HE AT LEAST WOULD UPDATE THE DATA PROPERLY. Changes are just sent to the server and then all of the data is re-fetched. I programmed the rest-endpoints to return the updated objects for a very reason. But no, fuck me.
I've heard A decided (A is the teamleader) to use less redux on the next project and use a dedicated rest-endpoints for every little comoutation you COULD DO WITH REDUX INSTEAD. My will is broken and just don't want to work with this anymore.
There are still various subpages that cant f5 because the components cant handle an empty redux state in the beginning, but to be honest i don't care anymore. Lets hope the client will never find out, along with the "on error nothing happens"-bugs. The product should've been shipped last week, but thanks to mandatory bugfixes the release was postponed to next week. Then the next project starts...
Please give me some tips to keep up code quality over time, i cant take this once more.
I'm also aware that i could've done more, talking A and C about code style, prettifying the code, etc. Etc. But i was busy putting out my out fires, i couldn't kill much of the other fires which in the end became a burning building (a perfect metaphor for this software)4 -
The company that I work for has recently recruited a team for Web Development, so they don't have to pay a monthly fee to the previous team who designed their website.
They have over 3000+ products in the old website, and no logical way to import them to the new website. The old team was asking for 300$ to give them an API which would return the product details in an XML format.
Obviously, paying that amount of money wasn't logical for a dying website, so the manager decided to hire someone to manually copy the content from the old admin panel to the new one, that is until I stopped him.
My solution? Write a simple web scraper to login to the old panel and collect data. Boom! 300$ saved from going to waste.
Now, the old team found about this and as much as my manager was happy, they were quite angry. So they implanted a Google reCaptcha to prevent my bot from scraping the old panel.
I spent about 20 minutes, and found out once you're logged in to the old panel, the session is saved in a cookie and you are no longer greeted by a Captcha.
So I re-written a small portion of my bot, and Boom! Instant karma from manager. We finished publishing the new site, and notified the old team, only to see the precious look on their face. Poor guy, he thought I was a wizard or something 😂😂
That's what you get for overcharging people!
TL;DR: Company's old website team wanted to overcharge us writing an API to fetch 3000+ records.
Written a basic web scraper to do the same job in less than an hour.3 -
Could probably write one myself, but fuck me I wish there was a ready plugin for vsCode or a tool that can fetch external resources into selected folders (e.g css, js, .. in the public folder) and replace it inside the code, it has been way too many times that I synced my code base and went out with my laptop, just to realize I forgot half of it has external resources like bootstrap and my mobile connection is terrible enough to call it quits.3
-
seconds into 2019
I see one incompetent fucker asking to eval in Node.js..
A FUCKING FETCH OF A NPM MODULES IN CDNJS
you know what's the reason?
node_modules
Fucking kill me unless you're some dumb bitch who uses npm modules like some braindead motherfucker who doesn't know what a number is, node_modules takes only an average of 3.6MB
Compared to RubyGems who takes 40+
Or Pip
Seriously stop this. I wanna hang myself because my 2019 put me in a shit mood1 -
I actually never felt the need to scream at a co-worker so let's talk about that time a co-worker screamed at me instead.
tl;dr : some asshole boss screamed and threatened me because someone else's project was shit and didn't work.
Context: I was in my third year of school internship (graded) and my experience is C, C++, C#, Python all in systems programming, no web.
I was working as an intern for a shit company that was selling a shit software to hospitals (though not medically critical, thank God) the only tech guy on site was the DBA (cool guy) the product was maintained by a single dev in VB from his house, the dude never showed up to work (you'll understand why) and an other intern who couldn't dev shit.
I was working with the DBA on an software making statistical analysis from DB exports, worked nice, no problems here if we forget the lack of specs or boundaries (except must work in ieShit).
The other intern was working on something else (don't ask me what it is) I just remember it was in GWT before the community revived it. His webapp was requesting the company http server for a file instead of having one of it's java servlet to fetch it (both apps ran on sane server) which caused a lot of shit especially CORS error. That guy left (end of contract) and leaves his shit as is, boss asked me to deploy the app, I fiddle with it to see if it works and when I find out it doesn't then that asshole starts screaming at me in front of every other employee present, starts threatening to burn me in the tech world and have me thrown out of my school for no goddamn reason than the other dude's project doesn't work.
After the screaming I leave and warn my school immediately.
I guess that's why the other dev never came to work.
I had three weeks of internship left, that I did from home and worked probably less than 2 hours a day so suck it asshole.
Still had a good grade because I was reviewed by the DBA and he was happy with the work I did.
It was only later that I realized that what he did was categorizing as harassment (at least in France) and decided that never again this would happen without a response from my lawyer.1 -
We are 2 people working as remote android devs for this startup in another country. 6 weeks ago a new person joined onsite to work directly in startup HQ. I'l refer to him as an newguy.
Last week we started new sprint (of 2 weeks) to work on a new feature.
Newguy was responsible for gathering all the specs and planning, so this is how our sprint is going so far:
Day 1:
We have 10+ tickets in jira (tickets have only titles) no one knows what to do and we don't even have specification. I started pushing everybody onsite to get their shit together. We NEED UX/UI specs, we NEED backend to be ready, or at least start working paralelly so that once wer'e done with frontend backend would be ready. I mean cmon guys this feature is already 70% done on iOS, why cant you send us the specification?
Day 2:
We had a meeting on Zoom and talked about missing specification and project manager promised to send us the specs. Meanwhile the idea of feature became clearer so I agreed with the newguy to start researching about best way to implement our solution.
Day 3:
We received the specifications. I provided my research for the feature to the newguy. Turns out the he knew about specification 4-5 days before.
Instead of sharing information with us, he decided to create his own library to do what we want to do and blatantly rejected my research input.
Now he showed his implementaton (which is shit by the way) and presents it as the only way to proceed forward. He offers for us to work paralelly with him on this (basically he wants to write library alone, and we are supposed to somehow implement and test it, but how the fuck we can implement if backend is not ready and library is just a bunch of empty interfaces at this point?)
I talked with one of the teamleads in the startup and told him that this is not the way things were being done here before and new guy is becoming a dictator.
Teamlead talked with new guy and found no issue. Basically newguy defended his sole decision by saying that he did research on his own, there are no libraries that do what we want and he knows better.
Teamlead tells me to STFU because new guy seems competent and he will be leading this feature. Basically from what I gathered teamlead doesn't give a single fuck and wants to delegate all project management to this new guy.
Day 5:
End of the week. New guy claims that his lib is done so we can start implementing properly. I tried implementing his lib but its fucked up and backend is still not ready.
Day 6:
Backend is still not ready, no one is doing anything just waiting for it to be ready.
Day 7 (Today):
Today(Backend is still not ready, no one is doing anything just waiting for it to be ready.
So what can I say? His plan was to probably prove his self worth and try to lead this feature by giving us information at last minute. At the point were we should start implementing instead of researching.
What happened? Motherfucker doesn't know shit about backend, has been notified about backend issues multiple times but his head was so deep up his ass with that new library of his that he delayed the rest of the team.
Result? 7 working days wasted. Out of 3 developers only 1 was actually working (and his fucked up code will have to be rewritten anyways). Only 50% of feature done. Motherfucker tells me that this is how we will work in the future, "paralelly". The fuck is this mate? If you would have worked on this feature alone you would have done it already now, but instead you wait until we remote devs will login and fetch you the test input and talk with backend guys for you? The fuck is wrong with you.
You fucking piece of shit, learn to plan and organize better if you want to lead the team. Now all that you are doing is wasting time, money and getting on everyboys nerves. Im tired of fucking spoon feeding you every day you needy scheming office politics playing piece of shit. Go back to your shithole country and let us work.
When I was responsible for sprint planning I figured out what to do before start of the sprint and remote devs were able to do week's work in 1-2 days and have rest of the week off. This is how it's supposed to be when you work with a remote team. Delegate them separate features, give them proper specs ahead and everyone's happy. Don't start working on frontend if you dont even fucking know when backend will be ready. It's fucking common sense.
Now I need to spoon feed this motherfucker who can't even get information while sitting on his ass onsite in HQ. Fucking hell.8 -
Late night ramble warning.
I like to fix issues. I like to roll up my sleeves and fetch my keyboard or soldering iron on a mission to build a custom solution for whatever real world annoyance that has just triggered my problem solving caveman brain.
I have prided myself in that. I am the kind of guy who doesn't shy away from getting my hands dirty, I tell myself, and it's good because it makes my life easier, I tell myself. But increasingly, I've been wondering if this is really so. Am I really making my life easier? Am I fixing the world or just scratching an itch?
Example 1:
Instead of using conventional backup methods for my personal files like a commercial cloud based service or buying a Synology NAS or something similar, I decided it would be better to build my own linux server and set up a rather obscure configuration in order to address things like parity, ECC, bit-rot and the likes while staying cheap.
Learning a lot? Sure. Fun? Sure. Never have to worry about backups again? The opposite, of course.
While I set out to build the perfect bespoke solution to all my personal backup needs - it's as if I, by putting my time and effort into the nitty gritty of technical implementation, placed a vote for my future to contain more of that stuff. In reality this project has burdened my little brain with many new things to consider in regards to storing my files.
Example 2:
Qwerty and the conventional staggered keyboard layout are relics of past technical limitations and both of them inefficient and bad from an ergonomic perspective.
Possible solution: ignore and carry on or possibly transition to Colemak on a somewhat more ergonomic full size keyboard.
My solution: well, let's also hand build a tiny-ass super obscure ergo keyboard and spend two days to come up with my own layout for all special characters, numbers and function keys.
Fun? Somewhat. Learning a lot? I guess. Never have to think about keyboard layouts again? Lol.
I'm living in a world of pain with various key commands in various apps and edge cases. Could I fix it? Probably make it better but not without quite a bit of effort.
Anyways, it'd be interesting to hear if anyone can relate to this feeling of wanting to fix something once and for all only to find yourself deeper in it then ever before. Idk might be a just me thing. Anyways, goodnight lovely people.5 -
MS Teams:
*Notification*: Bob sent you a message: "Hello netikras. Got a minute for a call? Andrew told me <...>"
Open up Teams, open chat with Bob. Last message received -- 2 weeks ago. Chat summary on the left (chats list) shows the last message was 2 minutes ago with "Hello netikras. Got a...."
Prolly just javascript failed to fetch the full message to display it in the chat.. refresh the Teams tab
Same thing.
How the f*ck do I read the full message? WHAT DID ANDREW TELL BOB????
Fucking Micro$oft... Ironically, the only thing they're good at is hardware. Well okay and the office suite.
Can't even make a working chat app that doesn't lose messages or doesn't behave like a three-headed dragon with multiple personalities in each head.4 -
I think of git as if it was my dog. For added fun name your remote "slippers". Then you can exec "git fetch slippers"2
-
It was time of my grade 11 result.
When result was out, I asked my school coordinator about the number of students failed and he replied it was non of my business. I got back home, coded a script in python to fetch the result of 233 students and pass it to text a file. Printed it and gave it to the coordinator the next day, he was like "Ok, I'll tell computer science teacher to give you full marks on practicals"4 -
FUCK EVERY PERSON IN THIS SHIT BANK!! FFS THE IDIOTS CREATED A NEW DATABASE USING SQL SERVER 2008! Yes, 2008 and its a new database if it was some type of legacy I could try to understand, but this shit is a completely new database. I have to use sequelize and guess what? It can't paginate results because shit server 2008 does not accept OFFSET FETCH syntax3
-
The more I'm on here the more I remember all the shit I have had to deal with in the past.
Anyway, lets rant! I just moved cities after college to be closer to my family, I didnt have any work lined up at that stage but started job hunting the moment I was settled in, I did some freelance for smaller companies to stay afloat.
Eventually I got a job at this agency startup where "SEO" was there main focus, still very inexperienced they put me on frontend and data capturing but will teach me how to code using their systems in due time. At this stage I was getting paid minimum wage, but I was doing minimum work and it wasnt that bad.
A new investor bought 49% of the company and immediately moved into the office space to focus more on marketing (He was one of those scaly marketing guys that will sell you babies if he could get his hands on enough to make a profit).
This is where everything starts going to shit. He hires a bunch of "SEO Gurus", fills up the small office with people like sardines squished together. Development was still our main money maker at this stage, so there where 3 new more senior developers at this stage and I started learning a lot really fast.
Here are some of the issues we had to deal with:
1. Incentives - Great more money, haha! No, No, you where 5 minutes late so you only get half of the promised amount.
2. For every minute you are late we will deduct it from you paycheck (Did I mention I was getting paid minimum wage).
3. If you take a smoke break we will dock it from your pay.
4. Free gym membership to the gym downstairs, but you can only go once a week during your lunch.
5. No pay raises if you cant prove your worth on paper.
He on purposely made up shitty rules and regulations to keep us down and make as much profit as he could.
Here are some shitty stuff he has done:
1. We arent getting a 13th check this year because the company didnt make a big profit - while standing next to his brand new BMW.
2. Made changes over FTP on clients work because we where too slow to get to it, than blames me for it because its broken the next day and wants to give me a written warning for not resolving the issue Immediately. They went as far as wanting to fire me for this, gave me 1 day notice for meeting and that I can bring a lawyer to represent me (1 day notice is illegal, you need 5 days where I am from), so I brought a lawyer since my mom was a lawyer. They freaked the fuck out and started harassing me about this a week later.
3. Would have meetings all the time about how much money the company is making, but wont be raising our pay since no one has proven they are worth it yet.
4. Would full on yell at employees infront of the entire office if they accidentally made an mistake on a clients project.
One one occasion I took a week off for holiday, my coworker contacted me to ask a question and I answered that I will handle it when I am back the following week. Withing 2 hours my other boss phones me in a rage, "he is coming to fetch the company laptop from my house in 5 minutes, he will let me know when he arrives. Gives me no time to talk at all and hangs up - I have figured out what has happened by now so when he showed up he has this long speech about abandonment, and trust and loyalty to the company. So I pass him my laptop once he shut up and said: "You do know I am on holiday leave which you approved, right?", he goes even more silent and passes me back my laptop without saying anything, and drives off.
While the above was happening Douche manager back at the office has a rage as well and calls the whole office (25 people) to a meeting talking about how I abandoned the company and how disgraceful that is.
Those are the shitty experiences I can remember, there where many more like this. All of the above eventually led to me going into a deep depression and having panic attacks weekly, from being overworked or scared to step out of line. Its also the reason I almost stopped coding forever at that stage. I worked there for 2.5 years with the abuse.
I left 2 weeks after the last shit show, I am ok now and have my anxiety and depression well under control if not almost gone completely.
Ran into Douche Manager a few months ago after 9 years, the company got bought out and the first person they fired was him. LOL! He now has his own agency and is looking for Developers (They are hard to find he says), little does he know I spread his name far and wide to all and every Dev I knew and didnt know to avoid working for him at all costs. Seems like word of mouth still works in this digital age.
Thanks for reading this far!5 -
Since my first post was a success, here's another shameless hack-- in this case, ripping a "closed" database I don't usually have access to and making a copy in MySQL for productivity purposes. That was at a former job as an IT guy at a hardware store, think Lowes/Rona.
We had an old SCO Unix server hosting Informix SQL (curious, anyone here touched iSQL?), which has terminal only forms for the users to handle data, and has keybindings that are strangely vi based (ESC does commit changes. Mindfsck for the users!). To add new price changes to our products, this results to a lengthy procedure inside a terminal form (with ascii borders!) with a few required fields, which makes this rather long. Sadly, only I and a colleague had access to price changes.
Introducing a manager who asks a price change for a brand- not a single product, but the whole product line of a brand we sell. Oh and, those price changes ends later after the weekend (twice the work, back at regular price!)
The usual process is that they send me a price change request Excel document with all the item codes along with the new prices. However, being non technical, those managers write EVERYTHING at hand, cell by cell (code, product name, cost, new price, etc), sometimes just copy pasted from a terminal window
So when the manager asked me to change all those prices, I thought "That's the last time I manually enter all of this sh!t- and so does he". Since I already have a MySQL copy of the items & actual (live) price tables, I wrote a PHP backend to provide a basic API to be consumed to a now VBA enhanced Excel sheet.
This VBA Excel sheet had additional options like calculating a new price based on user provided choices ("Lower price by x $ or x %, but stay above cost by x $ or x %"), so the user could simply write back to back every item codes and the VBA Excel sheet will fetch & display automatically all relevant infos, and calculate a new price if it's a 20% price cut for example.
So when the managers started using that VBA sheet, I had also hidden a button which simply generate all SQL inserts for the prices written in the form, including a "back to regular price" if the user specified an end date, etc.
No more manual form entry for me, no more keyboard pecking for the managers with new prices calculated for them. It was a win/win :)1 -
I like my work colleague, great to talk to, bless his heart, however he has this mysterious power to make whatever he writes sound super passive aggressive or aloof. For instance:
Me: this request , to clarify you want me to do this, as per this document
Him: is that what you presume?
Me: 🤨2 -
Company has a severe lack of fresh blood.
"let's recruit everyone who has an IQ over room temperature and barely passes the mark".
Me protesting bloody murder cause I know that the idea is not just profoundly dumb, but frustration from high staff turnover takes a toll on *everyone*.
"nah can't be that bad".
Then the discussion started who could do monitoring and mentoring, so we can sort out the bad apples *quickly*.
Me reminding again that this is exactly what leads to a high staff turnover, as this is nothing else than "hire, hire - quickly fire".
Guess who won the award of being the mentor / monitor ....
*drum roll*
Come on, I know you would NEVER expect this.
Let me surprise you: M E.
Yeah. They chose the person that was absolutely against this idea...
Because that person is "most qualified for the task at hand and has the necessary qualifications".
Today was the first 4 h workshop with a new recruit.
The Lord has had zero mercy on me.
I started to mute myself after 30 minutes in regular intervals to just scream and curse the world.
How profound dumb a person can be amazes me.
Person has had a "very expensive 6 month boot camp course".
I was close asking if the boot camp course was in watching porn and wanking their brain cells out....
Git... Yeah he knew what he was doing...
Except that he messed up every commit by either not sticking to the companies format or - what I found funny the first 2 times, then not so much anymore - just writing a git commit message like a 15 year old teenage girl would write to their diary.
Programming. Oh yeah. He should be a programmer.
He had much Bootcamp.
Bootcamp expensive. Bootcamp good.
If someone is unable to iterate over an iterator... And instead starts creating an integer based array of a map's key name to then fetch the map value in an for loop based on the created key array.
Yeah. Bootcamp much good.
Creating DTOs...
It took an hour to write a DTO with him... Cause constructors are hard and it's even harder when you have to explain primitive datatypes in Java, null safety, constructors, NPEs, final, ...
Like really no experience at all.
The next week's will be amazing.
Either I get a valium drop or I'm gonna blow my head off, cause mentoring will drain the last bit of hope I had left in me.
Note that I do not blame the recruit (yeah he's dumb. But he has ZERO work experience, so it's not unexpected), I'm just too fed up with getting the poo crown despite being against the whole process.
I think the recruit could make it..........
But that I got the shittiest job ever is really haunting me.
I dunno how I survive the next weeks.
And this is just the first recruit... There will be more.2 -
So recently I installed Windows 7 on my thiccpad to get Hyperdimension Neptunia to run (yes 50GB wasted just to run a game)... And boy did I love the experience.
ThinkPads are business hardware, remember that. And it's been booting Debian rock solid since.. pretty much forever. There are no hardware issues here. Just saying.
With that out of the way I flashed Windows 7 Ultimate on a USB stick and attempted to boot it... Oh yay, first hurdle to overcome. It can't boot in UEFI mode. Move on Debian, you too shall boot in BIOS mode now! But okay, whatever right. So I set it to BIOS mode and shuffled Debian's partitions around a bit to be left with 3 partitions where Windows could stick in one more.
Installed, it asks for activation. Now my ThinkPad comes with a Windows 7 Pro license key, so fuck it let's just use that and Windows will be able to disable the features that are only available for Ultimate users, right? How convenient would that be, to have one ISO for all the half a dozen editions that each Windows release has? And have the system just disable (or since we're in the installer anyway, not install them in the first place) features depending on what key you used? Haha no, this is Microsoft! Developers developers developers DEVELOPERS!!! Oh and Zune, if anyone remembers that clusterfuck. Crackhead Microsoft.
But okay whatever, no activation then and I'll just fetch Windows Loader from my webserver afterwards to keygen my way through. Too bad you didn't accept that key Microsoft! Wouldn't that have been nice.
So finally booted into the installed system now, and behold finally we find something nice! Apparently Windows 7 Enterprise and Ultimate offer a native NFS driver. That's awesome! That way I don't have to adjust my file server at all. Just some fuckery with registry keys to get the UID and GID correct, but I'll forgive it for that. It's not exactly "native" to Windows after all. The fact that it even has a built-in driver for it is something I found pretty neat already.
Fast-forward a few hours and it's time to Re Boot.. drivers from Lenovo that required reboots and whatnot. Fire the system back up, and low and behold the network drive doesn't mount anymore. I've read that this is apparently due to Windows (not always but often) mounting the network drive before the network comes up. Absolutely brilliant! Move out shitstaind, have you seen this beauty of an init Mr. Poet?
But fuck it we can mount that manually after every single boot.. you know, convenient like that. C O P E.
With it now manually mounted, let's watch a movie! I've recently seen Pyro's review on The Platform and I absolutely loved it. The movie itself is quite good too. Open the directory on my file server and.. oh. Windows.. you just put db.thumb on it and db.thumb:encryptable. I shit you not, with the colon and everything. I thought that file names couldn't contain colons Windows! I thought that was illegal in NTFS. Why you doing this in NFS mate? And "encryptable", am I already infected with ransomware??? If it wasn't for the fact that that could also be disabled with something as easy as a registry key, I would've thought I contracted ransomware!
Oh and sound to go with that video, let's pair up some Bluetooth headphones with that Bluetooth driver I installed earlier! Except.. haha nope. Apparently you don't get that either.
Right so let's just navigate the system in its Aero glory... Gonna need to flick the mouse for that. Except it's excruciatingly slow, even the fastest speed is slower than what I'm used to on Linux.. and it's jerky as hell (Linux doesn't have any of that at higher speed). But hey it can compensate for that! Except that slows down the mouse even more. And occasionally the mouse driver gets fucked up too. Wanna scroll on Telegram messages in a chat where you're admin? Well fuck you mate, let me select all these messages for you and auto scroll at supersonic speeds! And God forbid that you press delete with that admin access of yours. Oh maybe I'll do it for you, helpful OS I am!
And the most saddening part of it all? I'd argue that Windows 7 is the best operating system that Microsoft ever released. Yeah. That's the best they could come up with. But at least it plays le games!10 -
PM asked me to develop an application to fetch data from the customer's DB, which would require an access security token provided by the customer. To get the token, I would have to travel to Germany (I live in Portugal) to get it personally (it's not possible to have someone else pick it up for me).
It turns out the security token is a completely closed environment, with its own OS, without the possibility of installing any application or communicating with the exterior. The laptop itself would boot from the token's OS.
It was concluded I would have to hack the security token, which is completely non compliant. So the PM decided not to go forward with it.
But now, I have to go Germany anyway to pick up the security tokens because they forgot to order them for these other guys who would be using them to access the customer's DB manually and they don't want to delay the project anymore.
Oh, and the security tokens cost the project 500€/month each...3 -
I have multiple contenders ;)
Contender one:
A program used to sort emails.
We was in the process of moving from lotus notes to exchange and needed a way route emails to the right server internally.
Solution, a qmail to receive all emails, a script running by cron every minute to read the emails, check the recipient name to a list and resending to the right server. The script was written in php :P since that was the only way we at the time had to read an email into an object, it was run just like any other shell script :D
Contender two:
A multi threaded mail sender that fetch email addresses and content from a database and posted them through qmail using background execution and pipes to get the result back and then update the database, written in bash script.
Contender three:
A c program used in a similar way as in one but this time using dial up and uucp to fetch email and then drop these either into lotus notes or into a bbs for our customers to give them an email address. This was around 1993, so not to many isp’s offered email and not to many had internet anyway, dial up bbs was much more common.5 -
It seems like I'm going on an assignment to a company working with Angular. Reading through the documentation I just want to ask all Java developers to get their greasy hands out of JavaScript. It feels like GWT all over again with Google reinventing core JS technologies just so that it looks like Java. Dependency injections? Observable wrappers? RxJS in general? WHAT IS THE POINT? Why can't I do this in a way adhering to web standards? Why can't I simply use fetch() or axios or whatever? Why can't you support reactivity without forcing me to write more boilerplate than I had on my central heating boiler? I just want to code and not be forced to discover what Google developers think web should be like.
Please, let me out of this hell.
Fortunately, it's not gonna be a long assignment.3 -
Well, I really have nothing to rant about these days 😅
What I do have is a request for feedback on a project’s video, m working on the project and will release it open source once it works decently well, and most of all, when the code doesn’t look so atrociously bad 😅😅
It is basically a C/C++ package and project manager.
It can create, build, and run projects which u make, and add library flags and include directories automatically if that library exists in its package list.
It also contains package (=library) manager which can, as of now, install, uninstall, and fetch info of any package should it exist in the package list.
I will be adding package upgrading in the future, although package list updates can be performed.
Also, right now it can only build binary projects. I’ll soon be working to enable creation of library ( static/dynamic ) projects as well.
Finally, it allows for building of packages using CMake or configure, but uses a custom format to build projects.
Here is a video of building a project and installing libcurl on system:
https://asciinema.org/a/155030
Thanks a lot ☺️😊1 -
Why the fuck is my git slower than a lobotomized snail when running a fetch operation on like 30 remotes?!?!9
-
So, I've had a personal project going for a couple of years now. It's one of those "I think this could be the billion-dollar idea" things. But I suffer from the typical "it's not PERFECT, so let's start again!" mentality, and the "hmm, I'm not sure I like that technology choice, so let's start again!" mentality.
Or, at least, I DID until 3-4 months ago.
I made the decision that I was going to charge ahead with it even if I started having second thoughts along the way. But, at the same time, I made the decision that I was going to rely on as little external technology as possible. Simplicity was going to be the key guiding light and if I couldn't truly justify bringing a given technology into the mix, it'd stay out.
That means that when I built the front end, I would go with plain HTML/CSS/JS... you know, just like I did 20+ years ago... and when I built the back end, I'd minimize the libraries I used as much as possible (though I allowed myself a bit more flexibility on the back end because that seems to be where there's less issues generally). Similarly, any choice I made I wanted to have little to no additional tooling required.
So, given this is a webapp with a Node back-end, I had some decisions to make.
On the back end, I decided to go with Express. Previously, I had written all the server code myself from "first principles", so I effectively built my own version of Express in other words. And you know what? It worked fine! It wasn't particularly hard, the code wasn't especially bad, and it worked. So, I considered re-using that code from the previous iteration, but I ultimately decided that Express brings enough value - more specifically all the middleware available for it - to justify going with it. I also stuck with NeDB for my data storage needs since that was aces all along (though I did switch to nedb-promises instead of writing my own async/await wrapper around it as I had previously done).
What I DIDN'T do though is go with TypeScript. In previous versions, I had. And, hey, it worked fine. TS of course brings some value, but having to have a compile step in it goes against my "as little additional tooling as possible" mantra, and the value it brings I find to be dubious when there's just one developer. As it stands, my "tooling" amounts to a few very simple JS scripts run with NPM. It's very simple, and that was my big goal: simplicity.
On the front end, I of course had to choose a framework first. React is fine, Angular is horrid, Vue, Svelte, others are okay. But I didn't want to bother with any of that because I dislike the level of abstraction they bring. But I also didn't want to be building my own widget library. I've done that before and it takes a lot of time and effort to do it well. So, after looking at many different options, I settled on Webix. I'm a fan of that library because it has a JS-centric approach. There's no JSX-like intermediate format, no build step involved, it's just straight, simple JS, and it's powerful and looks pretty good. Perfect for my needs. For one specific capability I did allow myself to bring in AnimeJS and ThreeJS. That's it though, no other dependencies (well, at first, I was using Axios because it was comfortable, but I've since migrated to plain old fetch). And no Webpack, no bundling at all, in fact. I dynamically load resources, which effectively is code-splitting, and I have some NPM scripts to do minification for a production build, but otherwise the code that runs in the browser is what I actually wrote, unlike using a framework.
So, what's the point of this whole rant?
The point is that I've made more progress in these last few months than I did the previous several years, and the experience has been SO much better!
All the tools and dependencies we tend to use these days, by and large, I think get in the way. Oh, to be sure, they have their own benefits, I'm not denying that... but I'm not at all convinced those benefits outweighs the time lost configuring this tool or that, fixing breakages caused by dependency updates, dealing with obtuse errors spit out by code I didn't write, going from the code in the browser to the actual source code to get anywhere when debugging, parsing crappy documentation, and just generally having the project be so much more complex and difficult to reason about. It's cognitive overload.
I've been doing this professionaly for a LONG time, I've seen so many fads come and go. The one thing I think we've lost along the way is the idea that simplicity leads to the best outcomes, and simplicity doesn't automatically mean you write less code, doesn't mean you cede responsibility for various things to third parties. Those things aren't automatically bad, but they CAN be, and I think more than we realize. We get wrapped up in "what everyone else is doing", we don't stop to question the "best practices", we just blindly follow.
I'm done with that, and my project is better for it! -
debugging a performance issue. basically the original dev had no idea what a database was for. system was generating millions of buffered reads, and paging horribly.
to see if an id existed they had done the following:
fetch all 1 million rows into an array.
iterate the array looking for a match.
if found, set found=true
continue to iterate the rest of the array.
return found.
repeat at every login.
replaced with
return person.get(id)
set world record for most i/o avoided with one liner.1 -
How do I make my manager understand that something isn’t doable no matter how much effort, time and perseverance are put into it?
———context———
I’ve been tasked in optimizing a process that goes through a list of sites using the api that manage said sites. The main bottle neck of the process are the requests made to the api. I went as far as making multiple accounts to have multiple tokens fetch the data, balance the loads on the different accounts, make requests in parallel, make dedicated sub processes for each chunks. All of this doesn’t even help that much considering we end up getting rate limited anyway. As for the maintainer of the API, it’s a straight no-can-do if we ask to decrease the rate limit for us.
Essentially I did everything you could possibly do to optimize the process and yet… That’s not enough, it doesn’t fit the 2 days max process time spec that was given to me. So I decided I would tell them that the specs wouldn’t match what’s possible but they insist on 2 days.
I’ve even proposed a valid alternative but they don’t like it either, admittedly it’s not the best as it’s marked as “depreciated” but it would allow us to process data in real time instead of iterating each site.3 -
// Stupid JSON
// Tale of back-end ember api from hell
// Background: I'm an android dev attempting to integrate // with an emberjs / rails back-end
slack conversation:
me 3:51pm: @backend-dev: Is there something of in the documentation for the update call on model x? I formed the payload per the docs like so
{
"valueA": true,
"valueB": false
}
and the call returns success 200 but the data isn't being updated when fetching again.
----------------------------------------------------------------------------------------
backend-dev 4:00pm: the model doesn't look updated for the user are you sure you made the call?
----------------------------------------------------------------------------------------
me 4:01pm: Pretty sure here's my payload and a screen grab of the successful request in postman <screenshot attached>
----------------------------------------------------------------------------------------
backend-dev 4:05pm: well i just created a new user on the website and it worked perfectly your code must be wrong
----------------------------------------------------------------------------------------
me 4:07pm: i can test some more to see if i get any different responses
----------------------------------------------------------------------------------------
backend-dev 4:15pm: ahhhhhh... I think it's expecting the string "true", not true
----------------------------------------------------------------------------------------
me 4:16: but the fetch call returns the json value as a boolean true/false
----------------------------------------------------------------------------------------
backend-dev 4:18pm: thats a feature, the flexible type system allows us to handle all sorts of data transformations. android must be limited and wonky.
----------------------------------------------------------------------------------------
me 4:19pm: java is a statically typed language....
// crickets for ten minutes
me 4:30pm: i'll just write a transform on the model when i send an update call to perform toString() on the boolean values
----------------------------------------------------------------------------------------
backend-dev 4:35: great! told you it wasn't my documentation!
// face palm forever4 -
I'm thinking of making a project that will help me during the upcoming school year. Pretty much an assistant that can fetch quick answers from google and a few other basic tasks. What projects have you guys done with your work? P.S. I might want one or two people to also contribute so comment down below if you'd like to help (won't be open source)6
-
I don't think I've been more excited about anything than my very first application.
This was when I was just starting out with programming. I had chosen C++ as my first programming language.
A friend and I'd written a simple application to fetch XML data from a sports data server, parse it and display our favourite football (soccer, for Americans) team's fixtures and results on a GUI.
We called it Sportify. -
Fucking asynchronous code can be very annoying sometimes. Had to struggle the whole day yesterday to get it work properly so I fetch all the results from a function which does many asynchronous calls.
Then I realized that I just need two more counter variables to increase and later decreases to detect when all of them finished.
That shit..1 -
To fetch 100 users at once, i used JPA hibernate findAll() method. Simple fucking enough. I realized this shit is slow. Takes a while to fetch and 100 records aint even a lot!
This shit needs over 265 ms to fetch 100 users
About 75 ms to fetch 20 users
That shits terrible!
Then i wrote a custom JDBC class with custom SQL queries to fetch the exact same shit.
Now it fetched 100 users in 7 ms, 37x times faster for performance
I havent even optimized indexing or did shit. I just avoided using jpa hibernate
Someone explain this to me8 -
I wrote a program a few years ago which needed to fetch data from a website. Instead of using the API, I used a html parser and extracted the data myself.
The code still runs flawlessly. -
Cleaning up after yourself if so damn important. Even your git repos:
# always be pruning
git fetch -p
# delete your merged branches
git branch --merged | grep -v master | xargs -n 1 git branch -d
# purge remote branches that are merged
git branch -r --merged | grep -v master | sed 's/origin\///' | xargs -n 1 git push --delete origin -
Is there an api to fetch avatars?
@dfox
I mailed to you a long time ago, but not getting any responses.7 -
Every c# job i see is like must also know .net, my question is, how the hell do you avoid learning .net when studying c#!?!8
-
I am seeing a lot of people infuriated a lot about windows10 updates.
And I am just sitting here on win7 with disabled updates.
Waiting for some free time to fetch my new computer with ryzen, throwing some linux dist on it and virtualize the shit out of windows for gaming only.5 -
I had a small .NET PoC project I wanted to upload to our git server. So added the project to version control using Visual Studio, meaning that VS created a local repository for it. Then I wanted to push it to the remote repository which were created by my colleague. This one was initialized with a commit (.gitignore and Readme.md), so I couldn't push directly. Googled a bit, OK then tried to fetch the remote repo, didn't help. Googled again, tried some "git push origin master whatever" stuff and then rebate, because nothing seemed to help.
OOPS where are my local files? WTF? 😣
Long story short: Experience in other version control systems is not enough or even dangerous when switching over to git. 😂4 -
Share your own useful terminal aliases here:
- grm=git rebase origin master
- gforbm=git fetch origin && git rebase origin master
- vm=vim Makefile
- idea=vim ~/repos/ideas/README.md (where I store all my programming ideas)14 -
Have you worked with GraphQL? Saw it for the first time and it looks like a pretty good tool for working with APIs, to me. I am excited to read your opinions about GraphQL.23
-
wtf...
Ones of the best bugs I love the "most" are the ones where the fix is counter-intuitive, e.g. making smth seemingly incorrect to rectify the issue.
Like today, I crafter an SQL query to fetch some PG metrics. And postgres-exporter refused to accept it until I added an excessive comma [,] at the end of the SELECT block (right before FROM).
Like.. wtf...12 -
Me: Ok I've updated the docs, I'll open a PR with the changes
Maintainer: Looks great! Can you remove the changes to the package-lock.json? (I assume it got updated when you ran npm install to start the webserver)
Me: Ok sure, I'll update it soon
And this is where the troubles begin. The file was commited 2 commits ago, so I have to roll back to then. However, the remote repository has been updated since then, so I git fetch to keep up to date.
This makes the rollback a hell of a lot harder, so I run git log to see the history. I try a reset, but I went back to the wrong commit, and now a shit ton of files are out of sync.
I frantically google 'reset a git reset', and come across the reflog command. Running that fucks things up even worse, and now so much shit is out of sync that even git seems confused.
I try to fix the mess I've created, and so I git pull from my forked repo to get myself back to where I was. Git starts screaming at me about out of sync files, so I try to find a way to overwrite local changes from the origin.
And by this point, the only way to describe what the local repo looks like is a dumpster fire clusterfuck that was involved in a train wreck
I resolved the mess by just deleting the local copy and git cloning again from my fork.
I gotta learn how to use Git better5 -
My biggest dev epiphany was also my dumbest one. We were working on a payment system for a roadside rescue company where an employee would register payments "in the field".
The challenge was automating input with typeahead and autocompletes in order to lessen the workload as manual input had to be an absolute minimum; this will be used by truck drivers/mechanics as they are trying to hurry to the next customer who has been waiting for 3 hours longer than we said we'd take.
We managed to make the invoice path first (customer has not paid, employee logs personalia needed for billing), but when it came to "paid on site" we almost upended the entire system trying to find a way to fetch user personalia outside of the invoice path.
Neither of us realized it during the days we were banging our heads against it. Realizing we don't need to make an invoice for a job that has been paid for was equal parts relief and utter embarrassment.
Probably my greatest lesson in how important it is to pull my head out of the code once in a while, and to ask myself what I'm trying to do and why. -
I need some opinions on Rx and MVVM. Its being done in iOS, but I think its fairly general programming question.
The small team I joined is using Rx (I've never used it before) and I'm trying to learn and catch up to them. Looking at the code, I think there are thousands of lines of over-engineered code that could be done so much simpler. From a non Rx point of view, I think we are following some bad practises, from an Rx point of view the guys are saying this is what Rx needs to be. I'm trying to discuss this with them, but they are shooting me down saying I just don't know enough about Rx. Maybe thats true, maybe I just don't get it, but they aren't exactly explaining it, just telling me i'm wrong and they are right. I need another set of eyes on this to see if it is just me.
One of the main points is that there are many places where network errors shouldn't complete the observable (i.e. can't call onError), I understand this concept. I read a response from the RxSwift maintainers that said the way to handle this was to wrap your response type in a class with a generic type (e.g. Result<T>) that contained a property to denote a success or error and maybe an error message. This way errors (such as incorrect password) won't cause it to complete, everything goes through onNext and users can retry / go again, makes sense.
The guys are saying that this breaks Rx principals and MVVM. Instead we need separate observables for every type of response. So we have viewModels that contain:
- isSuccessObservable
- isErrorObservable
- isLoadingObservable
- isRefreshingObservable
- etc. (some have close to 10 different observables)
To me this is overkill to have so many streams all frequently only ever delivering 1 or none messages. I would have aimed for 1 observable, that returns an object holding properties for each of these things, and sending several messages. Is that not what streams are suppose to do? Then the local code can use filters as part of the subscriptions. The major benefit of having 1 is that it becomes easier to make it generic and abstract away, which brings us to point 2.
Currently, due to each viewModel having different numbers of observables and methods of different names (but effectively doing the same thing) the guys create a new custom protocol (equivalent of a java interface) for each viewModel with its N observables. The viewModel creates local variables of PublishSubject, BehavorSubject, Driver etc. Then it implements the procotol / interface and casts all the local's back as observables. e.g.
protocol CarViewModelType {
isSuccessObservable: Observable<Car>
isErrorObservable: Observable<String>
isLoadingObservable: Observable<Void>
}
class CarViewModel {
isSuccessSubject: PublishSubject<Car>
isErrorSubject: PublishSubject<String>
isLoadingSubject: PublishSubject<Void>
// other stuff
}
extension CarViewModel: CarViewModelType {
isSuccessObservable {
return isSuccessSubject.asObservable()
}
isErrorObservable {
return isSuccessSubject.asObservable()
}
isLoadingObservable {
return isSuccessSubject.asObservable()
}
}
This has to be created by hand, for every viewModel, of which there is one for every screen and there is 40+ screens. This same structure is copy / pasted into every viewModel. As mentioned above I would like to make this all generic. Have a generic protocol for all viewModels to define 1 Observable, 1 local variable of generic type and handle the cast back automatically. The method to trigger all the business logic could also have its name standardised ("load", "fetch", "processData" etc.). Maybe we could also figure out a few other bits too. This would remove a lot of code, as well as making the code more readable (less messy), and make unit testing much easier. While it could never do everything automatically we could test the basic responses of each viewModel and have at least some testing done by default and not have everything be very boilerplate-y and copy / paste nature.
The guys think that subscribing to isSuccess and / or isError is perfect Rx + MVVM. But for some reason subscribing to status.filter(success) or status.filter(!success) is a sin of unimaginable proportions. Also the idea of multiple buttons and events all "reacting" to the same method named e.g. "load", is bad Rx (why if they all need to do the same thing?)
My thoughts on this are:
- To me its indentical in meaning and architecture, one way is just significantly less code.
- Lets say I agree its not textbook, is it not worth bending the rules to reduce code.
- We are already breaking the rules of MVVM to introduce coordinators (which I hate, as they are adding even more unnecessary code), so why is breaking it to reduce code such a no no.
Any thoughts on the above? Am I way off the mark or is this classic Rx?16 -
So I made an update to my React Native app. I changed UI of a couple of screen, added a few animations here and there, refactored how my graphQL resolvers work in the backend(no breaking changes), changed how data gets loaded into the database etc.
It worked in dev so I figured hey let's deploy it. Today is(was because it's now 3am but more on that later) a national holiday so no one goes to work so no one will use my app so I have an entire day to deploy.
I started at 15:00(because i woke up at 13:00 lol). I tested the update once again in dev and proceeded to deploy it to prod. I merged backend to master, built docker images, did migrations on the db, restarted docker-compose with new images. And now for the app. I run ./gradlew assembleRelease and it starts complaining that react-native-gesture-handler is not installed. Ugh, rm -rf node_modules && yarn install. It worked. But now gradlew crashes and logs don't tell me anything. Google tells me to change a bunch of gradle settings but none of them work. Fast forward 5h, it's around 20:00 and I isolated the issue to, again, react-native-gesture-handler. They updated from 2.2.4 to 2.3.0 which didn't fucking compile. 2 more hours passed (now 22:00) and I got v2.3.1 working which fixed the problem in 2.3.0 but made my app crash on startup. YOUR FUCKING LIBRARY GETS 250K WEEKLY DOWNLOADS AND YOU DONT EVEN BOTHER CHECKING IF IT COMPILES IN PROD ON ANDROID?! WHAT THE FUCK software-mansion?
After I solved that, my app didn't crash. Now it threw an error "Type errors: Network Request Failed" every time I fetch my legacy REST API(older parts use rest and newer use graphql. I'll refactor that in the next update). I'll spare you the debugging hell i went through but another 5h passed. Its 3am. My config had misspelled url to prod but good for dev... I hate myself and even more so react-native-gesture-handler.3 -
Sitting in my beautiful chalet on this beautiful park. My garden nicely mowed. Sun in shining. Was in the zone the whole day. Now it's time to shift some gears and fetch some desperado's (will pass out after three probably, didn't drink for long time). Walk towards neighbor and drink together.
Life can be so beautiful14 -
Khmm Intel... A paper from 1995 describing speculative execution as :"Prefetching may fetch otherwise inaccesible instructions in Virtual 8086 mode." which makes Intel know the 'recent' exploits knows for just a shy 23 years. Why didn't they fix it? Who know.
https://t.co/KRMCEAfZgX2 -
Fack I hate this CMS we have. Using Fetch and sending data as encoded JSON doesn't get parsed nor recognized by the CMS. I was pulling my hair out, understanding the differences, switched back to XHR... and then noticed the buggar:
XHR sends data as "FormData()".
I switched everything to FormData and TADAAA IT WORKS MOTHERFUCKA
Lost a whole fucking day on this trying to understand whether it was my code or the CMS the problem.
Ah yes, of course, I had to reproduce the CMS on my machine, instead of having the developers lend us a "development" platform on it, so obviously I don't know what's happening under the hood :))))))))rant are we in 1999 light swag cms as a shitty service life joy yolo hate death losing time for nothing darkness2 -
I'm still using .then().catch() instead of the async await.
So, first of all, Fuck you for calling it a STANDARD now. its nowhere near to be called standard. You wanna get some data from an API? Wanna call it using axios or fetch? What if the server is down? what if there's an error that you don't even know existed? Where do I get that kinda error in async await? try-catch? no thanks :| I'm good -_-8 -
Mine actually don't know shit about what I do but they don't try to describe it either. One day I told my father that as a personal project I was making a little script to fetch song lyrics and assign them to the music files I have in my library. Well he was all surprised like "Wow, and you can do that sort of thing?". Well, yeah, what do you think I've been studying during all those years? 😁
-
Damn @&$#%£¥ hibernate. Why is ur API so $£¥%#*^ inconsistent and buggy????
@Index on mappedsupertype any? @index causing duplicate tables on hsqldb?? REALLY????
Why does it work on mysql?
Maxresults and fetch joins cause ALL rows to be fetched, instead of properly generating a „limited subselect“? WHYYYYYY, JUST WHY???? -
tl;dr - i think? i have a short attention span
Just started ricing up i3 yesterday! i'll post screenshots and maybe start a git repo or something for my Arch/other customizations.
Any tips on getting polybar to be transparent? i wanna get your guys's help before i go digging through subredits and StOve 😂
so far i have:
- installed compton and tossed it into i3
- tossed the initial (example) polybar in i3
- tossed in an xrandr script because i needed it
- tossed in a feh to fetch a desktop background (it's pretty, i'll link below)
- I've got it bad guys - i tried to mod+enter to open a terminal...
on my windows computer at work. 😬
Pretty desktop wallpaper: https://wykop.pl/cdn/c3201142/...
(THIS WAS A PAIN IN THE ASS TO DIG UP ONLINE [ had to remember the last half of the filename to get an image search where it popped up ] - remind me to save a link to ALL the wallpapers i find)1 -
Why the fuck nobody talks about Multi-page apps?! We went from a Web where everything was Multi-page server-rendered, and now everything for Web developers is "Single-page apps".
What about websites who can't do that? Not everything can be a single-page app. Only my uncle's restaurant website, or something which is TRULY a full app. No half choices.
If your website is a multi-page app/portal which actually PRELOADS data, instead of doing 100 fetch to an API within a page that is full of loading bars, well, your life is a pain.
When you want a first contentful paint which isn't a white page, well, your life is a pain.
What are React, Vue, Ember, Angular (let's exclude Svelte and Marko) going to do about Multi-page apps and SSR?
React-router sucks to me. It's performance is weak and it's useful only when you have an SPA with multiple sections which can be treated as pages (e.g. A single SPA divided in tabs).
Server-side rendering is the worst pain ever made by humanity, in React (and prob Vue, I didn't try but I can bet). And even when made easier from libs like Svelte and Marko, I (personally) can't get it to be faster enough compared to a traditional website without a JS framework and with a templating engine.
Anyways, if there's anything that I learnt from React, is to stay away from Next.js. Perfect, beautiful, mess.
All JS frameworks just seem to bloat the code and make it worse and slower, even though they're REALLY helpful.
Why? Why everyone loves them if their downsides are so clear? Why 3 projects out of 3 I made (1 React SSR, 1 Vue, 1 Marko SSR) are and will stay painfully slow and bloated, full of shit, even if in 2020 we should have evolved with the famous three shaking, with the famous lazy loading, etc.?
I am just frustrated.
And let's not even talk about Webpack, Rollup, Lasso, those module bundlers shit which are harder to configure and understand than finding a needle in a haystack.
Lasso was the easiest to configure but I anyways can't understand it. Webpack seems it was made to handle SPAs, as any tool in this freaking world, and not even considering an easy way to integrate multiple bundles for multiple pages (I know it's pretty easy, but with component sharing between pages and big unique bundles Next.js handles it soooo bad it feels like hell).
Am I the only one?
Sorry for the long rant. I just needed to rant right now.17 -
Html imports. Polyfil. Hey. Reading, this is awesome. <link rel=“import” href=“control.html”> what could be simpler? Deprecated front end. But only need it for developing. Will combine the files at the end.
Estimate converting php to pure html, couple of days.
Go to use it with polyfill (webcompnents.js htmlimport). Doesn’t work.
Try the light components. Doesn’t work.
Try server-side polymer. Doesn’t work.
So much documentation about it working. Then finally come across shadow dom and how html imports are associated with them.
Hell no they aren’t. They are completely different things. Oh. Google packaged them together? No one could agree on shadow dom, and its now going away? Taking the pure html way of combining files into one page with it???
Spent an entire freaking day. But got 8 lines of code and <link rel=“fetch” ...> to do the same thing.
Why hasn’t this been an html standard for say...years. Why can’t the server do a handshake with the client and serve one page (php-ish) if the feature isn’t supported. Otherwise multiple files asyc. I mean. This is a fundamental part of pwa’s.
Why are these obviously smart people so stupid??? Deploy you shitty shadow dom without this obviously useful feature...2 -
Sharing a first look at a prototype Web Components library I am working on for "fun"
TL;DR left side is pivot (grouped) table, right side is declarative code for it (Everything except the custom formatting is done declaratively, but has the option to be imperative as well).
====
TL;DR (Too long, did read):
I'm challenging myself to be creative with the cool new things that browsers offer us. Lani so far has a focus on extreme extensibility, abstraction from dependencies, and optional declarative style.
It's also going to be a micro CSS framework, but that's taking the back-seat.
I wanted to highlight my design here with this table, and the code that is written to produce this result.
First, you can see that the <lani-table> element is reading template, data, and layout information from its child elements. Besides the custom highlighting code (Yellow background in the "Tags" column, and green gradient in the "Score" column), everything can be done without opening even a single script tag.
The <lani-data-source> element is rather special. It's an abstraction of any data source, and you, as a developer can add custom data sources and hook up the handlers to your whim (the element itself uses the "type" attribute to choose a handler. In this case, the handler is "download" which simply sends a fetch request to the server once and downloads the result to memory).
Templates are stored in an html file, not string literals (Which I think really fucks the code) and loaded async, then cached into an object (so that the network tab doesn't get crowded, even if we can count on the HTTP cache). This also has the benefit of allowing me to parse the HTML templates once and then caching the parsed result in memory, so templates are never re-parsed from string no matter how many custom elements are created.
Everything is "compiled" into a single, minified .js file that you include on your page.
I know it's nothing extraordinary, but for something that doesn't need to be compiled, transpiled, packaged, shipped, and kissed goodnight, I think it's a really nice design and I hope to continue work on it and improve it over time1 -
Me : Hey, can we add datas in the database so we can test every single possible situation with it ?
Worker : Nah, it's better that you don't touch it, lemme do it.
*Later*
Worker : Hey I've added data to test our code.
Me : *fetch data, see only three similar objects that fits in only a situation, I can't test the others" Thank you, that was really helpful.1 -
I just want things to do on my phone that are intellectually going to be stimulating
and not brainwashing
is that so hard to ask for
I can't keep playing sudokus all the time. the other day I wanted to go read a coding answer I asked an AI in my browser on my laptop but I was in bed on a voice chat with a sleeping person and didn't wanna get up out of bed to go fetch the laptop. my browser lets me see tabs I have open on other machines and this AI website makes a url with a unique id so you can browse to the chat you had, but it seems to not always work
earlier in the day I had asked the AI a theoretical coding question and it answered, but I got distracted and needed to go do something before I finished reading it (it was long). but when I was in bed on my phone playing sudokus for intellectual stimulation, annoyed and bored it was the only thing I could do, I had the bright idea of opening the tab on my laptop through my phone. Vivaldi is great and this always works. unfortunately the AI website's unique id thing doesn't. it loaded the website by URL correctly but the AI website just took me to the home page and I had no chat history to read =[
phones are literally computers but you can't do anything on them. can't watch videos without ads or bugs, if you load a lot of websites the tab management system sucks and performance is shit, controls for games suck even if you could find something not ad infested
hell you can't even do a pedometer that's not trying to get you to "log in". bruh
you can't even browse GitHub code! at least last I checked. it's just awkward, their app
I feel like I'm in a straitjacket in terms of technology and I wanna scream. I don't even know how to adequately describe my frustration or what I keep wishing for. it's been prominent in my head a couple years now. it's like we're regressing in terms of compatibility. went from card games provided by Microsoft like solitaire and spider, paint... to Jesus fuck you can't even get paint in a browser now without someone trying to fleece you
remember when things were inventive, nice, and not shit?
I don't even like playing mindustry on a phone to be fair. fighting the controls is most of the experience. so maybe phones are only good for reading things
I just noticed my brain over time doing sudokus learns so I wanted to practice engaging in something and learning as exercise, cuz I think it would be good for the brain damage. bah5 -
So I salvaged some computer that was about to be thrown out by IT because it works perfectly fine and would be a waste to lose.
The (current) problem is, there is no built-in wi-fi adapter so I had to order a usb adapter to plug into the machine.
Fortunately enough it supports Linux but cones with a cd with the _source_ of the driver in it and we are supposed to build it. Now what's the problem with that?
First problem: building needs all sorts of build tools, starting from gcc and make. Since it's a fresh install, though, I _cannot_ install those normally because -- you guessed it -- I don't have a WiFi connection, which is why I needed the bloody adapter in the first place, so I spent hours trying to fetch the binaries from the apt register using another computer and bringing them over via USB.
Once that was (more or less) successful, the next problem came around; Second problem: the instructions clearly state to run make, but there is no fucking makefile anywhere so that obviously fails spectacularly. What _is_ there are some bash scripts so I try running those.
Now, just when I think it's finally done (one of the scripts has been running for a while and seems to work) the compiler dies with an error: the fucking driver won't build for the current kernel version. And not just that, but it is clear nobody is using basic things like include guards because gcc kept screaming at me about the same macros being defined over and over due to header file re-inclusion. Like, seriously? Come on!
Long story short the fucking adapter is going back to the seller, let's see if the next one I order more civilized.8 -
"Please stop importing the resources so fast, our servers can't keep up"
Please send the correct data in the first place, so we don't need to fetch all of it again under time pressure -
So I thought I knew source tree, apparently I do not... Lost a week's worth of work, went to history, saw someone removed it with a commit, and now I'm getting blamed for my own work 'disappearing'. The reason I am being told I am to blame is how I control my branches... So how I do it is that I keep a local copy of the master branch, I keep it updated and monitor it for changes regularly (meaning fetch and pull cause double tap..) before I do a merge, I check for any new code on master again, then using the local copy of master, which I just updated, I pull the master changes into my branch, deal with any conflicts, build and done. Then I request my changes into master once I am happy everything is good.
My question is, clearly there is something wrong with the way I do things, so please source tree users, what is the most fool proof way to pull latest from master so that I don't loose code? 😔11 -
Microservices
Lets take an example: Products service & orders service.
When I want to save an order for a user, data saved as
1. UserId, ProductId, Quantity, Date
Or
2. UserId, Name, Email, ProductId, ProductName, Quantity, Date
I'm a bit confused here because if I'm going to fetch that purchase, in example 1, it will return IDs requiring another trip to server to get user & product info
In example two it takes only one trip BUT if any changes is made to either user info or product info it means I'm returning wrong info to the user.
What do we do in this scenario? Excuse my questions first time applying Microservices and been using monolith all my life6 -
How do you test unreachable code or part that is considered an edge case?
For example I catch exceptions in case IO failed and data was not written on database, but that only happens if hardware failure, or no disk space left, how do I mimic that?
I also have unreachable code for example, in one layer I fetch data (lets call it function x) and always return success result unless item not found I throw KeyNotFound exception. But in the calling function I handle the case of Status == Failed
Just in-case in the future I change function x and start returning failed status, so my logic already written but never reachable14 -
My plan was to potato today.
... But given anxiety, might as well have a minor heart attack and a few panic attacks on the side.
Plus, second day of no proper food seems to be helping that cause greatly too.
At this rate, I'll die of dehydration first. Lol. My greatest regret is missing out on the robot's uprising. Ain't got nobody I love deeply, so at least I don't feel regrets for people I leave behind. Tiz a short meh life I've lived.
Aight. Ms NoRegrets is out.
P.S.
In case you're stupid, let me clarify: I was being a drama queen. Shall fetch water... soon, hopefully.1 -
a Senior co-worker start cron job using cpanel to fetch tweets every minute .
the problem he didn't use/know
'/dev/null'
which send email to the admin for every successful fetch
after a week we discovered this problem , admin inbox full of emails ,also our server get blacklisted (ie. cannot send emails) -
Why haven't I learned about js fetch before now. I've even used socket.io!
Fetch is amazing and I just built a function for using fetch with WordPress AJAX hooks. Yes I know it's WordPress and there's jQuery but all of my custom plugins use vanilla js as a little fuck you.
Oh yeah but ie doesn't support it at all2 -
I'm looking into GraphQL and so far so good, but I am finding it hard to implement business rules, for example:
1. Receive request with auth token
2. Know who the user is by extractin userId from token
3. fetch data related to that user only.
I was only able to make it allow or deny if there is a token or not lol5 -
Most actual GraphQL explanation:
1. Still uses your xhr/fetch/axios on FE
2. Just sends all the requests to single endpoint
3. On BE uses its own resolution schema to call proper controller to handle the request, rather than relying on router for that
That's all!
Just another useless layer of abstraction with its learning curve, tricks and bugs as ORMs are9 -
VLC for android is horrendous when it comes to network storage (and ux, but that's vlc in general). 3500 audio titles in a playlist being vlc-native format via samba-share. The fucking thing just freezes up. 3 minutes later it still doesn't do anything..
Oh you say it's prebuffering? How nice it's able to do so witHOUT READING ANYTHING FROM THE FUCKING DISK IN MY SERVER NICE DUDE IS IT GOING TO FETCH IT TELEPATHICALLY?!3 -
I finally attempted php, I wrote a bunch of scripts to fetch data... I feel like I have betrayed myself 😞😞😞
I should have just absorbed the cost of azure 😭😭😭 now am a php dev.... The things I'm about to do, the horrors I'm about to put out into the world.... Pulling your hair out will be the least of your worries, dear maintainer forgive us all, always and forever. 😳1 -
"what are you working on right now?"
I think that question is the one I hate the most.
If A asks that to B, it means A has the authority to do so, you're basically a boss or leader.
But it also reveals insane incompetence, because A has such role, then should already know what B is working on.
I have fantasies of just exploding with a "NOTHING, NOTHIIIIIING!!!! Because you didn't assign me anything"
What the fuck do they want me to do? Go around jerking off to any documentation I can find on google?
Should I just come up with things so that they can be sadly discarded?
I would much rather have dailies, and get guided like "you can do this or that" I feel like I'm just expected to do shit without any actual regqrd for results. As if I was some dog who was thrown a stick to fetch.
But if I don't fetch the stick I get asked like "you should be doing something". I fucking hate it .2 -
How the fuck is visio the worst fucking tool ever. For some obtuse reason, I can't move a text box. At least tell me why your being an uncooperative asshole, instead of showing the new position, and then not moving the damn thing! Its like trying to get a cat to play fetch.1
-
So there is this website called 100daysofrunning.in one of the worst design seen ever. They've a submit page which is another app that opens in an iframe.
If you're part of challenge, everyday you've to submit a form. Distance, time, Strava link, date and it's a pain to do so every day.
On the 50th day they restricted the date to7 days, so you cannot post data older then 7 days.
Being a programmer it would have been insult had i entered data manually.
Thanks to casperjs, meteorjs i was able to automate fetch from strava and post on this dumb page.
One day due an error, the script failed and I've missed one day of data entry. That's 2km of running gone invain and I'm out of the challenge.
Programming has mad me lazy. Screw programming. I should've been a dumb idiot to manually add data spending fkin 30 mins, atleast life would be simple. -
Narrator(me): currently on keeping up with the clientele
Client: sooo-- go live in 2 days lol?
Me: im still waitin for finalised content from you and your designers
Client: oh , can you work the public holiday tomorrow , in anticipation for perhaps the of chance we will have sent xyz content
Me: hm tomorrows a public holiday.. but sure
Them: oh im in another timezone so its really 2 days - 12 hours
Narrator(me): thats when the boy realised not to let his guard down -
I was building a super simple Laravel app for a client (forms APIs stuff)
For the frontend I used jQuery cuz why overkill it with react.
Now the sad part:
The app makes ajax calls to fetch the data from the database and update the view according. The code is very well written and the call is so quick that in a blink of an eye the data is processed from the controller and sent to the view -_-
Because the user doesn't gets to see what the fuck just happened when they clicked the action button, I had to add a setTimeout function before the Ajax call to slow down the process by 2000ms and added a freakin spinner.
I feel very sad when I can't show how awesome apps I can build but,
I killed my ego for the UX.
This was my sacrifice.
Anyone faced similar shits?3 -
I wasn't happy with one of our UI views for editing a database query that consisted of about 50 fields ("editing" being the operative term here, not just viewing. It had to be two-way). Everything was hardcoded and defined manually, with each block of ~10 lines being repeated and mostly identical apart from the occasional double inline field and name of the variable. It had "just ended up that way" over time due to the variable names in the UI being different than the names of the variables that came from the API.
I decided to overhaul it all where I defined the different input components and which fields should be included, then made a function which would generate the page based on these definitions. It was about 500 lines of modularized functions and classes where the class for the actual view was about 50 lines- compared to the 1400+ lines of the previous version.
But, it didn't work. It should, but it "just didn't". There was no error. All I got was a blank, solid white page. I could make a drastic change or try something completely different and I would get the same error, same blank page. API fetch succeeded, value assignments succeeded, the object exists, but if you iterated it it was... empty.
I started getting really discouraged that I had made it too abstract. Maybe I actually made it more complex and unreadable than before. Maybe just hardcoding it all was the better solution after all. Maybe I had gone against KISS and overdesigned it.
I was up pretty late and everyone had gone home. When the last guy left there was that mood where "yeah if I can't make this work we'll just use the current version...".
Turns out I had tried iterating over a property of the set of fields to render, rather than the entire collection. In the old method the variables were a member of an object, but now they were its own object, a change I had made to isolate the set of values which were to be viewed/edited and make them easier to pass back and forth. This member existed since I hadn't cleaned it out yet, but it was empty.
I had been banging my head against this for a whole day and I was ready to admit I had made a mistake and wasted my time before discarding it all, but then I backspaced this one property and the interface went from empty to rendering perfectly and with all functionality intact. I swear god rays were coming out of my screen. -
YGGG IM SO CLOSE I CAN ALMOST TASTE IT.
Register allocation pretty much done: you can still juggle registers manually if you want, but you don't have to -- declaring a variable and using it as operand instead of a register is implicitly telling the compiler to handle it for you.
Whats more, spilling to stack is done automatically, keeping track of whether a value is or isnt required so its only done when absolutely necessary. And variables are handled differently depending on wheter they are input, output, or both, so we can eliminate making redundant copies in some cases.
Its a thing of beauty, defenestrating the difficult aspects of assembly, while still writting pure assembly... well, for the most part. There's some C-like sugar that's just too convenient for me not to include.
(x,y)=*F arg0,argN. This piece of shit is the distillation of my very profound meditations on fuckerous thoughtlessness, so let me break it down:
- (x,y)=; fuck you in the ass I can return as many values as I want. You dont need the parens if theres only a single return.
- *F args; some may have thought I was dereferencing a pointer but Im calling F and passing it arguments; the asterisk indicates I want to jump to a symbol rather than read its address or the value stored at it.
To the virtual machine, this is three instructions:
- bind x,y; overwrite these values with Fs output.
- pass arg0,argN; setup the damn parameters.
- call F; you know this one, so perform the deed.
Everything else is generated; these are macro-instructions with some logic attached to them, and theres a step in the compilation dedicated to walking the stupid program for the seventh fucking time that handles the expansion and optimization.
So whats left? Ah shit, classes. Disinfect and open wide mother fucker we're doing OOP without a condom.
Now, obviously, we have to sanitize a lot of what OOP stands for. In general, you can consider every textbook shit, so much so that wiping your ass with their pages would defeat the point of wiping your ass.
Lets say, for simplicity, that every program is a data transform (see: computation) broken down into a multitude of classes that represent the layout and quantity of memory required at different steps, plus the operations performed on said memory.
That is most if not all of the paradigm's merit right there. Everything else that I thought to have found use for was in the end nothing but deranged ways of deriving one thing from another. Telling you I want the size of this worth of space is such an act, and is indeed useful; telling you I want to utilize this as base for that when this itself cannot be directly used is theoretically a poorly worded and overly verbose bitch slap.
Plainly, fucktoys and abstract classes are a mistake, autocorrect these fucking misspelled testicle sax.
None of the remaining deeper lore, or rather sleazy fanfiction, that forms the larger cannon of object oriented as taught by my colleagues makes sufficient sense at this level for me to even consider dumping a steaming fat shit down it's execrable throat, and so I will spare you bearing witness to the inevitable forced coprophagia.
This is what we're left with: structures and procedures. Easy as gobblin pie.
Any F taking pointer-to-struc as it's first argument that is declared within the same namespace can be fetched by an instance of the structure in question. The sugar: x ->* F arg0,argN
Where ->* stands for failed abortion. No, the arrow by itself means fetch me a symbol; the asterisk wants to jump there. So fetch and do. We make it work for all symbols just to be dicks about it.
Anyway, invoking anything like this passes the caller to the callee. If you use the name of the struc rather than a pointer, you get it as a string. Because fuck you, I like Perl.
What else is there to discuss? My mind seems blank, but it is truly blank.
Allocating multitudes of structures, with same or different types, should be done in one go whenever possible. I know I want to do this, and I know whichever way we settle for has to be intuitive, else this entire project has failed.
So my version of new always takes an argument, dont you just love slurping diarrhea. If zero it means call malloc for this one, else it's an address where this instance is to be stored.
What's the big idea? Only the topmost instance in any given hierarchy will trigger an allocation. My compiler could easily perform this analysis because I am unemployed.
So where do you want it on the stack on the heap yyou want to reutilize any piece of ass, where buttocks stands for some adequately sized space in memory -- entirely within the realm of possibility. Furthermore, evicting shit you don't need and replacing it with something else.
Let me tell you, I will give your every object an allocator if you give the chance. I will -- nevermind. This is not for your orifices, porridges, oranges, morpheousness.
Walruses.16 -
So...buffered query kept crashing even though I have a row limit of 10001 on my abstract datatables class. I didn't realize it was buffered because I grab the results in a for loop as a fetch. Well, I tracked it down to being the size of the email content that I'm selecting (and then using strip_tags and substring in PHP before returning to the front-end). So it's totally a catch-22 at this point because if I select let's say...substring 500 characters and most of that is line breaks and other html junk, I may only get a couple characters of normal text (or none at all) after stripping tags and doing a final substring to get the 50 characters of text I want to display. I said screw it and took the email content out of the table all together. You have to view it to see the content now. I should probably be storing a text-only version of the email, but argh..that's a lot of extra data.
-
Create a p2p version of Google Maps / OpenStreetMap that uses people's browsers to store the map tiles of "their region".
Started already by building a proof-of-concept called p2p-fetch[1] (uses GunDB under the hood) and mixing it with the awesome Leaflet library.
[1] https://github.com/davide/p2p-fetch2 -
1/2 dev and a fair warning: do not go into the comments.
You're going anyway? Good.
I began trying to figure out how to use stable diffusion out of boredom. Couldn't do shit at first, but after messing around for a few days I'm starting to get the hang of it.
Writing long prompts gets tiresome, though. Think I can build myself a tool to help with this. Nothing fancy. A local database to hold trees of tokens, associate each tree to an ID, like say <class 'path'> or some such. Essentially, you use this to save a description of any size.
The rest is textual substitution, which is trivial in devil-speak. Off the top of my head:
my $RE=qr{\< (?<class> [^\s]+) \s+ ' (?<path>) [^'] '\>}x;
And then? match |> fetch(validate) |> replace, recurse. Say:
while ($in =~ $RE) {
my $tree=db->fetch $+{class},$+{path};
$in=~ s[$RE][$tree];
};
Is that it? As far the substitution goes, then yeah, more or less. We have to check that a tree's definition does not recurse for this to work though, but I would do that __before__ dumping the tree to disk, not after.
There is most likely an upper limit to how much abstraction can be achieved this way, one can only get so specific before the algorithm starts tripping balls I reckon, the point here is just reaching that limit sooner.
So pasting lists of tokens, in a nutshell. Not a novel idea. I'd just be making it easier for myself. I'd rather reference things by name, and I'd rather not define what a name means more than once. So if I've already detailed what a Nazgul is, for instance, then I'd like to reuse it. Copy, paste, good times.
Do promise to slay me in combat should you ever catch me using the term "prompt engineering" unironically, what a stupid fucking joke.
Anyway, the other half, so !dev and I repeat the warning, just out of courtesy. I don't think it needs to be here, as this is all fairly mild imagery, but just in case.
I felt disappointed that a cursed image would scare me when I've seen far worse shit. So I began experimenting, seeing if I could replicate the result. No luck yet, but I think we're getting somewhere.
Our mission is clearly the bronwning of pants, that much is clear. But how do we come to understand fear? I don't know. "Scaring" seems fairly subjective.
But I fear what I know to be real,
And I believe my own two eyes.11 -
I have implemented RESTful API using expressJS, and another React app which will use the API's to fetch data.
I'm getting a problem of Allow-Origin Header.
what's the proper way of calling a API ?
do I use a CORS middleware and allow all origin ('*') and use Api-key as way of check authorization to prevent mis-use. ?
any other tricks ?2 -
Currently, I refactor some code in a private project at home. Yesterday I ran tests and some things didn't work I thought I already fixed. After I fixed them other things were broke and again errors felt familiar. With every fix and every new error I was more and more sure, I already did exactly the same things before. I thought, maybe it is a Déjà-vu or I dreamed about that.
After two hours, when everything was working again, I realized I did all this two nights before in a branch and totally forgot about it.
I wasn't even drunk -.-' -
Fetch API gives CORS error..
Then I use JQuery AJAX request and it works fine...? 😑
Can you even handle CORS requests with Fetch?
🤔4 -
Need some advice here.
So hello everyone! I recently moved abroad for work, for the sake of the experience and the excitement of learning how developers in Latin America tackle specific problems. To my surprise, the dev team is actually composed solely of Europeans and Americans.
I work for a relatively new startup with an ambitious goal. I love the drive everyone has, but my major gripe is with my team lead. He's adverse to any change, and any and all proposals made to improve quality of throughput are shot down in flames. Our stack is a horrendous mess patched together with band-aids, nothing is documented, there are NO unit tests for our backend and the same goes for our frontend. The team has been working on a database/application migration for about a month now, which I find ridiculous because the entire situation could have been avoided by following very rudimentary DevOps practices (which I'm shunned for mentioning). I should also add that for whatever reason containerization and microservices are also taboo, which I find hillarious because of our currently convoluted setup with elastic beanstalk and the the constant complaints between our development environment and production environments differing too much.
I've been tasked with managing a Wordpress site for the past 3 weeks, hardly what I would consider exciting. I've written 6 pages in the past two weeks so our marketing team can move off of squarespace to save some money and allow us more control. Due to the shit show that is our "custom theme" I had to write these pages in a manner that completely disregard existing style rules by disabling them entirely on these pages. Now, ironically they would like to change the blog's base theme but this would invertedly cause other pages created before I arrived to simply not work, which means I would have to rewrite them.
Before I took the role of writing an entire theme from scratch and updating these existing pages to work adequately, I proposed moving to a headless wordpress setup. In which case we could share assets in a much more streamline manner between our application and wordpress site and unify our styles. I was shot down almost immediately. Due to a grave misunderstanding of how wordpress works, no one else on the team seems to understand just how easy it is to fetch data from wordpress's api.
In any event, I also had a tech meeting today with developers from partner companies and realized no one knew what the fuck they were talking about. The greater majority of these self proclaimed senior developers are actually considered junior developers in the United States. I actually recoiled at the thought that I may have made a great mistake leaving the United States to look a great tech gig.
I mean no disrespect to Latin America, or any European countries, I've met some really incredible developers from Russia, the Ukraine, Italy, etc. in the past and I'm certainly not trying to make any blanket statements. I just want to know what everyone thinks, if I should maybe move back to the states and header over to the bay/NY. I'm from the greater Boston area, where some really great stuff is going on but I guess I also wanted a change of scenery.2 -
So 9 months back I wrote a script that asked for company's symbol to fetch data from a site, just one symbol a time. There were around 300-350 symbols, I tried storing symbols in a text file and supplying input from it, it did not work then so I decided to leave it as it is. Today when I took closer look at the code I wrote, I found that the symbols were being fed to the script, however, the "\n" was included too, so my script was failing to get data in bluk. Modified it today, it's all good. Its kinda crazy, 9 months and only thing stopping my script to work was a freakin "\n".
-
Can anyone help me with this theory about microprocessor, cpu and computers in general?
( I used to love programming when during school days when it was just basic searching/sorting and oop. Even in college , when it advanced to language details , compilers and data structures, i was fine. But subjects like coa and microprocessors, which kind of explains the working of hardware behind the brain that is a computer is so difficult to understand for me 😭😭😭)
How a computer works? All i knew was that when a bulb gets connected to a battery via wires, some metal inside it starts glowing and we see light. No magics involved till now.
Then came the von Neumann architecture which says a computer consists of 4 things : i/o devices, system bus ,memory and cpu. I/0 and memory interact with system bus, which is controlled by cpu . Thus cpu controls everything and that's how computer works.
Wait, what?
Let's take an easy example of calc. i pressed 1+2= on keyboard, it showed me '1+2=' and then '3'. How the hell that hapenned ?
Then some video told me this : every key in your keyboard is connected to a multiplexer which gives a special "code" to the processer regarding the key press.
The "control unit" of cpu commands the ram to store every character until '=' is pressed (which is a kind of interrupt telling the cpu to start processing) . RAM is simply a bunch of storage circuits (which can store some 1s) along with another bunch of circuits which can retrieve these data.
Up till now, the control unit knows that memory has (for eg):
Value 1 stored as 0001 at some address 34A
Value + stored as 11001101 at some address 34B
Value 2 stored as 0010 at some Address 23B
On recieving code for '=' press, the "control unit" commands the "alu" unit of cpu to fectch data from memory , understand it and calculate the result(i e the "fetch, decode and execute" cycle)
Alu fetches the "codes" from the memory, which translates to ADD 34A,23B i.e add the data stored at addresses 34a , 23b. The alu retrieves values present at given addresses, passes them through its adder circuit and puts the result at some new address 21H.
The control unit then fetches this result from new address and via, system busses, sends this new value to display's memory loaded at some memory port 4044.
The display picks it up and instantly shows it.
My problems:
1. Is this all correct? Does this only happens?
2. Please expand this more.
How is this system bus, alu, cpu , working?
What are the registers, accumulators , flip flops in the memory?
What are the machine cycles?
What are instructions cycles , opcodes, instruction codes ?
Where does assembly language comes in?
How does cpu manipulates memory?
This data bus , control bus, what are they?
I have come across so many weird words i dont understand dma, interrupts , memory mapped i/o devices, etc. Somebody please explain.
Ps : am learning about the fucking 8085 microprocessor in class and i can't even relate to basic computer architecture. I had flunked the coa paper which i now realise why, coz its so confusing. :'''(14 -
Mr developer are you giving me constructive criticism? or forcing your programming faith down my throat again? i really cant tell.
-
I gues devrant android app got an update recently [last week?]? Notifications suddenly started to work. Yayy!
But in-app notifications broke. More often than not either tab in notifs [all, ++, comments, etc.] loads nothing and there's no way to re-fetch data manually. If _all_ breaks I have to browse each category manually to see what's new.
Is it just me or is anyone else experiencing this?
Galaxy s7 edge, stock everything.
P.S. I rarely get to rant about devrant :) -
This is a little shameful, but i dont tell people i do IT since it leads to a , "wow you must be really smart, i could never do computer science" discussion.
Alot easier to connect with people without a plethora of stereotypes attatched to your SEO of a name3 -
Front-end web development is like a fucking cancer to me right now
I need the following behavior from my development environment if I don't want the webdev experience to destroy my sanity and tempt me into suicide by making me waste my valuable lifetime configuring shit that is ultimately meaningless to the software I'm trying to create:
- I should be able to open the webpage in the browser at localhost:<some-port>
- the page should refresh immediately as I save my files
- I should be able to import node modules installed with npm without using a script tag linking to some CDN (for instance, I want to do a get request with axios instead of the fetch API)
- I should be able to do this without spending more than two minutes reading the documentation for a tool that would enable me to do it, ideally without ever coming even close to touching a configuration file
Right now I know about browser-sync and webpack, or webpack-dev-server or some such fucking shit fuck fucking fuck.
browser-sync seems to fulfill most of these needs, except that I can't seem to bring npm modules into my application and import them. Webpack seems to be able to do this, but at the cost of slowly throwing my life away reading documentation for over-complicated configuration files that do not aid me in actual software creation and therefore do not interest me and never will, all in the hope that I *may* at some point dig out enough shit to find how to do such a use case (i.e. seamless, smooth web development) that to me feels reasonably common and expected.
Is there some tool that enables me to do *seamless*, pleasurable web development without the hassles of over-complication and over-engineering? Is there some hidden command for webpack that allows me to run such simple shit without ever needing to edit some pointless configuration file?
Please, I beg of you, let me know.8 -
This is my frontend tip of the day.
If you have a frontend that consumes an external API:
1) Retrieve the json responses from devtools
2) Save them in your project as json files (trim the data a bit if it's too long)
3) When starting your app with a special env var like MOCK_DATA, make your app mock the data and use your saved json data instead.
You can associate each response with an url regex.
The package fetch-mock mocks fetch really well, it lets through the urls that don't match anything.
This way you can incrementally add responses.
And voila, you have a mode where you have near instant page loads to test things manually, and you also have mocked data ready for testing eg, cypress. -
I had a discussion with my colleagues about my bachelor thesis.
Together we created within the last 18 month a REST-API where we use LDAP/LMDB as database (tree structured storage). Of course our data is relational and of course we have a high redundancy there. It's a 170 call API and I highly doubt that it's actually conforming REST.
Ensuring DB integrity is done in the backend and coding style there is "If we change it at one place, let's make sure to also change it everywhere else", so you get a good impression how much of spaghetti code we have there.
Now I proposed to code a solution in my bachelor thesis where we use a relational database (we even have an administrated Oracle DB with high availability) and have a write-only layer to also store the data in LDAP but my colleagues said that "it would add too much complexity to the system".
Instead I should write the relational layer myself and fetch the data somehow from the existing LDAP tree.
What the actual fuck, spaghetti code is what makes the system really unnecessarily complex so that no one will understand that code in 2 years.
Congratulations, you just created legacy code that went into production in 2018 while not accepting the opportunity to let that legacy code get eliminated.
Now good luck with running and maintaining that system and it's inconsistencies.1 -
So I'm currently working on a chat app that deals with astrology..dealing in the sense we are building an AI which gives prediction based on ones date of birth, time of birth and place of birth, you can ask it questions (currently only career related) and you get some prediction..it's an in-house project, we have a client who is an astrologer who gives us the logic to compute the predictions ..it's still a long way from being an AI ...so our CEO walks in one day with his huge plans for the product...decides to ditch the app completely, on which we have invested 4 months of our time and instead make an appointment scheduling webapp for our client as he felt that would fetch us some green stuff..so I was like why ditch the app when we can have the same module in the app itself and ask the astrologer to make his clients install if they want to book future appointments, he completely disregarded my idea and said that is bad marketing and all other shit and he went on to explain his other ideas ...I didn't think much of it at that time , then the CEO and the director of technology had a separate meeting where the director has made the same points which I had told him(ceo) that it is a bad idea to ditch the app (I wasn't aware of this meeting untill later)...so after a week we have a team meeting with the CEO, director of technology ...where he starts telling how it is not so wise to Chuck the existing application and build a new one which is totally unnecessary and we can have it as a module in the existing one...and I'm like sitting there thinking to myself da fck is he talking about...so i decided to stay silent and listen to his bs...my marketing lead leans over and ask y so silent ....I tell her whatever he is talking now is the same thing I told last week which he rejected blatantly... And then he had the nerve to ask me any inputs to this plan...I couldn't hold back ...I told him that this is the exact same thing I told u last week , to which his reply was focus on the future and forget the past ....I was like mother fckr woooooot ...I realised the power of position !! Fuckol man3
-
The moment when you just wanna fetch data from instagram API and they ask for "Instagram User Experience Video" for permissions. #FU Facebook #FU Instagram
-
I hate Intellij Idea but it's best option available to develop in Scala. Improvements in VSCode/Metals is my last hope.
The (few) things I NEED from Intellij:
* Very good autocompletion
* Refactoring tools (renaming, auto imports)
* Search tools (find usages, sub/super-types)
The (many) things I hate of Intellij:
* Layout with panel sizes doesn't behave properly and it scales instead of remaining fixed.
* Tedious 2-hands shortcuts makes the right hand to move a lot from the mouse
* Delays and lag in the UI, freezes on garbage collection
* High memory consumption, high CPU usage and generally slow and cumbersome
* The delay in the UI between commands is so that it's accidentally possible to introduce typos
* Can't move tabs around and organize them as I like
* Ugly font rendering and missing typography settings
* Multi-caret implementation as a different editing mode is annoying because requires frequent switching
* Unnatural code folding regions, why method arguments are not folded with the method?
* Unhelpful support forum, sometimes dismissive answers
* Highlighting of current word under the caret doesn't work
* Very slow editor, can't keep spacebar pressed to move text or it hangs!
* Several settings reset at every update. Like the auto fetch of git
* New features are added and enabled by default which is very invasive
* Some of the features mentioned above are really annoying and it's not possible/not trivial to disable them
* It uses its own compile and several times it highlights false positives7 -
Spent the whole night trying to get a react native component to push a csv file to multer in the backend.
Tried using fetch, then dependencies, then xhr.
Realised I had to create formdata. No such documentation on the internet.
Used God knows what after 3 hours, suddenly things are working.
I'll never be able to get answers in life -_- -
I needed to align instruction execution to a 64bit boundary, for custom CPU architecture that I'm building. Basically the ISA had 3 types of instruction lengths; 16bit, 32bit, and 48bit. The core did 64bit fetch from the Instruction Cache, the issue was that if an instruction was in between two blocks of data I needed to fetch two blocks. That would impact performance a bit. So I had to modify GCC sources of the ISA that I'm using. So instead of doing it the right way, I just did it the lazy way and modified the GCC backend to print
.p2alignw 3, 0x00ff, 4
.p2alignw 3, 0x00ff, 2
On each 48bit and 32bit instruction to generate NOPs. And it did work lol. -
I just learned Serverless.com
Thats it?
Shit was 100x more easy to learn compared to the brutality of terraform devops reactive streaming kafka rabbitmq sockets and other shits i had to fuck around and find out.
Dont even have to watch tutorials for this. Just building 1 simple crud project and read the docs was enough.
However after deploying these serverless shits to aws Lambda i noticed that it takes quite some time for the api to fetch response. Why?
On postman calling the route for the first time i have to wait like 3s for api to fetch all (with limit of 10) or create 1 dto object. Then every next api call is 100-150ms which is ok. But it could be better no? Locally my spring boot rest api takes 3-7ms of load time. Why is this 100-150ms?20 -
Writes regex pattern at regex101.com: match
Writes same pattern on site: No matches
Trying to fetch text from foreign site to detect when the text changes. -
I was sending data from online server.. and my team mate was still using localhost and complaining failure of data fetch!!😂
-
!rant, but funny
tl;dr I made something that was to protect me in case the customer doesn't pay, wanted to check if it's still there, messed up a little :D
>do an Android app project for almost 6 months
>issues with payment for it
> =.=
>firebase
>"Add new application"
>Remote Config
>add single integer variable
>back to app code
>if (integerFromFirebase != 0) navigateTo(new Fragment())
>mwahahahaha
>but they ended up paying me in the end
>huh...
>see another post on how to secure yourself if customer doesn't want to pay
>well, consider yours as more sophisticated
>hmm... wonder if they removed it
>firebaseconsole.exe
>change "enableJavaScript" (needed a legit name, so it can't be easily backtracked) to 1
>publish changes
>app still works fine
>mhhh... they removed it? really?
>can't fking believe it
>apkpure.com
>search for the app
>download apk
>unzip
>decompile dex file
>find the fragment
>can't find the code that navigates to blank fragment, but the config fetch is still there
>wtf
>look at the app
>restart it
>SHIT ITS NOT WORKING NOW XDDDDD
>changed the variable back to 0
>found out that the lambda in which I navigate to the blank fragment is in other .java file. New thing learned :v
>idk if I'm in trouble but I highly doubt it (console shows max 10 active users atm)
Was fun tho :v3 -
Update local removed git branches
You can update the removed branches on your local machine by
git fetch --prune --all3 -
That feeling when you realize that the REST API you were trying to consume apparently does not provide a query flag to get for a more detailed response making you think you'll need to fetch one list of items and then fire almost 1,000 requests really does not compare to that feeling when a colleague points out that the REST API in question does in fact support the flag AFTER you implemented the roundabout way.
FUCKING HELL!
I just didn't realize that I could click on GET and POST blocks for the metronome API documentation opening up a frigging pop-up. (See screenshot.)
Why couldn't the information have been more upfront? Only a cursor change on hovering the area could make one thing to click there.
Oh how I blame their lack of a user interface for my blindness.
I thought that it was just a basic documentation that only told you which endpoints exist and expects you to learn by trial of fire. So I searched the interwebs and on their support forum I found an old issue making me think that my round-about way was the way to go m(
Even worse, on the support forum I cannot even leave a comment warning the poor souls comming after me that they should not do the roundabout way as that issue has been long closed.
If you want to see it yourself: https://dcos.github.io/metronome/... -
Hey, I got this new web project, but to be honest I haven’t coded much web in a few years and I’ve heard the landscape changed a bit. You are the most up-to date web dev around here right?
-The actual term is Front End engineer, but yeah, I’m the right guy. I do web in 2016. Visualisations, music players, flying drones that play football, you name it. I just came back from JsConf and ReactConf, so I know the latest technologies to create web apps.
Cool. I need to create a page that displays the latest activity from the users, so I just need to get the data from the REST endpoint and display it in some sort of filterable table, and update it if anything changes in the server. I was thinking maybe using jQuery to fetch and display the data?
read full article at https://hackernoon.com/how-it-feels...1 -
A database fetch. All rows at once. Not that many rows, maybe 50.
But oh boy when someone forgot that the repository is wired to magically inject SQL that joins other tables and does ineffective loops to create thousands of objects in the background.
Been fun finding memory hogs in the codebase. -
Why eclipse?
Why did you think that implementing git yourself was a good idea?
Why? Fucking why?
Now when I switch to my development branch and it's a few commits behind I get lock fails when I fetch. And when I pull?
I'm fucking 32 commits in front of the branch? Like what the Fuck eclipse?
How can you pull those commits from the repo without fetching?
... I use the command like now.3 -
I was doing something in Git, switching and I guess another process was doing a fetch and now all the files no meter what bench I'm in says Unstaged/Untracked.
I tried deleting the main branch and checking it out again from remote but nothing.
I have no idea what I did or how to fix... Seems like I need to delete the whole folder... But then I'd lose any changes in feature branches out local branches which afternoon checked in....8 -
Went networking the other day at an event was super hangry but somehow it worked out for me?
Me: *interjects on someone else* i heard you need a dev, im full stack, lets get coffee and talk later
Them: i feel compelled, heres my buisness card, yes i would like coffee with you in a week -
A very long rant.. but I'm looking to share some experiences, maybe a different perspective.. huge changes at the company.
So my company is starting our microservices journey (we have a 359 retail websites at this moment)
First question was: What to build first?
The first thing we had to do was to decide what we wanted to build as our first microservice. We went looking for a microservice that can be used read only, consumers could easily implement without overhauling production software and is isolated from other processes.
We’ve ended up with building a catalog service as our first microservice. That catalog service provides consumers of the microservice information of our catalog and its most essential information about items in the catalog.
By starting with building the catalog service the team could focus on building the microservice without any time pressure. The initial functionalities of the catalog service were being created to replace existing functionality which were working fine.
Because we choose such an isolated functionality we were able to introduce the new catalog service into production step by step. Instead of replacing the search functionality of the webshops using a big-bang approach, we choose A/B split testing to measure our changes and gradually increase the load of the microservice.
Next step: Choosing a datastore
The search engine that was in production when we started this project was making user of Solr. Due to the use of Lucene it was performing very well as a search engine, but from engineering perspective it lacked some functionalities. It came short if you wanted to run it in a cluster environment, configuring it was hard and not user friendly and last but not least, development of Solr seemed to be grinded to a halt.
Elasticsearch started entering the scene as a competitor for Solr and brought interesting features. Still using Lucene, which we were happy with, it was build with clustering in mind and being provided out of the box. Managing Elasticsearch was easy since there are REST APIs for configuration and as a fallback there are YAML configurations available.
We decided to use Elasticsearch since it provides us the strengths and capabilities of Lucene with the added joy of easy configuration, clustering and a lively community driving the project.
Even bigger challenge? Which programming language will we use
The team responsible for developing this first microservice consists out of a group web developers. So when looking for a programming language for the microservice, we went searching for a language close to their hearts and expertise. At that time a typical web developer at least had knowledge of PHP and Javascript.
What we’ve noticed during researching various languages is that almost all actions done by the catalog service will boil down to the following paradigm:
- Execute a HTTP call to fetch some JSON
- Transform JSON to a desired output
- Respond with the transformed JSON
Actions that easily can be done in a parallel and asynchronous manner and mainly consists out of transforming JSON from the source to a desired output. The programming language used for the catalog service should hold strong qualifications for those kind of actions.
Another thing to notice is that some functionalities that will be built using the catalog service will result into a high level of concurrent requests. For example the type-ahead functionality will trigger several requests to the catalog service per usage of a user.
To us, PHP and .NET at that time weren’t sufficient enough to us for building the catalog service based on the requirements we’ve set. Eventually we’ve decided to use Node.js which is better suited for the things we are looking for as described earlier. Node.js provides a non-blocking I/O model and being event driven helps us developing a high performance microservice.
The leap to start programming Node.js is relatively small since it basically is Javascript. A language that is familiar for the developers around that time. While Node.js is displaying some new concepts it is relatively easy for a developer to start using it.
The beauty of microservices and the isolation it provides, is that you can choose the best tool for that particular microservice. Not all microservices will be developed using Node.js and Elasticsearch. All kinds of combinations might arise and this is what makes the microservices architecture so flexible.
Even when Node.js or Elasticsearch turns out to be a bad choice for the catalog service it is relatively easy to switch that choice for magic ‘X’ or component ‘Z’. By focussing on creating a solid API the components that are driving that API don’t matter that much. It should do what you ask of it and when it is lacking you just replace it.
Many more headaches to come later this year ;)3 -
Anyone else out there feel like Git is like Charlie Brown’s “stupid kite-eating tree” that just lies in wait at code deploy time to ruin you? I can never get it right. Either I’m doing some edits and realize I’m on the wrong branch or the master is inexplicably ahead of local (or vice versa) and even though I can see in the git log where things went wrong, it’s like crossing a freeway blindfolded and hoping my git fetch or reset or merge doesn’t blow everything to hell. WHYYYY IS THIS SO DAMN HARD?!27
-
I almost looked for every possible source but is not satisfied or not found optimum way or path.
Actually i have to build a netflix clone (not exactly but a video on demand platform for one local client). The tech stack will be React at the frontend and DRF (Django Rest Framework) at the backend.
I know the node will match my purpose at best, but this needs to be done on DRF. How should I return the video, should I return it in bytes, packets and what can sample code look like, how frontend should fetch it. It will really help if you put your insight, in general. Thank you5 -
Dear me, It's spelled feTCH. It's caused you like an hour between today and yesterday mispelling FETCH as feCth. Why use it at all? I mean it's there now but next time, why not use GET or REQUEST or anything that you will spell correctly 100% of the time and prevent confusion when autocomplete gets it "wrong" because "derp, fetchFoobar is defined, derp dee derp what did I do"?
It's been a long week when the target of the rant is my own dumb habits.
I did get a new keyboard but only a ding dong blames his tools. Something like that2 -
A command line tool built in Python that helps you analyse your git logs by exporting them into a csv/json file.
Can fetch the logs from a given file path or a git directory.
https://github.com/dev-prakhar/...3 -
Some time ago, when exactly the fuck I don't quite remember and promise I never will unless just the right amount of ass is provided in a timely fashion, I start going about how I want to work on some utils to make writing prompts easier.
What I do remember and will remind you with strongly renewed vigor is the fact that I signed a legally-binding document to grant the general public a full pardon for my own ritualistic assassination should I ever use the term "prompt-engineering" unironically.
This pact still holds, and were I to break my solemn oath, then I will hold you fully accountable every single second I find myself still breathing.
Anyhoo, today was the first test of my resolve, for I have implemented both the stupid preprocessor and the local database prototype that allows it to fetch long ass definitions from disk, and both have been published (main: https://github.com/Liebranca/...).
I must admit to you all that though I have not failed, I felt weakness for a second when filling out tags in the repo description, as only "prompt-engineering" was recognized as a legit tag, and not "prompt-writing". In that moment, I almost gave in to temptation, as the accursed Satan whispered in my ear, appealing to my desire for recognition.
However, I reminded this ill buttrape daemon that their fate lies in the burning fires of hell, and as a result was allowed to resist it's alluring diabolical seduction. And for not giving in to greed, I have kept my life, my honor, and my anal virginity.
Also I wrote *some* documentation, it's shit but it's something.12 -
Trying to explain to (a more experienced) dev why it's not a god idea to do a exec( php '/var/www/xxx >> /dev/null) and then redirect the visitor.
The script is running a query that take some time and he want's to redirect the visitor and then fetch the result with jquery.
Tried to explain parent and child processes and pointed him in the direction with pipes and bakground process. After some discussion about forking and all the cons with that.
yes its PHP ;)
Gonna be exiting to see his next idea :S -
WHY DOES EVERYTHING BREAK WHEN I WANT TO DEPLOY?
Finally fixed the last few bugs on my project and the thing is pretty much set to go. Then both the production and my local copy (identical in every way except for connecting to different MySQL servers) have the EXACT same query issues. The cursor fetch methods ALL get nothing EVEN WHEN I CAN SEE THE CURSOR HAS THE DATA I REQUESTED in the debugger.
I'm already in hot water for taking longer than expected with this project and this is going to push it back even further. -
This is new to me .-.
I just noticed I don't have internet permission in my flutter app Android manifest, yet I'm able to fetch data from an API, how is this possible?5 -
It seems to me that browsers lagging behind is the reason we've seen the JS framework boom both in recent years and ongoing, evident in what they regard as major updates. Most of the functionalities implemented in my time working on the front end are high level problems ubiquitous enough to have been solved at the browser level. Same goes for all the optimizations CSR frameworks are struggling to attain. Every CSR app genuinely feels like recreating a browser, both in UX and dev requirements. These problems exist because current browsers are analog software still accustomed to loading all content at once, no in-app state, just scroll states
The React-Vue-Angular wars of today are a direct hat-tip to the Netscape-Microsoft wars of the early years. If they can form a coalition that sets a standard for syntax, best rendering engine, natural way for user facing devs to control app state, fetch data or connect the back end, somehow render this on the server or find a workaround SEO issues on CSRs, etc, given the shared agreement on expectations for modern web software, it'll be fascinating to see such a possibility8 -
So I have replaced npm with yarn due to performance boost and the lockfile.
Never will there be problems with unexpected versions of dependencies!
Wait.
Why is my build writing a yarn.lock?
It turns out, if you want yarn to exit with an error code if it's out of sync with the package.json, you have to run it with:
$ yarn install --frozen-lockfile
Only then it will produce an error.
The default for it is to notice, oh, there is some new dependencies, let resolve this to the most current version I can fetch, and use that one, and write a new lockfile. Meaning you will get unknown futures of a depdency. O_o
That's totally going besides the purpose of having a lockfile in the first place. Why would anyone want this?
Action I do expect to touch the lockfile:
add / remove / upgrade
Action I do NOT expect to touch the lockfile:
install
Install should just install whatever is in there, and if it realizes it is out of sync, die with an error.
But that would make sense!
Who needs sensible defaults anyway!?5 -
It seems I am developing a habit to always forget to test "fetch" code in a repository, found two unit tests having all insert/update/delete but not a single fetch function T_T
-
Hey guys... I need some help. I have a react project which needs to fetch around 10MB of JSON data from server and then do a computation on that whole data. The UI completely stops obviously because of the computation. I tried using promises for computation but still nothing. The UI still stops completely. Can someone help please?7
-
I work in a small team. As the senior dev I tens to focus on important tasks that shape the core of the product but some times I can’t divide my self when there are multiple tasks at hand, so I pass some tasks to the an other mid level dev.
So the task was to create an automation in order to CD (continuously deliver) an order from WHMCS of the (git versioned) product to customers UAT, PROD envs.
To get a background this is an old guy with “constricted” experience in PHP/jQuery/Joomla/Wordpress.
So when we were breaking up the tasks he told me he would like to implement this so i gave him the task as i was busy with core features.
I was like what could go wrong? I know he doesn’t know much about CI/CD but he can read right? He will google right? He will search for CI/CD solutions that do this out of the box right? He will design on paper or what ever and do small POCs right? He will design the flow first before starting the implementation right? RIGHT?
So fast forward to today I had a call with him this morning about some DB staff. And he wanted to show me his progress…
His solution is:
(parentheses is my brain)
1. Customer completes WHMCS order (perfect)
2. Web Hook 🪝 action (YES)
3. cpanel gets source and “automatic!” Init, all using pure PHP code ignoring the usage of the current framework (ok… something is missing)
4. cpanel web hooks(?) WHMCS to send email to customer with the envs initial setup page(?)
5. Customer opens link and adds setup info (ok fuck, fuck, fuck)
(Ok stay cool composed, lets ask some questions maybe he thought it all in a cool way I can’t get my mind around)
Me: So how are you gonna get the correct version from the repo to the env and init the correct schema?
Dev: I haven’t thought about it yet.
Me: Are we gonna save each version to a file system then your code is going to fetch them?
Dev: I haven’t really thought about it we will see. But look on customer init user setup I implemented a password strength validation and it also checks if the password is the same.
So after this Pokémon encounter I politely closed teams. Stood up drank some (a lot) coffee ☕️. Put out the washed laundry while reflecting on life’s good things, while listening to classical music 🎼 .
Then I sat on my office chair drank some more coffee, put some linking park starting with in that order:
“Numb” then “What I’ve Done” and ended with “In the end, it does really fucking matter” -
CREA DDF (Canada Real state listings API) is what you get when government fucks with technology.
Holy shit! So f*cking inefficient to use it, test it and get data.
I get the protection behind sensitive data but fuck me if there is not a lot of waiting behind their fucking application process just to fetch some testing data.1 -
Not dev related.
Two incidents that I'd like to share.
So here in India two major streams for college are engineering and medicine (others do exist). So entrances to both these colleges are based upon entrance exams. So here are two "events" that happened this year and worth mentioning.
Incident 1:
The exam for the engineering stream had a section where the answer is a number with up to two digits of decimal. Range is (0.00 - 9.99) So apparently this two decimal precision created some confusion and the court decided that if the answer is precisely "seven" then only the candidates who've marked 7.00 are given marks while those who marked 7.0 or 7 were given wrong answer.
Incident 2:
So for the medical entrance, exam was for 720 marks (180 questions * 4marks each). So every candidate from the state of Tamil Nadu were given a full 196 marks as bonus because the translations from English to Tamil we're inaccurate.
Now I need to mention that around 300 marks would fetch a decent seat in a government college.
What the fuck is happening? One the only thing they're supposed to conduct every year is also messed up. And who the fuck created complicated shit like 7.00 is correct while 7.0 and 7 are wrong. I mean should the candidate worry about the getting the answer or marking it?
For those who don't know wrong answers are penalized heavily and there's huge competition.
https://m.timesofindia.com/home/...
https://m.timesofindia.com/india/...1 -
Hii,
I want to use HTML5 History API.
I'm using ajax to fetch whole page (Yes Whole page)
Then I'm searching a particular TAG and replace whole html code in container.
I'm feeling that I'm doing it so wrong.
Can anybody tell me best use of HTML5 History API.
This update data in page without reloading page, but I think this does not make any sense.
This is an Example of my Code, You'll get Idea:
$('body').on('click','a.ajax-nav-link',function(e) {
e.preventDefault();
//call ajax method, show data and update url via html5 history api :)
if (isHistorySupport) {
//fetch url associated with a tag
let url = $(this).attr("href");
let title = $(this).attr("data-title");
fetchPage(url);
$(".ajax-nav-link").removeClass("active");
$(this).addClass("active");
return false;
}else {
alert("Not Available");
}
});5 -
Using the fetch API and it's not that bad at all. Beats XMLHttpRewuest for sure and I dont have to feel guilty for using jQuery's AJAX functions anymore. I'm considering rewriting a module or two with it now4
-
Something annoying about the Node.JS and express stack: no way to globally catch errors using middleware.
I mean, you can try. There's even specific middleware designed for it. However, it only catches synchronous errors. So basically only controllers that don't fetch or upload any sort of data and just return some static view or whatever.
ASP.Net middleware chain is so much more robust :(5 -
Spent 4 hours today debugging the polyfill for fetch in IE to discover the latest version of IE11 has a bug in FileReader.
I have no way of getting around it. It simply cuts my text response off by a few hundred characters. I have to come up will a solution by Friday...
Why do we have to still support IE 😭1 -
program, which should save report from db to file, running a couple of minutes:
1) despite prefetch precompiler option no fetch was prefetched, because for every next line fetch cursor was reopened with condition WHERE some_val > prev_val
2) allocated array of host variables to fetch 100 rows per call to db client api
program is running under 3 seconds now -
first of all fuck this stupid website and deleting your rant if you aren't signed up
second of all fuck fetch
curl gives me a readable json object
axios gives me a readable json object
fetch gives me what should be a readable json object, but looks more like a set in python instead
[{"id":"1"},{"id":"2"}] curl and axios reply with this
meanwhile fetch
{
[
{"id":"1"},
{"id":"2"}
],
{various symbol objects}
}
how am I supposed to get my data out of a fetch? I see people call response[0] or using some strange amalgamation of
fetch().then().then((data) => {}) but data in this results in an unresolved promise for some inexplicable reason. (nextjs)
also fuck nextjs I want to go back to hardcoding everything in html
also fuck modern web development and businesses in general, they ruined the internet.4 -
What's the point of a 'Check for new version' menu if it doesn't tell you that a new version is out, and that you have to manually fetch it and install it?3
-
If i have 2 branches on git
- main
- infra
You cannot push directly to main. It is forbidden. You can only merge to main
Now. Once i push to infra branch. Assuming all the shit went good pipeline passed tests passed etc. Then i merge it to main.
Now
Locally while im on infra branch. I have to pull latest changes from main otherwise ill fuck up everything and cause conflicts.
After trial and error i realized i just have to do:
git fetch
This fetches all shits from main (defaukt branch) into infra branch. And now it works. No rebase. No pull. Wtf?
Is this the correct way to do it?
Also i need someone to explain this to me like im 5:
- git pull
- git pull --rebase
- git fetch
What is the difference between those 3 commands? I tried googling and chatgpting but i cant seem to understand any explanation. Explain it to me in simple terms with examples15 -
How do i show a profile pic from s3 bucket?
One way is to fetch it from backend and send it to frontend as a huge blob string. This is how i made it currently and it works.
.... what if i want to frequently get the profile image? Am i supposed to send a separate API request to the backend every time? What if I need to show the profile picture 100 times then that means I will have to send 100 requests to the backend API?
...... or even worse, what if I need to fetch a list of images from the S3 bucket for example, a list of posts that contain images or a card with the list of profile images of multiple users? If I need to display 100 posts, each post containing one image, That means I would have to separately call 100 API request to fetch 100 images…
That is fucking absurd.
Of course I can make it so that it saves that URL to that image as a public setting but the problem is the URL will be the exact URL to the S3 bucket, including the bucket name, the path and the file name as well as the user information such as the user ID. this feels like it is a huge security risk
What the fuck am I supposed to do and how am I supposed to properly handle display images which are supposed to be viewed publicly?20 -
What is the efficient way of querying database and fetch paginated posts AND also checking if the user viewing that post has liked it?
Just like on instagram or twitter, you can just like/unlike post.
Entities:
- user
- post
- user_post_like
Ive implemented fetching posts for 1 user profile and also liking unliking each post. Thats fine
But now how do i know which post has been liked by which user?
One way i can think of is:
1. Query paginated posts (e.g. 10)
2. Loop through each post and query in user_post_like table to check if this post has been liked and if it is then set flag liked to true. That way on the frontend i can easily set liked or unliked post via ui
But this means I'd have to query database 10 times all the time, aside from querying 10 paginated posts. This doesnt seem efficient... Or am i wrong? Is this normal?
How would you model this?7 -
I have been on a rollercoaster journey this year, long short quit my job, lounged around in emotionally recovery mode for a few months, went back to uni to study and recently i went and did some good ol networking and the results were pretty successful and ahhh life is moving to fast and im startin to feel invested in uni, but also money and greed is consuming me guuhhh doing both doesnt feel like a option haha1
-
Hi folks,
I'm currently working on a project where I need to reassemble and play a video from chunks fetched on a server.
The chunks are created from an mp4 video file and with the help of the 'split' command in a terminal.
I can fetch and play the first chunk in a video tag. It displays the total length of the video and stops when the end of the chunk is reached.
But I cannot fetch the second one, somehow append it to the first one and play the newly created chunk.
I tried to concatenate the two chunks using arrayBuffer and Blobs but it didn't work.
Maybe the solution is with SourceBuffer ?
Let's find a way to do that !
Thanks you guys !1 -
I was working on CakePHP as part of class project. I had to demonstrate AJAX. So I created some text files filled with random Wikipedia articles. On drop down item change, my AJAX handler would fetch the content of corresponding file and return it back.
But the problem was, it wasn't returning just the file content, but the whole HTML document. So the calling page would get another set of header, menu, footer and all! There was no time to fix layout for AJAX calls as it was added at last minute before the presentation. So, I just hid the duplicate menu etc using CSS and went for the presentation. It passed with flying colours.
So, if you can't fix something, just hide it! 😂 -
Terraform: Tried to fetch your module .zip file but failed. No route to host. 🤷♂️
Curl: Got it, what you want me to do with it now boss?
What the literal fuck Terraform? Chrome and Curl have no problem seeing it.4 -
I'm having problems understanding ngrx (or simply rxjs...), once again.
I have a feature state called "World".
This state has three Actions: Init, InitSuccess and InitFailure.
Also I have other isolated feature states like "Mountains", "Streets", "Rivers".
They have actions like this and corresponding effects to fetch data: Load, LoadSuccess, LoadFailure.
Now I would like to add an effect to WorldActions.Init, which will dispatch Mountains.Load, Streets.Load, Rivers.Load one after another. So ideally an action log would look like this:
1. World.Init
2. Mountains.Load
3. Streets.Load
4. Rivers.Load
5. Mountains.LoadSuccess
6. Streets.LoadSuccess
7. Rivers.LoadSuccess
8. World.InitSuccess
Or when an error occurs:
1. World.Init
2. Mountains.Load
3. Mountains.LoadFailure
4. World.InitFailure
How could an effect pipeline for that look?
How can I dispatch World.InitSuccess only after all LoadSuccess-Actions have happened?
Or am I still trying to implement ngrx in a wrong/bad way?
PS: The reason I am putting everything into separate feature states is because "Mountains" etc. are standalone features on their own. Only in the context of a "World" they belong together. For this reason I can't create a monstrous effect "World.Load", without producing redundant code.10 -
So, here I am, trying to get started with fucking heroku and failing miserably again and again. I deploy my app, I fetch with git, I try to push, and guess what: IMPOSSIBLE TO FETCH. WHY!?
Simple, I've already deployed and fetched may app, but I wasn't thinking, just doing a said...
Feel so stupid when type like a monkey... Now I know the only I had to do was learn heroku commands. And it took me only three hours to notice... -
NodeJS and MongoDB. And tutorials.
Everywhere tutorials show simple example of console.log output of findOne. Good, that works... But when I try to extrapolate example to assign results to variable, it won't work. Inside that fetch anonymous function it works... But outside, simply undefined no matter what I try. Return doesn't return either...
Why it is so hard to make tutorials and examples that would be actually useful. I've spent hours with this already.
And on top of that it is really hard to find tutorials staying with minimum extra dependencies. Like most tutorials in this case throw mongoose in the soup. And I don't want that.
Sometimes makes me question why I try to learn these new things, when I have knowledge of other technologies that I could use faster and easier...3 -
I've to say that javascript is no language I like because I'm more fastinated of building a nice and scalable backend then building a gui (in the browser).
The funny thing is that right now javascript helps me at my current project because many websites implement wide-opened apis for their js frontend and it just works like a charm to use them instead of parsing the whole html and do some XPath stuff to fetch information. -
When your weekend starts with: ahh just gonna chill and test my apps background-fetch mode and realizing it dont work with async requests and silent push-notifications, spending the weekend adding syncronious requests for silent push. Why does apple hate developers??
-
I need a help salesforce guys,
I am trying to automate Salesforce sandbox creation, then copying the client secret and key from an app and then use those credentials for some application.
Sandbox creation and deletion is done, but I am not able to get how should I fetch client credentials. I searched internet, and I only find gui method : login, select app, select view, get credentials.
At last I wrote a shitty selenium script but I don't have faith in this approach.
If anybody can give me insight, It would be great help.5 -
I am new to redis and confused how this works
To keep it simple lets say i have a CRUD service for user
- POST user, just creates user
- GET user by id, fetches user but using annotation @CacheEvict(). This method has a Thread.sleep(3000) before fetching user
- GET all users, fetches all users but using annotation @Cacheable()
- PUT user by id, updates a single with annotation @CacheEvict(). This method has a Thread.sleep(3000) before fetching user
- DELETE user by id, deletes a single user with annotation @CacheEvict()
---
GOOD:
When i GET user by id, i wait 3 seconds and then get the fetched user
When i GET user by id again, i get the fetched user instantly in 5 ms. This means it has pulled the user from REDIS cache instead of postgres
---
PROBLEM:
If i PUT user by id, update some data, and then if i GET user by id, it will return the user in 5 ms BUT the outdated user! Not the newly updated one. Because the Redis cache configuration has not expired yet. So there are now data inconsistencies
---
QUESTION:
How can i know When was something updated, deleted or whatever, so that i can fetch data from postgres (latest data) instead of Redis cache (outdated data)?10 -
In the last days, my developer was working with different kinds of databases.
He uses a database, named TimeScale DB (Timescale). This is the plugin of a famous SQL database PostgreSQL.
He imports the same data in PostgreSQL and timescale DB.
Size in Timescale DB is = 24KB
Size in Simple PostgreSQL = 166MB
This is the record of 1000000 rows, 12 columns.
Data fetching is the same on both databases, But if we fetch time-based data, (e.g 2 days, 24 hours, From Date to Date) we have a 50% to 200% difference depending on the query. python3 & elixir is also working fine with DB.
Results are also nearly the same with InfluxDB.
For a specific use-case, we have software that is handling Simple and time-series data at the same time. If anyone wants to suggest something new, Also tell us. -
Recieving orders through mails.
PHP IMAP, fetch info, and save those orders in the ERP system.
Is this even a good idea? 😱1 -
I have an old webcomic I made when I was but a child, and I've been wanting to redo it. How would you store the images so they are easy to fetch and can be done on a budget?6
-
made 2 websites that fetch pictures of dogs and cats from 2 public api's
https://satvik.ninja/cat-pics
https://satvik.ninja/dog-pics1 -
Just asking , food for thought really, any alternatives for firebase (database) in terms of pricing for connecting it to a website hosted by github.1
-
The only time I didn't envy git is when most of the team had to refetch after our lead front-end developer deleted trunk, committed the deletion, and then a backender had to re-base it off his repo. Until then, I thought only my dog could fetch for days.
-
Back at uni, finishing a degree--
Thinkin of making a things to unlearn before uni because boy, there are alot of practices they teach that arent industry standard
P.s dont use goddamn ids for css effect use classes like everybody else1 -
HI all.
I'm a novice in Web Development and I've made simple Web App I've just publish on Github. I just wanted to play a little to learn API manipulation and JavaScript's Fetch. So I decided to build an app based on Discogs' API for my personal use.
ShuffleBum is an app, which after providing a username in the dedicated field, retrieves all items of user's collection then randomize one of them before displaying it. That's help me to find my next listening records (I have 400+ vinyls in my collection and I always listenning too many time the same records ^^).
Because of Discogs' API construction, I must perform two fetch to retrieve all the items (Items are stacked at 50 per page). So, for my collection, there is 10 pages with 50 items. So, I used a foreach and a fetch to scan items and store them into an array.
But, this is really slow (800ms for each request), so for 10 pages, that's a lot. But I can't see how to improve those request. I can't do otherwise. Well, I guess.
Hope I will have so recommandation, hints, tricks.
Thank you.13 -
I watched a movie and i forgot its name i hope if you can help find the name, the story is about a generation where a crime truth is revealed through brain, they fetch the latest event you have encountered which you have seen it's like a stored video inside the brain, the very last crime she committed was revealed from the eyes of a rat which she forgot to kill and she was caught, i believe it was a swedish or danmark movie which i watched on Netflix long time ago... anyone?9
-
OpenSource is fun they said. I being a bored teen thought, ah, another chance to experiment. Discover something new. Now I am into piracy, movies, music, software. If I can get it for free I ain't paying for it. So I went on to GitHub to see what exciting new Repos I could contribute to. I hate already implemented plenty of algorithms in GO for GitHub.com/TheAlgorithms so I was looking something more practical, more beneficial to society. Then I saw it, the perfect repo, not too complex and not amateur. SpotDL/spotify-downloader for downloading songs from Spotify, a grey area coz it's technically piracy. Well not from Spotify, we fetch the info from the Spotify API and search for the songs on YouTubeMusic. They were just about to release v3, a complete rewrite of the codebase stressing code readability and stuff. I spend about a day studying the codebase, trying to findout just where I could make my contribution. I can see outright that there's a huge problem with implementation.
First of all the script spawns 4 processes for downloading songs though you might be downloading only one song. Which means for everytime you run the script you have to wait for 4 other processes to be spawned before any downloading can happen. Sure this is faster when you are downloading more than like 4 songs, but it's actually slower when downloading a single song. But I ignored that coz I assumed that most users download playlists and albums. Anyway we talked with the like lead developer and he was all like, make those PRs anytime you feel like. So I made a really minor first contribution.
I introduced download from Spotify URI functionality, modified like 10 lines of code. I was half expecting that the PR would be merged within hours at most 24 hours coz of how minor of a contribution it was, 5 days in it was pending. So I tagged the lead Dev and he was all appreciative of the PR, calling it real 'clean code' and stuff. 3 more days, the PR is still not merged. I have now stacked 4 more commits to the same PR, I tag the dev and he's like he's waiting to see if my 'feature' will get atleast 10 upvotes so that it can be merged, he links an issue. I go to the issue and my feature is not there, So 11 days after I made my PR I have to write a comment explaining the 'feature' introduced in my PR and then wait for 10 upvotes.
I was like f**k this, I'll just develop on my fork if you want the features on my fork, you will make your own PR! I am so done with OpenSource, development is slow. I have no idea how you guys do it. I can't handle development where I don't have write access.6 -
Question about GIT regarding intellij idea. I have a local branch develop and I perform a git fetch via GUI. I see develop gets a blue arrow meaning that there are some remote changes that happened. In the past I would just click on that branch and click update, but I noticed that sometimes fetched commits from remote get applied to my branch in a weird way, for example they get merged. Later when I want to make an MR I get duplicate commits because git doesnt recognize that my branch has these changes already.
Right now I just delete that develop branch and check it out again, to make sure I'm working on a proper develop.
Wondering if there is a better way of doing this instead of deleting develop branch and checking it out again each time?5 -
Why? Why does Eloquent with SQLite only fetch attributes as strings? I spent at least half an hour debugging, I just want to set up a local test DB and now I have to run a whole PostgreSQL instance3
-
Something interesting i learned today about the html5 video tag is that even if preload is set it's up to the player the render engine is using to fetch the index of the file first as with mp4 this is usually at the end of the file.
This means that for Blink and Gecko most likely fetch this first themselves. But for webkit it opens in quicktime on mobile devices which you cannot pass parameters to and flat out waits for the entire stream to start playing.1 -
To solve problems in life, One must be irritated to have a solution or else it will be a norm to the society... We want to propose something that can eliminate our irritation. I, once commute a trycycle who charged me 5¢ because it was too far to fetch other commuters... So i kicked his vehicle and he drove away. I reported to the traffic office because thats where they record the violation and they never issued this case... I can think of a PWA that lets the commuters to report those pesky and annoying trycycle driver.
We won't display the bio of the violated person but we will put their temp number written in their body parts. So that the commuters wont go to the office just to report, instead. Let the commuters immediately report them. Im ranting because of overcharged fee... Even among of my classmates agree... A hundred dollar idea for this. I really need to solve this because it irritates me... And by the way, there wont be report abuse because the officer will confirm if both parties crossed the line of argument via message/call. Its just within the city so its not much of a deal outside the vicinity
PS. dont judge my english. I suck at it. -
Following some new nextjs tutorial to learn how to efficiently build a web chat app, the guy built it very solid, but is it efficient?
Im having mixed feelings about this approach. The way he did it is, for example when you click on a user (imagine it as a list of users from your contacts), it actually calls a route, which stores that in database, and once its done Then the route triggers lets say socket.io event to notify the frontend to update the UI.
Not only that but each new message that gets sent it actually calls a route which stores that message in database and once that's successful Then it emits a socket.io event to the frontend to fetch that message.
As you can imagine constantly calling routes like this Does induce small delays. Creating conversations, navigating, opening someones profile and especially sending messages, is NOT instantaneous. When you do it theres a small delay, giving the impression as if the app is SO large that it lags
But it doesnt lag, it just needs a few ms to store that in db so it can return the socket.io bidirectional message event. Which does make sense because what if the internet broke and the user immediately gets sent a message, but the message fails to get stored in database? Or db storage gets fucked or something else fails but socket.io works while db doesnt? The data then may be inconsistent. This approach fulfulls the single source of truth principle
So thats why im having mixed feelings about this approach particularly because of small delays. It is not instantaneous like whatsapp discord telegram signal viber etc the input UI freezes until the message is successfully sent
---
Of course this can be a UI/UX decision and can be handled differently even if the backend works like that.
My concern is is this approach valid?
My question is... I had an idea what if i emit socket.io event to send the message while in the background also call the route to store that message in db? This way not only would it work asynchronously but the message gets sent instantaneously, and if the backend fucks up to store it in db then the UI gets updated with message failed to get delivered, switching the socket.io into polling state. Is this a good (proper, efficient, better) way to do it or not?8 -
I've been trying to understand why my browser does not set the cookies I'm getting from my login api for the last 4 hours and I'm losing my mind, pls help. My frontend is a create-react-app on localhost:8888 and my api is a django rest framework on localhost:8000. I'm using fetch() for all the communication to the api11
-
I personally hate that ad blockers block analytics software, specially since (as by the law) users must give consent before it collects data, several options exist to opt-out, and they are not f'ing ads.
I've been thinking to set a cron job to fetch analytics.js regularly and serve it locally as, let's say jquery-4.0.js, and screw it. What's your insight about it, fellow frontend devs?8 -
Profile (1, 1) --- (1, 1) User
Right?
- A single user *must* have *exactly* 1 profile.
- A single profile *must* belong to *exactly* 1 user.
Makes sense?
I did this because i moved user profile image and user banner image into Profile entity
So now i can easily join tables and fetch user profile image based on username or user ID
By deeply thinking like an asshole and overengineering, i stumbled upon a confusion
If i can join tables and get ALL fields (assuming its a left or full outer join) from both entities...
What is the difference between choosing which entity to fetch on the frontend?
For example if i want to fetch users, inversely, i can fetch Profile entity, which has User entity as a nested object, and that way access users. Now i have access to each user's profile image, banner image, bio etc aside from the entire user object
If the user navigates to a profile page, inversely, i can fetch User entity which will have a Profile entity as a nested object, and that way show the remaining necessary fields that the profile page needs to show
I gave these inverse examples because if i want to fetch users, surely enough i can simply fetch from User entity, and if i want to fetch someones profile data i can fetch from Profile entity directly
So if this is the case, when am i supposed to fetch one over the other?
You tell me. For simplicity lets focus on these two examples. Consider this as an exam question:
1) user navigates to home page. Now paginated users with role X need to be shown, but also their profile image. Do you fetch from User or Profile entity? If you use joins which ones and why?
2) user navigates to their or someone elses profile page. Now profile-based data needs to be shown, but also the user's username and full name need to be shown. Do you fetch from User or Profile entity? If you use joins which ones and why?21 -
Twitter api restricts me to fetch only 3202 tweets and if I had to fetch the retweeters count is limited to 100 .. is there any open source which will help me get through this problem because the data being collected is not sufficient for analysis
-
Node server with webpack poly fill on embedded device. Why 😂 .
Replacing node-fetch with node http instead of waiting for native node fetch API. Why 😂
All npm scripts on package.json are dead. Why 😂
Node server is not even sharing TS interfaces with frontend.
Customers are complaining about MeM0r1 L3k and let's build more features on stupid node.
Fucking kill me.1 -
Contractor: I've finished those data receipt RAG reports for each of the feeds.
Me: OK, cool.
git fetch
gitk
Wonders why I have a new report class for each feed when they all do the same thing and the data is homogeneous.
Why you hate me so much contractor??? -
Need a backend advice. I have a php webhosting with a mysql database and im looking for a simple REST API backend solution so that my android app could fetch data from mysql via api. Any suggestions?13
-
Function in my Dao file is:
@Query("SELECT * from appData WHERE type = :type ORDER BY id DESC")
fun getData(type: String): Flow<List<AppData>>
The database contains a column of automatically generated int primary key 'id', a column of string named 'type', and another string column named 'content'.
A Part of code from my composable function is:
val currentRetrievedDate: Date = Date()
val currentDate: String = SimpleDateFormat("dd-MM-yyyy").format(currentRetrievedDate)
var lastDateObjectList by remember { mutableStateOf(emptyList<AppData>()) }
LaunchedEffect(streakCounterViewModel) {
coroutineScope {
streakCounterViewModel.appDataRepository.getDataStream("lastDate")
.collect { newDataList ->
lastDateObjectList = newDataList as List<AppData>
}
}
}
if (lastDateObjectList.size>1){/*TODO*/
for (i in 1 until lastDateObjectList.size){
LaunchedEffect (Unit){
streakCounterViewModel.deleteData(id = lastDateObjectList[i]!!.id,
type = lastDateObjectList[i]!!.type,
content = lastDateObjectList[i]!!.content)
}
}
}
val lastDateObject: AppData?/* = lastDateObjectList[0] ?: null*/
lastDateObject = if (lastDateObjectList.isNotEmpty())
lastDateObjectList[0]
else null
var lastDate = lastDateObject?.content ?: "00-00-0000"
var currentStreakObjectList by remember { mutableStateOf(emptyList<AppData>()) }
LaunchedEffect(streakCounterViewModel) {
coroutineScope {
streakCounterViewModel.appDataRepository.getDataStream("currentStreak")
.collect { newDataList ->
currentStreakObjectList = newDataList as List<AppData>
}
}
}
if (currentStreakObjectList.size>1){
for (i in 1 until currentStreakObjectList.size){
LaunchedEffect (Unit){
streakCounterViewModel.deleteData(id = currentStreakObjectList[i]!!.id,
type = currentStreakObjectList[i]!!.type,
content = currentStreakObjectList[i]!!.content)
}
}
}
val currentStreakObject: AppData?/* = currentStreakObjectList[0] ?: null*/
currentStreakObject = if (currentStreakObjectList.isNotEmpty())
currentStreakObjectList[0]
else null
var currentStreak = currentStreakObject?.content ?: "0"
In this code, the last login time and last streak of user is already saved, we just have to fetch the data from the local database, make sure that there are not more than 1 occurrences of same data type in the database, and then use that data. But this, instead of using the database values, uses 00-00-0000 as the lastDate, and uses "0" as the currentStreak (which should only be used for the first not, and in some rare situations only), and is not able to retrieve data from the database properly.
The data saving and updating logic is working fine, but only the retrieval part is causing some issue.
The complete project is also uploaded on github: at HealthEase repo of tauqirnizami4 -
I have never seen core coding questions here so this is one of my shots in the dark-- this time, because I have a phobia for stackoverflow, and specifically, discussing this objective among wider audience
Here it goes: Ever since elon musk overpriced twitter apis, the 3rd-party app I used to unfollow non-followers broke. So I wrote a nifty crawler that cycles through those following me and fish out traitors who found me unpleasant enough to unfollow. Script works fine, I suspect, because I have a small amount I'm following
The challenge lies in me preemptively trying to delete some of the elements before the dom can overflow. Realistically, you want to do this every 1000 rows or so. The problem is, tampering with the rows causes the page's lazy loader to break. Apparently, it has some indicator somewhere using information on one of the rows to determine details of the next fetch
I've tried doing many things when we reach that batch limit:
1) wiping either the first or last
2) wiping only even rows
3) logging read rows and wiping them when it reaches batch limit
4) Emptying or hiding them
5) Accessing siblings of the last element and wiping them
I've tried adding custom selectors to the incoming nodes but something funny occurs. During each iteration, at some point, their `.length` gets reset, implying those selectors were removed or the contents were transferred to another element. I set the MutationObserver to track changes but it fetches nothing
I hope there are no twitter devs here cuz I went great pains to decipher their classes. I don't want them throwing another cog that would disrupt the crawler. So you can post any suggestions you have that could work and I will try it out. Or if it's impossible to assist without running the code, I will have no choice but to post it here4 -
🐟💩The image i fetch from s3 is of type byte array
I return it to angular as an ArrayBuffer
Which then needs to be somehow converted to an image so i can fucking show it
Then after research i had to convert ArrayBuffer to Blob
And from Blob to URL encoded object which returns a string that now shows the full image in img tag
Somehow, by a sheer of trials and error i have just accidentally made a very secure way of fetching a very sensitive piece of document (verification document with user's personal data on it) and now in browser this is shown as blob:shit-image/random-hash. Not even the file extension. This means nobody can download this image. You fucking cant. Its a Blob motherfucker! Like a Blob Fish. It saves either a .txt when you try to save it (no idea how) and if you try to open the image in new tab it shows gibberish text. This means you can read-only this highly sensitive document image and not manipulate it, not even download it. Perfect. I have just made a very secure software by accident.
(this blob fish looks like my shit)3 -
Variable Naming process
-------------------------------------------
Fetching a data of Counties into a variable using Fetch API.
Variable Name: countyData.
The code review is going to be interesting.4 -
Having problems with connecting nodejs twitter bot to cloudinary api to fetch images and directly posting them to twitter.
Any help?3