Join devRant
Do all the things like
++ or -- rants, post your own rants, comment on others' rants and build your customized dev avatar
Sign Up
Pipeless API

From the creators of devRant, Pipeless lets you power real-time personalized recommendations and activity feeds using a simple API
Learn More
Feed
All
Post Types
- Rants
- Jokes/Memes
- Questions
- Collabs
- devRant
- Random
- Undefined
Cancel
All
-
We just had a server outage this morning and my colleague was heading to the office. He was already in a meeting on his phone and got stopped by the police for calling while driving.
He got a fine, lost his driving license for 2 weeks and has to go police court for it. That's an expensive server outage for him2 -
Solo Leveling vs. Me:
Shadow Monarch: Starts at E-rank, grinds like a beast, becomes S-rank legend.
Me: Started as an intern, still stuck at:
(Write your answers in comment) -
I'm so sick of placements, every opportunity I ever prayed for kinda slipped through my fingers, my friends made it tho. I'm sitting here helping them optimize their codes during tests and trials while breaking a little bit more. Any one that has a job or would wanna share their story on how they got through this phase, lemme know. Cause i feel like jumping off the roof and im not planning on dying anytime soon :)
-
Every time you scroll past this without upvoting, a junior dev pushes straight to main, and a senior dev sighs in despair. Save the repo. Upvote.8
-
Adaptive Latent Hypersurfaces
The idea is rather than adjusting embedding latents, we learn a model that takes
the context tokens as input, and generates an efficient adapter or transform of the latents,
so when the latents are grabbed for that same input, they produce outputs with much lower perplexity and loss.
This can be trained autoregressively.
This is similar in some respects to hypernetworks, but applied to embeddings.
The thinking is we shouldn't change latents directly, because any given vector will general be orthogonal to any other, and changing the latents introduces variance for some subset of other inputs over some distribution that is partially or fully out-of-distribution to the current training and verification data sets, thus ultimately leading to a plateau in loss-drop.
Therefore, by autoregressively taking an input, and learning a model that produces a transform on the latents of a token dictionary, we can avoid this ossification of global minima, by finding hypersurfaces that adapt the embeddings, rather than changing them directly.
The result is a network that essentially acts a a compressor of all relevant use cases, without leading to overfitting on in-distribution data and underfitting on out-of-distribution data.6 -
All hail, king of the losers!
Ah, being rushed!
Don't point that thing at me!
Yes. No. Yes. I mean so. Commandament?
Start the game already!
It is good to be the king.
Hahaha..ha...ha!
Gold, please.
Wood, please.
Food, please.
Stone, please.3 -
Have you tried chatgpt's text-to-speech feature?
It’s so much better than anything that I tried before. You can even choose different "personalities" or tones or whatever.
I‘d even say that it‘s perfect. I can’t think of anything that could be improved in terms of how well it pronounces words and puts emphasis on specific words. It’s 100% natural sounding.9 -
Relatives Be Like:
Mother’s side relatives: "Beta, how’s the job search going?" (Translation: They just wanna gossip if I’m still unemployed.)
Father’s side relatives: "Don’t worry, we’re praying for you." (Translation: They’ve already sent my resume to the local tantrik.)
Me: "Bro, I need a referral" -
What’s the most frustrating bug you've ever encountered, and how did you finally fix it (or did you just give up and rewrite everything)?10
-
Job Offer Letters Be Like:
Page 1: "We are pleased to offer you the position…"
Page 2-10: “Terms, conditions, policies, and consequences if you even think about breathing wrong.”
Page 11: "Sign below if you’re still alive."8 -
just dicked down my new hot brunette gf last night whole night and today morning. came back home and the first thing i do of course is fart a lot and start shitting6
-
The next time I hear someone go "it'll happen when you least expect it", I'll cave their skull in.
Oh and also, my Parser is done :)6 -
Friendly reminder:
Be irreplaceable. Don't code clean code. Codely it messy. Make it so fucked up that only you understand the codebase.
Make it LLM model can't understand jack shit.
Then ask for higher pay, otherwise resign leaving them a messy code
😅😅😅🤣
#motivation15 -