Ranter
Join devRant
Do all the things like
++ or -- rants, post your own rants, comment on others' rants and build your customized dev avatar
Sign Up
Pipeless API
From the creators of devRant, Pipeless lets you power real-time personalized recommendations and activity feeds using a simple API
Learn More
Comments
-
jestdotty5593249dheuristic engines are liable to cognitive biases
all animals survive better if they find ways to evolve to account for their cognitive biases, because then this mirrors reality better and improves their survival rates
imo the AI engineers are the most nutty people there are. take a bunch of biased people and have them write AI. it's like the meme in West world where they built the great judgement of all humans AI and it was built by a schizophrenic so it was schizophrenic -
jestdotty5593249d... ok ok, technically not all animals survive better if they account for their cognitive biases
humans are fascinating, because due to our advanced communication skills it seems to me that we've evolved the art of lying to one another (not deliberately, first you delude yourself so effectively use your delusions as lies even though not consciously lying) to steal things from each other. so the most delusional can just leech off others as a feature. it works until civilization gains critical mass of neurosis and civilization falls apart
how would you even fix that problem?! so crazy! -
tosensei8453248dthe reason is simple: it's not a calculator.
it's an answer predictor.
a huge bunch of very carefully weighted dice, whichs results are translated into a string. -
AlgoRythm50254247dYou’ve pierced the thin veil that hides the almost uselessly stupid machine.
Ask it to do any trivial but unique task, like spell a word backwards and remove the vowels. Count down from 100 but skip each number that rhymes with “shoe”. Tasks a human could do with no effort. It will fail spectacularly. Because it is not thinking. It’s generating a response it thinks you will enjoy based on watching humans respond to each other. That’s all.
So TIL chat-gpt can’t COUNT!! Since my eyes suck I asked it to return the last 24 chars of a string just as a quick sanity check and it gives me back a substring of only five. Went back to correct and it returns a substring of 7 😫😫😫😫!!!!
I gave up 🤬. It always fails basic calculations for some fucked reason
random
teachmecodeisbetteratmaththangptforsomereason