7

I noticed an increased usage of the word unalive, such as in "to un-alive someone" on youtube, spoken in the videos and written in the comments.

I suppose this is to avoid the word kill?

So we are at the stage of changing the language just to avoid using a bad word on a platform of hypersensitive woke snowflakes who will cancel you for saying a specific word regardless of the context it’s been used in?

Please tell me I‘m wrong. 😒

Comments
  • 9
    it's not about "hypersensitive woke snowflakes".

    it's about "The Algorithm" suppressing everything that might possibly be in any way whatsoever controversial.
  • 4
    @tosensei Still, it is "The Algorithm" made by "hypersensitive woke snowflakes". Next step is an AI that is trans and will go on strike every 10 minutes because you didn't respect it's 'personal space'

    This world is fucked and school is not helping.
  • 2
    @tosensei > "it's about 'The Algorithm' suppressing everything that might possibly be in any way whatsoever controversial"

    Then it's 100% about the "hypersensitive woke snowflakes".

    Cancel culture is still alive and well.
  • 3
    Devs don't give a shit. Managers think their opinion is important. Lawyer's opinions are important. Devs just knuckle under.
  • 1
    @PaperTrail So we cannot use "this task is killing me" ? or "my back is killing me"
  • 0
    @Grumm yes, literally. It’s mind boggling to imagine that there are people who think this is a good idea.
  • 0
    I mean we see what is happening as a consequence: People start inventing a new word which has the exact same meaning but is not caught by the algorithm.
    It’s so dumb.
  • 1
    @Grumm also, the context where I encountered it in was historical. The people in the video were talking about past wars were people were "un-alived".
    It was bizarre.
  • 1
    @Grumm > "So we cannot use 'this task is killing me' ? or 'my back is killing me'"

    Posts are flagged and removed for offensive language like 'killed', 'shot'. Not all, but enough to where you'll hear folks say "Pew Pew" instead of 'shot'

    "Other day my buddy was pew pew'ed next the liquor store..."

    I hear that and I'm reminded, again, our society is getting dumber and dumber. Not the folks that do it, but a system that makes people change words so they won't offend and their social media banned/canceled.

    Language is a social construct. Words in themselves have no value, no action, it's our stupid monkey brain that reacts to hearing/reading 'Kill' that causes folks to lose that mind.

    George Carlin has old comedy bit about curse words that is still relevant today.
  • 1
    @Grumm would be so offended if it's possible to train such model with openai. I was working on fine tuning a retoor clone, and it didn't pass the regulations. It's literally a life story in proper language. The amount of times that I tried to train a model and failed at openai is insane. I'm sure something with the word tranny wouldn't pass. But often, I can imagine to train a real big model with it. The trainings data all went to gpt once before offered for training. Imagine that.
  • 1
    I think unalived is kinda funny. Like Pepsi.
  • 1
    There is going to be a massive overcorrection of this kind of thing in the next decade. We are seeing the beginnings of it now.

    People that just want to live their lives, (read: the vast majority of people) are fed up to the point of taking action, which does not bode well for the ones that have steered society into the dysfunctional mockery that it is now.

    Jobs being easy to lose and hard to find were the only thing keeping otherwise docile people from acting.

    The best thing you can do in the coming years is blend in with the crowd and not try to force whatever your thing is on people.
  • 1
    Thing is, this kind of algo speak has been rampant in China for ages. The main thing you should take away from them is this kind of censorship on the internet doesn't work and nerds have a sense of humour. It's cat and mouse. There was a study like, 15 years ago by Facebook showing they could manipulate people's emotions by changing what was shown on their feed. There was a lot of backlash and they stopped publishing their research after that. I suspect in western culture it's an attempt to circumvent attempts at manipulation rather than attempts at censorship. Given how divided western culture is, given Musk's control over X and support of Trump. Anyway, thanks for listening to my Ted talk.
  • 1
    I mean it's rather simple. People dont wanna rum ads on controversial stuff, its hard to objectively decide what is controversial so you go happy ban hammer on anything that isn't clearly good for ads.
    Can't really fault YT for doing whats clearly their best financial option even though it's hella crude.
    Though the double standards for the old guard of news channels that get ignored when they do it on YT is kind of annoying.
    My vid gets algobanned for saying fuck in the first min, meanwhile news channels show someone getting shot and they get ads...
Add Comment