18
Earu
20d

I am so sick and tired of hearing “AI” everywhere all the time. Yeah how about we integrate some AI into your super smart toaster so that it knows when to start preparing for when you put toasts in it in the morning.

Not even mentioning all these idiots being like “oh yeah AI is becoming sentient. Oh yeah AI is gonna take over the world”.

Brother the current state of AI is just machine learning, it’s a stupid pattern detector and generator it doesn’t have thoughts, emotions. Please just stop it.

Comments
  • 3
    I wouldn't mess too much with it. I remember SkyNet.
  • 3
    I mean, AI in a toaster sounds like something useful. If I trained a neural network on the reaction of people to toast, I might develop a toaster that toasts bread perfectly no matter which bread I buy.

    (I remember this as a potential master thesis)

    Anyway, the whole idea of sentience is that it might be a by-product of a learning algorithm. And we have a hard time defining it. I mean, are we more than a learning algorithm? Many point out inner monologue, but I think this argument is quite unfair since we have multiple brains (or brain parts developed at different stages) and a reality simulator (imagination).

    Wouldn't an AI comprised of multiple neural networks working together necessarily resemble an inner monologue, but a monolithic AI with the same capabilities might lack this inner monologue.

    Well, turns out, we have a shit and biased definition of sentience. I would declare everything sentient that is an intelligent actor who is hard to defeat as an opponent.
  • 1
    @TheCommoner282 "a toaster that toasts bread perfectly no matter which bread I buy" - https://youtube.com/watch/... - this has been perfected in the sixties - and completely analog.

    also: your definition of "sentience" would apply to an algorithm that's playing chess.

    but, i agree it's a part (or at least a prerequisite for sentience). i would add "ability for introspection" to that
  • 0
    @tosensei

    Don't forget I have a character limit. And that I said I assign sentience to anything that can compete. A chess computer cannot compete. In the same way a bear cannot compete with humans. Sure, put a human in its territory, badly armed and prepared and the bear will win, but let us prepare. Let's bring tanks and snipers and helicopters. No matter how much time I'll give the bear, it won't win. It will not prepare.

    But a bear is sentient. So my test of sentience even overshoots the requirement for sentience.

    I am pretty sure that a chess computer cannot compete with humans outside of a very narrow area which we can avoid. How about a quick draw duel at high noon. Repeat the experiment a million times, give the chess computer time to prepare and it will still not win. But if it managed a statistically significant win rate, than I happily assign sentience to it.

    This is also no sentience definition, it is a test. Inner monologue not required (original point).
  • 1
    @TheCommoner282 well, it's all a matter of perspective - and the rules you play by. sure, by the rules of our modern, technologised world, the bear has no chance. in the rules of a prehistoric, wild world, we'd be the ones with little to no chance.

    by the same logic, if you reduce the rules by which you measure to the ruleset of chess, we have no chance against the algorithm.

    i think circumstancial factors like this should be abstracted away from the question of sentience.
  • 0
    @tosensei

    But that's exactly what I am doing. Stripping away circumstances. The argument is about being an intelligent actor that is capable of controlling the circumstances. If you have achieved that, you passed the test. The chess computer is not.

    Humans are good at tool making. We made bear hunting a tool making competition and out-compete the bear.

    The chess computer has to figure out a way to make the quick draw duel at high noon a chess problem. I think it won't be capable to do so, but cannot be sure. But hey, the test I propose does not claim that anyone failing is not sentient. See the bear example. The test only says that anyone succeeding is sentient. High sensitivity, not high specificity. Or in math terms, it follows. It is not an iff relationship.
Add Comment