Well well well

  • 5
  • 2
    A million to one odds on humans winning any war where the machines actually took the initiative to fight humanity.

    Other than general intelligence, I would think initiative is the most dangerous quality in almost any species.
  • 2
    until some zombie nsa server got activiated and transferred some classified nuclear codes to their Ai leader, who built the ci successfully and launched nuclear bombs on all humanity, taking a poetic revenge of the years of failing builds and bugs
  • 1
    Nah. If we wont commit a "big stupid" moment.

    No replication + limited manipulators.

    It is like humans without hands...
    They wont be able to shoot anybody or control anything or make stuff.
  • 2
    I think the robots will use another method for choosing weapon: simulation. They will simulate a billions scenarios of the war in their "brains" and will choose the one which will have the highest winrate.

    The same idea that AlphaGo have used to beat the world champion.
  • 0
    Sentient robot therapists will likely be dangerous tho.

    Imagine them manipulating and turning the masses of doped up schizophrenics against society, the same was the feds do already?
Add Comment