Looking at AI and regarding the stop button problem. (computerphile video about the stop button problem: https://youtube.com/watch/...)

And looking at Skynet from Terminator.

There is some chance that the goal condition for the AI was that humans don't press the stop button.

And Skynet's biggest goal is to prevent humanity to press this button by exterminating humanity.

So, the film writers are actually this one time totally justified to have a heavily guarded room with nothing but a big red button which instantly brings victory for humanity when pressed.

Critics will hate this "deux ex machina", but computer scientists and AI researchers will be like: "Ehm, actually, that makes a lot of sense."

Add Comment