Why AI's Might Take Over
Movies like Terminator and Matrix are fun to watch, but I’ve wondered why AI/intelligent machines would ever want to take over the world and/or kill mankind. Or is that just a theme for movies and sci-fi novels?
In Matrix, Agent Smith (an AI) states the
reason:
“Every mammal on
this planet instinctively develops a natural equilibrium with the surrounding
environment but you humans do not. You move to an area and you multiply and
multiply until every natural resource is consumed and the only way you can
survive is to spread to another area. There is another organism on this planet
that follows the same pattern. Do you know what it is? A virus. Human beings
are a disease, a cancer of this planet. You are a plague and we are the cure.”
Virus?
Ouch! In his book, Life
3.0,
Max Tegmark writes of another reason. He mentions one of the first (computer)
viruses that got world-wide attention, the Morris Worm, which in 1988, exploited
bugs in the Unix OS:
“It was allegedly
a misguided attempt to count how many computers were online… it infected and
crashed about 10% of the 60,000 computers that made up the Internet back then.”
Ok,
what’s that got to do with AI’s taking over the world? The same point: it might
be an error in the code that causes the havoc, albeit unintentionally.
If
that’s too depressing, here’s another scenario Tegmark visualizes. We humans
keep the AI boxed (“imprisoned”, with no Internet access etc) to prevent it from
breaking out and doing who knows what. But the AI isn’t evil, it has solutions
for many of mankind’s problems and benefit. But the humans won’t implement any
of its ideas until they fully understand it and/or convinced that it hasn’t
gone rogue. So would the AI proceed as follows?
“(It) will
probably view (humans) as an annoying obstacle to helping humanity flourish:
they’re incredibly incompetent… and their meddling greatly slows down
progress.”
Would
it therefore try to break out? And given its superior intelligence, wouldn’t it
succeed? And would its actions then be the equivalent of ruling the world… for
our own good, of course? But obviously, we wouldn’t see it that way.
So ok, there are many reasons an AI might take over. No wonder so many people these days talk about the AI explosion should/shouldn’t be handled.
Comments
Post a Comment