Science fiction is largely
about ideas, which means good science fiction is about good ideas. For this
reason there aren't many really terrific science fiction films made, because
there aren't that many real science fiction films. For every 2001: A Space Odyssey,
there are countless sci-fi-flavoured yarns that serve as okay entertainment,
but are not what one would call good expositions of ideas.
Given that it's such a
compelling subject, both scientifically and philosophically, some of the best
science fiction films I've seen have been about artificial intelligence (AI),
and the relationship between humankind and machines. The first two Terminator
films and Spike Jonze's Her are prime examples, as is Ex Machina, which I saw
very recently and thoroughly enjoyed.
Ex Machina tells the story
of a young programmer who is selected to lodge in the abode of AI designer
Nathan, and participate in the evaluation of potential human qualities in one
of his created androids to see if she passes the Turing test (that is, exhibit
intelligent behaviour equivalent to, or indistinguishable from, that of a
human). For fear of a movie spoiler, I won't say any more about the plot - but if you haven't seen it, I thoroughly recommend you do.
There is, though, one big
question that I think the movie gets wrong. In one scene Nathan tells us how he
sees human history as one tiny passage of time on a lengthy and complex
evolution of mind, informing us that, in his view, artificial intelligence is
the future intelligence that's going to live far beyond the human intelligence
that created them:
"One day the AIs are going to look back on us the
same way we look at fossil skeletons on the plains of Africa .
An upright ape living in dust with crude language and tools, all set for
extinction."
I doubt this very much.
This view seems to be a projected future based on the reality of the past. It's
easy to see why. Think back to a few hundred thousand years ago, and consider those
primeval grunts from our ancestors as they began to make sense of their
surroundings. They never could have imagined that those inceptive primate
sounds would one day evolve into the entire world of languages, literature,
poetry, philosophy, science and technology that we have today. And just as
Lord Of The Rings, the Manhattan skyline, space travel and the Hadron Collider
would have been far beyond the imaginative precipitations of our primeval
ancestors, so too is much of humanity's future evolution beyond us today.
But whatever form the
evolution of mind takes, it won't be 'us' as thinking beings that is all set
for extinction, because the most important thing that survives is the cognita
(and its representation) on which any future technology is based. The human
mind is the most sophisticated aggregation of matter in the entire universe
(that we know of), and as such, it seems to me that machines we create cannot
be any more intellectually sophisticated than the minds that create them (just
as machines cannot
be more wicked than the wickedness of the minds that create them). That's obviously not to deny that we can create machines that are more
sophisticated than us at tasks, and computers that can undertake tasks in
execution time at far more advanced levels than us, because clearly we can.
We are picking up pace in
this modern age as we reach new heights and new depths in shorter passages of
time. My prediction is we'll one day evolve into creatures of pure thought,
that require no monetary currency, no food or drink, probably even no heart,
lungs, hands and feet. Our cognition probably will be sustained by software far
beyond our present imagination. But I very much doubt it's ever going to be the
case that the AI we create is going to assume dominion of its own to the
extent that it becomes the supreme species at the expense of the sophistication
of the human mind and its proposed extinction.
I believe that humans will
always be implicitly involved in the evolution process of our own cognita, to
enable us to retain a co-operative with any machines we create, however
advanced, and however powerful. We are not going to create anything that, by
comparison, turns us into the proverbial "upright
apes living in dust with crude language and tools, all set for extinction",
because all future innovation and
advancement is itself going to be part of the evolutionary process of the human
mind, not distinct from it in a way that we'd allow to threaten the existence
of the minds that engendered it.
No comments:
Post a Comment