I watched Terminator 3: Rise of the Machines on TV last night. Fox's new series, 'The Sarah Connor Chronicles' debuts next weekend, so expect a lot of terminator promotion in the next seven days. Compared to the unexpected cult classic of the original Terminator, and the campy fun of its sequel, T2, the third film in the series is difficult to quantify. It was hardly received with critical acclaim. No, the film was largely panned, as was its hero, who was about to embark upon a real-life mission to terminate California's then-governor, Gray Davis.
And yet for me, T3 is by far the most disturbing film of the bunch. The ending took me completely by surprise. After I saw it the first time, I went into a deep funk, muttering for days about humans, robots, and Armageddon. Without being specific, I'll reveal that T3's ending is very, very bleak. It's also (time travel aside) very, very plausible.
One line in the film bothered me the most. The good terminator tells John Connor, "Judgment Day is inevitable." Why, I asked? This flew in direct conflict with one of the film series' primary themes—'There is no Fate but what we Make.' So why would Armageddon suddenly be inevitable? Where did that come from?
I believe I've found an answer, and it comes in an unusual book called Deep Survival: Who Lives, Who Dies, and Why by Laurence Gonzales.
Deep Survival looks at wilderness accidents not only from the usual, human-factors perspective, but also with cutting-edge hard science, specifically, Chaos and Complexity Theory.
If we define a special kind of system that is both relatively complicated (many parts), and tightly-coupled (that is, any given part of the system is capable of affecting the whole), Complexity Theory tells us that large-scale accidents are not in fact anomalies, but are actually normal.
Let's try an example: a large party of climbers all roped together. The system is complex. It is also tightly coupled. Complexity Theory expects that over time, this system will self-organize. 'Organize' in this context means create a catastrophic accident. One man falls. He pulls down all the other climbers.
This is not an argument in favor of never roping up. But it is a sobering statement on the bargain climbers are making when they do rope up. For the security of a belay, we are trading a high-probability accident in which one person falls for a lower probability accident in which many people fall. And, over time, Complexity Theory tells us that large-party fall is inevitable.
Complexity Theory thus offers a warning with a rather broad application: beware added layers of complexity in life, especially those designed to prevent accidents.
A common subject of talk on the Whitney Portal Message Board is how to make climbing Mt. Whitney safer for the large numbers of inexperienced hikers who make the attempt each year. If we accept the theory's contention that accidents are normal, we see that any action we take to reduce accidents (ie, require tracking/communication devices, safety gear, etc.) may come at the cost of enabling large-scale catastrophes in the future. To put it another way, as we become more dependent on equipment and technology, we become more vulnerable to a self-organizing event.
And so, full-circle, it's time to come back to Judgment Day. In the Terminator films, the immense information grid—internet, military, civilian, communications—becomes a self-aware Artificial Intelligence called SkyNet. Shortly after becoming self-aware (self-organizing), the SkyNet system correctly perceives that humans are a threat to its existence. In control of all military computers, SkyNet launches a massive nuclear attack against humanity.
You couldn't put together a better example of what Complexity Theory calls a normal accident. Inevitable.
Leave a Comment: