Common sense and a little skepticism may help protect us from an overreliance on complex analytics.
The volcanic eruption in Iceland last week has focused attention once again on how unprepared we are when managing technology—as well as managing the world around us. First we fail to predict a volcano that is pushing up from the ground like a teenager's pimple, and then we cancel more than 100,000 airline flights based on what now is claimed to be a flawed computer simulation. What's next?
The debate over climate change for the past decade makes me a big believer in IBM's concept for a "smarter planet"; can't we just figure out what's going wrong and fix it? Apparently not. Living on planet Earth is clearly akin to driving a 4.5-billion-year-old-car in a country where all the mechanics are under the age of 15 yet proud to exhibit their recently issued certificates from local technical colleges. And how many countries now have nuclear weapons? I was much impressed by the Iranian cleric this week who blamed earthquakes on promiscuous women. Maybe they also caused Iceland's volcanic eruption!
Having the source of the flying ash come from Iceland is a bit ironic, considering how it focused attention once again on a country whose failing banks sounded the starting bell of the worldwide recession in 2008. Excuse the stretch from volcano to economic crisis, but there is an underlying question throughout of who is in charge here.
It's true we can't control everything, but we should be able to rationally control our reactions. Grounding airplanes throughout Europe seemed like a good idea at first, but apparently the data in the computer model was flawed, so there's a question of whether planes were in danger in the first place. Oh well, what's another $1.7 billion loss to an industry that last year was in the red more than $9 billion? Who wrote that volcano ash program, anyway? Ultimately, European airlines told their collective governments to take a hike: we're flying whether you like it or not! The American system puts more responsibility for safety at the airline level, though the FAA can ground planes and close airports.
I can't decide whether we are relying too much on computers or not enough. If we look at the recent economic downturn, it certainly was exacerbated by computer models that decision-makers didn't understand. The models spit out equations that showed acceptable risk levels when in fact things were going haywire. Given the flawed assumption that housing prices would rise indefinitely, of course the programs would produce the wrong results. They started with the wrong assumption.
The result, however, was that the computers gave top management reassurance that everything was fine when, in fact, it wasn't. Some have said that IT played a role in the creation of the recession because we believed in our own calculations and we gave assurances to management that everything was safe. There is no doubt the programs we created magnified the volume and speed of the complex hedge systems that were obviously flawed. Without computers, the recession would not have happened as fast or been as severe.
What are the lessons to be learned from the Eyjafjallajokull volcano eruption, the Icelandic bank failures, and the worldwide housing-inspired recession? The lessons are 1) you can't trust a computer model without empirical testing, and 2) someone must be responsible for every decision—and it can't be a computer. The computer is a tool that a person uses to help make a decision; it's not a decision-maker. If the data is bad, it's the decision-maker's responsibility to recognize it and get the correct data.
A few people might disagree with me on that, however. Albert Einstein, for instance, once said, "If the facts don't fit the theory, change the facts." But Albert Einstein didn't have to endure the current recession. I personally prefer a line from Donald R. Gannon: "Get the facts, or the facts will get you. And when you get them, get them right—or they will get you wrong." Management has to be disciplined when it uses sophisticated computer models; no single model is universally applicable, and no one model is always going to be right.
There is a big push within IBM today to sell business analytics. If you can collect, sort, and analyze data, then the assumption is that you will make better decisions. Generally, I suppose that is true. But as the stock market ads so clearly state: "Past performance is no guarantee of future results." Analytics has its place in decision-making, but just because housing prices have gone up for the past several years does not ensure they will continue to rise. Other factors come into play, and it seems fair to say that it should be the job of IT to point out to management inevitable potential shortcomings of any predictive analysis.
Many banks today are reducing the scope of predictive analytics when it comes to things like derivative contracts. When it comes to predicting the effect of volcanic ash on aircraft, maybe we're still putting too much faith in an untested computer model. We certainly can make use of more and better information to build a Smarter Planet, but let's keep a tight rein on the decisions that evolve after that information becomes available. We can't afford any more runaway financial derivative bubbles or software mistakes that hobble an entire industry. Let's get the facts so the facts don't get us.
Any Iranian clerics in the audience who associate earthquakes with promiscuity, please raise your hands to indicate that you understand.
LATEST COMMENTS
MC Press Online