Computer Crash Could Come In 2000 Unless Changes Made, Congress Told
WASHINGTON (AP) _ It sounds like the plot from a high-tech thriller: Governments and businesses around the world dawdle while the clock ticks toward a global computer meltdown set to begin exactly at the turn of the century.
To keep this the stuff of fiction, a House panel was told Tuesday, federal agencies and the leaders of America’s financial sector must continue working to correct data-reading problems in outdated computer programs that will reach a crisis point when the calendar reaches 2000.
The problem is that when the forerunners of today’s massive computer programs were first designed, storage space was at a premium. To save memory space on the old-fashioned mainframes, code writers simply omitted the first two numbers of a date _ meaning that 1998, for example, would read as 98, 1999 as 99, and so on.
The year 2000 would be read as 00. Since the systems are coded to assume that all years begin with 19--, computers will interpret 00 to mean 1900, if modifications are not made.
``Unless an effective response is soon initiated, on the first day of the year 2000, private industry and government computers could malfunction,″ said Rep. Steve Horn, R-Calif., whose Government Management, Information and Technology subcommittee held a hearing on the issue. ``There’s much work needed to be done. We’ve got an immovable deadline.″
The potential ramifications of the problem are huge, and could affect everything from the nation’s computerized missile defense system to automated checking-account deductions used by thousands of Americans to pay bills or make mutual-fund or retirement account investments.
``There’s no magic pill...unless you can legislate that the year 2000 be pushed back,″ Louis Marcoccia, director of data systems for the New York Transit Authority, told the panel.
``Anyone who offers to quickly and cheaply fix the problem is offering us a silver bullet that does not exist, and is doing us no favors,″ added George Munoz, the Treasury Department’s chief financial officer.
And while there are no easy answers, the government and the nation’s financial industry are already taking steps to prevent a catastrophic systems crash.
``The solution to the problem is obvious, but labor-intensive,″ said Dean Mesterharm, a systems manager for the Social Security Administration.
There is no automated way for a big company _ or a huge federal agency _ to correct the situation.
``Each line (of computer code) must be examined individually to see if a change is needed,″ explained Mesterharm, who added that the equivalent of 300 work years will be needed at his agency before changes are finalized.
Mesterharm said the Social Security administration, whose computer programs include 30 million lines of code, will have its new system in place by the end of 1998. That will allow for a full year of testing before the end of the century, he said.
The Internal Revenue Service is on the same schedule, Munoz added.
Cost estimates for the project vary widely, but those who testified Tuesday said the government tab could run to $30 billion _ money Horn insisted should come from savings found in agency budgets.
Michael B. Tiernan, the Securities Industry Association official, said that Wall Street firms alone are likely to spend $3 billion fixing the problem.
Marcoccia said that other nations, particularly in Europe, have lagged behind the U.S., but are now beginning to deal with the problem as well.
While computer expert Kevin Schick predicted that up to 30 percent of the U.S. government’s computers would not be ready for the turn of the century, Munoz and others said that part of the improvements called for the installation of ``firewalls″ to prevent faulty information from penetrating newly-improved databases.
One cause of the problem was that no computer expert working in the 1960s expected that the systems designed then would still be in use today, augmented by technical breakthroughs like the improved storage capacity on today’s micro chips.
``We are the victims of technology,″ said Emmet Paige Jr., a computer systems expert at the Defense Department. ``This wasn’t a stupid mistake.″