NY Times : The Legacy of a Debacle That Wasn't (with more Studies coming)

greenspun.com : LUSENET : Poole's Roost II : One Thread

NY Times : The Legacy of a Debacle That Wasn't

MORE EVIDENCE THAT SOMEONE IS "STUDYING Y2k":

In the long run, the biggest impact may be on society's capacity to respond to complex management challenges, said Mark Haselkorn, an engineering professor at the University of Washington who is working on a major Year 2000 study sponsored by the National Research Council. But teasing out the lessons has been a challenge. Although he originally hoped to have the study published in September, he is now aiming for the spring.

"It's much more difficult to write than if everything had fallen apart," Mr. Haselkorn said.

AND a little note on the "push" for IV & V from Data Dimensions. And more proof that ***AS EACH ONE OF THE **MYTHS** FAILED TO APPEAR.....THE "CONSULTANTS" STARTED NEW ONES: from "not enough time, start today to "not enough programmers, hire today" to "not enough PROOF, start testing today".........


The company eventually cut its consultant rolls to 500 from 900 and switched to quality assurance and information testing work, fields closer to its Year 2000 experience. It escaped the recent carnage among those that stayed in the e-business consulting, but its stock has tumbled to less than $1 a share and it has yet to complete the tough climb back to profitability. "2000 was a very painful year," Mr. Allen said.

-------------------------------------------

The Legacy of a Debacle That Wasn't

The Y2K phenomenon risked leaving computers and other electronic equipment unable to function properly. A year later, information technology managers are still finding dividends from their huge investment.

LINK TO TIMES

http://www.nytimes.com/2001/01/01/technology/01COMP.html

The Legacy of a Debacle That Wasn't

By BARNABY J. FEDER

ity agencies in Portland, Ore., have consolidated their welter of e-mail systems. New Jersey's biggest utility has established a full-time quality assurance department for its software. And enterprises from Wall Street to Texas are keeping emergency teams and systems together to tackle new computer challenges or ward off future disasters.

Those are a few legacies of the operation mounted worldwide to prepare computer systems for 2000 — an effort that cost an estimated $250 billion or more in repairs and contingency planning, all necessitated by a longstanding and seemingly trivial shortcut in computer programming.

The Y2K phenomenon, as it came to be known, risked leaving computers and other electronic equipment unable to function properly. But the crucial transition passed with only scattered malfunctions, and a year later information technology managers are still finding dividends from their huge investment.

The effort endowed the managers with updated information systems, software development tools and disaster plans, not to mention their most complete inventory of computer systems and how they were being used. It also helped big enterprises identify managers capable of handling complex projects reaching into every part of their operations. Business leaders, regulators, lawyers, insurers and the public all became more technically literate.

"The big benefits are soft, like changes in attitudes and procedures for information technology," said Terry Milholland, chief information and technology officer for Electronic Data Systems, which designs and runs computer systems for other companies.

Hanging on to such benefits takes work. Electronic Data, for example, is maintaining disaster recovery plans and networks set up to manage the Year 2000 threats that never materialized. The company has applied for a patent on the processes its global network of eight regional centers used to monitor disruptions and communicate with one another and the company's Millennium Management Center in the basement of its headquarters in Plano, Tex., Mr. Milholland said.

In retrospect, many say, the Year 2000 challenge showed the value of the Internet in coordinating work that was huge in scope and often involved disparate groups. It also broke down barriers between information specialists and business managers within companies and among companies that normally compete. In some cases, it spawned ad hoc groups spanning industries, suppliers, customers and regulators.

One of the most successful groups was formed by the Securities Industry Association in New York to help stock exchanges, brokerage firms and other financial institutions cooperate. After the successful transition, the structures and methods used on the Year 2000 problem were quickly adapted to what many on Wall Street view as an even bigger challenge: the move to settle trades within one day instead of three.

Like the transition to pricing stocks in decimals, now set for this spring, the settlement project was delayed by Wall Street's need to focus on Year 2000 issues, according to industry executives. But that experience has provided a number of useful precedents for planning, testing, regulatory issues, publicity and use of the Internet for coordinating the work.

"This is a different mountain to climb, but the best practices are the same," said Arthur L. Thomas, the Merrill Lynch executive who was chairman of the industry steering committee overseeing the Year 2000 effort and now has a comparable role in the next-day-settlement project.

Mr. Thomas noted that Year 2000 concerns were more pervasive: dates were buried in every computerized aspect of the business, and the readiness of public utilities or other support services also had to be scrutinized. But the new trade settlement process, scheduled for 2004, actually requires more difficult changes to the industry's computer systems, he said.

Richard Hofland, who headed the city of Portland's Year 2000 project, has drawn on another Year 2000 lesson: that it is much easier to find and clean up computer problems when everyone is using the same products. "We are doing a lot better job now of getting city departments to share software applications development work, databases and development tools," he said. All city agencies except the police are on a single e-mail system now, down from six before the Year 2000 program began.

In addition, Mr. Hofland said, emergency services groups like the police and fire department now have a much better picture of the role of computer systems play in city services. These days, Mr. Hofland is routinely involved in disaster recovery exercises.

At Public Service Electric and Gas, New Jersey's largest power utility, a permanent quality assurance department is being set up to test all new software, according to Vincent Scatuccio, its information technology manager. That move is based on experience with a team formed to perform such tests on the Year 2000 work of programmers, who previously had done the testing themselves.

The Year 2000 legacy has some special twists for businesses that thrived or even depended on the preparation effort. Law firms that geared up for Year 2000 advice and a projected wave of damage lawsuits have used the expertise they gained to expand their work in electronic commerce, technology patent litigation and other areas once viewed as exotic. But for others, it has been a hard adjustment.

"Y2K made our company," said Peter A. Allen, chairman, president and chief executive of Data Dimensions, a consulting firm in Bellevue, Wash., whose stock surpassed $40 a share in 1997 as awareness of the potential computer problems spread. Mr. Allen tried to switch the firm's focus to e-business consulting as Year 2000 work wound down in 1999, but dot-com companies like Scient and Viant had already moved into the field armed with deep bank accounts courtesy of Wall Street's infatuation with electronic commerce. Data Dimensions was hit with staggering losses trying to catch up.

The company eventually cut its consultant rolls to 500 from 900 and switched to quality assurance and information testing work, fields closer to its Year 2000 experience. It escaped the recent carnage among those that stayed in the e-business consulting, but its stock has tumbled to less than $1 a share and it has yet to complete the tough climb back to profitability. "2000 was a very painful year," Mr. Allen said.

But as different as the world looks a year later for firms like Mr. Allen's, and whatever lessons the Year 2000 episode has offered, at least one thing has not changed much: human nature.

Much of the work and expense of the repair effort might have been avoided if it had not been neglected so long. And many computer experts now suspect that the uneventful transition lulled most computer users back into many of the lax practices that contributed to the Year 2000 problems.

"Ninety percent of the lessons learned have already been forgotten," said Howard Rubin, an information technology consultant based in Pound Ridge, N.Y. The first casualties have been tasks like carefully documenting changes in software and maintaining up-to-date inventories of equipment and software, jobs he likened to a homeowner's monthly balancing of the checkbook.

"There's a big crush on to produce new systems and they're back to leaving the deposit slips all over the desk," Mr. Rubin said.

In the long run, the biggest impact may be on society's capacity to respond to complex management challenges, said Mark Haselkorn, an engineering professor at the University of Washington who is working on a major Year 2000 study sponsored by the National Research Council. But teasing out the lessons has been a challenge. Although he originally hoped to have the study published in September, he is now aiming for the spring.

"It's much more difficult to write than if everything had fallen apart," Mr. Haselkorn said.



-- Anonymous, January 01, 2001


Moderation questions? read the FAQ