"What Programmers Don't Know"

greenspun.com : LUSENET : TimeBomb 2000 (Y2000) : One Thread

The following is a glimpse into the nature of Y2K information systems problems. Having been a programmer in 3rd and 4th generation languages, I recall writing lots of date handling logic -- almost in every program I wrote (perhaps because much of my career has been focused on financial systems). This programmer had somewhat different experience, and hadn't encountered as much need for date handling during his career.

Many programmers honestly don't understand why Y2K is such a major problem, mainly because they haven't encountered, and therefore haven't had to debug, many date handling problems.

Similar problems to the one described below will occur countless times world wide, in every company, in every industry, and in every government agency -- regardless of technical environment, throughout the coming year (perhaps longer). A programmer can only debug one at a time. And, as we shall see together on a grand (albeit obscure) scale, tired and pressured programmers make mistakes. God knows I made a lot during my programming years. Perhaps you can relate.....

(link follows article) What Programmers Don't Know All Rights Reserved ) NewsMax.com David M. Allen February 26, 1999 I've been programming real-time applications for almost twenty years now, so I guess that makes me part of the problem. I honestly don't remember doing any date calculations involving years until my most recent project. I wrote a tool to translate our hardware's system time into the common "mm/dd/yy hh:mm:ss" format using the standard Unix system calls available on our system workstations. Well, imagine my surprise when I converted a time-stamp that extended into the year 2000 and got "01/01/:0" as the date! The month and day were always correct, but the first digit of the two-digit year was a colon instead of a zero. The year 2001 was reported as ":1", and so on. When I tried 2010, I got "01/01/;0". The colon was changed to a semicolon. At 2020, the semicolon became a "<". (Just imagine trying to do math on 01/01/:0.) It turned out the the operating system call was following the ASCII codes: after "9" comes ":", ";", etc. I made provision for this quirk in my software and began work on the corresponding tool to convert from the common time format back to our hardware's version. That's when I discovered that we had "missed a spot." In one of our main time conversion utilities, a logic error would produce incorrect times at the calendar roll-over to 2000. It was a simple thing really, the two-digit year was used to determine the number of seconds since 1970, by subtracting 70 from the year value. For 1999, 99 - 70 = 29 years. For 2000, 00 - 70 = -70 years, not the 30 the programmer was expecting. This software had already passed the formal system tests and had been deployed; and of course, it worked perfectly. But when I ran a simulation into the year 2000 (not using the formal test procedures), all of the time-stamps were bogus. The self-test software detected the bad time-stamp the first time it logged some status and an error message was generated. When the error message was logged, it was stamped with another bogus time, which the self-test software detected, generating another error message, which was logged with a bogus time-stamp.... The system crashed within a few seconds, and would not accept any commands nor report any status, other than the time-stamp errors. The fix was easy, but the hardware was off-line for hours during the upgrade. I'm just glad this particular control and status hardware box wasn't controlling and statusing a nuclear power plant. The point is, this software was designed to operate smoothly through the calendar roll-over. Everyone thought that it had been tested to do just that. The programmer that produced the bad code had left the project months before the formal system test had been run, successfully! He didn't know he'd made this mistake. Nobody knew. And I found it only by accident. Mr. Anonymous says the power grid isn't date sensitive, but the equipment that controls and monitors the generating plants most certainly is. Maybe the computers will crash and leave the power generators running. That would be convenient, but Mr. Murphy wouldn't bet on it.

David M. Allen http://216.46.238.34/articles/?a=1999/2/26/91811

-- TA (
sea_spur@yahoo.com), December 03, 1999

Answers

Mr. Allen,

This explains an article some months ago in the Wash. Post. When a reporter interviewed the developer of ASCII at his home in the country, where he had recently moved. He took the time to show the reporter his Swedish made water filter and said they would use it to drain the pond out back if necessary.

Still don't understand why no one reacted to this article!

-- d----- (dciinc@aol.com), December 03, 1999.


---d--

DO you know where I can find the ASCII article you mentioned? Somehow I missed it.

Thanks.

Terry

-- TA (sea_spur@yahoo.com), December 03, 1999.


A perfect, real example of how a product can be designed, used, fixed, tested and still fail. The 85/15 rule in action (you will on average only find 85% of the errors, the other 15% only show up in real world useage).

The reason no one responds to articles like this is because almost anyone who read it who was not technical would have stopped reading it after the second technical detail...no ability/skill to deal with technical issues. Stuff like this is WAY to complicated for folks who have no real technical bent.

Our technology is beyond the comprehension of 99% of the population it supports.

-- ..- (dit@dot.dash), December 03, 1999.


-d-

Category: Programmers'_Views Date: 1999-07-19 10:36:58 Subject: Bob Bemer, Inventor of ASCII, Saw Y2K Coming in 1979. Now He Has Bugged Out. Link: http://www.washingtonpost.com/wp-srv/WPlate/1999-07/18/... Comment: This is a well written, lively summary of y2k. It is written in a lighthearted style. I don't think the author is really all that bubbly over y2k. One thing is sure: Bob Bemer isn't.

The introduction leads into an article on Bemer, inventor of ASCII, early programmer, and developer of a solution to y2k that was heralded by the press two years ago as a reason not to worry about y2k, but which is no longer mentioned.

Bemer has moved from the city. He is buying survival gear such as water purifiers. He is getting ready.

The public, of course, is not fazed by y2k. There's no problem. "They" have fixed it, or will fix it.

This is the guy who might have fixed it, if anyone had listened in 1979.

This is the longest y2k article I remember seeing in any mainsrtream newspaper. But it's not just any newspaper. This is from the WASHINGTON POST (July 18).

I strongly suggest that you click through and read the whole article. Print it out. Mail it to a friend.

But don't let ZD Net's Mitch Ratcliffe hear about it. It might upset him.

After you have read it, look at your enviornment. Where do you live? How much better informed than Bob Bemer are you?

Time really is running out, and not just for Mitch Ratcliffe.

* * * * * * * * *

We are knocking at the door of a high-rise apartment in Baileys Crossroads, with a question so awful we are afraid to ask it. We do not wish to cause a heart attack.

A woman invites us in and summons her husband, who shuffles in from another room. She is 78. He is 82. They met in the 1960s as middle- management civil servants, specialists in an aspect of data processing so technical, so nebbishy, that many computer professionals disdain it. He was her boss. Management interface blossomed into romance. Their marriage spans three decades. They are still in love.

"You know how we use Social Security numbers alone to identify everyone?" she says. She points proudly to her husband. "That all started with this kid!"

The kid has ice cube spectacles and neck wattles. He has been retired for years. Some of his former colleagues guessed he was deceased. His phone is unlisted. We located him through a mumbled tip from a man in a nursing home, followed up with an elaborate national computer search. Computers--they're magic. . . .

Here is what we have to ask him: Are you the man who is responsible for the greatest technological disaster in the history of mankind? Did you cause a trillion-dollar mistake that some believe will end life as we know it six months from now, throwing the global economy into a tailspin, disrupting essential services, shutting down factories, darkening vast areas of rural America, closing banks, inciting civic unrest, rotting the meat in a million freezers, pulling the plug on life-sustaining medical equipment, blinding missile defense systems, leaving ships adrift on the high seas, snarling air traffic, causing passenger planes to plummet from the skies?

Obligingly, he awaits the question. . . .

Technology has been the propulsive force behind civilization, but from time to time technology has loudly misfired. In the name of progress, there have been profound blunders: Filling zeppelins with hydrogen. Treating morning sickness with Thalidomide. Constructing aqueducts with lead pipes, poisoning half the population of ancient Rome. Still, there is nothing that quite compares with the so- called "Millennium Bug." It is potentially planetary in scope. It is potentially catastrophic in consequence. And it is, at its heart, stunningly stupid. It is not like losing a kingdom for want of a nail; it is like losing a kingdom because some idiot made the nails out of marshmallows. . . .

Never has a calamity been so predictable, and so inevitable, tied to a deadline that can be neither appealed nor postponed. Diplomacy is fruitless. Nuclear deterrence isn't a factor. This can't be filibustered into the next Congress. . . .

The search for a culprit is an honored American tradition. It nourishes both law and journalism. When things go bad, we demand a fall guy. A scapegoat. A patsy.

Today we'll search for one, and find him. . . .

The Y2K problem wasn't just foreseeable, it was foreseen.

Writing in February 1979 in an industry magazine called Interface Age, computer industry executive Robert Bemer warned that unless programmers stopped dropping the first two digits of the year, programs "may fail from ambiguity in the year 2000."

This is geekspeak for the Y2K problem.

Five years later, the husband-wife team of Jerome T. and Marilyn J. Murray wrote it much more plainly. In a book called "Computers in Crisis: How to Avoid the Coming Worldwide Computer Systems Collapse," they predicted Y2K with chilling specificity.

Few people read it. The year was 1984, and to many, the book seemed very 1984-ish: a paranoid Orwellian scenario. ComputerWorld magazine reviewed it thus:

"The book overdramatizes the date-digit problem. . . . Much of the book can be overlooked."

How could we have been so blind?

Basically, we blinded ourselves, like Oedipus. It seemed like a good idea at the time. . . .

Why didn't people realize earlier the magnitude of the problem they were creating?

And when they did realize it, why was the problem so hard to solve? . . . .

We sought the answer from the first man to ask the question.

Robert Bemer, the original Y2K whistleblower, lives in a spectacular home on a cliff overlooking a lake two hours west of a major American city. We are not being specific because Bemer has made this a condition of the interview. We can say the car ride to his town is unrelievedly horizontal. The retail stores most in evidence are fireworks stands and taxidermists.

In his driveway, Bemer's car carries the vanity tag "ASCII." He is the man who wrote the American Standard Code for Information Interchange, the language through which different computer systems talk to each other. He also popularized the use of the backslash, and invented the "escape" sequence in programming. You can thank him, or blaspheme him, for the ESC key.

In the weenieworld of data processing, he is a minor deity.

We had guessed Bemer would be reassuring about the Y2K problem.

Our first question is why the heck he recently moved from a big city all the way out to East Bumbleflop, U.S.A.

It's a good place to be next New Year's Eve, he says. From a kitchen drawer he extracts two glass cylinders about the size of the pneumatic-tube capsules at a drive-through teller. Each is filled with what appears to be straw.

"They're Danish," he says. "They cost $500. We ran water with cow [poop] through them and they passed with flying colors."

They're filters, to purify water. If Y2K is as bad as he fears, he says, cocking a thumb toward his backyard, "we can drain the lake."

Bemer is 79. He looks flinty, like an aging Richard Boone still playing Paladin.

He has started a company, Bigisoft, that sells businesses a software fix for the Y2K problem. So, for selfish reasons, he doesn't mind if there is widespread concern over Y2K, though he swears he really thinks it is going to be bad. That's why he has requested that we not mention the town in which he lives. He doesn't want nutballs descending on him in the hellish chaos of Jan. 1, somehow blaming him.

Who, then, is to blame?

Bemer rocks back in his chair and offers a commodious smile.

In one sense, he says, he is.

Binary Colors

In the late 1950s, Bemer helped write COBOL, the Esperanto of computer languages. It was designed to combine and universalize the various dialects of programming. It also was designed to open up the exploding field to the average person, allowing people who weren't mathematicians or engineers to communicate with machines and tell them what to do. COBOL's commands were in plain English. You could instruct a computer to MOVE, ADD, SEARCH or MULTIPLY, just like that.

It was a needed step, but it opened the field of programming, Bemer says, to "any jerk."

"I thought it would open up a tremendous source of energy," he says. "It did. But what we got was arson."

There was no licensing agency for programmers. No apprenticeship system. "Even in medieval times," Bemer notes dryly, "there were guilds." When he was an executive at IBM, he said, he sometimes hired people based on whether they could play chess.

There was nothing in COBOL requiring or even encouraging a two-digit year. It was up to the programmers. If they had been better trained, Bemer says, they might have known it was unwise. He knew.

He blames the programmers, but he blames their bosses more, for caving in to shortsighted client demands for cost-saving.

"What can I say?" he laughs. "We're a lousy profession." . . . .

The longer a program is used, the larger the database and supporting material that grow around it. If, say, a program records and cross- references the personnel records in the military, and if the program itself abbreviates years with two digits, then all stored data, all files, all paper questionnaires that servicemen fill out, will have two-digit years. The cost of changing this system goes way beyond the cost of merely changing the computer program.

It's like losing your wallet. Replacing the money is no sweat. Replacing your credit cards and ATM card and driver's license and business-travel receipts can be a living nightmare.

And so, even after computer memory became cheaper, and data storage became less cumbersome, there was still a powerful cost incentive to retain a two-digit year. Some famously prudent people programmed with a two-digit date, including Federal Reserve Chairman Alan Greenspan, who did it when he was an economics consultant in the 1960s. Greenspan sheepishly confessed his complicity to a congressional committee last year. He said he considered himself very clever at the time. . . .

A group did adopt a written standard for how to express dates in computers.

We are looking at it now.

It is a six-page document. It is so stultifying that it is virtually impossible to read. It is titled "Federal Information Processing Standards Publication 4: Specifications for Calendar Date." It is dated Nov. 1, 1968, and took effect on Jan. 1, 1970, precisely when Brooks says the lines on the graph crossed, precisely when a guiding hand might have helped.

On Page 3, a new federal standard for dates is promulgated. . . .

Federal Information Processing Standards Publication 4, Paragraph 4 and Subparagraph 4.1, is another of those statements. Here it is, in its entirety:

Calendar Date is represented by a numeric code of six consecutive positions that represent (from left to right, in high to low order sequence) the Year, the Month and the Day, as identified by the Gregorian Calendar. The first two positions represent the units and tens identification of the Year. For example, the Year 1914 is represented as 14, and the Year 1915 is represented as 15.

Ah.

The Y2K problem.

Set in stone.

By the United States government.

FIPS 4, as it was called, was limited in scope. It applied only to U.S. government computers, and only when they were communicating from agency to agency. Still, it was the first national computer date standard ever adopted, and it influenced others that followed. It would have affected any private business that wanted to communicate with government computers. It might have been a seed for change, had it mandated a four-digit year. . . .

Link: http://www.washingtonpost.com/wp-srv/WPlate/1999-07/18/...

---------------------------------------------------------------------- ----------



-- Familyman (prepare@home.com), December 03, 1999.


try here

Or Father of ASCII Tackles Y2k

egory: Programmers'_Views Date: 1999-07-19 10:36:58 Subject: Bob Bemer, Inventor of ASCII, Saw Y2K Coming in 1979. Now He Has Bugged Out. Link: http://www.washingtonpost.com/wp-srv/WPlate/1999-07/18/... Comment: This is a well written, lively summary of y2k. It is written in a lighthearted style. I don't think the author is really all that bubbly over y2k. One thing is sure: Bob Bemer isn't.

The introduction leads into an article on Bemer, inventor of ASCII, early programmer, and developer of a solution to y2k that was heralded by the press two years ago as a reason not to worry about y2k, but which is no longer mentioned.

Bemer has moved from the city. He is buying survival gear such as water purifiers. He is getting ready.

The public, of course, is not fazed by y2k. There's no problem. "They" have fixed it, or will fix it.

This is the guy who might have fixed it, if anyone had listened in 1979.

This is the longest y2k article I remember seeing in any mainsrtream newspaper. But it's not just any newspaper. This is from the WASHINGTON POST (July 18).

I strongly suggest that you click through and read the whole article. Print it out. Mail it to a friend.

But don't let ZD Net's Mitch Ratcliffe hear about it. It might upset him.

After you have read it, look at your enviornment. Where do you live? How much better informed than Bob Bemer are you?

Time really is running out, and not just for Mitch Ratcliffe.

* * * * * * * * *

We are knocking at the door of a high-rise apartment in Baileys Crossroads, with a question so awful we are afraid to ask it. We do not wish to cause a heart attack.

A woman invites us in and summons her husband, who shuffles in from another room. She is 78. He is 82. They met in the 1960s as middle-management civil servants, specialists in an aspect of data processing so technical, so nebbishy, that many computer professionals disdain it. He was her boss. Management interface blossomed into romance. Their marriage spans three decades. They are still in love.

"You know how we use Social Security numbers alone to identify everyone?" she says. She points proudly to her husband. "That all started with this kid!"

The kid has ice cube spectacles and neck wattles. He has been retired for years. Some of his former colleagues guessed he was deceased. His phone is unlisted. We located him through a mumbled tip from a man in a nursing home, followed up with an elaborate national computer search. Computers--they're magic. . . .

Here is what we have to ask him: Are you the man who is responsible for the greatest technological disaster in the history of mankind? Did you cause a trillion-dollar mistake that some believe will end life as we know it six months from now, throwing the global economy into a tailspin, disrupting essential services, shutting down factories, darkening vast areas of rural America, closing banks, inciting civic unrest, rotting the meat in a million freezers, pulling the plug on life-sustaining medical equipment, blinding missile defense systems, leaving ships adrift on the high seas, snarling air traffic, causing passenger planes to plummet from the skies?

Obligingly, he awaits the question. . . .

Technology has been the propulsive force behind civilization, but from time to time technology has loudly misfired. In the name of progress, there have been profound blunders: Filling zeppelins with hydrogen. Treating morning sickness with Thalidomide. Constructing aqueducts with lead pipes, poisoning half the population of ancient Rome. Still, there is nothing that quite compares with the so-called "Millennium Bug." It is potentially planetary in scope. It is potentially catastrophic in consequence. And it is, at its heart, stunningly stupid. It is not like losing a kingdom for want of a nail; it is like losing a kingdom because some idiot made the nails out of marshmallows. . . .

Never has a calamity been so predictable, and so inevitable, tied to a deadline that can be neither appealed nor postponed. Diplomacy is fruitless. Nuclear deterrence isn't a factor. This can't be filibustered into the next Congress. . . .

The search for a culprit is an honored American tradition. It nourishes both law and journalism. When things go bad, we demand a fall guy. A scapegoat. A patsy.

Today we'll search for one, and find him. . . .

The Y2K problem wasn't just foreseeable, it was foreseen.

Writing in February 1979 in an industry magazine called Interface Age, computer industry executive Robert Bemer warned that unless programmers stopped dropping the first two digits of the year, programs "may fail from ambiguity in the year 2000."

This is geekspeak for the Y2K problem.

Five years later, the husband-wife team of Jerome T. and Marilyn J. Murray wrote it much more plainly. In a book called "Computers in Crisis: How to Avoid the Coming Worldwide Computer Systems Collapse," they predicted Y2K with chilling specificity.

Few people read it. The year was 1984, and to many, the book seemed very 1984-ish: a paranoid Orwellian scenario. ComputerWorld magazine reviewed it thus:

"The book overdramatizes the date-digit problem. . . . Much of the book can be overlooked."

How could we have been so blind?

Basically, we blinded ourselves, like Oedipus. It seemed like a good idea at the time. . . .

Why didn't people realize earlier the magnitude of the problem they were creating?

And when they did realize it, why was the problem so hard to solve? . . . .

We sought the answer from the first man to ask the question.

Robert Bemer, the original Y2K whistleblower, lives in a spectacular home on a cliff overlooking a lake two hours west of a major American city. We are not being specific because Bemer has made this a condition of the interview. We can say the car ride to his town is unrelievedly horizontal. The retail stores most in evidence are fireworks stands and taxidermists.

In his driveway, Bemer's car carries the vanity tag "ASCII." He is the man who wrote the American Standard Code for Information Interchange, the language through which different computer systems talk to each other. He also popularized the use of the backslash, and invented the "escape" sequence in programming. You can thank him, or blaspheme him, for the ESC key.

In the weenieworld of data processing, he is a minor deity.

We had guessed Bemer would be reassuring about the Y2K problem.

Our first question is why the heck he recently moved from a big city all the way out to East Bumbleflop, U.S.A.

It's a good place to be next New Year's Eve, he says. From a kitchen drawer he extracts two glass cylinders about the size of the pneumatic-tube capsules at a drive-through teller. Each is filled with what appears to be straw.

"They're Danish," he says. "They cost $500. We ran water with cow[poop] through them and they passed with flying colors."

They're filters, to purify water. If Y2K is as bad as he fears, he says, cocking a thumb toward his backyard, "we can drain the lake."

Bemer is 79. He looks flinty, like an aging Richard Boone still playing Paladin.

He has started a company, Bigisoft, that sells businesses a software fix for the Y2K problem. So, for selfish reasons, he doesn't mind if there is widespread concern over Y2K, though he swears he really thinks it is going to be bad. That's why he has requested that we not mention the town in which he lives. He doesn't want nutballs descending on him in the hellish chaos of Jan. 1, somehow blaming him.

Who, then, is to blame?

Bemer rocks back in his chair and offers a commodious smile.

In one sense, he says, he is.

Binary Colors

In the late 1950s, Bemer helped write COBOL, the Esperanto of computer languages. It was designed to combine and universalize the various dialects of programming. It also was designed to open up the exploding field to the average person, allowing people who weren't mathematicians or engineers to communicate with machines and tell them what to do. COBOL's commands were in plain English. You could instruct a computer to MOVE, ADD, SEARCH or MULTIPLY, just like that.

It was a needed step, but it opened the field of programming, Bemer says, to "any jerk."

"I thought it would open up a tremendous source of energy," he says. "It did. But what we got was arson."

There was no licensing agency for programmers. No apprenticeship system. "Even in medieval times," Bemer notes dryly, "there were guilds." When he was an executive at IBM, he said, he sometimes hired people based on whether they could play chess.

There was nothing in COBOL requiring or even encouraging a two-digit year. It was up to the programmers. If they had been better trained, Bemer says, they might have known it was unwise. He knew.

He blames the programmers, but he blames their bosses more, for caving in to shortsighted client demands for cost-saving.

"What can I say?" he laughs. "We're a lousy profession." . . . .

The longer a program is used, the larger the database and supporting material that grow around it. If, say, a program records and cross-references the personnel records in the military, and if the program itself abbreviates years with two digits, then all stored data, all files, all paper questionnaires that servicemen fill out, will have two-digit years. The cost of changing this system goes way beyond the cost of merely changing the computer program.

It's like losing your wallet. Replacing the money is no sweat. Replacing your credit cards and ATM card and driver's license and business-travel receipts can be a living nightmare.

And so, even after computer memory became cheaper, and data storage became less cumbersome, there was still a powerful cost incentive to retain a two-digit year. Some famously prudent people programmed with a two-digit date, including Federal Reserve Chairman Alan Greenspan, who did it when he was an economics consultant in the 1960s. Greenspan sheepishly confessed his complicity to a congressional committee last year. He said he considered himself very clever at the time. . . .

A group did adopt a written standard for how to express dates in computers.

We are looking at it now.

It is a six-page document. It is so stultifying that it is virtually impossible to read. It is titled "Federal Information Processing Standards Publication 4: Specifications for Calendar Date." It is dated Nov. 1, 1968, and took effect on Jan. 1, 1970, precisely when Brooks says the lines on the graph crossed, precisely when a guiding hand might have helped.

On Page 3, a new federal standard for dates is promulgated. . . .

Federal Information Processing Standards Publication 4, Paragraph 4 and Subparagraph 4.1, is another of those statements. Here it is, in its entirety:

Calendar Date is represented by a numeric code of six consecutive positions that represent (from left to right, in high to low order sequence) the Year, the Month and the Day, as identified by the Gregorian Calendar. The first two positions represent the units and tens identification of the Year. For example, the Year 1914 is represented as 14, and the Year 1915 is represented as 15.

Ah.

The Y2K problem.

Set in stone.

By the United States government.

FIPS 4, as it was called, was limited in scope. It applied only to U.S. government computers, and only when they were communicating from agency to agency. Still, it was the first national computer date standard ever adopted, and it influenced others that followed. It would have affected any private business that wanted to communicate with government computers. It might have been a seed for change, had it mandated a four-digit year. . . .

-- just_me (nada@nowhere.com), December 03, 1999.



SYSOP:

Please delet my thread

-- Familyman (prepare@home.com), December 03, 1999.


WAIT --- PLEASE DO NOT DELETE THIS THREAD!!!

It has lots of great information!

-- TA (sea_spur@yahoo.com), December 03, 1999.


"Our technology is beyond the comprehension of 99% of the population it supports"

----------------------------------------------------------------------

Velcro is beyond the comprehension of 99% of the population

-- MegaMe (CWHale67@aol.com), December 03, 1999.


Cause and effect, and reality, is beyond the comprehension of 99% of the people.

-- A (A@AisA.com), December 04, 1999.

Granted, the government decision was idiotic, as most government decisions are (EGTTTS -- Everything Government Touches Turns to Shit) -- but that is no excuse for industry not adopting and using a 4-digit year in date representations. There ARE industry organizations, such as the American National Standards Institute (ANSI). The Institute for Electrical and ELectronic Engineers (IEEE) had a Computing subgroup as far back as the 1950s.

It was a management decision, of various private companies, that set up the coming fiasco. Pure and simple.

Much as I hate government, I say you can't lay it all at government/military feet. Individuals and companies are SUPPOSED to be "smarter" than the pigs.

Companies could have adopted and used 4-digit years for all the interconnected BUSINESS applications, chopping off the first two digits when exporting date data to the government clowns (assuming YYYYMMDD format). Easier to chop off two digits than to try to figure what two digits to prefix with. But they didn't. A pox on all their houses.

-- A (A@AisA.com), December 04, 1999.



TA,

I think familyman was only asking to have his dup ASCII answer removed.

Amyway, I think that most programmers DO KNOW this. I think the 85/15 ratio may be a little off, but it is true that one almost never finds all bugs when testing. No matter how hard you try to put together a "full" set of test data, something that you never expect comes back to bite you. It's virtually impossible to check out all possible conditions in anything but the most simple program.

Even in cases where program A works just just like you expected, it's output goes to program B, and program C, and program D. When you have something like expanding a date field, and you forgot, or didn't know about program C, well, core dump city, or better yet, the program runs, but produces junk. This is why end-to-end testing is so important. But even with this type of testing, you are not guaranteed to have checked all conditions in any, or all programs in a production stream. It may run today, but fail tomorrow.

This stuff isn't easy. And I'm sure that Murphy is alive and well, and is going to have one hell of a party on 01/01/00, err, 01/01/2000. I just hope that he doesn't screw around with the "critical" stuff too much...

Tick... Tock... <:00=

-- Sysman (y2kboard@yahoo.com), December 04, 1999.


Moderation questions? read the FAQ